81_FR_210
Page Range | 75315-75670 | |
FR Document |
Page and Subject | |
---|---|
81 FR 75456 - Sunshine Act Meeting | |
81 FR 75427 - Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of California | |
81 FR 75426 - Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of California | |
81 FR 75427 - Indian Gaming; Approval of Amended Tribal-State Class III Gaming Compact in the State of South Dakota | |
81 FR 75428 - Indian Gaming; Approval of Amendment to Tribal-State Class III Gaming Compact in the State of Oregon | |
81 FR 75427 - Indian Gaming; Approval of Amendment to Tribal-State Class III Gaming Compact in the State of California | |
81 FR 75428 - Indian Gaming; Approval of Amended Tribal-State Class III Gaming Compact in the State of California | |
81 FR 75405 - Proposed Data Collection Submitted for Public Comment and Recommendations | |
81 FR 75411 - Report on the Performance of Drug and Biologics Firms in Conducting Postmarketing Requirements and Commitments; Availability | |
81 FR 75406 - Agency Information Collection Activities: Proposed Collection; Comment Request | |
81 FR 75349 - Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry; Availability | |
81 FR 75351 - Good Laboratory Practice for Nonclinical Laboratory Studies; Extension of Comment Period | |
81 FR 75419 - Labeling for Permanent Hysteroscopically Placed Tubal Implants Intended for Sterilization; Guidance for Industry and Food and Drug Administration Staff; Availability | |
81 FR 75409 - Agency Information Collection Activities: Submission for OMB Review; Comment Request | |
81 FR 75429 - Atlantic Wind Lease Sale 6 (ATLW-6) for Commercial Leasing for Wind Power on the Outer Continental Shelf Offshore New York-Final Sale Notice MMAA104000 | |
81 FR 75476 - Petition for Exemption; Summary of Petition Received; Douglas Myers | |
81 FR 75477 - Petition for Exemption; Summary of Petition Received; Pentastar Aviation Charter, Inc. | |
81 FR 75438 - Environmental Assessment for Commercial Wind Lease Issuance and Site Assessment Activities on the Atlantic Outer Continental Shelf Offshore New York; MMAA104000 | |
81 FR 75352 - Withholding of Unclassified Technical Data and Technology From Public Disclosure | |
81 FR 75327 - Drawbridge Operation Regulation; Newtown Creek, Brooklyn and Queens, NY | |
81 FR 75315 - Temporary Exceptions to FIRREA Appraisal Requirements in Areas Affected by Severe Storms and Flooding in Louisiana | |
81 FR 75361 - Approval and Promulgation of Air Quality Implementation Plans; State of Utah; Revisions to Nonattainment Permitting Regulations | |
81 FR 75398 - Combined Notice of Filings | |
81 FR 75397 - Applied Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 Authorization | |
81 FR 75393 - Moapa Southern Paiute Solar, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 Authorization | |
81 FR 75398 - Combined Notice of Filings #1 | |
81 FR 75399 - Combined Notice of Filings | |
81 FR 75394 - Combined Notice of Filings #1 | |
81 FR 75395 - Combined Notice of Filings #1 | |
81 FR 75401 - Change in Bank Control Notices; Acquisitions of Shares of a Bank or Bank Holding Company | |
81 FR 75387 - Agency Information Collection Activities; Comment Request; National Professional Development Program: Grantee Performance Report | |
81 FR 75345 - Fisheries of the Exclusive Economic Zone Off Alaska; Groundfish by Vessels Using Trawl Gear in the of the Gulf of Alaska | |
81 FR 75378 - Polyethylene Retail Carrier Bags From Malaysia: Final Results of the Antidumping Duty Administrative Review; 2014-2015 | |
81 FR 75373 - Foreign-Trade Zone (FTZ) 38-Spartanburg, South Carolina Authorization of Production Activity Benteler Automotive Corporation (Automotive Suspension and Body Components) Duncan, South Carolina | |
81 FR 75374 - Call for Applications for the International Buyer Program Select Service for Calendar Year 2018 | |
81 FR 75400 - Children's Health Protection Advisory Committee | |
81 FR 75379 - Call for Applications for the International Buyer Program Calendar Year 2018 | |
81 FR 75453 - New Postal Products | |
81 FR 75376 - Certain Frozen Warmwater Shrimp From India: Initiation and Preliminary Results of Antidumping Duty Changed Circumstances Review | |
81 FR 75439 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Special Dipping and Coating Operations (Dip Tanks) | |
81 FR 75440 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Safety Standards for Underground Coal Mine Ventilation-Belt Entry Used as an Intake Air Course To Ventilate Working Sections and Areas Where Mechanized Mining Equipment Is Being Installed or Removed | |
81 FR 75449 - NuScale Power, LLC, Design-Specific Review Standard and Scope and Safety Review Matrix | |
81 FR 75365 - Mercury and Air Toxics Standards (MATS) Completion of Electronic Reporting Requirements | |
81 FR 75478 - Pilot Program for Transit-Oriented Development Planning Project Selections | |
81 FR 75452 - Duke Energy Florida, LLC; Levy Nuclear Plant Units 1 and 2 | |
81 FR 75392 - Environmental Management Site-Specific Advisory Board, Savannah River Site | |
81 FR 75370 - Submission for OMB Review; Comment Request | |
81 FR 75444 - TUV Rheinland of North America, Inc.: Applications for Expansion of Recognition and Proposed Modification to the List of Appropriate NRTL Test Standards | |
81 FR 75442 - Intertek Testing Services NA, Inc.: Application for Expansion of Recognition | |
81 FR 75446 - Curtis-Strauss LLC: Application for Expansion of Recognition | |
81 FR 75371 - National Urban and Community Forestry Advisory Council | |
81 FR 75347 - Refunding Baggage Fees for Delayed Checked Bags | |
81 FR 75368 - Petitions for Reconsideration and Clarification of Action in Rulemaking Proceeding | |
81 FR 75400 - Schedule Change Open Commission Meeting, Thursday, October 27, 2016 | |
81 FR 75370 - Forest Resource Coordinating Committee | |
81 FR 75388 - National Assessment Governing Board Quarterly Board Meeting | |
81 FR 75390 - Request for Information on Interagency Working Group on Language and Communication's Report on Research and Development Activities | |
81 FR 75480 - Nondiscrimination on the Basis of Disability in Air Travel: Negotiated Rulemaking Committee Seventh Meeting | |
81 FR 75481 - Exploring Industry Practices on Distribution and Display of Airline Fare, Schedule, and Availability Information | |
81 FR 75396 - Breitburn Operating LP v. Florida Gas Transmission Company, LLC; Notice of Complaint | |
81 FR 75397 - City of Tuscaloosa, Alabama; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments and Motions To Intervene | |
81 FR 75399 - Public Service Company of New Hampshire; Notice of Availability of Environmental Assessment | |
81 FR 75392 - Alabama Power Company v. Southwest Power Pool; Notice of Complaint | |
81 FR 75393 - Indianapolis Power & Light Company v. Midcontinent Independent System Operator, Inc.; Notice of Complaint | |
81 FR 75396 - Dominion Carolina Gas Transmission, LLC; Notice of Application | |
81 FR 75328 - Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long-Term Care Hospital Prospective Payment System and Policy Changes and Fiscal Year 2017 Rates; Quality Reporting Requirements for Specific Providers; Graduate Medical Education; Hospital Notification Procedures Applicable to Beneficiaries Receiving Observation Services; Technical Changes Relating to Costs to Organizations and Medicare Cost Reports; Finalization of Interim Final Rules With Comment Period on LTCH PPS Payments for Severe Wounds, Modifications of Limitations on Redesignation by the Medicare Geographic Classification Review Board, and Extensions of Payments to MDHs and Low-Volume Hospitals; Correction | |
81 FR 75423 - Commercial Customs Operations Advisory Committee (COAC) | |
81 FR 75366 - Notice of Proposed Supplementary Rules for Public Lands Managed by the Moab Field Office in Grand County, Utah | |
81 FR 75384 - Notice of Intent To Grant Exclusive Patent License to RF Networking Solutions, LLC; East Brunswick, NJ | |
81 FR 75386 - Notice of Public Hearing and Business Meeting; November 9 and December 14, 2016 | |
81 FR 75344 - NASA Federal Acquisition Regulation Supplement: Remove NASA FAR Supplement Clause Engineering Change Proposals (2016-N030) | |
81 FR 75385 - Submission for OMB Review; Comment Request | |
81 FR 75449 - Submission for OMB Review; Comment Request | |
81 FR 75425 - Endangered and Threatened Wildlife and Plants; 5-Year Status Review of the Red Wolf | |
81 FR 75454 - 2017 Railroad Experience Rating Proclamations, Monthly Compensation Base and Other Determinations | |
81 FR 75424 - Announcement of Meetings: North American Wetlands Conservation Council; Neotropical Migratory Bird Conservation Advisory Group | |
81 FR 75371 - Allegheny Resource Advisory Committee Meeting | |
81 FR 75316 - Excepted Benefits; Lifetime and Annual Limits; and Short-Term, Limited-Duration Insurance | |
81 FR 75439 - Agency Information Collection Activities; Proposed eCollection eComments Requested; Proposed Renewal, With Change, of a Previously Approved Collection; Attorney Student Loan Repayment Program Electronic Forms | |
81 FR 75384 - Proposed Information Collection; Comment Request; Natural Resource Damage Assessment Restoration Project Information Sheet | |
81 FR 75383 - Proposed Information Collection; Comment Request; Expanded Vessel Monitoring System Requirement in the Pacific Coast Groundfish Fishery | |
81 FR 75388 - Agency Information Collection Activities; Comment Request; Evaluation of the Comprehensive Technical Assistance Centers | |
81 FR 75374 - Proposed Information Collection; Comment Request; Report of Requests for Restrictive Trade Practice or Boycott | |
81 FR 75383 - Submission for OMB Review; Comment Request | |
81 FR 75382 - Submission for OMB Review; Comment Request | |
81 FR 75372 - Notice of Invitation for Nominations to the Advisory Committee on Agriculture Statistics | |
81 FR 75373 - Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection | |
81 FR 75428 - Information Collection Request: National Park Service Centennial National Household Survey | |
81 FR 75327 - Drawbridge Operation Regulation; Upper Mississippi River, Clinton, IA | |
81 FR 75491 - Proposed Information Collection (Application Requirements To Receive VA Dental Insurance Plan Benefits Under 38 CFR 17.169) Activity: Comment Request | |
81 FR 75377 - Freshwater Crawfish Tail Meat From the People's Republic of China: Initiation of Antidumping Duty New Shipper Review | |
81 FR 75477 - Railroad Safety Advisory Committee; Notice of Meeting | |
81 FR 75478 - Norfolk Southern Railway Company's Request for Positive Train Control Safety Plan Approval and System Certification | |
81 FR 75448 - NASA Advisory Council; Aeronautics Committee; Meeting | |
81 FR 75401 - Patient Safety Organizations: Voluntary Relinquishment From the Patient Safety Leadership Council PSO | |
81 FR 75402 - Agency Information Collection Activities: Proposed Collection; Comment Request | |
81 FR 75454 - New Postal Product | |
81 FR 75473 - Self-Regulatory Organizations; NASDAQ PHLX LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Add Commentary .14 to Rule 3317 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot) | |
81 FR 75471 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Add Commentary .14 to Rule 4770 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot) | |
81 FR 75468 - Self-Regulatory Organizations; NASDAQ BX, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Add Commentary .14 to Rule 4770 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot) | |
81 FR 75460 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed Amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information To Amend OPRA's Non-Display Use Fees | |
81 FR 75462 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed Amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information To Amend the Professional Subscriber Device-Based Fees and Policies with Respect to Device-Based Fees | |
81 FR 75458 - Self-Regulatory Organizations; Bats EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend EDGX Rule 11.11, Routing to Away Trade Centers | |
81 FR 75466 - Self-Regulatory Organizations; Bats EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change to EDGA Rule 11.11, Routing to Away Trading Centers | |
81 FR 75464 - Self-Regulatory Organizations; Bats BZX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change to BZX Rule 11.13, Order Execution and Routing | |
81 FR 75456 - Self-Regulatory Organizations; Bats BYX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend BYX Rule 11.13, Order Execution and Routing | |
81 FR 75423 - National Cancer Institute; Notice of Meeting | |
81 FR 75421 - Government-Owned Inventions; Availability for Licensing | |
81 FR 75421 - Center for Scientific Review; Notice of Closed Meetings | |
81 FR 75381 - Determination of Overfishing or an Overfished Condition | |
81 FR 75488 - Unblocking of Specially Designated Nationals and Blocked Persons Resulting From the Termination of the National Emergency and Revocation of Executive Orders Related to Burma | |
81 FR 75387 - Agency Information Collection Activities; Comment Request; GEPA Section 427 Guidance for All Grant Applications | |
81 FR 75408 - Agency Information Collection Activities: Proposed Collection; Comment Request | |
81 FR 75441 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Derricks Standard | |
81 FR 75487 - Fee Schedule for the Transfer of U.S. Treasury Book-Entry Securities Held on the National Book-Entry System | |
81 FR 75338 - Amendment of the Commission's Space Station Licensing Rules and Policies, Second Order on Reconsideration | |
81 FR 75330 - Procedures for Disclosure of Information Under the Freedom of Information Act | |
81 FR 75624 - Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation Authority | |
81 FR 75494 - Teacher Preparation Issues |
Forest Service
National Agricultural Statistics Service
Foreign-Trade Zones Board
Industry and Security Bureau
International Trade Administration
National Oceanic and Atmospheric Administration
Army Department
Navy Department
Federal Energy Regulatory Commission
Agency for Healthcare Research and Quality
Centers for Disease Control and Prevention
Centers for Medicare & Medicaid Services
Food and Drug Administration
National Institutes of Health
Coast Guard
U.S. Customs and Border Protection
Fish and Wildlife Service
Indian Affairs Bureau
Land Management Bureau
National Park Service
Ocean Energy Management Bureau
Employee Benefits Security Administration
Occupational Safety and Health Administration
Federal Aviation Administration
Federal Railroad Administration
Federal Transit Administration
Bureau of the Fiscal Service
Comptroller of the Currency
Foreign Assets Control Office
Internal Revenue Service
Consult the Reader Aids section at the end of this issue for phone numbers, online resources, finding aids, and notice of recently enacted public laws.
To subscribe to the Federal Register Table of Contents electronic mailing list, go to https://public.govdelivery.com/accounts/USGPOOFR/subscriber/new, enter your e-mail address, then follow the instructions to join, leave, or manage your subscription.
Office of the Comptroller of the Currency, Treasury (OCC); Board of Governors of the Federal Reserve System (Board); Federal Deposit Insurance Corporation (FDIC); and National Credit Union Administration (NCUA), collectively referred to as the Agencies.
Statement and order; temporary exceptions.
Section 2 of the Depository Institutions Disaster Relief Act of 1992 (DIDRA) authorizes the Agencies to make exceptions to statutory and regulatory appraisal requirements under Title XI of the Financial Institutions Reform, Recovery, and Enforcement Act of 1989 (FIRREA). The exceptions are available for transactions involving real property located within an area declared to be a major disaster area by the President if the Agencies determine, and describe by publication of a regulation or order, that the exceptions would facilitate recovery from the disaster and would be consistent with safety and soundness. In this statement and order, the Agencies exercise their authority to grant temporary exceptions to the FIRREA appraisal requirements for real estate related transactions, provided certain criteria are met, in the Louisiana parishes declared a major disaster area by President Obama on August 14, 2016, as a result of the severe storms and flooding in Louisiana. The expiration date for the exceptions is December 31, 2017.
This order is effective on October 31, 2016 and expires for specific areas on December 31, 2017.
Section 2 of DIDRA, which added section 1123 to Title XI of FIRREA,
On August 14, 2016, the President declared that 22 parishes in Louisiana were in a major disaster area (Major Disaster Area) due to extensive damage that occurred as a result of severe storms and subsequent flooding.
The Agencies have determined that the disruption of real estate markets in the Major Disaster Area interferes with the ability of depository institutions to obtain appraisals that comply with all statutory and regulatory requirements. Further, the Agencies have determined that the disruption may impede institutions in making loans and engaging in other transactions that would aid in the reconstruction and rehabilitation of the affected area. Accordingly, the Agencies have determined that recovery from this major disaster would be facilitated by exempting certain transactions involving real estate located in the area directly affected by the severe storms and flooding from the real estate appraisal requirements of Title XI of FIRREA and its implementing regulations.
The Agencies also have determined that the exceptions are consistent with safety and soundness, provided that the depository institution determines and maintains appropriate documentation of the following: (1) The transaction involves real property located in the Major Disaster Area; (2) there is a binding commitment to fund the transaction that was entered into on or after August 14, 2016, but no later than December 31, 2017; and (3) the value of the real property supports the institution's decision to enter into the transaction. In addition, the transaction must continue to be subject to review by management and by the Agencies in the course of examinations of the institution.
Exceptions made under section 1123 of FIRREA may be provided for no more than three years after the President determines that a major disaster exists in the area.
In accordance with section 2 of DIDRA, relief is hereby granted from the provisions of Title XI of FIRREA and the Agencies' appraisal regulations for any real estate-related financial transaction that requires the services of an appraiser under those provisions, provided that the institution determines, and maintains documentation made available to the Agencies upon request, of the following:
(1) The transaction involves real property located in one of the 22 parishes declared a major disaster area as a result of severe storms and flooding in Louisiana by the President on August 14, 2016 (identified in the Appendix);
(2) There is a binding commitment to fund a transaction that was entered into on or after August 14, 2016, but no later than December 31, 2017; and
(3) The value of the real property supports the institution's decision to enter into the transaction.
By order of the Board of Directors.
By order of the Board of Directors.
Internal Revenue Service, Department of the Treasury; Employee Benefits Security Administration, Department of Labor; Centers for Medicare & Medicaid Services, Department of Health and Human Services.
Final rules.
This document contains final regulations regarding the definition of short-term, limited-duration insurance for purposes of the exclusion from the definition of individual health insurance coverage, and standards for travel insurance and supplemental health insurance coverage to be considered excepted benefits. This document also amends a reference in the final regulations relating to the prohibition on lifetime and annual dollar limits.
Elizabeth Schumacher or Matthew Litton of the Department of Labor, at 202-693-8335, Karen Levin, Internal Revenue Service, Department of the Treasury, at (202) 317-5500, David Mlawsky or Cam Clemmons, Centers for Medicare & Medicaid Services, Department of Health and Human Services, at 410-786-1565.
The Health Insurance Portability and Accountability Act of 1996 (HIPAA), Public Law 104-191 (110 Stat. 1936), added title XXVII of the Public Health Service Act (PHS Act), part 7 of the Employee Retirement Income Security Act of 1974 (ERISA), and Chapter 100 of the Internal Revenue Code (the Code), providing portability and nondiscrimination rules with respect to health coverage. These provisions of the PHS Act, ERISA, and the Code were later augmented by other consumer protection laws, including the Mental Health Parity Act of 1996,
The Affordable Care Act reorganizes, amends, and adds to the provisions of part A of title XXVII of the PHS Act relating to group health plans and health insurance issuers in the group and individual markets. For this purpose, the term “group health plan” includes both insured and self-insured group health plans.
On June 10, 2016, the Departments of Labor, Health and Human Services and the Treasury (the Departments
On July 20, 2015, the Internal Revenue Service published Notice 2015-43, 2015-29 IRB 73, to provide interim guidance with respect to the treatment of expatriate health plans, expatriate health plan issuers, and employers in their capacity as plan sponsors of expatriate health plans, as defined in the Expatriate Health Coverage Clarification Act of 2014 (EHCCA).
Short-term, limited-duration insurance is a type of health insurance coverage that is designed to fill temporary gaps in coverage when an individual is transitioning from one plan or coverage to another plan or coverage. Although short-term, limited-duration insurance is not an excepted benefit, it is similarly exempt from PHS Act requirements because it is not individual health insurance coverage. Section 2791(b)(5) of the PHS Act provides that the term “individual health insurance coverage” means health insurance coverage offered to individuals in the individual market, but does not include short-term, limited-duration insurance. The PHS Act does not define short-term, limited-duration insurance. Under current regulations, short-term, limited-duration insurance means “health insurance coverage provided pursuant to a contract with an issuer that has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder without the issuer's consent) that is less than 12 months after the original effective date of the contract.”
Before enactment of the Affordable Care Act, short-term, limited-duration insurance was an important means for individuals to obtain health coverage when transitioning from one job to another (and from one group health plan to another) or when faced with other similar situations. However, with guaranteed availability of coverage and special enrollment period requirements in the individual health insurance market under the Affordable Care Act, individuals can purchase coverage with the protections of the Affordable Care Act to fill in the gaps in coverage.
The Departments have become aware that short-term, limited-duration insurance is being sold in situations other than those that the exception from the definition of individual health insurance coverage was initially intended to address.
To address the issue of short-term, limited-duration insurance being sold as a type of primary coverage, the Departments proposed regulations to revise the definition of short-term, limited-duration insurance so that the coverage must be less than three months in duration, including any period for which the policy may be renewed. The proposed regulations also included a requirement that a notice must be prominently displayed in the contract and in any application materials provided in connection with enrollment in such coverage with the following language: THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.
In addition to proposing to reduce the length of short-term, limited-duration insurance to less than three months, the proposed regulations modified the permitted coverage period to take into account extensions made by the policyholder “with or without the issuer's consent.” This modification was intended to address the Departments' concern that some issuers are taking liberty with the current definition of short-term, limited-duration insurance—either by automatically renewing such policies or having a simplified reapplication process with the result being that such coverage, which does not contain the important protections of the Affordable Care Act, lasts longer than 12 months and serves as an individual's primary health coverage.
The Departments received a number of comments relating to the treatment of short-term, limited-duration insurance. Several commenters supported the proposed rules and the reasoning behind them, noting that short-term, limited-duration insurance is not subject to the same consumer protections as major medical coverage and can discriminate based on health status by recruiting healthier consumers to the exclusion of sicker consumers. These commenters suggested the proposed rules would limit the number of consumers relying on short-term, limited-duration insurance as their primary form of coverage and improve the Affordable Care Act's single risk pool.
Some commenters requested that the Departments go further and prohibit issuers from offering short-term, limited-duration insurance to consumers who have previously purchased this type of coverage to prevent consumers from stringing together coverage under policies offered by the same or different issuers. However, in the Departments' view, such a restriction is not warranted. The individual shared responsibility provision of the Code,
Other commenters expressed general opposition to the proposed rules or requested that short-term, limited-duration insurance be allowed to provide coverage for a longer period. Several commenters stated that some individuals who lose their employer-sponsored coverage may not be able to obtain COBRA continuation coverage
After consideration of the comments and feedback received from stakeholders, the Departments are finalizing the proposed regulations without change.
The revised definition of short-term, limited-duration insurance applies for policy years beginning on or after January 1, 2017. The Departments recognize, however, that State regulators may have approved short-term, limited-duration insurance products for sale in 2017 that met the definition in effect prior to January 1, 2017. Accordingly, the Department of Health and Human Services (HHS) will not take enforcement action against an issuer with respect to the issuer's sale of a short-term, limited-duration insurance product before April 1, 2017 on the ground that the coverage period is three months or more, provided that the coverage ends on or before December 31, 2017 and otherwise complies with the definition of short-term, limited-duration insurance in effect under the regulations.
Sections 2722 and 2763 of the PHS Act, section 732 of ERISA, and section 9831 of the Code provide that the respective requirements of title XXVII of the PHS Act, part 7 of ERISA, and Chapter 100 of the Code generally do not apply to the provision of certain types of benefits, known as “excepted benefits.” Excepted benefits are described in section 2791(c) of the PHS Act, section 733(c) of ERISA, and section 9832(c) of the Code.
The parallel statutory provisions establish four categories of excepted benefits. The first category, under section 2791(c)(1) of the PHS Act, section 733(c)(1) of ERISA and section 9832(c)(1) of the Code, includes benefits that are generally not health coverage (such as automobile insurance, liability insurance, workers compensation, and accidental death and dismemberment coverage). The benefits in this category are excepted in all circumstances. In contrast, the benefits in the second, third, and fourth categories are types of health coverage that are excepted only if certain conditions are met.
The second category of excepted benefits is limited excepted benefits, which may include limited scope vision or dental benefits, and benefits for long-term care, nursing home care, home health care, or community-based care. Section 2791(c)(2)(C) of the PHS Act, section 733(c)(2)(C) of ERISA, and section 9832(c)(2)(C) of the Code authorize the Secretaries of HHS, Labor, and the Treasury (collectively, the Secretaries) to issue regulations establishing other, similar limited benefits as excepted benefits. The Secretaries exercised this authority previously with respect to certain health flexible spending arrangements.
The third category of excepted benefits, referred to as “noncoordinated excepted benefits,” includes both coverage for only a specified disease or illness (such as cancer-only policies), and hospital indemnity or other fixed indemnity insurance. These benefits are excepted under section 2722(c)(2) of the PHS Act, section 732(c)(2) of ERISA, and section 9831(c)(2) of the Code only if all of the following conditions are met: (1) The benefits are provided under a separate policy, certificate, or contract of insurance; (2) there is no coordination between the provision of such benefits and any exclusion of benefits under any group health plan maintained by the same plan sponsor; and (3) the benefits are paid with respect to any event without regard to whether benefits are provided under any group health plan maintained by the same plan sponsor.
The fourth category, under section 2791(c)(4) of the PHS Act, section 733(c)(4) of ERISA, and section 9832(c)(4) of the Code, is supplemental excepted benefits. These benefits are excepted only if they are provided under a separate policy, certificate, or contract of insurance and are Medicare supplemental health insurance (also known as Medigap), TRICARE supplemental programs, or “similar supplemental coverage provided to coverage under a group health plan.” The phrase “similar supplemental coverage provided to coverage under a group health plan” is not defined in the statute or regulations. However, the Departments issued regulations clarifying that one requirement to be similar supplemental coverage is that the coverage “must be specifically designed to fill gaps in primary coverage, such as coinsurance or deductibles.”
In 2007 and 2008, the Departments issued guidance on the circumstances under which supplemental health insurance would be considered excepted benefits under section 2791(c)(4) of the PHS Act (and the parallel provisions of ERISA and the Code).
On February 13, 2015, the Departments issued Affordable Care Act Implementation FAQs Part XXIII, providing additional guidance on the circumstances under which health insurance coverage that supplements group health plan coverage may be considered supplemental excepted benefits.
The proposed regulations incorporated guidance from the Affordable Care Act Implementation FAQs Part XXIII addressing supplemental health insurance products that provide categories of benefits in addition to those in the primary coverage. Under the proposed regulations, if group or individual supplemental health insurance covers items and services not included in the primary coverage (referred to as providing “additional categories of benefits”), the coverage will be considered to be designed “to fill gaps in primary coverage,” for purposes of being supplemental excepted benefits if none of the benefits provided by the supplemental policy are an EHB, as defined under section 1302(b) of the Affordable Care Act, in the State in which the coverage is issued.
The Departments received several comments in support of the proposed regulations. One commenter expressed support but requested that the Departments provide additional examples in the regulations. Another commenter requested clarification regarding the application of the standards for similar supplemental coverage that provides benefits outside of the United States, noting that no State's EHB rules require coverage for services outside of the United States. If any benefit provided by the supplemental policy is a type of service that is an EHB in the State where the coverage is issued, the coverage would not be supplemental excepted benefits under the final regulations, even if the supplemental coverage was limited to covering the benefit in a location or setting where it would not be covered as an EHB.
After consideration of the comments, the Departments are finalizing the proposed regulations on similar supplemental coverage without substantive change. For purposes of consistency and clarity, HHS is also including a cross reference in the individual market excepted benefits regulations at 45 CFR 148.220 to reflect the standard for similar supplemental coverage under the group market regulations at 45 CFR 146.145(b)(5)(i)(C). The Departments may provide additional guidance on similar supplemental coverage that meets the criteria to be excepted benefits in the future.
The Departments are aware that certain travel insurance products may include limited health benefits. However, these products typically are not designed as major medical coverage. Instead, the risks being insured relate primarily to: (1) The interruption or cancellation of a trip; (2) the loss of baggage or personal effects; (3) damages to accommodations or rental vehicles; or (4) sickness, accident, disability, or death occurring during travel, with any health benefits usually incidental to other coverage.
Section 2791(c)(1)(H) of the PHS Act, section 733(c)(1)(H) of ERISA, and section 9832(c)(1)(H) of the Code provide that the Departments may, in regulations, designate as excepted benefits “benefits for medical care [that] are secondary or incidental to other insurance benefits.” Pursuant to this authority, and to clarify which types of travel-related insurance products are excepted benefits under the PHS Act, ERISA, and the Code, the Departments' proposed regulations identified travel insurance as an excepted benefit under the first category of excepted benefits and proposed a definition of travel insurance consistent with the definition of travel insurance under final regulations issued by the Treasury Department and the IRS for the health insurance providers fee imposed by section 9010 of the Affordable Care Act,
The proposed regulations defined the term “travel insurance” as insurance coverage for personal risks incident to planned travel, which may include, but are not limited to, interruption or cancellation of a trip or event, loss of baggage or personal effects, damages to accommodations or rental vehicles, and sickness, accident, disability, or death occurring during travel, provided that the health benefits are not offered on a stand-alone basis and are incidental to other coverage. For this purpose, travel insurance does not include major medical plans that provide comprehensive medical protection for travelers with trips lasting six months or longer, including, for example, those working overseas as an expatriate or military personnel being deployed.
The Departments received a number of comments in favor of the treatment of travel insurance as an excepted benefit, as well as the proposed definition of travel insurance. Several comments expressed support for the proposed definition's consistency with regulations governing the health insurance providers fee. One commenter requested clarification that the requirement that health benefits are incidental to other coverage be determined based solely on coverage under the travel insurance policy, without regard to other coverage provided by an employer or plan sponsor; the Departments agree that this is correct. The Departments are finalizing without change the proposed regulations defining travel insurance and treating such coverage as an excepted benefit.
Section 2711 of the PHS Act, as added by the Affordable Care Act, generally prohibits group health plans and health insurance issuers offering group or individual health insurance coverage from imposing lifetime and annual dollar limits on EHB, as defined under section 1302(b) of the Affordable Care Act. These prohibitions apply to both grandfathered and non-grandfathered health plans, except the annual limits prohibition does not apply to grandfathered individual health insurance coverage.
Under the Affordable Care Act, self-insured group health plans, large group market health plans, and grandfathered health plans are not required to offer EHB, but they generally cannot place lifetime or annual dollar limits on services they cover that are considered EHB. On November 18, 2015, the Departments issued final regulations implementing section 2711 of the PHS Act.
The final regulations under section 2711 of the PHS Act include a reference to selecting a “base-benchmark” plan, as specified under 45 CFR 156.100, for purposes of determining which benefits cannot be subject to lifetime or annual dollar limits. The base-benchmark plan selected by a State or applied by default under 45 CFR 156.100, however, may not reflect the complete definition of EHB in the applicable State. For that reason, the Departments are amending the regulations at 26 CFR 54.9815-2711(c), 29 CFR 2590.715-2711(c), and 45 CFR 147.126(c) to refer to the provisions that capture the complete definition of EHB in a State.
Specifically, in these final regulations, the Departments replace the phrase “in a manner consistent with one of the three Federal Employees Health Benefit Program (FEHBP) options as defined by 45 CFR 156.100(a)(3) or one of the base-benchmark plans selected by a State or applied by default pursuant to 45 CFR 156.100” in each of the regulations with the following: “in a manner that is consistent with (1) one of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered EHB consistent with 45 CFR 155.170(a)(2); or (2) one of the three Federal Employees Health Benefit Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.” This change reflects the possibility that base-benchmark plans, including the FEHBP plan options, could require supplementation under 45 CFR 156.110, and ensures the inclusion of State-required benefit mandates enacted on or before December 31, 2011 in accordance with 45 CFR 155.170, which when coupled with a State's EHB-benchmark plan, establish the definition of EHB in that State under regulations implementing section 1302(b) of the Affordable Care Act.
Some commenters requested clarification that self-insured group health plans, large group market health plans and grandfathered plans are not required to include as covered benefits any specific items and services covered by the State-EHB benchmark plan, including any additional State-required benefits considered EHB under 45 CFR 155.170(a)(2). The requirement in section 2707(a) of the PHS Act to provide the EHB package required under section 1302(a) of the Affordable Care Act applies only to non-grandfathered health insurance coverage in the individual and small group markets. Self-insured group health plans, large group market health plans and grandfathered health plans are not required to include coverage of EHB, but cannot place lifetime or annual dollar limits on any EHB covered by these plans.
One commenter urged the Departments to eliminate the option for large group market health plans to define EHB based on one of the three largest nationally available FEHBP benchmark plan options to ensure consistency with the definition of EHB in the individual and small group markets. However, these FEHBP plan options
These final regulations are applicable for plan years (or, in the individual market, policy years) beginning on or after January 1, 2017. The HHS final regulations specify the applicability dates in the group market regulations at 45 CFR 146.125 and in the individual market regulations at 45 CFR 148.102.
These final regulations specify the conditions for similar supplemental coverage products that are designed to fill gaps in primary coverage by providing coverage of additional categories of benefits (as opposed to filling in gaps in cost sharing) to constitute supplemental excepted benefits, and clarify that certain travel-related insurance products that provide only incidental health benefits constitute excepted benefits.
These final regulations also revise the definition of short-term, limited-duration insurance so that the coverage (including renewals) has to be less than three months in total duration (as opposed to the current definition of less than 12 months in duration), and provide that a notice must be prominently displayed in the contract and in any application materials provided in connection with enrollment in the coverage indicating that such coverage is not minimum essential coverage.
Finally, the regulations amend the definition of “essential health benefits” for purposes of the prohibition on lifetime and annual dollar limits with respect to group health plans and health insurance issuers that are not required to provide essential health benefits, including self-insured group health plans, large group market health plans, and grandfathered health plans.
The Departments are publishing these final regulations to implement the protections intended by the Congress in the most economically efficient manner possible. The Departments have examined the effects of this rule as required by Executive Order 13563 (76 FR 3821, January 21, 2011), Executive Order 12866 (58 FR 51735, September 1993, Regulatory Planning and Review), the Regulatory Flexibility Act (September 19, 1980, Pub. L. 96-354), the Unfunded Mandates Reform Act of
Executive Order 12866 (58 FR 51735) directs agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects; distributive impacts; and equity). Executive Order 13563 (76 FR 3821, January 21, 2011) is supplemental to and reaffirms the principles, structures, and definitions governing regulatory review as established in Executive Order 12866.
Section 3(f) of Executive Order 12866 defines a “significant regulatory action” as an action that is likely to result in a final rule—(1) having an annual effect on the economy of $100 million or more in any one year, or adversely and materially affecting a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or state, local or tribal governments or communities (also referred to as “economically significant”); (2) creating a serious inconsistency or otherwise interfering with an action taken or planned by another agency; (3) materially altering the budgetary impacts of entitlement grants, user fees, or loan programs or the rights and obligations of recipients thereof; or (4) raising novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles set forth in the Executive Order.
A regulatory impact analysis must be prepared for rules with economically significant effects (for example, $100 million or more in any 1 year), and a “significant” regulatory action is subject to review by the Office of Management and Budget. The Departments have determined that this regulatory action is not likely to have economic impacts of $100 million or more in any one year, and is not significant within the meaning of Executive Order 12866. However, the Departments are nonetheless providing a discussion of the benefits and costs that might stem from these final regulations in the Summary of Impacts section below.
These final regulations clarify the conditions for similar supplemental coverage and travel insurance to be recognized as excepted benefits. These clarifications are necessary to provide health insurance issuers offering supplemental coverage and travel insurance products with a clearer understanding of the Federal standards that apply to these types of coverage. These final regulations also amend the definition of short-term, limited-duration insurance for purposes of the exclusion from the definition of individual health insurance coverage and impose a new notice requirement in response to reports that short-term, limited-duration insurance coverage is being sold to individuals as primary coverage.
The final regulations outline the conditions for travel insurance and similar supplemental health insurance coverage to be considered excepted benefits, and revise the definition of short-term, limited-duration insurance.
The Departments received comments suggesting that the majority of travel insurance policies are issued for trips of short duration, with the average policy length being approximately three months, and these policies generally provide limited medical coverage and property and casualty coverage to protect against risks related to travel. The Departments believe that the designation of certain travel insurance products (as defined by the regulations) as excepted benefits is consistent with prevailing industry practices, and therefore, will not result in significant cost to issuers of these products or consumers who purchase them.
Short-term, limited-duration policies represent a very small fraction of the health insurance market, though their use is increasing. In 2015, total premiums earned for short-term, limited-duration insurance was approximately $160 million for approximately 1,517,000 member months and with approximately 148,000 covered lives at the end of the year,
The Departments received comments indicating that a large majority of the short-term, limited-duration insurance plans are sold as transitional coverage, particularly for individuals seeking to cover periods of unemployment or gaps between employer-sponsored coverage, and typically provide coverage for less than three months. Therefore, the Departments believe that the final regulations will have no effect on the majority of consumers who purchase such coverage and issuers of those policies. The small fraction of consumers who purchase such policies for longer periods and who may have to transition to individual market coverage will benefit from the protections afforded by the Affordable Care Act, such as no preexisting condition exclusions, essential health benefits without annual or lifetime dollar limits, and guaranteed renewability. While some of these consumers may experience an increase in costs due to higher premiums compared with short-term, limited-duration coverage, they will also avoid potential tax liability by having minimum essential coverage. Some consumers may also be eligible for premium tax credits and cost-sharing reductions for coverage offered through the Exchanges. Finally, inclusion of these individuals, often relatively healthier individuals, in the individual market will help strengthen the individual market's single risk pool. The notice requirement will help ensure that consumers do not inadvertently purchase these products expecting them to be minimum essential coverage. Further, the Departments believe that any costs incurred by issuers of short-term, limited-duration insurance to include the required notice in application or enrollment materials will be negligible since the Departments have provided the exact text for the notice.
As a result, the Departments have concluded that the impacts of these final regulations are not economically significant.
The final regulations provide that to be considered short-term, limited-duration insurance for policy years beginning on or after January 1, 2017, a notice must be prominently displayed in the contract and in any application materials, stating that the coverage is not minimum essential coverage and that failure to have minimum essential coverage may result in an additional tax payment. The Departments have provided the exact text for these notice requirements and the language will not need to be customized. The burden associated with these notices is not subject to the Paperwork Reduction Act
The Regulatory Flexibility Act (5 U.S.C. 601
The RFA generally defines a “small entity” as (1) a proprietary firm meeting the size standards of the Small Business Administration (13 CFR 121.201); (2) a nonprofit organization that is not dominant in its field; or (3) a small government jurisdiction with a population of less than 50,000. (States and individuals are not included in the definition of “small entity.”) The Departments use as their measure of significant economic impact on a substantial number of small entities a change in revenues of more than 3 to 5 percent.
The Departments expect the impact of these final regulations to be limited because the provisions are generally consistent with current industry practices and impact only a small fraction of the health insurance market. Therefore, the Departments certify that the final regulations will not have a significant impact on a substantial number of small entities. In addition, section 1102(b) of the Social Security Act requires agencies to prepare a regulatory impact analysis if a rule may have a significant economic impact on the operations of a substantial number of small rural hospitals. This analysis must conform to the provisions of section 604 of the RFA. These final regulations will not affect small rural hospitals. Therefore, the Departments have determined that these final regulations will not have a significant impact on the operations of a substantial number of small rural hospitals.
Certain IRS regulations, including this one, are exempt from the requirements of Executive Order 12866, as supplemented and reaffirmed by Executive Order 13563. Therefore, a regulatory impact assessment is not required. For applicability of RFA, see paragraph D of this section III.
Pursuant to section 7805(f) of the Code, these regulations have been submitted to the Chief Counsel for Advocacy of the Small Business Administration for comment on their impact on small business.
For purposes of the Unfunded Mandates Reform Act of 1995 (2 U.S.C. 1501
Executive Order 13132 outlines fundamental principles of federalism. It requires adherence to specific criteria by Federal agencies in formulating and implementing policies that have “substantial direct effects” on the States, the relationship between the national government and States, or on the distribution of power and responsibilities among the various levels of government. Federal agencies promulgating regulations that have these federalism implications must consult with State and local officials, and describe the extent of their consultation and the nature of the concerns of State and local officials in the preamble to the final regulation.
In the Departments' view, these final regulations have federalism implications because they would have direct effects on the States, the relationship between the national government and the States, or on the distribution of power and responsibilities among various levels of government. Under these final regulations, health insurance issuers offering short-term, limited-duration insurance, travel insurance and similar supplemental coverage will be required to follow the minimum Federal standards to not be subject to the market reform provisions under the PHS Act, ERISA and the Code. However, in the Departments' view, the federalism implications of these final regulations are substantially mitigated because, with respect to health insurance issuers, the Departments expect that the majority of States will enact laws or take other appropriate action resulting in their meeting or exceeding the Federal standards.
In general, through section 514, ERISA supersedes State laws to the extent that they relate to any covered employee benefit plan, and preserves State laws that regulate insurance, banking, or securities. While ERISA prohibits States from regulating an employee benefit plan as an insurance or investment company or bank, the preemption provisions of section 731 of ERISA and section 2724 of the PHS Act (implemented in 29 CFR 2590.731(a) and 45 CFR 146.143(a) and 148.210(b)) apply so that the requirements in title XXVII of the PHS Act (including those added by the Affordable Care Act) are not to be construed to supersede any provision of State law which establishes, implements, or continues in effect any standard or requirement solely relating to health insurance issuers in connection with individual or group health insurance coverage except to the extent that such standard or requirement prevents the application of a Federal requirement. The conference report accompanying HIPAA indicates that this is intended to be the “narrowest” preemption of State laws (See House Conf. Rep. No. 104-736, at 205, reprinted in 1996 U.S. Code Cong. & Admin. News 2018).
States may continue to apply State law requirements except to the extent that such requirements prevent the application of the market reform requirements that are the subject of this rulemaking. Accordingly, States have significant latitude to impose requirements on health insurance issuers that are more restrictive than the Federal law.
In compliance with the requirement of Executive Order 13132 that agencies examine closely any policies that may have federalism implications or limit the policy making discretion of the States, the Departments have engaged in efforts to consult with and work cooperatively with affected States, including consulting with, and attending conferences of, the National Association of Insurance Commissioners and consulting with State insurance officials on an individual basis. It is expected that the Departments will act in a similar fashion in enforcing the market reform provisions of the Affordable Care Act.
Throughout the process of developing these final regulations, to the extent
Pursuant to the requirements set forth in section 8(a) of Executive Order 13132, and by the signatures affixed to this final rule, the Departments certify that the Employee Benefits Security Administration and the Centers for Medicare & Medicaid Services have complied with the requirements of Executive Order 13132 for the attached final rules in a meaningful and timely manner.
These final regulations are subject to the Congressional Review Act provisions of the Small Business Regulatory Enforcement Fairness Act of 1996 (5 U.S.C. 801
IRS Revenue Procedures, Revenue Rulings notices, and other guidance cited in this document are published in the Internal Revenue Bulletin (or Cumulative Bulletin) and are available from the Superintendent of Documents, U.S. Government Printing Office, Washington, DC 20402, or by visiting the IRS Web site at
The Department of the Treasury regulations are adopted pursuant to the authority contained in sections 7805 and 9833 of the Code.
The Department of Labor regulations are adopted pursuant to the authority contained in 29 U.S.C. 1135 and 1191c; and Secretary of Labor's Order 1-2011, 77 FR 1088 (Jan. 9, 2012).
The Department of Health and Human Services regulations are adopted pursuant to the authority contained in sections 2701 through 2763, 2791, and 2792 of the PHS Act (42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92), as amended.
Pension and excise taxes.
Continuation coverage, Disclosure, Employee benefit plans, Group health plans, Health care, Health insurance, Medical child support, Reporting and recordkeeping requirements.
Health care, Health insurance, Reporting and recordkeeping requirements.
Administrative practice and procedure, Health care, Health insurance, Penalties, Reporting and recordkeeping requirements.
Accordingly, 26 CFR part 54 is amended as follows:
26 U.S.C. 7805 * * *
(1) Has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder with or without the issuer's consent) that is less than 3 months after the original effective date of the contract; and
(2) Displays prominently in the contract and in any application materials provided in connection with enrollment in such coverage in at least 14 point type the following: “THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.”
(c)
(1) One of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered essential health benefits consistent with 45 CFR 155.170(a)(2); or
(2) One of the three Federal Employees Health Benefits Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.
The revisions and additions are as follows:
(c) * * *
(2) * * *
(ix) Travel insurance, within the meaning of § 54.9801-2.
(5) * * *
(i) * * *
(C)
* * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 54.9801-2 and paragraph (c)(5)(i)(C) of § 54.9831-1 apply for plan years beginning on or after January 1, 2017.
For the reasons stated in the preamble, the Department of Labor amends 29 CFR part 2590 as set forth below:
29 U.S.C. 1027, 1059, 1135, 1161-1168, 1169, 1181-1183, 1181 note, 1185, 1185a, 1185b, 1191, 1191a, 1191b, and 1191c; sec. 101(g), Pub. L. 104-191, 110 Stat. 1936; sec. 401(b), Pub. L. 105-200, 112 Stat. 645 (42 U.S.C. 651 note); sec. 512(d), Pub. L. 110-343, 122 Stat. 3881; sec. 1001, 1201, and 1562(e), Pub. L. 111-148, 124 Stat. 119, as amended by Pub. L. 111-152, 124 Stat. 1029; Division M, Pub. L. 113-235, 128 Stat. 2130; Secretary of Labor's Order 1-2011, 77 FR 1088 (Jan. 9, 2012).
(1) Has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder with or without the issuer's consent) that is less than 3 months after the original effective date of the contract; and
(2) Displays prominently in the contract and in any application materials provided in connection with enrollment in such coverage in at least 14 point type the following: “THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.”
(c)
(1) One of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered essential health benefits consistent with 45 CFR 155.170(a)(2); or
(2) One of the three Federal Employees Health Benefits Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.
(c) * * *
(2) * * *
(ix) Travel insurance, within the meaning of § 2590.701-2.
(5) * * *
(i) * * *
(C)
* * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 2590.701-2 and paragraph (c)(5)(i)(C) of § 2590.732 apply for plan years beginning on or after January 1, 2017.
For the reasons stated in the preamble, the Department of Health and Human Services amends 45 CFR parts 144, 146, 147, and 148 as set forth below:
Secs. 2701 through 2763, 2791, and 2792 of the Public Health Service Act, 42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92.
(1) Has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder with or without the issuer's consent) that is less than 3 months after the original effective date of the contract; and
(2) Displays prominently in the contract and in any application materials provided in connection with enrollment in such coverage in at least 14 point type the following: “THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.”
Secs. 2702 through 2705, 2711 through 2723, 2791, and 2792 of the Public Health Service Act (42 U.S.C. 300gg-1 through 300gg-5, 300gg-11 through 300gg-23, 300gg-91, and 300gg-92.
* * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 144.103 of this subchapter and paragraph (c)(5)(i)(C) of § 146.145 apply for policy years and plan years beginning on or after January 1, 2017.
(b) * * *
(2) * * *
(ix) Travel insurance, within the meaning of § 144.103 of this subchapter.
(5) * * *
(i) * * *
(C)
Secs. 2701 through 2763, 2791, and 2792 of the Public Health Service Act (42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92), as amended.
(c)
(1) One of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered essential health benefits consistent with 45 CFR 155.170(a)(2); or
(2) One of the three Federal Employees Health Benefits Program
Secs. 2701 through 2763, 2791, and 2792 of the Public Health Service Act (42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92), as amended.
(b) * * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 144.103 of this subchapter and paragraph (b)(7) of § 148.220 apply for policy years beginning on or after January 1, 2017.
(a) * * *
(9) Travel insurance, within the meaning of § 144.103 of this subchapter.
(b) * * *
(7) Similar supplemental coverage provided to coverage under a group health plan (as described in § 146.145(b)(5)(i)(C) of this subchapter).
Coast Guard, DHS.
Notice of deviation from drawbridge regulation.
The Coast Guard has issued a temporary deviation from the operating schedule that governs three drawbridges crossing the Upper Mississippi River in Iowa: The Illinois Central Railroad Drawbridge, mile 579.9, Dubuque, IA; the Sabula Railroad Drawbridge, mile 535.0, Sabula, IA; and the Clinton Railroad Drawbridge, mile 518.0, Clinton, IA. The deviation is necessary to allow the bridge owners time to perform preventive maintenance that is essential to the continued safe operation of the drawbridges and allows for a seasonal deviation issued for these bridges each year. Maintenance is scheduled in the winter, when there is less impact on navigation due to less traffic. This deviation allows the bridges to open on signal if at least 24 hours advance notice is given.
This deviation is effective from 5 p.m., December 13, 2016 until 9 a.m., March 2, 2017.
The docket for this deviation, (USCG-2016-0956) is available at
If you have questions on this temporary deviation, call or email Eric A. Washburn, Bridge Administrator, Western Rivers, Coast Guard; telephone 314-269-2378, email
The Illinois Central, Canadian Pacific, and Union Pacific Railroads requested a temporary deviation for the Illinois Central Railroad Drawbridge, mile 579.9, Dubuque, Iowa, Sabula Railroad Drawbridge, mile 535.0, Sabula, Iowa, and Clinton Railroad Drawbridge, mile 518.0, Clinton, Iowa, across the Upper Mississippi River to open on signal if at least 24 hours advance notice is given for 79 days from 5 p.m., December 13, 2016 to 9 a.m., March 2, 2017 for scheduled maintenance on the bridges.
The Illinois Central, Sabula, and Clinton Railroad Drawbridges currently operate in accordance with 33 CFR 117.5, which states the general requirement that drawbridges open on signal.
There are no alternate routes for vessels transiting these sections of the Upper Mississippi River. The bridges cannot open in case of emergency.
The Illinois Central Railroad Drawbridge provides a vertical clearance of 19.9 feet, Sabula Railroad Drawbridge provides a vertical clearance of 18.1 feet, and Clinton Railroad Drawbridge provides a vertical clearance of 18.7 feet, above normal pool in their closed-to-navigation positions. Navigation on the waterway consists primarily of commercial tows and recreational watercraft and will not be significantly impacted. This temporary deviation has been coordinated with waterway users. No objections were received.
In accordance with 33 CFR 117.35(e), each of these drawbridges must return to its regular operating schedule immediately at the end of the effective period of this temporary deviation. This deviation from the operating regulations is authorized under 33 CFR 117.35.
Coast Guard, DHS.
Notice of deviation from drawbridge regulation.
The Coast Guard has issued a temporary deviation from the operating schedule that governs the Pulaski Bridge across the Newtown Creek, mile 0.6, between Brooklyn and Queens, New York. This deviation is necessary to allow the bridge owner to perform span locks adjustment at the bridge.
This deviation is effective from 12:01 a.m. on November 8, 2016 to 5 a.m. on December 2, 2016.
The docket for this deviation, [USCG-2016-0948] is available at
If you have questions on this temporary deviation, call or email Judy Leung-Yee, Project Officer, First Coast Guard District, telephone (212) 514-4330, email
The Pulaski Bridge, mile 0.6, across the Newtown Creek, has a vertical clearance in the closed position of 39 feet at mean high water and 43 feet at mean low water. The existing bridge operating regulations are found at 33 CFR 117.801(g)(1).
The waterway is transited by commercial barge traffic of various sizes.
The bridge owner, New York City DOT, requested a temporary deviation from the normal operating schedule to perform span locks adjustment at the bridge.
Under this temporary deviation, the Pulaski Bridge shall remain in the closed position as follows:
November 8, 2016 between 12:01 a.m. and 5 a.m.
November 9, 2016 between 12:01 a.m. and 5 a.m.
November 10, 2016 between 12:01 a.m. and 5 a.m.
November 11, 2016 between 12:01 a.m. and 5 a.m.
November 15, 2016 between 12:01 a.m. and 5 a.m.
November 16, 2016 between 12:01 a.m. and 5 a.m.
November 17, 2016 between 12:01 a.m. and 5 a.m.
November 18, 2016 between 12:01 a.m. and 5 a.m.
November 22, 2016 between 12:01 a.m. and 5 a.m.
November 23, 2016 between 12:01 a.m. and 5 a.m.
November 24, 2016 between 12:01 a.m. and 5 a.m.
November 25, 2016 between 12:01 a.m. and 5 a.m.
November 29, 2016 between 12:01 a.m. and 5 a.m.
November 30, 2016 between 12:01 a.m. and 5 a.m.
December 1, 2016 between 12:01 a.m. and 5 a.m.
December 2, 2016 between 12:01 a.m. and 5 a.m.
Vessels able to pass under the bridge in the closed position may do so at anytime. The bridge will not be able to open for emergencies and there is no immediate alternate route for vessels to pass.
The Coast Guard will inform the users of the waterways through our Local Notice and Broadcast to Mariners of the change in operating schedule for the bridge so that vessel operations can arrange their transits to minimize any impact caused by the temporary deviation. The Coast Guard notified known companies of the commercial oil and barge vessels in the area and they have no objections to the temporary deviation.
In accordance with 33 CFR 117.35(e), the drawbridge must return to its regular operating schedule immediately at the end of the effective period of this temporary deviation. This deviation from the operating regulations is authorized under 33 CFR 117.35.
Centers for Medicare & Medicaid Services (CMS), HHS.
Final rule; correction.
This document corrects a typographical error in the final rule that appeared in the August 22, 2016
Donald Thompson, (410) 786-4487.
In the final rule which appeared in the August 22, 2016
On page 57105, we inadvertently made a typographical error in defining an MSA-dominant hospital.
On page 68953 in the table titled “CHANGE OF FY 2016 STANDARDIZED AMOUNTS TO THE FY 2017 STANDARDIZED AMOUNTS,” we inadvertently made a typographical error in the Labor figure for the “National Standardized Amount for FY 2017 if Wage Index is Greater than
On page 68955 in the table titled “Table 1A—NATIONAL ADJUSTED OPERATING STANDARDIZED AMOUNTS, LABOR/NONLABOR (69.6 PERCENT LABOR SHARE/30.4 PERCENT NONLABOR SHARE IF WAGE INDEX IS GREATER THAN 1)—FY 2017,” we inadvertently made a typographical error in the Nonlabor figure under the classification of “Hospital submitted quality data and is a meaningful EHR user (update = 1.65 percent)”.
On page 68958 in the table titled “FY 2017 IPPS ESTIMATED PAYMENTS DUE TO RURAL AND IMPUTED FLOOR WITH NATIONAL BUDGET NEUTRALITY,” we made errors in the alignment of the data in the fourth column titled “Difference (in $ millions)”. Specifically, when creating the table in the correcting document, the data in the fourth column was inadvertently misaligned starting with the entry for Washington, DC and continuing to the end, resulting in incorrect values in that column.
We ordinarily publish a notice of proposed rulemaking in the
Section 553(d) of the APA ordinarily requires a 30-day delay in the effective date of final rules after the date of their publication in the
We believe that this correcting document does not constitute a rule that would be subject to the APA notice and comment or delayed effective date requirements. This correcting document corrects typographical errors in the FY 2017 IPPS/LTCH PPS final rule and the FY 2017 IPPS/LTCH PPS correcting document but does not make substantive changes to the policies or payment methodologies that were adopted in the final rule. As a result, this correcting document is intended to ensure that the information in the FY 2017 IPPS/LTCH PPS final rule accurately reflects the policies adopted in that final rule.
In addition, even if this were a rule to which the notice and comment procedures and delayed effective date requirements applied, we find that there is good cause to waive such requirements. Undertaking further notice and comment procedures to incorporate the corrections in this document into the final rule or delaying the effective date would be contrary to the public interest because it is in the public's interest for providers to receive appropriate payments in as timely a manner as possible, and to ensure that the FY 2017 IPPS/LTCH PPS final rule accurately reflects our policies. Furthermore, such procedures would be unnecessary, as we are not altering our payment methodologies or policies, but rather, we are simply implementing correctly the policies that we previously proposed, received comment on, and subsequently finalized. This correcting document is intended solely to ensure that the FY 2017 IPPS/LTCH PPS final rule accurately reflects these payment methodologies and policies. Therefore, we believe we have good cause to waive the notice and comment and effective date requirements.
In FR Doc. 2016-18476 of August 22, 2016 (81 FR 56761), we are making the following correction:
1. On page 57105, first column, first partial paragraph, lines 6 and 7, the phrase “total hospital's Medicare discharges” is corrected to read “total hospital Medicare discharges”.
In FR Doc. 2016-24042 of October 5, 2016 (81 FR 68947), we are making the following corrections:
1. On pages 68952 through 68954 in the table titled, “CHANGE OF FY 2016 STANDARDIZED AMOUNTS TO THE FY 2017 STANDARDIZED AMOUNTS”, the last entry on page 68953 is corrected to read as follows:
2. On page 68955, top of the page in the table titled, “Table 1A—NATIONAL ADJUSTED OPERATING STANDARDIZED AMOUNTS, LABOR/NONLABOR (69.6 PERCENT LABOR SHARE/30.4 PERCENT NONLABOR SHARE IF WAGE INDEX IS GREATER THAN 1)—FY 2017”, the first column of the table is corrected to read as follows:
3. On page 68958, top of the page, the table titled, “FY 2017 IPPS ESTIMATED PAYMENTS DUE TO RURAL AND IMPUTED FLOOR WITH NATIONAL BUDGET NEUTRALITY” is corrected to read as follows:
Legal Services Corporation.
Final rule, request for comments.
The Legal Services Corporation (LSC) is publishing for public comment a proposed final rule to implement the statutorily required amendments in the FOIA Improvement Act of 2016. LSC is also making
The final rule is effective on December 15, 2016, unless LSC receives substantive adverse comments during the comment period. Written comments will be accepted until November 30, 2016.
You may submit comments by any of the following methods:
Helen Gerostathos Guyton, Assistant General Counsel, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007, (202) 295-1632 (phone), (202) 337-6519 (fax),
LSC is subject to the FOIA by the terms of the Legal Services Corporation Act. 42 U.S.C. 2996d(g). LSC has implemented FOIA by adopting regulations that contain the rules and procedures LSC will follow in making its records available to the public. LSC last amended its FOIA regulations in 2008. 73 FR 67791, Dec. 31, 2008.
On June 30, 2016, President Obama signed into law the FOIA Improvement Act of 2016 (“2016 Amendments” or the “Act”). The Act codifies a number of transparency and openness principles and enacts housekeeping measures designed to facilitate FOIA requests and production. LSC must review its current regulations and issue revised regulations on procedures for the disclosure of records consistent with the Act no later than December 27, 2016. The revised regulations described in this final rule reflect the required changes prescribed by the Act. LSC also identified and proposed technical changes to clarify the language and update the structure of its FOIA regulations.
In light of the deadline established by Congress, LSC management requested that the Operations and Regulations Committee (Committee) recommend that the Board authorize expedited rulemaking and publication of this final rule. On October 16, 2016, the Committee considered the request and voted to make the recommendation to the Board. On October 18, 2016, the Board voted to authorize expedited rulemaking and the publication of this final rule.
There are no proposed changes to this section.
LSC modified several existing definitions, deleted one definition, and added five new definitions to make its regulations clearer. Specifically, LSC amended the Definitions section as follows:
LSC is making minor technical edits to clarify this section.
LSC is making minor technical edits to clarify this section.
This section sets out the process by which LSC makes available for public inspection the records described in the FOIA, 5 U.S.C. 552(a)(2). In the current version of its FOIA regulations, LSC sets out the specific categories of records that must be publicly disclosed. LSC is deleting those specific provisions and replacing them with a broader reference to § 552(a)(2) generally in anticipation of implementing the “Release to One, Release to All” policy.
The Department of Justice Office of Information Policy launched a pilot program as part of its Open Government Initiative called “Release to One, Release to All.” Under this policy, agencies would release FOIA processed records not only to a requester, but to the public at large by posting them online. LSC intends to comply with this policy immediately. As a result, it is revising the description of records in this section to track what LSC actually will be disclosing upon implementation of the “Release to One, Release to All” policy.
LSC is also making minor technical revisions to clarify this section.
LSC is adding a provision to this section that will provide requesters with onsite computer and printer access to electronic reading room records. This provision is consistent with federal agency practice and provides greater access to LSC's records to the public at large.
LSC is updating this section to reflect its current practice of maintaining its index of records electronically.
The current version of § 1602.8 includes provisions relating to the format of requests for records, the timing of responses, and the format of responses to requests. There are no subheadings to distinguish these provisions within the section, making it difficult to follow. To improve readability, LSC is restructuring § 1602.8 by limiting the section solely to provisions related to the format of FOIA requests. LSC is also adding a provision that informs requesters of their right to specify the preferred form or format for the records sought and that requires requesters to provide their contact information to assist LSC in communicating with them about their request.
This is a new section. As described in the discussion of § 1602.8, LSC determined that it would be clearer if the provisions for timing and responses to requests were contained in a separate section. LSC also is making technical changes to the language and structure to improve clarity. In addition, LSC is adding provisions describing the dispute resolution processes available to the public as required by the 2016 Amendments. These provisions describe when a requester may seek assistance, including dispute resolution services, from an LSC FOIA Public Liaison or the U.S. National Archives and Record Administration's Office of Government Information Services.
LSC is amending this section to incorporate the 2016 Amendments' codification of the Department of Justice's foreseeable harm standard, which requires LSC to withhold information only if disclosure would harm an interest protected by an exemption or prohibited by law. It further obligates LSC to consider whether partial disclosure of information is possible when full disclosure is not and to take reasonable steps to segregate and release nonexempt information.
In addition, LSC is modifying its rule regarding the applicability of the deliberative process privilege, as required by the 2016 Amendments. The privilege now applies only to records created within 25 years of the date on which the records were requested.
Finally, LSC is adding exemptions 1, 8, and 9 from 5 U.S.C. 552(8)(B)(b) to its regulations. While these exemptions, which deal with national security, financial institutions, and geological information, generally do not apply to the work of LSC, their absence caused confusion because LSC's exemption numbers did not track the commonly used exemption numbers found in both the FOIA and case law. This change will eliminate any confusion.
LSC is deleting paragraph (a) of this section, which describes the role of the General Counsel in adequately and consistently applying the provisions of this part within LSC. The 2016 Amendments establish the role of the Chief FOIA Officer in ensuring compliance with FOIA, thereby superseding LSC's current regulations.
LSC is adding a provision to this section requiring it to include a provision in its denial decisions notifying the requester of his or her right to seek dispute resolution services from LSC's FOIA Public Liaison or the Office of Government Information Services.
LSC is making minor technical edits to clarify this section. LSC is also adding a provision required by the 2016 Amendments. This provision requires LSC to notify a requester of the mediation services offered by the Office of Government Information Systems as a non-exclusive alternative to litigation.
LSC is adding a provision to this section that prohibits LSC from assessing fees if its response time is delayed, subject to limited exceptions described in the 2016 Amendments. LSC is also deleting references to the specific dollar amounts it will charge for search and reproduction costs because they are outdated and providing instead the web address for its FOIA page, which will contain current fee and cost schedules.
As previously described in the discussion of § 1602.2's definition of the term
LSC is further modifying this section to include a right to appeal to the Inspector General for Office of Inspector General-related requests, as the current regulations do not provide a mechanism to do so.
Finally, LSC is clarifying an ambiguous provision that requires a submitter to provide to LSC within seven days his or her statement objecting to disclosure of his information. LSC must receive the submitter's statement within seven days of the date of LSC's notice to the submitter.
Freedom of Information.
42 U.S.C. 2996g(e)
This part contains the rules and procedures the Legal Services Corporation (LSC) follows in making records available to the public under the Freedom of Information Act.
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
(i)
(j)
(k)
(l)
(m)
(n)
(o)
LSC will make records concerning its operations, activities, and business available to the public to the maximum extent reasonably possible. LSC will withhold records from the public only in accordance with the FOIA and this part. LSC will disclose records otherwise exempt from disclosure under the FOIA when disclosure is not prohibited by law and disclosure would not foreseeably harm a legitimate interest of the public, LSC, a recipient, or any individual.
LSC routinely publishes in the
(a) LSC will maintain a public reading room at its offices at 3333 K St. NW., Washington, DC 20007. This room will be supervised and will be open to the public during LSC's regular business hours. Procedures for use of the public reading room are described in § 1602.6. LSC also maintains an electronic public reading room that may be accessed at
(b) Subject to the limitation stated in paragraph (c), LSC will make available for public inspection in its electronic public reading room the records described in 5 U.S.C. 552(a)(2).
(c) Certain records otherwise required by FOIA to be available in the public reading room may be exempt from mandatory disclosure pursuant to 5 U.S.C. 552(b) and § 1602.10. LSC will not make such records available in the public reading room. LSC may edit other records maintained in the reading room by redacting details about individuals to prevent clearly unwarranted invasions of personal privacy. In such cases, LSC will attach
(a) A person who wishes to inspect or copy records in the public reading room should arrange a time in advance, by telephone or letter request made to the Office of Legal Affairs, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007 or by email to
(1) In appropriate circumstances, LSC will advise persons making telephonic requests to use the public reading room that a written request would aid in the identification and expeditious processing of the records sought.
(2) Written requests should identify the records sought in the manner provided in § 1602.8(b) and should request a specific date for inspecting the records.
(b) LSC will advise the requester as promptly as possible if, for any reason, it is not feasible to make the records sought available on the date requested.
(c) A computer terminal and printer are available upon request in the public reading room for accessing Electronic Reading Room records.
LSC will maintain and make available for public inspection in an electronic format a current index identifying any matter within the scope of §§ 1602.4 and 1602.5(b).
(a) LSC will make its records promptly available, upon request, to any person in accordance with this section, unless:
(1) The FOIA requires the records to be published in the
(2) LSC determines that such records should be withheld and are exempt from mandatory disclosure under the FOIA and § 1602.10.
(b)(1)
(2)
(3) Any request not marked and addressed as specified in this section will be so marked by LSC personnel as soon as it is properly identified, and will be forwarded immediately to the appropriate Office. A request improperly addressed will be deemed to have been received as in accordance with § 1602.9 only when it has been received by the appropriate Office. Upon receipt of an improperly addressed request, the Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees shall notify the requester of the date on which the time period began.
(c) A request must reasonably describe the records requested so that employees of LSC who are familiar with the subject area of the request are able, with a reasonable amount of effort, to determine which particular records are within the scope of the request. Before submitting their requests, requesters may contact LSC's or OIG's FOIA Analyst or FOIA Public Liaison to discuss the records they seek and to receive assistance in describing the records. If LSC determines that a request does not reasonably describe the records sought, LSC will inform the requester what additional information is needed or why the request is otherwise insufficient. Requesters who are attempting to reformulate or modify their request may discuss their request with LSC's or OIG's FOIA Analyst or FOIA Public Liaison. If a request does not reasonably describe the records sought, LSC's response to the request may be delayed.
(d) To facilitate the location of records by LSC, a requester should try to provide the following kinds of information, if known:
(1) The specific event or action to which the record refers;
(2) The unit or program of LSC which may be responsible for or may have produced the record;
(3) The date of the record or the date or period to which it refers or relates;
(4) The type of record, such as an application, a grant, a contract, or a report;
(5) Personnel of LSC who may have prepared or have knowledge of the record;
(6) Citations to newspapers or publications which have referred to the record.
(e) Requests may specify the preferred form or format (including electronic formats) for the records sought. LSC will provide records in the form or format indicated by the requester to the extent such records are readily reproducible in the requested form or format. LSC reserves the right to limit the number of copies of any document that will be provided to any one requester or to require that special arrangements for duplication be made in the case of bound volumes or other records representing unusual problems of handling or reproduction.
(f) Requesters must provide contact information, such as their phone number, email address, and/or mailing address, to assist LSC in communicating with them and providing released records.
(g) LSC is not required to create a record or to perform research to satisfy a request.
(h) Any request for a waiver or reduction of fees should be included in the FOIA request, and any such request should indicate the grounds for a waiver or reduction of fees, as set out in § 1602.14(g). LSC shall respond to such request as promptly as possible.
(a)(1)(i) Upon receiving a request for LSC or Inspector General records under § 1602.8, the Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees shall make an initial determination of whether to comply with or deny such request. The Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees will send the determination to the requester within 20 business days after receipt of the request and will notify the requester of their right to seek assistance from an LSC FOIA Public Liaison.
(ii) If the processing Office determines that a request or portion thereof is for the other Office's records, the processing Office shall promptly refer the request or portion thereof to the appropriate Office and send notice of such referral to the requester.
(2) The 20-day period under paragraph (a)(1)(i) of this section shall commence on the date on which the request is first received by the appropriate Office, but in no event later than 10 working days after the request has been received by either the Office of Legal Affairs or the Office of Inspector
(i) It is awaiting such information that it has reasonably requested from the requester under this section; or
(ii) It communicates with the requester to clarify issues regarding fee assessment. In either case, the processing Office's receipt of the requester's response to such a request for information or clarification ends the tolling period.
(b)(1) In unusual circumstances, as specified in paragraph (b)(3) of this section, LSC may extend the time limit for up to 10 working days by written notice to the requester setting forth the reasons for such extension and the date on which LSC expects to send its determination.
(2) If a request is particularly broad or complex so that it cannot be completed within the time periods stated in paragraph (a)(1)(i) of this section, LSC may ask the requester to narrow the request or agree to an additional delay. In addition, to aid the requester, LSC shall make available a FOIA Public Liaison, who shall assist in the resolution of any disputes between the requester and LSC, and shall notify the requester of his right to seek dispute resolution services from the U.S. National Archives and Records Administration's Office of Government Information Services.
(3)
(i) The need to search for and collect the requested records from establishments that are separate from the office processing the request;
(ii) The need to search for, collect, and appropriately examine a voluminous amount of separate and distinct records which are demanded in a single request; or
(iii) The need for consultation, which shall be conducted with all practicable speed, with another agency or organization, such as a recipient, having a substantial interest in the determination of the request or among two or more components of LSC having substantial subject matter interest therein.
(c)(1) When the processing Office cannot send a determination to the requester within the applicable time limit, the Chief FOIA Officer, Office of the Inspector General Legal Counsel, or their designees shall inform the requester of the reason for the delay, the date on which the processing Office expects to send its determination, and the requester's right to treat the delay as a denial and to appeal to LSC's President or Inspector General, in accordance with § 1602.13, or to seek dispute resolution services from a FOIA Public Liaison or the Office of Government Information Services.
(2) If the processing Office has not sent its determination by the end of the 20-day period or the last extension thereof, the requester may deem the request denied, and exercise a right of appeal in accordance with § 1602.13, or seek dispute resolution services from LSC's or OIG's FOIA Public Liaison or the National Archives and Records Administration's Office of Government Information Services. The Chief FOIA Officer, Office of Inspector General Legal Counsel, or their designees may ask the requester to forego appeal until a determination is made.
(d) After the processing Office determines that a request will be granted, LSC or the OIG will act with due diligence in providing a substantive response.
(e)(1)
(i) Circumstances in which the lack of expedited treatment could reasonably be expected to pose an imminent threat to the life or physical safety of an individual;
(ii) An urgency to inform the public about an actual or alleged LSC activity and the request is made by a person primarily engaged in disseminating information;
(iii) The loss of substantial due process rights; or
(iv) A matter of widespread and exceptional media interest raising questions about LSC's integrity which may affect public confidence in LSC.
(2) A request for expedited processing may be made at the time of the initial request for records or at any later time. For a prompt determination, a request for expedited processing must be properly addressed and marked and received by LSC pursuant to § 1602.8.
(3) A requester who seeks expedited processing must submit a statement demonstrating a compelling need and explaining in detail the basis for requesting expedited processing. The requester must certify that the statement is true and correct to the best of the requester's knowledge and belief.
(4) Within 10 calendar days of receiving a request for expedited processing, the Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees shall decide whether to grant the request and shall notify the requester of the decision. If a request for expedited treatment is granted, the request shall be given priority and shall be processed as soon as practicable. If a request for expedited processing is denied, the requester may appeal in writing to LSC's President or Inspector General in the format described in § 1602.13(a). Any appeal of a denial for expedited treatment shall be acted on expeditiously by LSC.
(a) LSC shall—
(1) Withhold information under this section only if—
(i) LSC reasonably foresees that disclosure would harm an interest protected by an exemption described in paragraph (b); or
(ii) Disclosure is prohibited by law; and
(2)(i) Consider whether partial disclosure of information is possible whenever LSC determines that a full disclosure of a requested record is not possible; and
(ii) Take reasonable steps necessary to segregate and release nonexempt information; and
(b) LSC may withhold a requested record from public disclosure only if one or more of the following exemptions authorized by the FOIA apply:
(1)(i) Matter that is specifically authorized under criteria established by an Executive order to be kept secret in the interest of national defense or foreign policy and
(ii) Is in fact properly classified pursuant to such Executive Order;
(2) Matter that is related solely to the internal personnel rules and practices of LSC;
(3) Matter that is specifically exempted from disclosure by statute (other than the exemptions under FOIA at 5 U.S.C. 552(b)), provided that such statute requires that the matters be withheld from the public in such a manner as to leave no discretion on the issue, or establishes particular criteria for withholding, or refers to particular types of matters to be withheld;
(4) Trade secrets and commercial or financial information obtained from a person and privileged or confidential;
(5) Inter-agency or intra-agency memoranda or letters that would not be available by law to a party other than an agency in litigation with the Corporation, provided that the
(6) Personnel and medical files and similar files, the disclosure of which would constitute a clearly unwarranted invasion of personal privacy;
(7) Records or information compiled for law enforcement purposes, including enforcing the Legal Services Corporation Act or any other law, but only to the extent that the production of such law enforcement records or information:
(i) Could reasonably be expected to interfere with enforcement proceedings;
(ii) Would deprive a person or a recipient of a right to a fair trial or an impartial adjudication;
(iii) Could reasonably be expected to constitute an unwarranted invasion of personal privacy;
(iv) Could reasonably be expected to disclose the identity of a confidential source, including a State, local, or foreign agency or authority or any private institution that furnished information on a confidential basis, and in the case of a record or information compiled by a criminal law enforcement authority in the course of a criminal investigation, information furnished by a confidential source;
(v) Would disclose techniques and procedures for law enforcement investigations or prosecutions, or would disclose guidelines for law enforcement investigations or prosecutions if such disclosure could reasonably be expected to risk circumvention of the law; or
(vi) Could reasonably be expected to endanger the life or physical safety of any individual;
(8) Matter that is contained in or related to examination, operating, or condition reports prepared by, on behalf of, or for the use of an agency responsible for the regulation or supervision of financial institutions; or
(9) Geological and geophysical information and data, including maps, concerning wells.
(c) In the event that one or more of the exemptions in paragraph (b) of this section applies, any reasonably segregable portion of a record shall be provided to the requester after redaction of the exempt portions. The amount of information redacted and the exemption under which the redaction is being made shall be indicated on the released portion of the record, unless doing so would harm the interest protected by the exemption under which the redaction is made. If technically feasible, the amount of information redacted and the exemption under which the redaction is being made shall be indicated at the place in the record where the redaction occurs.
(d) No requester shall have a right to insist that any or all of the techniques in paragraph (c) of this section should be employed in order to satisfy a request.
(e) Records that may be exempt from disclosure pursuant to paragraph (b) of this section may be made available at the discretion of the LSC official authorized to grant or deny the request for records, after appropriate consultation as provided in § 1602.11. Records may be made available pursuant to this paragraph when disclosure is not prohibited by law and does not appear adverse to legitimate interests of LSC, the public, a recipient, or any person.
(a) The Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees are authorized to grant or deny requests under this part. In the absence of an Office of Inspector General Legal Counsel, the Inspector General shall name a designee who will be authorized to grant or deny requests under this part and who will perform all other functions of the Office of Inspector General Legal Counsel under this part.
(b)(1) The Chief FOIA Officer or designee shall consult with the Office of Inspector General Legal Counsel or designee prior to granting or denying any request for records or portions of records which originated with the OIG, or which contain information which originated with the OIG, but which are maintained by other components of LSC.
(2) The Office of Inspector General Legal Counsel or designee shall consult with the Chief FOIA Officer or designee prior to granting or denying any request for records or portions of records which originated with any component of LSC other than the OIG, or which contain information which originated with a component of LSC other than the OIG, but which are maintained by the OIG.
(a) A denial of a written request for a record that complies with the requirements of § 1602.8 shall be in writing and shall include the following:
(1) A reference to the applicable exemption or exemptions in § 1602.10(b) upon which the denial is based;
(2) An explanation of how the exemption applies to the requested records;
(3) A statement explaining why it is deemed unreasonable to provide segregable portions of the record after deleting the exempt portions;
(4) An estimate of the volume of requested matter denied unless providing such estimate would harm the interest protected by the exemption under which the denial is made;
(5) The name and title of the person or persons responsible for denying the request;
(6) An explanation of the right to appeal the denial and of the procedures for submitting an appeal, as described in § 1602.13, including the address of the official to whom appeals should be submitted; and
(7) An explanation of the right of the requester to seek dispute resolution services from a FOIA Public Liaison or the Office of Government Information Services.
(b) Whenever LSC makes a record available subject to the deletion of a portion of the record, such action shall be deemed a denial of a record for purposes of paragraph (a) of this section.
(c) All denials shall be treated as final opinions under § 1602.5(b)(1).
(a) Any person whose written request has been denied is entitled to appeal the denial within 90 days of the date of the response by writing to the President of LSC or, in the case of a denial of a request for OIG records, the Inspector General, at the mailing or email addresses given in § 1602.8(b)(1) and (2). The envelope and letter or email appeal should be clearly marked: “Freedom of Information Appeal.” An appeal need not be in any particular form, but should adequately identify the denial, if possible, by describing the requested record, identifying the official who issued the denial, and providing the date on which the denial was issued.
(b) No personal appearance, oral argument, or hearing will ordinarily be permitted on appeal of a denial. Upon request and a showing of special circumstances, however, this limitation may be waived and an informal conference may be arranged with the President, Inspector General or their designees for this purpose.
(c) The decision of the President or the Inspector General on an appeal shall be in writing and, in the event the denial is in whole or in part upheld, shall contain an explanation responsive to the arguments advanced by the requester, the matters described in § 1602.12(a)(1) through (4), and the provisions for judicial review of such decision under 5 U.S.C. 552(a)(4). The decision must also notify the requester of the mediation services offered by the
(d) LSC will send its decision to the requester within 20 business days after receipt of the appeal, unless an additional period is justified due to unusual circumstances, as described in § 1602.9, in which case LSC may extend the time limit for up to 10 working days by written notice to the requester setting forth the reasons for such extension and the date on which LSC expects to send its determination. The decision of the President or the Inspector General shall constitute the final action of LSC. All such decisions shall be treated as final opinions under § 1602.5(b)(1).
(e) On an appeal, the President or designee shall consult with the OIG prior to reversing in whole or in part the denial of any request for records or portions of records which originated with the OIG, or which contain information which originated with the OIG, but which are maintained by other components of LSC. The Inspector General or designee shall consult with the President prior to reversing in whole or in part the denial of any request for records or portions of records which originated with LSC, or which contain information which originated with LSC, but which are maintained by the OIG.
(a) LSC will not charge fees for information routinely provided in the normal course of doing business.
(b)(1) When records are requested for commercial use, LSC shall limit fees to reasonable standard charges for document search, review, and duplication.
(2) LSC shall not assess any search fees (or if the requester is a representative of the news media, duplication fees) if LSC has failed to comply with the time limits set forth in § 1602.9 and no unusual circumstances, as defined in that section apply.
(3)(i) If LSC has determined that unusual circumstances as defined in § 1602.9 apply and LSC has provided timely written notice to the requester in accordance with § 1602.9(b)(1), a failure described in paragraph § 1602.9(c)(2) is excused for an additional 10 days. If LSC fails to comply with the extended time limit, LSC may not assess any search fees (or, if the requester is a representative of the news media, duplication fees).
(ii) If LSC has determined that unusual circumstances as defined in § 1602.9 apply and more than 5,000 pages are necessary to respond to the request, LSC may charge search fees or duplication fees if LSC has provided a timely written notice to the requester in accordance with § 1602.9 and LSC has discussed with the requester via written mail, electronic mail, or telephone (or made not less than three good-faith attempts to do so) how the requester could effectively limit the scope of the request in accordance with paragraph § 1602.9.
(c) When records are sought by a representative of the news media or by an educational or non-commercial scientific institution, LSC shall limit fees to reasonable standard charges for document duplication after the first 100 pages; and
(d) For all other requests, LSC shall limit fees to reasonable standard charges for search time after the first 2 hours and duplication after the first 100 pages.
(e) The schedule of charges and fees for services regarding the production or disclosure of the Corporation's records may be viewed on LSC's FOIA home page at
(f) LSC may charge for time spent searching even if it does not locate any responsive records or it withholds the records located as exempt from disclosure.
(g)
(1) In order to determine whether disclosure of the information is in the public interest because it is likely to contribute significantly to public understanding of the operations or activities of LSC, LSC shall consider the following four factors:
(i) The subject of the request: Whether the subject of the requested records concerns “the operations or activities of LSC.” The subject of the requested records must concern identifiable operations or activities of LSC, with a connection that is direct and clear, not remote or attenuated.
(ii) The informative value of the information to be disclosed: Whether the disclosure is “likely to contribute” to an understanding of LSC operations or activities. The requested records must be meaningfully informative about LSC operations or activities in order to be likely to contribute to an increased public understanding of those operations or activities. The disclosure of information that is already in the public domain, in either a duplicative or a substantially identical form, would not be likely to contribute to such understanding where nothing new would be added to the public's understanding.
(iii) The contribution to an understanding of the subject by the public likely to result from disclosure: Whether disclosure of the requested records will contribute to “public understanding.” The disclosure must contribute to a reasonably broad audience of persons interested in the subject, as opposed to the personal interest of the requester. A requester's expertise in the subject area and ability and intention to effectively convey information to the public shall be considered. LSC shall presume that a representative of the news media will satisfy this consideration.
(iv) The significance of the contribution to public understanding: Whether the disclosure is likely to contribute “significantly” to public understanding of LSC operations or activities. The disclosure must enhance the public's understanding of the subject in question to a significant extent.
(2) In order to determine whether disclosure of the information is not primarily in the commercial interest of the requester, LSC will consider the following two factors:
(i) The existence and magnitude of a commercial interest: Whether the requester has a commercial interest that would be furthered by the requested disclosure. LSC shall consider any commercial interest of the requester (with reference to the definition of “commercial use” in this part) or of any person on whose behalf the requester may be acting, that would be furthered by the requested disclosure.
(ii) The primary interest in disclosure: Whether the magnitude of the identified commercial interest is sufficiently large, in comparison with the public interest in disclosure, that disclosure is “primarily” in the commercial interest of the requester. A fee waiver or reduction is justified where the public interest is of greater magnitude than is any identified commercial interest in disclosure. LSC ordinarily shall presume that where a news media requester has satisfied the public interest standard, the public interest will be the interest primarily served by disclosure to that requester. Disclosure to data brokers or others who merely compile and market government information for direct economic return
(3) Where LSC has determined that a fee waiver or reduction request is justified for only some of the records to be released, LSC shall grant the fee waiver or reduction for those records.
(4) Requests for fee waivers and reductions shall be made in writing and must address the factors listed in this paragraph as they apply to the request.
(h) Requesters must agree to pay all fees charged for services associated with their requests. LSC will assume that requesters agree to pay all charges for services associated with their requests up to $25 unless otherwise indicated by the requester. For requests estimated to exceed $25, LSC will consult with the requester prior to processing the request, and such requests will not be deemed to have been received by LSC until the requester agrees in writing to pay all fees charged for services.
(i) No requester will be required to make an advance payment of any fee unless:
(1) The requester has previously failed to pay a required fee within 30 days of the date of billing, in which case an advance deposit of the full amount of the anticipated fee together with the fee then due plus interest accrued may be required (and the request will not be deemed to have been received by LSC until such payment is made); or
(2) LSC determines that an estimated fee will exceed $250, in which case the requester shall be notified of the amount of the anticipated fee or such portion thereof as can readily be estimated. Such notification shall be transmitted as soon as possible, but in any event within five working days of receipt by LSC, giving the best estimate then available. The notification shall offer the requester the opportunity to confer with appropriate representatives of LSC for the purpose of reformulating the request so as to meet the needs of the requester at a reduced cost. The request will not be deemed to have been received by LSC for purposes of the initial 20-day response period until the requester makes a deposit on the fee in an amount determined by LSC.
(j) Interest may be charged to those requesters who fail to pay the fees charged. Interest will be assessed on the amount billed, starting on the 31st day following the day on which the billing was sent. The rate charged will be as prescribed in 31 U.S.C. 3717.
(k) If LSC reasonably believes that a requester or group of requesters is attempting to break a request into a series of requests for the purpose of evading the assessment of fees, LSC shall aggregate such requests and charge accordingly. Likewise, LSC will aggregate multiple requests for documents received from the same requester within 45 days.
(a) When LSC receives a FOIA request seeking the release of confidential commercial information, LSC shall provide prompt written notice of the request to the submitter in order to afford the submitter an opportunity to object to the disclosure of the requested confidential commercial information. The notice shall reasonably describe the confidential commercial information requested and inform the submitter of the process required by paragraph (b) of this section.
(b) If a submitter who has received notice of a request for the submitter's confidential commercial information wishes to object to the disclosure of the confidential commercial information, the submitter must provide LSC with a detailed written statement identifying the information which it objects to LSC disclosing. The submitter must send its objections to the Office of Legal Affairs or, if it pertains to Office of Inspector General records, to the Office of Inspector General, and must specify the grounds for withholding the information under FOIA or this part. In particular, the submitter must demonstrate why the information is commercial or financial information that is privileged or confidential. The submitter's statement must be received by LSC within seven business days of the date of the notice from LSC. If the submitter fails to respond to the notice from LSC within that time, LSC will deem the submitter to have no objection to the disclosure of the information.
(c) Upon receipt of written objection to disclosure by a submitter, LSC shall consider the submitter's objections and specific grounds for withholding in deciding whether to release the disputed information. Whenever LSC decides to disclose information over the objection of the submitter, LSC shall give the submitter written notice which shall include:
(1) A description of the information to be released and a notice that LSC intends to release the information;
(2) A statement of the reason(s) why the submitter's request for withholding is being rejected; and
(3) Notice that the submitter shall have five business days from the date of the notice of proposed release to appeal that decision to the LSC President or Inspector General (as provided in § 1602.13 (c)), whose decision shall be final.
(d) The requirements of this section shall not apply if:
(1) LSC determines upon initial review of the requested confidential commercial information that the requested information should not be disclosed;
(2) The information has been previously published or officially made available to the public; or
(3) Disclosure of the information is required by statute (other than FOIA) or LSC's regulations.
(e) Whenever a requester files a lawsuit seeking to compel disclosure of a submitter's information, LSC shall promptly notify the submitter.
(f) Whenever LSC provides a submitter with notice and opportunity to oppose disclosure under this section, LSC shall notify the requester that the submitter's rights process under this section has been triggered. Likewise, whenever a submitter files a lawsuit seeking to prevent the disclosure of the submitter's information, LSC shall notify the requester.
Federal Communications Commission.
Final rule.
The Federal Communications Commission addresses the remaining petitions for reconsideration of the
Effective November 30, 2016.
Jay Whaley, 202-418-7184, or if concerning the information collections in this document, Cathy Williams, 202-418-2918.
This is a summary of the Commission's
In the
We revise section 25.157(e) of the current rules to eliminate the requirement that the Commission withhold spectrum for use in a subsequent processing round if fewer than three qualified applicants file applications in the initial processing round, known as the “three-licensee presumption.” We find that the “three-licensee presumption” is overly restrictive for its intended purpose. We agree with petitioners that a specific frequency band does not necessarily equate to a market, and thus having fewer than three licensees in a band does not necessarily indicate a harmful lack of competition in some market that we should attempt to remedy. We find it common that licensees in different bands compete with each other in the provision of satellite-based services in broader markets, and we note that there are numerous NGSO-like system operators that currently compete across frequency bands.
We also recognize that in cases where one or more applicants in a processing round request less spectrum than they would be assigned if all the available spectrum were divided equally among all the qualified applicants, some spectrum would remain unassigned, thus we retain the procedure that the Commission adopted in the
We clarify the procedures that apply when we redistribute spectrum among the remaining NGSO-like systems after an authorization for a NGSO-like system has been canceled or otherwise becomes available. This redistribution procedure applies only in cases where spectrum was granted pursuant to a processing round, and one or more of those grants of spectrum is lost or surrendered for any reason. In these cases, the Commission will issue a public notice or order announcing the loss or surrender of such spectrum, and will then propose to modify the remaining grants to redistribute the returned spectrum among the remaining system operators that have requested use of the spectrum. The returned spectrum will generally be redistributed equally among the remaining operators that requested the spectrum, although no operator will receive more spectrum on redistribution than it requested in its application. Additionally, if an operator has not requested use of a particular spectrum band, it will not receive spectrum in that band. If the Commission is unable to make a finding that there will be reasonably efficient use of the spectrum, we will consider on a case-by-case basis whether to open a new processing round for the returned spectrum, leave it unassigned at that point, or repurpose it for another use.
In the
In the
Under section 25.159(d) of the rules, adopted in the
SES Americom (SES) maintains that the Commission should not consider a licensee's relinquishing a license prior to the contract execution milestone in determining whether to impose the limit on satellite applications and/or unbuilt satellites on that licensee. As an initial matter, we note that the milestone rules have been revised in the
SIA asserts that it is unclear in the
In its Petition, Hughes asserts that the limit on pending applications and licensed-but-unlaunched satellites is not necessary for those orbital locations not covering the United States.
The purpose of the safeguards in section 25.159 of the Commission's rules is not to reduce the number of satellite applications to a “reasoned and measured” level. Rather, the Commission intended the safeguards to discourage speculators from applying for satellite licenses, thereby precluding another applicant from obtaining a license, constructing a satellite, and providing service to the customers. Hughes assumes that, because fewer applications are filed outside of the arc from 60° W.L. to 140° W.L. than within that arc, speculation is not a concern. Although demand may not be as great for locations that cannot serve large portions of the United States, we have licensed many satellites at orbital locations in this portion of the arc that are subject to competition. We have also granted U.S. market access to many non-U.S.-licensed satellites operating at those locations to provide services to U.S. customers. Thus, allowing operators to hold these orbital locations while they decide whether to proceed with implementation could preclude other operators whose plans also involve providing international service from going forward. For these reasons, we will continue to apply the safeguards against speculation, including the bond requirement, where appropriate, regardless of orbital location.
In its petition for reconsideration, ICO asserts that the
The Commission eliminated the anti-trafficking rule to allow NGSO-like licensees in modified processing rounds to acquire rights to operate on additional spectrum from other licensees if they feel it is necessary to meet their business needs. It would be inefficient to require these licensees to build two incompatible satellite networks, each operating in only part of the spectrum rights that the licensee is authorized to use. We therefore clarify that NGSO-like licensees acquiring spectrum rights from other NGSO-like licensees are permitted to build a single, integrated NGSO-like system operating on all authorized frequency bands, under a single milestone schedule. These cases are inherently fact-specific, and so we decline to adopt a blanket approach about the milestone schedule that would apply in these cases.
Under the terms of the World Trade Organization (WTO) Agreement on
In the
According to SIA, the rule revisions adopted in the
Further, in the
In the
The Commission discussed international coordination issues in the
Hughes notes that the rule revisions adopted in the
In the
As required by the Regulatory Flexibility Act (RFA), an Initial Regulatory Flexibility Analysis (IRFA) was incorporated in the
This document does not contain new or modified information collection requirements subject to the Paperwork Reduction Act of 1995, Public Law 104-13. Therefore it does not contain any new or modified “information burden for small business concerns with fewer than 25 employees” pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198. Thus, on October 14, 2016, the Office of Management and Budget (OMB) determined that the rule changes in this document are non-substantive changes to the currently approved collection, OMB Control Number 3060-0678. ICR Reference Number: 201610-3060-011.
Pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198, see 44 U.S.C. 3506(c)(4), we previously sought specific comment on how the Commission might further reduce the information collection burden for small business concerns with fewer than 25 employees. We received no comments on this issue. We have assessed the effects of the revisions adopted that might impose information collection burdens on small business concerns, and find that the impact on businesses with fewer than 25 employees will be an overall reduction in burden. The amendments adopted in this
The Commission will send copies of this Second Order on Reconsideration to Congress and the General Accountability Office pursuant to the Congressional Review Act, 5 U.S.C. 801(a)(1)(A).
The effective date for the rules adopted in this
This Order adopts minor changes to part 25 of the Commission's rules, which governs licensing and operation of space stations and earth stations for the provision of satellite communication services.
This
(1) Eliminate the “three-licensee presumption” that applies to the NGSO-like processing round procedure, and also revise the procedures that we will apply when we redistribute spectrum among remaining NGSO-like licensees when a license is cancelled for any reason.
(2) Clarify that non-U.S.-satellite operators may notify the Commission of a change of ownership after the transfer takes place.
No party filing comments in this proceeding responded to the IRFA, and no party filing comments in this proceeding otherwise argued that the policies and rules proposed in this proceeding would have a significant economic impact on a substantial number of small entities. The Commission has, nonetheless, considered any potential significant economic impact that the rule changes may have on the small entities which are impacted. On balance, the Commission believes that the economic impact on small entities will be positive rather than negative, and that the rule changes move to streamline the part 25 requirements.
Pursuant to the Small Business Jobs Act of 2010, the Commission is required to respond to any comments filed by the Chief Counsel for Advocacy of the Small Business Administration, and to provide a detailed statement of any change made to the proposed rules as a result of those comments. The Chief Counsel did not file any comments in response to the proposed rules in this proceeding.
The RFA directs agencies to provide a description of, and, where feasible, an estimate of, the number of small entities that may be affected by the rules adopted herein. The RFA generally defines the term “small entity” as having the same meaning as the terms “small business,” “small organization,” and “small governmental jurisdiction.” In addition, the term “small business” has the same meaning as the term “small business concern” under the Small Business Act. A small business concern is one which: (1) Is independently owned and operated; (2) is not dominant in its field of operation; and (3) satisfies any additional criteria established by the Small Business Administration (SBA). Below, we describe and estimate the number of small entity licensees that may be affected by the adopted rules.
The rules adopted in this Order will affect some providers of satellite telecommunications services. Satellite telecommunications service providers include satellite and earth station operators. Since 2007, the SBA has recognized two census categories for satellite telecommunications firms: “Satellite Telecommunications” and “Other Telecommunications.” Under the “Satellite Telecommunications” category, a business is considered small if it had $32.5 million or less in annual receipts. Under the “Other Telecommunications” category, a business is considered small if it had $32.5 million or less in annual receipts.
The first category of Satellite Telecommunications “comprises establishments primarily engaged in providing point-to-point telecommunications services to other establishments in the telecommunications and broadcasting industries by forwarding and receiving communications signals via a system of satellites or reselling satellite telecommunications.” For this category, Census Bureau data for 2007 show that there were a total of 512 satellite communications firms that operated for the entire year. Of this total, 482 firms had annual receipts of under $25 million.
The second category of Other Telecommunications is comprised of entities “primarily engaged in providing specialized telecommunications services, such as satellite tracking, communications telemetry, and radar station operation. This industry also includes establishments primarily engaged in providing satellite terminal stations and associated facilities connected with one or more terrestrial systems and capable of transmitting telecommunications to, and receiving telecommunications from, satellite systems. Establishments providing Internet services or voice over Internet protocol (VoIP) services via client-supplied telecommunications connections are also included in this industry.” For this category, Census Bureau data for 2007 show that there were a total of 2,383 firms that operated for the entire year. Of this total, 2,346 firms had annual receipts of under $25 million. We anticipate that some of these “Other Telecommunications firms,” which are small entities, are earth station applicants/licensees that will be affected by our adopted rule changes.
We anticipate that our rule changes will have an impact on space station applicants and licensees. Space station applicants and licensees, however, rarely qualify under the definition of a small entity. Generally, space stations cost hundreds of millions of dollars to construct, launch and operate. Consequently, we do not anticipate that any space station operators are small entities that would be affected by our actions.
The Order adopts a number of rule changes that will affect reporting, recordkeeping and other compliance requirements for space station operators. These changes, as described below, will decrease the burden for all businesses operators, especially firms that are applicants for licenses to operate NGSO-like space stations.
We simplify the rules to facilitate improved compliance. First, the Order simplifies information collections in applications for NGSO-like space station licenses. Specifically, the Order eliminates reporting requirements that are more burdensome than necessary. For example, the Order removes the “three-licensee presumption,” a rebuttable presumption that assumes, for purposes of the modified processing round procedure for NGSO-like space station applications, a sufficient number of licensees in the frequency band is three, and if the processing round results in less than three applicants,
Another example is that we see no reason to require non-U.S.-satellite operators with satellites on the Permitted List to notify the Commission of a change of ownership before the transfer takes place. Thus, we revise our rule to state clearly that non-U.S.-satellite operators are allowed to notify the Commission of transfers of ownership of Permitted List satellites
The RFA requires an agency to describe any significant, specifically small business, alternatives that it has considered in reaching its proposed approach, which may include the following four alternatives (among others): “(1) the establishment of differing compliance or reporting requirements or timetables that take into account the resources available to small entities; (2) the clarification, consolidation, or simplification of compliance and reporting requirements under the rules for such small entities; (3) the use of performance rather than design standards; and (4) an exemption from coverage of the rule, or any part thereof, for such small entities.”
The Commission is aware that some of the revisions may impact small entities. The
The adopted changes for NGSO-like space station licensing clarify requirements for NGSO-like modified processing rounds. Each of these changes will lessen the burden in the licensing process. Specifically, this Order adopts revisions to reduce filing requirements and clarify the procedures for redistribution of surrendered spectrum in such a way that applicant burden will be reduced. Thus, the revisions will ultimately lead to benefits
The Commission will send a copy of this
The action is authorized under sections 4(i), 7(a), 303(c), 303(f), 303(g), and 303(r) of the Communications Act of 1934, as amended, 47 U.S.C. 154(i), 157(a), 161, 303(c), 303(f), 303(g), and 303(r).
Administrative practice and procedure, Earth stations, Satellites.
For the reasons discussed in the preamble, the Federal Communications Commission amends 47 CFR part 25 as follows:
Interprets or applies 47 U.S.C. 154, 301, 302, 303, 307, 309, 310, 319, 332, 605, and 721, unless otherwise noted.
(g) A non-U.S.-licensed satellite operator that acquires control of a non-U.S.-licensed space station that has been permitted to serve the United States must notify the Commission within 30 days after consummation of the transaction so that the Commission can afford interested parties an opportunity to comment on whether the transaction affected any of the considerations we made when we allowed the satellite operator to enter the U.S. market. A non-U.S.-licensed satellite that has been transferred to new owners may continue to provide service in the United States unless and until the Commission determines otherwise. If the transferee or assignee is not licensed by, or seeking a license from, a country that is a member of the World Trade Organization for services covered under the World Trade Organization Basic Telecommunications Agreement, the non-U.S.-licensed satellite operator will be required to make the showing described in paragraph (a) of this section.
(e)(1) In the event that there is insufficient spectrum in the frequency band available to accommodate all the qualified applicants in a processing round, the available spectrum will be divided equally among the licensees whose applications are granted pursuant to paragraph (d) of this section, except as set forth in paragraph (e)(2) of this section.
(2) In cases where one or more applicants apply for less spectrum than they would be warranted under paragraph (e)(1) of this section, those applicants will be assigned the bandwidth amount they requested in their applications. In those cases, the remaining qualified applicants will be assigned the lesser of the amount of spectrum they requested in their applications, or the amount of spectrum that they would be assigned if the available spectrum were divided equally among the remaining qualified applicants.
National Aeronautics and Space Administration.
Final rule.
NASA is issuing a final rule amending the NASA Federal Acquisition Regulation Supplement (NFS) to remove the Engineering Change Proposals (ECPs) basic clause with its Alternate I & II and associated information collection from the NFS.
Andrew O'Rourke, telephone 202-358-4560.
NASA published a proposed rule in the
NASA reviewed the public comments received in the development of the final rule. The six comments received were advertisements for personal services from the same respondent and completely unrelated to the purpose of this rule. Therefore, no change was made to the final rule as a result of the public comments received.
Executive Orders (E.O.s) 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distributive impacts, and equity). E.O. 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. This is not a significant regulatory action and, therefore, was not subject to review under section 6(b) of E.O. 12866, Regulatory Planning and Review, dated September 30, 1993. This rule is not a major rule under 5 U.S.C. 804.
A final regulatory flexibility analysis has been prepared consistent with the Regulatory Flexibility Act, 5 U.S.C. 601,
The National Aeronautics and Space Administration (NASA) is issuing a final rule to amend the NASA FAR Supplement (NFS) to remove NFS clause 1852.243-70, Engineering Change Proposals (ECPs) basic clause with its Alternate I & II and associated information collection from the NFS because the NFS clause is no longer used in procurements and is duplicative to FAR requirements. NASA conducted a retrospective review of its regulations and determined NFS clause 1852.243-70 should be removed along with the corresponding information collection requirement OMB Control No. 2700—054.
No changes were made to the final rule as a result of public comments received. Comments received in response to the proposed rule were advertisements for personal services and deemed out of scope.
NASA does not expect this final rule to have a significant economic impact on a substantial number of small entities within the meaning of the Regulatory Flexibility Act, 5 U.S.C. 601
This rule does not include any new reporting, recordkeeping, or other compliance requirements for small businesses. There are no significant alternatives that could further minimize the already minimal impact on businesses, small or large.
The rule contains information collection requirements that require the approval of the OMB under the Paperwork Reduction Act (44 U.S.C. chapter 35); however, the changes to the NFS removes the information collection requirements previously approved under OMB Control Number 2700-0054, entitled NFS 1843 Contract Modifications for Engineering Change Proposals (ECP).
Government procurement.
Accordingly, 48 CFR parts 1801, 1843, and 1852 are amended as follows:
51 U.S.C. 20113(a) and 48 CFR chapter 1.
1801.106 OMB approval under the Paperwork Reduction Act.
The following OMB control numbers apply:
The contracting officer may insert a clause substantially as stated at 1852.243-72, Equitable Adjustments, in solicitations and contracts for—
(a) Dismantling, demolishing, or removing improvements; or
(b) Construction, when the contract amount is expected to exceed the simplified acquisition threshold and a fixed-price contract is contemplated.
National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration (NOAA), Commerce.
Temporary rule; opening.
NMFS is opening directed fishing for groundfish by vessels using trawl gear in the Gulf of Alaska (GOA). This action is necessary to fully use the 2016 groundfish total allowable catch in the GOA.
Effective 1200 hours, Alaska local time (A.l.t.), October 28, 2016, through 2400 hours, A.l.t., December 31, 2016.
Comments must be received at the following address no later than 4:30 p.m., A.l.t., November 15, 2016.
You may submit comments on this document, identified by FDMS Docket Number NOAA-NMFS-2015-0110, by any of the following methods:
•
•
Josh Keaton 907-586-7228.
NMFS manages the groundfish fishery in the GOA exclusive economic zone according to the Fishery Management Plan for Groundfish of the Gulf of Alaska (FMP) prepared by the North Pacific Fishery Management Council under authority of the Magnuson-Stevens Fishery Conservation and Management Act. Regulations governing fishing by U.S. vessels in accordance with the FMP appear at subpart H of 50 CFR part 600 and 50 CFR part 679.
NMFS prohibited directed fishing for groundfish by vessels using trawl gear in the GOA, effective 1200 hours, A.l.t., October 22, 2016 (81 FR 74313) under § 679.21(d)(6)(i). That action was necessary because the annual prohibited species catch (PSC) limit for Pacific halibut specified for vessels using trawl gear in the GOA was reached.
As of October 25, 2016, NMFS has determined that approximately 250 metric tons of the trawl Pacific halibut PSC limit remains. Therefore, in accordance with § 679.25(a)(1)(i), (a)(2)(i)(C), and (a)(2)(iii)(D), and to fully utilize the 2016 groundfish total allowable catch, NMFS is terminating the previous closure and is opening directed fishing for groundfish by vessels using trawl gear in the GOA. The Administrator, Alaska Region (Regional Administrator) considered the following factors in reaching this decision: (1) The current harvest of Pacific halibut PSC in the trawl fishery the of the GOA and, (2) the harvest capacity and stated intent on future harvesting patterns of vessels in participating in this fishery.
This action responds to the best available information recently obtained from the fishery. The Assistant Administrator for Fisheries, NOAA (AA), finds good cause to waive the requirement to provide prior notice and opportunity for public comment pursuant to the authority set forth at 5 U.S.C. 553(b)(B) as such requirement is impracticable and contrary to the public interest. This requirement is impracticable and contrary to the public interest as it would prevent NMFS from responding to the most recent fisheries data in a timely fashion and would delay the opening of directed fishing for groundfish by vessels using trawl gear in the GOA. NMFS was unable to publish a notice providing time for public comment because the most recent, relevant data only became available as of October 25, 2016.
The AA also finds good cause to waive the 30-day delay in the effective date of this action under 5 U.S.C. 553(d)(3). This finding is based upon the reasons provided above for waiver of prior notice and opportunity for public comment.
Without this inseason adjustment, NMFS could not allow the trawl deep-water species fishery in the GOA to be harvested in an expedient manner and in accordance with the regulatory schedule. Under § 679.25(c)(2), interested persons are invited to submit written comments on this action to the above address until November 15, 2016.
This action is required by §§ 679.21 and 679.25 and is exempt from review under Executive Order 12866.
16 U.S.C. 1801
Office of the Secretary (OST), Department of Transportation (DOT).
Advance notice of proposed rulemaking (ANPRM).
The Department of Transportation (DOT or Department) is soliciting public comment and feedback on various issues related to the requirement for airlines to refund checked baggage fees when they fail to deliver the bags in a timely manner, as provided by the FAA Extension, Safety, and Security Act of 2016.
Comments should be filed by November 30, 2016. Late-filed comments will be considered to the extent practicable.
You may file comments identified by the docket number DOT-OST-2016-0208 by any of the following methods:
•
•
•
•
Clereece Kroha, Senior Trial Attorney, Office of the Assistant General Counsel for Aviation Enforcement and Proceedings, U.S. Department of Transportation, 1200 New Jersey Ave. SE., Washington, DC 20590, 202-366-9342 (phone), 202-366-7152 (fax),
The Department of Transportation (DOT or Department) is seeking comment on the appropriate means to implement a requirement in recent legislation for airlines to refund checked baggage fees when they fail to deliver the bags in a timely manner. Specifically, the Department seeks comment on how to define a baggage delay, and the appropriate method for providing the refund for delayed baggage.
On April 25, 2011, the Department of Transportation published its second Enhancing Airline Passenger Protections final rule that requires, among other things, that U.S. and foreign air carriers adopt and adhere to a customer service plan that addresses various consumer issues.
Baggage fees, along with other ancillary fees, have become an increasingly important component of the airline industry's revenue structure. According to data from the Department's Bureau of Transportation Statistics (BTS), the top 13 U.S. carriers collectively generated over $3.8 billion in revenue in 2015 from baggage fees.
This matter has also caught the attention of the Congress. In 2016, both the Senate and the House of Representatives included in their Federal Aviation Administration reauthorization bills a provision to require the Department to issue a rule that mandates refunds of baggage fees for delayed bags.
Section 2305 of the FAA Extension Act provides that the Department shall issue a final rule within one year of the enactment of the Act that requires U.S. and foreign carriers to promptly provide an automated refund for any ancillary fees paid by the passenger for checked baggage if the carriers fail to deliver the bag to passengers within 12 hours of arrival for domestic flights and within 15 hours of arrival for international flights, if the passenger notifies the carrier about the delayed or lost baggage. The Act also allows the Department to extend these timeframes to up to 18 hours for domestic flights and up to 30 hours for international flights, if the Department determines that the 12-hour or 15-hour standards are not feasible and would adversely affect consumers in certain cases.
Each delayed bag affects an individual passenger's travel experience, resulting in inconvenience and other harms. The Department is seeking comments from all stakeholders in order to determine how to implement section 2305 of the Act so the mandated regulation would best achieve Congress' and the Department's goal of mitigating the inconvenience and harm to consumers caused by delayed baggage.
DOT is seeking comment to help it determine the appropriate length of delay within the statutory parameters that would trigger the refund requirement. As stated above, the Act provides that a refund should be issued to passengers if the carrier fails to deliver the checked baggage to the passenger not later than 12 hours after the arrival of a domestic flight, or not later than 15 hours after the arrival of an international flight. The Act also authorizes the Department to extend these timeframes to up to 18 hours for domestic flights and 30 hours for international flights if the Secretary determines that the 12-hour or 15-hour standards are infeasible and would “adversely affect consumers in certain cases.” The Department invites public input on the 12 and 15 hour standards prescribed in the Act as well as any other standards within the statutory parameters, which are for domestic flights between 12 and 18 hours after the flight's arrival and for international flights between 15 and 30 hours after the flight's arrival. The Department seeks comment on why a particular length of time within this timeframe would be more appropriate than other times.
The Department also seeks comment on how the rule should deal with a passenger itinerary that consists of an international flight connecting to a domestic flight. Is there a reason that this itinerary should be considered an international flight within the meaning of the statute, or does the final domestic flight cause the passenger to be treated as domestic for purposes of the statute and rule? Is there a reason to distinguish between a standard interline (
We solicit comments on the ways in which standard industry practice for baggage interlining and mishandled baggage may affect the mandated rule. For example, the last carrier on an interline itinerary is generally responsible for handling a mishandled-baggage report to conclusion, but on a baggage delay on an interline trip this will generally not be the carrier to whom the passenger paid the baggage fee.
In addition to situations, such as interline, in which there are multiple entities involved in the transportation of bags, there are also situations in which there are multiple entities involved in the transactions of bag fees. Specifically, although not a common practice among most carriers, there are instances in which a carrier authorizes a ticket agent, by contractual agreement, to collect baggage fees from the ticket agent's customers on behalf of the carrier. To the extent an entity other than the carrier is involved in collecting baggage fees, we seek comments on who should be held responsible to refund the bag fees for delayed bags. Should we hold both entities responsible? Based on the structure of the agreement between the two entities, and common business practice, what is the best way to ensure that bag fees are refunded in a timely manner and to avoid passengers being sent back and forth between two entities to determine which entity is responsible?
As the statute gives the Department some flexibility to modify the length of delay taking into consideration feasibility and any negative impact on consumers, we construe the statute's use of the phrase “in certain cases” to mean that Congress intends to provide the Department the flexibility to differentiate the length of delay that triggers a refund based on certain circumstances, if appropriate, instead of applying one standard to all domestic flights, and another standard to all international flights, if the Department determines this is appropriate. In that regard, in addition to domestic versus international flights, is there a reason that the rule should establish a secondary set of criteria, such as the flight duration and/or the frequency of service in question? Is the frequency of the operation by the transporting carrier or all carriers that operate on the same route relevant to defining the delay? Since some international flights are short haul flights (
DOT is also seeking comment on how to determine when the clock stops running for purposes of measuring the delay. The Act provides that the 12 hour and 15 hour clock stops when the carrier “delivers the checked baggage to the passenger.” Sometimes, a passenger may stay at the arrival airport and wait for the delayed baggage if the delay is likely to be within a few hours. However, when the delay goes beyond a certain point, the industry's common practice is to deliver the bags to the passenger's residence or a designated location requested by the passenger. In
DOT seeks comment on the number of bags that are delayed annually based on the 12 and 18 hour and 15 and 30 hour statutory timeframes, and lost bags. The Department receives information on the number of mishandled-baggage reports filed by passengers, but we do not have data on how many of these are delayed bags, and how many are lost. Information on the number of delayed and lost bags that would be affected by this rulemaking would help the Department to better estimate the impact this rule would have on consumers and airlines.
The Department is also seeking comment on the appropriate method for providing a refund for delayed baggage. The Department's credit card refund regulation, 14 CFR part 374, implements the Consumer Credit Protection Act and Regulation Z of the Board of Governors of the Federal Reserve System, 15 U.S.C. 1601-1693r and 12 CFR part 226 (Regulation Z) with respect to air carriers and foreign air carriers. It states that when refunds are due on purchases with a credit card, a carrier must transmit a credit statement to the credit card issuer within seven business days of receipt of full documentation for the refund requested. In addition, the Department requires that, with respect to purchases with forms of payment other than credit cards, an airline must provide a refund within 20 days of receipt of full documentation of such a request. See 14 CFR 259.5(b)(5). The Department applies these refund standards to all refunds that are due to consumers, including airfare refunds and ancillary fee refunds. In order to receive a refund under Regulation Z, a consumer must request the refund from the carrier and provide all necessary supporting documents. In contrast, the Act states that carriers should “promptly provide an automated refund” to an eligible passenger when the carriers fail to meet the applicable time limit in delivering the checked bag, and the passenger has notified the carrier of the lost or delayed checked baggage. Under the Act, an “automated refund” should be issued to passengers as long as the delay has met the threshold timeframe and the passenger has notified the carrier about the delayed or lost bag. In that regard, we view the delayed baggage fee refund provision in the FAA Extension Act differently from Regulation Z in that the Act only requires a passenger to notify the carrier that a bag is delayed or lost, and there is not a requirement for the passenger to request a refund for the baggage fee. We emphasize that since the Act's automated refund requirement covers all bags that are delayed for more than a set number of hours, it will also cover “lost bags,” refunding fees charged for which is already required by 14 CFR 259.5(b)(3).
The Department seeks comment on whether prescribing a specific mechanism for the carriers to use to provide the statutorily required automated refund would negatively or positively impact carriers and consumers. What procedures would be necessary on interline itineraries, for which the carrier to whom the passenger reports the delayed bag at his or her destination or stopover is not the carrier to whom the passenger had paid the baggage fee? In addition to soliciting comment on all of the issues and concerns identified above, we also welcome and any other information relevant to this issue. This specifically includes comments and data on the cost impact on new-entrant carriers (many of whom do not have interline agreements) of the time standard developed in this proceeding, and the cost impact on regional airlines.
Food and Drug Administration, HHS.
Notification of availability.
The Food and Drug Administration (FDA, we, or Agency) is announcing the availability of a draft guidance for industry entitled “Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry.” This draft guidance explains our current thinking on disclosure statements made by an entity, in documents accompanying food, that certain hazards have not been controlled by that entity as required by certain provisions in four final rules. This document describes our current thinking on how to describe the hazard under each of the four rules and which documents we consider to be “documents of the trade” for the purpose of disclosure statements.
Although you can comment on any guidance at any time (see 21 CFR 10.115(g)(5)), to ensure that we consider your comment on this draft guidance before we begin work on the final version of the guidance, submit either electronic or written comments on the draft guidance by May 1, 2017. Submit either electronic or written comments on the proposed collection of information by May 1, 2017.
You may submit comments as follows:
Submit electronic comments in the following way:
•
• If you want to submit a comment with confidential information that you do not wish to be made available to the public, submit the comment as a written/paper submission and in the manner detailed (see “Written/Paper Submissions” and “Instructions”).
Submit written/paper submissions as follows:
•
• For written/paper comments submitted to the Division of Dockets Management, FDA will post your comment, as well as any attachments, except for information submitted, marked and identified, as confidential, if submitted as detailed in “Instructions.”
•
Submit written requests for single copies of the draft guidance to Center for Food Safety and Applied Nutrition (HFS-300), Food and Drug Administration (HFS-300), 5001 Campus Drive, College Park, MD 20740. Send two self-addressed adhesive labels to assist that office in processing your request. See the
For questions regarding this draft guidance as it relates to our regulation entitled “Current Good Manufacturing Practice, Hazard Analysis, and Risk-Based Preventive Controls for Food for Animals,” contact Jeanette Murphy, Center for Veterinary Medicine (HFV-200), Food and Drug Administration, 7519 Standish Pl., Rockville, MD 20855, 240-402-6246.
For questions regarding this draft guidance as it relates to our regulation entitled “Standards for the Growing, Harvesting, Packing, and Holding of Produce for Human Consumption,” contact Samir Assar, Center for Food Safety and Applied Nutrition (HFS-317), Food and Drug Administration, 5001 Campus Dr., College Park, MD 20740, 240-401-1636.
For questions regarding this draft guidance as it relates to our regulation entitled “Foreign Supplier Verification Programs (FSVP) for Importers of Food for Humans and Animals,” contact Rebecca Buckner, Office of Food and Veterinary Medicine, Food and Drug Administration, 10903 New Hampshire Ave., Silver Spring, MD 20993-0002, 301-796-4576.
We are announcing the availability of a draft guidance for industry entitled “Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry.” We are issuing the draft guidance consistent with FDA's good guidance practices regulation (21 CFR 10.115). The draft guidance, when finalized, will represent the current thinking of FDA on this topic. It does not create or confer any rights for or on any person and does not operate to bind FDA or the public. You can use an alternate approach if it satisfies the requirements of the applicable statutes and regulations.
The draft guidance relates to four of the seven foundational rules that we have established in Title 21 of the Code of Federal Regulations (21 CFR) as part of our implementation of the FDA Food Safety Modernization Act (FSMA) (Pub. L. 111-353). Table 1 lists these four rules. Each of these rules includes “customer provisions” as specified in table 1.
The “customer provisions” of part 117 and part 507 each include a requirement for a “disclosure statement” in which a manufacturer/processor must disclose, in documents accompanying the food, in accordance with the practice of the trade, that the food is “not processed to control [identified hazard]” in certain circumstances. Likewise, the “customer provisions” of the FSVP regulation include a requirement for a “disclosure statement” in which an importer must disclose, in documents accompanying the food, in accordance with the practice of the trade, that the food is “not processed to control [identified hazard]” in certain circumstances. The “customer provisions” of the produce safety regulation relate to an exemption from that regulation that includes a requirement for a “disclosure statement” in which a farm must disclose, in documents accompanying the food, in accordance with the practice of the trade, that the food is “not processed to adequately reduce the presence of microorganisms of public health significance.”
The draft guidance responds to industry questions regarding these requirements for a disclosure statement. On March 23, 2016, FDA met with a food trade association at their request to listen to concerns regarding the customer provisions of part 117 (Ref. 1), including concerns regarding the disclosure statement in part 117. At the meeting, the trade association expressed concern about providing a disclosure statement when multiple hazards may be present, including chemical hazards (such as mycotoxins) and physical hazards (such as stones in raw agricultural commodities), as well as for multiple biological hazards (such as microbial pathogens). The trade association also asked us to allow a variety of types of documents that accompany the food to have the disclosure statement (
The trade association focused its discussion on the requirements of part 117, but noted that it had parallel concerns for the analogous provisions of part 507 and the FSVP regulation (Ref. 1). Although the trade association did not express concern with the disclosure statement in the produce safety regulation, we believe it will be helpful to businesses subject to the produce safety regulation, to include our current thinking on the disclosure statement in all four rules that have requirements for a disclosure statement, not just the three rules mentioned by the trade association.
This draft guidance refers to previously approved collections of information found in FDA regulations. These collections of information are subject to review by the Office of Management and Budget (OMB) under the Paperwork Reduction Act of 1995 (44 U.S.C. 3501-3520). The collections of information in 21 CFR part 117 have been approved under OMB control number 0910-0751. The collections of information in 21 CFR part 507 have been approved under OMB control number 0910-0789. The collections of information in 21 CFR part 112 have been approved under OMB control number 0910-0816. The collections of information in 21 CFR part 1, subpart L have been approved under OMB control number 0910-0752.
Persons with access to the Internet may obtain the draft guidance at either
The following references are on display in the Division of Dockets Management, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852, and are available for viewing by interested persons between 9 a.m. and 4 p.m., Monday through Friday; they are also available electronically at
Food and Drug Administration, HHS.
Proposed rule; extension of comment period.
The Food and Drug Administration (FDA) is extending the comment period for the proposed rule that appeared in the
FDA is extending the comment period on the proposed rule published August 24, 2016 (81 FR 58342). Submit either electronic or written comments by January 21, 2017.
You may submit comments as follows:
Submit electronic comments in the following way:
•
• If you want to submit a comment with confidential information that you do not wish to be made available to the public, submit the comment as a written/paper submission and in the manner detailed (see “Written/Paper Submissions” and “Instructions”).
Submit written/paper submissions as follows:
•
• For written/paper comments submitted to the Division of Dockets Management, FDA will post your comment, as well as any attachments, except for information submitted, marked and identified, as confidential, if submitted as detailed in “Instructions.”
•
Vernon Toelle, Office of Surveillance and Compliance, Center for Veterinary Medicine, Food and Drug Administration, 7519 Standish Pl., MPN4-142, Rockville, MD 20855, 240-402-5637; or Kristin Webster Maloney, Office of Policy and Risk Management, Office of Regulatory Affairs, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 32, Rm. 4373, Silver Spring, MD 20993, 240-402-4993.
In the
The Agency has received requests for a 90-day extension of the comment period for the proposed rule. Each request conveyed concern that the current 90-day comment period does not allow sufficient time to develop a meaningful or thoughtful response to the proposed rule.
FDA has considered the requests and is extending the comment period for the proposed rule for 60 days, until January 21, 2017. The Agency believes that a 60-day extension allows adequate time for interested persons to submit comments without significantly delaying rulemaking on these important issues.
Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, DoD.
Proposed rule.
This rulemaking establishes policy, assigns responsibilities, and prescribes procedures for the dissemination and withholding of certain unclassified technical data and technology subject to the International Traffic in Arms Regulations (ITAR) and Export Administration Regulations (EAR). It applies to DoD components, their contractors and grantees and is meant to control the transfer of technical data and technology contributing to the military potential of any country or countries, groups, or individuals that could prove detrimental to U.S, national security or critical interests.
Comments must be received by December 30, 2016.
You may submit comments, identified by docket number and/or RIN number and title, by any of the following methods:
•
•
Vakare Valaitis, 703-767-9159.
For the purposes of this regulation, public disclosure of technical data and technology is the same as providing uncontrolled foreign access. This rule instructs DoD employees, contractors, and grantees to ensure unclassified technical data and technology that discloses technology or information with a military or space application may not be exported without authorization and should be controlled and disseminated consistent with U.S. export control laws and regulations. These policies preserve the U.S. military's technological superiority, establish and maintain interoperability with allies and coalition partners, and manage direct and indirect impacts on defense industrial base.There are penalties for export control violations. For export control violations involving items controlled by the United States Department of State under the International Traffic in Arms Regulations (ITAR), including many munitions items, the statute authorizes a maximum criminal penalty of $1 million per violation and, for an individual person, up to 10 years imprisonment. In addition, ITAR violations can result in the imposition of a maximum civil fine of $500,000 per violation, as well as debarment from exporting defense articles or services. For export control violations involving dual-use and certain munitions items controlled by the United States Department of Commerce under the Export Administration Regulations, criminal and civil penalties are currently provided by the International Emergency Economic Powers Act (IEEPA), 50 U.S.C. 1705, which has continued the Export Administration Regulations (EAR) in effect while the Export Administration Act is in lapse through Executive Order 13222 of August 17, 2001 (3 CFR 2001 Comp. 783 (2002)), as amended by Executive Order 13637 of March 8, 2013, 78 FR 16129 (March 13, 2013) and as extended by successive Presidential Notices, the most recent being that of August 4, 2016 (81 FR 52587 (Aug. 8, 2016)). Under the EAR and IEEPA, as adjusted by 15 CFR 5.4(b), the penalty for persons who violate, attempt or conspire to violate, or cause a violation of the export control regulations includes civil penalties of not more than $284,582 per transaction or twice the amount of the transaction, whichever is greater, and criminal penalties of not more than $1,000,000, imprisonment of not more than 20 years, or both. Violations of the EAR may also result in the denial of export priveleges and other administrative sanctions.
In accordance with 10 U.S.C. 133 part (b)(2), the Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)) may exercise powers relating to establishing policies for acquisition (including procurement of goods and services, research and development, developmental testing, and contract administration) for all elements of the Department of Defense. In addition, U.S. export control laws, including 22 U.S.C. 2778 (also known as the “Arms Export Control Act”); 50 U.S.C. chapter 35 (also known as the “International Emergency Economic Powers Act” (IEEPA)); 22 CFR parts 120 through 130 (also known as “International Traffic in Arms Regulations” (ITAR)); and 15 CFR parts 730 through 774 (also known as “Export Administration Regulations” (EAR)) govern this rule.
This proposed rule describes procedures for the release of technical information; discusses procedures for technical data and technology to be marked for distribution; and provides an example of the notice to accompany export-controlled technical data and technology.
DoD is proposing this regulation to update the CFR and DoD Directive 5230.25 (available at
The program has no discernible increase in anticipated costs and benefits as the program is being updated to conform to national security guidance cited in the text in §§ 250.1 through 250.7.
The potential benefits include greater public access and understanding of information about the qualifications needed for access to export controlled technical data and technology. Such information may help potential contractors and grantees to better understand their options for participating in DoD activities; to better enable funders and researchers to determine the need for information and technolgy; to provide more complete information of those who use information from DoD research and contracts to inform other decisions; and to better enable the scientific community to examine the overall state of information and technology in this area as a basis for engaging in quality improvement (
This proposed rule is included in DoD's retrospective plan, completed in August 2011, and will be reported in future status updates of DoD's retrospective review in accordance with the requirements in Executive Order 13563. DoD's full plan can be accessed at:
Executive Orders 13563 and 12866 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distribute impacts, and equity). Executive Order 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. Although this rulemaking is not “economically significant” because it does not have an annual effect on the economy of $100 million or more or adversely affect in a material way the economy, it has been deemed “other significant” for raising novel legal or policy issues arising out of legal mandates, the President's priorities, or
Section 202 of the Unfunded Mandates Reform Act of 1995 (UMRA) (Pub. L. 104-4) requires agencies assess anticipated costs and benefits before issuing any rule whose mandates require spending in any 1 year of $100 million in 1995 dollars, updated annually for inflation. In 2014, that threshold is approximately $141 million. This proposed rule would not mandate any requirements for State, local, or tribal governments, nor would it affect private sector costs.
The Department of Defense certifies that this proposed rule is not subject to the Regulatory Flexibility Act (5 U.S.C. 601) because it would not, if promulgated, have a significant economic impact on a substantial number of small entities. Therefore, the Regulatory Flexibility Act, as amended, does not require us to prepare a regulatory flexibility analysis.
It has been certified that this proposed rule does impose reporting or recordkeeping requirements under the Paperwork Reduction Act of 1995. These reporting requirements have been approved by OMB under OMB Control Number 0704-0207 titled DD Form 2345, Militarily Critical Technical Data Agreement.
In exchange for Government-owned unclassified export controlled technical data and technology, a contractor provides basic company information, identifies a technical data and technology custodian, and describes need-to-know. The reporting burden is estimated to average 20 minutes per response. The DD Form 2345 and supporting documentation must be submitted to the U.S./Canada Joint Certification Office in hardcopy. Approximately 24,000 U.S. companies have active certifications.
Executive Order 13132 establishes certain requirements that an agency must meet when it promulgates a proposed rule (and subsequent final rule) that imposes substantial direct requirement costs on State and local governments, preempts State law, or otherwise has federalism implications. This proposed rule will not have a substantial effect on State and local governments.
Exports, Science and technology.
Accordingly, 32 CFR part 250 is proposed to be revised to read as follows:
10 U.S.C. 133.
This part establishes policy, assigns responsibilities, and prescribes procedures for the dissemination and withholding of certain unclassified technical data and technology consistent with the requirements of 10 U.S.C. 130.
This part:
(a) Applies to:
(1) The Office of the Secretary of Defense, the Military Departments, the Office of the Chairman of the Joint Chiefs of Staff and the Joint Staff, the Combatant Commands, the Office of Inspector General of the Department of Defense, the Defense Agencies, the DoD Field Activities, and all other organizational entities within the DoD (referred to collectively in this part as the “DoD Components”).
(2) All unclassified technical data and technology that discloses technology or information with military or space application, in the possession or under the control of a DoD Component, that may not be exported lawfully without an approval, authorization, license, license exception, or exemption in accordance with U.S. export control laws and regulations: 22 U.S.C. 2778 (also known as the “Arms Export Control Act”); 50 U.S.C. chapter 35 (also known as the “International Emergency Economic Powers Act”); 22 CFR parts 120-130 (also known as “International Traffic in Arms Regulations” (ITAR)); and 15 CFR parts 730 through 774 (also known as “Export Administration Regulations” (EAR)).
(b) Does not modify or supplant the regulations governing the export of technical data and technology established by 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, 10 CFR 810, and 15 CFR parts 730 through 774.
(c) Does not apply to technical information under the control of the Department of Energy or the Nuclear
(d) Does not introduce any additional controls on the dissemination of technical data and technology by private enterprises or individuals beyond those specified by export control laws and regulations or in contracts or other agreements, including certifications as specified in paragraph (a)(9) of § 250.5. Accordingly, the fact that DoD may possess such technical data and technology does not in itself provide a basis for control of such technical data and technology under this part.
(e) Does not introduce any controls on the dissemination of:
(1) Scientific, educational, or other items that are not subject to the EAR or exclusively controlled for export or reexport by another department or agency pursuant to 15 CFR 734.3, 734.7 through 734.8;
(2) Information in the public domain as described in 22 CFR 120.11 and technical data that has been approved for release in accordance with 22 CFR 125.4(b)(13)).
(f) Does not alter the responsibilities of the DoD Components to protect proprietary technical data and technology of a private party, including:
(1) In which the DoD has less than unlimited rights (
(2) That is authorized to be withheld from public disclosure pursuant to 5 U.S.C. 552, also known and referred to in this part as the “Freedom of Information Act (FOIA).”
(g) Does not pertain to or affect the release of technical data and technology by DoD Components to foreign governments, international organizations or their respective representatives, or contractors pursuant to official agreements or formal arrangements with the U.S. Government (USG), or pursuant to USG-licensed transactions involving such entities or individuals. However, in the absence of such USG-sanctioned relationships this part does apply.
(h) Does not apply to classified technical data. However, after declassification, dissemination of the technical data and technology within the scope of paragraph (a)(2) of this section is governed by this part.
(i) Does not alter the responsibilities of the DoD Components to mark and protect information qualifying for designation as controlled unclassified information in accordance with Executive Order 13556, “Controlled Unclassified Information,” as implemented by volume 4 of DoD Manual 5200.01, “DoD Information Security Program” (available at
Unless otherwise noted, these terms and their definitions are for the purpose of this part.
(1) With respect to defense articles or defense services: Those technologies specified in 22 CFR 121.1.
(2) With respect to categories of systems, equipment, and components; test, inspection, and production equipment; materials; software; and technology subject to the EAR: Those technologies specified in 15 CFR part 774.
(3) With respect to nuclear equipment, materials, and technology: Those technologies specified in 10 CFR part 810.
(4) With respect to select agents and toxins: Those technologies specified in 7 CFR part 331, 9 CFR part 121, and 42 CFR part 73; and any other technologies affecting the critical infrastructure.
(5) With respect to emerging critical defense technology: Research and engineering development, or engineering and technology integration that will produce a defense article or defense service, including its underlying technology and software, covered by 22 CFR parts 120 through 130, or a dual-use or munitions item, including its underlying technology and software, covered by 15 CFR parts 730 through 774.
(1) Providing or seeking to provide equipment or technology to a foreign government with USG approval (for example, through foreign military sale).
(2) Bidding, or preparing to bid, on a sale of surplus property.
(3) Selling or producing products for the commercial domestic marketplace or for the commercial foreign marketplace,
(4) Engaging in scientific research in a professional capacity.
(5) Acting as a subcontractor to a qualified contractor.
(1) Certifies that the individual who will act as recipient of the export-controlled technical data and technology on behalf of the U.S. contractor is a U.S. citizen or a person admitted lawfully into the United States for permanent residence and is located in the United States.
(2) Certifies that such data and technology are needed to bid or perform on a contract with the DoD or other USG agency, or for other legitimate business purposes in which the U.S. contractor is engaged or plans to engage. The purpose for which the data and technology are needed must be described sufficiently in such certification to permit an evaluation of whether subsequent requests for data and technology are related properly to such business purpose.
(3) Acknowledges its responsibilities under U.S. export control laws and regulations (including the obligation, under certain circumstances, to obtain an export license prior to the release of technical data and technology within the United States) and agrees that it will not disseminate any export-controlled technical data and technology subject to this part in violation of applicable export control laws and regulations.
(4) Agrees that, unless dissemination is permitted by paragraph (i) of § 250.6, it will not provide access, including network access, to export-controlled technical data and technology subject to this part to persons other than its employees or persons acting on its behalf, and who meet the same citizenship or residency requirements without the permission of the DoD Component that provided the technical data and technology.
(5) To the best of its knowledge, knows of no person employed by it or acting on its behalf who will have access to such data and technology, who is debarred, suspended, or otherwise ineligible from performing on USG contracts; or has violated U.S. export control laws or a certification previously made to the DoD under the provisions of this part.
(6) Asserts that it is not debarred, suspended, or otherwise determined ineligible by any agency of the USG to perform on USG contracts, has not been convicted of export control law violations, and has not been disqualified under the provisions of this part.
(7) Requests the certification be accepted based on its description of extenuating circumstances when the certifications required by this definition cannot be made truthfully.
(1) Classified data relating to defense articles and defense services on the U.S. Munitions List;
(2) Information covered by an invention secrecy order; or
(3) Software (see 22 CFR 120.45(f)) directly related to defense articles.
(4) The definition does not include information concerning general scientific, mathematical, or engineering principles commonly taught in schools, colleges, and universities, or information in the public domain as defined in 22 CFR 120.11 or telemetry data as defined in note 3 to Category XV(f) of in 22 CFR part 121. It also does not include basic marketing information on function or purpose or general system descriptions of defense articles.
It is DoD policy that:
(a) Pursuant to 10 U.S.C. 130 and 133, the Secretary of Defense may withhold from public disclosure any technical data and technology with military or space application in the possession or under the control of the DoD, if such technical data and technology may not be exported lawfully without a license, exception, exemption, or other export
(b) Because public disclosure of technical data and technology subject to this part is the same as providing uncontrolled foreign access, withholding such technical data and technology from public disclosure, unless approved, authorized, or licensed in accordance with export control laws, is necessary and in the national interest.
(c) Notwithstanding the authority in paragraph (c)(1) of this section, it is DoD policy to provide technical data and technology governed by this part to individuals and enterprises that are:
(1) Currently qualified U.S. contractors, when such technical data and technology relate to a legitimate business purpose for which the contractor is certified; or
(2) A certified Canadian contractor referred to in and governed by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition (available at
(d) This part may not be used by the DoD Components as authority to deny access to technical data and technology to the Congress or to any Federal, State, or local government agency that requires the technical data and technology for regulatory or other official government purposes. Dissemination of the technical data and technology will include a statement that DoD controls it, in accordance with this part.
(e) The authority in this part may not be used to withhold from public disclosure unclassified information regarding DoD operations, policies, activities, or programs, including the costs and evaluations of performance and reliability of military and space equipment. When information does contain technical data and technology subject to this part, the technical data and technology must be excised from what is disclosed publicly.
(f) This part may not be used as a basis for the release of limited rights or restricted rights data as defined in 48 CFR or those that are authorized to be withheld from public disclosure pursuant to the 5 U.S.C. 552.
(g) This part may not be used to provide protection for technical data that should be classified in accordance with Executive Order 13526, “Classified National Security Information,” and volume 1 of DoD Manual 5200.01 (available at
(h) This part provides immediate authority to cite section (b)(3) of 5 U.S.C. 552 (FOIA Exemption 3) described in 32 CFR part 286 as the basis for denials under 5 U.S.C. 552 of technical data and technology currently determined to be subject to the provisions of this part. The technical data will be withheld under the authority of 10 U.S.C.130. If the information originated or is under the control of a Government Agency outside the DoD, DoD Components will refer to that Government Agency for a release determination.
(i) Technical data and technology subject to this part must be marked in accordance with DoD Instruction 5230.24, “Distribution Statements on Technical Documents” (available at
(j) Technical data and technology subject to this part, when disseminated electronically, must be marked in accordance with volume 4 of DoD Manual 5200.01 and are subject to all applicable security requirements specified in DoD Instruction 8500.01, “Cybersecurity” (available at
(k) In accordance with DoD Instruction 5015.02, “DoD Records Management Program” (available at
(a) The Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) has overall responsibility for the implementation of this part and will designate an office to:
(1) Administer and monitor compliance with this part.
(2) Receive and disseminate notifications of temporary revocation of contractor qualification in accordance with paragraph (e) of § 250.6.
(3) Receive recommendations for contractor disqualification made in accordance with paragraph (f) of § 250.6, and act as disqualification authority.
(4) Provide technical assistance when necessary to the DoD Components to assess the significance of the military or space application of technical data and technology that may be withheld from public disclosure in accordance with this part.
(5) Maintain and update procedures and appropriate mechanisms for the certification of qualified contractors, in accordance with paragraph (c) of § 250.4 of this part.
(6) Ensure that the requirements of this part are incorporated into 48 CFR for application to contracts involving technical data and technology governed by this part.
(7) Develop, in conjunction with the Office of the General Counsel of the Department of Defense (GC DoD), guidelines for responding to appeals, as identified in paragraph (k) of § 250.6.
(8) Develop procedures to ensure that the DoD Components apply consistent criteria in authorizing exceptions in accordance with paragraph (j) of § 250.6.
(9) Prescribe procedures to develop, collect, and disseminate certification statements; to ensure their sufficiency, accuracy, and periodic renewal; and to make final determinations of qualification.
(10) Take such other actions that may be required to ensure consistent and appropriate implementation of this part within the DoD.
(b) The Under Secretary of Defense for Policy (USD(P)):
(1) Prepares and issues policy guidance regarding the foreign disclosure and security controls for information in international programs within the scope of this part.
(2) Provides consultation to DoD offices on export control and commodity jurisdiction determinations.
(c) The Deputy Chief Management Officer (DCMO) of the Department of Defense:
(1) Monitors the implementation of the provisions of this part that pertain to 5 U.S.C. 552 and 32 CFR part 285.
(2) Provides such other assistance as may be necessary to ensure compliance with this part.
(d) The GC DoD:
(1) Advises DoD Components with respect to the statutory and regulatory requirements governing the export of technical data and technology.
(2) Advises the USD(AT&L) regarding consistent and appropriate implementation of this part.
(e) The DoD Component heads:
(1) Disseminate and withhold from public disclosure technical data and technology subject to this part consistent with its policies and procedures.
(2) Designate a focal point to:
(i) Ensure implementation of this part.
(ii) Identify classes of technical data and technology whose release are governed by paragraph (d)(3) of § 250.6.
(iii) Act on appeals relating to case-by-case denials for release of technical data and technology.
(iv) Temporarily revoke a contractor's qualification in accordance with paragraph (e) of § 250.6.
(v) Receive and evaluate requests for reinstatement of a contractor's qualification in accordance with paragraph (e)(4) of § 250.6.
(vi) Recommend contractor's disqualification to the USD(AT&L) in accordance with paragraph (f) of § 250.6.
(3) Develop, distribute, and effect Component regulations to implement this part.
(4) Ensure that the controlling DoD office that created or sponsored the technical information exercises its inherently governmental responsibility to determine the appropriate marking in accordance with DoD Instruction 5230.24 and volumes 2 and 4 of DoD Manual 5200.01 (volume 2 available at
(a) Procedures for release of technical information must be made under the following guidelines:
(1) DoD Components may make their technical information for other than military or space application available for public disclosure in accordance with DoD Directive 5230.09 and DoD Instruction 5230.29. DoD has the authority to withhold technical data and technology as defined in § 250.3 from public disclosure.
(2) DoD Components will process FOIA requests from the public for technical information in accordance with 32 CFR part 286 and governing DoD Component issuances. All requested technical data and technology currently determined to be subject to the withholding authority in this part will be denied under Exemption 3 of 5 U.S.C. 552 and 10 U.S.C. 130. Any FOIA appeals for the denied information will be processed in accordance with 32 CFR part 286 and governing DoD Component issuances.
(3) DoD Components may give qualified contractors access to their technical data and technology as permitted by the provisions of this part.
(i) United States-Canada Joint Certification Office adjudicates certification of qualified contractors.
(ii) To qualify, U.S. and Canadian contractors must submit a completed DD Form 2345 “Militarily Critical Technical Data Agreement,” to the United States-Canada Joint Certification Office.
(iii) To qualify, Canadian contractors will submit a completed DD Form 2345 when a Canadian contractor intends to request access to DoD-controlled technical data and technology.
(iv) A copy of the company's State/Provincial Business License, Incorporation Certificate, Sales Tax Identification Form, ITAR Controlled Goods Registration letter or certificate, or other documentation that verifies the legitimacy of the company must accompany all DD Forms 2345.
(v) The contractor's business activity is a key element of the certification process since this information is used by the controlling office as a basis for approving or disapproving specific requests for technical data and technology. The business activity statement should be sufficiently detailed to support requests for any data that the contractor expects for legitimate business purposes.
(b) Upon receipt of a request for technical information in the possession of, or under the control of the DoD, the controlling DoD office for the requested information will determine whether the information is governed by this part.
(1) The determination will be based on whether
(i) The information is subject to 22 CFR part 121 or 15 CFR part 774.
(ii) The information would require a license, exception, exemption, or other export authorization in accordance with U.S. export control laws and regulations in accordance with 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, and 15 CFR parts 730 through 774.
(iii) The information would not fall into the categories of information described in paragraphs (c) and (d) of § 250.2.
(2) In making such a determination, the controlling office may consult with the Defense Technology Security Administration for advice on whether U.S. export control laws or regulations apply. The controlling DoD office may request assistance in making this determination from the USD(AT&L), and if necessary, consult the Departments of State, Commerce, or Energy.
(c) The controlling DoD office will ensure technical data and technology governed by this part are marked for distribution in accordance with DoD Instruction 5230.24 and volume 4 of DoD Manual 5200.01.
(d) The controlling DoD office will authorize release of technical data and technology governed by this part to qualified contractors, as defined in § 250.3, unless either:
(1) The qualification of the contractor concerned has been temporarily revoked in accordance with paragraph (e) of this section;
(2) The controlling DoD office judges the requested technical data and technology to be unrelated to the purpose for which the qualified contractor is certified. When release of technical data and technology is denied in accordance with this paragraph, the controlling DoD office will request additional information to explain the intended use of the requested technical data and technology and, if appropriate, request a new certification (see § 250.3) describing the intended use of the requested technical data and technology; or
(3) The technical data and technology are being requested for a purpose other than to permit the requester to bid or perform on a contract with the DoD or other USG agency. In this case, the controlling DoD office will withhold the technical data and technology if the DoD Component focal point determines the release of the technical data and technology may jeopardize an important technological or operational military advantage of the United States.
(e) Upon receipt of substantial and credible information that a qualified U.S. contractor has violated U.S. export control law; violated its certification; made a certification in bad faith; or omitted or misstated material fact, the DoD Component will temporarily revoke the U.S. contractor's
(1) The DoD Component may delay such temporary revocations with the potential to compromise a USG investigation.
(2) Immediately upon a temporary revocation, the DoD Component will notify the contractor and the USD(AT&L).
(3) The contractor will be given an opportunity to respond in writing to the information upon which the temporary revocation is based before being disqualified.
(4) Any U.S. contractor whose qualification has been temporarily revoked may present information to the DoD Component showing that the basis for revocation was in error or has been remedied and be reinstated.
(f) When the basis for a contractor's temporary revocation cannot be removed within 20 working days, the DoD Component will recommend to the USD(AT&L) that the contractor be disqualified.
(g) After receipt of substantial and credible information that a qualified U.S. contractor has violated U.S. export control law, the DoD Component must notify the appropriate law enforcement agency.
(h) Charges for copying, certifying, and searching records rendered to requesters will be levied in accordance with chapter 4, appendix 2 of volume 11A of DoD 7000.14-R, “Department of Defense Financial Management Regulations (FMRs)” (available at
(i) Qualified U.S. contractors who receive technical data and technology governed by this part may disseminate that technical data and technology for
(1) To any foreign recipient for which the technical data and technology are approved, authorized, or licensed in accordance with 22 U.S.C. 2778 or 15 CFR parts 730 through 774.
(2) To another qualified U.S. contractor including existing or potential subcontractors, but only within the scope of the certified legitimate business purpose of the recipient.
(3) To the Departments of State and Commerce to apply for approvals, authorizations, or licenses for export pursuant to 22 U.S.C. 2778 or 15 CFR parts 730 through 774. The application will include a statement that the technical data and technology for which the approval, authorization, or license is sought is controlled by the DoD in accordance with this part.
(4) To the Congress or any Federal, State, or local governmental agency for regulatory purposes or otherwise as may be required by law or court order. Any such dissemination will include a statement that the technical data and technology are controlled by the DoD in accordance with this part.
(j) A qualified contractor desiring to disseminate technical data and technology subject to this part in a manner not permitted expressly by the terms of this part must be granted authority to do so by the controlling DoD office, consistent with U.S. export control laws and regulations specified in 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, and 15 CFR parts 730 through 774 and DoD policies.
(k) Any requester denied technical data and technology or any qualified U.S. contractor denied permission to disseminate such technical data and technology in accordance with this part will be promptly provided with a written statement of reasons for that action, and advised of the right to make a written appeal to a specifically identified appellate authority within the DoD Component. Other appeals will be processed as directed by the USD(AT&L).
(l) Denials will cite 10 U.S.C. 130 and 133 as implemented by this part. Implementing procedures will provide for resolution of any appeal within 20 working days.
(a) USG officials and certified U.S. contractors and Canadian government officials and certified Canadian contractors may use the certification process to facilitate directly arranged visits that involve access to unclassified technical data and technology. Activities under this process are limited to:
(1) Procurement activities such as unclassified pre-solicitation conferences, discussions related to unclassified solicitations, and collection of procurement unclassified documents.
(2) Performance of an unclassified contract.
(3) Scientific research, in support of unclassified U.S. or Canadian national defense initiatives.
(4) Attendance at restricted meetings, conferences, symposia, and program briefings where technical data and technology governed by this part or Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition will be presented, or the event is being held in an unclassified access controlled area.
(b) A directly arranged visit does not apply to uncertified U.S. or Canadian contractors; classified visits, where confirmation of the visitors' security clearances is required; or unsolicited marketing visits.
(c) A directly arranged visit related to the release of information controlled in the United States by this part or in Canada by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition, is permitted when two conditions are satisfied.
(1) First condition:
(i) There is a valid license covering the export of the technical data and technology;
(ii) The export or release is permitted under the Canadian exemption on 22 CFR 126.5;
(iii) The export or release is covered by the general exemptions in 22 CFR 125.4; or
(iv) The export or release qualifies for a license exception under 15 CFR parts 730 through 774.
(2) Second condition:
(i) The distribution statement applied to the technical data and technology pursuant to DoD Instruction 5230.24 permits release; or
(ii) The originator or government controlling office authorizes release.
Environmental Protection Agency (EPA).
Proposed rule.
The Environmental Protection Agency (EPA) is proposing to conditionally approve State Implementation Plan (SIP) revisions submitted by the state of Utah on August 20, 2013, with supporting administrative documentation submitted on September 12, 2013. These submittals revise the Utah Administrative Code (UAC) that pertain to the issuance of Utah air quality permits for major sources in nonattainment areas. The EPA proposes a conditional approval because while the submitted revisions to Utah's nonattainment permitting rules do not fully address the deficiencies in the state's program, Utah has committed to address additional remaining deficiencies in the state's nonattainment permitting program no later than a year from the EPA finalizing this conditional approval. If finalized, and upon the EPA finding a timely meeting of this commitment in full, the proposed conditional approval of the SIP revisions would convert to a final approval of Utah's plan. This action is being taken under section 110 of the Clean Air Act (CAA) (Act).
Written comments must be received on or before November 30, 2016.
Submit your comments, identified by EPA-R08-OAR-2016-0620 at
Kevin Leone, Air Program, EPA, Region 8, Mailcode 8P-AR, 1595 Wynkoop Street, Denver, Colorado 80202-1129, (303) 312-6227,
a.
b.
i. Identify the rulemaking by docket number and other identifying information (subject heading,
ii. Follow directions—The agency may ask you to respond to specific questions or organize comments by referencing a Code of Federal Regulations (CFR) part or section number.
iii. Explain why you agree or disagree; suggest alternatives and substitute language for your requested changes.
iv. Describe any assumptions and provide any technical information and/or data that you used.
v. If you estimate potential costs or burdens, explain how you arrived at your estimate in sufficient detail to allow for it to be reproduced.
vi. Provide specific examples to illustrate your concerns, and suggest alternatives.
vii. Explain your views as clearly as possible, avoiding the use of profanity or personal threats.
viii. Make sure to submit your comments by the comment period deadline identified.
On May 10, 2001, the EPA sent Utah a letter outlining concerns that Utah's nonattainment permitting rules, which are codified in UAC R307-403 (Permits: New and Modified Sources in Nonattainment Areas and Maintenance Areas), have not been consistent with federal requirements (see docket R08-OAR-2016-0620). On August 20, 2013, with supporting administrative documentation submitted on September 12, 2013, Utah sent the EPA revisions to their nonattainment permitting regulations, specifically to address EPA identified deficiencies in their nonattainment permitting regulations that affected the EPA's ability to approve Utah's PM
The SIP revisions submitted by the Utah Department of Air Quality (UDAQ) on August 20, 2013, establish specific nonattainment new source review permitting requirements. In this revision, the UDAQ has incorporated federal regulatory language—establishing permitting requirements for new and modified major stationary sources in a nonattainment area—from portions of 40 CFR 51.165 and reformatted it into state-specific requirements for sources in Utah under R307-403-1 (Purpose and Definitions) and R307-403-2 (Applicability), including provisions relevant to nonattainment NSR programs for PM
CAA section 110(a)(2)(C) requires each state plan to include “a program to provide for . . . regulation of the modification and construction of any stationary source within the areas covered by the plan as necessary to assure that [NAAQS] are achieved, including a permit program as required in parts C and D of this subchapter,” and CAA section 172(c)(5) provides that the plan “shall require permits for the construction and operation of new or modified major stationary sources anywhere in the nonattainment area, in accordance with section [173].” CAA section 173 lays out the requirements for obtaining a permit that must be included in a state's SIP-approved permit program. CAA section 110(a)(2)(A) requires that SIPs contain enforceable emissions limitations and other control measures. Under section CAA section 110(a)(2), the enforceability requirement in section 110(a)(2)(A) applies to all plans submitted by a state. CAA section 110(i) (with certain limited exceptions) prohibits states from modifying SIP requirements for stationary sources except through the SIP revision process. CAA section 172(c)(7) requires that nonattainment plans, including nonattainment New Source Review (NSR) programs required by section 172(c)(5), meet the applicable provisions of section 110(a)(2), including the requirement in section 110(a)(2)(A) for enforceable emission limitations and other control measures. CAA section 110(l) provides that the
Section 51.165 in title 40 of the CFR (Permit Requirements) sets out the minimum plan requirements states are to meet within each SIP nonattainment NSR permitting program. Generally, 40 CFR 51.165 consists of a set of definitions, minimum plan requirements regarding procedures for determining applicability of nonattainment NSR and use of offsets, and minimum plan requirements regarding other source obligations, such as recordkeeping.
Specifically, subparagraphs 51.165(a)(1)(i) through (xlvi) enumerate a set of definitions which states must either use or replace with definitions that a state demonstrates are more stringent or at least as stringent in all respects. Subparagraph 51.165(a)(2) sets minimum plan requirements for procedures to determine the applicability of the nonattainment NSR program to new and modified sources. Subparagraph 51.165(a)(3), (a)(9) and (a)(11) set minimum plan requirements for the use of offsets by sources subject to nonattainment NSR requirements. Subparagraphs (a)(8) and (a)(10) regard precursors, and subparagraphs (a)(6) and (a)(7) regard recordkeeping obligations. Subparagraph 51.165(a)(4) allows nonattainment NSR programs to treat fugitive emissions in certain ways. Subparagraph 51.165(a)(5) regards enforceable procedures for after approval to construct has been granted. Subparagraph 51.165(b) sets minimum plan requirements for new major stationary sources and major modifications in attainment and unclassifiable areas that would cause or contribute to violations of the national ambient air quality standards (NAAQS.) Finally, subparagraph 51.165(f) sets minimum plan requirements for the use of PALs. Please refer to docket EPA-R08-OAR-2016-0620 to view a cross-walk table which outlines how Utah's nonattainment permitting rules correlate with the requirements of 40 CFR 51.165.
Clean Air Act section 189(e) requires that state SIPs apply the same control requirements that apply to major stationary sources of PM
As a result, it became clear that Utah needed to submit further revisions to address remaining deficiencies in the nonattainment permitting program for the EPA to approve the August 20, 2013, submittal. Included as part of those deficiencies was that Utah has not submitted an analysis demonstrating that sources of ammonia, as a PM
1. UDAQ commits to submit a SIP revision that either regulates major stationary sources of the pursuant to Utah's nonattainment new source review (NNSR) permitting program, consistent with all applicable federal regulatory requirements or demonstrates that sources of ammonia, as a PM
2. UDAQ commits to revise R307-403-2 consistent with the new definitions in 40 CFR 51.165 that EPA recently finalized in the PM
3. UDAQ commits to revise R307-403-3, including R307-403-3(3), to remove the reference to NNSR determinations being made “at the time of the source's proposed start-up date”;
4. UDAQ commits to revise R307-403-3, including R307-403-3(2) and R307-403-3(3), to specify that NNSR permit requirements are applicable to all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment;
5. UDAQ commits to revise R307-403-3, in addition to the previously adopted definition of lowest achievable emission rate (LAER) in R307-403-1, to explicitly state that LAER applies to all major new sources and major modifications for the relevant pollutants in nonattainment areas;
6. UDAQ commits to revise R307-403-4 to incorporate the requirements from 40 CFR 51.165 to establish that all general offset permitting requirements apply for all offsets regardless of the pollutant at issue, and to revise the provision to impose immediate and direct general offset permitting requirements on all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment;
7. UDAQ commits to work with the Utah Air Quality Board to revise R307-403-4 to reference the criteria discussed in section IV.D. of 40 CFR 51, Appendix S; and
8. UDAQ will update R307-403 to include a new section that imposes requirements that address emission offsets for PM
Under section 110(k)(4) of the Act, the EPA may approve a SIP revision based on a commitment by the state to adopt specific enforceable measures by a date certain, but not later than one year after the date of approval of the plan revision.
The EPA is proposing to conditionally approve Utah's revisions submitted on August 20, 2013, which have not been withdrawn by Utah. These revisions addressed R307-403-1 (Purpose and Definitions), R307-403-2 (Applicability), R307-403-11 (Actual PALs), and R307-420 (Ozone Offset Requirements in Davis and Salt Lake Counties). In addition, Utah moved R307-401-19 (Analysis of Alternatives) to R307-403-10 and moved R307-401-20 (Relaxation of Limits) to R307-403-2. The EPA proposes that these changes, when combined with the changes Utah has committed to submitting to the EPA by December 8, 2017, in Utah's September 30, 2016 commitment letter, create enforceable obligations for sources and are consistent with the CAA and EPA regulations, including the requirements of CAA section 110(a)(2)(A), 110(a)(2)(C), 110(i), 110(l), 172(c)(5), 172(c)(7), 173.
The crosswalk table in the docket details how the submittal corresponds to specific requirements in 40 CFR 51.165; however, as stated earlier, we are not proposing to determine that Utah's PM
Specifically, we are proposing to conditionally approve:
Section R307-401-19 being moved removed from R307-401-19 and being added to R307-403-10. Because this section applies only to major sources or major modifications that are located in a nonattainment area or impact a nonattainment area, this section is more appropriately located in R307-403.
Section R307-401-20 being moved removed from R307-401-19 and being added to R307-403-2. Because this section applies only to major sources or major modifications that are located in a nonattainment area or impact a nonattainment area, this section is more appropriately located in R307-403.
Language being added in R307-403-1(1)-(4) to parallel federal nonattainment permitting regulations in 40 CFR 51.165; however, Utah committed to addressing further deficiencies regarding ammonia as a precursor to PM
In particular, R307-403-1(4)(b) states that “ammonia is not a precursor to PM
The title of this section being changed from “Emission Limitations” to “applicability” and language being added to R307-403-2(1)-(12) to parallel federal nonattainment permitting regulations in 40 CFR 51.165; however, Utah committed to addressing further deficiencies in this section in its September 30, 2016 commitment letter. Utah committed to revise R307-403-2 consistent with the new definitions in 40 CFR 51.165 that the EPA recently finalized in the PM
On September 23, 2016, Utah submitted a letter to the EPA requesting to withdraw R307-403-2(12) (see docket EPA-R08-OAR-2016-0620.) As a result, we will not be acting on that subparagraph.
R307-403-11 being added to implement a portion of the EPA's NSR Reform provisions that were adopted in the federal regulations in 2002 and have not yet been incorporated into the Utah Air Quality Rules. R307-403-11 incorporates by reference the provisions of 40 CFR 51.165(f)(1) through (14).
This rule being revised to include the definitions and applicability provisions of R307-403-1. This rule change will ensure that the definitions and applicability provisions in R307-420 are consistent with related permitting rules in R307-403.
UDAQ additionally committed to submit a revised SIP by December 8, 2017 to: (1) Revise R307-403-3, including R307-403-3(3), to remove the reference to NNSR determinations being made “at the time of the source's proposed start-up date; (2) revise R307-403-3, including R307-403-3(2) and R307-403-3(3), to specify that NNSR permit requirements are applicable to all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment; (3) revise R307-403-3, in addition to the previously adopted definition of LAER in R307-403-1, to explicitly state that LAER applies to all major new sources and major modifications for the relevant pollutants in nonattainment areas; (4) revise R307-403-4 to incorporate the requirements from 40 CFR 51.165 to
Under section 110(l) of the CAA, the EPA cannot approve a SIP revision if the revision would interfere with any applicable requirements concerning attainment and reasonable futher progress (RFP) toward attainment of the NAAQS, or any other applicable requirement of the Act. In addition, section 110(l) requires that each revision to an implementation plan submitted by a state shall be adopted by the state after reasonable notice and public hearing.
The Utah SIP revisions that the EPA is proposing to approve do not interfere with any applicable requirements of the Act. The revisions to R307-401 and R307-403 submitted by the Utah on August 20, 2013, are intended to strengthen the SIP. Therefore, CAA section 110(l) requirements are satisfied.
In this rule, the EPA is proposing to include in a final EPA rule regulatory text that includes incorporation by reference. In accordance with requirements of 1 CFR 51.5, the EPA is proposing to incorporate by reference the UDAQ rules promulgated in the DAR, R307-400 Series as discussed in section III of this preamble. The EPA has made, and will continue to make, these materials generally available through
Under the Clean Air Act, the Administrator is required to approve a SIP submission that complies with the provisions of the Act and applicable federal regulations. 42 U.S.C. 7410(k); 40 CFR 52.02(a). Thus, in reviewing SIP submissions, the EPA's role is to approve state choices, provided that they meet the criteria of the Clean Air Act. Accordingly, this action merely proposes to approve state law as meeting federal requirements and does not impose additional requirements beyond those imposed by state law. For that reason, this proposed action:
• Is not a “significant regulatory action” subject to review by the Office of Management and Budget under Executive Order 12866 (58 FR 51735, October 4, 1993);
• does not impose an information collection burden under the provisions of the Paperwork Reduction Act (44 U.S.C. 3501
• is certified as not having a significant economic impact on a substantial number of small entities under the Regulatory Flexibility Act (5 U.S.C. 601
• does not contain any unfunded mandate or significantly or uniquely affect small governments, as described in the Unfunded Mandates Reform Act of 1995 (Pub. L. 104-4);
• does not have federalism implications as specified in Executive Order 13132 (64 FR 43255, August 10, 1999);
• is not an economically significant regulatory action based on health or safety risks subject to Executive Order 13045 (62 FR 19885, April 23, 1997);
• is not a significant regulatory action subject to Executive Order 13211 (66 FR 28355, May 22, 2001);
• is not subject to requirements of Section 12(d) of the National Technology Transfer and Advancement Act of 1995 (15 U.S.C. 272 note) because application of those requirements would be inconsistent with the CAA; and
• does not provide the EPA with the discretionary authority to address, as appropriate, disproportionate human health or environmental effects, using practicable and legally permissible methods, under Executive Order 12898 (59 FR 7629, February 16, 1994).
In addition, the SIP is not approved to apply on any Indian reservation land or in any other area where the EPA or an Indian tribe has demonstrated that a tribe has jurisdiction. In those areas of Indian country, the proposed rule does not have tribal implications and will not impose substantial direct costs on tribal governments or preempt tribal law as specified by Executive Order 13175 (65 FR 67249, November 9, 2000).
Environmental protection, Air pollution control, Carbon monoxide, Intergovernmental relations, Incorporation by reference, Lead, Nitrogen dioxide, Ozone, Particulate matter, Reporting and recordkeeping requirements, Sulfur oxides, Volatile organization compounds.
42 U.S.C. 7401
Environmental Protection Agency (EPA).
Proposed rule; extension of comment period.
On September 29, 2016, the Environmental Protection Agency (EPA) proposed a rule titled, “Mercury and Air Toxics Standards (MATS) Completion of Electronic Reporting Requirements.” The EPA is extending the comment period on the proposed rule that was scheduled to close on October 31, 2016, by 15 days until November 15, 2016. The EPA is making this change based on three requests for additional time to prepare comments on this proposed rule.
The public comment period for the proposed rule published in the
The EPA has established a docket for the proposed rulemaking (available at
For additional submission methods, the full EPA public comment policy, and general guidance on making effective comments, please visit
For additional information on this action, contact Barrett Parker, Sector Policies and Programs Division, Office of Air Quality Planning and Standards (D243-05), Environmental Protection Agency, Research Triangle Park, NC 27711; telephone number: (919) 541-5635; email address:
To allow additional time for stakeholders to provide comments, the EPA has decided to extend the public comment period until November 15, 2016.
Bureau of Land Management, Interior.
Proposed supplementary rule.
The Bureau of Land Management (BLM) is proposing a supplementary rule addressing conduct on public lands in the vicinity of Corona Arch and Gemini Bridges in Grand County, Utah. The proposed supplementary rule would prohibit roped activities around Corona Arch and Gemini Bridges. Such activities involve the use of ropes or other climbing aids, and include, but are not limited to, ziplining, highlining, slacklining, traditional rock climbing, sport rock climbing, rappelling, and swinging.
Comments on the proposed supplementary rule must be received or postmarked by December 30, 2016 to be assured of consideration. Comments received, postmarked or electronically dated after that date will not necessarily be considered in the development of the final supplementary rules.
Please mail or hand deliver all comments concerning the proposed supplementary rule to the Bureau of Land Management, 82 E. Dogwood, Moab, UT 84532, or email comments to Katie Stevens, at
Beth Ransel, Moab Field Manager, BLM Moab Field Office, 82 E. Dogwood, Moab, UT 84532, or telephone (435) 259-2110. Persons who use a telecommunications device for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 800-877-8339 to leave a message or question with the above individual. The FIRS is available 24 hours a day, 7 days a week. You will receive a reply during normal business hours.
The public is invited to provide comments on the proposed supplementary rule. See the
Written comments on the proposed supplementary rule should be specific, confined to issues pertinent to the proposed supplementary rule, and should explain the reason for any recommended change. Where possible, comments should reference the specific section of the rule that the comment is addressing. Comments, including names, street addresses, and other contact information of respondents, will be available for public review at 82 E. Dogwood, Moab, UT 84532, during regular business hours (8:00 a.m. to 4:30 p.m.), Monday through Friday, except Federal holidays. Before including your address, telephone number, email address, or other personal identifying information in your comment, be advised that your entire comment, including your personal identifying information, may be made publicly available at any time. While you can ask us in your comment to withhold from public review your personal identifying information, we cannot guarantee that we will be able to do so.
The BLM establishes supplementary rules under the authority of 43 CFR 8365.1-6, which allows the BLM State Directors to establish such rules for the protection of persons, property, and public lands and resources. This regulation allows the BLM to issue rules of less than national effect without codifying the rules in the Code of Federal Regulations.
Corona Arch and Gemini Bridges are two of the most popular locations in the Moab Field Office. Corona Arch is a partly freestanding arch with a 110-foot by 110-foot opening. Gemini Bridges are two large arches standing side-by-side. Corona Arch is visited by approximately 40,000 visitors per year, and Gemini Bridges are visited by approximately 50,000 visitors per year. The BLM has received many complaints that roped activities, including swinging from the arches, conflict with other visitors' use and enjoyment of the arches. The BLM finds merit in these complaints. People setting up and using swings and rappels from the arches endanger both themselves and those viewing them from below. In addition, the rock arches may be damaged by ropes “sawing” on the rock spans. The supplementary rules currently in effect in the Moab Field Office (at 81 FR 9498 (Feb. 25, 2016)) do not address roped activities on the affected arches, although a temporary restriction (80 FR 27703 (May 14, 2015)) is in effect until May 2017.
The legal descriptions of the affected public lands are:
The areas described aggregate 37.3 acres.
This proposed supplementary rule would allow for enforcement as a tool in minimizing the adverse effects of roped activities within the affected areas. After it goes into effect, the supplementary rule will be available for inspection in the Moab Field Office, and it will be announced broadly through the news media and direct mail to the constituents included on the Moab Field Office mailing list. It will also be posted on signs at main entry points to the affected areas.
The Moab Field Office proposes to ban roped activities in the vicinity of Corona Arch and Gemini Bridges. The prohibited activities would include, but not be limited to, ziplining, highlining, slacklining, traditional rock climbing, sport rock climbing, rappelling, and swinging, using equipment such as ropes, cables, climbing aids, webbing, or anchors. The proposed supplementary rule would affect 31 acres surrounding Corona Arch and 6.3 acres surrounding Gemini Bridges. The proposed supplementary rule is necessary for the protection of visitors and for the protection of the arches.
This proposed supplementary rule is not a significant regulatory action and is not subject to review by the Office of Management and Budget under Executive Order 12866. This proposed supplementary rule would not have an annual effect of $100 million or more on the economy. It is not intended to affect commercial activity, but imposes a rule of conduct on recreational visitors for public safety and resource protection reasons in a limited area of public lands. This supplementary rule would not adversely affect, in a material way, the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities. This proposed supplementary rule would not create a serious inconsistency or otherwise interfere with an action taken or planned by another agency. This proposed supplementary rule does not materially alter the budgetary effects of entitlements, grants, user fees, or loan programs or the right or obligations of their recipients, nor does it raise novel legal or policy issues; it merely strives to protect public safety and the environment.
Executive Order 12866 requires each agency to write regulations that are simple and easy to understand. The BLM invites comments on how to make this proposed supplementary rule easier to understand, including answers to questions such as the following:
(1) Are the requirements in the proposed supplementary rule clearly stated?
(2) Does the proposed supplementary rule contain technical language or jargon that interferes with their clarity?
(3) Does the format of the proposed supplementary rule (grouping and order of sections, use of headings, paragraphing, etc.) aid or reduce its clarity?
(4) Would the proposed supplementary rule be easier to understand if it was divided into more (but shorter) sections?
(5) Is the description of the proposed supplementary rule in the
Please send any comments you have on the clarity of the proposed supplementary rule to the address specified in the
A temporary restriction on roped activities was analyzed in Environmental Assessment (EA) DOI-BLM-UT-2014-0170-EA,
Congress enacted the Regulatory Flexibility Act (RFA), 5 U.S.C. 601-612, to ensure that Government regulations do not unnecessarily or disproportionately burden small entities. The RFA requires a regulatory flexibility analysis if a rule would have a significant economic impact, either detrimental or beneficial, on a substantial number of small entities. The proposed supplementary rule does not pertain specifically to commercial or governmental entities of any size, but to public recreational use of specific public lands. Therefore, the BLM has determined under the RFA that the proposed supplementary rule would not have a significant economic impact on a substantial number of small entities.
This proposed supplementary rule would not constitute a “major rule” as defined at 5 U.S.C. 804(2). This proposed supplementary rule merely contains rules of conduct for recreational use of public lands. This proposed rule would not affect business, commercial, or industrial use of public lands.
This proposed supplementary rule would not pose an unfunded mandate on State, local, or tribal governments of more than $100 million per year; nor would it have a significant or unique effect on small governments. This proposed supplementary rule does not require anything of State, local, or tribal governments. Therefore, the BLM is not required to prepare a statement containing the information required by the Unfunded Mandates Reform Act, 2 U.S.C. 1531
This proposed supplementary rule is not a government action capable of interfering with constitutionally protected property rights. This proposed supplementary rule does not address property rights in any form, and does not cause the impairment of anybody's property rights. Therefore, the BLM has determined that this proposed supplementary rule would not cause a taking of private property or require further discussion of takings implications under this Executive Order.
This proposed supplementary rule would not have a substantial direct effect on the states, on the relationship between the Federal government and the states, or on the distribution of power and responsibilities among the various levels of government. Therefore, the BLM has determined that this proposed supplementary rule does not have sufficient Federalism implications to warrant preparation of a Federalism assessment.
Under Executive Order 12988, the BLM has determined that this proposed supplementary rule would not unduly burden the judicial system and that the requirements of sections 3(a) and 3(b)(2) of the Order are met. This supplementary rule contains rules of conduct for recreational use of certain public lands to protect public safety and the environment.
In accordance with Executive Order 13175, the BLM has found that this proposed supplementary rule does not include policies that have tribal implications. This proposed supplementary rule does not affect lands held in trust for the benefit of Native American tribes, individual Indians, Aleuts, or others, nor does it affect lands for which title is held in fee status by Indian tribes or U.S. Government-owned lands managed by the Bureau of Indian Affairs.
This proposed supplementary rule does not contain information collection requirements that the Office of Management and Budget must approve under the Paperwork Reduction Act, 44 U.S.C. 3501
This proposed supplementary rule does not comprise a significant energy action. This proposed supplementary rule would not have an adverse effect on energy supplies, production, or consumption. It only addresses rules of conduct for recreational use of certain public lands to protect public safety and the environment, and has no connection with energy policy.
The principal author of the proposed supplementary rule is Beth Ransel, Field Manager for the Moab Field Office, Utah.
For the reasons stated in the preamble, and under the authority for supplementary rules at 43 U.S.C. 1740 and 43 CFR 8365.1-6, the Utah State Director, BLM, proposes to issue this supplementary rule for public lands managed by the BLM in Utah, to read as follows:
1. You must not participate in any roped activities on public lands in the vicinity of Corona Arch or Gemini Bridges. This prohibition includes, but is not limited to, the use of ropes, cables, climbing aids, webbing, anchors, and similar devices.
The following persons are exempt from this supplementary rule: Any Federal, State, local government officer or employee in the scope of their duties; members of any organized law enforcement, rescue, or firefighting force in performance of an official duty; and any persons, agencies, municipalities or companies whose activities are authorized in writing by the BLM.
Any person who violates this supplementary rule may be tried before a United States Magistrate and fined in accordance with 18 U.S.C. 3571, imprisoned no more than 12 months under 43 U.S.C. 1733(a) and 43 CFR 8360.0-7, or both. In accordance with 43 CFR 8365.1-7, State or local officials may also impose penalties for violations of Utah law.
Federal Communications Commission.
Petitions for reconsideration and clarification.
Petitions for Reconsideration and Clarification (Petitions) have been filed in the Commission's rulemaking proceeding by Andrew D. Lipman, on behalf of Submarine Cable Coalition, and Kent D. Bressie, on behalf of North American Submarine Cable Association.
Oppositions to the Petition must be filed on or before November 15, 2016. Replies to an opposition must be filed on or before November 25, 2016.
Federal Communications Commission, 445 12th Street SW., Washington, DC 20554.
Peter Shroyer, Public Safety and Homeland Security Bureau, email:
This is a summary of the Commission's document, Report No. 3052, released October 12, 2016. The full text of the Petitions is available for viewing and copying at the FCC Reference Information Center, 445 12th Street SW., Room CY-A257, Washington, DC 20554 or may be accessed online via the Commission's Electronic Comment Filing System at
The Department of Agriculture has submitted the following information collection requirement(s) to OMB for review and clearance under the Paperwork Reduction Act of 1995, Public Law 104-13. Comments are requested regarding (1) whether the collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; (2) the accuracy of the agency's estimate of burden including the validity of the methodology and assumptions used; (3) ways to enhance the quality, utility and clarity of the information to be collected; and (4) ways to minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology.
Comments regarding this information collection received by November 30, 2016 will be considered. Written comments should be addressed to: Desk Officer for Agriculture, Office of Information and Regulatory Affairs, Office of Management and Budget (OMB), New Executive Office Building, 725 17th Street NW., Washington, DC 20502. Commenters are encouraged to submit their comments to OMB via email to:
An agency may not conduct or sponsor a collection of information unless the collection of information displays a currently valid OMB control number and the agency informs potential persons who are to respond to the collection of information that such persons are not required to respond to the collection of information unless it displays a currently valid OMB control number.
Forest Service, USDA.
Notice of meeting.
The Forest Resource Coordinating Committee (Committee) will meet in Washington, DC. The Committee is authorized under Section 8005 of the Food, Conservation, and Energy Act of 2008 (the Act) (Pub. L. 110-246). Additional information concerning the Committee, including the meeting agenda, supporting documents and minutes, can be found by visiting the Committee's Web site at
The meeting will be held on the following dates and time:
All meetings are subject to cancellation. For updated status of the meeting prior to attendance, please contact the person listed under
The meeting will be held at the Hotel Indigo, Inspiration Conference Room, 151 Haywood Street, Asheville, North Carolina.
Written comments may be submitted as described under
Scott Stewart, Forest Resource Coordinating Committee Designated Federal Officer, Cooperative Forestry Staff by phone at 202-205-1618 or Jennifer Helwig, Forest Resource Coordinating Committee Program Coordinator, Cooperative Forestry Staff by phone at 202-205-0892. Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 between 8:00 a.m. and 8:00 p.m., Eastern Daylight Time, Monday through Friday.
The purpose of the meeting is to:
1. Discuss current and emerging recommendation efforts and develop a briefing-paper for incoming Administration;
2. Meet partners to hear concerns and opportunities to collaborate; and
3. Conduct Work Group break out sessions.
The meeting is open to the public. The agenda will include time for people to make oral statements of three minutes or less. Individuals wishing to make an oral statement should submit a request in writing by November 3, 2016 to be scheduled on the agenda. Anyone who would like to bring related matters to the attention of the Committee may file written statements with the Committee staff before November 3, 2016. Written comments and time requests for oral comments must be sent to Scott Stewart, 1400 Independence Ave. SW., Mailstop 1123, Washington, DC 20250; or by email to
Forest Service, USDA.
Notice of meeting.
The Allegheny Resource Advisory Committee (RAC) will meet in Warren, Pennsylvania. The committee is authorized under the Secure Rural Schools and Community Self-Determination Act (the Act) and operates in compliance with the Federal Advisory Committee Act. The purpose of the committee is to improve collaborative relationships and to provide advice and recommendations to the Forest Service concerning projects and funding consistent with Title II of the Act. Additional RAC information can be found at the following Web site:
The meeting will be held Thursday, December 8, 2016, at 10:00 a.m. EST.
All RAC meetings are subject to cancellation. For status of meeting prior to attendance, please contact the person listed under
The meeting will be held at the Allegheny National Forest Supervisor's Office, 4 Farm Colony Drive, Warren, Pennsylvania.
Written comments may be submitted as described under
Ruth Sutton, RAC Coordinator by phone at 814-728-6100, or via email at
Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 between 8:00 a.m. and 8:00 p.m., Eastern Standard Time, Monday through Friday. Please make requests in advance for sign language interpreting, assistive listening devices or other reasonable accommodation for access to the facility or proceedings by contacting the person listed above.
The purpose of the meeting is to review and approve project submissions.
The meeting is open to the public. The agenda will include time for people to make oral statements of three minutes or less. Individuals wishing to make an oral statement should request in writing by November 30, 2016, to be scheduled on the agenda. Anyone who would like to bring related matters to the attention of the committee may file written statements with the committee staff before or after the meeting. Written comments and requests for time to make oral comments must be sent to Ruth Sutton, RAC Coordinator, Allegheny National Forest Supervisor's Office, 4 Farm Colony Drive, Warren, Pennsylvania 16365; by email to
Forest Service, USDA.
Notice of meeting.
The National Urban and Community Forestry Advisory Council (Council) will meet in Washington, DC The Council is authorized under Section 9 of the Cooperative Forestry Assistance Act, as amended by Title XII, Section 1219 of Public Law 101-624 (the Act) (16 U.S.C. 2105g) and the Federal Advisory Committee Act (FACA) (5 U.S.C. App. II). Additional information concerning the Council, can be found by visiting the Council's Web site at:
The meeting will be held on the following dates and times:
• Monday, November 13, 2016, from 9:00 a.m. to 5:00 p.m. Central Time or until Council business is completed. All meetings are subject to cancellation. For an updated status of meeting prior to attendance, please contact the person listed under
The meeting will be held at the Santa Fe Room, Indianapolis Marriot Downtown, 350 West Maryland Street, Indianapolis, Indiana.
Written comments concerning this meeting should be submitted as described under
Nancy Stremple, Executive Staff, National Urban and Community Forestry Advisory Council, Sidney Yates Building, Room 3SC-01C, 201 14th Street SW., Washington, DC 20024 by telephone at 202-205-7829, or by email at
The purpose of the meeting is to:
1. Introduce new members;
2. Finalize the 2016 Accomplishment and Recommendations;
3. Update status of the 2017 grant review;
4. Listen to local constituents urban forestry concerns;
5. Provide updates on the 10-year action plan (2016-2026);
6. Receive Forest Service budget and program updates; and
The meeting is open to the public. The agenda will include time for people to make oral statements of three minutes or less. Individuals wishing to make an oral statement should submit a request in writing by November 1, 2016, to be scheduled on the agenda. Council discussion is limited to Forest Service staff and Council members, however anyone who would like to bring urban and community forestry matters to the attention of the Council may file written statements with the Council's staff before or after the meeting. Written comments and time requests for oral comments must be sent to Nancy Stemple, Executive Staff, National Urban and Community Forestry Advisory Council, Sidney Yates Building, Room 3SC-01C, 201 14th Street SW., Washington, DC 20024, or by email at
National Agricultural Statistics Service (NASS), Department of Agriculture.
Solicitation of Nominations to the Advisory Committee on Agriculture Statistics.
In accordance with the Federal Advisory Committee Act, 5 U.S.C. App. 2, this notice announces an invitation from the Office of the Secretary of Agriculture for nominations to the Advisory Committee on Agriculture Statistics.
On August 15, 2016, the Secretary of Agriculture renewed the Advisory Committee charter for a two-year term to expire on August 15, 2018. The purpose of the Committee is to advise the Secretary of Agriculture on the scope, timing, content, etc., of the periodic censuses and surveys of agriculture, other related surveys, and the types of information to obtain from respondents concerning agriculture. The Committee also prepares recommendations regarding the content of agriculture reports and presents the views and needs for data of major suppliers and users of agriculture statistics.
The nomination period for interested candidates will close 30 days after publication of this notice.
You may submit nominations by any of the following methods:
•
•
•
•
Renee Picanso, Associate Administrator, National Agricultural Statistics Service, (202) 720-2707.
Each person nominated to serve on the committee is required to submit the following form: AD-755 (Advisory Committee Membership Background Information, OMB Number 0505-0001), available on the Internet at
The Committee, appointed by the Secretary of Agriculture, consists of 20 members representing a broad range of disciplines and interests, including, but not limited to, producers, representatives of national farm organizations, agricultural economists, rural sociologists, farm policy analysts, educators, State agriculture representatives, and agriculture-related business and marketing experts.
Members serve staggered 2-year terms, with terms for half of the Committee members expiring in any given year. Nominations are being sought for 10 open Committee seats. Members can serve up to 3 terms for a total of 6 consecutive years. The Chairperson of the Committee shall be elected by members to serve a 1-year term.
Equal opportunity practices, in line with USDA policies, will be followed in all membership appointments to the Committee. To ensure that the recommendations of the Committee have taken into account the needs of the diverse groups served by USDA, membership will include to the extent possible, individuals with demonstrated ability to represent the needs of all racial and ethnic groups, women and men, and persons with disabilities.
The duties of the Committee are solely advisory. The Committee will make recommendations to the Secretary of Agriculture with regards to the agricultural statistics programs of NASS, and such other matters as it may deem advisable, or which the Secretary of Agriculture; Under Secretary for Research, Education, and Economics; or
Send questions, comments, and requests for additional information to the email address, fax number, or address listed above.
National Agricultural Statistics Service, USDA.
Notice and request for comments.
In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service (NASS) to request revision and extension of a currently approved information collection, the Cotton Ginning Survey. Revision to burden hours will be needed due to changes in the size of the target population, sampling design, and/or questionnaire length.
Comments on this notice must be received by December 30, 2016 to be assured of consideration.
You may submit comments, identified by docket number 0535-0220, by any of the following methods:
•
•
•
•
R. Renee Picanso, Associate Administrator, National Agricultural Statistics Service, U.S. Department of Agriculture, (202) 720-2707. Copies of this information collection and related instructions can be obtained without charge from David Hancock, NASS—OMB Clearance Officer, at (202) 690-2388 or at
NASS also complies with OMB Implementation Guidance, “Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA),”
All responses to this notice will become a matter of public record and be summarized in the request for OMB approval.
On June 28, 2016, the South Carolina State Ports Authority, grantee of FTZ 38, submitted a notification of proposed production activity to the FTZ Board on behalf of Benteler Automotive Corporation, within Subzone 38F, in Duncan, South Carolina.
The notification was processed in accordance with the regulations of the FTZ Board (15 CFR part 400), including notice in the
Bureau of Industry and Security, Department of Commerce.
Notice.
The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.
Written comments must be submitted on or before December 30, 2016.
Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW., Washington, DC 20230 (or via the Internet at
Requests for additional information or copies of the information collection instrument and instructions should be directed to Mark Crace, BIS ICB Liaison, (202) 482-8093,
This information is used to monitor requests for participation in foreign boycotts against countries friendly to the U.S. The information is analyzed to note changing trends and to decide upon appropriate action to be taken to carry out the United States' policy of discouraging its citizens from participating in foreign restrictive trade practices and boycotts directed against friendly countries.
Submitted on paper or electronically.
Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.
Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.
International Trade Administration, Department of Commerce.
Notice and call for applications.
The U.S. Department of Commerce (DOC), International Trade Administration (ITA) announces that it will accept applications for the International Buyer Program (IBP) Select service for calendar year 2018 (January 1, 2018, through December 31, 2018). This announcement sets out the objectives, procedures and application review criteria for IBP Select. Under IBP Select, ITA recruits international buyers to U.S. trade shows to meet with U.S. suppliers exhibiting at those shows. The main difference between IBP and IBP Select is that IBP offers worldwide promotion, whereas IBP Select focuses on promotion and recruitment in up to five international markets. Specifically, through the IBP Select, the DOC selects domestic trade shows that will receive DOC assistance in the form of targeted promotion and recruitment in up to five foreign markets, export counseling to exhibitors, and export counseling and matchmaking services at the trade show. This notice covers selection for IBP Select participation during calendar year 2018.
Applications for IBP Select must be received by Friday, January 6, 2017.
The application form can be found at
Vidya Desai, Senior Advisor, Trade Promotion Programs, International Trade Administration, U.S. Department of Commerce, 1300 Pennsylvania Ave. NW., Ronald Reagan Building, Suite 800M—Mezzanine Level—Atrium North, Washington, DC 20004; Telephone (202) 482-2311; Facsimile: (202) 482-7800; Email:
The IBP was established in the Omnibus Trade and Competitiveness Act of 1988 (Pub. L. 100-418, title II, § 2304, codified at 15 U.S.C. 4724) to bring international buyers together with U.S. firms by promoting leading U.S. trade shows in industries with high export potential. The IBP emphasizes cooperation
Through the IBP Select, the DOC selects trade shows that DOC determines to be leading trade shows with participation by U.S. firms interested in exporting. DOC provides successful applicants with assistance in the form of targeted overseas promotion of the show by U.S. Embassies and Consulates; outreach to show participants about exporting; recruitment of potential buyers to attend the shows; and staff assistance in setting up and staffing international trade centers at the shows. Targeted promotion in up to five markets can be executed through the overseas offices of ITA or in U.S. Embassies in countries where ITA does not maintain offices.
ITA is accepting applications for IBP Select from trade show organizers of trade shows taking place between January 1, 2018, and December 31, 2018. Selection of a trade show for IBP Select is valid for one show. A trade show organizer seeking selection for a recurring show must submit a new application for selection for each occurrence of the show. For shows that occur more than once in a calendar year, the trade show organizer must submit a separate application for each show.
There is no fee required to submit an application. For IBP Select in calendar year 2018, ITA expects to select approximately 10 shows from among the applicants. ITA will select those shows that are determined to most clearly support the statutory mandate in 15 U.S.C. 4721 to promote U.S. exports, especially those of small- and medium-sized enterprises, and that best meet the selection criteria articulated below. Once selected, applicants will be required to enter into a Memorandum of Agreement (MOA) with the DOC, and submit payment of the $6,000 2018 participation fee (by check or credit card) within 30 days of written notification of acceptance into IBP Select. The MOA constitutes an agreement between the DOC and the show organizer specifying which responsibilities for international promotion and export assistance services at the trade shows are to be undertaken by the DOC as part of the IBP Select and, in turn, which responsibilities are to be undertaken by the show organizer. Anyone requesting application information will be sent a sample copy of the MOA along with the application form and a copy of this
Selection as an IBP Select show does not constitute a guarantee by DOC of the show's success. IBP Select participation status is not an endorsement of the show except as to its international buyer activities. Non-selection of an applicant for IBP Select status should not be viewed as a determination that the show will not be successful in promoting U.S. exports.
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
(i)
The Office of Management and Budget (OMB) has approved the information collection requirements of the application to this program (0625-0143) under the provisions of the Paperwork Reduction Act of 1995 (44 U.S.C. 3501
Enforcement and Compliance, International Trade Administration, Department of Commerce.
Avanti Frozen Foods Private Limited (Avanti Frozen) requested a changed circumstances review of the antidumping duty order on certain frozen warmwater shrimp (shrimp) from India. The Department of Commerce (Department) is initiating this changed circumstances review and preliminarily determining that Avanti Frozen is the successor-in-interest to Avanti Feeds Limited (Avanti Feeds).
Effective October 31, 2016.
E. Whitley Herndon, AD/CVD Operations, Office II, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230; telephone: 202-482-6274.
On February 1, 2005, the Department published in the
On September 7, 2016, Avanti Frozen requested that, pursuant to section 751(b)(1) of the Tariff Act of 1930, as amended (the Act) and 19 CFR 351.216(b), the Department conduct a changed circumstances review of the
The merchandise subject to the order is certain frozen warmwater shrimp.
Pursuant to section 751(b)(1) of the Act, the Department will conduct a changed circumstances review upon receipt of information concerning, or a request from, an interested party for a review of an antidumping duty order which shows changed circumstances sufficient to warrant a review of the order. As indicated in the “Background” section, we received information indicating that Avanti Feeds has transferred its shrimp business to Avanti Frozen. This constitutes changed circumstances warranting a review of the order.
Section 351.221(c)(3)(ii) of the Department's regulations permits the Department to combine the notice of initiation of a changed circumstances review and the notice of preliminary results if the Department concludes that expedited action is warranted. In this instance, because the record contains information necessary to make a preliminary finding, we find that expedited action is warranted and have combined the notice of initiation and the notice of preliminary results.
In this changed circumstances review, pursuant to section 751(b) of the Act, the Department conducted a successor-in-interest analysis. In making a successor-in-interest determination, the Department examines several factors, including, but not limited to, changes in the following: (1) Management; (2) production facilities; (3) supplier
In accordance with 19 CFR 351.216, we preliminarily determine that Avanti Frozen is the successor-in-interest to Avanti Feeds. Record evidence, as submitted by Avanti Frozen, indicates that Avanti Frozen operates as essentially the same business entity as Avanti Feeds with respect to the subject merchandise.
Record evidence, as submitted by Avanti Frozen, indicates that the shrimp business was transferred fully from Avanti Feeds to its subsidiary, Avanti Frozen. Specifically, Avanti Frozen provided a Business Transfer Agreement which transfers Avanti Feed's entire shrimp business to Avanti Frozen; approvals from various governing entities approving/confirming the transfer of the shrimp business from Avanti Feeds to Avanti Frozen; letters notifying customers, suppliers, and employees of the business transfer; Avanti Frozen's first annual report; charts demonstrating the board of directors and equity stockholders of both Avanti Feed and Avanti Frozen; and a list of suppliers, customers, and production and business locations before and after the transfer.
We find that the evidence provided by Avanti Frozen is sufficient to preliminarily determine that the transfer of shrimp operations from Avanti Feeds to its subsidiary Avanti Frozen did not affect the company's operations in a meaningful way. Therefore, based on the aforementioned reasons, we preliminarily determine that Avanti Frozen is the successor-in-interest to Avanti Feeds and, thus, should receive the same antidumping duty treatment with respect to the subject merchandise as Avanti Feeds.
Pursuant to 19 CFR 351.310(c), any interested party may request a hearing within 30 days of publication of this notice. In accordance with 19 CFR 351.309(c)(1)(ii), interested parties may submit case briefs not later than 30 days after the date of publication of this notice. Rebuttal briefs, limited to issues raised in the case briefs, may be filed no later than five days after the case briefs, in accordance with 19 CFR 351.309(d). Parties who submit case or rebuttal briefs are encouraged to submit with each argument: (1) A statement of the issue; (2) a brief summary of the argument; and (3) a table of authorities. All comments are to be filed electronically using Enforcement and Compliance's Antidumping and Countervailing Duty Centralized Electronic Service System (ACCESS) available to registered users at
Consistent with 19 CFR 351.216(e), we will issue the final results of this changed circumstances review no later than 270 days after the date on which this review was initiated, or within 45 days if all parties agree to our preliminary finding. This notice is published in accordance with sections 751(b)(1) and 777(i) of the Act and 19 CFR 351.216(b), 351.221(b) and 351.221(c)(3).
Enforcement and Compliance, International Trade Administration, Department of Commerce.
Effective October 31, 2016.
Based on a request, the Department of Commerce (the Department) is initiating a new shipper review (NSR) of the antidumping duty order on freshwater crawfish tail meat from the People's Republic of China (PRC) with respect to Jingzhou Tianhe Aquatic Products Co., Ltd. (Jingzhou Tianhe). We have determined that this request meets the statutory and regulatory requirements for initiation.
Dmitry Vladimirov, AD/CVD Operations Office I, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230; Telephone: (202) 482-0665.
The antidumping duty order on freshwater crawfish tail meat from the PRC published in the
Pursuant to section 751(a)(2)(B)(i)(I) of the Act and 19 CFR 351.214(b)(2)(i), Jingzhou Tianhe certified that it did not export subject merchandise to the United States during the period of investigation (POI).
In addition to the certifications described above, pursuant to 19 CFR 351.214(b)(2), Jingzhou Tianhe submitted documentation establishing the following: (1) The date on which it first shipped subject merchandise for export to the United States; (2) the volume of its first shipment; and (3) the date of its first sale to an unaffiliated customer in the United States.
In accordance with 19 CFR 351.214(g)(1)(i)(A), the period of review (POR) for a NSR initiated in the month immediately following the anniversary month will be the twelve-month period immediately preceding the anniversary month. Therefore, the POR for this NSR is September 1, 2015, through August 31, 2016.
Pursuant to section 751(a)(2)(B) of the Act and 19 CFR 351.214(b), we find that the request from Jingzhou Tianhe meets the threshold requirements for initiation of a NSR for shipments of freshwater crawfish tail meat from the PRC produced and exported by Jingzhou Tianhe.
On February 24, 2016, the President signed into law the “Trade Facilitation and Trade Enforcement Act of 2015,” H.R. 644, which made several amendments to section 751(a)(2)(B) of the Act. We will conduct this NSR in accordance with section 751(a)(2)(B) of the Act, as amended by the Trade Facilitation and Trade Enforcement Act of 2015.
Unless extended, the Department intends to issue the preliminary results of this NSR no later than 180 days from the date of initiation and final results of the review no later than 90 days after the date the preliminary results are issued.
It is the Department's usual practice, in cases involving non-market economy countries, to require that a company seeking to establish eligibility for an antidumping duty rate separate from the country-wide rate provide evidence of
Because Jingzhou Tianhe certified that it produced and exported subject merchandise, the sale of which is the basis for the request for a NSR, we will instruct CBP to continue to suspend liquidation of all entries of subject merchandise produced and exported by Jingzhou Tianhe.
To assist in its analysis of the
Interested parties requiring access to proprietary information in the NSR should submit applications for disclosure under administrative protective order, in accordance with 19 CFR 351.305 and 351.306.
This initiation and notice are published in accordance with section 751(a)(2)(B) of the Act and 19 CFR 351.214 and 351.221(c)(1)(i).
Enforcement and Compliance, International Trade Administration, Department of Commerce.
On June 24, 2016, the Department of Commerce (the Department) published the preliminary results of the administrative review of the antidumping duty order on polyethylene retail carrier bags (PRCBs) from Malaysia. The review covers one producer/exporter of the subject merchandise, Euro SME Sdn Bhd (Euro SME) for the period of review (POR) August 1, 2014, through July 31, 2015. The final estimated weighted-average dumping margin is listed below in the “Final Results of Review” section of this notice.
Effective October 31, 2016.
Bryan Hansen or Minoo Hatten, AD/CVD Operations, Office I, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230; telephone: (202) 482-3683 or (202) 482-1690, respectively.
On June 24, 2016, the Department published the
The merchandise subject to the order is PRCBs. The product is currently classified under the Harmonized Tariff Schedules of the United States (HTSUS) subheading 3923.21.0085. While the HTSUS subheading is provided for convenience and customs purposes, the written description is dispositive. A full description of the scope of the order is contained in the Final Decision Memorandum.
All issues raised in the case and rebuttal briefs by parties to this review are addressed in the Final Decision Memorandum, which is hereby adopted by this notice. A list of the issues raised is attached to this notice as Appendix. The Final Decision Memorandum is a public document and is on file electronically
Based on our analysis of comments received, we made one revision that changed the results for Euro SME.
As a result of this administrative review, we determine that a weighted-average dumping margin of 0.00 percent exists for Euro SME for this POR.
We intend to disclose the calculations performed to parties in this proceeding within five days after public announcement of the final results, in accordance with 19 CFR 351.224(b).
In accordance with 19 CFR 351.212 and the
The following deposit requirements will be effective upon publication of the notice of final results of administrative review for all shipments of PRCBs from Malaysia entered, or withdrawn from warehouse, for consumption on or after the date of publication as provided by section 751(a)(2) of the Act: (1) The cash deposit rate for Euro SME will be 0.00 percent, the rate established in the final results of this administrative review; (2) for merchandise exported by manufacturers or exporters not covered in this review but covered in a prior segment of the proceeding, the cash deposit rate will continue to be the company-specific rate published for the most recently completed segment of this proceeding in which that manufacturer or exporter participated; (3) if the exporter is not a firm covered in this review, a prior review, or the original investigation but the manufacturer is, the cash deposit rate will be the rate established for the manufacturer of the merchandise for the most recently completed segment of this proceeding for the manufacturer of the merchandise; (4) the cash deposit rate for all other manufacturers or exporters will continue to be 84.94 percent.
This notice serves as a final reminder to importers of their responsibility under 19 CFR 351.402(f)(2) to file a certificate regarding the reimbursement of antidumping duties prior to liquidation of the relevant entries during this review period. Failure to comply with this requirement could result in the Secretary's presumption that reimbursement of antidumping duties occurred and the subsequent assessment of double antidumping duties.
This notice also serves as a reminder to parties subject to administrative protective order (APO) of their responsibility concerning the destruction of proprietary information disclosed under APO in accordance with 19 CFR 351.305(a)(3). Timely written notification of the return or destruction of APO materials or conversion to judicial protective order is hereby requested. Failure to comply with the regulations and terms of an APO is a sanctionable violation.
The Department is issuing and publishing these final results of administrative review in accordance with sections 751(a)(1) and 777(i)(1) of the Act, and 19 CFR 351.213(h).
International Trade Administration, Department of Commerce.
Notice and call for applications.
In this notice, the U.S. Department of Commerce (DOC) International Trade Administration
Applications for the IBP must be received by Friday, January 6, 2017.
The application form can be found at
Vidya Desai, Senior Advisor for Trade Events, Trade Promotion Programs, International Trade Administration, U.S. Department of Commerce, 1300 Pennsylvania Ave. NW., Ronald Reagan Building, Suite 800M—Mezzanine Level—Atrium North, Washington, DC 20004; Telephone (202) 482-2311; Facsimile: (202) 482-7800; Email:
The IBP was established in the Omnibus Trade and Competitiveness Act of 1988 (Pub. L. 100-418, codified at 15 U.S.C. 4724) to bring international buyers together with U.S. firms by promoting leading U.S. trade shows in industries with high export potential. The IBP emphasizes cooperation between the DOC and trade show organizers to benefit U.S. firms exhibiting at selected events and provides practical, hands-on assistance such as export counseling and market analysis to U.S. companies interested in exporting. Shows selected for the IBP will provide a venue for U.S. companies interested in expanding their sales into international markets.
Through the IBP, ITA selects U.S. trade shows with participation by U.S. firms interested in exporting that ITA determines to be leading international trade shows, for promotion in overseas markets by U.S. Embassies and Consulates. The DOC is authorized to provide successful applicants with assistance in the form of overseas promotion of the show; outreach to show participants about exporting; recruitment of potential buyers to attend the events; and staff assistance in setting up international trade centers at the shows. Worldwide promotion is executed through ITA offices at U.S. Embassies and Consulates in more than 70 countries representing the United States' major trading partners, and also in Embassies in countries where ITA does not maintain offices.
The International Trade Administration (ITA) is accepting applications from trade show organizers for the IBP for trade shows taking place between January 1, 2018, and December 31, 2018. Selection of a trade show is valid for one show,
For the IBP in calendar year 2018, the ITA expects to select approximately 20 shows from among the applicants. The ITA will select those shows that are determined to most clearly meet the statutory mandate in 15 U.S.C. 4721 to promote U.S. exports, especially those of small- and medium-sized enterprises, and the selection criteria articulated below.
There is no fee required to submit an application. If accepted into the program for calendar year 2018, a participation fee of $9,800 is required for shows of five days or fewer. For trade shows more than five days in duration, or requiring more than one International Trade Center, a participation fee of $15,000 is required. For trade shows ten days or more in duration, and/or requiring more than two International Trade Centers, the participation fee will be determined by DOC and stated in the written notification of acceptance. It would be calculated on a full cost recovery basis. Successful applicants will be required to enter into a Memorandum of Agreement (MOA) with ITA within 10 days of written notification of acceptance into the program. The participation fee (by check or credit card) is due within 30 days of written notification of acceptance into the program.
The MOA constitutes an agreement between ITA and the show organizer specifying which responsibilities for international promotion and export assistance services at the trade shows are to be undertaken by ITA as part of the IBP and, in turn, which responsibilities are to be undertaken by the show organizer. Anyone requesting application information will be sent a sample copy of the MOA along with the application and a copy of this
Selection as an IBP partner does not constitute a guarantee by DOC of the show's success. IBP partnership status is not an endorsement of the show except as to its international buyer activities. Non-selection of an applicant for IBP
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
(i)
(j)
The Office of Management and Budget (OMB) has approved the information collection requirements of the application to this program (Form OMB 0625-0143) under the provisions of the Paperwork Reduction Act of 1995 (44 U.S.C. 3501
National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration (NOAA), Commerce.
Notice.
This action serves as a notice that NMFS, on behalf of the Secretary of Commerce (Secretary), has found that the following stocks are subject to overfishing—Hood Canal coho salmon and Pribilof Islands blue king crab; the following salmon stocks are approaching an overfished condition—Quillayute Fall coho and Snohomish coho; and the following stocks are still both overfished and subject to overfishing—Western and Central North
Regina Spallone, (301) 427-8568.
Pursuant to sections 304(e)(2) and (e)(7) of the Magnuson-Stevens Fishery Conservation and Management Act (Magnuson-Stevens Act), 16 U.S.C. 1854(e)(2) and (e)(7), and implementing regulations at 50 CFR 600.310(e)(2) and (j)(1), NMFS, on behalf of the Secretary, must notify Councils whenever it determines that a stock or stock complex is overfished or approaching an overfished condition; or if an existing rebuilding plan has not ended overfishing or resulted in adequate rebuilding progress. NMFS also notifies Councils when it determines a stock or stock complex is subject to overfishing.
NMFS has determined that Hood Canal coho is subject to overfishing, based on the most recent salmon stock assessments conducted by the Pacific Fishery Management Council (Pacific Council) Salmon Technical Team (STT). The Pacific Council has, consistent with the Pacific Coast Salmon Fishery Management Plan, already taken action shaping the 2016 fisheries to ensure Pacific Council area fisheries are not contributing to overfishing (May 2, 2016, 81 FR 26157). In addition, NMFS has determined that Pribilof Islands blue king crab is subject to overfishing based on catch levels exceeding the stock's overfishing limit. The North Pacific Fishery Management Council has been informed that they must take action to end overfishing immediately on this stock.
NMFS has determined that Quillayute Fall coho and Snohomish coho salmon are both approaching an overfished condition, based on the most recent salmon stock assessments conducted by the Pacific Council STT. These salmon stocks will be considered approaching an overfished condition if the 3-year geometric mean of the stock's two most recent postseason estimates of spawning escapement and the current preseason forecast of spawning escapement is below the stock's minimum stock size threshold. The Pacific Council has been informed that if either of these stocks becomes overfished, they must direct the STT to prepare a rebuilding plan within one year.
In addition, NMFS has determined that both Western and Central North Pacific striped marlin and Atlantic and Gulf of Mexico dusky shark are still overfished and subject to overfishing, based on the most recent assessments of these stocks. The striped marlin's determination was based on a 2015 assessment conducted by the Billfish Working Group of the International Scientific Committee for Tuna and Tuna-like Species in the North Pacific Ocean. On May 19, 2014, NMFS had announced its overfishing and overfished status determination for striped marlin, and informed the Western Pacific Fishery Management Council and the Pacific Fishery Management Council of their obligations under the MSA to address the domestic and international impact of U.S. fisheries on this stock (79 FR 28686). NMFS continues to work with the Councils and its partners to meet its domestic and international obligations, as specified in that earlier notice.
The dusky shark determination is based on a 2016 stock assessment update to the 21st Southeast Data Assessment and Review benchmark assessment for this stock, finalized in 2011. NMFS manages dusky shark under the 2006 Consolidated Atlantic Highly Migratory Species Fishery Management Plan and its amendments. Dusky shark has been a prohibited species since 2000, and may not be landed or retained in any fisheries. However, multiple commercial and recreational fisheries sometimes interact with the species as bycatch.
The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. Chapter 35).
On June 15, 2006, President Bush established the Papahānaumokuākea Marine National Monument by issuing Presidential Proclamation 8031 (71 FR 36443, June 26, 2006) under the authority of the Antiquities Act (16 U.S.C. 431). The proclamation includes restrictions and prohibitions regarding activities in the monument consistent with the authority provided by the act. Specifically, the proclamation prohibits access to the monument except when passing through without interruption or as allowed under a permit issued by NOAA and the U.S. Fish and Wildlife Service (FWS). Vessels passing through the monument without interruption are required to notify NOAA and FWS upon entering into and leaving the monument. Individuals wishing to access the monument to conduct certain regulated activities must first apply for and be granted a permit issued by NOAA and FWS to certify compliance with vessel monitoring system requirements, monument regulations and best management practices. On August 29, 2006, NOAA and FWS published a final rule codifying the provisions of the proclamation (71 FR 51134).
This information collection request may be viewed at
Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to
National Oceanic and Atmospheric Administration (NOAA), Commerce.
Notice.
The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.
Written comments must be submitted on or before December 30, 2016.
Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW., Washington, DC 20230 (or via the Internet at
Requests for additional information or copies of the information collection instrument and instructions should be directed to Karen Palmigiano, (206) 526-4491 or
This request is for extension of a currently approved information collection.
The National Oceanic and Atmospheric Administration (NOAA) has established large-scale depth-based management areas, referred to as Groundfish Conservation Areas (GCAs), where groundfish fishing is prohibited or restricted. These areas were specifically designed to reduce the catch of species while allowing healthy fisheries to continue in areas and with gears where little incidental catch of overfished species is likely to occur. Because NOAA needs methods to effectively enforce area restrictions, certain commercial fishing vessels are required to install and use a vessel monitoring system (VMS) that automatically send hourly position reports. Exemptions from the reporting requirement are available for inactive vessels or vessels fishing outside the monitored area. The vessels are also required to declare what gear will be used.
To ensure the integrity of the GCAs and Rockfish Conservation Areas, a pilot VMS program was implemented on January 1, 2004. The pilot program required vessels registered to Pacific Coast groundfish fishery limited entry permits to carry and use VMS transceiver units while fishing off the coasts of Washington, Oregon and California. On January 1, 2007, the VMS program coverage was expanded on to include all open access fisheries in addition to the limited entry fisheries. Finally, in 2010, NMFS expanded the declaration reports to include several more limited entry categories.
The installation/activation reports are available over the Internet. Due to the need for the owner's signature, installation reports must be faxed or mailed to the National Marine Fisheries Service (NMFS). Hourly position reports are automatically sent from VMS transceivers installed aboard vessels. Exemption reports and declaration reports are submitted via a toll-free telephone number.
Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.
Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.
The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. Chapter 35).
NOAA has established requirements for the licensing of private operators of remote-sensing space systems. The information in applications and subsequent reports is needed to ensure compliance with the Land Remote-Sensing Policy Act of 1992 and with the national security and international obligations of the United States. The requirements are contained in 15 CFR part 960.
This information collection request may be viewed at
Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to
National Oceanic and Atmospheric Administration (NOAA), Commerce.
Notice.
The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.
Written comments must be submitted on or before December 30, 2016.
Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW., Washington, DC 20230 (or via the Internet at
Requests for additional information or copies of the information collection instrument and instructions should be directed to Megan Brockway, (301) 427-8692 or
This request is for an extension of a currently approved information collection.
The purpose of this information collection is to assist state and federal Natural Resource Trustees in more efficiently carrying out the restoration planning phase of Natural Resource Damage Assessments (NRDA), in compliance with the National Environmental Policy Act (NEPA) of 1969, 42 U.S.C. 4321-4370d; 40 CFR 1500-1500 and other federal and local statutes and regulations as applicable. The NRDA Restoration Project Information Sheet is designed to facilitate the collection of information on existing, planned, or proposed restoration projects. This information will be used by the Natural Resource Trustees to develop potential restoration alternatives for natural resource injuries and service losses requiring restoration, during the restoration planning phase of the NRDA process.
The Restoration Project Information Sheet can be submitted on paper through the mail or faxed, or can be submitted electronically via the internet or email.
Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.
Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.
Department of the Army, DoD.
Notice of Intent.
In compliance with 35 U.S.C. 209(e) and 37 CFR 404.7(a)(1)(i), the Department of the Army hereby gives notice of its intent to grant to RF Networking Solutions, LLC; a company having its principle place of business at 4 Huron Court, East Brunswick, NJ 08816, exclusive license in all fields. The proposed license would be relative to the following: U.S. Patent Number 6,844,841 entitled “Radio Frequency Link Performance Tool Process and System”, Inventor Michael Masciulli, Issue Date January 18, 2005.
The prospective exclusive license may be granted unless within fifteen (15) days from the date of this published notice, the U.S. Army Research Laboratory receives written objections including evidence and argument that establish that the grant of the license would not be consistent with the requirements of 35 U.S.C. 209 and 37 CFR 404.7. Competing applications completed and received by the U.S. Army Research Laboratory within fifteen (15) days from the date of this published notice will also be treated as objections to the grant of the contemplated exclusive license.
Objections submitted in response to this notice will not be made available to the public for inspection and, to the extent permitted by law, will not be released under the Freedom of Information Act, 5 U.S.C. 552.
Send written objections to U.S. Army Research Laboratory Technology Transfer and Outreach Office, RDRL-DPT/Thomas Mulkern, Building 321, Room 110, Aberdeen Proving Ground, MD 21005-5425.
Thomas Mulkern, (410) 278-0889,E-Mail:
None.
Notice.
The Department of Defense has submitted to OMB for clearance, the following proposal for collection of information under the provisions of the Paperwork Reduction Act.
Consideration will be given to all comments received by November 30, 2016.
Fred Licari, 571-372-0493.
Comments and recommendations on the proposed information collection should be emailed to Ms. Jasmeet Seehra, DoD Desk Officer, at
You may also submit comments and recommendations, identified by Docket ID number and title, by the following method:
•
Written requests for copies of the information collection proposal should be sent to Mr. Licari at WHS/ESD Directives Division, 4800 Mark Center Drive, East Tower, Suite 03F09, Alexandria, VA 22350-3100.
Notice.
The Department of Defense has submitted to OMB for clearance, the following proposal for collection of information under the provisions of the Paperwork Reduction Act.
Consideration will be given to all comments received by November 30, 2016.
Fred Licari, 571-372-0493.
Comments and recommendations on the proposed information collection should be emailed to Ms. Jasmeet Seehra, DoD Desk Officer, at
You may also submit comments and recommendations, identified by Docket ID number and title, by the following method:
•
Written requests for copies of the information collection proposal should be sent to Mr. Licari at WHS/ESD Directives Division, 4800 Mark Center Drive, East Tower, Suite 03F09, Alexandria, VA 22350-3100.
Notice is hereby given that the Delaware River Basin Commission will hold a public hearing on Wednesday, November 9, 2016. A business meeting will be held the following month, on Wednesday, December 14, 2016. The hearing and business meeting are open to the public and will be held at the Washington Crossing Historic Park Visitor Center, 1112 River Road, Washington Crossing, Pennsylvania.
The list of projects scheduled for hearing, including project descriptions, will be posted on the Commission's Web site,
Written comments on matters scheduled for hearing on November 9 will be accepted through 5:00 p.m. on November 10. After the hearing on all scheduled matters has been completed, and as time allows, an opportunity for Open Public Comment will also be provided.
The public is advised to check the Commission's Web site periodically prior to the hearing date, as items scheduled for hearing may be postponed if additional time is deemed necessary to complete the Commission's review, and items may be added up to ten days prior to the hearing date. In reviewing docket descriptions, the public is also asked to be aware that project details commonly change in the course of the Commission's review, which is ongoing.
After all scheduled business has been completed and as time allows, the meeting will also include up to one hour of Open Public Comment.
There will be no opportunity for additional public comment for the record at the December 14 business meeting on items for which a hearing was completed on November 9 or a previous date. Commission consideration on December 14 of items for which the public hearing is closed may result in approval of the item (by docket or resolution) as proposed, approval with changes, denial, or deferral. When the Commissioners defer an action, they may announce an additional period for written comment on the item, with or without an additional hearing date, or they may take additional time to consider the input they have already received without requesting further public input. Any deferred items will be considered for action at a public meeting of the Commission on a future date.
Department of Education (ED), Office of English Language Acquisition (OELA).
Notice.
In accordance with the Paperwork Reduction Act of 1995 (44 U.S.C. chapter 3501
Interested persons are invited to submit comments on or before December 30, 2016.
To access and review all the documents related to the information collection listed in this notice, please use
For specific questions related to collection activities, please contact Samuel Lopez, 202-401-1423.
The Department of Education (ED), in accordance with the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3506(c)(2)(A)), provides the general public and Federal agencies with an opportunity to comment on proposed, revised, and continuing collections of information. This helps the Department assess the impact of its information collection requirements and minimize the public's reporting burden. It also helps the public understand the Department's information collection requirements and provide the requested data in the desired format. ED is soliciting comments on the proposed information collection request (ICR) that is described below. The Department of Education is especially interested in public comment addressing the following issues: (1) Is this collection necessary to the proper functions of the Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden accurate; (4) how might the Department enhance the quality, utility, and clarity of the information to be collected; and (5) how might the Department minimize the burden of this collection on the respondents, including through the use of information technology. Please note that written comments received in response to this notice will be considered public records.
Office of the Secretary (OS), Department of Education (ED).
Notice.
In accordance with the Paperwork Reduction Act of 1995 (44 U.S.C. chapter 3501
Interested persons are invited to submit comments on or before December 30, 2016.
To access and review all the documents related to the information collection listed in this notice, please use
For specific questions related to collection activities, please contact Alfreida Pettiford, 202-245-6110.
The Department of Education (ED), in accordance with the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3506(c)(2)(A)), provides the general public and Federal agencies with an opportunity to comment on proposed, revised, and continuing collections of information. This helps the Department assess the impact of its information collection requirements and minimize the public's reporting burden. It also helps the public understand the Department's information collection requirements and provide the requested data in the desired format. ED is soliciting comments on the proposed information collection request (ICR) that is described below. The Department of Education is especially interested in public comment addressing the following issues: (1) Is this collection necessary to the proper functions of the Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden accurate; (4) how might the Department enhance the quality, utility, and clarity of the information to be collected; and (5) how might the Department minimize the burden of this collection on the respondents, including through the use of information technology. Please note
Institute of Education Sciences (IES), Department of Education (ED).
Notice.
In accordance with the Paperwork Reduction Act of 1995 (44 U.S.C. chapter 3501
Interested persons are invited to submit comments on or before December 30, 2016.
To access and review all the documents related to the information collection listed in this notice, please use
For specific questions related to collection activities, please contact Amy Johnson, 202-245-7781.
The Department of Education (ED), in accordance with the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3506(c)(2)(A)), provides the general public and Federal agencies with an opportunity to comment on proposed, revised, and continuing collections of information. This helps the Department assess the impact of its information collection requirements and minimize the public's reporting burden. It also helps the public understand the Department's information collection requirements and provide the requested data in the desired format. ED is soliciting comments on the proposed information collection request (ICR) that is described below. The Department of Education is especially interested in public comment addressing the following issues: (1) Is this collection necessary to the proper functions of the Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden accurate; (4) how might the Department enhance the quality, utility, and clarity of the information to be collected; and (5) how might the Department minimize the burden of this collection on the respondents, including through the use of information technology. Please note that written comments received in response to this notice will be considered public records.
National Assessment Governing Board, U.S. Department of Education.
Announcement of open and closed meetings.
This notice sets forth the agenda for the November 17-19, 2016 Quarterly Board Meeting of the National Assessment Governing Board (hereafter referred to as Governing Board). This notice provides information to members of the public who may be interested in attending the meeting or providing written comments on the meeting. The notice of this meeting is required under § 10(a)(2) of the Federal Advisory Committee Act (FACA).
The Quarterly Board Meeting will be held on the following dates:
Sheraton Pentagon City, 900 South Orme Street, Arlington, Virginia 22204.
Munira Mwalimu, Executive Officer/Designated Federal Official of the Governing Board, 800 North Capitol Street NW., Suite 825, Washington, DC 20002, telephone: (202) 357-6938, fax: (202) 357-6945.
The Governing Board is established to formulate policy for the National Assessment of Educational Progress (NAEP). The Governing Board's responsibilities include the following: Selecting subject areas to be assessed, developing assessment frameworks and specifications, developing appropriate student achievement levels for each grade and subject tested, developing standards and procedures for interstate and national comparisons, improving the form and use of NAEP, developing guidelines for reporting and disseminating results, and releasing initial NAEP results to the public.
The Governing Board's standing committees will meet to conduct regularly scheduled work based on agenda items planned for this Quarterly Board Meeting and follow-up items as reported in the Governing Board's committee meeting minutes available at
On Thursday, November 17, 2016, ADC will meet in closed session from 12:30 p.m. to 2:30 p.m. to review secure digital-based tasks in mathematics for grade 12 and for science at grades 4 and 8. This meeting must be conducted in closed session because the test items are secure and have not been released to the public. Public disclosure of the secure test items would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.
From 2:30 p.m. to 4:00 p.m. the ADC will meet in open session to review grade 12 contextual questions for students, teachers, and schools in reading and mathematics.
The Executive Committee will meet in open session on November 17 from 4:30 p.m. to 5:35 p.m. and thereafter in closed session from 5:35 p.m. to 6:00 p.m. During the closed session, the Executive Committee will be briefed on the development of the NAEP research grants program and the forthcoming request for proposals (RFP). This discussion will include secure information that will be included in the request for proposals which is not yet available to the public. This meeting must be conducted in closed session because premature public disclosure of this information would likely have an adverse impact on the proposed agency action if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.
On Friday, November 18, the full Governing Board will meet in open session from 8:30 a.m. to 10:00 a.m. The Governing Board will review and approve the November 17-19, 2016 Governing Board meeting agenda and meeting minutes from the August 2016 Quarterly Board Meeting. Thereafter, the Secretary of Education, John B. King, Jr. will administer the oath of office to a new Board member and four reappointed members following which he will provide remarks to the Governing Board.
This session will be followed by a report from the Executive Director of the Governing Board, William Bushaw, followed by an update on National Center for Education Statistics (NCES) work by Holly Spurlock, Branch Chief, National Assessment Operations, NCES.
The Governing Board will recess for committee meetings at 10:00 a.m. which are scheduled to take place from 10:15 a.m. to 12:30 p.m.
On November 18, 2016, the ADC will meet in a joint open session with COSDAM from 10:15 a.m. to 11:00 a.m. Thereafter the two committees will meet in a joint closed session from 11:00 a.m. to 11:30 a.m. to receive a briefing on an embargoed NCES research study involving 2015 mathematics data from grades 4 and 8 at national and state levels. The data and analyses are secure and have not been released to the public. Public disclosure of the secure test data and analyses would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.
Following this joint meeting, ADC will meet in closed session from 11:45 a.m. to 12:30 p.m. to receive a briefing on the history and content of the NAEP Long-Term Trend assessments in reading and mathematics, which are conducted at ages 9, 13, and 17. The briefing will include secure reading and mathematics test items from these three age-level assessments that have not been released to the public. This meeting must be conducted in closed session because the test items are secure and have not been released to the public. Public disclosure of the secure test items would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.
On November 18, the COSDAM will meet in open session from 11:30 a.m. to 12:30 p.m. to conduct regularly scheduled work. On November 17, the R&D Committee will meet in open session from 10:15 a.m. to 12:30 p.m. to conduct regularly scheduled work.
Following the committee meetings on Friday, November 18, the Governing Board will meet in closed session from 12:45 p.m. to 1:45 p.m. to receive a briefing on the 2015 National Indian Education Study in reading and mathematics from James Deaton, NCES. Results from this study have not been released to the public. Public disclosure of the study results would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.
Following this closed session, the Governing Board will meet in closed session from 2:00 p.m. to 3:15 p.m. to receive a briefing from Eunice Greer, NCES, on data from recent NAEP digital-based pilot assessments in reading, mathematics, and writing.
Thereafter, the Governing Board will take a fifteen-minute break and reconvene in open session from 3:30 p.m. to 4:15 p.m. to discuss and take action on the Governing Board's Strategic Vision. The discussion will be led by the Governing Board's Vice Chair Lucille Davy, with a presentation from Lily Clark of the Governing Board staff.
From 4:15 p.m. to 5:00 p.m., Marcella Goodridge-Keiller, Office of the General Counsel will provide the annual ethics briefing, and William Bushaw, Governing Board Executive Director, and Peggy Carr, NCES Acting Commissioner, will provide a briefing on keeping embargoed data secure.
The November 18, 2016 meeting will adjourn at 5:00 p.m.
On November 19, the Nominations Committee will meet in closed session from 7:30 a.m. to 8:15 a.m. The committee will receive a briefing on nominations received for Governing Board terms beginning in October 1, 2017. The Nominations Committee's discussions pertain solely to internal personnel rules and practices of an agency and information of a personal nature where disclosure would constitute a clearly unwarranted invasion of personal privacy. As such, the discussions are protected by exemptions 2 and 6 of § 552b(c) of Title 5 of the United States Code.
On November 19, the Governing Board will meet from 8:30a.m. to 9:45 a.m. to receive a briefing from the National Academy of Sciences on the Evaluation of the NAEP Achievement Levels for Mathematics and Reading. The evaluation report has not yet been publically released by the National Academy of Sciences. Public disclosure of the evaluation results would significantly impede implementation of the NAEP assessment and reporting program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.
Thereafter, the Governing Board will have a short break and reconvene from 10:00 a.m. to 10:30 a.m. to receive an update on committee reports and take action on the R&D recommended release plan for the 2016 NAEP Arts assessment. The Governing Board will also take action on a joint delegation of authority to COSDAM and the Executive Committee for providing an official response to the Evaluation of NAEP Achievement Levels.
Following a short break, from 10:30 a.m. to 10:45 a.m., the Governing Board will meet in open session from 10:45 a.m. to 11:45 a.m. to receive a briefing on draft Governing Board guidelines for Releasing, Reporting, and Disseminating Results.
The November 19, 2016 meeting is scheduled to adjourn at 11:45 a.m.
You may also access documents of the Department published in the
Public Law 107-279, Title III—National Assessment of Educational Progress § 301.
Institute of Education Sciences, U.S. Department of Education.
Request for information.
To assist National Science and Technology Council's (NSTC) Interagency Working Group on Language and Communication (IWGLC) in its efforts to further improve coordination and collaboration of research and development (R & D) agendas related to language and communication across the Federal Government, the Institute of Education Sciences (the Institute) requests information from interested parties through this notice.
Written submissions must be received by the Department on or before December 30, 2016.
Submit your comments through the Federal eRulemaking Portal or via postal mail or commercial delivery. We will not accept comments by fax, email, or hand delivery. To ensure that we do not receive duplicate copies, please submit your comments only one time. In addition, please include the Docket ID and the term “Language and Communication R & D Activities response” at the top of your comments.
To assist us in making a determination on your request, we encourage you to identify any specific information in your comments that you consider confidential commercial information. Please list the information by page and paragraph numbers.
This is a request for information (RFI) only. This RFI is not a request for proposals (RFP) or a promise to issue an RFP or a notice inviting applications (NIA). This RFI does not commit the Department to contract for any supply or service whatsoever. Further, the Department is not seeking proposals and will not accept unsolicited proposals. The Department will not pay for any information or administrative costs that you may incur in responding to this RFI. If you do not respond to this RFI, you may still apply for future contracts and grants. The Department posts RFPs on the Federal Business Opportunities Web site (
Dr. Rebecca McGill-Wilkinson, U.S. Department of Education, 400 Maryland Avenue SW., PCP 4127, Washington, DC. Telephone: (202) 245-7613 or by email:
If you use a telecommunications device for the deaf (TDD) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1-800-877-8339.
The Institute requests information from interested parties to help inform its work with the IWGLC as it moves forward to improve coordination and collaboration of research and development agendas related to a recently published report on language and communication across the Federal Government. The
The NSTC is the principal means by which the Executive Branch coordinates science and technology policy across the Federal Government. A primary objective of the NSTC is establishing clear national goals for Federal science and technology investments. The IWGLC serves as part of the internal deliberative process of the NSTC. The IWGLC includes representatives from the White House Office of Science and Technology Policy, National Science Foundation, Department of Health and Human Services, Department of Education, Department of Defense, Department of Agriculture, Department of Justice, Department of Energy, Department of Homeland Security, Department of State, Department of Commerce, National Endowment for the Humanities, National Aeronautics and Space Administration, and the Department of Transportation, and recently researched and authored the Report.
Human interaction in society depends upon language and communication. Across the Federal Government, agencies support R & D activities focused on furthering the understanding of and supporting better language and communication. To date, however, there has been no systematic accounting or description of the range of language and communication R & D that is programs and activities being supported by the Federal Government. In the Report, the IWGLC took on the challenge of creating a taxonomy of language and communication R & D activities and summarizing current and recent Federal investment in this area.
The taxonomy included in the Report identified four broad R & D topics in language and communication funded by the Federal Government, along with a number of subtopics under each broad topic. Please consult the taxonomy on pages 48-50 in the Report. The four broad topic headings include:
1. Knowledge and Processes Underlying Language and Communication.
2. Language and Communication Abilities and Skills.
3. Using Language and Communication to Influence Behavior and Share Information.
4. Language and Communication Technologies.
The taxonomy also identified four types of R & D activities that could be supported within each topic area:
1. Basic/foundational.
2. Translational.
3. Applied.
4. Implementation.
The Report provides programmatic recommendations for key areas for investment and collaboration in language and communication research to support a broad range of government functions such as environmental protection, education, national security, law enforcement, transportation, and public health.
The Institute is interested in gathering information that would be of help to the IWGLC in coordinating and making recommendations about the range of R & D programs and activities related to key topics of language and communication that are supported across the Federal agencies. Specifically, the Institute, on behalf of the IWGLC, requests information on the following:
1. Whether the taxonomy included in the Report captures all types of federally funded R & D programs and activities on language and communication. If not, please indicate which types of R & D activities should be added to the taxonomy.
2. Whether there are language and communication R & D programs and activities carried out in the non-Federal sector (
3. Whether there are activities that are not included in the Report's list of recommended next steps for the Federal Government to take related to language and communication R & D programs and activities that should be considered (see pgs. 33-36). If so, please indicate what activities should be added to the Report's recommendations.
Written comments may be submitted through any of the methods discussed in the
You may also access documents of the Department published in the
Department of Energy.
Notice of open meeting.
This notice announces a meeting of the Environmental Management Site-Specific Advisory Board (EM SSAB), Savannah River Site. The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires that public notice of this meeting be announced in the
Applied Research Center, 301 Gateway Drive, Aiken, SC 29802.
James Giusti, Office of External Affairs, Department of Energy, Savannah River Operations Office, P.O. Box A, Aiken, SC, 29802; Phone: (803) 952-7684.
Take notice that on October 24, 2016, pursuant to Rule 206 of the Federal Energy Regulatory Commission's (Commission) Rules of Practice and Procedure, 18 CFR 385.206 and sections 205, 206, 306, and 309 of the Federal Power Act, (FPA)
The Complainant certifies that a copy of the complaint has been served on the Respondent.
Any person desiring to intervene or to protest this filing must file in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211, 385.214). Protests will be considered by the Commission in determining the appropriate action to be taken, but will not serve to make protestants parties to the proceeding. Any person wishing to become a party must file a notice of intervention or motion to intervene, as appropriate. The Respondent's answer and all interventions, or protests must be filed on or before the comment date. The Respondent's answer, motions to intervene, and protests must be served on the Complainants.
The Commission encourages electronic submission of protests and interventions in lieu of paper using the “eFiling” link at
This filing is accessible on-line at
This is a supplemental notice in the above-referenced proceeding of Moapa Southern Paiute Solar, LLC`s application for market-based rate authority, with an accompanying rate tariff, noting that such application includes a request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability.
Any person desiring to intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211 and 385.214). Anyone filing a motion to intervene or protest must serve a copy of that document on the Applicant.
Notice is hereby given that the deadline for filing protests with regard to the applicant's request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability is November 14, 2016.
The Commission encourages electronic submission of protests and interventions in lieu of paper, using the FERC Online links at
Persons unable to file electronically should submit an original and 5 copies of the intervention or protest to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.
The filings in the above-referenced proceeding are accessible in the Commission's eLibrary system by clicking on the appropriate link in the above list. They are also available for electronic review in the Commission's Public Reference Room in Washington, DC. There is an eSubscription link on the Web site that enables subscribers to receive email notification when a document is added to a subscribed docket(s). For assistance with any FERC Online service, please email
Take notice that on October 21, 2016, pursuant to sections 206 of the Federal Power Act, 16 U.S.C. 824e and Rule 206 of the Federal Energy Regulatory Commission's (Commission) Rules of Practice and Procedure, 18 CFR 385.206, Indianapolis Power & Light Company (IPL or Complainant) filed a formal complaint against Midcontinent Independent System Operator, Inc., (MISO or Respondent) alleging that the Respondent's Open Access Transmission, Energy and Operating Reserve Markets Tariff is unjust and unreasonable, unduly discriminatory and preferential because it does not provide a means for IPL's Advancion® Energy Storage Array, a.k.a. the Harding Street Station Battery Energy Storage System to be compensated for services it provides to the MISO system, including Primary Frequency Response, as more fully explained in the complaint.
Complainant certifies that copies of the complaint were served on the contacts for Respondent as listed on the Commission's list of Corporate Officials.
Any person desiring to intervene or to protest this filing must file in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211 and 385.214). Protests will be considered by the Commission in determining the appropriate action to be taken, but will not serve to make protestants parties to the proceeding. Any person wishing to become a party must file a notice of intervention or motion to intervene, as appropriate. The Respondent's answer and all interventions, or protests must be filed on or before the comment date. The Respondent's answer, motions to intervene, and protests must be served on the Complainants.
The Commission encourages electronic submission of protests and interventions in lieu of paper using the “eFiling” link at
This filing is accessible on-line at
Take notice that the Commission received the following electric rate filings:
Description: Baseline eTariff Filing: Market-Based Rates Application to be effective 12/24/2016.
The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.
Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern
eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at:
Take notice that the Commission received the following electric corporate filings:
Take notice that the Commission received the following electric rate filings:
The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.
Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.
eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at:
Take notice that on October 24, 2016, pursuant to Rule 206 of the Federal Energy Regulatory Commission's (Commission) Rules of Practice and Procedure, 18 CFR 385.206 and section 5 of the Natural Gas Act (NGA), 15 U.S.C. 717d (2009), Breitburn Operating LP (Complainant) filed a formal complaint against Florida Gas Transmission Company, LLC (Respondent) alleging that, Respondent: (1) Unduly discriminated against Complainant by unilaterally requiring its natural gas supplier to pay both the Western Division and Market Area rates while similarly situated shippers paid only the Western Division rate and (2) unlawfully charged and collected a rate under section 4 of the NGA without Commission authorization, all as more fully explained in the complaint.
The Complainant certifies that a copy of the complaint has been served on the contacts for the Respondent.
Any person desiring to intervene or to protest this filing must file in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211, 385.214). Protests will be considered by the Commission in determining the appropriate action to be taken, but will not serve to make protestants parties to the proceeding. Any person wishing to become a party must file a notice of intervention or motion to intervene, as appropriate. The Respondent's answer and all interventions, or protests must be filed on or before the comment date. The Respondent's answer, motions to intervene, and protests must be served on the Complainants.
The Commission encourages electronic submission of protests and interventions in lieu of paper using the “eFiling” link at
This filing is accessible on-line at
Take notice that on October 13, 2016, Dominion Carolina Gas Transmission, LLC, 707 East Main Street, Richmond, VA 23219, filed an application pursuant to section 7(b) of the Natural Gas Act (NGA) requesting authorization to abandon approximately 60 miles of mainline transmission pipeline facilities in Chester, Kershaw, Lancaster, and York Counties, South Carolina that comprise the Line A Abandonment Project, all as more fully set forth in the application which is on file with the Commission and open to public inspection. The filing may also be viewed on the web at
Any questions concerning this application may be directed to Richard D. Jessee, Gas Transmission Certificates Program Manager, Dominion Carolina Gas Transmission, LLC, 707 East Main Street, Richmond, VA 23219, telephone no. (866) 319-3382, facsimile no. (804) 771-4804 and email:
Pursuant to section 157.9 of the Commission's rules (18 CFR 157.9), within 90 days of this Notice, the Commission staff will either: Complete its environmental assessment (EA) and place it into the Commission's public record (eLibrary) for this proceeding or issue a Notice of Schedule for Environmental Review. If a Notice of Schedule for Environmental Review is issued, it will indicate, among other milestones, the anticipated date for the Commission staff's issuance of the final environmental impact statement (FEIS) or EA for this proposal. The filing of the EA in the Commission's public record for this proceeding or the issuance of a Notice of Schedule will serve to notify federal and state agencies of the timing for the completion of all necessary reviews, and the subsequent need to complete all federal authorizations within 90 days of the date of issuance of the Commission staff's FEIS or EA.
There are two ways to become involved in the Commission's review of this project. First, any person wishing to obtain legal status by becoming a party to the proceedings for this project should, on or before the comment date stated below, file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, a motion to intervene in accordance with the requirements of the Commission's Rules of Practice and Procedure (18 CFR 385.214 or 385.211) and the Regulations under the NGA (18 CFR 157.10). A person obtaining party status will be placed on the service list maintained by the Secretary of the Commission and will receive copies of all documents filed by the applicant and by all other parties. A party must submit 5 copies of filings made with the Commission and must mail a copy to the applicant and to every other party in the proceeding. Only parties to the proceeding can ask for court review of Commission orders in the proceeding.
However, a person does not have to intervene in order to have comments considered. The second way to participate is by filing with the Secretary of the Commission, as soon as possible, an original and two copies of comments in support of or in opposition to this project. The Commission will consider these comments in determining the appropriate action to be taken, but the filing of a comment alone will not serve to make the filer a party to the proceeding. The Commission's rules require that persons filing comments in opposition to the project provide copies of their protests only to the party or parties directly involved in the protest.
Persons who wish to comment only on the environmental review of this project should submit an original and two copies of their comments to the
The Commission strongly encourages electronic filings of comments, protests and interventions in lieu of paper using the “eFiling” link at
On June 29, 2016, the City of Tuscaloosa, Alabama filed an application for a preliminary permit, pursuant to section 4(f) of the Federal Power Act (FPA), proposing to study the feasibility of the Lake Tuscaloosa Dam Hydroelectric Project (Lake Tuscaloosa Project or project) to be located on the North River, near the City of Tuscaloosa in Tuscaloosa County, Alabama. The sole purpose of a preliminary permit, if issued, is to grant the permit holder priority to file a license application during the permit term. A preliminary permit does not authorize the permit holder to perform any land-disturbing activities or otherwise enter upon lands or waters owned by others without the owners' express permission.
More information about this project, including a copy of the application, can be viewed or printed on the “eLibrary” link of Commission's Web site at
This is a supplemental notice in the above-referenced proceeding of Applied Energy LLC`s application for market-based rate authority, with an accompanying rate tariff, noting that such application includes a request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability.
Any person desiring to intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211 and 385.214). Anyone filing a motion to intervene or protest must serve a copy of that document on the Applicant.
Notice is hereby given that the deadline for filing protests with regard to the applicant's request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability is November 14, 2016.
The Commission encourages electronic submission of protests and interventions in lieu of paper, using the FERC Online links at
Persons unable to file electronically should submit an original and 5 copies of the intervention or protest to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.
The filings in the above-referenced proceeding are accessible in the Commission's eLibrary system by clicking on the appropriate link in the above list. They are also available for electronic review in the Commission's
Take notice that the Commission has received the following Natural Gas Pipeline Rate and Refund Report filings:
The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.
Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and § 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.
eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at:
Take notice that the Commission received the following electric corporate filings:
Take notice that the Commission received the following electric rate filings:
The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.
Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.
eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at:
In accordance with the National Environmental Policy Act of 1969 and the Federal Energy Regulatory Commission's (Commission) regulations, 18 CFR part 380 (Order No. 486, 52 FR 47897), the Office of Energy Projects has reviewed the application for a new license for the Eastman Falls Hydroelectric Project, located on the Pemigewasset River in the town of Franklin, in Merrimack and Belknap Counties, New Hampshire, and has prepared an Environmental Assessment (EA).
The EA contains the staff's analysis of the potential environmental impacts of the project and concludes that licensing the project, with appropriate environmental protective measures, would not constitute a major federal action that would significantly affect the quality of the human environment.
A copy of the EA is available for review at the Commission in the Public Reference Room or may be viewed on the Commission's Web site at
You may also register online at
Any comments should be filed within 30 days from the date of this notice. The Commission strongly encourages electronic filing. Please file comments using the Commission's eFiling system at
For further information, contact Steve Kartalia at (202) 502-6131 or
Take notice that the Commission has received the following Natural Gas Pipeline Rate and Refund Report filings:
Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.
Any person desiring to protest in any of the above proceedings must file in accordance with Rule 211 of the Commission's Regulations (18 CFR 385.211) on or before 5:00 p.m. Eastern time on the specified comment date.
The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.
eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at:
Environmental Protection Agency (EPA).
Notice of meeting of the Children's Health Protection Advisory Committee.
Pursuant to the provisions of the Federal Advisory Committee Act, Public Law 92-463, notice is hereby given that the next meeting of the Children's Health Protection Advisory Committee (CHPAC) will be held November 15 and 16, 2016 at the George Washington University Milken Institute School of Public Health, located at 950 New Hampshire Avenue NW., Washington, DC 20037.
November 15 and 16, 2016.
950 New Hampshire Avenue NW., Washington, DC 20037.
The meetings of the CHPAC are open to the public. The CHPAC will meet on Thursday, November 15 from 1:00 p.m. to 5:30 p.m. and Friday, November 16 from 9:00 a.m. to 4:00 p.m. An agenda will be posted to
Martha Berger, Designated Federal Officer, U.S. EPA; telephone (202) 564-2191 or
Please note that the time for the Federal Communications Commission Open Meeting is rescheduled from 10:30 a.m. to 9:30 a.m.
The Federal Communications Commission will consider the Agenda items listed on the Commission's Notice of October 20 at the Open Meeting on Thursday, October 27, 2016, scheduled to commence at 9:30 a.m. in room TW-C305, at 445 12th Street SW., Washington, DC. The order of the agenda items is changed as follows:
The notificants listed below have applied under the Change in Bank Control Act (12 U.S.C. 1817(j)) and § 225.41 of the Board's Regulation Y (12 CFR 225.41) to acquire shares of a bank or bank holding company. The factors that are considered in acting on the notices are set forth in paragraph 7 of the Act (12 U.S.C. 1817(j)(7)).
The notices are available for immediate inspection at the Federal Reserve Bank indicated. The notices also will be available for inspection at the offices of the Board of Governors. Interested persons may express their views in writing to the Reserve Bank indicated for that notice or to the offices of the Board of Governors. Comments must be received not later than November 15, 2016.
1.
1.
Agency for Healthcare Research and Quality (AHRQ), Department of Health and Human Services (HHS).
Notice of delisting.
The Patient Safety and Quality Improvement Act of 2005, 42 U.S.C. 299b-21 to b-26, (Patient Safety Act) and the related Patient Safety and Quality Improvement Final Rule, 42 CFR part 3 (Patient Safety Rule), published in the
The directories for both listed and delisted PSOs are ongoing and reviewed weekly by AHRQ. The delisting was effective at 12:00 Midnight ET (2400) on September 30, 2016.
Both directories can be accessed electronically at the following HHS Web site:
Eileen Hogan, Center for Quality Improvement and Patient Safety, AHRQ, 5600 Fishers Lane, Room 06N94B, Rockville, MD 20857; Telephone (toll free): (866) 403-3697; Telephone (local):
The Patient Safety Act authorizes the listing of PSOs, which are entities or component organizations whose mission and primary activity are to conduct activities to improve patient safety and the quality of health care delivery.
HHS issued the Patient Safety Rule to implement the Patient Safety Act. AHRQ administers the provisions of the Patient Safety Act and Patient Safety Rule relating to the listing and operation of PSOs. The Patient Safety Rule authorizes AHRQ to list as a PSO an entity that attests that it meets the statutory and regulatory requirements for listing. A PSO can be “delisted” if it is found to no longer meet the requirements of the Patient Safety Act and Patient Safety Rule, when a PSO chooses to voluntarily relinquish its status as a PSO for any reason, or when a PSO's listing expires. Section 3.108(d) of the Patient Safety Rule requires AHRQ to provide public notice when it removes an organization from the list of federally approved PSOs.
AHRQ has accepted a notification from the Patient Safety Leadership Council PSO, PSO number P0164, to voluntarily relinquish its status as a PSO. Accordingly, the Patient Safety Leadership Council PSO was delisted effective at 12:00 Midnight ET (2400) on September 30, 2016. AHRQ notes that the Patient Safety Leadership Council PSO submitted this request for voluntary relinquishment following receipt of the Notice of Preliminary Finding of Deficiency sent on September 1, 2016.
More information on PSOs can be obtained through AHRQ's PSO Web site at
Agency for Healthcare Research and Quality, HHS.
Notice.
This notice announces the intention of the Agency for Healthcare Research and Quality (AHRQ) to request that the Office of Management and Budget (OMB) approve the proposed information collection project:
This proposed information collection was previously published in the
Comments on this notice must be received by November 30, 2016.
Written comments should be submitted to: AHRQ's OMB Desk Officer by fax at (202) 395-6974 (attention: AHRQ's desk officer) or by email at
Doris Lefkowitz, AHRQ Reports Clearance Officer, (301) 427-1477, or by email at
There is a substantial evidence base showing that engaging patients and families in their care can lead to improvements in patient safety. Since the 1999 release of
Patient and Family Engagement (PFE) strategies for acute care settings include: patient and family advisory committees; membership on patient safety oversight bodies at both operations and governance levels; consultation in the development of patient information material; engaging patients in process improvement or redesign projects; rounding with patients and families; patient and family participation in clinical education programs, and welcoming patients and families to work alongside providers and health systems employees on transparency, culture change and high reliability organization initiatives.
Although the field of PFE in patient safety for hospitals and health systems is maturing, leveraging PFE to improve patient safety in non-acute settings is in its infancy. Building sustainable processes and practice-based infrastructure are crucial to improving patient safety through patient and family engagement in primary care.
In response to the limited guidance available for primary care practices to improve safety through patient and family engagement, the Agency for Healthcare Research and Quality (AHRQ) has funded the development of a
Active engagement requires organizational commitment to hearing the patient and family voice and action by leadership to include them as central members of the health care team.
Patients and families expect and increasingly demand meaningful engagement in harm prevention efforts.
Institutional courage is required to openly share patient safety vulnerabilities and proactively engage patients in developing solutions that prevent harm.
Supportive infrastructure is needed to hardwire PFE into all facets of care delivery across the care continuum.
When done well, patient engagement yields important and measurable results. When not done well, PFE activities may disenfranchise patients, contribute to misunderstanding about risk, result in lack of trust between providers and their organizations, and create fissures among members of the clinical care team.
With these insights as a basis, three precepts undergird our approach to development for the Guide. The Guide interventions must yield:
The Guide will principally, but not exclusively, meet the needs of practices that have not already implemented effective PFE structures or processes. An environmental scan revealed several promising interventions for consideration for inclusion in the Guide. The four interventions selected as part of the Guide include:
The interventions will be compiled into a Guide for adoption by primary care practices. The environmental scan also yielded several important implications for Guide development including:
Engagement efforts in primary care to date have focused on the patient as the agent of change with limited guidance to providers on how to support patients in these efforts.
Many interventions are focused heavily on educational efforts alone, either for the patient, the provider, or the practice.
Few of the tools and interventions identified are immediately usable without the need for additional development or enabling materials to support sustainable adoption.
Health equity and literacy considerations are limited. Tools for patients are often at a relatively high level of literacy, and/or health literacy is required for use.
Current interventions, tools, and toolkits have a high level of complexity that may impede adoption.
Existing evidence-based interventions are being refined to reduce complexity and enhance the opportunity for implementation. Implementation development activities including guidance for each intervention and the Guide as a whole are currently underway. Guide field testing will evaluate the implementation challenges faced by primary care practices thereby offering an opportunity to revise the Guide materials for optimal implementation success prior to widespread dissemination.
The Guide will be made publicly accessible through the AHRQ Web site for easy referral, access, and use by other health care professionals and primary care practices. AHRQ recognizes the importance of ensuring that the Guide will be useful, well implemented and effective in achieving the goals of improving patient safety by engaging patients and families. Thus, the purpose of the Field Testing evaluation is to gain insight on the implementation challenges identified by the twelve primary care practices field testing the Guide. The Guide materials will be revised in an effort to overcome these implementation challenges prior to broad dissemination.
The specific goals of the proposed Guide field testing evaluation are to examine the following:
The feasibility of implementing a minimum of two of the four Guide interventions within twelve medium or large primary care practices.
The challenges to implementing the interventions at the patient, clinician, practice staff, and practice level.
The uptake and confidence among primary care practices to improve patient safety through patient and family engagement.
How the implementation of two of the four Guide interventions changes the perception of patient safety among patients, clinicians, and practice staff.
How the implementation of two of the four Guide interventions changes the perception of patient and family engagement among patients, clinicians, and practice staff.
Whether primary care practices will continue to use the Guide (or its interventions) beyond the period of field testing and evaluation (
What changes patients, clinicians, and practice staff would recommend to the interventions and the Guide to enhance sustainability.
This study is being conducted by AHRQ through its contractor, MedStar, pursuant to AHRQ's statutory authority to conduct and support research on health care and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).
To achieve the goals of the project, the following data collections will be implemented during the Field Testing evaluation:
1. Baseline Practice Assessment of Primary Care Practices. This pen and paper survey will be administered to the twelve primary care practice champions, individuals at each practice responsible for coordinating Guide activities and responding to inquiries from MedStar during Field Testing, immediately following the recruitment as part of the Guide Field Test and prior to commencing implementation of the Guide. Information collected includes: (i) Practice name and location (
2. Post-Implementation Focus Groups for Patients and Families. Information from patients on their experiences with the Guide and its interventions will be solicited twice during the Field Test—once at 3-months and again at 6-months post-implementation of the Guide. Each patient and family focus group will aim to recruit between 6-8 participants and solicit feedback from patients and family members on their experiences with the Guide materials. Information collected will include: (i) Perceptions of patient safety in primary care practices; (ii) perceptions of patient and family engagement in primary care practices; (iii) feedback from the patient perspective on the Guide materials and their general use; (iv) feasibility of adopting the patient and family focused intervention materials in practice; (v) feedback on the patient and family experiences of the Guide and its relation to patient safety.
3. Baseline Practice Readiness Assessment. Information from primary care practices about their readiness to adopt patient and family engagement strategies will be solicited through telephone interviews with practice staff champions. Information collected will include: (i) Descriptive information on the person completing the interview (
4. Post-Implementation Interviews of Primary Care Clinicians. Information from primary care clinicians (
5. Post-Implementation Focus Groups for Practice Staff Members. Information from practice staff members (
6. Monthly Telephone Interviews with Practice Champions. This survey will be completed over the phone on a monthly basis with the practice champions from the twelve primary care practices engaged in the Field Testing of the Guide. Information collected will include: (i) Current progress towards implementation of the intervention(s); (ii) movement towards target goals set in the prior meeting; (iii) barriers to implementation; (iv) facilitators of implementation; (v) perceived impact on patient safety; (vi) perceived impact on patient and family engagement; vii) plans for the coming weeks/months.
The Guide will be tested to evaluate the feasibility of adopting it in primary care practices. A mixed-methods approach will be used to identify barriers and facilitators to uptake and sustainability, and to answer the question “How and in what contexts do the chosen interventions work or can they be amended to work”, rather than “Do they work?” Testing will occur at up to 12 primary care sites and feasibility will be assessed at the patient, provider, and practice levels. The Guide will be revised based on these findings.
Exhibit 1 shows the estimated annualized burden hours for the respondents' time to participate in this evaluation of the Guide during field testing. Two formative evaluations will be conducted during field testing in twelve primary care practices in at least two geographic regions of the United States. Evaluation efforts will include collection of baseline practice level data prior to Guide implementation and two separate rounds of focus groups and interviews conducted 3-months and 6-months after Guide implementation. Baseline assessments will be conducted on paper via phone consultation between the Contractor and the local practice champion and will take between 30-60 minutes. Patient focus groups will be conducted at the 3- and 6-month evaluation periods; each lasting between 60-90 minutes. Practice staff focus groups will be conducted during each of the site visits, conducted outside regular practice hours, and last between 60-90 minutes. Primary care clinician interviews will last approximately 45 minutes. We estimate that approximately 12 individuals will participate in the monthly telephone interviews over the 9-month implementation and evaluation period.
Exhibit 2 shows the estimated annualized cost burden based on the respondents' time to participate in this project. The total cost burden is estimated to be $18,629.16.
In accordance with the Paperwork Reduction Act, comments on AHRQ's information collection are requested with regard to any of the following: (a) Whether the proposed collection of information is necessary for the proper performance of AHRQ health care research and health care information dissemination functions, including whether the information will have practical utility; (b) the accuracy of AHRQ's estimate of burden (including hours and costs) of the proposed collection(s) of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information upon the respondents, including the use of automated collection techniques or other forms of information technology.
Comments submitted in response to this notice will be summarized and included in the Agency's subsequent request for OMB approval of the proposed information collection. All comments will become a matter of public record.
Centers for Disease Control and Prevention (CDC), Department of Health and Human Services (HHS).
Notice with comment period.
The Centers for Disease Control and Prevention (CDC), as part of its continuing efforts to reduce public burden and maximize the utility of government information, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995. This notice invites comment on a proposed information collection entitled “Understanding the Needs, Challenges, Opportunities, Vision and Emerging Roles in Environmental Health (UNCOVER EH).” The purpose of the data collection is to collect information from the health department environmental health (EH) workforce to determine demographics, education/training, experience, areas of practice, and current and future needs to address emerging environmental issues.
Written comments must be received on or before December 30, 2016.
You may submit comments, identified by Docket No. CDC-2016-0103 by any of the following methods:
•
•
To request more information on the proposed project or to obtain a copy of the information collection plan and instruments, contact the Information Collection Review Office, Centers for Disease Control and Prevention, 1600 Clifton Road NE., MS-D74, Atlanta, Georgia 30329; phone: 404-639-7570; Email:
Under the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3501-3520), Federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. In addition, the PRA also requires Federal agencies to provide a 60-day notice in the
Understanding the Needs, Challenges, Opportunities, Vision and Emerging Roles in Environmental Health (UNCOVER EH)—NEW—National Center for Environmental Health (NCEH), Centers for Disease Control and Prevention (CDC).
The environmental health (EH) workforce is an essential component of the public health workforce. According to recent health department surveys, EH professionals are employed at approximately 85% of local health departments, 81% of state health departments, and 30% of tribal health departments. Describing and characterizing the EH workforce is essential to identifying gaps in staffing, training, and ultimately ensuring EH professionals are prepared to meet future challenges.
CDC seeks OMB approval for a one-time, one-year information collection designed to thoroughly describe the health department EH workforce on: (1) The current supply of EH professionals; (2) EH workforce demographics and professional roles; (3) gaps in current EH education and competencies and training needs; and (4) critical skills and resources needed to meet the evolving and emerging EH issues and challenges. This information will benefit the government and other entities by providing essential data to inform and support workforce development activities and initiatives and understand areas of practice and where gaps may exist in capacity to address current EH issues and future challenges.
The respondent universe will be the estimated 20,000 EH professionals working within health departments. They will be enumerated and recruited by identifying a point of contact in each state, local, tribal, and territorial health department from whom a roster of EH professionals will be requested. A list of respondents and their business email addresses will be generated and used for recruitment and survey administration. Any contact information collected will be related to the respondents' role in the organization. Participation will be voluntary.
Data will be collected one time from a census of members of the public health department EH workforce using a web-based survey instrument. The UNCOVER EH Survey will take approximately 30 minutes to complete per respondent, and it will take approximately 5 minutes for health department administrative staff to compile EH workforce names and email addresses into the Health Department Roster.
There will be no cost to respondents other than their time. The requested time burden is 10,269 hours.
Centers for Medicare & Medicaid Services, HHS.
Notice.
The Centers for Medicare & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of 1995 (the PRA), federal agencies are required to publish notice in the
Comments must be received by December 30, 2016.
When commenting, please reference the document identifier or OMB control number. To be assured consideration, comments and recommendations must be submitted in any one of the following ways:
1.
2.
To obtain copies of a supporting statement and any related forms for the proposed collection(s) summarized in this notice, you may make your request using one of following:
1. Access CMS' Web site address at
2. Email your request, including your address, phone number, OMB number, and CMS document identifier, to
3. Call the Reports Clearance Office at (410) 786-1326.
Reports Clearance Office at (410) 786-1326.
This notice sets out a summary of the use and burden associated with the following information collections. More detailed information can be found in each collection's supporting statement and associated materials (see
Under the PRA (44 U.S.C. 3501-3520), federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. The term “collection of information” is defined in 44 U.S.C. 3502(3) and 5 CFR 1320.3(c) and includes agency requests or requirements that members of the public submit reports, keep records, or provide information to a third party. Section 3506(c)(2)(A) of the PRA requires federal agencies to publish a 60-day notice in the
1.
2.
3.
4.
5.
Centers for Medicare & Medicaid Services, Department of Health and Human Services.
Notice.
The Centers for Medicare & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of 1995 (the PRA), Federal agencies are required to publish notice in the
Comments must be received by December 30, 2016.
When commenting, please reference the document identifier or OMB control number. To be assured consideration, comments and recommendations must be submitted in any one of the following ways:
1.
2.
To obtain copies of a supporting statement and any related forms for the proposed collection(s) summarized in this notice, you may make your request using one of following:
1. Access CMS' Web site address at
2. Email your request, including your address, phone number, OMB number, and CMS document identifier, to
3. Call the Reports Clearance Office at (410) 786-1326.
Reports Clearance Office at (410) 786-1326.
This notice sets out a summary of the use and burden associated with the following information collections. More detailed information can be found in each collection's supporting statement and associated materials (see
Under the PRA (44 U.S.C. 3501-3520), Federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. The term “collection of information” is defined in 44 U.S.C. 3502(3) and 5 CFR 1320.3(c) and includes agency requests or requirements that members of the public submit reports, keep records, or provide information to a third party. Section 3506(c)(2)(A) of the PRA requires Federal agencies to publish a 60-day notice in the
1.
Centers for Medicare & Medicaid Services, HHS.
Notice.
The Centers for Medicare & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of 1995 (PRA), federal agencies are required to publish notice in the
Comments on the collection(s) of information must be received by the OMB desk officer by November 30, 2016.
When commenting on the proposed information collections, please reference the document identifier or OMB control number. To be assured consideration, comments and recommendations must be received by the OMB desk officer via one of the following transmissions: OMB, Office of Information and Regulatory Affairs, Attention: CMS Desk Officer, Fax Number: (202) 395-5806
To obtain copies of a supporting statement and any related forms for the proposed collection(s) summarized in this notice, you may make your request using one of following:
1. Access CMS' Web site address at
2. Email your request, including your address, phone number, OMB number, and CMS document identifier, to
3. Call the Reports Clearance Office at (410) 786-1326.
Reports Clearance Office at (410) 786-1326.
Under the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3501-3520), federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. The term “collection of information” is defined in 44 U.S.C. 3502(3) and 5 CFR 1320.3(c) and includes agency requests or requirements that members of the public submit reports, keep records, or provide information to a third party. Section 3506(c)(2)(A) of the PRA (44 U.S.C. 3506(c)(2)(A)) requires federal agencies to publish a 30-day notice in the
1.
CMS evaluates the quality and effectiveness of the QIO program as authorized in Part B of Title XI of the Social Security Act. CMS created the Independent Evaluation Center (IEC) to provide CMS and its stakeholders with an independent and objective program evaluation of the 11th SOW. Evaluation activities will focus on analyzing how well the QIO program is achieving the three aims of better care, better health, and lower cost as well as the effectiveness of the new QIO program structure. One of the QIN-QIOs' tasks to achieve these three aims is to support participating nursing homes in their efforts to improve quality of care and health outcomes among residents. According to the 2013 CMS Nursing Home Data Compendium, more than 15,000 nursing homes participated in Medicare and Medicaid programs with more than 1.4 million beneficiaries resided in U.S. nursing homes. These residents and their families rely on nursing homes to provide reliable, safe, high quality care. However, cognitive and functional impairments, pain, incontinence, antipsychotic drug use, and healthcare associated conditions (HAC), such as pressure ulcers and falls, remain areas of concern.
This information collection is to provide data to assess QIN-QIOs efforts aimed at addressing these HACs in nursing homes. QIN-QIOs are responsible for recruiting nursing homes to participate in the program. We will conduct an annual survey of administrators of nursing homes participating in the QIN-QIO program (intervention group) and administrators at nursing homes that are not participating in the QIN-QIO program (comparison group). Our proposed survey assesses progress towards the goals of the QIN-QIO SOW, including activities and strategies to increase mobility among residents, reduce infections, reduce use of inappropriate antipsychotic medication among long-term stay residents.
We plan to conduct qualitative interviews with nursing home administrators. This interview will supplement the Nursing Home Survey and provide more in-depth contextual information about the QIN-QIO program implementation within at nursing homes, including: (i) Their experience with, and perceived success of QIN-QIO collaboratives; (ii) their satisfaction with the QIN-QIO Collaborative and QIO support; (iii) perceived value and impact of QIO program; and (iv) drivers and barriers to QIN-QIO involvement and success.
Information from QIO leadership and/or state/territory task leads will be collected by interviews and focus groups. Interviews with Nursing Home Task leaders at the QIN and QIO will be conducted in-person during site visits and/or over the phone. We will conduct focus groups with QIO-level Directors during the annual CMS Quality conference. The purpose of the interviews and focus groups is to examine: (i) QIO processes for recruiting nursing homes, peer coaches, and beneficiaries to participate in the program; (ii) strengths and challenges of QIN-QIO activities related to nursing homes; (iii) partnership and coordination with other QIN-QIO tasks; and (iv) overall lessons learned. We will also conduct qualitative interviews with nursing home peer coaches. Subsequent to the 60-day notice
2.
3.
4.
5.
6.
7.
8.
9.
Food and Drug Administration, HHS.
Notice of availability.
Under the Federal Food, Drug, and Cosmetic Act (the FD&C Act), the Food and Drug Administration (FDA or Agency) is required to report annually in the
Cathryn C. Lee, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 22, Rm. 6484, Silver Spring, MD 20993-0002, 301-796-0700; or Stephen Ripley, Center for Biologics Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 71, Rm. 3128, Silver Spring, MD 20993-0002, 240-402-7911.
A PMR is a study or clinical trial that an applicant is required by statute or regulation to conduct postapproval. A PMC is a study or clinical trial that an applicant agrees in writing to conduct postapproval, but that is not required by statute or regulation. PMRs and PMCs can be issued upon approval of a drug
FDA can require application holders to conduct postmarketing studies and clinical trials:
• To assess a known serious risk, assess signals of serious risk, or identify an unexpected serious risk (when available data indicates the potential for a serious risk) related to the use of a drug product (section 505(o)(3) of the FD&C Act, as added by the Food and Drug Administration Amendments Act of 2007 (FDAAA)).
• Under the Pediatric Research Equity Act (PREA), to study certain new drugs for pediatric populations, when these drugs are not adequately labeled for children. Under section 505B(a)(3) of the FD&C Act, the initiation of these studies may be deferred until required safety information from other studies in adults has first been submitted and reviewed.
• To verify and describe the predicted effect or other clinical benefit for drugs approved in accordance with the accelerated approval provisions in section 506(c)(2)(A) of the FD&C Act (21 CFR 314.510 and 601.41).
• For a drug that was approved on the basis of animal efficacy data because human efficacy trials are not ethical or feasible (21 CFR 314.610(b)(1) and 601.91(b)(1)). PMRs for drug products approved under the animal efficacy rule
Under the regulations (21 CFR 314.81(b)(2)(vii) and 601.70), applicants of approved drugs are required to submit annually a report on the status of each clinical safety, clinical efficacy, clinical pharmacology, and nonclinical toxicology study or clinical trial either required by FDA or that they have committed to conduct, either at the time of approval or after approval of their new drug application (NDA), abbreviated new drug application (ANDA), or biologics license application (BLA). Applicants are required to report to FDA on these requirements and commitments made for NDAs and ANDAs under 21 CFR 314.81(b)(2)(viii), and for BLAs under 21 CFR 601.70(b). The status of PMCs concerning chemistry, manufacturing, and production controls and the status of other studies or clinical trials conducted on an applicant's own initiative are not required to be reported under 21 CFR 314.81(b)(2)(vii) and 601.70 and are not addressed in this report. Furthermore, section 505(o)(3)(E) of the FD&C Act requires that applicants report periodically on the status of each required study or clinical trial and each study or clinical trial “otherwise undertaken . . . to investigate a safety issue . . . .”
An applicant must report on the progress of the PMR/PMC on the anniversary of the drug product's approval
The status of the PMR/PMC must be described in the ASR according to the terms and definitions provided in 21 CFR 314.81 and 601.70. For its own reporting purposes, FDA has also established terms to describe when the conditions of the PMR/PMC have been met, and when it has been determined that a PMR/PMC is no longer necessary.
•
•
•
•
•
•
•
In addition to the above statuses, PMRs/PMCs may also be characterized as closed or open. “Open” PMRs/PMCs comprise those that are pending, ongoing, delayed, submitted, or terminated; whereas “closed”
If an applicant fails to comply with the original schedule for completion of postmarketing studies or clinical trials required under section 505(o)(3) of the FD&C Act (
Section 505B(a)(3)(B) of the FD&C Act, as amended by the Food and Drug Administration Safety and Innovation Act, authorizes FDA to grant an extension of deferral of pediatric assessments that are required under PREA.
FDA may take enforcement action against applicants who are noncompliant with or otherwise fail to conduct studies and clinical trials required under FDA statutes and regulations (see, for example, sections 505(o)(1), 502(z), and 303(f)(4) of the FD&C Act (21 U.S.C. 355(o)(1), 352(z), and 333(f)(4))).
Databases containing information on PMRs/PMCs are maintained at the Center for Drug Evaluation and Research (CDER) and the Center for Biologics Evaluation and Research (CBER). The information in these databases is periodically updated as new PMRs/PMCs are issued, upon FDA review of PMR/PMC ASRs or other PMR/PMC correspondence, upon receipt of final reports from completed studies and clinical trials, and after the final reports are reviewed and FDA determines that the PMR/PMC has been fulfilled, or when FDA determines that the PMR/PMC is either no longer feasible or would no longer provide useful information. Because applicants typically report on the status of their PMRs/PMCs annually, and because updating the status of PMRs/PMCs in FDA's databases involves FDA review of received information, there is an inherent lag in updating the data (that is, the data are not “real time”). FDA strives to maintain as accurate information as possible on the status of PMRs/PMCs.
Both CDER and CBER have established policies and procedures to help ensure that FDA's data on PMRs/PMCs are current and accurate. When identified, data discrepancies are addressed as expeditiously as possible and/or are corrected in later reports.
In 2013, CDER initiated an internal audit of a sample of PMRs and PMCs that had been established after March 25, 2008,
FDA also maintains an online searchable and downloadable database that contains information about PMRs/PMCs that is publicly reportable (
This report is published to fulfill the annual reporting requirement under section 506B(c) of the FD&C Act. Information in this report covers any PMR/PMC that was made, in writing, at the time of approval or after approval of an application or a supplement to an application (see section I.A) and summarizes the status of PMRs/PMCs in FY2013 (
This report reflects combined data from CDER and CBER. Information summarized in the report includes the following: (1) The number of applicants with open PMRs/PMCs
Numbers published in this report and in the accompanying supplemental report on FDA's Web site cannot be compared with the numbers resulting from searches of the publicly accessible and downloadable database. This is because this report incorporates data for all PMRs/PMCs in FDA databases as of the end of the fiscal year, including PMRs/PMCs undergoing review for accuracy. The publicly accessible and downloadable database includes a subset of PMRs/PMCs, specifically those that, at the time of data retrieval, either had an open status or were closed within the past 12 months. In addition, the status information in this report is updated annually while the downloadable database is updated quarterly (
This report provides information on PMRs/PMCs as of September 30, 2013 (
Although a comparison of the number of open and on-schedule or off-schedule PMRs/PMCs over time is not appropriate for the aforementioned reasons, a comparison of the data for FY2013 and FY2014 may be helpful in understanding the effect of CDER's 2013 audit. The observed differences are considered to reflect the results of CDER's efforts to update the information on the statuses of PMRs and PMCs following the internal audit of the data for a sample of PMRs/PMCs (see section II.A), as well as the natural progress of postmarketing studies and clinical trials over time. Finally, due to rounding, the percentages in the tables may not add up to 100 percent.
An applicant may have multiple approved drug products, and an approved drug product may have multiple PMRs and/or PMCs. Table 1 shows that as of September 30, 2013, there were 256 unique applicants with open PMRs/PMCs under 613 unique NDAs and BLAs. There were 184 unique NDA applicants (and 496 associated applications) and 72 unique BLA applicants (and 117 associated applications) with open PMRs/PMCs.
As of September 30, 2014, there were 257 unique applicants with open PMRs/PMCs under 639 unique NDAs and BLAs. There were 181 unique NDA applicants (and 510 associated applications) and 76 unique BLA applicants (and 129 associated applications) with open PMRs/PMCs.
As previously mentioned, applicants must submit an ASR on the progress of each open PMR/PMC within 60 days of the anniversary date of U.S. approval of the original application or an alternate reporting date that was granted by FDA (21 CFR 314.81 and 21 CFR 601.70).
There were 569 NDAs and BLAs with an ASR due in FY2014 (454 NDAs and 115 BLAs). Of the 454 NDA ASRs due in that fiscal year, 58 percent (265/454) were received on time, 19 percent (88/454) were not received on time, and 22 percent (101/454) were not received during FY2014. Of the 115 BLA ASRs due, 63 percent (73/115) were received on time, 19 percent (20/115) were not received on time, and 19 percent (22/115) were not received during FY2014.
Table 3 shows that as of September 30, 2013, most open PMRs (84 percent for NDAs and 89 percent for BLAs) and most open PMCs (77 percent for NDAs and 74 percent for BLAs) were progressing on schedule (
Table 4 shows that as of September 30, 2013, the majority of open NDA PMRs (60 percent; 534/887) and open BLA PMRs (45 percent; 80/179) were pending.
Table 4 also shows that the proportion of open BLA PMRs that were pending decreased between FY2013 (45 percent; 80/179) and FY2014 (38 percent; 74/194). The proportion of open BLA PMRs that were ongoing did not change substantially between FY2013 (32 percent; 57/179) and FY2014 (35 percent; 68/194).
In addition, table 4 provides detail on the status of open PMRs and PMCs for each category of PMR. The table shows that as of September 30, 2013, 50 percent (305/614) of pending PMRs for drug and biological products were in response to the requirements under PREA. The next largest category of pending PMRs for drug and biological products (47 percent; 286/614) comprises those studies/clinical trials required by FDA under FDAAA. As of September 30, 2014, PREA PMRs and FDAAA PMRs comprised 55 percent (292/530) and 42 percent (222/530) of pending PMRs, respectively.
Table 5 provides additional information on the status of open and off-schedule (
As of September 30, 2014, 13 percent (126/943) of the open NDA PMRs were off-schedule. Of the off-schedule NDA PMRs, 94 percent (118/126) were off-schedule because they were delayed and the remaining 6 percent (8/126) were terminated. At the end of FY2014, 12 percent (24/194) of the open BLA PMRs were off-schedule. The majority of the off-schedule BLA PMRs (88 percent; 21/24) were off-schedule because they were delayed; the remaining 2 percent (3/194) were terminated.
In certain situations, the original PMR schedules were adjusted for unanticipated delays in the progress of the study or clinical trial (
Table 6 provides the status of open on-schedule and off-schedule PMCs. As shown in the table, pending NDA PMCs comprised the largest category of all open NDA PMCs as of September 30, 2013 (37 percent; 97/264), and September 30, 2014 (29 percent; 61/207). Among all open BLA PMCs, 35 percent (88/251) and 30 percent (69/228) were pending at the end of FY2013 and FY2014, respectively.
As of September 30, 2013, the largest category of off-schedule PMCs were delayed according to the original schedule milestones.
Table 7 provides details about PMRs and PMCs that were closed (released or fulfilled) within FY2013 and FY2014. The majority of closed PMRs were fulfilled (53 percent of NDA PMRs and 88 percent of BLA PMRs at the end of FY2013; 72 percent of NDA PMRs and 77 percent of BLA PMRs at the end of FY2014). Similarly, the majority of PMCs closed within FY2013 and FY2014 were fulfilled.
Tables 8 and 9 show the distribution of the statuses of PMRs/PMCs as of September 30, 2014, of all PMRs and PMCs, presented by the year that the PMR/PMC was established (FY2008 to FY2014).
Food and Drug Administration, HHS.
Notice of availability.
The Food and Drug Administration (FDA or Agency) is announcing the availability of the guidance entitled “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization.” This guidance addresses the inclusion of a boxed warning and patient decision checklist in the product labeling for permanent hysteroscopically placed tubal implants intended for female sterilization, and the content and format of those materials. FDA believes that the labeling described in this guidance will help to ensure that a woman receives and understands information regarding the benefits and risks of this type of device prior to undergoing implantation. FDA considered comments received on the draft guidance and revised the guidance as appropriate.
The guidance identifies the content and format of certain labeling components for permanent, hysteroscopically placed tubal implants that are intended for sterilization. The guidance applies to all devices of this type, regardless of the insert material composition, location of intended implantation, or exact method of delivery.
Submit either electronic or written comments on this guidance at any time. General comments on Agency guidance documents are welcome at any time.
You may submit comments as follows:
Submit electronic comments in the following way:
•
• If you want to submit a comment with confidential information that you do not wish to be made available to the public, submit the comment as a written/paper submission and in the manner detailed (see “Written/Paper Submissions” and “Instructions”).
Submit written/paper submissions as follows:
•
• For written/paper comments submitted to the Division of Dockets Management, FDA will post your comment, as well as any attachments, except for information submitted, marked and identified, as confidential, if submitted as detailed in “Instructions.”
• Confidential Submissions—To submit a comment with confidential information that you do not wish to be made publicly available, submit your comments only as a written/paper submission. You should submit two copies total. One copy will include the information you claim to be confidential with a heading or cover note that states “THIS DOCUMENT CONTAINS
An electronic copy of the guidance document is available for download from the Internet. See the
Jason Roberts, Division of Reproductive, Gastro-Renal and Urological Devices, Center for Devices and Radiological Health, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 66, Rm. G218, Silver Spring, MD 20993-0002, 240-402-6400.
Female sterilization is a commonly performed surgical procedure that permanently prevents a woman from becoming pregnant by occluding her fallopian tubes. Traditionally, such surgery has been performed by surgical bilateral tubal ligation (BTL) through a laparotomy, a mini-laparotomy, a transvaginal approach or at the time of cesarean delivery, and, more recently, laparoscopy. During surgical BTL, the fallopian tubes are cut or physically occluded by using various procedures or medical instruments, such as electrosurgical coagulation or implantable clips or rings. On November 4, 2002, FDA approved the Essure System for Permanent Birth Control, the first permanent hysteroscopically placed tubal implant, as an alternative, non-incisional method of providing female sterilization. As the number of hysteroscopic sterilizations with such devices has increased, additional information, including reports of adverse events, has accumulated. Some of these events have resulted in surgery and/or removal of the implants.
In the
A draft guidance regarding the labeling for permanent hysteroscopically placed tubal implants intended for sterilization was announced in the
This guidance is being issued consistent with FDA's good guidance practices regulation (21 CFR 10.115). The guidance represents the current thinking of FDA on “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization.” It does not establish any rights for any person and is not binding on FDA or the public. You can use an alternative approach if it satisfies the requirements of the applicable statutes and regulations.
Persons interested in obtaining a copy of the guidance may do so by downloading an electronic copy from the Internet. A search capability for all Center for Devices and Radiological Health guidance documents is available at
This guidance refers to previously approved collections of information found in FDA regulations. These collections of information are subject to review by the Office of Management and Budget (OMB) under the Paperwork Reduction Act of 1995 (44 U.S.C. 3501-3520). The collections of information in 21 CFR part 801, regarding labeling, have been approved under OMB control number 0910-0485.
The following reference is on display in the Division of Dockets Management (see
Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5 U.S.C. App.), notice is hereby given of the following meetings.
The meetings will be closed to the public in accordance with the provisions set forth in sections 552b(c)(4) and 552b(c)(6), Title 5 U.S.C., as amended. The grant applications and the discussions could disclose confidential trade secrets or commercial property such as patentable material, and personal information concerning individuals associated with the grant applications, the disclosure of which would constitute a clearly unwarranted invasion of personal privacy.
National Institutes of Health, HHS.
Notice.
The invention listed below is owned by an agency of the U.S. Government and is available for licensing and/or co-development in the U.S. in accordance with 35 U.S.C. 209 and 37 CFR part 404 to achieve expeditious commercialization of results of federally-funded research and development. Foreign patent applications are filed on selected inventions to extend market coverage for companies and may also be available for licensing and/or co-development.
Invention Development and Marketing Unit, Technology Transfer Center, National Cancer Institute, 9609 Medical Center Drive, Mail Stop 9702, Rockville, MD 20850-9702.
Information on licensing and co-development research collaborations, and copies of the U.S. patent applications listed below may be obtained by contacting: Attn. Invention Development and Marketing Unit, Technology Transfer Center, National Cancer Institute, 9609 Medical Center Drive, Mail Stop 9702, Rockville, MD 20850-9702, Tel. 240-276-5515 or email
Technology description follows.
Small Molecule Inhibitors of Drug Resistant Forms of HIV-1 Integrase
Integrase strand transfer inhibitors (“INSTIs”) are currently in use as a component of prophylactic antiretroviral therapy for preventing HIV-1 infection from progressing to AIDS. Three INSTIs are approved by the FDA for inclusion in antiretroviral regiments: Raltegravir (RAL), elvitegravir (EVG) and dolutegravir (DTG). Clinicians have already identified several HIV-1 integrase mutations that confer resistance to RAL and EVG, and additional mutations that confer resistance to all three INSTIs has been identified in the laboratory.
Researchers at the National Cancer Institute discovered small-molecule compounds containing 1-hydroxy-2-oxo-1,8-naphthyridine moieties whose activity against HIV-1 integrase mutants confer resistance to currently approved INSTIs. These new compounds exhibit potent and selective activity against comprehensive and varied panels of INSTI-resistant mutants of HIV-1 integrase. Preliminary rodent efficacy, metabolic, and pharmacokinetic studies have been completed by the NCI researchers.
The National Cancer Institute (NCI) seeks partners to in-license or co-develop this class of compounds for therapeutic use. Parties interested in licensing the technology should submit an Application for Licensing, and seek detailed information from the Licensing and Patenting Manager indicated below.
Co-development partners would apply under a Cooperative Research and Development (CRADA) to conduct pre-clinical studies that include lead optimization,
Interested potential CRADA collaborators can receive detailed information by contacting the Licensing and Patenting Manager (see below). Interested parties will receive detailed information on the current status of the project after signing a confidentiality disclosure agreement (CDA) with NCI. Interested candidate partners must submit a statement of interest and capability to the NCI point of contact for consideration by 5:00 p.m. Eastern Standard Time, December 30, 2016.
Guidelines for the preparation of a full CRADA proposal will be communicated to all respondents with whom initial confidential discussions have been established. Licensing of background technology related to this CRADA opportunity, specifically HHS Reference No.: E-093-2013/0,1,2, entitled “Compounds for Inhibiting Drug-Resistant Strains of HIV-1 Integrase”, is also available to potential collaborators. All proposals received by the above date will be considered. NCI reserves the right to consider additional proposals or none at all if no partner is selected from the initial response.
Further information about the NCI Technology Transfer Center can be found on its Web site
• HIV therapeutic for drug-resistant compounds of HIV-1 integrase
• Currently, the only INSTI effective against drug resistant mutants of HIV-1 integrase
Pre-clinical (in vivo validation)
Terrence Burke, Stephen Hughes, Yves Pommier, Xue Zhao, Mathieu Metifiot, Stephen Smith, Barry Johnson, Christophe Marchand (all from NCI)
Requests for copies of the patent application and inquiries about licensing, research collaborations, and co-development opportunities for this invention should be sent to Lauren Nguyen-Antczak, Ph.D., J.D., Senior Licensing & Patenting Manager, NCI Technology Transfer Center, 8490 Progress Drive, Suite 400, Frederick, MD 21701, Tel: (301) 624-8752, email:
Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5 U.S.C. App. 2), notice is hereby given of the joint meeting of the National Cancer Advisory Board (NCAB) and NCI Board of Scientific Advisors (BSA).
The meeting will be open to the public as indicated below, with attendance limited to space available. Individuals who plan to attend and need special assistance, such as sign language interpretation or other reasonable accommodations, should notify the Contact Person listed below in advance of the meeting. The open session will be videocast and can be accessed from the NIH Videocasting and Podcasting Web site (
A portion of the National Cancer Advisory Board meeting will be closed to the public in accordance with the provisions set forth in section 552b(c)(6), Title 5 U.S.C., as amended, for the review, discussion, and evaluation of individual intramural programs and projects conducted by the National Cancer Institute, including consideration of personnel qualifications and performance, and the competence of individual investigators, the disclosure of which would constitute a clearly unwarranted invasion of personal privacy.
Any interested person may file written comments with the committee by forwarding the statement to the Contact Person listed on this notice. The statement should include the name, address, telephone number and when applicable, the business or professional affiliation of the interested person.
In the interest of security, NIH has instituted stringent procedures for entrance onto the NCI—Shady Grove campus. All visitors will be asked to show one form of identification (for example, a government-issued photo ID, driver's license, or passport) and to state the purpose of their visit. Information is also available on the Institute's/Center's home page: NCAB:
U.S. Customs and Border Protection (CBP), Department of Homeland Security (DHS).
Committee Management; Notice of Federal Advisory Committee Meeting.
The Commercial Customs Operations Advisory Committee (COAC) will meet in Washington, DC. The meeting will be open to the public.
The Commercial Customs Operations Advisory Committee (COAC) will meet on Thursday, November 17, 2016, from 12:30 p.m. to 4:30 p.m. EST. Please note that the meeting may close early if the committee has completed its business.
Members of the public who are pre-registered and later need to cancel, please do so in advance of the meeting by accessing one (1) of the following links:
The meeting will be held at the Washington Marriott Wardman Park Hotel, 2660 Woodley Road NW., Washington, DC 20008. There will be signage posted directing visitors to the location of the meeting room.
For information on facilities or services for individuals with disabilities or to request special assistance at the meeting, contact Ms. Karmeshia Tuck,
To facilitate public participation, we are inviting public comment on the issues the committee will consider prior to the formulation of recommendations as listed in the “Agenda” section below.
Comments must be submitted in writing no later than November 7, 2016, and must be identified by Docket No. USCBP-2016-0066, and may be submitted by
•
•
•
•
There will be multiple public comment periods held during the meeting on November 17, 2016. Speakers are requested to limit their comments to two (2) minutes or less to facilitate greater participation. Contact the individual listed below to register as a speaker. Please note that the public comment period for speakers may end before the time indicated on the schedule that is posted on the CBP Web page,
Ms. Karmeshia Tuck, Office of Trade Relations, U.S. Customs and Border Protection, 1300 Pennsylvania Avenue NW., Room 3.5A, Washington, DC 20229; telephone (202) 344-1661; facsimile (202) 325-4290.
Notice of this meeting is given under the
The COAC will hear from the following subcommittees on the topics listed below and then will review, deliberate, provide observations, and formulate recommendations on how to proceed:
1. The Trade Enforcement and Revenue Collection (TERC) Subcommittee will discuss the progress made on prior TERC, Bond Working Group, and Intellectual Property Rights Working Group recommendations, as well the recommendations from the Forced Labor Working Group.
2. The Global Supply Chain Subcommittee will provide an update report on the progress of the Customs-Trade Partnership Against Terrorism (C-TPAT) Working Group that is reviewing and developing recommendations to update the C-TPAT minimum security criteria.
3. The One U.S. Government Subcommittee (1 USG) will discuss the progress of the North American Single Window (NASW) Working Group's NASW approach. The subcommittee will also discuss the progress of the Automated Commercial Environment (ACE) Single Window effort.
4. The Exports Subcommittee will give an update on the Air, Ocean, and Rail Manifest Pilots and discuss the progress of the Truck Manifest Sub-Working Group, which is coordinating with the 1 USG NASW Working Group.
5. The Trade Modernization Subcommittee will discuss the progress of the International Engagement and Trade Facilitation Working Group which will be identifying examples of best practices in the U.S. and abroad that facilitate trade. The subcommittee will discuss the startup of the Revenue Modernization Working Group which will be generating advice pertaining to the strategic modernization of Customs and Border Protection's revenue collections process and systems. Finally, the subcommittee will discuss the startup of the Rulings and Decisions Working Group which will be identifying process improvements in the receipt and issuance of Customs and Border Protection Headquarters' rulings and decisions.
6. The Trusted Trader Subcommittee will continue their discussion on their vision for an enhanced Trusted Trader concept that includes engagement with CBP to include relevant partner government agencies with a potential for international interoperability.
Meeting materials will be available by November 14, 2016, at:
Fish and Wildlife Service, Interior.
Notice of meetings.
The North American Wetlands Conservation Council (Council) will meet to select North American Wetlands Conservation Act (NAWCA) grant proposals for recommendation to the Migratory Bird Conservation Commission (Commission). The Council will consider Canadian, Mexican, and U.S. Standard grant proposals. The Advisory Group for the Neotropical Migratory Bird Conservation Act (NMBCA) grants program (Advisory Group) also will meet. The Advisory Group will discuss the strategic direction and management of the NMBCA program. Both meetings are open to the public, and interested persons may present oral or written statements.
The Council and Advisory Group meetings will take place at the U.S. Fish and Wildlife Service Headquarters, 5275 Leesburg Pike, Falls Church, Virginia 22041.
Sarah Mott, Council/Advisory Group Coordinator, by phone at 703-358-1784; by email at
The Council meets two to three times per year to select. The Council will consider Canadian, Mexican, and U.S. Standard NAWCA grant proposals for recommendation to the Commission. Council meetings are open to the public, and interested persons may present oral or written statements. The Advisory Group for the Neotropical Migratory Bird Conservation Act (NMBCA) grants program meets once a year. The Advisory Group will discuss the strategic direction and management of the NMBCA program. This meeting is also open to the public, and interested persons may present oral or written statements.
In accordance with NAWCA (Pub. L. 101-233, 103 Stat. 1968, December 13, 1989, as amended), the State-private-Federal Council meets to consider wetland acquisition, restoration, enhancement, and management projects for recommendation to, and final funding approval by, the Commission. NAWCA provides matching grants to organizations and individuals who have developed partnerships to carry out wetlands conservation projects in the United States, Canada, and Mexico. These projects must involve long-term protection, restoration, and/or enhancement of wetlands and associated uplands habitats for the benefit of all wetlands-associated migratory birds. Project proposal due dates, application instructions, and eligibility requirements are available on the NAWCA Web site at
In accordance with NMBCA (Pub. L. 106-247, 114 Stat. 593, July 20, 2000), the Advisory Group will hold its meeting to discuss the strategic direction and management of the NMBCA program and provide advice to the Director of the Fish and Wildlife Service. NMBCA promotes long-term conservation of neotropical migratory birds and their habitats through a competitive grants program by promoting partnerships, encouraging local conservation efforts, and achieving habitat protection in 36 countries. The goals of NMBCA include perpetuating healthy bird populations, providing financial resources for bird conservation, and fostering international cooperation. Because the greatest need is south of the U.S. border, at least 75 percent of NMBCA funding supports projects outside the United States. Project proposal due dates, application instructions, and eligibility requirements are available on the NMBCA Web site at
Interested members of the public may submit relevant information or questions to be considered during the public meetings. If you wish to submit a written statement so information may be made available to the Council or Advisory Group for their consideration prior to the meetings, you must contact the Council/Advisory Group Coordinator by the date in
Individuals or groups requesting to make an oral presentation at the meetings will be limited to 2 minutes per speaker, with no more than a total of 30 minutes for all speakers. Interested parties should contact the Council/Advisory Group Coordinator by the date in
Summary minutes of the Council and Advisory Group meetings will be maintained by the Council/Advisory Group Coordinator at the address under
Fish and Wildlife Service, Interior.
Notice of initiation of review; request for information.
We, the U.S. Fish and Wildlife Service (Service), are initiating a 5-year status review for the red wolf (
To allow us adequate time to conduct this review, we must receive your comments or information on or before December 30, 2016. However, we will continue to accept new information about any listed species at any time.
For instructions on how to submit information and review information we receive on the red wolf, see “Request for New Information.”
Aaron Valenta, Chief, Division of Restoration and Recovery, 404-679-4144.
Under the Act (16 U.S.C. 1531
This notice announces our active review of the red wolf (
In conducting a 5-year review, the Service considers the best scientific and commercial data that have become available since the current listing determination or most recent status review of each species, such as:
A. Species biology, including but not limited to population trends, distribution, abundance, demographics, and genetics;
B. Habitat conditions, including but not limited to amount, distribution, and suitability;
C. Conservation measures that have been implemented to benefit the species;
D. Threat status and trends (see five factors under heading “How Do We Determine Whether a Species Is Endangered or Threatened?”); and
E. Other new information, data, or corrections, including but not limited to taxonomic or nomenclatural changes, identification of erroneous information contained in the Lists of Endangered and Threatened Wildlife and Plants, and improved analytical methods.
New information will be considered in the 5-year review and ongoing recovery programs for the species.
A.
B.
C.
Section 4(a)(1) of the Act establishes that we determine whether a species is endangered or threatened based on one or more of the following five factors:
A. The present or threatened destruction, modification, or curtailment of its habitat or range;
B. Overutilization for commercial, recreational, scientific, or educational purposes;
C. Disease or predation;
D. The inadequacy of existing regulatory mechanisms; or
E. Other natural or manmade factors affecting its continued existence.
To do any of the following, contact Aaron Valenta at the Service's Southeast Regional Office, 1875 Century Boulevard, Atlanta, GA 30345; fax 404-679-7081; email at
A. To get more information on the red wolf;
B. To submit information on the red wolf; or
C. To review information we receive, which will be available for public inspection by appointment, during normal business hours at the Southeast Regional Office, Ecological Services Division, at the address above.
We request any new information concerning the status of the red wolf. See “What information do we consider in our review?” above for specific criteria. Information submitted should be supported by documentation such as maps, bibliographic references, methods used to gather and analyze the data, and/or copies of any pertinent publications, reports, or letters by knowledgeable sources.
Before including your address, phone number, email address, or other personal identifying information in your comment, you should be aware that the entire comment—including your personal identifying information—may be made publicly available at any time. While you can ask us in your comment to withhold your personal identifying information from public review, we cannot guarantee that we will be able to do so.
We publish this document under the authority of the Endangered Species Act (16 U.S.C. 1531
Bureau of Indian Affairs, Interior.
Notice.
The State of California and the Viejas (Baron Long) Group of Capitan Grande Band of Mission Indians of the Viejas Reservation entered into a Tribal-State compact governing Class III gaming. This notice announces that the compact is taking effect.
The effective date of the compact is October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
Bureau of Indian Affairs, Interior.
Notice.
The State of California and the Agua Caliente Band of Cahuilla Indians of the Agua Caliente Indian Reservation entered into a Tribal-State compact governing Class III gaming. This notice announces that the compact is taking effect.
The effective date of the compact is October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
Bureau of Indian Affairs, Interior.
Notice.
The Yankton Sioux Tribe of South Dakota and State of South Dakota negotiated an Amended Gaming Compact governing Class III gaming; this notice announces approval of the amended compact.
Effective October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
Bureau of Indian Affairs, Interior.
Notice.
The Yurok Tribe (Tribe) of the Yurok Reservation and State of California (State) entered into an amendment to an existing Tribal-State compact governing Class III gaming. This notice announces approval of the amendment.
Effective October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
Bureau of Indian Affairs, Interior.
Notice.
The State of California and the Viejas (Baron Long) Group of Capitan Grande Band of Mission Indians of the Viejas Reservation entered into a Tribal-State compact governing Class III gaming. This notice announces that the compact is taking effect.
The effective date of the compact is October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
Bureau of Indian Affairs, Interior.
Notice.
The Coquille Indian Tribe and State of Oregon entered into an amendment to an existing Tribal-State compact governing Class III gaming. This notice announces approval of the amendment.
Effective October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
Bureau of Indian Affairs, Interior.
Notice.
The Jackson Band of Miwuk Indians (Tribe) and State of California entered into an amendment to the existing Tribal-State Compact governing Class III gaming. This notice announces approval of the amendment.
Effective October 31, 2016.
Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.
Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the
National Park Service, Interior.
Notice; request for comments.
We (National Park Service, NPS) have sent an Information Collection Request (ICR) to OMB for review and approval. We summarize the ICR below and describe the nature of the collection and the estimated annual burden. We may not conduct or sponsor and a person is not required to respond to a collection of information unless it displays a currently valid OMB Control Number.
To ensure that we are able to consider your comments on this ICR, we must receive them by November 30, 2016.
Please direct all written comments on this ICR directly to the Office of Management and Budget (OMB) Office of Information and Regulatory Affairs, Attention: Desk Officer for the Department of the Interior, to
Bret Meldrum, Chief Social Science Program, National Park Service, 1201 Oakridge Drive, Fort Collins, CO 80525 (mail); or
2016 marks the 100th anniversary of the National Park Service (NPS)—a defining moment that offers an opportunity to reflect on and celebrate our accomplishments as we move forward into a new century of stewardship and engagement. As we prepare for our centennial anniversary, discussions concerning the relevancy of the National Parks have ignited the need for a third iteration of the NPS Comprehensive Survey of the American
This request is to reinstate OMB Control Number 1024-0254 in order to pretest the survey and collection methods before we ask OMB to review for the consideration of approval the final version of the survey instrument. The new content is sufficiently different enough to necessitate this request to pretest question, response choice wording, and survey length before requesting approval of the final survey. The purpose and intent of the final survey will be measure the awareness, engagement, values, and preferences of both visitors and non-visitors. This information will be used to assess the relevancy of NPS as well as to assess change over time, which in turn will be used to evaluate the effectiveness of NPS efforts to increase its relevancy.
A notice was published in the
We again invite comments concerning this information collection on:
• Whether or not the collection of information is necessary, including whether or not the information will have practical utility;
• The accuracy of our estimate of the burden for this collection of information;
• Ways to enhance the quality, utility, and clarity of the information to be collected; and
• Ways to minimize the burden of the collection of information on respondents.
Comments that you submit in response to this notice are a matter of public record. We will include or summarize each comment in our request to OMB to approve this IC. Before including your address, phone number, email address, or other personal identifying information in your comment, you should be aware that your entire comment, including your personal identifying information, may be made publicly available at any time. While you can ask us in your comment to withhold your personal identifying information from public review, we cannot guarantee that we will be able to do so.
Bureau of Ocean Energy Management, Interior.
Final sale notice for commercial leasing for Wind Power on the Outer Continental Shelf Offshore New York.
This document is the Final Sale Notice (FSN) for the sale of one commercial wind energy lease on the Outer Continental Shelf (OCS) offshore New York, pursuant to 30 CFR 585.216. The Bureau of Ocean Energy Management (BOEM or “the Bureau”) will offer Lease OCS-A 0512 for sale using a multiple-factor auction format. This FSN contains information pertaining to the area available for leasing, lease provisions and conditions, auction details, the lease form, criteria for evaluating competing bids, award procedures, appeal procedures, and lease execution. The issuance of the lease resulting from this sale would not constitute an approval of project-specific plans to develop offshore wind energy. Such plans, if submitted by the lease sale winner, would be subject to subsequent environmental and public review prior to a decision to proceed with development.
BOEM will hold a mock auction for the bidders starting at 8:30 a.m. Eastern Standard (EST) on December 13, 2016. The monetary auction will be held online and will begin at 8:30 a.m. Eastern Standard Time (EST) on December 15, 2016. Additional details are provided in the section entitled, “Deadlines and Milestones for Bidders.”
Wright Frank, New York Project Coordinator and Auction Manager, BOEM Office of Renewable Energy Programs, 45600 Woodland Road, VAM-OREP, Sterling, Virginia, 20166, (703) 787-1325 or
BOEM made several changes from the description of the New York lease sale that was published in the PSN. Three changes worth highlighting are: A 10% bidding credit for entities that establish that they are a “government authority” meeting the definition included in this notice, an adaptation to the auction format, and the removal of a small portion of the lease area. The auction format described here differs slightly from past lease sales in that bidders may have a “limited opportunity to revoke” a provisionally winning bid without penalty if the next-highest bid was submitted by a governmental entity. An explanation regarding the reduction in the area of the LA relative to the area described in the PSN is provided in the section entitled “Area Offered for Leasing.”
On May 28, 2014, BOEM published a Notice of Intent (NOI) to Prepare an Environmental Assessment (EA) for commercial wind lease issuance and approval of site assessment activities on the Atlantic OCS offshore New York with a 45-day public comment period (79 FR 30643). In response to the NOI, BOEM received 32 comment
On June 6, 2016, in conjunction with the PSN, BOEM published an EA for public comment (81 FR 36344). BOEM received approximately 60 submittals. Submittals included letters, emails, comment cards, and comments made to a court reporter at public meetings. BOEM identified 300 discrete comments within the submittals received. Comments were received from various stakeholders, including private citizens, environmental groups, Federal agencies, trade associations, businesses, state agencies, universities, and Federal organizations.
Concurrent with publication of this FSN, BOEM has published a Notice of Availability (NOA) for the revised EA and Finding of No Significant Impact (FONSI) for commercial wind lease issuance and site assessment activities on the Atlantic OCS offshore New York. The EA and FONSI are available at:
All consultations necessary to inform BOEM's lease issuance decision have been completed. BOEM completed consultations with the National Oceanic and Atmospheric Administration's National Marine Fisheries Service (NMFS) and the U.S. Fish and Wildlife Service (USFWS) under the Endangered Species Act (ESA). BOEM completed formal consultation with NMFS upon receipt of a Biological Opinion on March 10, 2013, (revised on April 10, 2013). That consultation covered lease issuance and site characterization activities (
BOEM also consulted with the State Historic Preservation Offices of New York and New Jersey, the National Park Service, and Monmouth County New Jersey under the National Historic Preservation Act. The
On July 11, 2016, NMFS provided comments on the EA pursuant to the Magnuson-Stevens Fishery Conservation and Management Act (MSFCMA) and recommended that BOEM coordinate with NMFS in the review of site-specific survey plans and Site Assessment Plans (SAPs). Because of the programmatic nature of the essential fish habitat (EFH) assessment, NMFS elected not to provide any specific EFH conservation measures until such time as site-specific plans are received.
For the issuance of a commercial lease, BOEM considers the environmental consequences of associated site characterization activities (
• FSN Waiting Period
•
•
•
Further information on this subject can be found in the section of this notice entitled,”Auction Procedures.”
•
•
•
•
○
•
•
• From the Auction to Lease Execution
•
•
•
•
•
A map of the New York LA and GIS spatial files can be found on BOEM's Web site at:
A large scale map of the area, showing boundaries of the area with numbered blocks, is available from BOEM upon request at the following address: Bureau of Ocean Energy Management, Office of Renewable Energy Programs, 45600 Woodland Road, VAM-OREP, Sterling, Virginia, 20166, Phone: (703) 787-1300, Fax: (703) 787-1708.
During the Area Identification (Area ID) process, BOEM identified three issues of concern associated with potential development of the New York Wind Energy Area (WEA): (1) Navigational safety; (2) commercial fishing; and (3) visual impacts to National Park Service lands and historic properties. Although BOEM did not remove any areas from leasing consideration during Area ID, potential bidders should be aware that future analysis of these or other issues could result in BOEM's requiring mitigation measures and/or development restrictions in all or part of the New York LA. In addition, mitigation measures and/or development restrictions could result from future BOEM environmental reviews and
Potential bidders should note that future mitigation measures, including potential restrictions on the placement of structures, may be applied to development within all or portions of the New York LA to ensure navigation safety and the U.S. Coast Guard's (USCG's) ability to maintain mission readiness.
The New York LA has been delineated to accommodate a setback of 1 nautical mile (nmi) from the adjacent Traffic Separation Schemes (TSSs) for the Port of New York and New Jersey. This setback is consistent with BOEM's delineation of other lease and wind energy areas that are in close proximity to TSSs (
In September 2015, BOEM received additional input from the USCG recommending a larger setback of 2 nmi from the TSSs and 5 nmi from the entry/exit points of the TSSs. USCG's correspondence to BOEM, which explains the recommendation, is available on BOEM's Web site at
Potential bidders should note that future mitigation measures may be applied to development within all or portions of the New York LA due to the use of the area as a fishery.
BOEM received fishery-related comments in response to the RFI, Call for Information and Nominations, NOI, NOA, and several public outreach meetings. Commenters included NMFS, the New England and Mid-Atlantic Fishery Management Councils, and several fishing industry groups, primarily representing members of the sea scallop and squid fisheries. BOEM also received comments from commercial and recreational fishermen during BOEM's November 2015 fisheries workshops. A meeting summary of BOEM's November 2015 fisheries workshops and comments associated with these workshops are available on BOEM's Web site at
BOEM has also gathered information regarding the use of the LA as a fishery through a joint study with NMFS. This data, specific to the New York LA, is included in the revised EA and is available on BOEM's Web site at
Between 2012 and 2016, BOEM collaborated with numerous stakeholders in the fishing and offshore wind industries to develop best management practices (BMPs) in furtherance of its goal of minimizing potential multiple use conflicts between offshore renewable energy developers and the fishing industry. As a result of this effort, BOEM has concluded that there would be great merit in a lessee's utilizing a fisheries liaison and a fisheries representative during the lessee's plan development process. BOEM has also received comments from the public regarding the importance of ensuring effective communication between the lessee and the fishing community. As a result, BOEM has issued guidance to lessees for communicating with fisheries stakeholders regarding social and economic impacts of renewable energy development on the Atlantic Outer Continental Shelf:
Potential bidders should note that future mitigation measures may be applied to development within all or portions of the New York LA to avoid, minimize, or mitigate adverse effects to historic properties or National Park Service (NPS) lands. The NPS, New York State Historic Preservation Office (NY SHPO), and New Jersey State Historic Preservation Office (NJ SHPO) have expressed concern regarding the potential for wind energy development within the New York WEA to cause adverse effects to onshore historic properties. Correspondence outlining these concerns is available on BOEM's Web site at
During the summer and fall of 2015, BOEM conducted stakeholder outreach with the NPS, NY SHPO, and NJ SHPO. BOEM also completed a study entitled, “Renewable Energy Viewshed Analysis and Visualization Simulation for the New York Outer Continental Shelf Call Area” to assist in this outreach effort and to provide scientific and technical information about visual impacts to inform its Area ID decision. Results of this study are available under the header “Visual Simulations” at
The lease is available on BOEM's Web site at
• Addendum “A” (Description of Leased Area and Lease Activities);
• Addendum “B” (Lease Term and Financial Schedule);
• Addendum “C” (Lease Specific Terms, Conditions, and Stipulations);
• Addendum “D” (Project Easement);
• Addendum “E” (Rent Schedule post COP approval);
• Appendix A to Addendum “C”: (Incident Report: Protected Species Injury or Mortality); and
• Appendix B to Addendum “C”: (Required Data Elements for Protected Species Observer Reports).
Addenda “A,” “B,” and “C” provide detailed descriptions of lease terms and conditions. Addenda “D” and “E” will be completed at the time of COP approval or approval with modifications.
The most recent version of BOEM's renewable energy commercial lease form (BOEM-0008) is available on BOEM's Web site at:
Potential bidders should note that BOEM and the Bureau of Safety and Environmental Enforcement (BSEE) are in the process of reassigning regulations relating to safety and environmental oversight and enforcement responsibilities for offshore renewable energy projects from BOEM to BSEE. Once this administrative reassignment is finalized, BOEM may make ministerial and non-substantive amendments to the lease to conform it to regulatory revisions.
For a 79,350 acre lease (the size of the New York LA), the rent payment will be $238,050 per year ($3 times 79,350) if no portion of the leased area is authorized for commercial operations. If 300 megawatts (MW) of a project's nameplate capacity is operating (or authorized for operation), and the approved COP specifies a maximum project size of 500 MW, the rent payment will be $95,220. This payment is based on the 200 MW of nameplate capacity BOEM has not yet authorized for commercial operations. For the above example, this would be calculated as follows: 200MW/500MW × ($3/acre × 79,350 acres) = $95,220.
If the lessee submits an application for relinquishment of a portion of its lease area within the first 45 calendar days following the date that the lease is received by the lessee for execution, and BOEM approves that application, no rent payment will be due on the relinquished portion of the LA. Later relinquishments of any portion of the LA will reduce the lessee's rent payments starting in the year following BOEM's approval of the relinquishment.
The lessee also must pay rent for any project easement associated with the lease, commencing on the date that BOEM approves the COP (or modification thereof) that describes the project easement. Annual rent for a project easement is the greater of $5 per acre per year or $450 per year.
For purposes of calculating the initial annual operating fee payment, pursuant to 30 CFR 585.506, an operating fee rate is applied to a proxy for the wholesale market value of the electricity expected to be generated from the project during its first twelve months of operations. This initial payment will be prorated to reflect the period between the commencement of commercial operations and the Lease Anniversary. The initial annual operating fee payment is due within 45 days of the commencement of commercial operations. Thereafter, subsequent annual operating fee payments are due on or before each Lease Anniversary.
The subsequent annual operating fee payments are calculated by multiplying the operating fee rate by the imputed wholesale market value of the projected annual electric power production. For the purposes of this calculation, the imputed market value is the product of the project's annual nameplate capacity, the total number of hours in a year (8,760), the capacity factor, and the annual average price of electricity derived from a historical regional wholesale power price index. For example, the annual operating fee for a 100 MW wind facility operating at a 40% capacity (
The capacity factor is expressed as a decimal between zero and one, and represents the share of anticipated generation of the wind facility that is delivered to the interconnection grid (
Within 10 business days after receiving the lease copies and pursuant to 30 CFR 585.515-.516, the provisional winner of the New York LA must provide an initial lease-specific bond or other approved means of meeting BOEM's initial financial assurance requirements. The provisional winner may meet financial assurance requirements by posting a surety bond or by setting up an escrow account with a trust agreement giving BOEM the right to withdraw the money held in the account on demand. BOEM encourages the provisionally winning bidder to discuss the financial assurance requirement with BOEM as soon as possible after the auction has concluded.
BOEM will base the amount of all SAP, COP, and decommissioning financial assurance requirements on cost estimates for meeting all accrued lease obligations at the respective stages of development. The required amount of supplemental and decommissioning financial assurance will be determined on a case-by-case basis.
The financial terms can be found in Addendum “B” of the lease, which BOEM has made available with this notice on its Web site at:
Each bidder must fill out the BFF referenced in this FSN. BOEM has made a copy of the form available with this notice on its Web site at:
BOEM will not consider BFFs submitted by bidders for previous lease sales to satisfy the requirements of this auction. Further, BOEM will only consider BFFs submitted after the deadline if BOEM determines that the failure to timely submit the BFF was caused by events beyond the bidder's control. BOEM will only accept an original, executed paper copy of the BFF. The BFF must be executed by an authorized representative who has been identified in the qualifications package on file with BOEM as authorized to bind the company.
Following the auction, bid deposits will be applied against bids or other obligations owed to BOEM. If the bid deposit exceeds a bidder's total financial obligation, the balance of the bid deposit will be refunded to the bidder. BOEM will refund bid deposits to non-winners once BOEM has announced the provisional winner.
Bidders will forfeit their bid deposit if they are the provisionally winning bidder and they fail to execute a lease pursuant to their provisionally winning bid. Exercising the LOR pursuant to the rules described in this notice constitutes a limited exception to this rule, wherein if BOEM notifies a bidder that it may revoke its provisionally winning bid immediately following the lease sale, and if the bidder revokes such bid within the allotted time, then that bidder will not forfeit its $450,000 bid deposit. If a bidder exercises its LOR in this manner, BOEM will reoffer the lease to the government authority that is the second-highest bidder. In this case, the government authority would inherit the obligation to execute a lease pursuant to the government authority's now-provisionally winning bid, forfeiting its bid deposit if it does not execute the lease within the required timeframe.
If BOEM offers a lease pursuant to a provisionally winning bid, and that bidder fails to timely return the signed lease form, establish financial assurance, or pay the balance of its bid, BOEM will retain that bidder's $450,000 bid deposit. BOEM reserves the right to reconvene the panel to determine which bidder would have won in the absence of the provisionally winning bid, and to offer a lease to that bidder.
As authorized under 30 CFR 585.220(a)(4) and 585.221(a)(6), BOEM will use a multiple-factor auction format, with a multiple-factor bidding system, for this lease sale. Under this system, BOEM may consider a combination of monetary and non-monetary factors, or “variables,” in determining the outcome of the auction. BOEM will appoint a panel of BOEM employees to review the non-monetary packages and verify the results of the lease sale. BOEM reserves the right to change the composition of this panel at any time.
In response to public comments on the PSN, BOEM is offering a 10% non-monetary bid credit in this lease sale for government authorities. In order to be considered for this non-monetary credit, BOEM must receive a bidder's non-monetary package
If a bidder wishes to establish itself as a government authority for the purposes of the auction, it must timely submit a non-monetary package for approval by BOEM. The non-monetary package may consist of new information to help a bidder demonstrate its status as a government authority, and/or may reference materials that the bidder has already submitted to BOEM to establish that the bidder is legally qualified to participate in the sale. If bidders wish to review what materials they have already submitted, they should contact Gina Best at 703-787-1341, as soon as practicable.
Prior to the date of the auction, the panel will determine which bidders, if any, have qualified for the non-monetary credit. Bidders will be notified by email prior to the date of the auction if they have been granted a non-monetary credit. If the panel determines that no bidder is eligible to bid as a government authority and receive a credit, the auction will proceed with each bidder registered with no imputed credit. Bidders will not be notified whether other bidders have qualified for a non-monetary credit until after the bidding has concluded.
Under the format for this sale, in each round a bidder may submit a bid proposal,
In response to public comments on the PSN, BOEM is introducing the LOR as a feature of the New York lease sale. Each bidder may download, complete, sign and return the RLOR form from BOEM's Web site at
If a bidder opts into an LOR, and then becomes the provisional winner of the auction, it will be given a short opportunity just after the auction to revoke its provisionally winning bid without forfeiting its bid deposit of $450,000, if the second-place bidder is a government authority. Alternatively, bidders may choose not to opt-in. If a provisionally winning bidder does not reserve the LOR, that bidder will not be given an opportunity to revoke its provisionally winning bid following the sale without jeopardizing its bid deposit of $450,000. If a bidder fails to return the form in a timely manner, absent any extension granted by BOEM, it will be deemed to have opted out of its LOR. More information on LOR can be found in the “
The auction will be conducted in a series of rounds. At the start of each round, BOEM will state an asking price for the LA. If a bidder is willing to meet that asking price for the LA, it will indicate this by submitting a bid equal to the asking price,
To participate in any round of the auction, a bidder must have submitted a live bid in the previous round. As long as there are two or more live bids for the LA, the auction proceeds to the next round. Between rounds, BOEM will raise the asking price for the LA by an increment that it determines appropriate. Asking price increments are within BOEM's sole discretion, but are based on a number of factors, including the number of bidders still active in the auction and BOEM's best estimate of how many rounds may remain before the auction is resolved. BOEM also reserves the right to increase or decrease bidding increments between rounds, if it determines that a different increment is warranted to enhance the efficiency of the auction process.
As the auction proceeds, a bidder retains its eligibility to continue bidding as long as that bidder submitted a live bid on the LA in the previous round. Between rounds, BOEM will release information indicating the number of live bids for the LA in the previous round of the auction (
In any round after the first round of the auction, a bidder may submit an exit bid that is higher than the previous round's asking price, but less than the current round's asking price. An exit bid must consist of a single offer price. If a bidder submits an exit bid, it is not eligible to participate in subsequent bidding rounds of the auction. During the auction, exit bids will be seen only by BOEM and not by other bidders.
If the LA receives only exit bids in a round, BOEM will not raise the price and start another round, because no bidders would be eligible to bid in the next round.
The auction will end in the first round in which one or zero live bids is received. If one live bid is received, that bid is the provisionally winning bid. If no live bids are received, then the highest exit bid received is the provisionally winning bid. If there is a tie for the highest exit bid, BOEM's tie-breaking procedures will resolve the tie. If no live or exit bids are received, then there is a tie among all bidders that submitted live bids at the most recent asking price, and BOEM's tie-breaking procedures will determine the provisionally winning bid.
As noted, in response to public comments on the PSN, this lease sale includes an LOR. Ordinarily, if a provisionally winning bidder does not execute a lease pursuant to that provisionally winning bid, that bidder will forfeit its bid deposit. In this lease sale, a provisionally winning bidder will have a chance to revoke its
1. The provisionally winning bidder reserved the right to a LOR through a timely-submitted RLOR in advance of the auction; and
2. The second highest bid was submitted by a government authority.
If these two elements are satisfied, then BOEM will offer the provisionally winning bidder one hour to revoke its provisionally winning bid. If there is a tie for the second highest bid, including a government authority, the tie will be resolved and an LOR will be offered only if the government authority has the second-place bid following resolution of the tie.
The provisionally winning bidder will be given precisely one hour to revoke, using the messaging tool in the auction system. If that bidder wishes to revoke, the message should consist of the following statement:
If the statement above is not included
If the provisionally winning bidder does not revoke its bid within the designated hour, BOEM's requirements for the bidder will be the same as it would be for a sale without the LOR. Pursuant to 30 CFR 585.224, once BOEM sends the lease copies to the bidder, the bidder must timely pay the balance of its bid, establish financial assurance, and properly sign and return the lease copies. If the bidder fails to do so, then BOEM may not issue the lease to that bidder, in which case the bidder would forfeit its bid deposit. BOEM may consider failure of a bidder to timely pay the full amount due an indication that the bidder is no longer financially qualified to participate in other lease sales under BOEM's regulations at 30 CFR 585.106 and 585.107.
If the highest bidder revokes its provisionally winning bid pursuant to an LOR, the government authority with the second-highest bid in the auction becomes the provisionally winning bidder and must follow all of BOEM's requirements contained in 30 CFR 585.224. The government authority would then need to execute a lease pursuant to its provisionally winning bid, or risk forfeiture of its bid deposit.
BOEM will use its tie-breaking procedures to resolve any ties before determining whether the conditions have been met for offering a provisionally winning bidder a LOR. Ties are resolved by a random process. The auction system generates a random number for each bidder. In the event of a tie, these numbers are compared, and the bidder with the higher random number is deemed the provisional winner.
Following the lease sale, the non-monetary panel will convene, review the auction record, and certify the results of the sale. Shortly thereafter, BOEM will notify the DOJ that it may begin its antitrust review pursuant to 43 U.S.C. 1337(c).
If a bidder fails to execute a lease pursuant to a provisionally winning bid, BOEM may reoffer that lease to the next highest bidder. If the bidder that fails to execute is a government authority that had been declared the provisional winner after the exercise of a LOR, BOEM may first reoffer the lease to the bidder that had exercised the LOR. If BOEM reoffers the lease following a bidder's failure to execute a lease pursuant to a provisionally winning bid, the second bidder to which the lease is offered may decline the offer without forfeiting its bid deposit.
For the online auction, BOEM will require two-factor authentication. Prior to the auction, the Auction Manager will send several bidder authentication packages to the bidders shortly after BOEM has processed the BFFs. One package will contain digital authentication tokens allowing access to the auction Web site. The tokens will be mailed to the Primary Point of Contact indicated on the BFF. This individual is responsible for distributing the tokens to the individuals authorized to bid for that company.
The second package contains login credentials for authorized bidders. The login credentials will be mailed to the address provided in the BFF for each authorized individual. Bidders can confirm these addresses by calling 703-787-1320. This package will contain user login information and instructions for accessing the Auction System Technical Supplement and Alternative Bidding Form. The login information, along with the tokens, will be tested during the Mock Auction.
The auction will begin at 8:30 a.m. EST on December 15, 2016. Bidders may log in as early as 8:00 a.m. on that day. We recommend that bidders log in earlier than 8:30 a.m. on that day to ensure that any login issues are resolved prior to the start of the auction. Once bidders have logged in, they should review the auction schedule, which lists the start times, end times, and recess times of each round in the auction. Each round is structured as follows:
• Round bidding begins;
• Bidders enter their bids;
• Round bidding ends and the Recess begins;
• During the Recess, previous Round results are posted;
• Bidders review the previous Round results and prepare their next Round bids; and
• Next Round bidding begins.
The first round will last about 30 minutes, though subsequent rounds may be shorter. Recesses are anticipated to last approximately 10 minutes. The descriptions of the auction schedule and asking price increments included with this FSN are tentative. Bidders should consult the auction schedule on the bidding Web site during the auction for updated times. Bidding will continue until about 6:00 p.m. each day. BOEM anticipates the auction will last one or two business days, but bidders are advised to prepare to continue bidding for additional business days as necessary to resolve the auction.
BOEM and the auction contractors will use the auction platform messaging service to keep bidders informed on issues of interest during the auction. For example, BOEM may change the schedule at any time, including during the auction. If BOEM changes the schedule during the auction, it will use the messaging feature to notify bidders that a revision has been made, and direct bidders to the relevant page. BOEM will also use the messaging system for other changes and items of note during the auction.
Bidders may place bids at any time during the round. At the top of the
The timing of the auction will be elaborated on and clarified in the Auction System Technical Supplement available on BOEM's Web site at:
During the auction, and including one hour after the auction if LOR is triggered, bidders are prohibited from communicating with each other regarding their participation in the auction. Additionally, during the auction, and including one hour after the auction if LOR is triggered, bidders are prohibited from communicating to the general public, including, but not limited to, through social media, updated Web sites, or press releases, regarding any aspect of their participation or lack thereof in the auction.
Alternate Bidding Procedures enable a bidder that is having difficulties accessing the Internet to submit its bid via fax using an Alternate Bidding Form available on BOEM's Web site at:
In order to be authorized to use an Alternative Bidding Form, a bidder must call the help desk number listed in the Auction Manual
Bidding behavior in this sale is subject to Federal antitrust laws. Accordingly, following the auction, but before the acceptance of bids and the issuance of leases, BOEM will “allow the Attorney General, in consultation with the Federal Trade Commission, 30 days to review the results of the lease sale.” 43 U.S.C. 1337(c). If a bidder is found to have engaged in anti-competitive behavior in connection with its participation in the competitive bidding process, BOEM may reject the provisionally winning bid. Compliance with BOEM's auction procedures and regulations is not an absolute defense to violations of antitrust laws.
Anti-competitive behavior determinations are fact-specific. However, such behavior may manifest itself in several different ways, including, but not limited to:
• An express or tacit agreement among bidders not to bid in an auction, or to bid a particular price;
• An agreement among bidders not to bid for a particular LA;
• An agreement among bidders not to bid against each other; or
• Other agreements among bidders that have the potential to affect the final auction price.
BOEM will decline to award a lease if the Attorney General, in consultation with the Federal Trade Commission, determines that doing so would be inconsistent with the antitrust laws 43 U.S.C. 1337(c).
For more information on whether specific communications or agreements could constitute a violation of Federal antitrust law, please see:
1. Sign the lease on the bidder's behalf;
2. File financial assurance, as required under 30 CFR 585.515-537; and
3. Pay by electronic funds transfer (EFT) the balance (if any) of the bonus bid (winning bid less the bid deposit). BOEM requires bidders to use EFT procedures (not
BOEM will not execute a lease until the three requirements above have been satisfied, BOEM has accepted the provisionally winning bidder's financial assurance pursuant to 30 CFR 585.515, and BOEM has processed the provisionally winning bidder's payment.
BOEM may extend the ten business day deadline for executing the lease on the bidder's behalf, filing the required financial assurance, and/or paying the balance of the bonus bid if it determines the delay was caused by events beyond the provisionally winning bidder's control.
If the provisionally winning bidder does not meet these requirements or otherwise fails to comply with applicable regulations or the terms of the FSN, BOEM reserves the right to not issue the lease to that bidder. In such a case, the provisionally winning bidder will forfeit its bid deposit.
Within 45 calendar days of the date that the provisionally winning bidder receives copies of the lease, it must pay the first year's rent using the
(a) If BOEM rejects your bid, BOEM will provide a written statement of the reasons and refund any money deposited with your bid, without interest.
(b) You will then be able to ask the BOEM Director for reconsideration, in writing, within 15 business days of bid rejection, under 30 CFR 585.118(c)(1). We will send you a written response either affirming or reversing the rejection.
The procedures for appealing final decisions with respect to lease sales are described in 30 CFR 585.118(c).
Consistent with the Freedom of Information Act (FOIA), BOEM will protect privileged or confidential information that you submit. Exemption 4 of FOIA applies to “trade secrets and commercial or financial information that you submit that is privileged or confidential.” 5 U.S.C. 552(b)(4). If you wish to protect the confidentiality of such information, clearly mark it, “Contains Privileged or Confidential Information,” and consider submitting such information as a separate attachment. BOEM will not disclose such information, except as required by FOIA. Information that is not labeled as privileged or confidential will be regarded by BOEM as suitable for public release. Further, BOEM will not treat as confidential aggregate summaries of otherwise confidential information.
This FSN is published pursuant to subsection 8(p) of the OCS Lands Act (43 U.S.C. 1337(p)) (“the Act”), as amended by section 388 of the Energy Policy Act of 2005 (EPAct), and the implementing regulations at 30 CFR part 585, including sections 211 and 216.
Bureau of Ocean Energy Management (BOEM), Interior.
Notice of availability of a revised environmental assessment and a finding of no significant impact.
BOEM is announcing the availability of a revised environmental assessment (EA) and finding of no significant impact (FONSI) for commercial wind lease issuance, site characterization activities (geophysical, geotechnical, archaeological, and biological surveys), and site assessment activities (including the installation and operation of a meteorological tower or buoys or both a tower and buoys) on the Atlantic Outer Continental Shelf offshore New York. The revised EA provides a discussion of potential impacts of the proposed action and an analysis of reasonable alternatives to the proposed action. In accordance with the requirements of the National Environmental Policy Act (NEPA) and the Council on Environmental Quality's (CEQ) regulations implementing NEPA at 40 CFR 1500-1508, BOEM issued a FONSI supported by the analysis in the revised EA. The FONSI concluded that the reasonably foreseeable environmental impacts associated with the proposed action and alternatives, as set forth in the EA, would not significantly impact the quality of the human environment; therefore, the preparation of an environmental impact statement is not required. This notice is being published concurrently with the Final Sale Notice for the New York Wind Energy Area (WEA). These documents and associated information are available on BOEM's Web site at
Michelle Morin, BOEM Office of Renewable Energy Programs, 45600 Woodland Road, Sterling, Virginia 20166, (703) 787-1340 or
In June 2016, BOEM published an EA to consider the reasonably foreseeable environmental consequences associated with commercial wind lease issuance, site characterization activities, and site assessment activities within the WEA offshore New York. A notice was published on June 6, 2016, to announce the availability of the EA and initiate a 30-day public comment period (81 FR 36344). Due to requests for extension, the public comment period closed on July 13, 2016. The EA was subsequently revised based on comments received through
In addition to the proposed action, the revised EA considers two alternatives: (1) Restricting site assessment structure placement within 2 nm (3.7 km) of the traffic separation scheme, and (2) no action. BOEM's analysis of the proposed action and alternatives takes into account standard operating conditions (SOCs) designed to avoid or minimize potential impacts to marine mammals and sea turtles. The SOCs can be found in Appendix B of the revised EA.
BOEM will use the revised EA to inform its decisions regarding lease issuance in the New York WEA and subsequent review of site assessment plans in the lease area. The competitive leasing process is set forth at 30 CFR 585.210-585.225. A future lessee may propose a wind energy generation facility on its lease by submitting a construction and operations plan (COP) to BOEM. BOEM would then prepare a separate site- and project-specific NEPA analysis of the activities proposed in the COP.
This notice of availability for an EA is in compliance with the National Environmental Policy Act (NEPA) of 1969, as amended (42 U.S.C. 4231
Department of Justice.
30 day notice.
The Department of Justice (DOJ), Justice Management Division, Office of Attorney Recruitment and Management (OARM), will be submitting the following information collection request to the Office of Management and Budget (OMB) for review and approval in accordance with the Paperwork Reduction Act of 1995. This proposed information collection was previously published in the
Comments are encouraged and will be accepted for an additional 30 days until December 30, 2016.
Written comments and/or suggestions regarding the item(s) contained in this notice, especially regarding the estimated public burden and associated response time, should be directed to the U.S. Department of Justice, Office of Attorney Recruitment and Management, 450 5th Street NW., Suite 10200, Attn: Deana Willis, Washington, DC 20530 or sent to
Written comments and suggestions from the public and affected agencies concerning the proposed collection of information are encouraged. Your comments should address one or more of the following four points:
(1) Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;
(2) Evaluate the accuracy of the agencies estimate of the burden of the proposed collection of information;
(3) Enhance the quality, utility, and clarity of the information to be collected; and
(4) Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology,
Overview of this information collection:
1.
2.
3.
4.
The Department of Justice Attorney Student Loan Repayment Program (ASLRP) is an agency recruitment and retention incentive program based on 5 U.S.C. 5379, as amended, and 5 CFR part 537. Anyone currently employed as an attorney or hired to serve in an attorney position within the Department may request consideration for the ASLRP. The Department selects new participants during an annual open season each spring and renews current beneficiaries who remain qualified for these benefits, subject to availability of funds. There are two application forms—one for new requests, and the other for renewal requests. A justification form (applicable to new requests only) and a loan continuation form complete the collection.
5.
6.
If additional information is required contact: Jerri Murray, Department Clearance Officer, United States Department of Justice, Justice Management Division, Policy and Planning Staff, Two Constitution Square, 145 N Street NE., 3E.405B, Washington, DC 20530.
Notice.
The Department of Labor (DOL) is submitting the Occupational Safety and Health Administration (OSHA) sponsored information collection request (ICR) titled, “Special Dipping and Coating Operations (Dip Tanks),” to the Office of Management and Budget (OMB) for review and approval for continued use, without change, in accordance with the Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501
The OMB will consider all written comments that agency receives on or before November 30, 2016.
A copy of this ICR with applicable supporting documentation; including a description of the likely respondents, proposed frequency of response, and estimated total burden may be obtained free of charge from the
Submit comments about this request by mail or courier to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-OSHA, Office of Management and Budget, Room 10235, 725 17th Street NW., Washington, DC 20503; by Fax: 202-395-5806 (this is not a toll-free number); or by email:
Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at
This ICR seeks to extend PRA authority for the Special Dipping and Coating Operations (Dip Tanks) information collection. The Dipping and Coating Operations Standard requires employers to post a conspicuous sign near each piece of electrostatic detearing equipment that notifies employees of the minimum safe distance they must maintain between goods undergoing electrostatic detearing and the electrodes or conductors of the equipment used in the process.
This information collection is subject to the PRA. A Federal agency generally cannot conduct or sponsor a collection of information, and the public is generally not required to respond to an information collection, unless it is approved by the OMB under the PRA and displays a currently valid OMB Control Number. In addition, notwithstanding any other provisions of law, no person shall generally be subject to penalty for failing to comply with a collection of information that does not display a valid Control Number.
OMB authorization for an ICR cannot be for more than three (3) years without renewal, and the current approval for this collection is scheduled to expire on October 31, 2016. The DOL seeks to extend PRA authorization for this information collection for three (3) more years, without any change to existing requirements. The DOL notes that existing information collection requirements submitted to the OMB receive a month-to-month extension while they undergo review. For additional substantive information about this ICR, see the related notice published in the
Interested parties are encouraged to send comments to the OMB, Office of Information and Regulatory Affairs at the address shown in the
• Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;
• Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;
• Enhance the quality, utility, and clarity of the information to be collected; and
• Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology,
Notice.
The Department of Labor (DOL) is submitting the Mine Safety and Health Administration (MSHA) sponsored information collection request (ICR) titled, “Safety Standards for Underground Coal Mine Ventilation—Belt Entry Used as an Intake Air Course to Ventilate Working Sections and Areas Where Mechanized Mining Equipment is Being Installed or Removed,” to the Office of Management and Budget (OMB) for review and approval for continued use, without change, in accordance with the Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501
The OMB will consider all written comments that agency receives on or before November 30, 2016.
A copy of this ICR with applicable supporting documentation; including a description of the likely respondents, proposed frequency of response, and estimated total burden may be obtained free of charge from the
Submit comments about this request by mail or courier to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-MSHA, Office of Management and Budget, Room 10235, 725 17th Street NW., Washington, DC 20503; by Fax: 202-395-5806 (this is not a toll-free number); or by email:
Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at
This ICR seeks to extend PRA authority for the Safety Standards for Underground Coal Mine Ventilation—Belt Entry Used as an Intake Air Course to Ventilate Working Sections and Areas Where Mechanized Mining Equipment is Being Installed or Removed information collection requirements codified in regulations 30 CFR part 75. More specifically, regulations 30 CFR 75.351 makes it mandatory for a mine operator electing to use belt air to ventilate a working section or area where mechanized equipment is being installed or removed to maintain records used by coal mine supervisors, miners, and Federal and State mine inspectors to show required examinations and tests were conducted. These records give insight into hazardous conditions that have been or may be encountered. Inspection records help in making decisions that ultimately affect the safety and health of miners working in belt air mines. Federal Mine Safety and Health Act of 1977 sections 101(a) and 103(h) authorize this information collection.
This information collection is subject to the PRA. A Federal agency generally cannot conduct or sponsor a collection of information, and the public is generally not required to respond to an information collection, unless it is approved by the OMB under the PRA and displays a currently valid OMB Control Number. In addition, notwithstanding any other provisions of law, no person shall generally be subject to penalty for failing to comply with a collection of information that does not display a valid Control Number.
OMB authorization for an ICR cannot be for more than three (3) years without renewal, and the current approval for this collection is scheduled to expire on December 31, 2016. The DOL seeks to extend PRA authorization for this information collection for three (3) more years, without any change to existing requirements. The DOL notes that existing information collection requirements submitted to the OMB receive a month-to-month extension while they undergo review. For additional substantive information about this ICR, see the related notice published in the
Interested parties are encouraged to send comments to the OMB, Office of Information and Regulatory Affairs at the address shown in the
• Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;
• Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;
• Enhance the quality, utility, and clarity of the information to be collected; and
• Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology,
Notice.
The Department of Labor (DOL) is submitting the Occupational Safety and Health Administration (OSHA) sponsored information collection request (ICR) titled, “Derricks Standard,” to the Office of Management and Budget (OMB) for review and approval for continued use, without change, in accordance with the Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501
The OMB will consider all written comments that agency receives on or before November 30, 2016.
A copy of this ICR with applicable supporting documentation; including a description of the likely respondents, proposed frequency of response, and estimated total burden may be obtained free of charge from the
Submit comments about this request by mail or courier to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-OSHA, Office of Management and Budget, Room 10235, 725 17th Street NW., Washington, DC 20503; by Fax: 202-395-5806 (this is not a toll-free number); or by email:
Contact Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at
44 U.S.C. 3507(a)(1)(D).
This ICR seeks to extend PRA authority for the Derricks Standard information collection requirements codified in regulations 29 CFR 1910.181. The specified requirements are for marking the rated load on derricks, preparing certification records that verify the inspection of derrick ropes, and posting warning signs while the derrick is undergoing adjustments and repairs. Certification records must be maintained and disclosed upon request. Occupational Safety and Health Act sections 2(b)(3), 6(b)(7), and 8(c) authorize this information collection.
This information collection is subject to the PRA. A Federal agency generally cannot conduct or sponsor a collection of information, and the public is generally not required to respond to an information collection, unless it is approved by the OMB under the PRA and displays a currently valid OMB Control Number. In addition, notwithstanding any other provisions of law, no person shall generally be subject to penalty for failing to comply with a collection of information that does not display a valid Control Number.
OMB authorization for an ICR cannot be for more than three (3) years without renewal, and the current approval for this collection is scheduled to expire on October 31, 2016. The DOL seeks to extend PRA authorization for this information collection for three (3) more years, without any change to existing requirements. The DOL notes that existing information collection requirements submitted to the OMB receive a month-to-month extension while they undergo review. For additional substantive information about this ICR, see the related notice published in the
Interested parties are encouraged to send comments to the OMB, Office of Information and Regulatory Affairs at the address shown in the
• Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;
• Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;
• Enhance the quality, utility, and clarity of the information to be collected; and
• Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology,
Occupational Safety and Health Administration (OSHA), Labor.
Notice.
In this notice, OSHA announces the application of Intertek Testing Services NA, Inc. for expansion of its recognition as a Nationally Recognized Testing Laboratory (NRTL) and presents the Agency's preliminary finding to grant the application.
Submit comments, information, and documents in response to this notice, or requests for an extension of time to make a submission, on or before November 15, 2016.
Submit comments by any of the following methods:
1.
2.
3.
4.
5.
6.
Information regarding this notice is available from the following sources:
The Occupational Safety and Health Administration is providing notice that Intertek Testing Services NA, Inc. (ITSNA), is applying for expansion of its current recognition as an NRTL. ITSNA requests the addition of twenty-three (23) test standards to its NRTL scope of recognition.
OSHA recognition of an NRTL signifies that the organization meets the requirements specified in 29 CFR 1910.7. Recognition is an acknowledgment that the organization can perform independent safety testing and certification of the specific products covered within its scope of recognition. Each NRTL's scope of recognition includes (1) the type of products the NRTL may test, with each type specified by its applicable test standard; and (2) the recognized site(s) that has/have the technical capability to perform the product-testing and product-certification activities for test standards within the NRTL's scope. Recognition is not a delegation or grant of government authority; however, recognition enables employers to use products approved by the NRTL to meet OSHA standards that require product testing and certification.
The Agency processes applications by an NRTL for initial recognition and for an expansion or renewal of this recognition, following requirements in Appendix A to 29 CFR 1910.7. This appendix requires that the Agency publish two notices in the
ITSNA currently has fourteen (14) facilities (sites) recognized by OSHA for product testing and certification, with its headquarters located at: Intertek Testing Services NA, Inc., 545 East Algonquin Road, Suite F, Arlington Heights, Illinois 60005. A complete list of ITSNA's scope of recognition is available at
ITSNA submitted an application, dated April 21, 2015 (OSHA-2007-0039-0022), to expand its recognition to include twenty-three (23) additional test standards. OSHA staff performed detailed analysis of the application packet and reviewed other pertinent information. OSHA did not perform any on-site reviews in relation to this application.
Table 1 below lists the appropriate test standards found in ITSNA's application for expansion for testing and certification of products under the NRTL Program.
ITSNA submitted an acceptable application for expansion of its scope of recognition. OSHA's review of the application file, and pertinent documentation, indicate that ITSNA can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of these twenty-three test standards for NRTL testing and certification listed above. This preliminary finding does not constitute an interim or temporary approval of ITSNA's application.
OSHA welcomes public comment as to whether ITSNA meets the requirements of 29 CFR 1910.7 for expansion of its recognition as an NRTL. Comments should consist of pertinent written documents and exhibits. Commenters needing more time to comment must submit a request in writing, stating the reasons for the request. Commenters must submit the written request for an extension by the due date for comments. OSHA will limit any extension to 10 days unless the requester justifies a longer period. OSHA may deny a request for an extension if the request is not adequately justified. To obtain or review copies of the exhibits identified in this notice, as well as comments submitted to the docket, contact the Docket Office, Room N-3653, Occupational Safety and Health Administration, U.S. Department of Labor, at the above address. These materials also are available online at
OSHA staff will review all comments to the docket submitted in a timely manner and, after addressing the issues raised by these comments, will recommend to the Assistant Secretary for Occupational Safety and Health whether to grant ITSNA's application for expansion of its scope of recognition. The Assistant Secretary will make the final decision on granting the application. In making this decision, the Assistant Secretary may undertake other proceedings prescribed in Appendix A to 29 CFR 1910.7.
OSHA will publish a public notice of its final decision in the
David Michaels, Ph.D., MPH, Assistant Secretary of Labor for Occupational Safety and Health, 200 Constitution Avenue NW., Washington, DC 20210, authorized the preparation of this notice. Accordingly, the Agency is issuing this notice pursuant to 29 U.S.C. 657(g)(2), Secretary of Labor's Order No. 1-2012 (77 FR 3912, Jan. 25, 2012), and 29 CFR 1910.7.
Occupational Safety and Health Administration (OSHA), Labor.
Notice.
In this notice, OSHA announces the applications of TUV Rheinland of North America, Inc., for expansion of its recognition as a Nationally Recognized Testing Laboratory (NRTL) and presents the Agency's preliminary finding to grant the applications. Additionally, OSHA proposes to add a new test standard to the NRTL listing of Appropriate Test Standards.
Submit comments, information, and documents in response to this notice, or requests for an extension of time to make a submission, on or before November 15, 2016.
Submit comments by any of the following methods:
1.
2.
3.
4.
5.
6.
Information regarding this notice is available from the following sources:
The Occupational Safety and Health Administration is providing notice that TUV Rheinland of North America, Inc. (TUVRNA), is applying for expansion of its current recognition as an NRTL. TUVRNA requests the addition of three test standards and two additional recognized sites to its NRTL scope of recognition.
OSHA recognition of an NRTL signifies that the organization meets the requirements specified in 29 CFR 1910.7. Recognition is an acknowledgment that the organization can perform independent safety testing and certification of the specific products covered within its scope of recognition. Each NRTL's scope of recognition includes (1) the type of products the NRTL may test, with each type specified by its applicable test standard; and (2) the recognized site(s) that has/have the technical capability to perform the product-testing and product-certification activities for test standards within the NRTL's scope. Recognition is not a delegation or grant of government authority; however, recognition enables employers to use products approved by the NRTL to meet OSHA standards that require product testing and certification.
The Agency processes applications by an NRTL for initial recognition and for an expansion or renewal of this recognition, following requirements in Appendix A to 29 CFR 1910.7. This appendix requires that the Agency publish two notices in the
TUVRNA currently has three facilities (sites) recognized by OSHA for product testing and certification, with its headquarters located at: TUV Rheinland of North America, Inc., 12 Commerce Road, Newtown, Connecticut 06470. A complete list of TUVRNA's scope of recognition is available at
TUVRNA submitted five applications, dated April 1, 2015 (OSHA-2007-0042-0016), May 6, 2015 (OSHA-2007-0042-0017), August 20, 2015 (OSHA-2007-0042-0018), December 7, 2015 (OSHA-2007-0042-0019) and March 2, 2016 (OSHA-2007-0042-0020), to expand its recognition to include three additional test standards and two additional recognized sites. The two proposed recognized testing sites are located at: TUV Rheinland Japan Ltd., Global Technology Assessment Center, 4-25-2 Kita-Yamata, Tsuzuki-ku, Yokohama, Kanagawa, 224-0021 JAPAN and TUV Rheinland LGA Products GmbH, Am Grauen Stein 29, Koln, NRW 51105 GERMANY. OSHA performed on-site reviews of TUV Yokohama on February 16-17, 2016, and TUV Cologne on June 9-10, 2016, in relation to these applications, in which assessors found some nonconformances with the requirements of 29 CFR 1910.7. TUVRNA addressed these issues sufficiently and OSHA staff preliminarily determined that OSHA should grant the additional site applications.
TUVRNA's expansion application also requested the addition of three test standards to its NRTL scope of recognition. OSHA staff performed a detailed analysis of the application packet and reviewed other pertinent information as well as conducted the on-site reviews discussed above. Table 1 below lists the appropriate test standards found in TUVRNA's applications for expansion for testing and certification of products under the NRTL Program.
Periodically, OSHA will propose to add new test standards to the NRTL list of appropriate test standards following an evaluation of the test standard document. To qualify as an appropriate test standard, the Agency evaluates the document to (1) verify it represents a product category for which OSHA requires certification by an NRTL, (2) verify the document represents an end product and not a component, and (3) verify the document defines safety test specifications (not installation or operational performance specifications). OSHA becomes aware of new test standards through various avenues. For example, OSHA may become aware of new test standards by: (1) Monitoring notifications issued by certain SDOs; (2) reviewing applications by NRTLs or applicants seeking recognition to include a new test standard in their scopes of recognition; and (3) obtaining notification from manufacturers,
In this notice, OSHA proposes to add a new test standard to the NRTL Program's list of appropriate test standards. Table 2, below, lists the test standard that is new to the NRTL Program. OSHA preliminarily determined that this test standard is an appropriate test standard and proposes to include it in the NRTL Program's List of Appropriate Test Standards. OSHA seeks public comment on this preliminary determination.
TUVRNA submitted acceptable applications for expansion of its scope of recognition. OSHA's review of the application files, and pertinent documentation, indicate that TUVRNA can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of these three test standards for NRTL testing and certification listed above. OSHA's detailed on-site assessments indicate that TUVRNA can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of two sites for NRTL testing and certification. This preliminary finding does not constitute an interim or temporary approval of TUVRNA's applications.
OSHA welcomes public comment as to whether TUVRNA meets the requirements of 29 CFR 1910.7 for expansion of its recognition as an NRTL. Comments should consist of pertinent written documents and exhibits. Commenters needing more time to comment must submit a request in writing, stating the reasons for the request. Commenters must submit the written request for an extension by the due date for comments. OSHA will limit any extension to 10 days unless the requester justifies a longer period. OSHA may deny a request for an extension if the request is not adequately justified. To obtain or review copies of the exhibits identified in this notice, as well as comments submitted to the docket, contact the Docket Office, Room N-2625, Occupational Safety and Health Administration, U.S. Department of Labor, at the above address. These materials also are available online at
OSHA staff will review all comments to the docket submitted in a timely manner and, after addressing the issues raised by these comments, will recommend to the Assistant Secretary for Occupational Safety and Health whether to grant TUVRNA's application for expansion of its scope of recognition. The Assistant Secretary will make the final decision on granting the application. In making this decision, the Assistant Secretary may undertake other proceedings prescribed in Appendix A to 29 CFR 1910.7.
OSHA will publish a public notice of its final decision in the
David Michaels, Ph.D., MPH, Assistant Secretary of Labor for Occupational Safety and Health, 200 Constitution Avenue NW., Washington, DC 20210, authorized the preparation of this notice. Accordingly, the Agency is issuing this notice pursuant to 29 U.S.C. 657(g)(2), Secretary of Labor's Order No. 1-2012 (77 FR 3912, Jan. 25, 2012), and 29 CFR 1910.7.
Occupational Safety and Health Administration (OSHA), Labor.
Notice.
In this notice, OSHA announces the application of Curtis-Strauss LLC for expansion of its recognition as a Nationally Recognized Testing Laboratory (NRTL) and presents the Agency's preliminary finding to grant the application.
Submit comments, information, and documents in response to this notice, or requests for an extension of time to make a submission, on or before November 15, 2016.
Submit comments by any of the following methods:
1.
2.
3.
4.
5.
6.
Information regarding this notice is available from the following sources:
The Occupational Safety and Health Administration is providing notice that Curtis-Strauss LLC (CSL), is applying for expansion of its current recognition as an NRTL. CSL requests the addition of sixteen (16) test standards to its NRTL scope of recognition.
OSHA recognition of an NRTL signifies that the organization meets the requirements specified in 29 CFR 1910.7. Recognition is an acknowledgment that the organization can perform independent safety testing and certification of the specific products covered within its scope of recognition. Each NRTL's scope of recognition includes (1) the type of products the NRTL may test, with each type specified by its applicable test standard; and (2) the recognized site(s) that has/have the technical capability to perform the product-testing and product-certification activities for test standards within the NRTL's scope. Recognition is not a delegation or grant of government authority; however, recognition enables employers to use products approved by the NRTL to meet OSHA standards that require product testing and certification.
The Agency processes applications by an NRTL for initial recognition and for an expansion or renewal of this recognition, following requirements in Appendix A to 29 CFR 1910.7. This appendix requires that the Agency publish two notices in the
CSL currently has one facility (site) recognized by OSHA for product testing and certification, with its headquarters located at: Curtis-Strauss LCC, One Distribution Center Circle, Suite #1, Littleton, MA 01460. A complete list of CSL's scope of recognition is available at
CSL submitted four applications, each dated December 29, 2015 (OSHA-2009-0026-0065; OSHA-2009-0026-0066; OSHA-2009-0026-0069; OSHA-2009-0026-0068), to expand its recognition to include 16 additional test standards. OSHA staff performed a detailed analysis of the application packets and reviewed other pertinent information. OSHA did not perform any on-site reviews in relation to these applications.
Table 1 below lists the appropriate test standards found in CSL's application for expansion for testing and certification of products under the NRTL Program.
CSL submitted an acceptable application for expansion of its scope of recognition. OSHA's review of the application file, and pertinent documentation, indicate that CSL can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of these 16 test standards for NRTL testing and certification listed above. This preliminary finding does not constitute an interim or temporary approval of CSL's application.
OSHA welcomes public comment as to whether CSL meets the requirements of 29 CFR 1910.7 for expansion of its recognition as an NRTL. Comments should consist of pertinent written documents and exhibits. Commenters needing more time to comment must submit a request in writing, stating the reasons for the request. Commenters must submit the written request for an extension by the due date for comments. OSHA will limit any extension to 10 days unless the requester justifies a longer period. OSHA may deny a request for an extension if the request is not adequately justified. To obtain or review copies of the exhibits identified in this notice, as well as comments submitted to the docket, contact the Docket Office, Room N-2625, Occupational Safety and Health Administration, U.S. Department of Labor, at the above address. These materials also are available online at
OSHA staff will review all comments to the docket submitted in a timely manner and, after addressing the issues raised by these comments, will recommend to the Assistant Secretary for Occupational Safety and Health whether to grant CSL's application for expansion of its scope of recognition. The Assistant Secretary will make the final decision on granting the application. In making this decision, the Assistant Secretary may undertake other proceedings prescribed in Appendix A to 29 CFR 1910.7.
OSHA will publish a public notice of its final decision in the
David Michaels, Ph.D., MPH, Assistant Secretary of Labor for Occupational Safety and Health, 200 Constitution Avenue NW., Washington, DC 20210, authorized the preparation of this notice. Accordingly, the Agency is issuing this notice pursuant to 29 U.S.C. 657(g)(2), Secretary of Labor's Order No. 1-2012 (77 FR 3912, Jan. 25, 2012), and 29 CFR 1910.7.
National Aeronautics and Space Administration.
Notice of meeting.
In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Aeronautics Committee of the NASA Advisory Council (NAC). This meeting will be held for the purpose of soliciting, from the aeronautics community and other persons, research and technical information relevant to program planning. This Committee reports to the NAC.
Monday, November 14, 2016, 2:00 p.m.-6:00 p.m., and Tuesday, November 15, 2016, 9:45 a.m. to 3:00 p.m., Local Time.
NASA Ames Conference Center, Building 3, 500 Severyns Avenue, Moffett Field, CA 94035-0001.
Ms. Irma Rodriguez, Executive Secretary for the NAC Aeronautics Committee, NASA Headquarters, Washington, DC 20546, phone number (202) 358-0984, or
The meeting will be open to the public up to the capacity of the room. Any person interested in participating in the meeting by WebEx and telephone should contact Ms. Irma Rodriguez at (202) 358-0984 for the web link, toll-free number and passcode. The agenda for the meeting includes the following topics:
Attendees will be requested to sign a register and to comply with NASA security requirements, including the presentation of a valid picture ID before receiving access to NASA Ames Research Center. All attendees must state that they are attending the NASA Advisory Council Aeronautics Committee meeting in the NASA Ames Conference Center in Building 3. Due to the Real ID Act, Public Law 109-13, any attendees with drivers licenses issued from non-compliant states/territories must present a second form of ID. [Federal employee badge; passport; active military identification card; enhanced driver's license; U.S. Coast Guard Merchant Mariner card; Native American tribal document; school identification accompanied by an item from LIST C (documents that establish employment authorization) from the “List of the Acceptable Documents” on Form I-9]. Non-compliant states/territories are: American Samoa, Minnesota, Missouri, and Washington. Foreign nationals attending this meeting will be required to provide a copy of their passport and visa in addition to providing the following information no less than 8 working days prior to the meeting: Full name; gender; date/place of birth; citizenship; passport information (number, country, telephone); visa information (number, type, expiration date); employer affiliation information (name of institution, address. Country, telephone); title/position of attendee to Ms. Irma Rodriguez, NAC Aeronautics Committee Executive Secretary, fax (202) 358-4060. U.S. Citizens and Permanent Residents (green card
National Credit Union Administration (NCUA).
Notice.
The National Credit Union Administration (NCUA) will be submitting the following information collection requests to the Office of Management and Budget (OMB) for review and clearance in accordance with the Paperwork Reduction Act of 1995, Public Law 104-13, on or after the date of publication of this notice.
Comments should be received on or before November 30, 2016 to be assured of consideration.
Send comments regarding the burden estimate, or any other aspect of the information collection, including suggestions for reducing the burden, to (1) Office of Information and Regulatory Affairs, Office of Management and Budget, Attention: Desk Officer for NCUA, New Executive Office Building, Room 10235, Washington, DC 20503, or email at
Copies of the submission may be obtained by emailing
By Gerard Poliquin, Secretary of the Board, the National Credit Union Administration, on October 19, 2016.
Nuclear Regulatory Commission.
NuScale design-specific review standard; issuance.
The U.S. Nuclear Regulatory Commission (NRC or Commission) has issued the NuScale Power, LLC, (NuScale), Design-Specific Review Standard (DSRS) Sections, and is issuing the final NuScale DSRS Scope and Safety Review Matrix, for NuScale Design Certification (DC), Combined License (COL), and Early Site Permit (ESP) reviews. The NRC staff is also issuing the DSRS public comment resolution matrices, which address the comments received on the draft DSRS. The NuScale DSRS provides guidance to the NRC staff for performing safety reviews for those specific areas where existing NUREG-0800, “Standard Review Plan [SRP] for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition,” sections do not address the unique features of the NuScale design.
The DSRS sections were effective upon issuance between June 24 and August 4, 2016.
Please refer to Docket ID NRC-2015-0160 when contacting the NRC about the availability of information regarding this document. You may obtain publically-available information related to this document using any of the following methods:
•
•
•
Rajender Auluck, telephone: 301-415-1025; email:
In the Staff Requirements Memorandum (SRM) COMGBJ-10-0004/COMGEA-10-0001, “Use of Risk Insights to Enhance Safety Focus of Small Modular Reactor Reviews,” dated August 31, 2010 (ADAMS Accession No. ML102510405), the Commission provided direction to the NRC staff on the preparation for, and review of, small modular reactor (SMR) applications, with a near-term focus on integral pressurized-water reactor designs. The Commission directed the NRC staff to more fully integrate the use of risk insights into pre-application activities and the review of applications and, consistent with regulatory requirements and Commission policy statements, to align the review focus and resources to risk-significant structures, systems, and components and other aspects of the design that contribute most to safety in order to enhance the effectiveness and efficiency of the review process. The Commission directed the NRC staff to develop a design-specific, risk-informed review plan for each SMR design to address pre-application and application review activities. An important part of this review plan is the DSRS. The DSRS for the NuScale design is the result of the implementation of the Commission's direction.
The NuScale DSRS (available in ADAMS Package Accession No. ML15355A295) reflects current NRC staff safety review methods and practices which integrate risk insights and, where appropriate, lessons learned from the NRC's reviews of DC and COL applications completed since the last revision of the NUREG-0800, SRP Introduction, Part 2, “Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: Light-Water Small Modular Reactor Edition,” January 2014 (ADAMS Accession No. ML13207A315). The NuScale DSRS Scope and Safety Review Matrix provides a complete list of SRP sections and identifies which SRP sections will be used for DC, COL, or ESP reviews concerning the NuScale design; which SRP sections are not applicable to the NuScale design; which SRP sections needed modification and were reissued as DSRS sections; and which new DSRS sections were added to address a unique design consideration in the NuScale design. The final NuScale DSRS Scope and Safety Review Matrix is available in ADAMS under Accession No. ML16263A000.
The NRC staff developed the content of the NuScale DSRS as an alternative method for evaluating a NuScale-specific application and has determined that the application may address the DSRS in lieu of addressing the SRP, with specified exceptions. These exceptions include particular review areas in which the DSRS directs reviewers to consult the SRP and others in which the SRP is used for the review as identified in the final NuScale DSRS Scope and Safety Review Matrix. If NuScale chooses to address the DSRS, the application should identify and describe all differences between the design features (DC and COL applications only), analytical techniques, and procedural measures proposed in an application and the guidance of the applicable DSRS section (or SRP section, as specified in the NuScale DSRS Scope and Safety Review Matrix), and discuss how the proposed alternative provides an acceptable method of complying with the regulations that underlie the DSRS acceptance criteria. The staff has accepted the content of the DSRS as an alternative method for evaluating whether an application complies with NRC regulations for the NuScale Small Modular Reactor applications, provided that the application does not deviate significantly from the design and siting assumptions made by the NRC staff while preparing the DSRS. If the design or siting assumptions in a NuScale application deviate significantly from the design and siting assumptions the staff used in preparing the DSRS, the staff will use the more general guidance in the SRP, as specified in sections 52.17(a)(1)(xii), 52.47(a)(9), or 52.79(a)(41) of title 10 of the
The NRC staff issued a
In the June 30, 2015
The results of determinations to use the related SRP sections rather than draft DSRS sections, along with other identified issues with the draft NuScale DSRS Scope and Safety Review Matrix, are documented in a separate “transitional” NuScale DSRS Scope and Safety Review Matrix (ADAMS Accession No. ML16076A048). The “transitional” Matrix shows the differences between the draft and final NuScale DSRS Scope and Safety Review Matrices and describes the reasons for these differences. The resulting final list of DSRS titles with corresponding section numbers and ADAMS references are provided in the table below and in ADAMS Package Accession No. ML15355A295.
In the future, should additional SRP sections be developed, the staff will determine at that time their applicability to the NuScale design. In addition, the NRC disseminates information regarding current safety issues and proposed solutions through various means, such as generic communications and the process for treating generic safety issues. When current issues are resolved, the staff will determine the need, extent, and nature of revision that should be made to the SRP and/or DSRS to reflect the new NRC guidance.
For the Nuclear Regulatory Commission.
Nuclear Regulatory Commission
Notice of intent to enter into a modified indemnity agreement.
The U.S. Nuclear Regulatory Commission (NRC) is issuing a notice of intent to enter into a modified indemnity agreement with Duke Energy Florida, LLC, (DEF) to operate Levy Nuclear Plant Units 1 and 2 (LNP 1 and 2). The NRC is required to publish notice of its intent to enter into an indemnity agreement which contains provisions different from the general form found in the NRC's regulations. A modification to the general form is necessary to accommodate the unique timing provisions of a combined license (COL).
On October 20, 2016, the Commission authorized the Director of the Office of New Reactors to issue COLs to DEF to construct and operate LNP 1 and 2. The modified indemnity agreement would be effective upon issuance of the COLs.
Please refer to Docket ID NRC-2008-0558 when contacting the NRC about the availability of information regarding this document. You may obtain publicly-available information related to this document using any of the following methods:
•
•
•
Donald Habib, Office of New Reactors, U.S. Nuclear Regulatory Commission, Washington DC 20555-0001; telephone: 301-415-1035, email:
On October 20, 2016, the Commission authorized issuance of COLs to DEF for LNP 1 and 2. These COLs would include a license pursuant to part 70 of title 10 of the
Pursuant to 10 CFR 140.9, the NRC is publishing notice of its intent to enter into an indemnity agreement that contains provisions different from the general form found in 10 CFR 140.92. Modifications to the general indemnity agreement are addressed in the following discussion.
The provisions of the general form of indemnity agreement provided in 10 CFR 140.92 address insurance and indemnity for a licensee that is authorized to operate as soon as an operating license (OL) is issued pursuant to 10 CFR part 50, “Domestic licensing of production and utilization facilities.” The DEF, however, has requested a COL pursuant to 10 CFR part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants” to construct and operate LNP 1 and 2. Unlike an OL, which authorizes operation of the facility as soon as the license is issued, a COL authorizes the construction of the facility but does not authorize operation of the facility until the Commission makes a finding pursuant to 10 CFR 52.103(g) that the acceptance criteria in the COL are met (also called a “§ 52.103(g) finding”). The COL holders are not required to maintain financial protection in the amount specified in 10 CFR 140.11(a)(4) before the § 52.103(g) finding is made, but must maintain financial protection in the amount specified by 10 CFR 140.13 upon receipt of a COL because the COL includes a license issued pursuant to 10 CFR part 70. Therefore, the provisions in the general form of indemnity agreement must be modified to address the timing differences applicable to COLs.
Modifications to the general form of indemnity agreement will reflect the timing distinctions applicable to COLs. In addition, other modifications and their intent are described below:
(1) References to Mutual Atomic Energy Liability Underwriters have been removed because this entity no longer exists.
(2) Monetary amounts have been updated to reflect changes that have been made to Sec. 170. “Indemnification and Limitation of Liability” of the Atomic Energy Act of 1954, as amended (42 U.S.C. 2210).
Accordingly, for the reasons discussed in this notice and in accordance with 10 CFR 140.9, the NRC hereby provides notice of its intent to enter into an agreement of indemnity with DEF for LNP 1 and 2 with the described modifications to the general form of indemnity.
For the Nuclear Regulatory Commission.
Postal Regulatory Commission.
Notice.
The Commission is noticing recent Postal Service filings for the Commission's consideration concerning negotiated service agreements. This notice informs the public of the filing, invites public comment, and takes other administrative steps.
Submit comments electronically via the Commission's Filing Online system at
David A. Trissell, General Counsel, at 202-789-6820.
The Commission gives notice that the Postal Service filed request(s) for the Commission to consider matters related to negotiated service agreement(s). The request(s) may propose the addition or removal of a negotiated service agreement from the market dominant or the competitive product list, or the modification of an existing product currently appearing on the market dominant or the competitive product list.
Section II identifies the docket number(s) associated with each Postal Service request, the title of each Postal Service request, the request's acceptance date, and the authority cited by the Postal Service for each request. For each request, the Commission appoints an officer of the Commission to represent the interests of the general public in the proceeding, pursuant to 39 U.S.C. 505 (Public Representative). Section II also establishes comment deadline(s) pertaining to each request.
The public portions of the Postal Service's request(s) can be accessed via the Commission's Web site (
The Commission invites comments on whether the Postal Service's request(s) in the captioned docket(s) are consistent with the policies of title 39. For request(s) that the Postal Service states concern market dominant product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3622, 39 U.S.C. 3642, 39 CFR part 3010, and 39 CFR part 3020, subpart B. For request(s) that the Postal Service states concern competitive product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3632, 39 U.S.C. 3633, 39 U.S.C. 3642, 39 CFR part 3015, and 39 CFR part 3020, subpart B. Comment deadline(s) for each request appear in section II.
1.
2.
3.
4.
5.
6.
7.
This notice will be published in the
Postal Regulatory Commission.
Notice.
The Commission is noticing recent Postal Service filing for the Commission's consideration concerning a negotiated service agreement. This notice informs the public of the filing, invites public comment, and takes other administrative steps.
Submit comments electronically via the Commission's Filing Online system at
David A. Trissell, General Counsel, at 202-789-6820.
The Commission gives notice that the Postal Service filed request(s) for the Commission to consider matters related to negotiated service agreement(s). The request(s) may propose the addition or removal of a negotiated service agreement from the market dominant or the competitive product list, or the modification of an existing product currently appearing on the market dominant or the competitive product list.
Section II identifies the docket number(s) associated with each Postal Service request, the title of each Postal Service request, the request's acceptance date, and the authority cited by the Postal Service for each request. For each request, the Commission appoints an officer of the Commission to represent the interests of the general public in the proceeding, pursuant to 39 U.S.C. 505 (Public Representative). Section II also establishes comment deadline(s) pertaining to each request.
The public portions of the Postal Service's request(s) can be accessed via the Commission's Web site (
The Commission invites comments on whether the Postal Service's request(s) in the captioned docket(s) are consistent with the policies of title 39. For request(s) that the Postal Service states concern market dominant product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3622, 39 U.S.C. 3642, 39 CFR part 3010, and 39 CFR part 3020, subpart B. For request(s) that the Postal Service states concern competitive product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3632, 39 U.S.C. 3633, 39 U.S.C. 3642, 39 CFR part 3015, and 39 CFR part 3020, subpart B. Comment deadline(s) for each request appear in section II.
1.
This notice will be published in the
Railroad Retirement Board.
Notice.
Pursuant to section 8(c)(2) and section 12(r)(3) of the Railroad Unemployment Insurance Act (Act) (45 U.S.C. 358(c)(2) and 45 U.S.C. 362(r)(3), respectively), the Board gives notice of the following:
1. The balance to the credit of the Railroad Unemployment Insurance (RUI) Account, as of June 30, 2016, is $93,849,116.28;
2. The September 30, 2016, balance of any new loans to the RUI Account, including accrued interest, is zero;
3. The system compensation base is $4,224,601,102.31 as of June 30, 2016;
4. The cumulative system unallocated charge balance is ($408,501,327.51) as of June 30, 2016;
5. The pooled credit ratio for calendar year 2017 is zero;
6. The pooled charged ratio for calendar year 2017 is zero;
7. The surcharge rate for calendar year 2017 is 1.5 percent;
8. The monthly compensation base under section 1(i) of the Act is $1,545 for months in calendar year 2017;
9. The amount described in sections 1(k) and 3 of the Act as “2.5 times the monthly compensation base” is $3,862.50 for base year (calendar year) 2017;
10. The amount described in section 4(a-2)(i)(A) of the Act as “2.5 times the monthly compensation base” is $3,862.50 with respect to disqualifications ending in calendar year 2017;
11. The amount described in section 2(c) of the Act as “an amount that bears the same ratio to $775 as the monthly compensation base for that year as computed under section 1(i) of this Act bears to $600” is $1,996 for months in calendar year 2017;
12. The maximum daily benefit rate under section 2(a)(3) of the Act is $72 with respect to days of unemployment and days of sickness in registration periods beginning after June 30, 2017.
The balance in notice (1) and the determinations made in notices (3) through (7) are based on data as of June 30, 2016. The balance in notice (2) is based on data as of September 30, 2016. The determinations made in notices (5) through (7) apply to the calculation, under section 8(a)(1)(C) of the Act, of employer contribution rates for 2017. The determinations made in notices (8) through (11) are effective January 1, 2017. The determination made in notice (12) is effective for registration periods beginning after June 30, 2017.
Secretary to the Board, Railroad Retirement Board, 844 Rush Street, Chicago, Illinois 60611-2092.
Michael J. Rizzo, Bureau of the Actuary, Railroad Retirement Board, 844 Rush Street, Chicago, Illinois 60611-2092, telephone (312) 751-4771.
The RRB is required by section 8(c)(1) of the Railroad Unemployment Insurance Act (Act) (45 U.S.C. 358(c)(1)) as amended by Public Law 100-647, to proclaim by October 15 of each year certain system-wide factors used in calculating experience-based employer contribution rates for the following year. The RRB is further required by section 8(c)(2) of the Act (45 U.S.C. 358(c)(2)) to publish the amounts so determined and proclaimed. The RRB is required by section 12(r)(3) of the Act (45 U.S.C. 362(r)(3)) to publish by December 11, 2016, the computation of the calendar year 2017 monthly compensation base (section 1(i) of the Act) and amounts described in sections 1(k), 2(c), 3 and 4(a-2)(i)(A) of the Act which are related to changes in the monthly compensation base. Also, the RRB is required to publish, by June 11, 2017, the maximum daily benefit rate under section 2(a)(3) of the Act for days of unemployment and days of sickness in registration periods beginning after June 30, 2017.
A surcharge is added in the calculation of each employer's contribution rate, subject to the applicable maximum rate, for a calendar year whenever the balance to the credit of the RUI Account on the preceding June 30 is less than the greater of $100 million or the amount that bears the same ratio to $100 million as the system compensation base for that June 30 bears to the system compensation base as of June 30, 1991. If the RUI Account balance is less than $100 million (as indexed), but at least $50 million (as indexed), the surcharge will be 1.5 percent. If the RUI Account balance is less than $50 million (as indexed), but greater than zero, the surcharge will be 2.5 percent. The maximum surcharge of 3.5 percent applies if the RUI Account balance is less than zero.
The ratio of the June 30, 2016 system compensation base of $4,224,601,102.31 to the June 30, 1991 system compensation base of $2,763,287,237.04 is 1.52883169. Multiplying 1.52883169 by $100 million yields $152,883,169.00. Multiplying $50 million by 1.52883169 produces $76,441,584.50. The Account balance on June 30, 2016, was $93,849,116.28. Accordingly, the surcharge rate for calendar year 2017 is 1.5 percent.
For years after 1988, section 1(i) of the Act contains a formula for determining the monthly compensation base. Under the prescribed formula, the monthly compensation base increases by approximately two-thirds of the cumulative growth in average national wages since 1984. The monthly compensation base for months in calendar year 2017 shall be equal to the greater of (a) $600 or (b) $600 [1 + {(A−37,800)/56,700}], where A equals the amount of the applicable base with respect to tier 1 taxes for 2017 under section 3231(e)(2) of the Internal Revenue Code of 1986. Section 1(i) further provides that if the amount so determined is not a multiple of $5, it shall be rounded to the nearest multiple of $5.
Using the calendar year 2017 tier 1 tax base of $127,200 for A above produces the amount of $1,546.03, which must then be rounded to $1,545. Accordingly, the monthly compensation base is determined to be $1,545 for months in calendar year 2017.
For years after 1988, sections 1(k), 3, 4(a-2)(i)(A) and 2(c) of the Act contain formulas for determining amounts related to the monthly compensation base.
Under section 1(k), remuneration earned from employment covered under the Act cannot be considered subsidiary remuneration if the employee's base year compensation is less than 2.5 times the monthly compensation base for months in such base year. Under section 3, an employee shall be a “qualified employee” if his/her base year compensation is not less than 2.5 times the monthly compensation base for months in such base year. Under section 4(a-2)(i)(A), an employee who leaves work voluntarily without good cause is disqualified from receiving unemployment benefits until he has been paid compensation of not less than 2.5 times the monthly compensation base for months in the calendar year in which the disqualification ends.
Multiplying 2.5 by the calendar year 2017 monthly compensation base of $1,545 produces $3,862.50. Accordingly, the amount determined under sections 1(k), 3 and 4(a-2)(i)(A) is $3,862.50 for calendar year 2017.
Under section 2(c), the maximum amount of normal benefits paid for days of unemployment within a benefit year and the maximum amount of normal benefits paid for days of sickness within a benefit year shall not exceed an employee's compensation in the base year. In determining an employee's base year compensation, any money remuneration in a month not in excess of an amount that bears the same ratio to $775 as the monthly compensation base for that year bears to $600 shall be taken into account. The calendar year 2017 monthly compensation base is $1,545. The ratio of $1,545 to $600 is 2.57500000. Multiplying 2.57500000 by $775 produces $1,996. Accordingly, the amount determined under section 2(c) is $1,996 for months in calendar year 2017.
Section 2(a)(3) contains a formula for determining the maximum daily benefit rate for registration periods beginning after June 30, 1989, and after each June 30 thereafter. Legislation enacted on
The calendar year 2016 monthly compensation base is $1,455. Multiplying $1,455 by 0.05 yields $72.75. Accordingly, the maximum daily benefit rate for days of unemployment and days of sickness beginning in registration periods after June 30, 2017, is determined to be $72.
By Authority of the Board.
[81 FR 73459, October 25, 2016].
Open Meeting.
100 F Street NE., Washington, DC.
Wednesday, October 26, 2016 10:00 a.m.
Time Change.
The Open Meeting scheduled for Wednesday, October 26, 2016 at 10:00 a.m., has been changed to Wednesday, October 26, 2016 at 11:00 a.m.
For further information and to ascertain what, if any, matters have been added, deleted or postponed, please contact:
The Office of the Secretary at (202) 551-5400.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),
The Exchange filed a proposal to amend Exchange Rule 11.13(b)(1) to describe when an order marked as “short” may be eligible for routing when a short sale price test restriction is in effect.
The text of the proposed rule change is available at the Exchange's Web site at
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.
The Exchange proposes to amend Exchange Rule 11.13(b)(1) to describe when an order to sell marked
Under Rule 11.13(b)(1), an order marked “short” when a short sale price test restriction is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a short sale price test restriction and such order is an Immediate or Cancel (“IOC”) Order
The Exchange proposes to specify in Rule 11.13 that orders marked “short” may be eligible for routing by the Exchange when a short sale price test restriction is in effect where the User
Under Exchange Rule 11.13(b)(1), IOC Orders marked “short” that are not eligible for routing during a short sale price test restriction will continue to be cancelled.
The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act
The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders marked “short” may be routed to an away market for execution under one specific routing strategy offered by the Exchange.
The Exchange has neither solicited nor received written comments on the proposed rule change.
Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act
At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act.
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),
The Exchange filed a proposal to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale instruction may be eligible for routing when a short sale price test restriction is in effect.
The text of the proposed rule change is available at the Exchange's Web site at
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.
The Exchange proposes to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale
Under Rule 11.11(a),
The Exchange proposes to specify in Rule 11.11(a) that orders that include a Short Sale instruction may be eligible for routing by the Exchange when a Short Sale Circuit Breaker is in effect where the User
Under Exchange Rule 11.11(a), orders that include a Short Sale instruction and a Time-in-Force of IOC that are not eligible for routing during a Short Sale Circuit Breaker will continue to be cancelled. For any other order that includes a Short Sale instruction that is ineligible for routing due to a Short Sale Circuit Breaker being in effect, the Exchange will continue to post the unfilled balance of the order to the EDGX Book, treat the order as if it included a Book Only or Post Only instruction, and subject it to the Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO, as described in Rule 11.6(l)(2), unless the User has elected the order Cancel Back as described in Rule 11.6(b).
The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act
The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders that include a Short Sale instruction may be routed to an away marked for execution under two specific routing strategies offered by the Exchange.
The Exchange has neither solicited nor received written comments on the proposed rule change.
Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act
At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Pursuant to Section 11A of the Securities Exchange Act of 1934 (“Act”)
OPRA proposes to amend footnotes 10 and 11 in the OPRA Fee Schedule to clarify the application of OPRA's “Non-Display Use” fees in certain respects.
OPRA proposes to eliminate the use of the term “datafeed” in footnotes 10 and 11. Some OPRA Vendors have argued that the use of the term “datafeed” in these footnotes provides a basis for saying that the Non-Display Use fees are not applicable to their downstream OPRA data recipients. That argument is based on a separate OPRA Policy entitled “Datafeeds.”
From OPRA's perspective, this is clearly incorrect. The Datafeeds Policy is directed to describing how OPRA data is
Nonetheless, OPRA recognizes that the use of the term “datafeed” in separate OPRA documents with
OPRA also proposes to add a sentence in footnote 10 to state that the Category 1 Non-Display Use Fee
OPRA is also proposing a separate change in footnote 10 to the OPRA Fee Schedule for a purpose relating to the administration of the Non-Display Use Fees. A few OPRA data recipients have tried to suggest that if a device is subject to the Professional Subscriber Device-Based Fees it is immune from Non-Display Use Fees, and that therefore by attaching a display monitor to a server an OPRA data recipient can avoid payment of Non-Display Use Fees even if the server is used for Non-Display Use of OPRA data. OPRA believes that this is clearly incorrect, and that this can be clearly seen in the first sentence of footnote 10 in its current form (“Non-Display Use refers to the accessing, processing or consuming . . . of OPRA market data . . .
OPRA proposes to add the word “Monthly” to the heading of the Non-Display Use Fee entry in its Fee Schedule, so that the entry reads “Monthly Non-Display Use Fees.” The absence of this word was recently brought to the attention of OPRA staff. The word is used in the other entries in OPRA's Fee Schedule that are for monthly fees, and OPRA believes that for clarity the word should be used in this entry as well.
OPRA does not anticipate any material increase in its revenues as a result of the changes described in this filing—indeed, on balance, OPRA may not experience any increase at all in its revenues as a result of the changes described in this filing. A few OPRA data recipients that have resisted payment of Non-Display Use fees on the basis of the assertion that they are not receiving the data through “datafeeds” will no longer be able to make that assertion, possibly resulting in a small increase in OPRA's revenues. On the other hand, there may be recipients of OPRA data that have been paying Category 1 Non-Display Use fees and that may no longer pay them as a result of the express exemption from Category 1 Non-Display Use fees for certain data recipients with a single UserID that use OPRA data for Category 1 Non-Display Use. OPRA believes that the change described in this filing to make more explicit that payment of Device-based Fees does not make Non-Display Use fees inapplicable will have no material effect on its revenues.
OPRA believes that the most important of these changes is the deletion of the term “datafeed” in the footnotes to its Fee Schedule, not because of its effect on OPRA revenues, but because of concerns expressed to OPRA staff by data recipients that have been paying the Non-Display Use fees
The text of the amendment to the OPRA Plan is available at OPRA, the Commission's Public Reference Room, the OPRA Web site at
Pursuant to paragraph (b)(3)(i) of Rule 608 of Regulation NMS under the Act, OPRA designated this amendment as establishing or changing fees or other charges collected on behalf of all of the OPRA Participant exchanges in connection with access to or use of OPRA facilities. OPRA proposes to implement the revisions in the Non-Display Use Fee footnotes that are described in this amendment on November 1, 2016. According to OPRA, implementation of the revisions as of that date will permit OPRA to provide persons that may be affected by these changes with thirty days' notice of the changes.
The Commission may summarily abrogate the amendment within sixty days of its filing and require refiling and approval of the amendment by Commission order pursuant to Rule 608(b)(2) under the Act
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the OPRA Plan amendment is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
By the Commission.
Pursuant to Section 11A of the Securities Exchange Act of 1934 (“Act”)
The primary purpose of the proposed Fee Schedule amendments is to specify OPRA's Professional Subscriber Device-Based Fee effective commencing January 1, 2017 and make conforming changes in OPRA's Enterprise Rate Professional Subscriber Fee. OPRA's Enterprise Rate Professional Subscriber Fee is available to those Professional Subscribers that elect that rate in place of the regular OPRA device-based fees.
Specifically, OPRA proposes, effective January 1, 2017, to: (1) Increase the current $29.50 monthly per device fee by $1.00; (2) to increase the Enterprise Rate, currently a monthly fee of $29.50 times the number of a Professional Subscriber's U.S.-based registered representatives, to be a monthly fee of $30.50 times the number of the Subscriber's U.S.-based registered representatives; and (3) make conforming changes to the minimum monthly fee under the Enterprise Rate. “Professional Subscribers” are persons who subscribe to OPRA data, do not qualify for the reduced fees charged to “Nonprofessional Subscribers,” and do not redistribute the OPRA data to third parties. OPRA permits the counting of “User IDs” as a surrogate for counting “devices” for purposes of its Professional Subscriber Device-based Fees.
The number of devices reported to OPRA as subject to Professional Subscriber Device-Based Fees has been steadily trending downwards over many years. In 2008, OPRA received device-based fees, including enterprise fees, with respect to approximately 210,500 devices. In 2014, OPRA received device-based fees, including enterprise fees, with respect to approximately 148,400 devices, and in 2015 OPRA received device-based fees, including enterprise fees, with respect to approximately 141,300 devices. OPRA is receiving device-based fees in the third calendar quarter of 2016 with respect to approximately 135,500 devices—already a reduction of approximately 4.1% from 2015. OPRA believes that this long-term downward trend is the result of the increasing use of trading algorithms and automated trading platforms and other fundamental changes in the securities industry, and OPRA anticipates that this trend is likely to continue.
The proposed increase in the Professional Subscriber Device-Based Fees is consistent with OPRA's past practice of making incremental $1.00 increases in its monthly Professional Subscriber Device-Based Fees,
A secondary purpose of the proposed Fee Schedule amendments is to add the word “display” in the statements of the monthly Professional Subscriber Device-Based Fees for the periods commencing on January 1, 2016 and January 1, 2017. A few OPRA Professional Subscribers have asked whether, if a device is subject to the Professional Subscriber Device-Based Fees, it is therefore not subject to the OPRA Non-Display Use Fees, and suggested that a Subscriber could perhaps avoid payment of Non-Display Use Fees by attaching a display monitor to a server even if the server is being used for Non-Display Use of OPRA data. OPRA believes that this suggestion is not consistent even with the current wording of the Fee Schedule, but that the addition of the word “display” will make the wording clearer in this respect.
The proposed changes in the Policies with respect to Device-Based Fees are for a purpose similar to the purpose described above of adding the word “display” in the OPRA Fee Schedule, namely to avert misreading the Policies as saying that, if a Professional Subscriber is paying Device-Based Fees with respect to a device, the payment of the Device-Based Fees in and of itself is a sufficient basis for not paying Non-Display Use Fees even if the Non-Display Use Fees would otherwise be applicable. No Professional Subscriber has actually suggested such a reading to OPRA, and OPRA believes that the suggestion would be untenable even in terms of the current phrasing of the Policies, but OPRA believes that it is appropriate to revise the Policies to make clearer that the Device-based Fees may not be the only fees applicable to a particular device that receives OPRA data.
The text of the amendment to the OPRA Plan is available at OPRA, the Commission's Public Reference Room, the OPRA Web site at
Pursuant to paragraph (b)(3)(i) of Rule 608 of Regulation NMS under the Act, OPRA designated this amendment as establishing or changing fees or other charges collected on behalf of all of the OPRA participant exchanges in connection with access to or use of OPRA facilities. OPRA proposes to implement the changes in the Professional Subscriber Device-Based Fee on January 1, 2017. Implementation of the changes in the Professional Subscriber Device-Based Fee on January 1 is consistent with OPRA's prior practice with respect to changes in this fee, and will provide ample opportunity to give persons subject to this fee advance notice of the change. OPRA also proposes to implement the changes in the Policies with respect to Device-Based Fees immediately.
The Commission may summarily abrogate the amendment within sixty days of its filing and require refiling and approval of the amendment by Commission order pursuant to Rule 608(b)(2) under the Act
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the OPRA Plan amendment is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
By the Commission.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),
The Exchange filed a proposal to amend Exchange Rule 11.13(b)(1) to describe when an order marked as “short” may be eligible for routing when a short sale price test restriction is in effect.
The text of the proposed rule change is available at the Exchange's Web site at
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.
The Exchange proposes to amend Exchange Rule 11.13(b)(1) to describe when an order to sell marked
Under Rule 11.13(b)(1), an order marked “short” when a short sale price test restriction is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a short sale price test restriction and such order is an Immediate or Cancel (“IOC”) Order
The Exchange proposes to specify in Rule 11.13 that orders marked “short” may be eligible for routing by the Exchange when a short sale price test restriction is in effect where the User
Under Exchange Rule 11.13(b)(1), IOC Orders marked “short” that are not eligible for routing during a short sale price test restriction will continue to be cancelled.
The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act
The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders marked “short” may be routed to an away market for execution under two specific routing strategies offered by the Exchange.
Not applicable.
Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act
At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),
The Exchange filed a proposal to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale instruction may be eligible for routing when a short sale price test restriction is in effect.
The text of the proposed rule change is available at the Exchange's Web site at
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.
The Exchange proposes to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale
Under Rule 11.11(a), an order that includes a Short Sale instruction when a short sale price test restriction pursuant to Rule 201 of Regulation SHO is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a Short Sale Circuit Breaker
The Exchange proposes to specify in Rule 11.11(a) that orders that include a Short Sale instruction may be eligible for routing by the Exchange when a Short Sale Circuit Breaker is in effect where the User
Under Exchange Rule 11.11(a), orders that include a Short Sale instruction and a Time-in-Force of IOC that are not eligible for routing during a Short Sale Circuit Breaker will continue to be cancelled. For any other order that includes a Short Sale instruction that is ineligible for routing due to a Short Sale Circuit Breaker being in effect, the Exchange will continue to post the unfilled balance of the order to the EDGA Book, treat the order as if it included a Book Only or Post Only instruction, and subject it to the Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO, as described in Rule 11.6(l)(2), unless the User has elected the order Cancel Back as described in Rule 11.6(b).
The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act
The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders that include a Short Sale instruction may be routed to an away marked for execution under one specific routing strategy offered by the Exchange.
Not applicable.
Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act
At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (“Act”),
The Exchange proposes to add Commentary .14 to Rule 4770 (Compliance with Regulation NMS Plan to Implement a Tick Size Pilot) to provide the SEC with notice of its efforts to re-program its systems to eliminate a re-pricing functionality for certain orders in Test Group Three securities in connection with the Regulation NMS Plan to Implement a Tick Size Pilot Program (“Plan” or “Pilot”).
The text of the proposed rule change is set forth below. Proposed new language is in italics; deleted text is in brackets.
(a) through (d) No Change.
Commentary: .01-.12 No change.
.1[2]
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in sections A, B, and C below, of the most significant aspects of such statements.
On September 7, 2016, the Exchange filed with the Securities and Exchange Commission (“SEC” or “Commission”) a proposed rule change (“Proposal”) to adopt paragraph (d) to Exchange Rule 4770 to describe changes to system functionality necessary to implement the Plan. The Exchange also proposed amendments to Rule 4770(a) and (c) to clarify how the Trade-at exception may be satisfied. The SEC published the Proposal in the
In SR-BX-2016-050, BX had initially proposed a re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities.
In that amendment, BX noted that this change would only impact the treatment of Price to Comply Orders, Non-Displayed Orders, and Post-Only orders that are submitted through the OUCH and FLITE protocols in Test Group Three Pilot Securities, as these types of Orders that are currently submitted to BX through the RASH or FIX protocols are already subject to this re-pricing functionality and will remain subject to this functionality under the Pilot.
In the Amendment, BX further noted that its systems are currently programmed so that Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities may be adjusted repeatedly to reflect changes to the NBBO and/or the best price on the BX book. BX stated that it is re-programming its systems to remove this functionality for Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities.
At this time, BX is still in the process of re-programming its systems to eliminate the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols. BX anticipates that this re-programming shall be complete on or before October 31, 2016.
Therefore, the current treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:
Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.
Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.
Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.
Following entry, and if market conditions allow, a Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the BX Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.
In addition to this proposal, BX will also issue an Equity Trader Alert that describes the current operation of the BX systems in this regard, and the timing related to the removal of this re-pricing functionality.
BX also proposes to re-number Commentary .12, which relates to the Block Size exception, to Commentary .13. A previous filing (SR-BX-2016-048) added Commentary to Rule 4770 that resulted in Commentary .11, which addresses the effective date of the Rule, being re-numbered as Commentary .12. BX therefore proposes to re-number the Commentary .12 that addresses the Block Size exception as Commentary .13.
The Exchange believes that its proposal is consistent with Section 6(b) of the Act,
BX also believes that the proposal is consistent with the Act because the re-pricing functionality will not significantly impact the data gathered pursuant to the Pilot. BX notes that this re-pricing functionality only affects Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols for Test Group Three securities until the re-pricing functionality is eliminated, and only becomes relevant when an Order in a Test Group Three security would cross a Protected Quotation of another market center. BX has analyzed data relating to the frequency with which Orders in Test Group Three securities are entered with a limit price that would cross a Protected Quotation of another market center, and believes that the re-pricing functionality will be triggered infrequently once Test Group Three becomes operational.
The Exchange does not believe that the proposed rule change will impose any burden on competition not necessary or appropriate in furtherance of the purposes of the Act. The purpose of this proposal is to provide the SEC and market participants with notice of BX's efforts to remove its re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, consistent with its statements in SR-BX-2016-050.
No written comments were either solicited or received.
Because the foregoing proposed rule change does not: (i) Significantly affect the protection of investors or the public interest; (ii) impose any significant burden on competition; and (iii) become operative for 30 days from the date on which it was filed, or such shorter time as the Commission may designate, it has become effective pursuant to Section 19(b)(3)(A)(iii) of the Act
A proposed rule change filed under Rule 19b-4(f)(6) normally does not become operative prior to 30 days after the date of filing. Rule 19b-4(f)(6)(iii), however, permits the Commission to designate a shorter time if such action is consistent with the protection of investors and the public interest. The Exchange requests that the Commission waive the 30-day operative delay contained in Rule 19b-4(f)(6)(iii) so that this proposed change will be in operative as of October 17, 2016, the date that Test Group Three securities begin to be subject to the quoting and trading restrictions of the Plan and, therefore, the relevant language in Rule 4770.
The Commission believes that waiving the 30-day operative delay is consistent with the protection of investors and the public interest because it will allow the Exchange to implement the proposed rules immediately thereby preventing delays in the implementation of the Plan. The Commission notes that the Pilot started implementation on October 3, 2016, Test Group Three securities were phased into the Pilot starting on October 17, 2016, and waiving the 30-day operative delay would ensure that the rules of the Exchange would be in place during implementation. Therefore, the Commission hereby waives the 30-day operative delay and designates the proposed rule change to be operative upon filing with the Commission.
At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (i) Necessary or appropriate in the public interest; (ii) for the protection of investors; or (iii) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (“Act”),
The Exchange proposes to add Commentary .14 to Rule 4770 (Compliance with Regulation NMS Plan to Implement a Tick Size Pilot) to provide the SEC with notice of its efforts to re-program its systems to eliminate a re-pricing functionality for certain orders in Test Group Three securities in connection with the Regulation NMS Plan to Implement a Tick Size Pilot Program (“Plan” or “Pilot”).
(a) through (d) No Change.
Commentary: .01-.12 No change.
.1[2]
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in sections A, B, and C below, of the most significant aspects of such statements.
On September 7, 2016, The Nasdaq Stock Market LLC (“Nasdaq” or “Exchange”) filed with the Securities and Exchange Commission (“SEC” or “Commission”) a proposed rule change (“Proposal”) to adopt paragraph (d) and Commentary .12 to Exchange Rule 4770 to describe changes to system functionality necessary to implement the Plan. The Exchange also proposed amendments to Rule 4770(a) and (c) to clarify how the Trade-at exception may be satisfied. The SEC published the Proposal in the
In SR-NASDAQ-2016-126, Nasdaq had initially proposed a re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities.
In that amendment, Nasdaq noted that this change would only impact the treatment of Price to Comply Orders, Non-Displayed Orders, and Post-Only orders that are submitted through the
In the Amendment, Nasdaq further noted that its systems are currently programmed so that Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities may be adjusted repeatedly to reflect changes to the NBBO and/or the best price on the Nasdaq book. Nasdaq stated that it is re-programming its systems to remove this functionality for Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities.
At this time, Nasdaq is still in the process of re-programming its systems to eliminate the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols. Nasdaq anticipates that this re-programming shall be complete on or before October 31, 2016.
Therefore, the current treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:
Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.
Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.
Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.
Following entry, and if market conditions allow, a Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Nasdaq Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.
In addition to this proposal, Nasdaq will also issue an Equity Trader Alert that describes the current operation of the Nasdaq systems in this regard, and the timing related to the removal of this re-pricing functionality.
Nasdaq also proposes to re-number Commentary .12, which relates to the Block Size exception, to Commentary .13. A previous filing (SR-NASDAQ-2016-123) added Commentary to Rule 4770 that resulted in Commentary .11, which addresses the effective date of the Rule, being re-numbered as Commentary .12. Nasdaq therefore proposes to re-number the Commentary .12 that addresses the Block Size exception as Commentary .13.
The Exchange believes that its proposal is consistent with Section 6(b) of the Act,
Nasdaq also believes that the proposal is consistent with the Act because the re-pricing functionality will not significantly impact the data gathered pursuant to the Pilot. Nasdaq notes that this re-pricing functionality only affects Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols for Test Group Three securities until the re-pricing functionality is eliminated, and only becomes relevant when an Order in a Test Group Three security would cross a Protected Quotation of another market center. Nasdaq has analyzed data relating to the frequency with which Orders in Test Group Three securities are entered with a limit price that would cross a Protected Quotation of another market center, and believes that the re-pricing functionality will be triggered infrequently once Test Group Three becomes operational.
The Exchange does not believe that the proposed rule change will impose any burden on competition not necessary or appropriate in furtherance of the purposes of the Act. The purpose of this proposal is to provide the SEC and market participants with notice of Nasdaq's efforts to remove its re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE
No written comments were either solicited or received.
Because the foregoing proposed rule change does not: (i) Significantly affect the protection of investors or the public interest; (ii) impose any significant burden on competition; and (iii) become operative for 30 days from the date on which it was filed, or such shorter time as the Commission may designate, it has become effective pursuant to Section 19(b)(3)(A)(iii) of the Act
A proposed rule change filed under Rule 19b-4(f)(6) normally does not become operative prior to 30 days after the date of filing. Rule 19b-4(f)(6)(iii), however, permits the Commission to designate a shorter time if such action is consistent with the protection of investors and the public interest. The Exchange requests that the Commission waive the 30-day operative delay contained in Rule 19b-4(f)(6)(iii) so that this proposed change will be in operative as of October 17, 2016, the date that Test Group Three securities begin to be subject to the quoting and trading restrictions of the Plan and, therefore, the relevant language in Rule 4770.
The Commission believes that waiving the 30-day operative delay is consistent with the protection of investors and the public interest because it will allow the Exchange to implement the proposed rules immediately thereby preventing delays in the implementation of the Plan. The Commission notes that the Pilot started implementation on October 3, 2016, Test Group Three securities were phased into the Pilot starting on October 17, 2016, and waiving the 30-day operative delay would ensure that the rules of the Exchange would be in place during implementation. Therefore, the Commission hereby waives the 30-day operative delay and designates the proposed rule change to be operative upon filing with the Commission.
At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (i) Necessary or appropriate in the public interest; (ii) for the protection of investors; or (iii) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
• Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (“Act”),
The Exchange proposes to add Commentary .14 to Rule 3317 (Compliance with Regulation NMS Plan to Implement a Tick Size Pilot) to provide the SEC with notice of its efforts to re-program its systems to eliminate a re-pricing functionality for certain orders in Test Group Three securities in connection with the Regulation NMS Plan to Implement a Tick Size Pilot Program (“Plan” or “Pilot”).
The text of the proposed rule change is set forth below. Proposed new language is in italics; deleted text is in brackets.
(a) through (d) No Change.
.1[2]
In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in sections A, B, and C below, of the most significant aspects of such statements.
On September 7, 2016, the Exchange filed with the Securities and Exchange Commission (“SEC” or “Commission”) a proposed rule change (“Proposal”) to adopt paragraph (d) and Commentary .12 to Exchange Rule 3317 to describe changes to system functionality necessary to implement the Plan. The Exchange also proposed amendments to Rule 3317(a) and (c) to clarify how the Trade-at exception may be satisfied. The SEC published the Proposal in the
In SR-Phlx-2016-92, Phlx had initially proposed a re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities.
In that amendment, Phlx noted that this change would only impact the treatment of Price to Comply Orders, Non-Displayed Orders, and Post-Only orders that are submitted through the OUCH and FLITE protocols in Test Group Three Pilot Securities, as these types of Orders that are currently submitted to Phlx through the RASH or FIX protocols are already subject to this re-pricing functionality and will remain subject to this functionality under the Pilot.
In the Amendment, Phlx further noted that its systems are currently programmed so that Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities may be adjusted repeatedly to reflect changes to the NBBO and/or the best price on the Phlx book. Phlx stated that it is re-programming its systems to remove this functionality for Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities.
At this time, Phlx is still in the process of re-programming its systems to eliminate the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols. Phlx anticipates that this re-programming shall be complete on or before October 31, 2016.
Therefore, the current treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:
Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply
Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.
Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.
Following entry, and if market conditions allow, a Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Phlx Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.
In addition to this proposal, Phlx will also issue an Equity Trader Alert that describes the current operation of the Phlx systems in this regard, and the timing related to the removal of this re-pricing functionality.
Phlx also proposes to re-number Commentary .12, which relates to the Block Size exception, to Commentary .13. A previous filing (SR-Phlx-2016-90) added Commentary to Rule 3317 that resulted in Commentary .11, which addresses the effective date of the Rule, being re-numbered as Commentary .12. Phlx therefore proposes to re-number the Commentary .12 that addresses the Block Size exception as Commentary .13.
The Exchange believes that its proposal is consistent with Section 6(b) of the Act,
Phlx also believes that the proposal is consistent with the Act because the re-pricing functionality will not significantly impact the data gathered pursuant to the Pilot. Phlx notes that this re-pricing functionality only affects Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols for Test Group Three securities until the re-pricing functionality is eliminated, and only becomes relevant when an Order in a Test Group Three security would cross a Protected Quotation of another market center. Phlx has analyzed data relating to the frequency with which Orders in Test Group Three securities are entered with a limit price that would cross a Protected Quotation of another market center, and believes that the re-pricing functionality will be triggered infrequently once Test Group Three becomes operational.
The Exchange does not believe that the proposed rule change will impose any burden on competition not necessary or appropriate in furtherance of the purposes of the Act. The purpose of this proposal is to provide the SEC and market participants with notice of Phlx's efforts to remove its re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, consistent with its statements in SR-Phlx-2016-92.
No written comments were either solicited or received.
Because the foregoing proposed rule change does not: (i) Significantly affect the protection of investors or the public interest; (ii) impose any significant burden on competition; and (iii) become operative for 30 days from the date on which it was filed, or such shorter time as the Commission may designate, it has become effective pursuant to Section 19(b)(3)(A)(iii) of the Act
A proposed rule change filed under Rule 19b-4(f)(6) normally does not become operative prior to 30 days after the date of filing. Rule 19b-4(f)(6)(iii), however, permits the Commission to designate a shorter time if such action is consistent with the protection of investors and the public interest. The Exchange requests that the Commission waive the 30-day operative delay contained in Rule 19b-4(f)(6)(iii) so that this proposed change will be in operative as of October 17, 2016, the date that Test Group Three securities begin to be subject to the quoting and trading restrictions of the Plan and, therefore, the relevant language in Rule 3317.
The Commission believes that waiving the 30-day operative delay is consistent with the protection of investors and the public interest because it will allow the Exchange to implement the proposed rules immediately thereby preventing delays in the implementation of the Plan. The Commission notes that the Pilot started implementation on October 3, 2016, Test Group Three securities were phased into the Pilot starting on October 17, 2016, and waiving the 30-day operative delay would ensure that the rules of the Exchange would be in place during implementation. Therefore, the Commission hereby waives the 30-day operative delay and designates the proposed rule change to be operative upon filing with the Commission.
At any time within 60 days of the filing of the proposed rule change, the
Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:
• Use the Commission's Internet comment form (
• Send an email to
For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.
Federal Aviation Administration (FAA), DOT.
Notice.
This notice contains a summary of a petition seeking relief from specified requirements of Title 14 of the Code of Federal Regulations. The purpose of this notice is to improve the public's awareness of, and participation in, the FAA's exemption process. Neither publication of this notice nor the inclusion or omission of information in the summary is intended to affect the legal status of the petition or its final disposition.
Comments on this petition must identify the petition docket number and must be received on or before November 21, 2016.
Send comments identified by docket number FAA-2016-8684:
•
•
•
•
Christopher Morris, Office of Rulemaking, Federal Aviation Administration, 800 Independence Ave. SW., Washington, DC 20591; (202) 267-4418;
This notice is published pursuant to 14 CFR 11.85.
Federal Aviation Administration (FAA), DOT.
Notice.
This notice contains a summary of a petition seeking relief from specified requirements of Title 14 of the Code of Federal Regulations. The purpose of this notice is to improve the public's awareness of, and participation in, the FAA's exemption process. Neither publication of this notice nor the inclusion or omission of information in the summary is intended to affect the legal status of the petition or its final disposition.
Comments on this petition must identify the petition docket number and must be received on or before November 21, 2016.
Send comments identified by docket number FAA-2016-5027 using any of the following methods:
•
•
•
•
Dale Williams (202) 267-4179, Office of Rulemaking, Federal Aviation Administration, 800 Independence Avenue SW., Washington, DC 20591.
This notice is published pursuant to 14 CFR 11.85.
Pentastar operates under part 135 as an eligible on demand air carrier under Certificate No. UG8A235J, which was issued initially in 1999. Pentastar currently operates 8 turbo jet aircraft and one Cessna-172 aircraft, all of which are leased aircraft. Pentastar retains responsibility for all maintenance of the aircraft on its part 135 certificate through its part 145 repair station (No. BTVR626C). None of these aircraft are operated under part 91K, however, all aircraft are operated under part 91 by owners when not in use by Pentastar. Pentastar is currently in compliance with §§ 135.25 (b) and (c) by leasing a C-172 aircraft. Pentastar maintains this aircraft on its part 135 operations specifications although it has never utilized this aircraft for on demand operations. The petitioner contends that a grant of exemption would provide an equivalent level of safety because all the aircraft utilized in actual on demand operations under its part 135 certificate would remain the same.
Federal Railroad Administration (FRA), Department of Transportation.
Announcement of Railroad Safety Advisory Committee (RSAC) meeting.
FRA announces the fifty-seventh meeting of the RSAC, a Federal Advisory Committee that develops railroad safety regulations through a consensus process. The RSAC meeting agenda topics will include: Opening remarks from the FRA Administrator and the FRA Associate Administrator for Railroad Safety and Chief Safety Officer; status reports from the Remote Control Locomotive, Track Standards, Hazardous Materials Issues, Rail Integrity Working Groups, and the Engineering Task Force; an informational presentation on the high-speed passenger rail equipment (Tier III) rulemaking; and an update on the status of Positive Train Control implementation. This agenda is subject to change, including possibly adding further proposed tasks.
The RSAC meeting is scheduled to commence at 9:30 a.m. on Thursday, January 26, 2017, and will adjourn by 4:30 p.m.
The RSAC meeting will be held at the National Association of Home Builders, National Housing Center, located at 1201 15th Street NW., Washington, DC 20005. The meeting is open to the public on a first-come, first-served basis, and is accessible to individuals with disabilities. Sign and oral interpretation can be made available if requested 10 calendar days before the meeting.
Kenton Kilgore, RSAC Administrative
Under Section 10(a)(2) of the Federal Advisory Committee Act (Pub. L. 92-463), FRA is giving notice of a meeting of the RSAC. The RSAC was established to provide advice and recommendations to FRA on railroad safety matters. The RSAC is composed of 59 voting representatives from 38 member organizations, representing various rail industry perspectives. In addition, there are non-voting advisory representatives from the agencies with railroad safety regulatory responsibility in Canada and Mexico, the National Transportation Safety Board, and the Federal Transit Administration. The diversity of the RSAC ensures the requisite range of views and expertise necessary to discharge its responsibilities. See the RSAC Web site for details on prior RSAC activities and pending tasks at
Federal Railroad Administration (FRA), U.S. Department of Transportation (DOT).
Notice of availability and request for comments.
This document provides the public with notice that on August 12, 2016 Norfolk Southern Railway Company (NS) submitted to FRA its Positive Train Control Safety Plan (PTCSP) Version 1.0, dated August 12, 2016, on FRA's Secure Information Repository (SIR) site. NS asks FRA to approve its PTCSP and issue a Positive Train Control System Certification for NS's Interoperable Electronic Train Management System (I-ETMS), under 49 CFR part 236, subpart I.
FRA will consider communications received by November 30, 2016 before taking final action on the PTCSP. FRA may consider comments received after that date if practicable.
All communications concerning this proceeding should identify Docket Number FRA 2010-0060 and may be submitted by any of the following methods:
•
•
•
•
Dr. Mark Hartong, Senior Scientific Technical Advisor, at (202) 493-1332 or
In its PTCSP, NS states the I-ETMS system it is implementing is designed as a vital overlay PTC system as defined in 49 CFR 236.1015(e)(2). The PTCSP describes NS's I-ETMS implementation and the associated I-ETMS safety processes, safety analyses, and test, validation, and verification processes used during the development of I-ETMS. The PTCSP also contains NS's operational and support requirements and procedures.
NS's PTCSP and the accompanying request for approval and system certification are available for review online at
Interested parties may comment on the PTCSP by submitting written comments or data. During its review of the PTCSP, FRA will consider any comments or data submitted. However, FRA may elect not to respond to any particular comment and, under 49 CFR 236.1009(d)(3), FRA maintains the authority to approve or disapprove the PTCSP at its sole discretion. FRA does not anticipate scheduling a public hearing regarding NS's PTCSP because the circumstances do not appear to warrant a hearing. If any interested party desires an opportunity for oral comment, the party should notify FRA in writing before the end of the comment period and specify the basis for his or her request.
Anyone can search the electronic form of any written communications and comments received into any of our dockets by the name of the individual submitting the comment (or signing the document, if submitted on behalf of an association, business, labor union, etc.). Under 49 CFR 211.3, FRA solicits comments from the public to better inform its decisions. DOT posts these comments, without edit, including any personal information the commenter provides, to
Federal Transit Administration, DOT.
Pilot Program for Transit-Oriented Development Planning Announcement of Project Selections.
The U.S. Department of Transportation's Federal Transit Administration (FTA) announces the selection of projects with Fiscal Year (FY) 2015 and FY 2016 appropriations for the Pilot Program for Transit-Oriented Development Planning (TOD Pilot Program), as authorized by the Moving Ahead for Progress in the 21st Century Act (MAP-21) with additional funding provided by the Fixing America's Surface Transportation (FAST) Act. On April 14, 2016, FTA published a Notice of Funding
Successful applicants should contact the appropriate FTA Regional Office for information regarding applying for the funds. For program-specific information, applicants may contact Benjamin Owen, FTA Office of Planning and Environment, at (202) 366-5602 or
In response to the NOFO, FTA received 20 proposals from 17 states requesting $17.6 million in Federal funds. Project proposals were evaluated based on each applicant's responsiveness to the program evaluation criteria as detailed in the NOFO. Two of the 20 projects were deemed ineligible to receive funds because they did not meet the eligibility requirements described in the NOFO. A further two were associated with transit projects for which the risk of major scope changes could render the proposed comprehensive planning efforts moot. Those projects would be eligible to reapply to a future TOD Pilot Program funding opportunity if and when the uncertainties are resolved. FTA has selected 16 projects as shown in Table I for a total of $14.7 million.
Recipients selected for competitive funding should work with their FTA Regional Office to finalize the grant application in FTA's Transit Award Management System (TrAMS) for the projects identified in the attached table to quickly obligate funds. Grant applications must include eligible activities applied for in the original project application. Funds must be used consistent with the submitted proposal and for the eligible planning purposes established in the NOFO. Recipients are reminded that program requirements such as local match can be found in the NOFO. A discretionary project identification number has been assigned to each project for tracking purposes and must be used in the TrAMS application.
Selected projects are eligible to incur costs under pre-award authority no earlier than the date projects were publicly announced, October 11, 2016. Pre-award authority does not guarantee that project expenses incurred prior to the award of a grant will be eligible for reimbursement, as eligibility for reimbursement is contingent upon all applicable requirements having been met. For more about FTA's policy on pre-award authority, please see the FTA Fiscal Year 2016 Apportionments, Allocations, and Program Information and Interim Guidance found in 81 FR 7893 (February 16, 2016). Post-award reporting requirements include submission of the Federal Financial Report and Milestone progress reports in TrAMS as appropriate (see Grant Management Requirements FTA.C.5010.1D and Program Guidance for Metropolitan Planning and State Planning and Research Program Grants C.8100.1C). Recipients must comply with all applicable Federal statutes, regulations, executive orders, FTA circulars, and other Federal requirements in carrying out the project supported by the FTA grant. FTA emphasizes that grantees must follow all third-party procurement guidance, as described in FTA.C.4220.1F. Funds allocated in this announcement must be obligated in a grant by September 30, 2017.
Office of the Secretary, Department of Transportation.
Notice of seventh public meeting of advisory committee.
This notice announces the seventh meeting of the Advisory Committee on Accessible Air Transportation (ACCESS Advisory Committee).
The seventh meeting of the ACCESS Advisory Committee will be held on November 2, 2016, from 9:00 a.m. to 5:00 p.m., Eastern Daylight Time.
The meeting will be held at Hilton Arlington, 950 N. Stafford St., Arlington, VA 22203. Attendance is open to the public up to the room's capacity of 150 attendees. Since space is limited, any member of the general public who plans to attend this meeting must notify the registration contact identified below no later than October 31, 2016.
To register to attend the meeting, please contact Kyle Ilgenfritz (
The seventh meeting of the ACCESS Advisory Committee will be held on November 2, from 9:00 a.m. to 5:00 p.m., Eastern Daylight Time. The meeting will be held at Hilton Arlington, 950 N. Stafford St., Arlington, VA 22203. At the meeting, the ACCESS Advisory Committee will continue to address whether to require accessible inflight entertainment (IFE) and strengthen accessibility requirements for other in-flight communications. We expect to negotiate and vote on proposals to amend the Department's disability regulation regarding this issue. Prior to the meeting, the agenda will be available on the ACCESS Advisory Committee's Web site,
The meeting will be open to the public. Attendance will be limited by the size of the meeting room (maximum 150 attendees). Because space is limited, we ask that any member of the public who plans to attend the meeting notify the registration contact, Kyle Ilgenfritz (
Members of the public may submit written comments on the topics to be considered during the meeting by October 31, 2016, to FDMC, Docket Number DOT-OST-2015-0246. You may submit your comments and material online or by fax, mail, or hand delivery, but please use only one of these means. DOT recommends that you include your name and a mailing address, an email address, or a phone number in the body of your document so that DOT can contact you if there are questions regarding your submission.
To submit your comment online, go to
To view comments and any documents mentioned in this preamble as being available in the docket, go to
The ACCESS Advisory Committee is established by charter in accordance with the Federal Advisory Committee Act (FACA), 5 U.S.C. App. 2. Secretary of Transportation Anthony Foxx approved the ACCESS Advisory Committee charter on April 6, 2016. The committee's charter sets forth policies for the operation of the advisory committee and is available on the Department's Web site at
In accordance with 5 U.S.C. 553(c), DOT solicits comments from the public to better inform its rulemaking process. DOT posts these comments, without edit, including any personal information the commenter provides, to
Notice of this meeting is being provided in accordance with the Federal Advisory Committee Act and the General Services Administration regulations covering management of Federal advisory committees.
Office of the Secretary (OST), Department of Transportation (DOT).
Request for Information (RFI).
The industry group for travel sites, a number of its members which include online travel booking sites, and certain members of Congress have expressed concerns to the Department of Transportation (DOT or Department) regarding airline restrictions on the distribution and display of airline flight schedule, fare, and availability information (“flight information”). Specifically, concerns were raised about practices by some airlines to restrict the distribution and/or display of flight information by certain online travel agencies (OTAs), metasearch entities that operate flight search tools, and other stakeholders involved in the distribution of flight information and sale of air transportation. Airlines state that it is important for them to maintain control over the display and distribution of airline flight information while OTAs and metasearch entities that operate flight search tools state that actions taken by airlines to restrict the distribution or display of flight information are anticompetitive and harming consumers.
The Department is interested in learning more about this issue. Pursuant to the Department's aviation consumer protection authority, we are requesting information on whether airline restrictions on the distribution or display of airline flight information harm consumers and constitute an unfair and deceptive business practice and/or an unfair method of competition. The Department is also requesting information on whether any entities are blocking access to critical resources needed for competitive entry into the air transportation industry. Finally, we are requesting information on whether Department action is unnecessary or whether Department action in these areas would promote a more competitive air transportation marketplace or help ensure that consumers have access to the information needed to make informed air transportation choices.
Responses should be filed by December 30, 2016.
You may file responses identified by the docket number DOT-OST-2016-0204 by any of the following methods:
•
•
•
•
Kyle-Etienne Joseph, Trial Attorney, or Kimberly Graber, Chief, Consumer Protection and Competition Law Branch, Office of the Assistant General Counsel for Aviation Enforcement and Proceedings, U.S. Department of Transportation, 1200 New Jersey Ave. SE., Washington, DC 20590, 202-366-9342, 202-366-7152 (fax),
Various entities have raised concerns to the Department regarding airlines restricting the distribution or display of information on their flights. We initially became aware of the issue in connection with certain airlines placing restrictions on flight information being displayed by metasearch sites that operate flight search tools. In a proposed rule, the Transparency of Airline Ancillary Fees and Other Consumer Protection Issues (“Consumer Rule III NPRM”), the Department sought information relating to a wide variety of distribution issues including information about the relationships between entities involved in the distribution of air transportation information. 79 FR 29974 (May 23, 2014). In the Consumer Rule III NPRM, the Department posed questions related to airline restrictions on the display of flight schedule, fare, and availability
While the rulemaking was pending, representatives of certain OTAs and representatives of metasearch sites focused on travel, and their outside counsel, met with Department representatives and urged the Department to consider taking action. Those entities stated that airlines that restrict distribution of airline fare, schedule, and availability information to metasearch sites are engaging in unfair practices and unfair methods of competition. They further stated that they were focused on enforcement action or industry guidance rather than rulemaking. See Docket item DOT-OST-2014-0056-0776.
Subsequently, many questions and concerns have been raised with the Department by members of Congress as well as various stakeholders regarding airline restrictions on the distribution and display of flight information by third parties. The Department met with representatives from OTAs, metasearch entities, airlines, and other industry stakeholders to learn about the issue and how airline decisions to place restrictions on the distribution and display of airline flight information may impact both consumers and the broader air transportation industry. The Department wanted to understand whether the issue of primary concern to industry stakeholders was (1) airlines refusing to provide any flight information to non-airline entities such as an OTA or metasearch entity; (2) airlines providing flight information to non-airline entities but placing restrictions on how that information is displayed; or (3) airlines providing flight information to an OTA but restricting the OTA from distributing that information to a metasearch entity that operates a flight search tool but does not itself sell tickets. In addition, the Department wanted to understand the impact on consumers.
In meetings with representatives of airlines and online travel entities, the Department asked about the restrictions and why some airlines are restricting some OTAs, metasearch entities that operate flight search tools, or other industry stakeholders from accessing flight information or from distributing and displaying flight information. The Department also asked how such restrictions may impact consumers who use OTA and metasearch Web sites to research and book air travel.
The Department learned that some airlines have issued cease and desist letters to some OTAs demanding that these companies stop distributing airline flight information to some metasearch entities that operate flight search tools or have included language in their contracts with OTAs prohibiting them from sharing airline flight information with any metasearch entity that has not been approved by the airline. Additionally, some airlines have issued letters to metasearch entities operating flight search tools demanding that these companies stop displaying the airline's flight information or limiting how the entities display the airline's flight information on their flight search tools.
Some airlines have explained that such actions are because there are certain Web sites marketing air transportation operated by entities with which the airline does not want to be associated because the entities provide inaccurate or incomplete information, or provide poor customer service. Additionally, certain airlines have alleged that some of these entities may have engaged in fraud. Further, several airlines have stated that they wish to control how the information regarding their flights is distributed so that the airline can market services the way it chooses, through the outlets it chooses. Some airlines also state that controlling the outlets through which information on their flights is distributed helps control their distribution costs.
Historically, competition in airline distribution has contributed to technological and retail innovation that has benefited both industry stakeholders and business and leisure air travelers and further enhanced airline competition. Meanwhile, airlines and ticket agents had commercial incentives to display airline information to consumers as widely as possible. Generally, market forces should ensure that airlines will continue to display their fares in the outlets where consumers want to find them and that those same market forces would then result in airlines accruing the commercial benefits of displaying their services in as many reputable outlets as possible. However, some stakeholders have argued that the marketplace is no longer balanced and consumers are being harmed so the Department should not rely on market forces to resolve these distribution and display issues.
On April 15, 2016, the White House issued Executive Order 13725: Steps to Increase Competition and Better Inform Consumers and Workers to Support Continued Growth of the American Economy (the “Executive Order”). The Executive Order expresses the importance of a fair, efficient, and competitive marketplace and notes that consumers need both competitive markets and information to make informed choices. The Department shares the goal of ensuring consumers are provided with information they need to make informed choices. In particular, as directed in the Executive Order, the DOT wants to identify any specific practices in connection with air transportation, such as blocking access to critical resources, that may impede informed consumer choice or unduly stifle new market entrants and determine whether the Department can potentially address those practices in appropriate instances. The issues raised in connection with airlines restricting ticket agents' ability to distribute or display flight information may potentially create the type of undue burdens on competition that the Executive Order has directed agencies to address. However, the Department needs to learn more about the issue to understand whether Department action is appropriate.
Under 49 U.S.C. 41712, the Department has authority to prevent unfair or deceptive practices and unfair methods of competition. Certain OTAs and metasearch entities have stated that airline restrictions on the distribution and display of flight information amount to an unfair, deceptive, or anticompetitive practice that harms consumers and an unfair method of competition, therefore the Department has authority to act under 49 U.S.C. 41712. Meanwhile, airlines have stated that the manner in which they distribute their fare, schedule, and availability information is a private contractual matter between airlines and third parties. Airlines further contend that they have the right to determine who they do business with and where and when their content is displayed. They state that the Department has no role in this issue because airlines are not engaging in any unfair or deceptive practices or unfair methods of competition.
The Department also is mandated to encourage and enhance consumer welfare through the benefits of a deregulated, competitive air transportation industry under the Airline Deregulation Act of 1978. The Department places maximum reliance on competitive market forces and on
Accordingly, the Department is requesting information on whether any entities are blocking access to critical resources needed for competitive entry into the air transportation industry, whether Department action in this area would promote or hinder a more competitive air transportation marketplace, or whether Department action would help ensure that consumers have access to the information needed to make informed air transportation choices.
The distribution of airline flight information is a complicated process that involves a number of industry stakeholders but for consumers it is currently relatively simple to obtain flight information from airline Web sites and to find and compare flight information on online travel entity Web sites Consumers routinely book air transportation through direct and indirect (non-airline) channels, including through Web sites that operate flight search tools that either lead consumers directly to airline Web sites or to an OTA with the authority to book tickets on behalf of an airline.
Airlines make flight information available through their own channels, such as airline Web sites, call centers, and airport agents, as well as outlets that range from traditional “brick and mortar” travel agents and corporate travel agencies to OTAs. Although airlines with sufficient market presence and high load factors may have incentives to limit the outlets through which their fares are displayed, airlines are generally motivated to ensure their flight information is widely available to increase consumer exposure and generate sales. Historically, the most efficient and cost effective way for airlines to distribute flight information was to provide it to entities that consolidated the information of multiple airlines and made it available to interested parties. Accordingly airlines have in the past provided information on their flights with few or no contract restrictions on the redistribution of flight information.
Industry participants, such as travel agents and metasearch entities that want the flight information of multiple carriers, have in the past been able to obtain flight information by subscribing to distributors of schedule information such as the Official Airline Guide (OAG) and Innovata, distributors of fare and fare related data such as the Airline Tariff Publishing Company (ATPCO) and Societe Internationale de Telecommunications Aeronautiques (SITA), and global distribution systems (“GDS”), which aggregate and distribute combined flight information that generally includes schedules, fares, and availability to subscribers. It is our understanding that in most cases, OTAs that market flight information directly to the public through Web site displays obtain that information from GDSs as their primary non-airline source. OTAs sometimes distribute flight information obtained from GDSs and other entities onward to metasearch entities that operate flight search tools. These metasearch entities often combine information obtained from OTAs with information obtained directly from GDSs and other distributors and/or airlines. Regardless of the source, the information is generally combined and displayed on online travel sites marketed to consumers in flight search tools displaying flight information for multiple airlines.
Just as airlines have financial incentives to widely distribute and display information on their flights, OTAs and metasearch engines operating flight search tools have financial incentives to distribute and display airline information. It is common for metasearch entities that operate flight search tools to include in search results links to OTAs that are able to sell air transportation on behalf of an airline. Stakeholders have informed the Department that there are a number of fee structures that exist between metasearch entities operating flight search tools and the entities that provide them flight information, whether airlines or OTAs. In connection with the relationship between an OTA and a metasearch engine, although fee structures may vary, generally speaking, when consumers follow a link from a metasearch entity flight search tool to an OTA Web site that allows consumers to book flights, the OTA pays the metasearch site a referral fee. Additionally, OTAs generally receive payments from GDSs for bookings made directly on OTA Web sites. GDSs in turn are paid a fee by airlines for such bookings. Accordingly, although airlines often benefit from having their flights marketed through a variety of outlets, airlines prefer to have consumers book directly through an airline channel, for which the airline generally bears the cost of operating its own channel but avoids paying booking fees to others such as GDSs, OTAs, or metasearch entities.
Certain airlines have placed restrictions on certain third party industry stakeholders such as GDSs and data aggregators, prohibiting them from distributing information to any entities that the airline does not approve. Additionally, certain airlines are prohibiting OTAs from distributing flight information on to metasearch entities, although it is not clear how many airlines have imposed these prohibitions. Some airlines are also prohibiting particular OTAs or metasearch entities from displaying flight information for an airline's codeshare partners, and at times, preventing OTAs and metasearch entities from displaying an airline's flight information altogether. In other instances, some airlines are prohibiting metasearch entities operating flight search tools from displaying flight information for that airline with any links to OTA Web sites. Instead, the only links must be to airline Web sites. As discussed below, representatives of ticket agents allege these airline restrictions harm consumers whereas airlines argue that they have legitimate business reasons for imposing these restrictions.
In connection with airline restrictions on ticket agent distribution or display of flight information, some ticket agents have stated to the Department that they believe flight information is public information and that airlines should not be allowed to place restrictions on it. Conversely, airlines believe flight information is both proprietary and protected under intellectual property laws and that airlines have the right to maintain control over its distribution and display.
As a result of the availability of airline flight information through so many
On the other hand, airlines state that despite the fact that airline flight information has historically been disseminated and available to the general public, airlines have invested significant money in developing methods to set schedules and fares, to effectively market air transportation, and ultimately to fill as many seats as possible on the flights an airline operates. Further, unlike bus or train fares and schedules that change infrequently, airline fares, schedules, and availability can change many times a day in response to a competitive marketplace. According to many airlines, as a result of the investment that airlines have made in developing flight prices, schedules, and availability, the flight information that they produce and distribute to the air transportation industry is proprietary information.
Additionally, airlines have indicated that they have an interest in controlling where and how flight information is displayed in order to control airline distribution costs and ensure adequate customer service. Unlike service providers and makers of consumer products that do not sell directly to the public and only sell through an intermediary, airlines sell their services directly to consumers as well as through agents. Despite this distribution model of direct and indirect channels, airlines generally retain control of fares, particularly in domestic air transportation, and do not allow agents to discount or increase fares or to play any role in establishing schedules or seat availability. As such, some airlines believe that because they control fares and the related services, they are entitled to retain ultimate control over how and where this information is distributed and/or displayed.
Some ticket agents have indicated to the Department that a potential consumer harm that may stem from allowing airlines to restrict the display and distribution of flight information is a reduction in consumers' ability to view a full range of flight options in one location. They also state that ticket agents that operate flight search tools typically display information in a manner that is helpful to travelers seeking to purchase air transportation. Flight search tools consolidate flight options for consumers on one Web site so that consumers do not need to visit multiple Web sites to identify the options for air travel on a number of airlines for a given itinerary. Such flight search tools may also combine the flights of multiple airlines or various one-way fares for consumers in an attempt to identify the most cost-effective and efficient itinerary. According to ticket agents, combining carriers, one way tickets, or both are options that average consumers would be unlikely to find on their own when searching multiple airline Web sites. These flight search tools often default to ranking flight options in order from the lowest to highest cost flight option but offer other ranking options as well, such as by particular airline, arrival time, or travel time. Consumers visiting these Web sites can determine which flight options best suit their needs and preferences, for example, by taking a flight at a less popular time, enduring a long layover in order to save money on air fare, or paying more for the convenience of a non-stop flight.
Further, according to ticket agents, some flight options offered are only offered by ticket agents and not airlines. However, according to ticket agents, this is an area in which airlines are increasingly imposing restrictions on OTAs and metasearch entities operating flight search tools. These entities state that certain airlines are prohibiting ticket agents from offering flight options combining one way fares for different flight segments or from combining segments and fares from multiple carriers. For example, if a consumer wished to fly from Buffalo, New York to Hartford, Connecticut, and then to Washington, DC, and then return to Buffalo, it is often significantly less expensive to buy multiple one way tickets for this itinerary on different carriers as opposed to purchasing this itinerary as one group of flights from one carrier. Some airlines have limited the ability of ticket agents to book this itinerary as a series of one way flights. Flight search tools that combine one way fares may save consumers time and provide options the consumer would otherwise not be aware of. Searching for one way tickets on multiple carrier Web sites to find a multi-carrier itinerary that fits a consumer's needs might not yield as consumer-beneficial results.
In addition, discounted tickets that OTAs offer as part of tour packages are not presented on airline Web sites. According to ticket agents, without the ability to efficiently view flight information across multiple airlines on a ticket agent Web site, transactions are less efficient. Consumers may need to visit numerous Web sites more than once in the days before purchasing air transportation to find a current fare for the most cost effective itinerary to match their travel plans. Ticket agents also note that some Web sites offer consumers the ability to review trends in pricing for various flights so that consumers can theoretically identify the optimal date to purchase a ticket before traveling, on-time performance information for flights, customer reviews of specific itineraries, optimal seat ratings
On the other hand, many airlines state that airline limitations placed on OTAs and metasearch sites operating flight search tools do not harm consumers. Airlines note that they also provide on-time performance information and tour package options. Airlines observe that ticket agents are not alleging that airlines are attempting to place limitations on OTAs or metasearch entity product offerings unrelated to air transportation, nor have they alleged that airlines are trying to restrict displays of customer reviews of itineraries or airline seat ratings. Further, despite placing limitations on some OTAs and metasearch entities operating flight search tools, most airlines allow what they consider to be “desirable” OTAs and metasearch entities to distribute and display the airline's flight information. Meanwhile, some airlines note that one of the largest airlines in the U.S. does not distribute its flight information through GDSs or OTAs. Most consumers, particularly the most price sensitive consumers, generally search multiple Web sites, including those operated by airlines as well as ticket agent flight search tools, before purchasing air transportation. According to the airlines, their actions
Airlines also observe that there has been a significant consolidation in the ownership of OTAs. Most leisure consumer bookings come through a small number of OTAs. Airlines assert that although consumers may believe they are comparing multiple outlets, several of those outlets are owned by the same parent company. According to airlines, the consolidation of OTAs is significant to flight option distribution and consumers may be harmed by limited OTA competition as those entities consolidate and no longer innovate to compete with each other.
Moreover, many airlines state that they should maintain ultimate control over how their airline product is offered and displayed to consumers because the flying experience that airlines offer to consumers is a unique product that individual airlines have invested significant resources to develop. For example, airlines state that some ticket agent Web sites do not display airline information in a way that optimizes the product that airlines are offering to consumers. Specifically, some ticket agent Web sites have included outdated airline logos, presented information in what airlines believe to be a disorganized and suboptimal way, and failed to offer customers the tailored experience that airlines offer. Airlines have expressed concern about improper display of airline information or poor customer service experiences that they believe may negatively impact consumer perception of the airlines' brand. Airlines have stated that some examples of poor experiences include excessively long layovers that customers are unaware of when booking through ticket agents, the failure of ticket agents to process refund requests, an inability of ticket agents to accurately relay flight status and other important information to consumers in a timely fashion, and other negative interactions that consumers may attribute to airlines. Some airlines also allege they are concerned about entities that engage in fraudulent activity by selling fraudulent tickets for travel on well-known airlines. In some of these instances, consumers contact the airlines directly to request a refund for an invalid ticket. Airlines are concerned that consumers defrauded by such entities may believe that the airlines are to blame. Certain airlines have demanded that entities that they consider undesirable cease displaying the airline's flights. They have also placed contractual limitations on the ability of GDSs and OTAs to distribute flight information to unapproved entities.
Further, airlines state that they allow access to their products through numerous OTAs and metasearch entities in addition to their own sites and that consumers are able to shop for air transportation on or through many of those Web sites. Airlines believe that the purchase of air transportation via the internet is an efficient process regardless of whether consumers access flight information through OTAs, metasearch entities that operate flight search tools, or airlines' Web sites. Accordingly, airlines assert that any Department action limiting airlines' ability to control how and where airline flight information is displayed would harm both consumers and the airline's brand. Several airlines also point out that it is in their financial interest to allow reputable OTAs and metasearch entities to display and distribute airline flight information despite a desire to have as many passengers book directly with the airline as possible. Airlines need to make their services available through the outlets that consumers choose to use. Bookings via OTAs in many instances account for a large percentage of airline sales and referrals from metasearch entities that operate flight search tools are also important. Meanwhile, GDSs have historically included provisions in contracts with airlines that require airlines to offer all of the same fares that the airline offers to ticket agents that subscribe to the GDS. Therefore, in many instances, airlines are not able to offer discount fares only available from the airline to drive consumers from purchasing through ticket agents to purchasing from the airline based on pricing. Accordingly, some airlines assert that it is not in their interest or even commercially viable to remove flight information from OTA or metasearch entity Web sites entirely. Some airlines have stated that, due to the quantity of bookings that originate on OTA or metasearch entity Web sites, it is unlikely that airlines would ever prevent all OTAs and metasearch sites that operate flight search tools from displaying and/or distributing airline flight information.
Some ticket agents assert that Web sites such as theirs can potentially better position new entrant airlines to compete with larger and more established airlines, especially considering recent airline consolidation. They state that new entrant airlines often offer consumers low ticket prices and increase the number of flight options for a given itinerary. This increase in air travel options tends to drive down airfares, which in turn allows more consumers to take advantage of air transportation. Some ticket agents also believe that new entrant airlines benefit from the exposure that they gain by advertising airfares on ticket agent Web sites alongside the fares offered by larger more established carriers. Some ticket agents allege that by allowing them to display and distribute flight information for all airlines that offer service for a given itinerary, ticket agent Web sites will promote price competition in some of the more concentrated markets where the dominance of legacy airlines and other larger airlines would otherwise lead to higher airfares for consumers.
Airlines state that airline restrictions on the distribution and display of flight information is unrelated to airline market power. Accordingly, airlines assert that consolidation within the airline industry should not be taken into account when considering the issue of airline restrictions on ticket agent distribution or the display of flight information.
Ticket agents also argue that by displaying flight combinations such as one way flights or flights on multiple carriers that are not offered by airlines, OTAs and metasearch entities operating flight search tools are creating price competition and improving consumer access to information.
Airlines counter that not all carriers use non-airline distribution channels such as OTAs or metasearch entities operating flight search tools. According to some airlines, the fact that not every flight option is available through every non-airline flight information outlet does not support the idea that price competition is harmed. According to the airlines, flight information for most airlines is available through a variety of outlets, but more importantly, flight information for every airline is readily available on the airline's own Web site. Moreover, airlines have to publish information on their flights in order to sell tickets. Therefore, they do not believe price competition is harmed simply by some airlines limiting where that airline's flight information is displayed when the information is available elsewhere, such as an airline Web site.
The Department has considered the information that has been provided thus far and now requests additional information from all stakeholders—airlines, ticket agents, consumers, and other affected parties. The Department
As an initial matter, the Department requests information on the proprietary nature of flight information and whether the wide-spread availability of that information is relevant to airline restrictions on the distribution or display of flight information. Specifically, when flight information is released to consumers by airlines and made generally available to the public (
In connection with consumer options for researching and purchasing air transportation, what is the value that OTA or metasearch entity flight search tools provide? To what extent do consumers, including leisure travelers, small businesses and corporate customers, benefit from saved search costs, greater confidence in search results, access to lower fares, or more travel options than they would have obtained from separate searches of individual airline Web sites? In this request for information, have we accurately described the types of actions airlines have taken that impact OTA and metasearch entity Web sites? If not, what are those actions and how do they impact OTA and metasearch entity Web sites? What effect do those actions have on the utility of OTA and metasearch entity Web sites for consumers? Do ticket agents that provide flight search tools offer consumers any flight information that consumers cannot obtain by visiting multiple airline Web sites? What effect does an inability to display schedule, fare or seat availability information of a large, well-known airline, or group of airlines, have on the utility of air travel comparison sites for consumers? Would access to one or two of those categories of airline information without,
It has been pointed out that not all airlines currently distribute information on their flights through OTAs or metasearch entities operating flight search tools and that those tools do not necessarily have the same level of information that is available on airline Web sites. Do airline restrictions currently placed on the distribution and/or display of airline flight information limit the ability of consumers to identify the best flight options available to meet consumer needs? If yes, how? Are the existing limitations of OTA or metasearch entity Web sites relevant to the ability of consumers who use those Web sites to identify the best flight options available to meet consumer needs?
As explained above, airlines have stated that in some cases they are restricting the sharing and use of their flight information by some Web sites or entities that airlines believe are disreputable or simply do not market the airline's flights in a manner that the airline would like. Some airlines have indicated that OTAs or metasearch entities have provided inaccurate or incomplete information about airline services and products, provided poor customer service, or engaged in marketing practices the airline does not approve of, and have in some cases engaged in fraud. Airlines say such conduct tarnishes the airline brand, and for these reasons airlines are trying to prevent or restrict these entities from marketing and selling their airline's products and services. Thus, airlines claim that their actions to restrict use of their flight information benefit both airlines and consumers. Some airlines also acknowledge that they are attempting to direct more consumers to their own Web sites for financial reasons as well as marketing reasons. Are there any other reasons why airlines are restricting the sharing and use of their flight information? What information is available to determine the scope and magnitude of the problems described by airlines? How many entities engage in the practices as described by airlines, and what portion of the OTA and metasearch market do these entities represent? How many consumers use these Web sites? What is the average number of consumer complaints for each of these issues regarding such entities that airlines receive each year? How would DOT appropriately measure and evaluate the effects of the problems as described by airlines? Is action by DOT necessary to allow airlines to protect their legitimate interests and also ensure that consumers are able to make informed flight choices?
We note that flight information is available through airline Web sites. Would a reduction in the availability of airline flight information on non-airline Web sites due to airline restrictions on the distribution and/or display of such information have a significant negative impact on consumers? If so, what are those impacts, and do they disproportionately affect some subsets of consumers? According to the information provided to the Department, no airline has indicated an intent to withdraw completely from ticket agent Web sites. However, if an airline that currently distributes flight information through ticket agent Web sites withdrew completely from those Web sites, would that reduce or eliminate the ability of consumers to identify the most suitable flight options? If not, how many airlines would have to withdraw from ticket agent Web sites to
Is there information to suggest that many airlines will eventually withhold flight information entirely from all or most Web sites that offer flight search tools? How many consumers would fail to investigate more than one airline Web site, with the result that they may not locate the optimal itinerary or fare?
If it is essential for consumers to be able to view as many airline flight options as possible on OTA and metasearch entity Web sites to identify the best flight options, what information is essential? Is schedule information sufficient or are both schedule and fare information necessary? Do consumers need availability information to identify the best flight options?
We note that airlines create fare rules and generally do not allow certain combinations of flight segments. Are consumers less likely to combine one-way fares when searching for an itinerary on multiple airline Web sites rather than a ticket agent Web site due to the amount of time it may take to identify these flights and pair them together by making multiple purchases?
We note that some airlines are placing restrictions on OTAs and metasearch entity Web sites preventing them from displaying codeshare flights, which at times may be the cheapest or most efficient flight options for consumers. Are consumers less likely to discover these codeshare flight options when airlines restrict the display of these flights on OTA and metasearch Web sites? Can consumers gain access to the same information by visiting airline Web sites directly?
Is Department action in connection with airline distribution practices necessary to ensure consumers have the information they need to make informed choices?
In connection with competition between airlines, we are requesting information on the impact of airline restrictions on the distribution or display of flight information on competition. What value, if any, do OTA and metasearch entity Web sites that operate flight search tools provide in facilitating or enabling competition among airlines? Does having airline information available through multiple outlets, including ticket agent outlets, impact price competition? Would the absence of several airlines that currently participate in ticket agent outlets impact price competition? Does the ability or inability of metasearch entities that operate flight search tools to provide links to OTAs impact price competition?
If restrictions placed on the distribution and/or display of airline flight information limits the flight options available on Web sites operating flight search tools that market multiple airlines, has that limitation in options lead to higher prices for consumers? If so, how? How would restrictions in the future potentially lead to higher prices?
It is our understanding that most airlines do not permit fare “discounting” by OTAs. Are OTAs or metasearch entities that operate flight search tools able to identify fares that are lower than fares that can be found on airline Web sites? Do OTAs receive discounts from GDSs which allow them to then price flights lower than airlines?
Some ticket agents have stated that flight search tools are able to identify lower prices on OTA Web sites than are available on airline Web sites and that the lower fare or both fares are displayed absent any airline restriction. If lower prices are identified by OTAs, do these prices serve as a competitive check on airline prices when displayed on flight search tools adjacent to the prices offered by airlines?
In the past, OTAs negotiated special deals, rates, and promotions from airlines that resulted in consumers obtaining discounted fares. More recently, it is our understanding that contractual arrangements between airlines, GDSs, and OTAs generally include provisions that prevent OTAs or airlines from offering discounted fares that are not available through all other outlets. Accordingly, discounted fares that might otherwise be available to consumers are no longer offered. We request information on how these types of private contractual arrangements impact consumers and whether they are unfair or anticompetitive.
Some stakeholders have argued that having flight information for multiple airlines available through the flight search tools of OTAs and metasearch entities operates a platform for smaller and new entrant airlines to compete with larger, better known airlines. They suggest that absent ticket agent Web sites that offer the flight information of multiple airlines, consumers will fly only well-known carriers that they recognize from advertisements and the airline's continuous length of operation in a given market. If OTA and metasearch entity Web sites do not provide the flight information of larger, better known airlines, will consumers stop using those Web sites? If consumers do not use those Web sites, and instead search only airline Web sites, will that impact the ability of smaller or new entrant airlines to compete with larger, better known airlines because consumers will not search Web sites that do not include largest airlines? Conversely, would the ability of new entrant airlines to compete with larger airlines be enhanced by the lack of competition if large, well-known airlines limit or do not permit information on their flights to be displayed on OTA or metasearch entity Web sites and therefore consumers find only smaller airline flight options on those sites? Is Department action in this area necessary to ensure airline restrictions on the distribution or display of flight information does not harm competition? If so, what action is appropriate?
We are requesting information on all of the issues and concerns identified above and any information relevant to this issue.
Bureau of the Fiscal Service, Fiscal Service, Treasury.
Notice.
The Department of the Treasury (Treasury) is announcing a new fee schedule applicable to transfers of U.S. Treasury book-entry securities maintained on the National Book-Entry System (NBES) that occur on or after January 3, 2017.
Effective January 3, 2017.
Brandon Taylor or Janeene Wilson, Bureau of the Fiscal Service, 202-504-3550.
Treasury has established a fee structure for the transfer of Treasury book-entry securities maintained on NBES. Treasury reassesses this fee structure periodically based on our review of the latest book-entry costs and volumes.
For each Treasury securities transfer or reversal sent or received on or after January 3, 2017, the basic fee will increase from $0.81 to $0.93. The
Treasury does not charge a fee for account maintenance, the stripping and reconstitution of Treasury securities, the wires associated with original issues, or interest and redemption payments. Treasury currently absorbs these costs.
The fees described in this notice apply only to the transfer of Treasury book-entry securities held on NBES. Information concerning fees for book-entry transfers of Government Agency securities, which are priced by the Federal Reserve, is set out in a separate
The following is the Treasury fee schedule that will take effect on January 3, 2017, for book-entry transfers on NBES:
31 CFR 357.45.
Office of Foreign Assets Control, Treasury.
Notice.
The Treasury Department's Office of Foreign Assets Control (OFAC) is removing from the Specially Designated Nationals and Blocked Persons List (SDN List) the names of the persons listed below whose property and interests in property had been blocked pursuant to Executive Order 13310 of July 28, 2003 (Blocking Property of the Government of Burma and Prohibiting Certain Transactions), Executive Order 13448 of October 18, 2007 (Blocking Property and Prohibiting Certain Transactions Related to Burma), Executive Order 13464 of April 30, 2008 (Blocking Property and Prohibiting Certain Transactions Related To Burma), and Executive Order 13619 of July 11, 2012 (Blocking Property of Persons Threatening the Peace, Security, or Stability of Burma).
OFAC's actions described in this notice are effective as of October 7, 2016.
Associate Director for Global Targeting, tel.: 202/622-2420, Assistant Director for Sanctions Compliance & Evaluation, tel.: 202/622-2490, Assistant Director for Licensing, tel.: 202/622-2480, Office of Foreign Assets Control, or Chief Counsel (Foreign Assets Control), tel.: 202/622-2410 (not toll free numbers).
The SDN List and additional information concerning OFAC sanctions programs are available from OFAC's Web site (
On October 7, 2016, the President signed an Executive Order terminating the national emergency declared in Executive Order 13047 of May 20, 1997 (Prohibiting New Investment in Burma), and revoked that order, Executive Order 13310 of July 28, 2003 (Blocking Property of the Government of Burma and Prohibiting Certain Transactions), Executive Order 13448 of October 18, 2007 (Blocking Property and Prohibiting Certain Transactions Related to Burma), Executive Order 13464 of April 30, 2008 (Blocking Property and Prohibiting Certain Transactions Related To Burma), Executive Order 13619 of July 11, 2012 (Blocking Property of Persons Threatening the Peace, Security, or Stability of Burma), and Executive Order 13651 of August 6, 2013 (Prohibiting Certain Imports of Burmese Jadeite and Rubies).
As such, the following individuals and entities are no longer subject to the blocking provisions in any of the Burma-related Executive Orders revoked by the President and are being removed from the SDN List as of the effective date of Executive Order 13742 of October 7, 2016, Termination of Emergency With Respect to the Actions and Policies of the Government of Burma:
1. HTOO TRADING COMPANY LIMITED (a.k.a. HTOO TRADING GROUP COMPANY), 5 Pyay Road, Hlaing Township, Yangon, Burma [BURMA].
2. HTOO WOOD PRODUCTS PTE. LIMITED (a.k.a. HTOO FURNITURE; a.k.a.
3. HTOO GROUP OF COMPANIES, 5 Pyay Road, Hlaing Township, Yangon, Burma [BURMA].
4. HTAY, Thein; DOB 07 Sep 1955; POB Taunggyi, Burma; Lieutenant General; Chief of Defence Industries; Chief of Army Ordnance Industries (individual) [BURMA].
5. HOTEL MAX (a.k.a. HOTEL CHAUNG THA BEACH RESORT), No. 1 Ywama Curve, Ba Yint Naung Road, Block-2, Hlaing Township, Yangon, Burma [BURMA].
6. LWIN, Saw, Burma; DOB 1939; alt. nationality Burma; alt. citizen Burma; Major General, Minister of Industry 2 (individual) [BURMA].
7. MANN, Aung Thet (a.k.a. KO, Shwe Mann Ko), c/o Htoo Trading Company Limited, undetermined; c/o Htoo Group of Companies, undetermined; c/o Ayer Shwe Wah Company Limited, undetermined; DOB 19 Jun 1977 (individual) [BURMA].
8. MAX (MYANMAR) CONSTRUCTION CO., LTD, 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].
9. MAX MYANMAR GEMS AND JEWELLERY CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].
10. MAX MYANMAR GROUP OF COMPANIES (a.k.a. MAX MYANMAR; a.k.a. MAX MYANMAR CO.; a.k.a. MAX MYANMAR COMPANY LIMITED; a.k.a. MAX MYANMAR GROUP), No. 1 Ywama Curve, Ba Yint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].
11. MAX MYANMAR MANUFACTURING CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].
12. PAVO AIRCRAFT LEASING PTE. LTD., 3 Shenton Way, #24-02 Shenton House 068805, Singapore [BURMA].
13. OO, Tin Aung Myint (a.k.a. OO, Thiha Thura Tin Aung Myint); DOB 27 May 1950; nationality Burma; citizen Burma; Lieutenant-General; Quartermaster General; Minister of Military Affairs; Member, State Peace and Development Council (individual) [BURMA].
14. OO, Maung; DOB 1952; nationality Burma; citizen Burma; Major General; Minister of Home Affairs (individual) [BURMA].
15. OO, Kyaw Nyunt; DOB 30 Jun 1959; Lieutenant Colonel; Staff Officer (Grade 1), D.D.I. (individual) [BURMA].
16. NYEIN, Chan (a.k.a. NYEIN, Chan, Dr.; a.k.a. NYEIN, Chang, Dr.), Burma; DOB 1944; alt. nationality Burma; alt. citizen Burma; Minister of Education (individual) [BURMA].
17. NG, Sor Hong (a.k.a. LAW, Cecilia; a.k.a. LO, Cecilia; a.k.a. NG, Cecilia), 150 Prince Charles Crescent, #18-03, Singapore 159012, Singapore; 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore; DOB 1958; citizen Singapore; Identification Number S1481823E (Singapore); Chief Executive, Managing Director, and Owner, Golden Aaron Pte. Ltd., Singapore; Director and Owner, G A Ardmore Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Capital Pte. Ltd., Singapore; Director and Owner, G A Foodstuffs Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Land Pte. Ltd., Singapore; Director and Owner, G A Resort Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Sentosa Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Treasure Pte. Ltd., Singapore; Director and Owner, G A Whitehouse Pte. Ltd., Singapore; Chief Executive, Manager, and Owner, S H Ng Trading Pte. Ltd., Singapore (individual) [BURMA].
18. MYINT, Ye; DOB 21 Oct 1943; nationality Burma; citizen Burma; Lieutenant-General; Chief, Military Affairs; Chief, Bureau of Special Operation 1; Member, State Peace and Development Council (individual) [BURMA].
19. MYINT, Tin Lin (a.k.a. MYINT, Daw Tin Lin); DOB 25 Jan 1947; wife of Ye Myint (individual) [BURMA].
20. MYINT, Kyaw (a.k.a. MYINT, Kyaw, Dr.), Burma; DOB 1940; alt. nationality Burma; alt. citizen Burma; Minister of Health (individual) [BURMA].
21. MYINT, Htay (a.k.a. MYINT, U Htay), Burma; DOB 06 Feb 1955; nationality Burma; citizen Burma; Chairman, Yuzana Company Limited (individual) [BURMA].
22. MYAWADDY TRADING LTD. (a.k.a. MYAWADDY TRADING CO.), 189-191 Maha Bandoola Street, Botataung P.O, Yangon, Burma [BURMA].
23. MYANMAR ECONOMIC CORPORATION (a.k.a. MEC), 74-76 Shwedagon Pagoda Road, Dagon Township, Yangon, Burma [BURMA].
24. MYANMAR IMPERIAL JADE CO., LTD, 22 Sule Pagoda Road, Mayangone Township, Yangon, Burma [BURMA].
25. MYANMAR IVANHOE COPPER COMPANY LIMITED (a.k.a. MICCL; a.k.a. MONYWA JVCO; a.k.a. MYANMAR IVANHOE COPPER CO. LTD.), 70 (I) Bo Chein Street, 6.5 miles Pyay Road, Yangon, Burma; 70 (I) Bo Chein Street, Pyay Road, Hlaing Township, Yangon, Burma; Monywa, Sagaing Division, Burma [BURMA].
26. STATE PEACE AND DEVELOPMENT COUNCIL OF BURMA [BURMA].
27. THA, Soe, Burma; DOB 1945; alt. nationality Burma; alt. citizen Burma; Minister of National Planning and Economic Development (individual) [BURMA].
28. THAUNG (a.k.a. THAUNG, U), Burma; DOB 06 Jul 1937; alt. nationality Burma; alt. citizen Burma; Minister of Labor; Minister of Science & Technology (individual) [BURMA].
29. ASIA WORLD PORT MANAGEMENT CO. LTD (a.k.a. ASIA WORLD PORT MANAGEMENT; a.k.a. “PORT MANAGEMENT CO. LTD.”), 61-62 Wartan St, Bahosi Yeiktha, Rangoon, Burma [BURMA].
30. AUREUM PALACE HOTELS AND RESORTS (a.k.a. AUREUEM PALACE HOTEL AND RESORT (BAGAN); a.k.a. AUREUEM PALACE HOTEL AND RESORT (NGAPALI); a.k.a. AUREUM PALACE HOTEL AND RESORT (NGWE SAUNG); a.k.a. AUREUM PALACE HOTEL AND RESORT GROUP CO. LTD.; a.k.a. AUREUM PALACE HOTEL RESORT; a.k.a. AUREUM PALACE RESORTS; a.k.a. AUREUM PALACE RESORTS AND SPA), No. 41 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; Thandwe, Rakhine, Burma [BURMA].
31. ASIA WORLD INDUSTRIES LTD., No. 21/22 Upper Pansodan St., Aung San Stadium (East Wing), Mingalar Taung Nyunt, Rangoon, Burma [BURMA].
32. AYE, Maung; DOB 25 Dec 1937; nationality Burma; citizen Burma; Vice Senior General; Vice-Chairman of the State Peace and Development Council; Deputy Commander-in-Chief, Myanmar Defense Services (Tatmadaw); Commander-in-Chief, Myanmar Army (individual) [BURMA].
33. AYER SHWE WAH COMPANY LIMITED (a.k.a. AYE YAR SHWE WAH; a.k.a. AYER SHWE WA; a.k.a. AYEYA SHWE WAR COMPANY), 5 Pyay Road, Hlaing Township, Yangon, Burma [BURMA].
34. AYEYARWADY BANK (a.k.a. AYEYARWADDY BANK LTD; a.k.a. IRRAWADDY BANK), Block (111-112), Asint Myint Zay, Zabu Thiri Township, Nay Pyi Taw, Burma; No. 1 Ywama Curve, Ba Yint Naung Road, Block (2), Hlaing Township, Yangon, Burma; SWIFT/BIC AYAB MM MY [BURMA].
35. DIRECTORATE OF DEFENCE INDUSTRIES (a.k.a. KA PA SA; a.k.a. “DDI”), Burma; Ministry of Defence, Shwedagon Pagoda Road, Yangon, Burma [BURMA].
36. ESPACE AVENIR EXECUTIVE SERVICED APARTMENT (a.k.a. ESPACE AVENIR), No. 523, Pyay Road, Kamaryut Township, Yangon, Burma [BURMA].
37. EXCELLENCE MINERAL MANUFACTURING CO., LTD., Plot No. (142), U Ta Yuoat Gyi Street, Industrial Zone No. (4), Hlaing Thar Yar Township, Yangon, Burma [BURMA].
38. G A ARDMORE PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
39. G A ARDMORE PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore; 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].
40. G A CAPITAL PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore [BURMA].
41. LAW, Steven (a.k.a. CHUNG, Lo Ping; a.k.a. HTUN MYINT NAING; a.k.a. LAW, Stephen; a.k.a. LO, Ping Han; a.k.a. LO, Ping Hau; a.k.a. LO, Ping Zhong; a.k.a. LO, Steven; a.k.a. TUN MYINT NAING; a.k.a. U MYINT NAING), No. 124 Insein Road, Ward (9), Hlaing Township, Rangoon, Burma; 61-62 Bahosi Development Housing, Wadan St., Lanmadaw Township, Rangoon, Burma; 330 Strand Rd., Latha Township, Rangoon, Burma; 8A Jalan Teliti, Singapore, Singapore; 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore; DOB 16 May 1958; alt. DOB 27 Aug 1960; POB Lashio, Burma; citizen Burma; Passport 937174 (Burma) (individual) [BURMA].
42. KO, Myint Myint (a.k.a. KO, Daw Myint Myint); DOB 11 Jan 1946; wife of Saw Tun (individual) [BURMA].
43. INNWA BANK LTD (a.k.a. INNWA BANK), 554-556 Corner of Merchant Street
44. HTWE, Aung; DOB 01 Feb 1943; nationality Burma; citizen Burma; Lieutenant-General; Chief of Armed Forces Training; Member, State Peace and Development Council (individual) [BURMA].
45. GOLD OCEAN PTE LTD, 101 Cecil Street #08-08, Tong Eng Building, Singapore 069533, Singapore; 1 Scotts Road, #21-07/08 Shaw Centre, Singapore 228208, Singapore [BURMA].
46. GOLD ENERGY CO. LTD., No. 74 Lan Thit Road, Insein Township, Rangoon, Burma; Taungngu (Tungoo) Branch, Karen State, Burma [BURMA].
47. G A FOODSTUFFS PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore [BURMA].
48. SOE MIN HTAIK CO. LTD. (a.k.a. SOE MIN HTIKE CO., LTD.; a.k.a. SOE MIN JTIAK CO. LTD.; a.k.a. SOE MING HTIKE), No. 4, 6A Kabaaye Pagoda Road, Mayangon Township, Yangon, Burma; No. 3, Kan Street, No. 10 Ward, Hlaing Township, Yangon, Burma [BURMA].
49. SOE, Myint Myint (a.k.a. SOE, Daw Myint Myint); DOB 15 Jan 1953; wife of Nyan Win (individual) [BURMA].
50. SWE, Myint; DOB 24 Jun 1951; nationality Burma; citizen Burma; Lieutenant-General; Chief of Military Affairs Security (individual) [BURMA].
51. TERRESTRIAL PTE. LTD., 3 Raffles Place, #06-01 Bharat Building, Singapore 048617, Singapore; 10 Anson Road, #23-16 International Plaza, Singapore 079903, Singapore [BURMA].
52. THAUNG, Aung, No. 1099, PuBa Thiri Township, Ottara (South) Ward, Nay Pyi Taw, Burma; DOB 01 Dec 1940; POB Kyauk Kaw Village, Thaung Tha Township, Burma; Gender Male; National ID No. 13/KaLaNa (Naing) 011849 (Burma); Lower House Member of Parliament (individual) [BURMA].
53. THEIN, Tin Naing, Burma; DOB 1955; alt. nationality Burma; alt. citizen Burma; Brigadier General, Minister of Commerce (individual) [BURMA].
54. THI, Lun; DOB 18 Jul 1940; nationality Burma; citizen Burma; Brigadier-General; Minister of Energy (individual) [BURMA].
55. TUN, Hla, Burma; DOB 11 Jul 1951; alt. nationality Burma; alt. citizen Burma; Major General, Minister of Finance and Revenue (individual) [BURMA].
56. TUN, Saw, Burma; DOB 08 May 1935; alt. nationality Burma; alt. citizen Burma; Major General, Minister of Construction (individual) [BURMA].
57. UNION OF MYANMAR ECONOMIC HOLDINGS LIMITED (a.k.a. MYANMAR ECONOMIC HOLDINGS LIMITED; a.k.a. UMEH; a.k.a. UNION OF MYANMAR ECONOMIC HOLDINGS COMPANY LIMITED), 189-191 Maha Bandoola Road, Botahtaung Township, Yangon, Burma [BURMA].
58. LO, Hsing Han (a.k.a. LAW, Hsit-han; a.k.a. LO, Hsin Han; a.k.a. LO, Hsing-han), 20-23 Masoeyein Kyang St., Mayangone, Rangoon, Burma; 20B Massoeyein St., 9 Mile, Rangoon, Burma, Burma; 60-61 Strand Rd., Latha Township, Rangoon, Burma; 330 Strand Rd, Latha Township, Rangoon, Burma; 20 Wingabar Rd, Rangoon, Burma; 36 19th St., Lower Blk, Latha Township, Rangoon, Burma; 47 Latha St., Latha Township, Rangoon, Burma; 152 Sule Pagoda Rd, Rangoon, Burma; 126A Damazedi Rd, Bahan Township, Rangoon, Burma; DOB 1938; alt. DOB 1935 (individual) [BURMA].
59. GREEN LUCK TRADING COMPANY (a.k.a. GREEN LUCK TRADING COMPANY LIMITED), No. 61/62 Bahosi Development, Wadan Street, Lanmadaw Township, Rangoon, Burma; No. 74 Lan Thit Street, Insein Township, Rangoon, Burma [BURMA].
60. GOLDEN AARON PTE. LTD. (a.k.a. CHINA FOCUS DEVELOPMENT; a.k.a. CHINA FOCUS DEVELOPMENT LIMITED; a.k.a. CHINA FOCUS DEVELOPMENT LTD.), 3 Shenton Way, 10-01, Shenton House, Singapore 068805, Singapore; 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore; China; Unit 2612A, Kuntai International Center, No. 12 Chaowai Street, Chaoyang District, Beijing 100020, China [BURMA].
61. GREAT SUCCESS PTE. LTD., 1 Scotts Road, #21/07-08 Shaw Centre, Singapore, 228208, Singapore; 101 Cecil Street #08-08, Tong Eng Building, Singapore, 069533, Singapore [BURMA].
62. G A LAND PTE. LTD., 1 Scotts Road, 21-07/08 Shaw House, Singapore 228208, Singapore [BURMA].
63. G A RESORT PTE. LTD., 1 Scotts Road, 21-07 Shaw House, Singapore 228208, Singapore; 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].
64. G A RESORT PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
65. G A TREASURE PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
66. G A TREASURE PTE. LTD., 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].
67. G A WHITEHOUSE PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
68. G A WHITEHOUSE PTE. LTD., 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].
69. LIN, Aung Thein (a.k.a. “LYNN, Aung Thein”), Burma; DOB 1952; alt. nationality Burma; alt. citizen Burma; Brigadier General, Mayor and Chairman of Yangon City (Rangoon) City Development Committee (individual) [BURMA].
70. ZAY GABAR COMPANY (a.k.a. ZAYKABAR COMPANY), Burma [BURMA].
71. ZAW, Thein, Burma; DOB 20 Oct 1951; alt. nationality Burma; alt. citizen Burma; Brigadier General, Minister of Telecommunications, Post, & Telegraph (individual) [BURMA].
72. ZA, Pye Phyo Tay, Burma; 6 Cairnhill Circle, Number 18-07, Cairnhill Crest 229813, Singapore; DOB 29 Jan 1987; nationality Burma; citizen Burma; Son of Tay Za. (individual) [BURMA].
73. ASIA WORLD CO. LTD. (a.k.a. ASIA WORLD), 61-62 Bahosi Development Housing, Wadan St., Lanmadaw Township, Rangoon, Burma [BURMA].
74. ZAW, Zaw (a.k.a. ZAW, U Zaw); DOB 22 Oct 1966; nationality Burma; citizen Burma; Passport 828461 (Burma) issued 18 May 2006 expires 17 May 2009 (individual) [BURMA] (Linked To: HOTEL MAX; Linked To: MAX MYANMAR GROUP OF COMPANIES; Linked To: MAX SINGAPORE INTERNATIONAL PTE. LTD.).
75. SENTOSA TREASURE PTE. LTD., 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].
76. SHWE NAR WAH COMPANY LIMITED, No. 39/40, Bogyoke Aung San Road, Bahosi Housing, Lanmadaw, Rangoon, Burma; Registration ID 1922/2007-2008 (Burma) [BURMA] (Linked To: LAW, Steven).
77. SHWE, Khin (a.k.a. SHWE, Khin, Dr.), Burma; DOB 21 Jan 1952; alt. nationality Burma; alt. citizen Burma; President, Zay Gabar Company (individual) [BURMA].
78. SHWE, Than; DOB 02 Feb 1935; alt. DOB 02 Feb 1933; nationality Burma; citizen Burma; Senior General, Minister of Defense and Commander-in-Chief of Defense Services; Chairman, State Peace and Development Council (individual) [BURMA].
79. MIN, Zaw, Burma; DOB 10 Jan 1949; alt. nationality Burma; alt. citizen Burma; Colonel, Minister of Electric Power 1 (individual) [BURMA].
80. G A CAPITAL PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
81. AIR BAGAN HOLDINGS PTE. LTD. (a.k.a. AIR BAGAN; a.k.a. AIRBAGAN), 545 Orchard Road, #01-04 Far East Shopping Centre, Singapore 238882, Singapore; 56 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; 9, 78th Street, Bet, 33rd and 34th Street, Mandalay, Burma; 134 Bogyoke Street, Myoma Quarter, Taunggyi, Burma; 3, Aung Thate Di Quarter, Nyaung U, Burma; Sandoway Inn, Thandwe, Burma; Pathein Hotel, Kanthonesint, Petheing-Monywa Road, Burma; 572 Ye Yeik That Street, Pear Ayekari Hotel, Myauk Ywa Quarter, Burma; 48 Quarter 2, Zay Tan Lay Yat, Kyaing Tong, Burma; 156 Bogyoke Aung San Road, Aung Chan Thar Building, San Sai Quarter, Tachileik, Burma; Myeik Golf Club, Pearl Mon Hotel, Airport Junction, Myeik, Burma; 244 Bet, Duwa Za Junn & Bayin Naung St., Third Quarter, Myitkyina, Burma; 414 Bogyoke Road, Kaw Thaung, Burma; Room (2), YMCA Building, Bogyoke Aung San Road, Forestry Quarter, Taunggyi, Burma; No. 407, Zei Phyu Kone Quarter, Near Ngapali Junction, Thandwe, Burma; No. Mitharsu (Family Video), No. 131/B Zay Taung Bak Lane, Zayit Quarter, Dawei, Burma; No. 13 (B), Zay Tan Gyi Street, Quarter (3), Zay Than Gyi Quarter, Kyaing Tong, Burma; 179 (Nya) Bogyoke Road, San Sai (Kha) Quarter, Tachileik, Burma; No. E (4), Construction Housing, Sumbrabun Road, Ayar Quarter, Myitkyina, Burma; No. 445, Anawa Quarter, Myinttzu Thaka Road, Kawthaung, Burma; No. 4, Naypyidaw Airport Compound, Naypyidaw, Burma; Kalaymyo, Red Cross Building, Bogyoke Street, Kalay Myo, Burma; Room-17, Stadium Building, Theinni Main Road, 12 Quarter, Lashio, Burma; Unit #310, 3rd Floor, Silom Complex, 191 Silom Road, Silom Bangrak, Bangkok 10500, Thailand; Room No. T1-112 & T-112A, Level 1, Main Terminal Building, Suvarnabhumi Airport, Bangpli,
82. ASIA GREEN DEVELOPMENT BANK (a.k.a. AGD BANK), 168 Thiri Yatanar Shopping Complex, Zabu Thiri Township, Nay Pyi Taw, Burma; 73/75 Sule Pagoda Road, Pabedan Township, Yangon, Burma; SWIFT/BIC AGDB MM MY [BURMA].
83. AIR BAGAN LIMITED (a.k.a. AIR BAGAN), 56 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; 9, 78th Street, Bet, Mandalay, Burma; 134 Bogyoke Street, Myoma Quarter, Taunggyi, Burma; 3, Aung Thate Di Quarter, Nyaung U, Burma; Sandoway Inn, Thandwe, Burma; Pathein Hotel, Kanthonesint, Petheing-Monywa Road, Burma; 572 Ye Yeik Tha Street, Pear Ayekari Hotel, Myauk Ywa Quarter, Burma; 48 Quarter 2, Zay Tan Lay Yat, Kyaing Tong, Burma; 156 Bogyoke Aung San Road, Aung Chan Thar Building, San Sai Quarter, Tachileik, Burma; Myeik Golf Club, Pearl Mon Hotel, Airport Junction, Myeik, Burma; 244 Bet, Duwa Zaw Junn & Bayin Naung St., Thida Quarter, Myitkyina, Burma; 414 Bogyoke Road, Kaw Thaung, Burma; No.6/88, 6 Quarter, Lalway, Naypyitaw, Burma; Kalaymyo, Red Cross Building, Bogyoke Street, Kalay Myo, Burma; Room (2), YMCA Building, Bogyoke Aung San Road, Forestry Quarter, Taunggyi, Burma; No. 407, Zei Phyu Kone Quarter, Near Ngapali Junction, Thandwe, Burma; No. Mitharsu (Family Video), No. 131/B Zay Taung Bak Lane, Zayit Quarter, Dawei, Burma; No. 13 (B) Zay Tan Gyi Street, Quarter (3), Zay Than Gyi Quarter, Kyaing Tong, Burma; 179 (Nya) Bogyoke Road, San Sai (Kha) Quarter, Tachileik, Burma; No. E (4), Construction Housing, Sumbrabun Road, Ayar Quarter, Myitkyina, Burma; No. 445, Anawa Quarter, Myinttzu Thaka Road, Kawthaung, Burma; No. 4, Naypyidaw, Airport Compound, Naypyidaw, Burma; Room-17, Stadium Building, Theinni Main Road, 12 Quarter, Lashio, Burma; Unit #310, 3rd Floor, Silom Complex, 191 Silom Road, Silom Bangrak, Bangkok 10500, Thailand; Room No. T1-112 & T1-112A, Level 1, Main Terminal Building, Suvarnabhumi Airport, Bangpli, Ssamutprakarn 10540, Thailand; Doing business as AIR BAGAN. [BURMA].
84. ASIA LIGHT CO. LTD., Mingalar Taung Nyunt Tower, 6 Upper Pansoden Street, Aung San Stadium Eastern Wing, Rangoon, Burma; 15/19 Kunjan Rd., S Aung San Std, Rangoon, Burma [BURMA].
85. ASIA MEGA LINK CO., LTD., No. 39/40, Bogyoke Aung San Road, Bahosi Housing, Lanmadaw, Rangoon, Burma; Registration ID 1679/2009-2010 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).
86. ASIA MEGA LINK SERVICES CO., LTD., No. 44/45, Bogyoke Aung San Road, Bahosi Housing Complex, Lanmadaw, Rangoon, Burma; Registration ID 2652/2010-2011 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).
87. ASIA METAL COMPANY LIMITED, No. 106 Pan Pe Khaung Maung Khtet Road, Industrial Zone (4), Shwe Pyi Thar Township, Yangon, Burma; No. (40) Yangon-Mandalay Road, Kywe Sekan, Pyay Gyi Tagon Township, Mandalay, Burma; No. A/B (1-5), Paung Laung (24) Street, Ext., Ward (2), Nay Pyi Taw, Pyinmana, Burma; Web site
88. ASIA PIONEER IMPEX PTE. LTD., 10 Anson Road, #23-16 International Plaza, Singapore 079903, Singapore [BURMA].
89. MYAWADDY BANK LTD. (a.k.a. MYAWADDY BANK), 24/26 Sule Pagoda Road, Yangon, Burma [BURMA].
90. MYANMAR TREASURE RESORTS (a.k.a. MYANMAR TREASURE BEACH RESORT; a.k.a. MYANMAR TREASURE BEACH RESORTS; a.k.a. MYANMAR TREASURE RESORT (BAGAN); a.k.a. MYANMAR TREASURE RESORT (PATHEIN); a.k.a. “MYANMAR TREASURE RESORT II”), No. 41 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; No 56 Shwe Taung Gyar Road, Golden Valley, Bahan Township, Yangon, Burma [BURMA].
91. MYANMAR RUBY ENTERPRISE CO. LTD. (a.k.a. MYANMAR RUBY ENTERPRISE), 24/26 Sule Pagoda Road, Kyauktada Township, Yangon, Burma [BURMA].
92. MAX MYANMAR SERVICES CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].
93. MAX MYANMAR TRADING CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].
94. MAX SINGAPORE INTERNATIONAL PTE. LTD., 3 Shenton Way, #24-02, Shenton House 068805, Singapore [BURMA].
95. MYANMAR AVIA EXPORT COMPANY LIMITED (a.k.a. MYANMAR AVIA EXPORT) [BURMA].
96. YUZANA COMPANY LIMITED (a.k.a. YUZANA CONSTRUCTION), No. 130 Yuzana Centre, Shwegondaing Road, Bahan Township, Yangon, Burma [BURMA].
97. PAVO TRADING PTE. LTD., 3 Shenton Way, #24-02 Shenton House, Singapore 068805, Singapore [BURMA].
98. PIONEER AERODROME SERVICES CO., LTD., No. 203/204, Thiri Mingalar Housing, Strand Rd, Ahlone, Rangoon, Burma; Registration ID 620/2007-2008 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).
99. WIN, Nyan; DOB 22 Jan 1953; nationality Burma; citizen Burma; Major General; Minister of Foreign Affairs (individual) [BURMA].
100. WIN, Kyaw; DOB 03 Jan 1944; nationality Burma; citizen Burma; Lieutenant-General; Chief of Bureau of Special Operation 2; Member, State Peace and Development Council (individual) [BURMA].
101. ROYAL KUMUDRA HOTEL, No. 9 Hotel Zone, Nay Pyi Taw, Burma; No. 1 Ywama Curve, Ba Yint Naung Road, Block (2), Hlaing Township, Rangoon, Burma [BURMA].
102. S H NG TRADING, 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
103. GREEN ASIA SERVICES CO., LTD., No. 61/62, Bahosi Housing, War Tan St., Lanmadaw T/S, Rangoon, Burma; Registration ID 4013/2011-2012 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).
104. GLOBAL WORLD INSURANCE COMPANY LIMITED, No. 44, Thein Phyu Road, Corner of Bogyoke Aung San Road and Thein Phyu Road, Pazuntaung, Rangoon, Burma; Registration ID 2511/2012-2013 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).
105. G A FOODSTUFFS PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
106. G A LAND PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
107. THET, Khin Lay (a.k.a. THET, Daw Khin Lay); DOB 19 Jun 1947; wife of Thura Shwe Mann (individual) [BURMA].
108. THIHA (a.k.a. THI HA), c/o Htoo Group of Companies, undetermined; c/o Htoo Trading Company Limited, undetermined; DOB 24 Jun 1960 (individual) [BURMA].
109. ZA, Tay (a.k.a. TAYZA; a.k.a. TEZA; a.k.a. ZA, Te; a.k.a. ZA, U Tay; a.k.a. ZA, U Te), 6 Cairnhill Circle, Number 18-07, Cairnhill Crest 229813, Singapore; Burma; DOB 18 Jul 1964; alt. DOB 18 Jun 1967; nationality Burma; citizen Burma; Managing Director, Htoo Trading Company Limited; Chairman, Air Bagan Holdings Pte. Ltd. (d.b.a. Air Bagan); Managing Director, Pavo Trading Pte. Ltd. (individual) [BURMA].
110. G A SENTOSA PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].
111. G A SENTOSA PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore [BURMA].
Veterans Health Administration, Department of Veterans Affairs.
Notice.
The Veterans Health Administration (VHA) is announcing an opportunity for public comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act (PRA) of 1995, Federal agencies are required to publish notice in the
Written comments and recommendations on the proposed collection of information should be received on or before December 30, 2016.
Submit written comments on the collection of information through the Federal Docket Management System (FDMS) at
Brian McCarthy at (202) 461-6345.
Under the PRA of 1995 (Pub. L. 104-13; 44 U.S.C. 3501-3521), Federal agencies must obtain approval from OMB for each collection of information they conduct or sponsor. This request for comment is being made pursuant to Section 3506(c)(2)(A) of the PRA.
With respect to the following collection of information, VHA invites comments on: (1) Whether the proposed collection of information is necessary for the proper performance of VHA's functions, including whether the information will have practical utility; (2) the accuracy of VHA's estimate of the burden of the proposed collection of information; (3) ways to enhance the quality, utility, and clarity of the information to be collected; and (4) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or the use of other forms of information technology.
By direction of the Secretary.
Office of Postsecondary Education, Department of Education.
Final regulations.
The Secretary establishes new regulations to implement requirements for the teacher preparation program accountability system under title II of the Higher Education Act of 1965, as amended (HEA), that will result in the collection and dissemination of more meaningful data on teacher preparation program quality (title II reporting system). The Secretary also amends the regulations governing the Teacher Education Assistance for College and Higher Education (TEACH) Grant program under title IV of the HEA to condition TEACH Grant program funding on teacher preparation program quality and to update, clarify, and improve the current regulations and align them with title II reporting system data.
The regulations in 34 CFR part 612 are effective November 30, 2016. The amendments to part 686 are effective on July 1, 2017, except for amendatory instructions 4.A., 4.B., 4.C.iv., 4.C.x. and 4.C.xi., amending 34 CFR 686.2(d) and (e), and amendatory instruction 6, amending 34 CFR 686.11, which are effective on July 1, 2021.
Sophia McArdle, Ph.D., U.S. Department of Education, 400 Maryland Avenue SW., Room 6W256, Washington, DC 20202. Telephone: (202) 453-6318 or by email:
If you use a telecommunications device for the deaf (TDD) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1-800-877-8339.
Section 205 of the HEA requires States and institutions of higher education (IHEs) annually to report on various characteristics of their teacher preparation programs, including an assessment of program performance. These reporting requirements exist in part to ensure that members of the public, prospective teachers and employers (districts and schools), and the States, IHEs, and programs themselves have accurate information on the quality of these teacher preparation programs. These requirements also provide an impetus to States and IHEs to make improvements where they are needed. Thousands of novice teachers enter the profession every year
Research from States such as Tennessee, North Carolina, and Washington indicates that some teacher preparation programs report statistically significant differences in the student learning outcomes of their graduates.
Moreover, section 205 of the HEA requires States to report on the criteria they use to assess whether teacher preparation programs are low-performing or at-risk of being low-performing, but it is difficult to identify programs in need of remediation or closure because few of the reporting requirements ask for information indicative of program quality. The GAO report noted that half the States said current title II reporting system data were “slightly useful,” “neither useful nor not useful,” or “not useful”; over half the teacher preparation programs surveyed said the data were not useful in assessing their programs; and none of the surveyed school district staff said they used the data.
The final regulations address shortcomings in the current system by defining the indicators of quality that a State must use to assess the performance of its teacher preparation programs, including more meaningful indicators of program inputs and program outcomes, such as the ability of the program's graduates to produce gains in student learning
The final regulations also link assessments of program performance under HEA title II to eligibility for the Federal TEACH Grant program. The TEACH Grant program, authorized by section 420M of the HEA, provides grants to eligible IHEs, which, in turn, use the funds to provide grants of up to $4,000 annually to eligible teacher preparation candidates who agree to serve as full-time teachers in high-need fields at low-income schools for not less than four academic years within eight years after completing their courses of study. If a TEACH Grant recipient fails to complete his or her service obligation, the grant is converted into a Federal Direct Unsubsidized Stafford Loan that must be repaid with interest.
Pursuant to section 420L(1)(A) of the HEA, one of the eligibility requirements for an institution to participate in the TEACH Grant program is that it must provide high-quality teacher preparation. However, of the 38 programs identified by States as “low-performing” or “at-risk,” 22 programs were offered by IHEs participating in the TEACH Grant program. The final
The final regulations—
• Establish necessary definitions and requirements for IHEs and States related to the quality of teacher preparation programs, and require States to develop measures for assessing teacher preparation performance.
• Establish indicators that States must use to report on teacher preparation program performance, to help ensure that the quality of teacher preparation programs is judged on reliable and valid indicators of program performance.
• Establish the areas States must consider in identifying teacher preparation programs that are low-performing and at-risk of being low-performing, the actions States must take with respect to those programs, and the consequences for a low-performing program that loses State approval or financial support. The final regulations also establish the conditions under which a program that loses State approval or financial support may regain its eligibility for title IV, HEA funding.
• Establish a link between the State's classification of a teacher preparation program's performance under the title II reporting system and that program's identification as “high-quality” for TEACH Grant eligibility purposes.
• Establish provisions that allow TEACH Grant recipients to satisfy the requirements of their agreement to serve by teaching in a high-need field that was designated as high-need at the time the grant was received.
• Establish conditions that allow TEACH Grant recipients to have their service obligations discharged if they are totally and permanently disabled. The final regulations also establish conditions under which a student who had a prior service obligation discharged due to total and permanent disability may receive a new TEACH Grant.
The benefits, costs, and transfers related to the regulations are discussed in more detail in the
The net budget impact of the final regulations is approximately $0.49 million in reduced costs over the TEACH Grant cohorts from 2016 to 2026. We estimate that the total cost annualized over 10 years of the final regulations is between $27.5 million and $27.7 million (see the Accounting Statement section of this document).
On December 3, 2014, the Secretary published a notice of proposed rulemaking (NPRM) for these parts in the
We discuss substantive issues under the sections of the proposed regulations to which they pertain. Generally, we do not address technical or other minor changes.
Commenters argued that only the State may determine whether to include student academic achievement data (and by inference our other proposed indicators of academic content knowledge and teaching skills) in their assessments of teacher preparation program performance. One commenter contended that the Department's attempt to “shoehorn” student achievement data into the academic content knowledge and teaching skills of students enrolled in teacher preparation programs (section 205(b)(1)(F)) would render meaningless the language of section 207(a) that gives the State the authority to establish levels of performance, and what those levels contain. These commenters argued that, as a result, the HEA prohibits the Department from requiring States to use any particular indicators. Other commenters argued that such State authority also flows from section 205(b)(1)(F) of the HEA, which provides
Commenters also stated that the Department does not have the authority to require that a State's criteria for assessing the performance of any teacher preparation program include the indicators of academic content knowledge and teaching skills, including, “in significant part,” student learning outcomes and employment outcomes for high-need schools. See proposed §§ 612.6(a)(1) and 612.4(b)(1). Similar concerns were expressed with respect to proposed § 612.4(b)(2), which provided that a State could determine that a teacher preparation program was effective (or higher) only if the program was found to have “satisfactory or higher” student learning outcomes.
The proposed regulations, like the final regulations, reflect the fundamental principle and the statutory requirement that the assessment of teacher preparation program performance must be conducted by the State, with criteria the State establishes and levels of differentiated performance that are determined by the State. Section 205(b)(1)(F) of the HEA provides that a State must include in its report card a description of its criteria for assessing the performance of teacher preparation programs within IHEs in the State and that those criteria must include indicators of the academic content knowledge and teaching skills of students enrolled in such programs. Significantly, section 205(b)(1) further provides that the State's report card must conform with definitions and methods established by the Secretary, and section 205(c) authorizes the Secretary to prescribe regulations to ensure the reliability, validity, integrity, and accuracy of the data submitted in the report cards.
Consistent with those statutory provisions, § 612.5 establishes the indicators States must use to comply with the reporting requirement in section 205(b)(1)(F), namely by having States include in the report card their criteria for program assessment and the indicators of academic content knowledge and teaching skills that they must include in those criteria. While the term “teaching skills” is defined in section 200(23) of the HEA, the definition is complex and the statute does not indicate what are appropriate indicators of academic content knowledge and teaching skills of those who complete teacher preparation programs. Thus, in § 612.5, we establish reasonable definitions of these basic, but ambiguous statutory phrases in an admittedly complex area—how States may reasonably assess the performance of their teacher preparation programs—so that the conclusions States reach about the performance of individual programs are valid and reliable in compliance with the statute. We discuss the reasonableness of the four general indicators of academic content knowledge and teaching skills that the Secretary has established in § 612.5 later in this preamble under the heading
The provisions of § 612.5 are also wholly consistent with section 207(a) of the HEA. Section 207(a) provides that States determine the levels of program performance in their assessments of program performance and discusses the criteria a State “may” include in those levels of performance. However, section 207(a) does not negate the basic requirement in section 205(b) that States include indicators of academic content knowledge and teaching skills within their program assessment criteria or the authority of the Secretary to establish definitions for report card elements. Moreover, the regulations do not limit a State's authority to establish, use, and report other criteria that the State determines are appropriate for generating a valid and reliable assessment of teacher preparation program performance. Section 612.5(b) of the regulations expressly permits States to supplement the required indicators with other indicators of a teacher's effect on student performance, including other indicators of academic content and knowledge and teaching skills, provided that the State uses the same indicators for all teacher preparation programs in the State. In addition, working with stakeholders, States are free to determine how to apply these various criteria and indicators in order to determine, assess, and report whether a preparation program is low-performing or at-risk of being low-performing.
We appreciate commenters' concerns regarding the provisions in §§ 612.4(b)(1) and (b)(2) and 612.6(b)(1) regarding weighting and consideration of certain indicators. Based on consideration of the public comments and the potential complexity of these requirements, we have removed these provisions from the final regulations. While we have taken this action, we continue to believe strongly that providing significant weight to these indicators when determining a teacher preparation program's level of performance is very important. The ability of novice teachers to promote positive student academic growth should be central to the missions of all teacher preparation programs, and having those programs focus on producing well-prepared novice
Commenters also stated that by mandating a system for rating teacher preparation programs, including the indicators by which teacher preparation programs must be rated, what a State must consider in identifying low-performing or at-risk teacher preparation programs, and the actions a State must take with respect to low-performing programs (proposed §§ 612.4, 612.5, and 612.6), the Federal government is impinging on the authority of States, which authorize, regulate, and approve IHEs and their teacher preparation programs.
As we discussed in response to the prior set of comments, these regulations establish definitions for terms provided in title II of the HEA in order to help ensure that the State and IHE reporting system meet its purpose. In authorizing the Secretary to define statutory terms and establish reporting methods needed to properly implement the title II reporting system, neither Congress nor the Department is abrogating State authority to authorize, regulate, and approve IHEs and their teacher preparation programs. Finally, in response to the comments that proposed §§ 612.4, 612.5, and 612.6 would impermissibly impinge on the authority of States in terms of actions they must take with respect to low-performing programs, we note that the regulations do little more than clarify the sanctions that Congress requires in section 207(b) of the HEA. Those sanctions address the circumstances in which students enrolled in a low-performing program may continue to receive or regain Federal student financial assistance, and thus the Federal government has a direct interest in the subject.
Since all LEAs stand to benefit from the success of the new reporting system through improved transparency and information about the quality of teacher preparation programs from which they may recruit and hire new teachers, we assume that all LEAs will want to work with their States to find manageable ways to implement the regulations. Moreover, without more information from the commenter, we cannot address why a particular State would not have the authority to insist that an LEA provide the State with the information it needs to meet these reporting requirements.
We disagree with comments that allege that the regulations reflect overreach by the Federal government into the province of States regarding the approval of teacher preparation programs and the academic domain of institutions that conduct these programs. The regulations do not constrain the academic judgments of particular institutions, what those institutions should teach in their specific programs, which students should attend those programs, or how those programs should be conducted. Nor do they dictate which teacher preparation programs States should approve or should not approve. Rather, by clarifying limited areas in which sections 205 and 207 of the HEA are unclear, the regulations implement the statutory mandate that, consistent with definitions and reporting methods the Secretary establishes, States assess the quality of the teacher preparation programs in their State, identify those that are low-performing or at-risk of being low-performing, and work to improve the performance of those programs.
With the changes we are making in these final regulations, the system for determining whether a program is low-performing or at-risk of being low-performing is unarguably a State-determined system. Specifically, as noted above, in assessing and reporting program performance, each State is free to (1) adopt and report other measures of program performance it believes are appropriate, (2) use discretion in how to measure student learning outcomes, employment outcomes, survey outcomes, and minimum program characteristics, and (3) determine for itself how these indicators of academic content knowledge and teaching skills and other criteria a State may choose to use will produce a valid and reliable overall assessment of each program's performance. Thus, the assessment system that each State will use is developed by the State, and does not compromise the ability of the State and its stakeholders to determine what is and is not a low-performing or at-risk teacher preparation program.
We also do not perceive a legitimate Tenth Amendment issue. The Tenth Amendment provides in pertinent part that powers not delegated to the Federal government by the Constitution are reserved to the States. Congress used its spending authority to require institutions that enroll students who receive Federal student financial assistance in teacher preparation programs, and States that receive HEA funds, to submit information as required by the Secretary in their institutional report cards (IRCs) and SRCs. Thus, the Secretary's authority to define the ambiguous statutory term “indicators of academic content knowledge and teaching skills” to include the measures the regulations establish, coupled with the authority States have under section 205(b)(1)(F) of the HEA to establish other criteria with which they assess program performance, resolves any claim that the assessment of program performance is a matter left to the States under the Tenth Amendment.
Similarly, some commenters stated that the proposed requirements in § 612.8(b)(1) for regaining eligibility to enroll students who receive title IV aid exceed the statutory authority in section 207(b)(4) of the HEA, which provides that a program is reinstated upon a demonstration of improved performance, as determined by the State. Commenters expressed concern that the proposed regulations would shift this responsibility from the State to the Federal government, and stated that teacher preparation programs could be caught in limbo. They argued that if a State had already reinstated funding and identified that a program had improved performance, the program's ability to enroll students who receive student financial aid would be conditioned on the Secretary's approval. The commenters contended that policy changes as significant as these should come from Congress, after scrutiny and deliberation of a reauthorized HEA.
Any teacher preparation program from which the State has withdrawn the State's approval, or terminated the State's financial support, due to the low performance of the program based upon the State assessment described in subsection (a)—
(1) Shall be ineligible for any funding for professional development activities awarded by the Department;
(2) May not be permitted to accept or enroll any student who receives aid under title IV in the institution's teacher preparation program;
(3) Shall provide transitional support, including remedial services if necessary, for students enrolled at the institution at the time of termination of financial support or withdrawal of approval; and
(4) Shall be reinstated upon demonstration of improved performance, as determined by the State.
Sections 612.7 and 612.8 implement this statutory provision through procedures that mirror existing requirements governing termination and reinstatement of student financial support under title IV of the HEA. As noted in the preceding discussion, our regulations do not usurp State authority to determine how to assess whether a given program is low-performing, and our requirement that States do so using, among other things, the indicators of novice teachers' academic content knowledge and teaching skills identified in § 612.5 is consistent with title II of the HEA.
Consistent with section 207(a) of the HEA, a State determines a teacher preparation program's performance level based on the State's use of those indicators and any other criteria or indicators the State chooses to use to measure the overall level of the program's performance. In addition, consistent with section 207(b), the loss of eligibility to enroll students receiving Federal student financial aid does not depend upon a Department decision. Rather, the State determines whether the performance of a particular teacher preparation program is so poor that it withdraws the State's approval of, or terminates the State's financial support for, that program. Each State may use a different decision model to make this determination, as contemplated by section 207(b).
Commenters' objections to our proposal for how a program subject to section 207(b) may regain eligibility to enroll students who receive title IV aid are misplaced. Section 207(b)(4) of the HEA provides that a program found to be low-performing is reinstated upon the State's determination that the program has improved, which presumably would need to include the State's reinstatement of State approval or financial support, since otherwise the institution would continue to lose its ability to accept or enroll students who receive title IV aid in its teacher preparation programs. However, the initial loss of eligibility to enroll students who receive title IV aid is a significant event, and we believe that Congress intended that section 207(b)(4) be read and implemented not in isolation, but rather in the context of the procedures established in 34 CFR 600.20 for reinstatement of eligibility based on the State's determination of improved performance.
Another commenter argued that in various ways the proposed regulations constitute a Federal overreach with regard to what Missouri provides in terms of State and local control and governance. Specifically, the commenter stated that proposed regulations circumvent: The rights of Missouri school districts and citizens under the Missouri constitution to control the characteristics of quality education; the authority of the Missouri legislative process and the State Board of Education to determine program quality; State law, specifically, according to the commenter Missouri House Bill 1490 limits how school districts can share locally held student data such as student learning outcomes; and the process already underway to improve teacher preparation in Missouri.
Other commenters expressed concern that our proposal to require States to use student learning outcomes, employment outcomes, and survey outcomes, as defined in the proposed regulations, would create inconsistencies with what they consider to be the more comprehensive and more nuanced way in which their States assess teacher preparation program performance and then provide relevant feedback to programs and the institutions that operate them.
Finally, a number of commenters argued that requirements related to indicators of academic content knowledge and teaching skills are unnecessary because there is already an organization, the Council for the Accreditation of Educator Preparation (CAEP), which requires IHEs to report information similar to what the regulations require. These commenters claimed that the reporting of data on indicators of academic content knowledge and teaching skills related to each individual program on the SRC may be duplicative and unnecessary.
While the commenter who referred to Missouri law raised several broad concerns about purported Federal overreach of the State's laws, these concerns were very general. However, we note that in previously applying for and receiving ESEA flexibility, the Missouri Department of Elementary and Secondary Education (MDESE) agreed to have LEAs in the State implement basic changes in their teacher evaluation systems that would allow them to generate student growth data that would fulfill the student learning outcomes requirement. In doing so the MDESE demonstrated that it was fully able to implement these types of activities without conflict with State law. Moreover, the regulations address neither how a State or LEA are to determine the characteristics of effective educators, nor State procedures and authority for determining when to approve a teacher preparation program. Nor do the regulations undermine any State efforts to improve teacher preparation; in implementing their responsibilities under sections 205(b) and 207(a) of the HEA, they simply require that, in assessing the level of performance of each teacher preparation program, States examine and report data about the performance of novice teachers the program produces.
Finally, we note that, as enacted, House Bill 1490 specifically directs the Missouri State Board of Education to issue a rule regarding gathering student data in the Statewide Longitudinal Data System in terms of the Board's need to make certain data elements available to the public. This is the very process the State presumably would use to gather and report the data that these regulations require. In addition, we read House Bill 1490 to prohibit the MDESE, unless otherwise authorized, “to transfer personally identifiable student data”, something that the regulations do not contemplate. Further, we do not read House Bill 1490 as establishing the kind of limitation on LEAs' sharing student data with the MDESE that the commenter stresses. House Bill 1490 also requires the State Board to ensure
We are mindful that a number of States have begun their own efforts to use various methods and procedures to examine how well their teacher preparation programs are performing. For the title II reporting system, HEA provides that State reporting must use common definitions and reporting methods as the Secretary shall determine necessary. While the regulations require all States to use data on student learning outcomes, employment outcomes, survey outcomes, and minimum program characteristics to determine which programs are low-performing or at-risk of being low-performing, States may, after working with their stakeholders, also adopt other criteria and indicators. We also know from the recent GAO report that more than half the States were already using information on program graduates' effectiveness in their teacher preparation program approval or renewal processes and at least 10 others planned to do so—data we would expect to align with these reporting requirements.
Finally, with regard to the work of CAEP, we agree that CAEP may require some institutional reporting that may be similar to the reporting required under the title II reporting system; however, reporting information to CAEP does not satisfy the reporting requirements under title II. Regardless of the information reported to CAEP, States and institutions still have a statutory obligation to submit SRCs and IRCs. The CAEP reporting requirements include the reporting of data associated with student learning outcomes, employment outcomes, and survey outcomes; however, CAEP standards do not require the disaggregation of data for individual teacher preparation programs but this disaggregation is necessary for title II reporting.
Some commenters also expressed concern about the costs to States of providing technical assistance to teacher preparation programs that they find to be low-performing, and suggested that those programs could lose State approval or financial support.
Finally, in view of the challenges in collecting accurate and meaningful data on teacher preparation program graduates who fan out across the United States, commenters argued that the Department should find ways to provide financial resources to States and institutions to help them gather the kinds of data the regulations will require.
The regulations are designed to achieve these goals, while maintaining State responsibility for deciding how to consider the indicators of academic content knowledge and teaching skills described in § 612.5, along with other relevant criteria States choose to use. We recognize that moving from the current system—in which States, using criteria of their choosing, identified only 39 programs nationally in 2011 as low-performing or at-risk of being low-performing (see the NPRM, 79 FR 71823)—to one in which such determinations are based on meaningful indicators and criteria of program effectiveness is not without cost. We understand that States will need to make important decisions about how to provide for these costs. However, as explained in the
While providing technical assistance to low-performing teacher preparation programs will entail some costs, § 612.6(b) simply codifies the statutory requirement Congress established in section 207(a) of the HEA and offers examples of what this technical assistance could entail. Moreover, we assume that a State would want to provide such technical assistance rather than have the program continue to be low-performing and so remain at-risk of losing State support (and eligibility to enroll students who receive title IV aid).
Finally, commenters requested that we identify funding sources to help States and IHEs gather the required data on students who, upon completing their programs, do not stay in the State. We encourage States to gather and use data on all program graduates regardless of the State to which they ultimately move. However, given the evident costs of doing so on an interstate basis, the final regulations permit States to exclude these students from their calculations of student learning outcomes, their teacher placement and retention rates and from the employer and teacher survey (see the definitions of teacher placement and retention rate in § 612.2) and provisions governing student learning outcomes and survey outcomes in § 612.5(a)(1)(iii) and (a)(3)(ii).
Additional commenters requested that this definition specifically refer to knowledge and skills regarding assessment. These commenters stated that the ability to measure student
Another commenter recommended that we specifically mention the distinct set of instructional skills necessary to address the needs of students who are gifted and talented. This commenter stated that there is a general lack of awareness of how to identify and support advanced and gifted learners, and that this lack of awareness has contributed to concerns about how well the Nation's top students are doing compared to top students around the world. The commenter also stated that this disparity could be rectified if teachers were required to address the specific needs of this group of students.
Multiple commenters requested that we develop data definitions and metrics related to the definition of “content and pedagogical knowledge,” and then collect related data on a national level. They stated that such a national reporting system would facilitate continuous improvement and quality assurance on a systemic level, while significantly reducing burden on States and programs.
Other commenters recommended that to directly assess for content knowledge and pedagogy, the definition of the term include rating graduates of teacher preparation programs based on a portfolio of the teaching candidates' work over the course of the academic program. These commenters stated that reviewing a portfolio reflecting a recent graduate's pedagogical preparation would be more reliable than rating an individual based on student learning, which cannot be reliably measured.
In this regard, we note that the purpose here is not to offer a comprehensive definition of the term that all States must use, as the commenters appear to recommend. Rather, it is to provide a general roadmap for States to use as they work with stakeholders (see § 612.4(c)) to decide how best to determine whether programs that lack the accreditation referenced in § 612.5(a)(4)(i) will ensure that students have the requisite content and pedagogical knowledge they will need as teachers before they complete the programs.
For this reason, we believe that requiring States to use a more prescriptive definition or to develop common data definitions and metrics aligned to that definition, as many commenters urged, would create unnecessary costs and burdens. Similarly, we do not believe that collecting this kind of data on a national level through the title II reporting system is worth the significant cost and burden that it would entail. Instead, we believe that States, working in consultation with stakeholders, should determine whether their State systems for evaluating program performance should include the kinds of additions to the definition of content and pedagogical knowledge that the commenters recommend.
We also stress that our definition underscores the need for teacher preparation programs to train teachers to have the content knowledge and pedagogical skills needed to address the learning needs of all students. It specifically refers to the need for a teacher to possess the distinct skills necessary to meet the needs of English learners and students with disabilities, both because students in these two groups face particular challenges and require additional support, and to emphasize the need for programs to train aspiring teachers to teach to the learning needs of the most vulnerable students they will have in their classrooms. While the definition's focus on all students plainly includes students who are gifted and talented, as well as students in all other subgroups, we do not believe that, for purposes of this title II reporting system, the definition of “content and pedagogical skills” requires similar special reference to those or other student groups. However, we emphasize again that States are free to adopt many of the commenters' recommendations. For example, because the definition refers to “effective learning experiences that make the discipline accessible and meaningful for all students,” States may consider a teacher's ability to factor students' cultural, linguistic, and experiential backgrounds into the design and implementation of productive learning experiences, just as States may include a specific focus on the learning needs of students who are gifted and talented.
Finally, through this definition we are not mandating a particular method for assessing the content and pedagogical knowledge of teachers. As such, under the definition, States may allow teacher preparation programs to use a portfolio review to assess teachers' acquisition of content and pedagogical knowledge.
Some commenters suggested, alternatively, that the Department include an additional disaggregation requirement for high-need subject areas. These commenters stated that targeting high-need subject areas would have a greater connection to employment outcomes than would high-need schools and, as such, should be tracked as a separate category when judging the quality of teacher preparation programs.
A number of commenters requested that the definition of high-need school include schools with low graduation rates. Other commenters agreed that this definition should be based on poverty, as defined in section 200(11) of the HEA, but also recommended that a performance component should be included. Specifically, these commenters suggested that high schools in which one-third or more of the students do not graduate on time be designated as high-need schools. Other commenters recommended including geography as an indicator of a school's need, arguing that, in their experience, high schools' urbanicity plays a significant role in determining student success.
Other commenters expressed concerns with using a quartile-based ranking of all schools to determine which schools are considered high need. These commenters stated that such an approach may lead to schools with very different economic conditions being considered high need. For example, a school in one district might fall into the lowest quartile with only 15 percent of students living in poverty while a school in another district would need to have 75 percent of students living in poverty to meet the same designation.
We acknowledge the concern expressed by some commenters that the definition of “high-need school” permits schools in different LEAs (and indeed, depending on the breakdown of an LEA's schools in the highest quartile based on poverty, in the same LEA as well) that serve communities with very different levels of poverty all to be considered high-need. However, for a reporting system that will use placement and retention rates in high-need schools as factors bearing on the performance of each teacher preparation program, States may consider applying significantly greater weight to employment outcomes for novice teachers who work in LEAs and schools that serve high-poverty areas than for novice teachers who work in LEAs and schools that serve low-poverty areas.
Moreover, while we acknowledge that the definition of “high-need school” in section 200(11)(A) of the HEA does not apply to the statutory provisions requiring the submission of SRCs and IRCs, we believe that if we use the term in the title II reporting system it is reasonable that we should give some deference to the definition used elsewhere in title II of the HEA. For reasons provided above, we believe the definition can work well for the
Furthermore, we disagree with the comments that the definition of “high-need school” should include high-need subject areas. As defined in the regulations, a “teacher preparation program” is a program that leads to an initial State teacher certification or licensure in a specific field. Thus, the State's assessment of a teacher preparation program's performance already focuses on a specific subject area, including those we believe States would generally consider to be high-need. In addition, maintaining focus on placement of teachers in schools where students come from families with high actual or relative poverty levels, and not on the subject areas they teach in those schools, will help maintain a focus on the success of students who have fewer opportunities. We therefore do not see the benefit of further burdening State reporting by separately carrying into the definition of a “high-need school” as commenters recommend, factors that focus on high-need subjects.
We also disagree that the definition of “high-need school” should include an additional criterion of low graduation rates. While we agree that addressing the needs of schools with low graduation rates is a major priority, we believe the definition of “high-need school” should focus on the poverty level of the area the school serves. The measure is easy to calculate and understand, and including this additional component would complicate the data collection and analysis process for States. However, we believe there is a sufficiently high correlation between schools in high-poverty areas, which our definition would deem high-need, and the schools with low graduation rates on which the commenters desire to have the definition focus. We believe this correlation means that a large proportion of low-performing schools would be included in a definition of high-need schools that focuses on poverty.
CEP schools are not permitted to use household applications to determine a reimbursement percentage from the USDA. Rather, the USDA determines meal reimbursement for CEP schools based on “claiming percentages,” calculated by multiplying the percentage of students identified through the direct certification data by a multiplier established in the Healthy, Hunger-Free Kids Act of 2010 and set in regulation at 1.6. The 1.6 multiplier provides an estimate of the number of students that would be eligible for free and reduced-price meals in CEP schools if the schools determined eligibility through traditional means, using both direct certification and household applications. If a State uses NSLP data from CEP schools when determining whether schools are high-need schools, it should not use the number of children actually receiving free meals in CEP schools to determine the percentage of students from low-income families because, in those schools, some children receiving free meals live in households that do not meet a definition of low-income. Therefore, States that wish to use NSLP data for purposes of determining the percentage of children from low-income families in schools that are participating in Community Eligibility should use the number of children for whom the LEA is receiving reimbursement from the USDA (direct certification total with the 1.6 multiplier), not to exceed 100 percent of children enrolled. For example, we can consider a school that participates in Community Eligibility with an enrollment of 1,000 children. The school identifies 600 children through direct certification data as eligible for the NSLP. The school multiplies 600 by 1.6, and that result is 960. The LEA would receive reimbursement through the NSLP for meals for 960 children, or 96 percent of students enrolled. In a ranking of schools in the LEA on the basis of the percentage of students from low-income families, even though 100 percent of students are receiving free meals through NSLP, the school would be ranked on the basis of 96 percent of students from low-income families. The use of claiming percentages for identifying CEP schools as high-need schools, rather than the number of students actually receiving free lunch through NSLP ensures comparability, regardless of an individual school's decision regarding participation in the program.
A number of commenters claimed that the proposed definition confused the attainment of certification or licensure with graduation from a program, which is often a precursor for certification or licensure. They stated that the proposed definition was not clear regarding how States would report on recent program completers who are entering the classroom. Others noted that some
Other commenters specifically requested that the definition include pre-kindergarten teachers (if a State requires postsecondary education and training for pre-kindergarten teachers), and that pre-kindergarten teachers be reflected in teacher preparation program assessment.
A number of commenters also recommended that the word “recent” be removed from the definition of “new teacher” so that individuals who take time off between completing their teaching degree and obtaining a job in a classroom are still considered to be new teachers. They argued that individuals who take time off to raise a family or who do not immediately find a full-time teaching position should still be considered new teachers if they have not already had full-time teaching experience. Other commenters stated that the term “new teacher” may result in confusion based on State decisions about when an individual may begin teaching. For example, the commenters stated that in Colorado teachers may obtain an alternative license and begin teaching before completing a formal licensure program. As such, new teachers may have been teaching for up to three years at the point that the proposed definition would consider them to be a “new teacher,” and the proposed definition therefore may cause confusion among data entry staff about which individuals should be reported as new teachers. They recommended the we replace the term “new teacher” with the term “employed completer” because the latter more clearly reflects that an individual would need to complete his or her program and have found employment to be included in the reporting requirements.
We understand that many alternative route teacher preparation programs place their participants as teachers while they are enrolled in their programs, and many traditional preparation program participants are only placed after earning their credential. Furthermore, we agree that direct comparisons between alternative route and traditional teacher preparation programs could be misleading if done without a more complete understanding of the inherent differences between the two types of programs. For example, a recent completer of an alternative route program may actually have several more years of teaching experience than a recent graduate of a traditional teacher preparation program, so apparent differences in their performance may be based more on the specific teacher's experience than the quality of the preparation program.
In addition, we agree with commenters that the preparation of preschool teachers is a critical part of improving early childhood education, and inclusion of these staff in the assessment of teacher preparation program quality could provide valuable insights. We strongly encourage States that require preschool teachers to obtain either the same level of licensure as elementary school teachers, or a level of licensure focused on preschool or early childhood education, to include preschool teachers who teach in public schools in their assessment of the quality of their teacher preparation programs. However, we also recognize that preschool licensure and teacher evaluation requirements vary among States and among settings, and therefore believe that it is important to leave the determination of whether and how to include preschool teachers in this measure to the States. We hope that States will base their determination on what is most supportive of high-quality early childhood education in their State.
We also agree with commenters that the proposed term “new teacher” may result in confusion based on State decisions about when individuals in an alternative route program have the certification they need to begin teaching, and that, in some cases, these individuals may have taught for up to three years before the proposed definition would consider them to be new teachers. We believe, however, that the term “employed completer” could be problematic for alternative route programs because, while their participants are employed, they may not have yet completed their program.
Likewise, we agree with commenters who expressed concern that our proposed definition of “new teacher” confuses the attainment of certification or licensure with graduation from a program leading to recommendation for certification or licensure.
For all of these reasons, we are removing the term and definition of “new teacher” and replacing it with the term “novice teacher,” which we are defining as “a teacher of record in the first three years of teaching who teaches elementary or secondary public school students, which may include, at a State's discretion, preschool students.” We believe this new term and definition more clearly distinguish between individuals who have met all the requirements of a teacher preparation program (recent graduates), and those who have been assigned the lead responsibility for a student's learning (
Finally, we agree with commenters that we should remove the word “recent” from the definition, and have made this change. As commenters suggest, making this change will ensure that individuals who take time off between completing their teacher preparation program and obtaining a job in a classroom, or who do not immediately find a full-time teaching position, are still included in the definition of “novice teacher.” Therefore, our definition of “novice teacher” does not include the word “recent”; the term instead clarifies that a novice teacher is an individual who is responsible for student outcomes, while still allowing individuals who are recent graduates to be categorized as novice teachers for three years in order to account for delays in placement.
Commenters suggested that the definition include a requirement that mentor teachers be “effective.” While our proposed definition did not use the term “mentor teacher,” we interpret the comments as pertaining to the language of paragraph (1) of the proposed definition—the requirement that those LEA-based personnel who provide training be qualified clinical instructors. Commenters also suggested that we eliminate the phrase “at least in part” when referring to the training to be provided by qualified clinical instructors, and that we require the clinical practice to include experience with high-need and high-ability students, as well as the use of data analysis and development of classroom management skills.
Other commenters suggested that the definition require multiple clinical or field experiences, or both, with effective mentor teachers who (1) address the needs of diverse, rural, or underrepresented student populations in elementary and secondary schools, including English learners, students with disabilities, high-need students, and high-ability students, and (2) assess the clinical experiences using a performance-based protocol to demonstrate teacher candidates' mastery of content and pedagogy.
Some commenters suggested that the definition require that teacher candidates use specific research-based practices in addition to those currently listed in the definition, including data analysis, differentiation, and classroom management. The commenters recommended that all instructors be qualified clinical instructors, and that they ensure that clinical experiences include working with high-need and high-ability students because doing so will provide a more robust and realistic clinical experience.
Commenters further suggested that “quality clinical preparation” use a program model similar to that utilized by many alternative route programs. This model would include significant in-service training and support as a fundamental and required component, alongside an accelerated pre-service training program. Another commenter suggested the inclusion of residency programs in the definition.
Commenters also suggested that the Department adopt, for the title II reporting system, the definitions of the terms “clinical experience” and “clinical practice” used by CAEP so that the regulatory definitions describe a collaborative relationship between a teacher preparation program and a school district. Commenters explained that CAEP defines “clinical experiences” as guided, hands-on, practical applications and demonstrations of professional knowledge of theory to practice, skills, and dispositions through collaborative and facilitated learning in field-based assignments, tasks, activities, and assessments across a variety of settings. Commenters further explained that CAEP defines “clinical practice” as student teaching or internship opportunities that provide candidates with an intensive and extensive culminating field-based set of responsibilities, assignments, tasks, activities, and assessments that demonstrate candidates' progressive development of the professional knowledge, skills, and dispositions to be effective educators. Another commenter recommended that we develop common definitions of data and metrics on quality clinical preparation.
We agree with the recommendation to remove the phrase “at least in part” from the definition, so that all training must be provided by quality clinical instructors.
We decline to revise the definition to provide that quality clinical preparation specifically include work with high-need or high-ability students, using data analysis and differentiation, and developing classroom management skills. We agree that these are important elements in developing highly effective educators and could be an important part of clinical preparation. However, the purpose of this definition is to highlight general characteristics of quality clinical instruction that must be reflected in how a State assesses teacher preparation program performance, rather than provide a comprehensive list of elements of quality clinical preparation. We believe that including the additional elements suggested by the commenters would result in an overly prescriptive definition. We note, however, that States are free to supplement this definition with additional criteria for assessing teacher preparation program performance.
We also decline to revise the definition to provide that quality clinical preparation be assessed using a performance-based protocol as a means of demonstrating student mastery of content and pedagogy. While this is a strong approach that States may choose to take, we are not revising the definition to prescribe this particular method because we believe it may in some cases be overly burdensome.
We decline commenters' recommendation to include significant in-service training and support as a fundamental and required component, alongside an accelerated pre-service training program. Similarly, we reject the suggestion to include residency programs in the definition. Here again, we feel that both of these additional qualifications would result in a definition that is too prescriptive. Moreover, as noted above, this definition is meant to highlight general characteristics of quality clinical instruction that must be reflected in how a State assesses teacher preparation program performance, rather than to provide a comprehensive list of elements of quality clinical preparation.
Furthermore, while we understand why commenters recommended that we use CAEP's definitions, we do not want to issue an overly prescriptive definition of what is and is not quality clinical preparation, nor do we want to endorse any particular organization's approach. Rather, we are defining a basic indicator of teacher preparation program performance for programs that do not meet the program accreditation provision in § 612.5(a)(4)(i). However, States are free to build the CAEP definitions into their own criteria for assessing teacher preparation program performance; furthermore, programs may implement CAEP criteria.
We encourage States and teacher preparation programs to adopt research-based practices of effective teacher preparation for all aspects of their program accountability systems. Indeed, we believe the accountability systems that States establish will help programs and States to gather more evidence about what aspects of clinical training and other parts of preparation programs lead to the most successful teachers. However, we decline to develop more
Other commenters recommended changing the definition of “recent graduate” to limit it to those graduates of teacher preparation programs who are currently credentialed and practicing teachers. The commenters stated that this would avoid having programs with completers who become gainfully employed in a non-education field or enroll in graduate school being penalized when the State determines the program's performance.
Furthermore, we decline to amend the definition to include only those individuals who are currently credentialed and practicing teachers. Doing so would create confusion between this term and “novice teacher” (defined elsewhere in this document). The term “novice teacher” is designed to capture individuals who are in their first three years of teaching, whereas the definition of “recent graduate” is designed to capture individuals who have completed a program, regardless of whether they are teaching. In order to maintain this distinction, we have retained the prohibitions that currently exist in the definitions in the title II reporting system against using recommendation to the State for licensure or becoming a teacher of record as a condition of being identified as a recent graduate.
We are, however, making slight modifications to the proposed definition. Specifically, we are removing the reference to being hired as a full-time teacher and instead using the phrase “becoming a teacher of record.” We do not believe this substantially changes the meaning of “recent graduate,” but it does clarify which newly hired, full-time teachers are to be captured under the definition.
We decline to provide States with additional flexibility in establishing other criteria for making a candidate a program completer because we believe that the revised definition of the term “recent graduate” provides States with sufficient flexibility. We believe that the additional flexibility suggested by the commenters would result in definitions that stray from the intent of the regulations.
Some commenters expressed concern that programs would be penalized if some individuals who have completed them go on to become gainfully employed in a non-education field or enroll in graduate school. We feel that it is important for the public and prospective students to know the degree to which participants in a teacher preparation program do not become teachers, regardless of whether they become gainfully employed in a non-education field. However, we think it is reasonable to allow States flexibility to exclude certain individuals when determining the teacher placement and retention rates (
Commenters recommended adding a number of specific items to the definition of exit qualifications, such as classroom management, differentiated instructional planning, and an assessment of student growth over time.
Another commenter suggested amending the definition to include culturally competent teaching, which the commenter defined as the ability of educators to teach students intellectual, social, emotional, and political knowledge by utilizing their diverse cultural knowledge, prior experiences, linguistic needs, and performance styles. This commenter stated that culturally competent teaching is an essential pedagogical skill that teachers must possess. The commenter also recommended that we include as separate terms and define “culturally competent education” and “culturally competent leadership”. Finally, this commenter requested that we develop guidance on culturally and linguistically appropriate approaches in education.
In our definition of rigorous exit requirements, we identified four basic characteristics that we believe all teacher candidates should possess. Regarding the specific components of rigorous exit requirements that commenters suggested (such as standards-based and differentiated planning, classroom management, and cultural competency), the definition does not preclude States from including those kinds of elements as rigorous exit requirements. We acknowledge that these additional characteristics, including cultural competency, may also be important, but we believe that the inclusion of these additional characteristics should be left to the discretion of States, in consultation with their stakeholders. To the extent that they choose to include them, States would need to develop definitions for each additional element. We also encourage interested parties to bring these suggestions forward to their States in the stakeholder engagement process required of all States in the design of their performance rating systems (see § 612.4(c)). Given that we are not adding cultural competency into the definition of rigorous candidate exit requirements, we are not adding the recommended related definitions or developing guidance on this topic at this time.
In addition, as we reviewed comments, we realized both that the phrase “at a minimum” was misplaced in the sentence and should refer not to the use of an assessment but to the use of validated standards and measures of the candidate's effectiveness, and that the second use of “measures of” in the phrase “measures of candidate effectiveness including measures of curriculum planning” was redundant.
Under the revised definition of student growth, States must use measures of student learning and performance, such as students' results on pre-tests and end-of-course-tests, objective performance-based assessments, student learning objectives, student performance on English language proficiency assessments, and other measures of student achievement that are rigorous, comparable across schools, and consistent with State requirements. Further, as a number of commenters recommended that the definition of student achievement in non-tested grades and subjects includes alignment to State and local standards, we feel that this new definition of student growth, in conjunction with altered requirements in the calculation of student learning outcomes, is sufficiently flexible to allow such alignment. Further, a State could adopt the commenters' recommendations summarized above under the revised requirements for the calculation of student learning outcomes and the revised definition of “student growth.”
We note that the quality of individual teachers is not being measured by the student learning outcomes indicator. Rather, it will help measure overall performance of a teacher preparation program through an examination of student growth in the many grades and subjects taught by novice teachers that are not part of the State's assessment system under section 1111(b) of the ESEA, as amended by the ESSA.
While the revised requirement does not necessitate the use of ESEA standardized test scores, we believe that the use of such scores could be a valid and reliable measure of student growth and encourage its use in determining student learning outcomes where appropriate.
We now turn to the comments from those who asserted that maintaining a link between this definition and conditions of waivers granted to States under ESEA flexibility is problematic. While we maintain the substance of this definition in the definition of “student growth,” in view of section 4(c) of ESSA, which terminates waivers the Department granted under ESEA flexibility as of August 1, 2016, we have revised the requirements for calculation of student learning outcomes in § 612.5(a)(1)(ii) to allow States the flexibility to use “another State-determined measure relevant to calculating student learning outcomes.” We believe that doing so allows the flexibility recommended by commenters. In addition, as we have stressed above in the discussion of
Finally, the use of value-added measures are not specifically included in the definition in the revised requirements for the calculation of student learning outcomes, or otherwise required by the regulations. However, we believe that there is convincing evidence that value-added scores, based on standardized tests, can be valid and reliable measures of teacher effectiveness and a teacher's effect on long-term student outcomes.
A number of commenters also stated that the definition of “student growth” has created new testing requirements in areas that were previously not tested. They urged that non-tested grades and subjects should not be a part of the definition of student growth. By including them in this definition, the commenters argued, States and school districts would be required to test students in currently non-tested areas, which they contended should remain non-tested. Several commenters also stated that, even as the value of yearly student testing is being questioned, the regulations would effectively add cost and burden to States that have not sought ESEA flexibility or received Race to the Top funds.
Due to the removal of separate definitions of student achievement in tested grades and subjects and student achievement in non-tested grades and subjects, and their replacement by one flexible definition of student growth, we believe we have addressed many concerns raised by commenters. This definition, for example, no longer requires States to use ESEA standardized test scores to measure student growth in any grade or subject, and does not require the use of definitions of terms used for Race to the Top.
We recognize commenters' assertion that student growth defined as a comparison of achievement between two points in time downplays the potential challenges of incorporating such measures into evaluation systems. However, since the revised definition of student growth and the revised requirements for calculating student learning outcomes allow States a large degree of flexibility in how such measures are applied, we do not believe the revised definition will place a significant burden on States to implement and incorporate these concepts into their teacher preparation assessment systems.
We have addressed commenters' recommendation that non-tested grades and subjects not be a part of the definition of student growth by removing the definition of student achievement in non-tested grades and subjects, and providing States with flexibility in how they apply the definition of student growth, should they choose to use it for measuring a program's student learning outcomes. However, we continue to believe that student growth in non-tested grades and subjects can and should be measured at
Consistent with the definition, and in conjunction with the altered requirements for the calculation of student learning outcomes, and the removal of the definition of student achievement in tested grades and subjects as well as the definition of student achievement in non-tested grades and subjects, States have significant flexibility to determine the methods they use for measuring student growth and the extent to which it is factored into a teacher preparation program's performance rating. The Department's revised definition of “student growth” is meant to provide States with more flexibility in response to commenters. Additionally, if a State chooses to use a method that controls for additional factors affecting student and teacher performance, like VAM, the regulations permit it to do so. See our response to comments in § 612.5(a)(1), which provides an in-depth discussion of the use of student growth and VAM.
Further, upon review of the proposed regulations, we recognized that the structure could be confusing. In particular, we were concerned that having a definition for the term “student learning outcomes” in § 612.2, when it largely serves to operationalize other definitions in the context of § 612.5, was not the clearest way to present these requirements. We therefore are moving the explanations and requirements of this term into the text of § 612.5(a).
Some commenters also stated that it is important to follow graduates through surveys for their first five years of employment, rather than just their first year of teaching (as proposed in the regulations) to obtain a rich and well-informed understanding of the profession over time, as the first five years is a significant period when teachers decide whether to leave or stay in the profession.
Commenters were concerned about the inclusion of probationary certificate teachers in surveys of teachers and employers for purposes of reporting teacher preparation program performance. Commenters noted that, in Texas, alternate route participants may be issued a probationary certificate that allows the participants to be employed as teachers of record for a period of up to three years while they are completing the requirements for a standard certificate. As a result, these probationary certificate holders would meet the proposed definition of “new teacher” and, therefore, they and their supervisors would be asked to respond to surveys that States would use to determine teacher preparation program performance, even though they have not completed their programs.
In addition, commenters asked which States are responsible for surveying teachers from a distance education program and their employers or supervisors.
The goal of every teacher preparation program is to effectively prepare aspiring teachers to step into a classroom and teach all of their students well. As the regulations are intended to help States determine whether each teacher preparation program is meeting this goal, we have decided to focus on novice teachers in their first year of teaching, regardless of the type of certification the teachers have or the type of teacher preparation program they attended or are attending. When a teacher is given primary responsibility for the learning outcomes of a group of students, the type of program she attended or is still attending is largely irrelevant—she is expected to ensure that her students learn. We expect that alternative route teacher preparation programs are ensuring that the teachers they place in classrooms prior to completion of their coursework are sufficiently prepared to ensure student growth in that school year. We recognize that these teachers, and those who completed traditional teacher
We agree with commenters who suggested that surveying teachers and their employers about the quality of training in the teachers' preparation program would provide a more rich and well-informed understanding of the programs over time. However, we decline to require that States survey novice teachers and their employers for more than one year. As an indicator of novice teachers' academic content knowledge and teaching skills, these surveys are a much more robust indicator of program performance in preparing novice teachers for teaching when completed in the first year of teaching. In this way, the program is still fresh and teachers and employers can best focus on the unique impact of the program independent of other factors that may contribute to teaching quality such as on-the-job training. However, if they so choose, States are free to survey novice teachers and their employers in subsequent years beyond a teacher's first year of teaching, and consider the survey results in their assessment of teacher preparation program effectiveness.
For teacher preparation programs provided through distance education, a State must survey the novice teachers described in the definition of “teacher survey” who have completed such a program and who teach in that State, as well as the employers of those same teachers.
Through this change, we are clarifying that the surveys will assess whether novice teachers possess the academic content knowledge and teaching skills needed to succeed in the classroom. We do so for consistency with § 612.5(a), which requires States to assess, for each teacher preparation program, indicators of academic content knowledge and teaching skills of novice teachers from that program. We also have removed the provision that the survey is of teachers in their first year of teaching in the State where the teacher preparation is located, and instead provide that the survey is of teachers in their first year teaching in the State. This change is designed to be consistent with new language related to the reporting of teacher preparation programs provided through distance education, as discussed later in this document. Finally, we are changing the term “new teacher” to “novice teacher” for the reasons discussed under the definition of “novice teacher.”
Commenters also noted that not all States may have teacher evaluation measures that meet the proposed definition because not all States require student growth to be a significant factor in teacher evaluations, as required by the proposed definition. Other commenters suggested that, while student growth or achievement should be listed as the primary factors in calculating teacher evaluation measures, other factors such as teacher portfolios and student and teacher surveys should be included as secondary considerations.
Some commenters felt that any use of student performance to evaluate effectiveness of teacher instruction needs to include multiple measures over a period of time (more than one to two years) and take into consideration the context (socioeconomic, etc.) in which the instruction occurred.
Furthermore, while we agree that reporting on student growth separately from teacher evaluation measures would likely provide the public with more information about the performance of novice teachers, we are committed to providing States the flexibility to develop performance systems that best meet their specific needs. In addition, because of the evident cost and burden of disaggregating student growth data from teacher evaluation measures, we do not believe that the HEA title II reporting system is the right vehicle for gathering this information. As a result, we decline to require separate reporting.
States may consider having LEAs incorporate teacher portfolios and student and teacher surveys into teacher evaluation measures, as the commenters recommended. In this regard, we note that the definition of “teacher evaluation measure” requires use of multiple valid measures, and we believe that teacher evaluation systems that use such additional measures of professional practice provide the best information on a teacher's effectiveness. We also note that, because the definition of “novice teacher” encompasses the first three years as a teacher of record, teacher evaluation measures that include up to three years of student growth data are acceptable measures of student learning outcomes under § 612.5(a)(1). In addition, States can control for different kinds of student and classroom characteristics in ways that apply our definition of student learning outcomes and student growth. See the discussion of § 612.5(a)(2) for further information of the student learning outcomes indicator.
With regard to the comment that some States lack teacher evaluation measures that meet the proposed definition because they do not require student growth to be a significant factor in teacher evaluations, we previously explained in our discussion of § 612.1 (and do so again in our discussion of § 612.6) our reasons for removing any
We understand that some States and districts that use student growth in their teacher evaluation systems do not do so for teachers in their first year, or first several years, of teaching. We are satisfied that such systems meet the requirements of the regulations so long as student growth is used as one of the multiple valid measures to assess teacher performance within the first three years of teaching. To ensure such systems meet the definition of “teacher evaluation measure,” we are revising the phrase “in determining each teacher's performance level” in the first sentence of the definition so that it reads “in determining teacher performance.”
Furthermore, for the reasons included in the discussion of §§ 612.1 and 612.6, we are removing the phrase “as a significant factor” from the definition. In addition, we are removing the phrase “of performance levels” from the second sentence of the definition, as inclusion of that phrase in the NPRM was an error.
In addition, we have determined that the parenthetical phrase beginning “such as” could be shortened without changing the intent, which is to provide examples of other measures of professional practice.
Finally, in response to commenters' desire for additional flexibility in calculating student learning outcomes, and given the newly enacted ESSA, under which waivers granted under ESEA flexibility will terminate as of August 1, 2016, we have revised the regulations so that States may use any State-determined measure relevant to calculating student learning outcomes, or combination of these three options.
We are adopting a commonly used definition of “teacher of record” that focuses on a teacher or co-teacher who is responsible for student outcomes and determining a student's proficiency in the grade or subject being taught.
The regulations do not, as the commenters state, establish any detailed expectations of what such a low (or high) teacher placement rate is or should be. This they leave up to each State, in consultation with its group of stakeholders as required under § 612.4(c).
We decline to accept commenters' recommendations to allow States to determine who may be excluded from placement rate calculations beyond the exclusions the regulations permit in the definition of “teacher placement rate.” Congress has directed that States report their teacher placement rate data “in a uniform and comprehensible manner that conforms to the definitions and methods established by the Secretary.” See section 205(a) of the HEA. We believe the groups of recent graduates that we permit States, at their discretion, to exclude from these calculations—teachers teaching out of State and in private schools, and teachers who have enrolled in graduate school or entered the military—reflect the most common and accepted groups of recent graduates that States should be able to exclude, either because States cannot readily track them or because individual decisions to forgo becoming teachers does not speak to the program's performance. Commenters did not propose another comparable group whose failure to become novice teachers should allow a State to exclude them in calculations of a program's teacher placement rate, and upon review of the
We accept that, in discussing this matter with its group of stakeholders, a State may identify one or more such groups of recent graduates whose decisions to pass up opportunities to become novice teachers are also reasonable. However, as we said above, a teacher placement rate becomes an indicator of a teacher preparation program's performance when it is unreasonably low,
Some commenters objected to permitting States to exclude teachers or recent graduates who take teaching positions out of State, arguing that, to be useful, placement rate data need to be gathered across State boundaries as program graduates work in numerous States.
With regard to the efficacy of the teacher placement rate as an indicator of program performance, we understand that employment outcomes, including teacher placement rates, are influenced by many factors, some of which are outside of a program's control. However, we believe that employment outcomes are, in general, a good reflection of program because they signal a program's ability to produce graduates whom schools and districts deem to be qualified and seek to hire and retain. Moreover, abnormally low employment outcomes are an indication that something about the program is amiss (just as abnormally high outcomes suggest something is working very well). Further discussion on this topic can be found under the subheading
While we are sympathetic to the commenters' concern that the proposed definition of teacher placement rate permits States to calculate employment outcomes only using data on teachers hired to teach in public schools, States may not, depending on State law, be able to require that private schools cooperate in the State data collection that the regulations require. We do note that, generally, teacher preparation programs are designed to prepare teachers to meet the requirements to teach in public schools nationwide, and over 90 percent of teachers in elementary and secondary schools do not work in private schools.
Similarly, we appreciate commenters' recommendation that the regulations include placement rate data for those recent graduates who take teaching positions in a different State. Certainly, many novice teachers do become teachers of record in States other than those where their teacher preparation programs are located. We encourage States and programs to develop interstate data-sharing mechanisms to facilitate reporting on indicators of program performance to be as comprehensive and meaningful as possible.
Until States have a ready means of gathering these kinds of data on an interstate basis, we appreciate that many States may find the costs and complexities of this data-gathering to be daunting. On the other hand, we do not view the lack of these data (or the lack of data on recent graduates teaching in private schools) to undermine the reasonableness of employment outcomes as indicators of program performance. As we have explained, it is when employment outcomes are particularly low that they become indicators of poor performance, and we are confident that the States, working in consultation with their stakeholders, can determine an appropriate threshold for teacher placement and retention rates.
Finally, we understand that the discretion that the regulations grant to each State to exclude novice teachers who teach in other States and who work in private schools (and those program graduates who go on to graduate school or join the military) means that the teacher placement rates for teacher preparation programs will not be comparable across States. This is not a major concern. The purpose of the regulations and the SRC itself is to ensure that each State reports those programs that have been determined to be low-performing or at-risk of being low-performing based on reasonable and transparent criteria. We believe that each State, in consultation with its stakeholders (see § 612.4(c), should exercise flexibility to determine whether to have the teacher placement rate reflect inclusion of those program graduates identified in paragraph (ii) of the definition.
Nonetheless, we decline to accept the recommendation that the regulations require that the teacher placement rate calculation account for these regional differences in job availability and the competitiveness of the employment market. Doing so would be complex, and would entail very large costs of cross-tabulating data on teacher preparation program location, area of residence of the program graduate, teacher placement data, and a series of employment and job market indicators. States may certainly choose to account for regional differences in job availability and the general competitiveness of the employment market and pursue the additional data collection that such effort would entail. However, we decline to require it.
As explained in the NPRM, while we acknowledge that teacher placement rates are affected by some considerations outside of the program's control, we believe that placement rates are still a valid indicator of the quality of a teacher preparation program (see the discussion of employment outcomes under § 612.5(a)(2)).
We understand that teachers may be hired to teach subjects and areas in which they were not prepared, and that out-of-field placement is more frequent in high-need schools. However, we maintain the requirement that the teacher placement rate assess the extent to which program graduates become novice teachers in the grade-level, grade-span, and subject area in which they were trained. A high incidence of out-of-field placement reflects that the teacher preparation program is not in touch with the hiring needs of likely prospective employers, thus providing its participants with the academic content knowledge and teaching skills to teach in the fields that do not match employers' teaching needs. We also recognize that placing teachers in positions for which they were not prepared could lead to less effective teaching and exacerbate the challenges already apparent in high-need schools.
In addition, we believe a number of commenters may have misunderstood how the teacher placement rate is calculated and used. Specifically, a number of commenters seemed to believe that the teacher placement rate is only calculated in the first year after program completion. This is inaccurate. The teacher placement rate is determined by calculating the percentage of recent graduates who have become novice teachers, regardless of their retention. As such, the teacher placement rate captures any recent graduate who works as a teacher of record in an elementary or secondary public school, which may include preschool at the State's discretion, within three years of program completion.
In order to provide additional clarity, we provide the following example. We examine a theoretical group of graduates from a single teacher preparation program, as outlined in Table 1. In examining the example, it is important to understand that a State reports in its SRC for a given year a program's teacher retention rate based on data from the second preceding title II reporting year (as the term is defined in the regulations). Thus, recent graduates in 2018 (in the 2017-2018 title II reporting year) might become novice teachers in 2018-2019. The State collects these data in time to report them in the SRC to be submitted in October 2019. Please see the discussion of the timing of the SRC under
In this example, the teacher preparation program has five individuals who met all of the requirements for program completion in the 2016-2017 academic year. The State counts these individuals (A, B, C, D, and E) in the denominator of the placement rate for the program's recent graduates in each of the State's 2018, 2019, and 2020 SRCs because they are, or could be, recent graduates who had become novice teachers in each of the prior title II reporting years. Moreover, in each of these years, the State would determine how many of these individuals have become novice teachers. In the 2018 SRC, the State identifies that A and B have become novice teachers in the prior reporting year. As such, the State divides the total number of recent graduates who have become novice teachers (2) by the total number of recent graduates from 2016-2017 (5). Hence, in the 2018 SRC, this teacher preparation program has a teacher placement rate of 40 percent.
In the State's 2019 SRC, all individuals who completed the program in 2017 and those who completed in 2018 (the 2016-2017 and 2017-2018 title II reporting years) meet the definition of recent graduate. In the 2018-2019 academic year, one additional completer from the 2016-2017 academic year has become a novice teacher (C), and five (F, G, H, J, and K) of the six 2017-2018 program completers have become novice teachers. In this instance, Teacher J is included as a recent graduate who has become a novice teacher even though Teacher J is not teaching in the current year. This is because the definition requires inclusion of all recent graduates who have become novice teachers at any time, regardless of their retention. Teacher J is counted as a successfully placed teacher. The fact that Teacher J is no longer still employed as a teacher is captured in the teacher retention rate, not here. As such, in the 2019 SRC, the teacher preparation program's teacher placement rate is 73 percent (eight program completers out of eleven have been placed).
In the State's 2020 SRC, there are no additional cohorts to add to the pool of recent graduates in this example although, in reality, States will be calculating this measure using three rolling cohorts of program completers each year. In this example, Teacher D has newly obtained placement as a novice teacher and would therefore be included in the numerator. As with Teacher J in the prior year's SRC, Teachers G and K remain in the numerator even though they are no longer teachers of record because they have been placed as novice teachers previously. In the 2020 SRC, the teacher preparation program's teacher placement rate is 82 percent (nine program completers out of eleven have been placed).
In the 2021 SRC, individuals who completed their teacher preparation program in the 2016-2017 academic year (A, B, C, D, and E) are no longer considered recent graduates since they completed their programs prior to the preceding three title II reporting years (2018, 2019, 2020). As such, the only cohort of recent graduates the State examines for this hypothetical teacher preparation program are those that completed the program in the 2016-2017 academic year (F, G, H, I, J, and K). In the 2020-2021 academic year, Teacher I is placed as a novice teacher. Once again, Teachers G and J are included in the numerator even though they are not currently employed as teachers because they have previously been placed as novice teachers. The program's teacher placement rate in the 2021 SRC would be 100 percent.
In the 2022 SRC, this hypothetical teacher preparation program has no recent graduates, as no one completed the requirements of the program in any of the three preceding title II reporting years (2019, 2020, or 2021).
As noted above, it is important to restate that recent graduates who have become novice teachers at any point, such as Teacher J, are included in the numerator of this calculation, regardless of whether they were retained as a teacher of record in a subsequent year. As such, if an individual completed a teacher preparation program in Year 1 and became a novice teacher in Year 2, regardless of whether he or she is still a novice teacher in Year 3, the individual is considered to have been successfully placed under this measure. Issues regarding retention of teachers are captured by the teacher retention rate measure, and therefore departures from a teaching position have no negative consequences under the teacher placement rate.
We have adopted these procedures for State reporting of a program's teacher placement rate in each year's SRC to keep them consistent with the proposal we presented in the NPRM for reporting teacher placement rates over a three-year period, in line with the change in the SRC reporting date, and as simple and straightforward as possible. This led us to make certain non-substantive changes to the proposed definition of teacher retention rate so that the definition is clearer and less verbose. In doing so, we have removed the State's option of excluding novice teachers who have taken teaching positions that do not require State certification (paragraph (ii)(C) of the proposed definition) because it seems superfluous; our definition of teacher preparation program is one that leads to an initial State teacher certification or licensure in a specific field.
(i) The percentage of recent graduates who have become novice teachers (regardless of retention) for the grade level, span, and subject area in which they were prepared.
(ii) At the State's discretion, exclusion from the rate calculated under paragraph (i) of this definition of one or more of the following, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State:
(A) Recent graduates who have taken teaching positions in another State.
(B) Recent graduates who have taken teaching positions in private schools.
(C) Recent graduates who have enrolled in graduate school or entered military service.
For these reasons, we have has determined that it is appropriate to allow States to use the total number of recent graduates who have obtained initial certification or licensure in the
Another commenter noted that it offers a number of graduate degree programs in education that do not lead to initial certification, but that the programs which institutions and States report on under part 612 are limited to those leading to initial certification.
Other commenters urged that aggregation of data to elementary and secondary data sets would be more appropriate in States with a primarily post-baccalaureate teacher preparation model. We understand that commenters are suggesting that our proposed definition of “teacher preparation program,” with its focus on the provision of a specific license or certificate in a specific field, will give States whose programs are primarily at the post-baccalaureate level considerable trouble collecting and reporting data for the required indicators given their small size. (See generally § 612.4(b)(3).)
While we understand that students often transfer during their college careers, we believe that that the teacher preparation program that ultimately determines that a student is prepared for initial certification or licensure is the one responsible for his or her performance as a teacher. This is so regardless of whether the student started in that program or a different one. The same is true for alternative route programs. Since alternative route programs enroll individuals who have had careers, work experiences, or academic training in fields other than education, participants in these programs have almost by definition had academic training elsewhere. However, we believe it is fully appropriate to have the alternative route program assume full responsibility for effective teacher training under the title II reporting system, as it is the program that determined the teacher to have sufficient academic content knowledge and teaching skills to complete the requirements of the program.
Finally, we note that in § 612.5(a)(4), the regulations also require States to determine whether teacher preparation programs have rigorous exit requirements. Hence, regardless of student transfers, the public will know whether the State considers program
We therefore have concluded that our proposed definition of a teacher preparation program does not fit these hybrid programs. Having an IHE or the State report composite information for a teacher preparation program that has both a traditional and alternative route component does not make sense; reporting in the aggregate will mask what is happening with or in each component. The clearest and simplest way to avoid the confusion in reporting that would otherwise result is to have IHEs and States treat each component of such a hybrid program as its own teacher preparation program. We have revised the definition of a “teacher preparation program” in § 612.2 to do just that. While doing so may create more small teacher preparation programs that require States to aggregate data under § 612.4(b)(3)(ii), this consequence will be far outweighed by the benefits of cleaner and clearer information.
Other commenters stated that two of the allowable options for calculating the teacher retention rate would provide useful information regarding: (1) The percentage of new teachers hired into full-time teaching positions and serving at least three consecutive years within five years of being certified or licensed; and (2) the percentage of new teachers hired full-time and reaching tenure within five years of being certified. According to commenters, the focus of the third option, new teachers who were hired and then fired for reasons other than budget cuts, could be problematic because it overlooks teachers who voluntarily leave high-need schools, or the profession altogether. Other commenters recommended removing the definition of teacher retention rate from the regulations.
Another commenter stated that the teacher retention rate, which we had proposed to define as any of the three specific rates as selected by the State, creates the potential for incorrect calculations and confusion for consumers when teachers have initial certification in multiple States; however, the commenter did not offer further information to clarify its meaning. In addition, commenters stated that the proposed definition allows for new teachers who are not retained due to market conditions or circumstances particular to the LEA and beyond the control of teachers or schools to be excluded from calculation of the retention rate, a standard that allows each school to determine the criteria for those conditions, which are subject to interpretation.
Several commenters requested clarification of the definition. Some asked us to clarify what we meant by tenure. Another commenter asked us to clarify how to treat teachers on probationary certificates.
Another commenter recommended that the Department amend the teacher retention rate definition so that it is used to help rate teacher preparation programs by comparing the program's recent graduates who demonstrate effectiveness and remain in teaching to those who fail to achieve high ratings on evaluations. One commenter suggested that programs track the number of years graduates taught over the course of five years, regardless of whether or not the years taught were consecutive. Others suggested shortening the timeframe for reporting on retention so that the rate would be reported for each of three consecutive years and, as we understand the comments, would apply to individuals after they became novice teachers.
In response to comments, we have clarified and simplified the definition of teacher retention rate. We agree with commenters that the third proposed option, by which one subtracts from 100 percent the percentage of novice teachers who were hired and fired for reasons other than budget cuts, is not a true measure of retention because it excludes those who voluntarily leave the profession. Therefore, we have removed it as an option for calculating the retention rate. Doing so also addresses those concerns that the third option allowed for too much discretion in interpreting when local conditions beyond the schools' control caused teachers to no longer be retained.
We also agree with commenters that the second proposed option for calculating the rate, which looked to the percentage of new teachers not receiving tenure within five years, is confusing and does not make sense when looking at new teachers, which we had proposed to define as covering a three-year teaching period, as tenure may not be reached during that timeframe. For these reasons, we also have removed this option from the definition. Doing so addresses the commenters' concerns that multiple methods for calculating the rate would create confusion. We also believe this addresses the comments regarding our use of the term tenure as potentially causing confusion.
We also note that our proposed definition of teacher retention rate did not bring in the concept of certification in the State in which one teaches. Therefore, we do not believe this definition will cause the confusion identified by the commenter who was concerned about teachers who were certified to teach in multiple States.
Additionally, we revised the first option for calculating the teacher retention rate to clarify that the rate must be calculated three times for each cohort of novice teachers—after the first,
We also agree with the recommendation that States calculate a program's retention rate based on three consecutive years after individuals become novice teachers. We believe reporting on each year for the first three years is a reasonable indicator of academic content and teaching skills in that it shows how well a program prepares novice teachers to remain in teaching, and also both promotes greater transparency and helps employers make more informed hiring decisions. We note that teacher retention rate is calculated for all novice teachers, which includes those on probationary certificates. This is further explained in the discussion of “Alternative Route Programs” in section 612.5(a)(2).
We appreciate the suggestions that we should require States to report a comparison of retention rates of novice teachers based on their evaluation ratings, but decline to prescribe this measure as doing so would create costs and complexities that we do not think are sufficiently necessary in determining a program's broad level of performance. States that are interested in such information for the purposes of transparency or accountability are welcome to consider it as another criterion for assessing program performance or for other purposes.
When calculating teacher retention rate, it is important to first note that the academic year in which an individual met all of the requirements for program completion is not relevant. Contrary to teacher placement rate, the defining concern of a teacher retention rate calculation is the first year in which an individual becomes a teacher of record for P-12 public school students. In this example, we use the same basic information as we did for the teacher placement rate example. As such, Table 2a recreates Table 1, with calculations for teacher retention rate instead of the teacher placement rate. However, because the first year in which an individual becomes a novice teacher is the basis for the calculations, rather than the year of program completion, we could rearrange Table 2a in the order in which teachers first became novice teachers as in Table 2b.
In addition, Table 2b removes data on program completion, and eliminates both extraneous information before an individual becomes a novice teacher and employment information after the State is no longer required to report on these individuals for purposes of the teacher retention rate.
In this example, this particular teacher preparation program has five individuals who became novice teachers for the first time in the 2017-2018 academic year (Teachers A, B, F, G, and J). For purposes of this definition, we refer to these individuals as a cohort of novice teachers. As described below, the State will first calculate a teacher retention rate for this teacher preparation program in the October 2019 State report card. In that year, the State will determine how many members of the 2017-2018 cohort of novice teachers have been continuously employed through the current year. Of Teachers A, B, F, G, and J, only teachers A, B, F, and G are still teaching in 2018-2019. As such, the State calculates a teacher retention rate of 80 percent for this teacher preparation program for the 2019 State Report Card.
In the October 2020 SRC, the State is required to report on the 2017-2018 cohort and the 2018-2019 cohort. The membership of the 2017-2018 cohort does not change. From that cohort, Teachers A, B, and F were employed in both the 2018-2019 academic year and the 2019-2020 academic year. The 2018-2019 cohort consists of Teachers C, H, and K. Of those, only Teachers C and H are employed as teachers of record in the 2019-2020 academic year. Therefore, the State reports a teacher retention rate of 60 percent for the 2017-2018 cohort—because three teachers (A, B, and F) were continuously employed through the current year out of the five total teachers (A, B, F, G, and J) in that cohort—and 67 percent for the 2018-2019 cohort—because 2 teachers (C and H) were employed in the current year of the three total teachers (C, H, and K) in that cohort.
In the October 2021 SRC, the State will be reporting on three cohorts of novice teachers for the first time—the 2017-2018 cohort (A, B, F, G, and J), the 2018-2019 cohort (C, H, and K), and the 2019-2020 cohort (D). Of the 2017-2018 cohort, only Teachers A and F have been continuously employed as a teacher of record since the 2017-2018 academic year, therefore the State will report a retention rate of 40 percent for this cohort (two out of five). Of the 2018-2019 cohort, only Teachers C and H have been continuously employed since the 2018-2019 academic year. Despite being a teacher of record for the 2020-2021 academic year, Teacher K does not count towards this program's teacher retention rate because Teacher K was not a teacher of record in the 2019-2020 academic year, and therefore has not been continuously employed. The State would report a 67 percent retention rate for the 2018-2019 cohort (two out of three). For the 2019-2020 cohort, Teacher D is still a teacher of record in the current year. As such, the State reports a teacher retention rate of 100 percent for that cohort.
Beginning with the 2022 SRC, the State no longer reports on the 2017-2018 cohort. Instead, the State reports on the three most recent cohorts of novice teachers—2018-2019 (C, H, and K), 2019-2020 (D), and 2020-2021 (E and I). Of the members of the 2018-2019 cohort, both Teachers C and H have been employed as teachers of record in each year from their first year as teachers of record through the current reporting year. Teacher K is still not included in the calculation because of the failure to be employed as a teacher of record in the 2019-2020 academic year. Therefore, the State reports a 67 percent retention rate for this cohort. Of the 2019-2020 cohort, Teacher D has been employed in each academic year since first becoming a teacher of record. The State would report a 100 percent retention rate for this cohort. Teachers E and I, of the 2020-2021 cohort, have also been retained in the 2021-2022 academic year. As such, the State reports a teacher retention rate of 100 percent in the 2022 SRC for this cohort.
Some commenters provided recommendations regarding survey content. These commenters argued that the teacher survey include questions to determine whether a teacher preparation program succeeded in the following areas, which, according to the commenters, research shows are important for preparing teachers to advance student achievement: producing student learning and raising student achievement for all students; using data to assess and address student learning challenges and successes; providing differentiated teaching strategies for students with varied learning needs, including English learners; keeping students engaged; managing classroom behavior; and using technology to improve teaching and increase student learning.
However, we believe that requiring States to survey all program completers would put undue burden on States by requiring them to locate individuals who have not been hired as teachers. Rather, we believe it is enough that States ensure that surveys are conducted of all novice teachers who are in their first year of teaching. We note that this change provides consistency with the revised definition of employer survey, which is a survey of employers or supervisors designed to capture their perceptions of whether the novice teachers they employ or supervise, who are in their first year of teaching, were effectively prepared. The goal of a teacher preparation program is to effectively prepare aspiring teachers to step into a classroom prepared to teach. As the regulations seek to help States reach reasonable determinations of whether teacher preparation programs are meeting this goal, the definition of survey outcomes focuses on novice teachers in their first year of teaching. We note that the regulations do not prohibit States from surveying additional individuals or conducting their surveys of cohorts of teachers over longer periods of time, and we encourage States to consider doing so. However, considering the costs associated with further surveys of the same cohorts of novice teachers, we believe that requiring that these teachers be surveyed once, during their first year of teaching, provides sufficient information about the basic issue—how well their program prepared them to teach.
We believe that States, in consultation with their stakeholders (see § 612.4(c)), are in the best position to determine the content of the surveys used to evaluate the teacher preparation programs in their State. Therefore, the regulations do not specify the number or types of questions to be included in employer or teacher surveys.
Commenters said that the change would make it impossible to collect reliable data on several factors and on large numbers of recent students. They stated that it would be impossible to submit a final IRC by October 1 because students take State licensing assessments, as well as enter into, drop from, and complete programs through August 31, and therefore final student data, pass rates for students who took assessments used for teacher certification or licensure by the State, and other information would not be available until September or October of each year. Other commenters indicated that, because most teacher preparation programs will need to aggregate multiple years of data to meet the program size threshold for reporting, the October submission date will unnecessarily rush the production and posting of their aggregated teacher preparation program data. Some commenters noted that changing the IRC due date to October (for reporting on students and programs for the prior academic year) would require a change in the definition of academic year because, without such a change, the October reports could not reflect scores on assessment tests that students or program completers took through August 31st. Alternatively, the proposal would require institutions to prepare and submit supplemental reports later in the year in order for the reports to fully reflect information for the prior academic year.
Some commenters also stated that LEAs have limited staffing and cannot provide assistance to institutions during the summer when data would be collected, or that because teacher hiring often occurs in August, an October IRC due date does not provide enough time to collect reliable employment data.
Finally, while several commenters opined that an October date for submission of the IRC did not provide sufficient time for institutions to receive information from LEAs, we do not believe that the regulations require LEAs to submit any information to institutions for purposes of the IRC. We assume that the comments were based on a misunderstanding surrounding the data to be reported in the IRC. While our proposed indicators of program performance would require States to receive and report information from LEAs, institutions would not need to receive comparable information from
On the other hand, some commenters suggested that teacher preparation programs report the demographics and outcomes of enrolled teacher candidates by race and ethnicity. Specifically, commenters suggested reporting the graduation rates, dropout rates, placement rates for graduates, first-year evaluation scores (if available), and the percentage of teacher candidates who stay within the teaching profession for one, three, and five years. Another commenter also suggested that gender, age, grade-level, and specialized areas of study be included; and that the data be available for cross-tabulation (a method of analysis allowing comparison of the relationship between two variables). One commenter stated that because title II reporting metrics are geared to evaluate how IHEs provide training, recruitment, and education to first-time graduates of education programs, the metrics cannot be applied to alternative route certification programs, which primarily train career changers who already have a degree and content knowledge. This commenter argued that attempting to compare the results of title II metrics from alternative route certification programs and traditional IHE-based programs will result in untrue conclusions because the programs' student candidates are so different.
Another commenter suggested that, in order to ensure that States are able to separately report on the performance of alternative route preparation programs, IHEs should report whether they have a partnership agreement with alternative route providers, and identify the candidates enrolled in each of those programs. The commenter noted that, while doing so may lead States to identify groups of small numbers of alternative route program participants, it may eliminate the possibility that candidates who actually participate in alternative route programs are identified as graduates of a traditional preparation program at the same IHE.
Another commenter stated that the variety of program academic calendars, with their different “start” and “end” dates in different months and seasons of the year, created another source of inaccurate reporting. The commenter explained that, with students entering a program on different dates, the need to aggregate cohorts will result in diffuse data that have relatively little meaning since the cohort will lose its cohesiveness. As such, the commenter stated, the data reported based on aggregate cohorts should not be used in assessing or evaluating the impact of programs on participants.
A number of commenters noted what they claimed were inherent flaws in our proposed IRC. They argued that it has not been tested for validity, feasibility, or unintended consequences, and therefore should not be used to judge the quality of teacher preparation programs.
Regarding the recommendation that institutions report whether their teacher preparation programs have partnership agreements with alternative route providers, we note that section 205(a) of the HEA neither provides for IHEs to include this type of information in their IRCs nor authorizes the Secretary to add reporting elements to them. However, if they choose, States could require institutions to report such data to them for inclusion in the SRCs. We defer to States on whether they need such information and, if so, the best way to require IHEs to provide it.
In response to the comment that the IRC is unnecessary because institutions already have feedback loops for program improvement, we note that by requiring each institution to make the information in the IRC available to the general public Congress plainly intends that the report serve a public interest that goes beyond the private use the institution may make of the reported data. We thus disagree that the current feedback loops that IHEs may have for program improvement satisfy Congress' intent in this regard.
We understand that there are differences between traditional and alternative route teacher preparation programs and that variability among programs in each category (including program start and end dates) exists. However, section 205(a) of the HEA is very clear that an IHE that conducts either a traditional or alternative route teacher preparation program must submit an IRC that contains the information Congress has prescribed. Moreover, we do not agree that the characteristics of any of these programs, specifically the demographics of the participants in these programs or whether participants have already earned an undergraduate degree, would necessarily lead to inaccurate or confusing reporting of the information Congress requires. Nor do we believe that the IRC reporting requirements are so geared to evaluate how IHEs provide training, recruitment, and education to first-time graduates of education programs that IHEs operating alternative route programs cannot explain the specifics of their responses.
We do acknowledge that direct comparisons of traditional and alternative route programs would potentially be misleading without additional information. However, this is generally true for comparisons of all types of programs. For example, a comparison of average cost of tuition and fees between two institutions could be misleading without the additional context of the average value of financial aid provided to each student. Simply because analyzing specific data out of context could potentially generate confusion does not mitigate the value of reporting the information to the general public that, as we have noted, Congress requires.
With specific regard to the fact that programs have different operating schedules, the IRC would have all IHEs report on students participating in teacher preparation programs during the reporting year based on their graduation date from the program. This would be true regardless of the programs' start date or whether the students have previous education credentials. We also believe the IRC would become too cumbersome if we tried to tailor the specific reporting requirements in section 205(a) of the HEA to address and reflect each individual program start time, or if the regulations created different reporting structures based on the program start time or the previous
Furthermore, we see no need for any testing of data reported in the IRC for validity, feasibility, or unintended consequences. The data required by these regulations are the data that Congress has specified in section 205(a) of the HEA. We do not perceive the data elements in section 205(a) as posing any particular issues of validity. Just as they would in any congressionally mandated report, we expect all institutions to report valid data in their IRCs and, if data quality issues exist we expect institutions will address them so as to meet their statutory obligations. Further, we have identified no issues with the feasibility of reporting the required data. While we have worked to simplify institutional reporting, institutions have previously reported the same or similar data in their IRCs, albeit at a different level of aggregation. Finally, we fail to see any unintended consequences that follow from meeting the statutory reporting requirements. To the extent that States use the data in the IRC to help assess whether a program is low-performing or at-risk of being low-performing under section 207(a) of the HEA, under our regulations this would occur only if, in consultation with their stakeholders under § 612.4(c), States decide to use these data for this purpose. If institutions are concerned about such a use of these data, we encourage them to be active participants in the consultative process.
We discuss the comments regarding concerns about the cost estimates in the
However, while not requiring the information to be included in promotional materials, we encourage IHEs and their teacher preparation programs to provide it in places that prospective students can easily find and access. We believe IHEs can find creative ways to go beyond the regulatory requirements to provide this information to students and the public without incurring significant costs.
Commenters noted that the proposed timeline does not allow States enough time to implement the proposed regulations, and that the associated logistical challenges impose undue and costly burdens on States. Commenters noted that States need more time to make decisions about data collection, involve stakeholders, and to pilot and revise the data systems—activities that they said cannot be completed in one year.
Several commenters recommended extending the timeline for implementation by at least five years. Some commenters suggested delaying the reporting of program ratings until at least 2021 to give States more time to create data linkages and validate data. Other commenters pointed out that their States receive employment and student learning data from LEAs in the fall or winter, which they said makes reporting outcomes in their SRCs in October of each year, as we had proposed, impossible. Still other commenters noted that some data, by their nature, may not be available to report by October. Another commenter suggested that institutions should report in October, States should report outcome data (but not performance designations) in February, and then the States should report performance designations in June, effectively creating an additional reporting requirement. To address the timing problems in the proposed schedule for SRC submission, other commenters recommended that the Department continue having States submit their SRCs in October. On the other hand, some commenters supported or encouraged the Department to maintain the proposed timelines.
Many commenters stated that no State currently implements the proposed teacher preparation program rating system. Therefore, to evaluate effectiveness, or to uncover unintended consequences, these commenters emphasized the importance of permitting States to develop and evaluate pilot programs before broader implementation. Some commenters therefore recommended that the proposed implementation timeline be delayed until the process had been
Under the final regulations, the initial SRC (a pilot) would be due October 31, 2018, for the 2016-2017 academic year. The October 2018 due date provides much more time for submission of the SRC. As we note in the discussion of comments received on § 612.3(a) (Reporting Requirements for the IRC), IHEs will continue to report on their programs, including pass rates for students who took assessments used for initial certification or licensure by the State in which the teacher preparation program is located, from the prior academic year, by April 30 of each year. States therefore will have these data available for their October 31 reporting. Because the outcome data States will need to collect to help assess the performance of their teacher preparation programs (
By maintaining the current reporting cycle, States will have a year (2016-2017) to design and implement a system. The 42 States, District of Columbia, and the Commonwealth of Puerto Rico that were previously granted ESEA flexibility are therefore well positioned to meet the requirements of these regulations because they either already have the systems in place to measure student learning outcomes or have worked to do so. Moreover, with the flexibility that § 612.5(a)(1)(ii) now provides for States to measure student learning outcomes using student growth, a teacher evaluation measure, or another State-determined measure relevant to calculating student learning outcomes (or any combination of these three), all States should be able to design and implement their systems in time to submit their initial reports by October 31, 2018. Additionally, at least 30 States, the District of Columbia, and the Commonwealth of Puerto Rico either already have the ability to aggregate data on the achievement of students taught by recent graduates and link those data back to teacher preparation programs. Similarly, as discussed below, 30 States already implement teacher surveys that could be modified to be used in this accountability system.
Particularly given the added flexibility in § 612.5(a)(1)(ii), as most States already have or are well on their way to having the systems required to implement the regulations, we are confident that the reduction in time to prepare before the pilot SRC will be prepared and submitted will prove to be manageable. We understand that some States will not have complete datasets available for all indicators during initial implementation, and so may need to make adjustments based on experience during the pilot year. We also stress that the October 2018 SRC is a pilot report; any State identification of a program as low-performing or at-risk of being low-performing included in that report would not have implications either for the program generally or for that program's eligibility to participate in the TEACH Grant program. Full SRC reporting begins in October 2019.
In addition, maintaining the SRC reporting date of October 31 also is important so that those who want to apply for admission to teacher preparation programs and for receipt of TEACH Grants as early as January of the year they wish to begin the program know which IHEs have programs that States have identified in their SRCs as at-risk or low-performing. Prospective students should have this information as soon as they can so that they know both the State's assessment of each program's level of performance and which IHEs lack authority to award TEACH Grants. See our response to public comment regarding the definition of a TEACH Grant-eligible institution in § 686.2.
In summary, under our revised reporting cycle, the SRC is due about five months earlier than in the proposed regulations. However, because the report due October 31, 2018 is a pilot report, we believe that States will have sufficient time to complete work establishing their reporting and related systems to permit submission of all information in the SRC by the first full reporting date of October 31, 2019. While we appreciate the comments suggesting that States be able to develop and evaluate pilot programs before broader implementation, or that the implementation timeline be delayed until the State process has been piloted and evaluated for efficiency, we do not believe that adding more time for States to develop their systems is necessary. Lastly, maintaining the existing timeline does not affect the timing of consequences for TEACH Grants for at-risk or low-performing teacher preparation programs. Under the regulations, the TEACH Grant consequences would apply for the 2021-2022 award year.
In addition, the commenter who raised concerns based on the State legislature being in session on only a biennial basis did not provide enough information to permit us to consider why this necessarily bars the State's compliance with these regulations.
Commenters also stated that there are too many variations in program size and, as we understand the comment, in the way States credential their teacher preparation programs to mandate a single Federal approach to disaggregated program reporting for the entire Nation.
We fail to understand how defining a teacher preparation program as we have, in terms of initial State teacher certification or licensure in a specific field, creates concerns that top ratings would only go to programs with data showing the effectiveness of graduates working in public schools in the State. So long as the number of novice teachers the program produces meets the minimum threshold size addressed in § 612.4(b)(3) (excluding, at the State's discretion, teachers teaching out of State and in private schools from determinations of student learning outcomes and teacher placement and retention rates as permitted by § 612.5(a)(1) and § 612.2, respectively), we are satisfied that the reporting of program information will be sufficiently robust and obviate concerns about data reliability.
Moreover, we disagree with the comments that students would find reporting of outcomes at the institution level less confusing than reporting at the teacher preparation program level. We believe students want information about teacher preparation programs that are specific to the areas in which they want to teach so they can make important educational and career decisions, such as whether to enroll in a specific teacher preparation program. This information would be presented most clearly at the teacher preparation program level rather than at the institutional level, where many programs would be collapsed such that a student would not only lack information about whether a specific program in which she is interested is low-performing or at risk of being low-performing, but also be unable to review data relative to indicators of the program's performance.
We also disagree with the claim that program level reporting as required under these regulations is inappropriate due to the variation in program size and structure across and within States. Since the commenters did not provide an example of how the requirements of these regulations make program level reporting impossible to implement, we cannot address these concerns more specifically than to say that since use of indicators of program performance will generate information unique to each program, we fail to see why variation in program size and structure undermine these regulations.
Commenters also requested that the Department clarify what would happen to distance education programs and their currently enrolled students if multiple States would be assessing a single program's effectiveness and doing so with differing results. One commenter suggested a “home State” model in which, rather than developing ratings for each program in each State, all of a provider's distance education programs would be evaluated by the State in which the provider, as opposed to the program participants, is physically located. The commenter argued that this model would increase the reliability of the measures and decrease student confusion, especially where comparability of measures between States is concerned. Unless such a home State model is adopted, the commenter argued, other States may discriminate against programs physically located and operated in other States by, as we understand the comment, using the process of evaluating program performance to create excessive barriers to entry in order to protect in-State institutions. Another commenter asked that the proposed regulations provide a specific definition of the term “distance education.”
Several commenters expressed support for the change to § 612.4(a)(1)(ii) proposed in the Supplemental NPRM, which would require that reporting on the quality of all teacher preparation programs provided through distance education in the State be made by using procedures for reporting that are consistent with § 612.4(b)(4), but based on whether the program produces at least 25 or fewer than 25 new teachers whom the State certified to teach in a given reporting year.
While commenters indicated that reporting on hybrid teacher preparation programs was a complicated issue, commenters did not provide recommendations specific to two questions regarding hybrid programs that were posed in the Supplemental NPRM. The first question asked under what circumstances, for purposes of both reporting and determining the teacher preparation program's level of overall performance, a State should use procedures applicable to teacher education programs offered through distance education and when it should use procedures for teacher preparation programs provided at brick-and-mortar institutions. Second, we asked, for a single program, if one State uses procedures applicable to teacher preparation programs provided through distance education, and another State uses procedures for teacher preparation programs provided at brick-and-mortar institutions, what are the implications, especially for TEACH Grant eligibility, and how these inconsistencies should be addressed.
In response to our questions, many commenters indicated that it was unclear how to determine whether a teacher preparation program should be classified as a teacher preparation program provided through distance education for reporting under § 612.4(a)(1)(ii) and asked for clarification regarding how to determine under what circumstances a teacher preparation program should be considered a teacher preparation program provided through distance education. One commenter recommended that we define a teacher preparation program provided through distance education program to be one where the full and complete program can be completed without an enrollee ever being physically present at the brick-and-mortar institution or any of its branch offices.
Commenters expressed a number of concerns about reporting. Some commenters indicated that while the December 3, 2014, NPRM allowed States to report on programs that produced fewer than 25 new teachers, it was unclear whether the same permission would be applied to distance education programs through the Supplemental NPRM. Additionally, a few commenters thought that, in cases where students apply for certification in more than one State, the outcomes of a single student could be reported multiple times by multiple States. Other commenters felt that if States are expected to evaluate distance education graduates from other States' programs, the regulations should be revised to focus on programs that are
Many commenters voiced concerns related to the identification and tracking of distance education programs provided through distance education. Specifically, commenters indicated that, because the method by which a teacher preparation program is delivered is not transcribed or officially recorded on educational credentials, the receiving State (the State where the teacher has applied for certification) has no way to distinguish teacher preparation programs provided through distance education from brick-and-mortar teacher preparation programs. Furthermore, receiving States would not be able to readily distinguish individual teacher preparation programs provided through distance education from one another.
Finally, a commenter stated that the proposed regulations do not require States to provide any notice of their rating, and do not articulate an appeal process to enable institutions to challenge, inspect, or correct the data and information on the basis of which they might have received an adverse rating. Commenters also indicated that teacher preparation programs themselves should receive data on States' student and program evaluation criteria.
We appreciate commenters' expressions of support for the change to the proposed regulations under § 612.4(a)(1)(ii), as proposed in the Supplemental NPRM, requiring that reporting on the quality of all teacher preparation programs provided through distance education in the State be made by using procedures for reporting that are consistent with proposed § 612.4(b)(4), but based on whether the program produces at least 25 or fewer than 25 new teachers whom the State certified to teach in a given reporting year. In considering the language of proposed § 612.4(a)(1)(ii) and the need for clarity on the reporting requirements for teacher preparation programs provided through distance education, we have concluded that the provision would be simpler if it simply incorporated by reference the reporting requirements for those programs in § 612.4(b)(3) of the final regulations.
While we agree with the commenters who stated that the proposed regulations were unclear on what constitutes a teacher preparation program provided through distance education, we decline to accept the recommendation to define a distance education program where the full and complete program can be completed without an enrollee ever being physically present at the brick-and-mortar institution or any of its branch offices because this definition would not be inclusive of teacher preparation programs providing significant portions of the program through distance education. In addition, the proposed definition would allow the teacher preparation program to easily modify its requirements such that it would not be considered a teacher preparation program provided through distance education.
Instead, in order to clarify what constitutes a teacher preparation program provided through distance education, we are adding the term “teacher preparation program provided through distance education” to § 612.2 and defining it as a teacher preparation program in which 50 percent or more of the program's required coursework is offered through distance education. The term distance education is defined under 34 CFR 600.2 to mean education that uses one or more specified technologies to deliver instruction to students who are separated from the instructor and to support regular and substantive interaction between the students and the instructor, either synchronously or asynchronously. The technologies may include the internet; one-way and two-way transmissions through open broadcast, closed circuit, cable, microwave, broadband lines, fiber optics, satellite, or wireless communications devices; audio conferencing; or video cassettes, DVDs, and CD-ROMs, if the cassettes, DVDs, or CD-ROMs are used in a course in conjunction with any of the technologies previously in this definition. We have incorporated this definition by reference (see § 612.2(a)).
In the Supplemental NPRM, we specifically requested public comment on how to determine when a program that has both brick-and-mortar and distance education components should be considered a teacher preparation program provided through distance education. While we received no suggestions, we believe that it is reasonable that if 50 percent or more of a teacher preparation program's required coursework is offered through distance education, it should be considered a teacher preparation program provided through distance education because the majority of the program is offered through distance education. This 50 percent threshold is consistent with thresholds used elsewhere in Departmental regulations, such as those relating to correspondence courses under 34 CFR 600.7 or treatment of institutional eligibility for disbursement of title IV HEA funds for additional locations under 34 CFR 600.10(b)(3).
In addition, we do not agree with the suggestion for a “home State” reporting model, in which all of a provider's distance education programs would be evaluated by the State in which the provider is physically located. First, section 205(b) of the HEA requires States to report on the performance of their teacher preparation programs. We feel strongly both that, to date, defining the program at the institutional level has not produced meaningful results, and that where programs provided through distance education prepare individuals to teach in different States, those States—and not only the “home State”—should assess those programs' performance. In addition, we believe that each State should, as the law anticipates, speak for itself about what it concludes is the performance of each teacher preparation program provided through distance education operating within its boundaries. Commenters did not provide any evidence to support their assertion that States would discriminate against distance learning programs physically located in other States, nor do we understand how they would do so if, as § 612.4(a) anticipates, they develop and apply the same set of criteria (taking into consideration the need to have different employment
Regarding reporting concerns, we provide under § 612.4(b)(3)(i) for annual reporting on the performance of each teacher preparation program that produces a total of 25 or more recent graduates in a given reporting year (that is, a program size threshold of 25), or, at the State's discretion, a lower program size threshold (
Further, for the purposes of the teacher placement rate, § 612.5(a)(2)(iv) permits a State, at its discretion, to assess the teacher placement rate for teacher preparation programs provided through distance education differently from the teacher placement rate for other teacher preparation programs based on whether the differences in the way the rate is calculated for teacher preparation programs provided through distance education affect employment outcomes.
States that certify at least 25 teachers from a teacher preparation program provided through distance education do have an interest in that program and will be reporting on the program as a program in their States. Moreover, we disagree that States in which distance education programs are headquartered should round up data from other States, determine a performance rating, and report it for several reasons. In addition to placing a higher cost and burden on a particular State, this methodology would undermine the goal of States having a say in the quality of the program that is being used to certify teachers in the State. The State where a teacher preparation program operating in multiple States is housed is not the only State with an interest in the program. Finally, we do not believe that the regulations would force States to create a duplicative and unnecessary second tracking system because a State is already required to report on teacher preparation programs in the State.
We agree with commenters' concerns regarding the identification and tracking of teacher preparation programs provided through distance education. To address this concern, institutions will be asked to report which of their teacher preparation programs are teacher preparation programs provided through distance education in the IRC, which the institutions provide to the State. The receiving State can then verify this information during the teacher certification process for a teacher candidate in the State.
We note that an appeal process regarding a teacher preparation program's performance is provided for under § 612.4(c). We also note that teacher preparation programs will have access to data on States' student and program evaluation criteria because State report cards are required to be publicly available.
While posting of the SRC data on the State's Web site may not lead directly to student learning or teacher preparation program improvement, it does provide the public with basic information about the performance of each program and other, broader measures about teacher preparation in the State. Moreover, making this information widely available to the general public is a requirement of section 205(b)(1) of the HEA. Posting this information on the State's Web site is the easiest and least costly way for States to meet this requirement. We also note that the commenters are mistaken in their belief that our proposed regulations did not require that information regarding teacher preparation programs be shared with consumers. Proposed § 612.4(a)(2) would require States to post on their Web sites all of the information required to be included in their SRCs, and these data include the data on each program's student learning outcomes, employment outcomes, and survey outcomes, and how the data contribute to the State's overall evaluation of the program's performance. The final regulations similarly require the State to include all of these data in the SRC, and § 612.4(a)(2) specifically requires the State to make the same SRC information it provides to the Secretary in its SRC widely available to the general public by posting it on the State's Web site.
Others noted that simply ascribing one of the four proposed performance levels to a program is not nuanced or sophisticated enough to fully explain the quality of a teacher preparation program. They recommended removing the requirement that SEAs provide a single rating to each program, and allow States instead to publish the results of a series of performance criteria for each program.
States cannot meet this requirement unless they establish procedures for using criteria, including indicators of academic content knowledge and teaching skills (see § 612.4(b)(2)(i)), to determine which programs are classified in each category. The requirement of § 612.4(b)(1) that States make meaningful differentiation of teacher preparation program performance using at least these three categories simply gives this statutory requirement regulatory expression. While § 612.4(b)(1) permits States to categorize teacher preparation programs using more than three levels of performance if they wish, the HEA cannot be properly implemented without States making meaningful differentiation among programs based on their overall performance.
We do not believe that these regulations disregard the uniqueness of each program's size, mission, or diversity, as they are intended to provide a minimum set of criteria with which States determine program performance. They do not prescribe the methods by which programs meet a State's criteria for program effectiveness.
Multiple other commenters expressed confusion about whether or not the regulations incentivize placement in high-need schools by making such placement a significant part of how States must determine the rating of a teacher preparation program. Some commenters argued that, on the one hand, the requirement that States use student learning outcomes to help assess a program's overall performance could incentivize teacher preparation programs having teaching candidates become teachers in schools where students are likely to have higher test scores. On the other hand, they argued that the proposed regulations would also assess program performance using, as one indicator, placement of candidates in high-need schools, an indicator that commenters stated would work in the opposite direction. These commenters argued that this could cause confusion and will create challenges in implementing the regulations by not giving States and programs a clear sense of which issue is of greater importance—student learning outcomes or placement of teachers in high-need schools.
Other commenters recommended that the Department set specific thresholds based on the affluence of the area the school serves. For example, commenters recommended that 85 percent of program graduates who work in affluent, high-performing schools should have a certain level of student learning outcomes, but that, to have the same level of program performance, only 60 percent of program graduates who work in high-need schools have perform at that same level.
Multiple commenters also opposed the inclusion of student learning outcomes, employment outcomes, and survey outcomes as indicators of the performance of teacher preparation programs. These commenters believed that student learning outcomes are embedded in the concept of VAM found in standardized testing, a concept they believe constitutes a flawed methodology that does not accurately represent teacher preparation program effectiveness.
However, while States retain the authority to determine thresholds for performance under each indicator, in consultation with their stakeholder groups (see § 612.4(c)), we encourage States to choose thresholds purposefully. We believe that all students, regardless of their race, ethnicity, or socioeconomic status, are capable of performing at high levels, and that all teacher preparation programs need to work to ensure that teachers in all schools are capable of helping them do so. We encourage
Similarly, we encourage States to employ measures of student learning outcomes that are nuanced enough to control for prior student achievement and observable socio-economic factors so that a teacher's contribution to student learning is not affected by the affluence of his or her school. Overall, the concerns stated here would also be mitigated by use of growth, rather than some indicator of absolute performance, in the measure of student learning outcomes. But, here again, we feel strongly that decisions about how and when student learning outcomes are weighted differently should be left to each State and its consultation with stakeholders.
We respond to the commenters' objections to our requirement that States use student learning outcomes, employment outcomes, and survey outcomes in their assessment of the performance levels of their teacher preparation programs in our discussion of comment on these subjects in § 612.5(a). For reasons we addressed above in the discussion of § 612.1, while still strongly encouraging States to give significant weight to these indicators in assessing a program's performance, we have omitted from the final regulations any requirement that States consider employment outcomes in high-need schools and student outcomes “in significant part.”
Other commenters recommended modifying the regulations so that States would need to determine programs to have “above average student learning outcomes” in order to rate them in the highest category of teacher preparation performance. Another commenter suggested that student learning data be disaggregated by student groups to show hidden inequities, and that States be required to develop a pilot program to use subgroup data in their measurement of teacher preparation programs, such that if the student subgroup performance falls short the program could not be rated as effective or higher.
In addition, some commenters opposed our proposed § 612.4(b)(2)(i)(B) requiring each State to include in its SRC an assurance that a teacher preparation program either is accredited or produces teachers with content and pedagogical knowledge because of what they described as the federalization of professional standards. They indicated that our proposal to offer each State the option of presenting an assurance that the program is accredited by a specialized accrediting agency would, at best, make the specialized accreditor an agent of the Federal government, and at worst, effectively mandate specialized accreditation by CAEP. The commenters
We believe that the costs of this SRC reporting will be manageable for all States, and have provided a detailed discussion of costs in the RIA section of this document. For further discussion of reporting on student learning outcomes, see the discussion in this document of § 612.5(a)(1). We also emphasize that States will report these data in the aggregate at the teacher preparation program level and not at the teacher level. Furthermore, while States will need to comply with applicable Federal and State student privacy laws in the data they report in their SRC, the commenters have not provided information to help us understand how our requirements, except as we discuss for § 612.4(b)(3)(ii)(E), are affected by State student privacy laws.
In addition, as we reviewed these comments and the proposed regulatory language, we realized the word “disaggregated” was unclear with regard to the factors by which the data should be disaggregated, and redundant with regard to the description of indicators in § 612.5. We have therefore removed this word from § 612.4(b)(2)(i).
Under § 612.5(a)(4) States must annually report whether each program is administered by an entity that is accredited by a specialized accrediting agency recognized by the Secretary, or produces candidates (1) with content and pedagogical knowledge and quality clinical preparation, and (2) who have met rigorous teacher candidate exit qualifications. Upon review of the comments and the language of § 612.5(a)(4), we have determined that proposed § 612.4(b)(3)(i)(B), which would have had States provide an assurance in their SRCs that each program met the characteristics described in § 612.5(a)(4), is not needed. We address the substantive comments offered on that provision in our discussion of comments on § 612.5(a)(4).
Finally, in reviewing the public comment, we realized that the proposed regulations focused only on having States report in their SRCs the data they would provide for indicators of academic knowledge and teaching skills that are used to determine the performance level of each teacher preparation program. This, of course, was because State use of those indicators was the focus of the proposed regulations. But we did not mean to suggest that in their SRCs, States would not also report the data they would use for other indicators and criteria they establish for identifying each's program's level of performance. While the instructions in section V of the proposed SRCs imply that States are to report their data for all indicators and criteria they use, we have revised those instructions to clarify this point.
Several commenters noted that the flexibility our proposed regulations provide to States to determine the weighting system for use of criteria and indicators to assess teacher preparation program performance undermines what the commenters state is the Department's goal of providing meaningful data to, among other things, facilitate State-to-State comparisons. The commenters argue that consumers might incorrectly assume the all States are applying the same metrics to assess program performance, and so draw incorrect conclusions especially for programs located near each other but located in different States. Several commenters also expressed concerns about the Department's proposal in § 612.5(a)(2) that States be able to weigh employment outcomes differently for alternative route programs and traditional teacher preparation programs. The commenters argued that all teacher preparation programs should be held to the same standards and levels of accountability.
Commenters also stated that our proposal, by which we understand the commenters to mean the proposed use of student learning outcomes, employment outcomes and survey outcomes as indicators of academic content knowledge and teaching skills of teachers whom programs prepare, should be adjusted based on the duration of the teachers' experience. Commenters stated we should do so because information about newer teachers' training programs should be emphasized over information about more experienced teachers, for whom data reflecting these indicators would likely be less useful.
Some commenters asked whether, if a VAM is used to generate information for indicators of student learning outcomes, the indicators should be weighted to count gains made by the lower performing third of the student population more than gains made by the upper third of the population because it would be harder to increase the former students' scores. The commenters noted that poorer performing students will have the ability to improve by greater amounts than those who score higher on tests.
Several commenters believed that the weighting of the indicators used to report on teacher preparation program performance is a critical decision, particularly with respect to the weighting of indicators specific to high-need schools, and because of this, decisions on weighting should be determined after data are collected and analyzed. As an example of why the group of stakeholders should have information available prior to making weighting decisions, the commenter noted that, if teacher placement in high-need schools has a relatively low-weight and student growth is negatively associated with the percentage of economically disadvantaged students enrolled in the school, programs may game the system by choosing to counsel students to seek employment in non-high-need schools.
Finally, several commenters stated that the regulations incentivize programs to place graduates in better
States plainly need to be able to implement procedures for taking the data relevant to each of the indicators of academic knowledge and teaching skills and other criteria they use to assess program performance, and turn those data into a reported overall level of program performance. We do not see how States can do this without somehow providing some form of weight to each of the indicators they use. However, the specific method by which a State does so is left to each State, in consultation with its stakeholders (see § 612.4(c)), to determine.
As we addressed in the discussion of § 612.1, we had proposed in § 612.4(b)(1) that a State's assessment of a program's performance needed to be based “in significant part” on the results for two indicators, student learning outcomes and employment outcomes in high-need schools. But as we noted in our discussion of comment on §§ 612.1 and 612.4(b)(1), while strongly encouraging States to adopt these provisions in their procedures for assessing a program's performance, we have revised these final regulations to omit that proposal and any other language that any regulatory indicator receive special weight.
Furthermore, the flexibility the regulations accord to States to determine how these factors should be weighed to determine a program's level of performance extends to the relative weight a State might accord to factors like a teacher's experience and to student learning outcomes of teachers in low-performing versus high-performing schools. It also extends to the weight a State would provide to employment outcomes for traditional teacher preparation programs and alternative route teacher preparation programs; after all, these types of programs are very different in their concept, who they recruit, and when they work with LEAs to place aspiring teachers as teachers of record. In addition, State flexibility extends to a State's ability to assess the overall performance of each teacher preparation program using other indicators of academic content knowledge and teaching skills beyond those contained in the regulations. We do not believe that this flexibility undermines any Departmental goal, or goal that Congress had in enacting the title II reporting system.
Thus, while a State must report the procedures and weighting of indicators of academic content knowledge and teaching skills and other criteria it uses to assess program performance in its SRC, we believe States should be able to exercise flexibility to determine how they will identify programs that are low-performing or at-risk of being so. In establishing these regulations, we stress that our goal is simple: to ensure that the public—prospective teaching candidates, LEAs that will employ novice teachers, and State and national policy makers alike—has confidence that States are reasonably identifying programs that are and are not working, and understand how States are distinguishing between the two. The flexibilities the regulations accord to States to determine how to determine a program's level of performance is fully consistent with this goal. Furthermore, given the variation we expect to find in State approaches and the different environments in which each State operates, we reiterate that any State-to-State comparisons will need to be made only with utmost caution.
As noted above, our discussion of §§ 612.1 and 612.4(b)(1) stressed both (1) our hope that States would adopt our proposals that student learning outcomes and employment outcomes for high-need schools be given significant weight, and that to be considered effective a teacher preparation program would show positive student learning outcomes, and (2) our decision not to establish these proposals as State requirements. Thus, we likewise leave to States issues regarding incentives that any given weight might cause to placements of aspiring teachers and the programs themselves.
Finally, in reviewing the public comment, we realized that the proposed regulations focused only on having States report in their SRCs the weights they would provide to indicators of academic knowledge and teaching skills used to determine the performance level of each teacher preparation program. This, of course, was because State use of those indicators was the focus of the proposed regulations. But we did not mean to suggest that in their SRCs, States would not also report the weights they would provide to other indicators and criteria they establish for identifying each program's level of performance. While the instructions in section V of the proposed SRCs imply that States are to report their weighting for all indicators and criteria they use, we have revised them to clarify this point.
Other commenters asked for more clarity on the various methods for a program to reach the threshold of 25 new teachers (or other threshold set by the State). The commenters also stated that a State could design this threshold to limit the impact on programs. Other commenters noted that smaller teacher preparation programs may not have the technical and human resources to collect the data for proposed reporting requirements,
The proposed regulations had focused State reporting and small program aggregation procedures on the number of new teachers a teacher preparation program produced. Based on further consideration of these and other comments, it became clear that the term “new teacher” was problematic in this case as it was in other places. We realized that this approach would not hold teacher preparation programs accountable for producing recent graduates who do not become novice teachers. Because we believe that the fundamental purpose of these programs is to produce novice teachers, we have concluded that our proposal to have State reporting of a program's performance depend on the number of new teachers that the program produces was misplaced.
Therefore, in order to better account for individuals who complete a teacher preparation program but who do not become novice teachers, we are requiring a State to report annually on the performance of each “brick-and-mortar” teacher preparation program that produces a total of 25 or more recent graduates (or such lower threshold as the State may establish). Similarly, aggregation procedures for smaller programs apply to each teacher preparation program that produces fewer than 25 recent graduates (or such lower threshold as the State may establish). For teacher preparation programs provided through distance education, the requirement is the same except that, since States are not likely to know the number of recent graduates, States will continue to look at whether the program has that same threshold number of 25 recent graduates, but in this case, to be counted, these recent graduates need to have received an initial certification or licensure from the State that allows them to serve in the State as teachers of record for K-12 students.
Several commenters were concerned that our proposals for aggregating data to be used to annually identify and report the level of performance of small teacher preparation programs would make year-by-year comparisons and longitudinal trends difficult to assess in any meaningful way, since it is very likely that States will use different aggregation methods institution-by-institution and year-by-year.
Commenters noted that many small rural teacher preparation programs and programs producing small numbers of teachers who disperse across the country after program completion do not have the requisite threshold size of 25. Commenters stated that for these programs, States may be unable to collect sufficient valid data. The result will be misinformed high-stakes decision making.
Some commenters proposed that States be able to report a minimum of 10 new teachers with aggregation when a minimum is not met instead of 25. Other options would be to report what data they have or aggregate previous years to meet “n” size.
One commenter recommended that rankings be initially based on a relatively few, normed criteria common to, and appropriate for, all sized programs and States,
Two commenters stated that even though the proposed rules create several ways in which States may report the performance of teacher preparation programs that annually produce fewer than 25 teachers per year, the feasibility of annual reporting at the program level in some States would be so limited it would not be meaningful. The commenters added that regardless of the aggregation strategy, having a minimum threshold of 25 will protect the confidentiality of completers for reporting, but requiring annual reporting of programs that produce 25 or more recent graduates per year will omit a significant number of individual programs from the SRC. Several commenters had similar concerns and stated that annual reporting of the teacher preparation program performance would not be feasible for the majority of teacher preparation programs across the country due to their size or where the student lives. Commenters specifically mentioned that many programs at Historically Black Colleges and Universities will have small cell sizes for graduates, which will make statistical conclusions difficult. Another commenter had concerns with the manner in which particular individual personnel data will be protected from public disclosure, while commenters supported procedural improvements in the proposed regulations discussed in the negotiated rulemaking sessions that addressed student privacy concerns by increasing the reporting threshold from 10 to 25.
Commenters further expressed concerns that for some States, where the number of teachers a program produces per year is less than 25, the manual calculation that States would need to perform to combine programs to aggregate the number of students up to 25 so that the States would then report the assessment of program performance and information on indicators would not only be excessive, but may lead to significant inconsistencies across entities and from one year to the next.
We do not share commenters' concerns about small elementary and secondary schools where privacy concerns purportedly require a school-level calculation of student growth measures rather than calculation of student growth at the teacher level, or related concerns about student learning outcomes for an individual teacher not yielding useable information about a particular teacher preparation program. Student learning outcomes applicable to a particular teacher preparation program would not be aggregated at the school level. Whether measured using student growth, a teacher evaluation measure, or another State-determined measure relevant to calculating student learning outcomes, each teacher—whether employed in a large school or a small school—has some impact on student learning. Under our regulations, these impacts would be aggregated across all schools (or at least all public schools in the State in which the program is located) that employ novice teachers the program had prepared.
For small teacher preparation programs, we believe that a State's use of the aggregation methods reasonably balances the need for annual reporting on teacher preparation program performance with the special challenges of generating a meaningful annual snapshot of program quality for programs that annually produce few teachers. By permitting aggregation to the threshold level of similar or broader programs run by the same teacher preparation entity (paragraph (b)(3)(ii)(A)) or over a period of up to four years (ii)(B)), or both (ii)(C)), we are offering States options for meeting their annual reporting responsibilities for all programs. However, if aggregation under any of the methods identified in § 612.4(b)(3)(ii)(A)-(C) would still not yield the requisite program size threshold of 25 recent graduates or such lower number that a State establishes, or if reporting such data would be inconsistent with Federal or State privacy and confidentiality laws and regulations, § 612.4(b)(3)(ii)(D) and § 612.4(b)(5) provide that the State would not need to report data on, or identify an overall performance rating for, that program.
Our regulations give States flexibility to determine, with their consultative groups, their own ways of determining a teacher preparation program's performance. But if a State were to use the “lowest common denominator” in evaluating programs, as the commenter suggested, it would not be meeting the requirement in § 612.4(b)(1) to identify meaningful differentiation between programs. We continue to caution against making comparisons of the performance of each teacher preparation program, or the data for each indicator and criterion a State uses to determine the overall level of performance, that States report in their SRCs. Each teacher preparation program is different; each has a different mission and draws different groups of aspiring teachers. The purpose of this reporting is to permit the public to understand which programs a State determines to be low-performing or at-risk of being low-performing, and the reasons for this determination. The regulations do not create a national ranking system for comparing the performance of programs across States. For these reasons, we do not believe that the regulations provide perverse incentives for States to lower their standards relative to other States.
While we appreciate the commenter's recommendation that States be required to use a set of normed criteria common across all sized programs and all States, section 205(b) of the HEA requires each State to include in its SRC its criteria for assessing program performance, including indicators of academic content knowledge and teaching skills. Therefore, subject only to use of the indicators of academic content knowledge and teaching skills defined in these regulations, the law provides that each State determine how to assess a program's performance and, in doing so, how to weight different criteria and indicators that bear on the overall assessment of a program's performance.
We appreciate the commenters' statements about potential challenges and limitations that the regulations' aggregation procedures pose for small teacher preparation programs. However, while we agree that a State's use of these procedures for small programs may produce results that are less meaningful than those for programs that annually produce 25 or more recent graduates (or such lower threshold as the State establishes), we believe that they do provide information that is far more meaningful than the omission of information about performance of these small programs altogether. We also appreciate commenters' concerns that for some States, the process of aggregating program data could entail significant effort. But we assume that data for indicators of this and other programs of the same teacher preparation entities would be procured
Like the commenter, we are concerned about protection of individual personnel data from public disclosure. But we do not see how the procedures for aggregating data on small programs, such that what the State reports concerns a combined program that meets the size threshold of 25 (or such lower size threshold as the State establishes) creates legitimate concerns about such disclosure. And as our proposed regulations did not contain a size threshold of 10, we do not believe we need to make edits to address the specific commenters' concerns regarding our threshold number.
TEACH Grant eligibility would not be impacted because either the State will determine and report the program's performance by aggregating relevant data on that program with data for other teacher preparation programs that are operated by the same teacher preparation entity and are similar to or broader than the program in content, or the program will meet the exceptions provided in § 612.4(b)(3)(ii)(D) and § 612.4(b)(5).
However, given the challenges of having States report on the performance of small programs, we believe that providing States this option, as well as options for aggregating data on the program with similar or broader programs of the same teacher preparation entity (§§ 612.4(b)(3)(ii)(A) and (C)), allows the State to make a reasonable determination of the program's level of performance. This is particularly so given that the regulations require that the State identify only whether a given teacher preparation program is low-performing or at-risk of being low-performing. We note that States have the option to aggregate across programs within an entity, if in consultation with stakeholders, they find that produces a more accurate representation of program quality. See § 612.4(b)(3)(ii)(A)). We believe that a State's use of these alternative methods would produce more reliable and valid measures of quality for each of these smaller programs and reasonably balance the need annually to report on program performance with the special challenges of generating a meaningful annual snapshot of program quality for programs that annually produce few novice teachers.
The commenters who recommended reducing the maximum time for aggregating data on the same small program from four years to three did not explain why the option of having an additional year to report on very small programs was preferable to omitting a report on program performance
Another commenter appears to understand that the government wants to review larger data fields for analysis and reporting, but stated that the assumption that data from a program with a smaller “n” size is not report worthy may dampen innovation and learning from a sponsoring organization with a stated goal of producing a limited number of teachers or is in a locale needing a limited number of teachers. The commenter noted that, if a State were to combine programs, report years, or some other combination to get to 25, the Federally stated goal of collecting information about each program, rather than the overall sponsoring organization, is gone. The commenter argued that § 612.4(c), which the commenter states requires that States report on teacher preparation at the individual program level, appears to contradict the over 25 completer rule for reporting.
Another commenter recommended that the proposed regulations include what the commenter characterized as the exemption in the Family Educational Rights and Privacy Act (FERPA) (34 CFR 99.31 or 99.35) that allows for the re-disclosure of student-level data for the purposes of teacher preparation program accountability. The commenter stressed that the proposed regulations do not address a restriction in FERPA that prevents teacher preparation programs from being able to access data that the States will receive on program performance. The commenter voiced concern that as a result of this restriction in FERPA, IHEs will be unable to perform the analyses to determine which components of their teacher preparation programs are leading to improvements in student academic growth and which are not, and urged that we include an exemption in 34 CFR 99.31 or 99.35 to permit the re-disclosure of student-level data to IHEs for the purposes of promoting teacher preparation program accountability. From a program improvement standpoint, the commenter argues that aggregated data are meaningless; teacher preparation programs need fine-grained, person-specific data (data at the lowest level possible) that can be linked to student information housed within the program.
Yet another commenter stated that surveying students (by which we interpret the comment to mean surveying elementary or secondary school students) or parents raises general issues involving FERPA.
At the Federal level, the final regulations do not amend 34 CFR part 99, which are the regulations implementing section 444 of the General Education Provisions Act (GEPA), commonly referred to as FERPA. FERPA is a Federal law that protects the privacy of personally identifiable information in students' education records. See 20 U.S.C. 1232g; 34 CFR part 99. FERPA applies to educational agencies and institutions (elementary and secondary schools, school districts, colleges and universities) that are recipients of Federal funds under a program administered by the Department. FERPA prohibits educational agencies and institutions to which it applies from disclosing personally identifiable information from students' education records, without the prior written consent of the parent or eligible student, unless the disclosure meets an exception to FERPA's general consent requirement. The term “education records” means those records that are: (1) Directly related to a student; and (2) maintained by an educational agency or institution or by a party acting for the agency or institution. Education records would encompass student records that LEAs maintain and that States will need in order to have the data needed to apply the regulatory indicators of academic content and teaching skills to individual teacher preparation programs.
As the commenter implicitly noted, one of the exceptions to FERPA's general consent requirement permits the disclosure of personally identifiable information from education records by an educational agency or institution to authorized representatives of a State educational authority (as well as to local educational authorities, the Secretary, the Attorney General of the United States, and the Comptroller General of the United States) as may be necessary in connection with the audit, evaluation, or the enforcement of Federal legal requirements related to Federal or State supported education programs (termed the “audit and evaluation exception”). The term “State and local educational authority” is not specifically defined in FERPA. However, we have previously explained in the preamble to FERPA regulations published in the
We understand that all SEAs exercise this authority with regard to data provided by LEAs, and therefore FERPA permits LEAs to provide to SEAs the data the State needs to assess the indicators our regulations require. Whether other State agencies such as those that oversee or help to administer aspects of higher education programs or State teacher certification requirements are also State education authorities, and so may likewise receive such data, depends on State law. The Department would therefore need to consider State law (including valid administrative regulations) and the particular responsibilities of a State agency before providing additional guidance about whether a particular State entity qualifies as a State educational authority under FERPA.
The commenter would have us go further, and amend the FERPA regulations to permit State educational authorities to re-disclose this personally identifiable information from students' education records to IHEs or the programs themselves in order to give them the disaggregated data they need to improve the programs. While we understand the commenter's objective, we do not have the legal authority to do this.
Finally, in response to other comments, FERPA does not extend privacy protections to an LEA's records on teachers. Nor do the final regulations require any reporting of survey results from elementary or secondary school students or their parents. To the extent that either is maintained by LEAs, disclosures would be subject to the same exceptions and limitations under FERPA as records of or related to students.
Some commenters believed that, as the relevant stakeholders will vary by State, the regulations should not specify any of the stakeholders that each State must include, leaving the determination of necessary stakeholders to each State's discretion.
Some commenters suggested that States be required to include representatives beyond those listed in the proposed rule. In this regard, commenters stated that representatives of small teacher preparation programs are needed to help the State to annually revisit the aggregation of data for programs with fewer novice teachers than the program size threshold, as would be required under proposed § 612.4(b)(4)(ii). Some commenters recommended adding advocates for low-income and underserved elementary and secondary school students. Some commenters also stated that advocates for students of color, including civil rights organizations, should be required members of the group. In addition, commenters believed that the regulations should require the inclusion of a representative of at least one teacher preparation program provided through distance education, as distance education programs will have unique concerns.
One commenter recommended adding individuals with expertise in testing and assessment to the list of stakeholders. This commenter noted, for example, that there are psychologists who have expertise in aspects of psychological testing and assessment across the variety of contexts in which psychological and behavioral tests are administered. The commenter stated that, when possible, experts such as these who are vested stakeholders in education should be consulted in an effort to ensure the procedures for
Some commenters supported the need for student and parent input into the process of establishing procedures for evaluating program performance but questioned the degree to which elementary and secondary school students and their parents should be expected to provide input on the effectiveness of teacher preparation programs.
One commenter supported including representatives of school boards, but recommended adding the word “local” before “school boards” to clarify that the phrase “school boards” does not simply refer to State boards of education.
We also agree with commenters that States should be required to include as stakeholders advocates for underserved students, such as low-income students and students of color, who are not specifically advocates for English learners and students with disabilities. Section 612.4(c)(ii)(I) includes these individuals, and they could be, for example, representatives of civil rights organizations. To best meet the needs of each State, and to provide room for States to identify other groups of underserved students, the regulations do not specify what those additional groups of underserved students must be.
We agree with the recommendation to require States to include a representative of at least one teacher preparation program provided through distance education in the group of stakeholders as we agree teacher preparation programs provided through distance education are different from brick-and-mortar programs, and warrant representation on the stakeholder group. Under the final regulations, except for the teacher placement rates, States collect information on those programs and report their performance on the same basis as brick-and-mortar programs. See the discussion of comment on
While a State may include individuals with expertise in testing and assessment in the group of stakeholders, we do not require this because States alternatively may either wish to consult with such individuals through other arrangements, or have other means for acquiring information in this area that they need.
Nonetheless, we encourage States to use their discretion to add representatives from other groups to ensure the process for developing their procedures and for assessing and reporting program performance are fair and equitable.
We thank commenters for their support for our inclusion of representatives of “elementary through secondary students and their parents” in the consultative group. We included them because of the importance of having teacher preparation programs focus on their ultimate customers—elementary and secondary school students.
Finally, we agree that the regulation should clarify that the school board representatives whom a State must include in its consultative group of stakeholders are those of local school boards. Similarly, we believe that the regulation should clarify that the superintendents whom a State must include in the group of stakeholders are LEA superintendents.
Commenters also stated that the proposed requirement that the procedures for assessing and reporting the performance of each teacher preparation program in the State must include State-level rewards and consequences associated with the designated performance levels is inappropriate because the HEA does not require States to develop rewards or consequences associated with the designated performance levels of teacher preparation programs. Commenters also questioned the amount of information that States would have to share with the group of stakeholders establishing the procedures on the fiscal status of the State to determine what the rewards should be for high-performing programs. Commenters noted that rewards are envisioned as financial in nature, but States operate under tight fiscal constraints. Commenters believed that States would not want to find themselves in an environment where rewards could not be distributed yet consequences (
In addition, commenters were concerned about the lack of standards in the requirement that States implement a process for programs to challenge the accuracy of their performance data and classification. Commenters noted that many aspects of the rating system carry the potential for inaccurate data to be inputted or for data to be miscalculated. Commenters noted that the proposed regulations do not address how to ensure a robust and transparent appeals process for programs to challenge their classification.
The regulations do not require a State to have State-level rewards or consequences associated with teacher preparation performance levels. To the extent that the State does, § 612.4(b)(2)(iii) requires a State to provide that information in the SRC, and § 612.4(c)(1)(ii)(C) requires the State to include those rewards or consequences in the procedures for assessing and reporting program performance it establishes in consultation with a representative group of stakeholders in accordance with § 612.4(c)(1)(i).
Certainly, whether a State can afford to provide financial rewards is an essential consideration in the development of any State-level rewards. We leave it up to each State to determine, in accordance with any applicable State laws or regulations, the amount of information to be shared in the development of any State-level rewards or consequences.
As a part of establishing appropriate opportunities for teacher preparation programs to challenge the accuracy of their performance data and program classification, States are responsible for determining the related procedures and standards, again in consultation with the required representative group of stakeholders. We expect that these procedures and standards will afford programs meaningful and timely opportunities to appeal the accuracy of their performance data and overall program performance level.
In general, many commenters opposed the use of the indicators of academic content knowledge and teaching skills in the SRC, stating that these indicators are arbitrary, and that there is no empirical evidence that connects the indicators to a quality teacher preparation program; that the proposed indicators have never been tested or evaluated to determine their workability; and that there is no consensus in research or among the teaching profession that the proposed performance indicators combine to accurately represent teacher preparation program quality. Other commenters opined that there is no evidence that the indicators selected actually represent program effectiveness, and further stated that no algorithm would accurately reflect program effectiveness and be able to connect those variables to a ranking system. Many commenters expressed concern about the proposed assessment system, stating that reliability and validity data are lacking. Some commenters indicated that reporting may not need to be annual since multi-year data are more reliable.
Commenters also stated that valid conclusions about teacher preparation program quality cannot be drawn using data with questionable validity and with confounding factors that cannot be controlled at the national level to produce a national rating system for teacher preparation programs. Many other commenters stated that teacher performance cannot be equated with the performance of the students they teach and that there are additional factors that impact teacher preparation program effectiveness that have not been taken into account by the proposed regulations. We interpret other comments as expressing concern that use of the outcome indicators would not necessarily help to ensure that teachers
Commenters stated that there are many potential opportunities for measurement error in the outcome indicators and therefore the existing data do not support a large, fully scaled implementation of this accountability system. Commenters argued that the regulations extend an untested performance assessment into a high-stakes realm by determining eligibility for Federal student aid through assessing the effectiveness of each teacher preparation program. One commenter stated that, in proposing the regulations, the Department did not consider issues that increase measurement error, and thus decrease the validity of inferences that can be made about teacher quality. For example, students who graduate but do not find a teaching job because they have chosen to stay in a specific geographic location would essentially count against a school and its respective ranking. Several commenters suggested that we pilot the proposed system and assess its outcomes, using factors that are flexible and contextualized within a narrative, without high-stakes consequences until any issues in data collection are worked out.
In broad terms, validity here refers to the accuracy of these indicators in measuring what they are supposed to measure,
For reasons we explain below, we believe it is important that teacher preparation programs produce new teachers who positively impact student academic success, take jobs as teachers and stay in the profession at least three years, and feel confident about the training the programs have provided to them. This is what these three indicators in our final regulations do—and by contrast what is missing from the criteria that States have reported in SRCs that they have used to date to assess program performance.
We do not believe that State conclusions about the performance levels of their teacher preparation programs can be valid or reliable if they, as State criteria have done to date, focus on inputs a program offers any more than an automobile manufacturer's assessment of the validity and reliability of its safety and performance testing make sense if they do not pay attention to how the vehicles actually perform on the road.
Our final regulations give States, working with their stakeholders, the responsibility for establishing procedures for ensuring that use of these indicators, and such other indicators of academic content knowledge and teaching skills and other criteria the State may establish, permits the State to reasonably identify (
We further note that by defining novice teacher to include a three-year teaching period, which applies collected for student learning outcomes and employment outcomes, the regulations will have States use data for these indicators of program performance over multiple years. Doing so will increase reliability of the overall level of performance the State assigns to each program in at least two respects. First, it will decrease the chance that one aberrational year of performance or any given cohort of program graduates (or program participants in the case of alternative route teacher preparation programs) has a disproportionate effect on a program's performance. And second, it will decrease the chance that the level of performance a State reports for a program will be invalid or unreliable.
We stress, however, that the student learning outcomes, employment outcomes, and survey outcomes that the regulations require States to use as indicators of academic content and teaching skills are not simply measures that logically are important to assessing a program's true level of performance. Rather, as we discuss below, we believe that these measures are also workable, based on research, and reflective of the direction in which many States and programs are going, even if not reflecting an outright consensus of all teacher preparation programs.
In this regard, we disagree with the commenters' assertions that these measures are arbitrary, lack evidence of support, and have not been tested. The Department's decision to require use of these measures as indicators of academic content knowledge and teaching skills is reinforced by the adoption of similar indicators by CAEP,
We acknowledge that many factors account for the variation in a teacher's impact on student learning. However, we strongly believe that a principal function of any teacher preparation program is to train teachers to promote the academic growth of all students regardless of their personal and family circumstances, and that the indicators whose use the regulations prescribe are already being used to help measure programs' success in doing so. For example, Tennessee employs some of the outcome measures that the regulations require, and reports that some teacher preparation programs consistently produce teachers with statistically significant student learning outcomes over multiple years.
While we acknowledge that some studies of teacher preparation programs
We have found little research one way or the other that directly ties the performance of teacher preparation programs to employment outcomes and survey outcomes. However, we believe that these other measures—program graduates and alternative route program participants' employment as teachers, retention in the profession, and perceptions (with those of their employers) of how well their programs have trained them for the classroom—strongly complement use of student learning outcomes in that they help to complete the picture of how well programs have really trained teachers to take and maintain their teaching responsibilities.
We understand that research into how best to evaluate both teacher effectiveness and the quality of teacher preparation programs continues. To accommodate future developments in research that improve a State's ability to measure program quality as well as State perspectives of how the performance of teacher preparation programs should best be measured, the regulations allow a State to include other indicators of academic content knowledge and teaching skills that measure teachers' effects on student performance (see § 612.5(b)). In addition, given their importance, while we strongly encourage States to provide significant weight in particular to the student learning outcomes and retention rate outcomes in high-need schools in their procedures for assessing program performance, the Department has eliminated the proposed requirements in § 612.4(b)(1) that States consider these measures “in significant part.” The change confirms States' ability to determine how to weight each of these indicators to reflect their own understanding of how best to assess program performance and address any concerns with measurement error. Moreover, the regulations offer States a pilot year, corresponding to the 2017-18 reporting year (for data States are to report in SRCs by October 31, 2018, in which to address and correct for any issues with data collection, measurement error, validity, or reliability in their reported data.
Use of these indicators themselves, of course, does not ensure that novice teachers are prepared to enter the classroom. However, we believe that the regulations, including the requirement for public reporting on each indicator and criterion a State uses to assess a program's level of performance, provide strong incentives for teacher preparation programs to use the feedback from these measures to ensure that the novice teachers they train are ready to take on their teaching responsibilities when they enter the classroom.
We continue to stress that the data on program performance that States report in their SRCs do not create and are not designed to promote any kind of a national, in-State, or interstate rating system for teacher preparation programs, and caution the public against using reported data in this way. Rather, States will use reported data to evaluate program quality based on the indicators of academic content knowledge and teaching skills and other criteria of program performance that they decide to use for this purpose. Of
The regulations define the term “teacher of record” to clarify that teacher preparation programs will be assessed on the aggregate outcomes of novice teachers who are assigned the lead responsibility for a student's learning in the subject area. In this way, although they may generate more data for the student learning outcomes measure, novice teachers who are teachers of record for more than one subject area are treated the same as those who teach in only one subject area.
We do not understand why a science teacher whose district administers only one examination is in a different position than a teacher of any other subject. More important, science is not yet a tested grade or subject under section 1111(b)(2) of the ESEA, as amended by ESSA. Therefore, for the purposes of generating data on a program's student learning outcomes, States that use the definition of “student growth” in § 612.2 will determine student growth for teacher preparation programs that train science teachers through use of measures of student learning and performance that are rigorous, comparable across schools, and consistent with State guidelines. These might include student results on pre-tests and end-of-course tests, objective performance-based assessments, and student learning objectives.
To the extent that the comments refer to small programs that train STEM teachers, the commenters did not indicate why our proposed procedures for reporting data and levels of performance for small teacher preparation programs did not adequately address their concerns. For reasons we discussed in response to comments on aggregating and then reporting data for small teacher preparation programs (§ 612.4(b)(3)(ii)), we believe the procedures the regulations establish for reporting performance of small programs adequately address concerns about program size.
In terms of gathering data about the learning outcomes for students with disabilities, the regulations do not require the teacher of record to use special education teachers' individualized monitoring plans to document student learning outcomes but rather expect teachers to identify, based on the unique needs of the students with disabilities, the appropriate data source. However, we stress that this issue highlights the importance of consultation with key stakeholders, like parents of and advocates for students with disabilities, as States determine how to calculate their student learning outcomes.
Moreover, under § 612.5(b), in assessing the performance of each teacher preparation program, a State may use additional indicators of academic content and teaching skills of its choosing, provided the State uses a consistent approach for all of its teacher preparation programs and these additional indicators provide information on how the graduates produced by the program perform in the classroom. In consultation with their stakeholder groups, States may wish to use additional indicators, such as edTPA, teacher classroom observations, or student survey results, to assess teacher preparation program performance.
As we addressed in our discussion of comment on § 612.4(b)(2)(ii) (Weighting of Indicators), we encourage States to give significant weight to student learning outcomes and employment outcomes in high-need schools. However, we have removed from the final regulations any requirement that States give special weight to these or other indicators of academic content knowledge and teaching skills. Thus, while States must include in their SRCs the weights they give to each indicator and any other criteria they use to identify a program's level of performance, each State has full authority to determine the weighting it gives to each indicator or criterion.
While we agree that information that helps prospective students identify programs that offer a good value is important, the purpose of sections 205(b)(1)(F) and 207(a) of the HEA, and thus our regulations, is to have States identify and report on meaningful criteria that they use to identify a program's level of performance—and specifically whether the program is low-performing or at-risk of being low-performing. While we encourage States to find ways to make information on a program's costs available to the public, we do not believe the information is sufficiently related to a program's level of performance to warrant the additional costs of requiring States to report it. For similar reasons, we decline to add this consumer information to the SRC as additional data States need to report independent of its use in assessing the program's level of performance.
The commenters may be suggesting that student learning outcomes of novice teachers are partially the consequence of the professional development they receive, yet the proposed regulations seem to attribute student learning outcomes to only the teacher preparation program. The preparation that novice teachers receive in their teacher preparation programs, of course, is not the only factor that influences student learning outcomes. But for reasons we have stated, the failure of recent graduates as a whole to demonstrate positive student learning outcomes is an indicator that something in the teacher preparation program is not working. We recognize that novice teachers receive various forms of professional development, but believe that high-quality teacher preparation programs produce graduates who have the knowledge and skills they need to earn positive reviews and stay in the classroom regardless of the type of training they receive on the job.
Some commenters stated that although there can be differences in traditional and alternative route programs that make comparison difficult, political forces that are pro- or anti-alternative route programs can attempt to make certain types of programs look better or worse. Further, commenters noted that it will be difficult for the Department to enforce equivalent levels of accountability and reporting when differences exist across States' indicators and relative weighting decisions.
Another commenter recommended that, to provide context, programs and States should also report raw numbers in addition to rates for these metrics.
However, we stress that the Department has no expectation or desire that a State will designate a certain number or percentage of its programs as low-performing or at-risk of being low-performing. Rather, we want States to do what our regulations provide: Assess the level of performance of each teacher preparation program based on what they determine to be differentiated levels of performance, and report in the SRCs (1) the data they secure about each program based on the indicators and other criteria they use to assess program performance, (2) the weighting of these data to generate the program's level of performance, and (3) a list of programs it found to be low-performing or at-risk of being low-performing. Beyond this, these regulations do not create, and are not designed to promote, an in-State or inter-State ranking system, or to rank traditional versus alternative route programs based on the reported data.
We acknowledge that if they choose, States may employ growth measures specifically based on a relative distribution of teacher scores statewide, which could constitute a “norm-referenced” indicator. While these statewide scores may not improve on the whole, an individual teacher preparation program's performance can still show improvement (or declines) relative to average teacher performance in the State. The Department notes that programs are evaluated on multiple measures of program quality and the other required indicators can be criterion-referenced. For example, a State may set a specific threshold for retention rate or employer satisfaction that a program must meet to be rated as effective. Additionally, States may decide to compare any norm-referenced student learning outcomes, and other indicators, to those of teachers prepared out of State to determine relative improvement of teacher preparation programs as a whole.
With respect to the recommendation that report cards include raw numbers as well as rates attributable to the indicators and other criteria used to assess program performance, § 612.4(b)(2)(i) requires the State to report data relative to each indicator identified in § 612.5. Section V of the instructions for the SRC asks for the numbers and percentages used in the calculation of the indicators of academic content knowledge and teaching skills and any other indicators and criteria a State uses.
The regulations require States to use establish four indicators of academic content knowledge and teaching skills—student learning outcomes, employment outcomes, survey results, and minimum program characteristics—in assessing the level of a teacher preparation program's performance under sections 205(b)(1)(F) and 207(a) of the HEA. In
Thus, the regulations address shortcomings in the current State reporting system by defining indicators of academic content knowledge and teaching skills, focusing on program outcomes that States will use to assess program performance. The regulations build on current State systems and create a much-needed feedback loop to facilitate program improvement and provide valuable information to prospective teachers, potential employers, the general public, and the programs themselves. We agree that program innovation and capacity building are worthwhile, and we believe that what States will report on each program will encourage these efforts.
Under the regulations, teacher preparation programs whose graduates (or participants, if they are teachers while being trained in an alternative route program) do not demonstrate positive student learning outcomes are not punished, nor are States required to punish programs. To the extent that proposed § 612.4(b)(2), which would have permitted a program to be considered effective or higher only if the teachers it produces demonstrate satisfactory or higher student learning outcomes, raised concerns about the regulations seeming punitive, we have removed that provision from the final regulations. Thus, the regulations echo the requirements of section 207(a) of the HEA, which requires that States annually identify teacher preparation programs that are low-performing or that are at-risk of becoming low-performing, and section 207(b) of the HEA, which prescribes the consequences for a program from which the State has withdrawn its approval or terminated its financial support. For a discussion of the relationship between the State classification of teacher preparation programs and TEACH Grant eligibility, see § 686.2 regarding a TEACH Grant-eligible program.
These comments also led us to see potential confusion in the proposed definitions of student learning outcomes and student growth. In reviewing the proposed regulations, we recognized that the original structure of the definition of “student learning outcomes” could cause confusion. We are concerned that having a definition for the term, which was intended only to operationalize the other definitions in the context of § 612.5, was not the
In general, commenters questioned the Department's basis for the use of student learning outcomes as one measure of teacher preparation program performance, citing research to support their claim that the method of measuring student learning outcomes as proposed in the regulations is neither valid nor reliable, and that there is no evidence to support the idea that student outcomes are related to the quality of the teacher preparation program attended by the teacher. Commenters further expressed concerns about the emphasis on linking children's test scores on mandated standardized tests to student learning outcomes. Commenters also stated that teacher preparation programs are responsible for only a small portion of the variation in teacher quality.
Commenters proposed that aggregate teacher evaluation results be the only measure of student learning outcomes so long as the State teacher evaluations do no overly rely on results from standardized tests. Commenters stated that in at least one State, teacher evaluations cannot be used as part of teacher licensure decisions or to reappoint teachers due to the subjective nature of the evaluations.
Some commenters argued that student growth cannot be defined as a simple comparison of achievement between two points in time.
One commenter, who stated that the proposed regulatory approach is thorough and aligned with current trends in evaluation, also expressed concern that K-12 student performance (achievement) data are generally a snapshot in time, typically the result of one standardized test, that does not identify growth over time, the context of the test taking, or other variables that impact student learning.
Commenters further cited research that concluded that student achievement in the classroom is not a valid predictor of whether the teacher's preparation program was high quality and asserted that other professions do not use data in such a simplistic way.
Another commenter stated that local teacher evaluation instruments vary significantly across towns and States.
Another commenter stated that student performance data reported in the aggregate and by subgroups to determine trends and areas for improvement is acceptable but should not be used to label or categorize a school system, school, or classroom teacher.
As we have previously stated, we intend the use of all indicators of academic content knowledge and teaching skills to produce information about the performance-level of each teacher preparation program that, speaking broadly, is valid and reliable. It is clear from the comments we received that there is not an outright consensus on using student learning outcomes to help measure teacher preparation program performance; however, we strongly believe that a program's ability to prepare teachers who can positively influence student academic achievement is both an indicator of their academic content knowledge and teaching skills, and a critical measure for assessing a teacher preparation program's performance. Student learning outcomes therefore belong among multiple measures States must use. We continue to highlight growth as a particularly appropriate way to measure a teacher's effect on student learning because it takes a student's prior achievement into account, gives a teacher an opportunity to demonstrate success regardless of the student characteristics of the class, and therefore reflects the contribution of the teacher to student learning. Even where student growth is not used, producing teachers who can make a positive contribution to student learning should be a fundamental objective of any teacher preparation program and the reason why it should work to provide prospective teachers with academic content and teaching skills. Hence, student learning outcomes, as we define them in the regulations, associated with each teacher preparation program are an important part of an assessment of any program's performance.
States therefore need to collect data on student learning outcomes—through either student growth that examines the change in student achievement in both tested and non-tested grades and subjects, a teacher evaluation measure as defined in the regulations, or another State-determined measure relevant to calculating student learning outcomes—and then link these data to the teacher preparation program that produced (or in the case of an alternative route program, is producing) these teachers.
In so doing, States may if they wish choose to use statistical measures of growth, like VAM or student growth percentiles, that control for student demographics that are typically associated with student achievement. There are multiple examples of the use of similar student learning outcomes in existing research and State reporting. Tennessee, for example, reports that some teacher preparation programs consistently exhibit statistically significant differences in student learning outcomes over multiple years, indicating that scores are reliable from one year to the next.
While some studies of teacher preparation programs
Moreover, looking at the related issue of educator evaluations, there is debate about the level of reliability and validity of the individual elements used in different teacher evaluation systems. However, there is evidence that student growth can be a useful and effective component in teacher evaluation systems. For example, a study found that dismissal threats and financial incentives based partially upon growth scores positively influenced teacher performance.
Teacher preparation programs may well only account for some of the variation in student learning outcomes. However, this does not absolve programs from being accountable for the extent to which their graduates positively impact student achievement. Thus, while the regulations are not intended to address the entire scope of student achievement or all factors that contribute to student learning outcomes, the regulations focus on student learning outcomes as an indicator of whether or not the program is performing properly. In doing so, one would expect that, through a greater focus on their student learning outcomes, States and teacher preparation programs will thereby have the benefit of some basic data about where their work to provide all students with academic content knowledge and teaching skills need to improve.
Commenters pointed out that not all graduates from a specific institution or program will be teaching in similar school contexts and that many factors influencing student achievement cannot be controlled for between testing intervals. Commenters also cited other contributing factors to test results that are not in a teacher's control, including poverty and poverty-related stress; inadequate access to health care; food insecurity; the student's development, family, home life, and community; the student's background knowledge; the available resources in the school district and classroom; school leadership, school curriculum, students not taking testing situations seriously; and school working conditions. Commenters also noted that students are not randomly placed into classrooms or schools, and are often grouped by socioeconomic class, and linguistic segregation, which influences test results.
However, teacher preparation programs should prepare novice teachers to be effective and successful in all classroom environments, including in high-need schools. It is for this reason, as well as to encourage States to highlight successes in these areas, that we include as indicators of academic content knowledge and teaching skills, placement and retention rates in high-need schools.
In addition, States and school districts can control for different kinds of student and classroom characteristics in the ways in which they determine student learning outcomes (and student growth). States can, for example, control for school level characteristics like the concentration of low-income students in the school and in doing so compare teachers who teach in similar schools. Evidence cited below that student growth, as measured by well-designed statistical models, captures the causal effects of teachers on their students also suggests that measures of student growth can successfully mitigate much of potential bias, and supports the conclusion that non-random sorting of students into classrooms does not cause substantial bias in student learning outcomes. We stress, however, the decision to use such controls and other statistical measures to control for student and school characteristics in calculating student learning outcomes is up to States in consultation with their stakeholder groups.
A large number of commenters stated that research points to the challenges and ineffectiveness of using VAM to evaluate both teachers and teacher preparation programs, and asserted that the data collected will be neither meaningful nor useful. Commenters also stated that use of VAM for decision-making in education has been discredited by leading academic and professional organizations such as the American Statistical Association (ASA)
Many commenters also noted that value-added models of student achievement are developed and normed to test student achievement, not to evaluate educators, so using these models to evaluate educators is invalid because the tests have not been validated for that purpose. Commenters further noted that value-added models of student achievement tied to individual teachers should not be used for high-stakes, individual-level decisions or comparisons across highly dissimilar schools or student populations.
Commenters stated that in psychometric terms, VAM are not reliable. They contended that it is a well-established principle that reliability is a necessary but not sufficient condition for validity. If judgments about a teacher preparation program vary based on the method of estimating value-added scores, inferences made about programs cannot be trusted.
Others noted Edward Haertel's
On the use of VAM specifically, we reiterate that the regulations permit multiple ways of measuring student learning outcomes without use of VAM; if they use student growth, States are not required to use VAM. We note also that use of VAM was not a requirement of Race to the Top, nor was it a requirement of ESEA Flexibility, although many States that received Race to the Top funds or ESEA flexibility committed to using statistical models of student growth based on test scores. We also stress that in the context of these regulations, a State that chooses to use VAM and other statistical measures of student growth would use them to help assess the performance of teacher preparation programs as a whole. Neither the proposed nor final regulations address, as many commenters stated, how or whether a State or district might use the results of a statistical model for individual teachers' evaluations and any resulting personnel actions.
Many States and districts currently use a variety of statistical methods in teacher, principal, and school evaluation, as well as in State accountability systems. VAM are one such way of measuring student learning outcomes that are used by many States and districts for these accountability purposes. While we stress that the regulations do not require or anticipate the use of VAM to calculate student learning outcomes or teacher evaluation measures, we offer the following summary of VAM in view of the significant amount of comments the Department received on the subject.
VAM are statistical methodologies developed by researchers to estimate a teacher's unique contribution to growth in student achievement, and are used in teacher evaluation and evaluation of teacher preparation programs. Several experimental and quasi-experimental studies conducted in a variety of districts have found that VAM scores can measure the causal impact teachers have on student learning.
The Department therefore disagrees with commenters who state that the efficacy of VAM is not grounded in sound research. We believe that VAM is commonly used as a component in many teacher evaluation systems precisely because the method minimizes the influence of observable factors independent of the teacher that might affect student achievement growth, like student poverty levels and prior levels of achievement.
Several commenters raised important points to consider with using VAM for teacher evaluation. Many cited the April 8, 2014, “ASA Statement on Using Value-Added Models for Educational Assessment” cited in the summary of comment, that makes several reasonable recommendations regarding the use of VAM, including its endorsement of wise use of data, statistical models, and designed experiments for improving the quality of education. We believe that the definitions of “student learning outcomes” and “student growth” in the regulations, is fully compatible with valid and reliable ways of including VAM to assess the impact of teachers on student academic growth. Therefore, States that chose to use VAM to generate student learning outcomes would have the means to do what the ASA study recommends: Use data and statistical models to improve the quality of their teacher preparation programs. The ASA also wisely cautions that VAMs are complex statistical models, necessitating high levels of statistical expertise to develop and run and should include estimates of the model's precision. These specific recommendations are entirely consistent with the regulations, and we encourage States to follow them when using VAM.
We disagree, however, with the ASA and commenters' assertions that VAM typically measures correlation, not causation, and that VAM does not measure teacher contributions toward other student outcomes. These assertions contradict the evidence cited above that VAM does measure the causal effects of teachers on student achievement, and that teachers with high VAM scores also improve long-term student outcomes.
The implication of the various studies we cited in this section is clear; not only can VAM identify teachers who improve short- and long-term student outcomes, but VAM can play a substantial role in effective, useful teacher evaluation systems.
However, as we have said, States do not need to use VAM to generate student learning outcomes. Working with their stakeholders States can, if they choose, establish other means of reporting a teacher preparation program's “student learning outcomes” that meet the basic standard in § 612.5(a)(1).
We note, however, that there is strong evidence that early career performance is a significant predictor of future performance. Two studies have found that growth scores in the first two years of a teacher's career, as measured by VAM, better predict future performance than measured teacher characteristics that are generally available to districts, such as a teacher's pathway into teaching, available credentialing scores and SAT scores, and competitiveness of undergraduate institution.
Moreover, even if States choose not to use VAM results as student growth measures, the function of teacher preparation programs is to train teachers to be ready to teach when they enter the classroom. We believe student learning outcomes should be measured early in a teacher's career, when the impact of their preparation is likely to be the strongest. However, while we urge States to give significant weight to their student outcome measures across the board, the regulations leave to each State how to weight the indicators of academic content knowledge and teaching skills for novice teachers in their first and other years of teaching.
Regarding comparability across States in the assessments administered to students, nothing in this regulation requires such comparability and, we believe such a requirement would infringe upon the discretion States have historically been provided under the ESEA in determining State standards, assessments, and curricula.
We understand the other comment to question the validity of comparisons of teacher preparation program ratings, as reported in the SRC. We continue to stress that the data regarding program performance reported in the SRCs and required by the regulations do not create, or intend to promote, any in-State or inter-State ranking system. Rather, we anticipate that States will use reported data to evaluate program performance based on State-specific weighting.
One commenter stated that there are validity issues with using tests to measure the skills of deaf children since standardized tests are based on hearing norms and may not be applicable to deaf children. Another commenter noted that deaf and hard-of-hearing K-12 students almost always fall below expected grade level standards, impacting student growth and, as a result, teacher preparation program ratings under our proposed regulations. In a similar vein, one commenter expressed concern that teacher preparation programs that prepare teachers of English learners may be unfairly branded as low-performing or at-risk because the students are forced to conform to tests that are neither valid nor reliable for them.
We expect that these measures of student learning outcomes and other indicators used in State systems under this regulation will be developed in consultation with key stakeholders (see § 612.4(c)), and be based on measures of achievement that conform to student learning outcomes as described in in § 612.5(a)(1)(ii).
Some commenters stated that by, in effect, telling teacher preparation programs that their graduates should engage in behaviors that lift the test scores of their students, the likely main effect will be classrooms that are more directly committed to test preparation (and to what the psychometric community calls score inflation) than to advancement of a comprehensive education.
Commenters opined that, to avoid unfavorable outcomes, teacher preparation programs will seek to place their graduates in higher-performing schools. Rather than encouraging stronger partnerships, commenters expressed concern that programs will abandon efforts to place graduates in low-performing schools. Others were concerned that teachers will self-select out of high-need schools, and a few commenters noted that high-performing schools will continue to have the most resources while teacher shortages in high-need schools, such as those in Native American communities, will be exacerbated.
Some commenters stated that it was unfair to assess a teacher preparation program based on, as we interpret the comment, the student learning outcomes of the novice teachers produced by the program because the students taught by novice teachers may also receive instruction from other teachers who may have more than three years of experience teaching.
For the purposes of the regulations, student learning outcomes may be calculated using student growth. Because growth measures the change in student achievement between two or more points in time, the prior achievement of students is taken into account. Teacher preparation programs may thus be assessed, in part, based on their recent graduates' efforts to increase student growth, not on whether the teachers' classrooms contained students who started as high or low achieving. For this reason, teachers—regardless of the academic achievement level of the students they teach—have the same opportunity to positively impact student growth. Likewise, teacher preparation programs that place students in high-need schools have the same opportunity to achieve satisfactory or higher student learning outcomes. These regulations take into account the commenters' concerns related to teacher equity as placement and retention in high-need schools are required metrics.
We recognize that many factors influence student achievement. Commenters who note that students taught by novice teachers may also receive instruction from other teachers who may have more than three years of experience teaching cite but one factor. But the objective in having States use student growth as an indicator of the performance of a teacher preparation program is not to finely calculate how novice teachers impact student growth. As we have said, it rather is to have the State determine whether a program's student learning outcomes are so far from the mark as to be an indicator of poor program performance.
For these reasons, we disagree with commenters that the student learning outcomes measure will discourage preparation programs and teachers from serving high-need schools. We therefore decline to make changes to the regulations.
Alternatively, to the extent that the commenter was referring to difficulties obtaining data for student learning outcomes (or other of our indicators of academic content and teaching skills) because of the small size of the teacher preparation programs, § 612.4(b)(3)(ii) provides different options for aggregation of data so the State can provide these programs with appropriate performance ratings. In this case, except for teacher preparation programs that are so small that even these aggregation methods will not permit the State to identify a performance level (see § 612.4(b)(3)(ii)(D) and § 612.4(b)(5)), all programs will have data on student learning outcomes with which to determine the program's level of performance.
One commenter asked the Department to confirm that the commenters' State's ESEA flexibility waiver would meet the student learning outcome requirements for both tested and non-tested grades and subjects, and if so, given the difficulty and cost, whether the State would still be required to report disaggregated data on student growth in assessment test scores for individual teachers, programs, or entities in the SRC. Commenters also noted that LEAs could be especially burdened, with no corresponding State or Federal authority to compel LEA compliance. A commenter stated that in one city most teachers have 20 to 40 percent of their evaluations based on tests in subjects they do not teach.
Commenters urged that States be given flexibility in determining the components of data collection and reporting systems with minimal common elements. This would, as commenters indicated, ultimately delay the State's ability to make valid and reliable determinations of teacher preparation program quality. Some commenters stated that States should be required to use student learning outcomes as a factor in performance designations, but allow each State to determine how best to incorporate these outcomes into accountability systems.
Commenters noted that a plan for creating or implementing a measure of student achievement in content areas for which States do not have valid statewide achievement data was not proposed, nor was a plan proposed to pilot or fund such standardized measures.
Regarding the suggestion that State standards for student learning outcomes should be nationally coordinated, States are free to coordinate. But how each State assesses a program's performance is a State decision; the HEA does not otherwise provide for such national coordination.
With respect to the comment asking whether a State's ESEA flexibility waiver would meet the student learning outcomes requirement for both tested and non-tested grades and subjects, this issue is likely no longer relevant since the enactment of the ESSA will make ESEA flexibility waivers null and void on August 1, 2016. However, in response to the commenters' question, so long as the State is implementing the evaluation systems as they committed to do in order to receive ESEA flexibility, the data it uses for student learning outcomes would most likely represent an acceptable way, among other ways, to comply with the title II reporting requirements.
We understand the comment, that LEAs would be especially burdened with no corresponding State or Federal authority to compel LEA compliance, to refer to LEA financial costs. It is unclear that LEAs would be so burdened. We believe that our cost estimates, as revised to respond to public comment, are accurate. Therefore, we also believe that States, LEAs, and IHEs will be able meet responsibilities under this reporting system without need for new funding sources. We discuss authorities related to LEA compliance in the discussion under § 612.1.
Regarding specific reporting recommendations for State flexibility in use of student learning outcomes, State must use the indicators of academic content knowledge and teaching skills identified in § 612.5(a). However, States otherwise determine for themselves how to use these indicators and other indicators and criteria they may establish to assess a program's performance. In identifying the performance level of each program, States also determine the weighting of all indicators and criteria they use to assess program performance.
Finally, we understand that all States are working to implement their responsibilities to provide results of student assessments for grades and subjects in which assessments are required under section 1111(b)(2) of the ESEA, as amended by ESSA. With respect to the comment that the Department did not propose a plan for creating or implementing a measure of student achievement in content areas for which States do not have valid statewide achievement data, the regulations give States substantial flexibility in how they measure student achievement. Moreover, we do not agree that time to pilot such new assessments or growth calculations, or more Federal funding in this area, is needed.
However, upon review of the definitions of the terms “student achievement in non-tested grades and subjects,” “student achievement in tested grades and subjects,” and “teacher evaluation measure” in proposed § 612.2, we realized that these definitions did not clearly authorize States to exclude student learning outcomes associated with these teachers from their calculation of a teacher preparation program's aggregate student learning outcomes. Therefore, we have revised § 612.5(a)(1) to include authority for the State to exclude data on student learning outcomes for students of novice teachers teaching out of State or in private schools from its calculation of a teacher preparation program's student learning outcomes. In doing so, as with the definitions of teacher placement rate and teacher retention rate, we have included in the regulations a requirement that the State use a consistent approach with regard to omitting or using these data in assessing and reporting on all teacher preparation programs.
In addition, programs that persistently produce teachers who fail to find jobs, or, once teaching, fail to remain employed as teachers, may well not be providing the level of academic content knowledge and teaching skills that novice teachers need to succeed in the classroom. Working with their stakeholders (see § 612.4(c)), each State will determine the point at which the reported employment outcomes for a program go from the acceptable to the unacceptable, the latter indicating a problem with the quality of the program. We fully believe that these outcomes reflect another reasonable way to define an indicator of academic content knowledge and teaching skills, and that unacceptable employment outcomes show something is wrong with the quality of preparation the teaching candidates have received.
Further, we believe that given the need for teacher preparation programs to produce teachers who are prepared to address the needs of students in high-need schools, it is reasonable and appropriate that indicators of academic content and teaching skills used to help assess a program's performance focus particular attention on teachers in those schools. Therefore, we do not believe that States should have the option to include teacher placement rates (and teacher retention rates) for high-need schools in their SRCs.
We agree with commenters that, in States where postsecondary training and certification is required, and the State licenses those teachers, data on the placement and retention of preschool teachers should be reported. We strongly encourage States to report this information. However, we decline to require that they do so because pre-kindergarten licensure and teacher evaluation requirements vary significantly between States and among settings, and given these State and local differences in approach we believe that it is important to leave the determination of whether and how to include preschool teachers in this measure to the States.
We disagree with the commenter's suggestion that alternative route program participants are teaching in out-of-field positions. Employment as a teacher is generally a prerequisite to entry into alternative route programs, and the alternative route program participants are being prepared for an initial certification or licensure in the field in which they are teaching. We do not know of evidence to suggest that most participants in alternative route programs become teachers of record without first having demonstrated adequate subject-matter content knowledge in the subjects they teach.
Nonetheless, traditional route programs and alternative route programs recruit from different groups of prospective teachers and have different characteristics. It is for this reason that, both in our proposed and final regulations, States are permitted to assess the employment outcomes of traditional route programs versus alternative route programs differently, provided that the different assessments result in equivalent standards of accountability and reporting.
Some commenters also expressed concern about how the retention rate measure will be used to assess performance during the first few years of implementation. They stated that it would be unfair to rate teacher preparation programs without complete information on retention rates.
We understand that, during the initial years of implementation, States will not have complete data on retention. We expect that States will weigh indicators for which data are unavailable during these initial implementation years in a way that is consistent and applies equivalent levels of accountability across programs. For further discussion of the reporting cycle and implementation timeline, see § 612.4(a). We also note that, as we explain in our response to comments on the definition of “teacher retention rate”, under the final regulations States will report on teachers who remain in the profession in the first three consecutive years after placement.
However, permitting States to exclude from the teacher placement rate calculation, but not from the teacher retention rate calculation, recent graduates who have taken teaching positions that do not require State certification could create inconsistencies between the measures. Moreover, upon further review, we believe permitting the exclusion of this category of teachers from either calculation runs contrary to the purpose of the regulations, which is to assess the performance of programs that lead to an initial State teacher certification or licensure in a specific field. For these reasons, the option to exclude this category of teachers has been removed from the definition of “teacher placement rate” in the final regulations (see § 612.2). With this change, the differences between the categories of teachers that can be excluded from teacher placement rate and teacher retention rate will not unfairly impact the outcomes of these measures, so long as the State uses a consistent approach to assess and report on all programs in the State.
We acknowledge that special education teachers face particular challenges, and that like other teachers, there are a variety of reasons—some dealing with the demands of their specialty, and some dealing with a desire for other responsibilities, or personal factors—for novice special education teachers to decide to move to other professional areas. For example, some teachers with special education training, after initial employment, may choose to work in regular education classrooms, where many children with disabilities are taught consistent with the least restrictive environment provisions of the Individuals with Disabilities Education Act. Their specialized training can be of great benefit in the regular education setting.
Under our regulations, States will determine how to apply the teacher retention indicator, and so determine in consultation with their stakeholders (see § 612.4(c)) what levels of retention would be so unreasonably low (or so unexpectedly high) to reflect on the quality of the teacher preparation program. We believe this State flexibility will incorporate consideration of the programmatic quality of special education teacher preparation and the general circumstances of employment of these teachers. Special education teachers are teachers first and foremost, and we do not believe the programs that train special education teachers should be exempted from the State's overall calculations of their teacher retention rates. Demand for teachers trained in special education is expected to remain high, and given the flexibility States have to determine what is a reasonable retention rate for novice special education teachers, we do not believe that this indicator of program quality will result in a reduction of special education preparation programs.
The Department required all States to submit State Plans to Ensure Equitable Access to Excellent Educations (Educator Equity Plans) to address this requirement, and we look forward to the time when employment outcomes that focus on high-need schools are unnecessary. However, it is much too early to remove employment indicators that focus on high-need schools. For this reason, we decline to accept the commenters' recommendation that we do so because of concern that these reporting requirements are inconsistent with those under the ESEA.
We add that, just as States will establish the weights to these outcomes in assessing the level of program performance, States also may adjust their expectations for placement and retention rates for high-need schools in order to support successful implementation of their State plans.
We disagree with commenters that the regulations will lead to higher turnover rates. By requiring reporting on teacher preparation rates by program, we believe that employers will be better able to identify programs with strong track records for preparing novice teachers who stay, and succeed, in high-need schools. This information will help employers make informed hiring decisions and may ultimately help districts reduce teacher turnover rates.
Several commenters suggested additional indicators that could be used to report on employment outcomes. Specifically, commenters suggested that programs should report the demographics and outcomes of enrolled teacher candidates by race and ethnicity (graduation rate, dropout rates, placement rates for graduates, first-year evaluation scores (if available), and the percentage of teachers candidates who stay within the teaching profession for one, three, and five years). Also, commenters suggested that the Department include the use of readily-available financial data when reporting employment outcomes. Another commenter suggested that the Department collect information on how many teachers from each teacher preparation program attain an exemplary rating through the statewide evaluation systems. Finally, one commenter suggested counting the number of times schools hire graduates from the same teacher preparation program.
We do not believe that further disaggregation of data as recommended will produce a sufficiently useful indicator of teacher preparation program performance to justify a requirement that all States implement one or more of these recommendations. We therefore decline to adopt them. We also do not believe additional indicators are necessary to assess the academic content knowledge and teaching skills of the novice teachers from each teacher preparation program though consistent with § 612.5(b), States are free to adopt them if they choose to do so.
Several commenters noted that school districts often handle their own decisions about hiring and placement of new school teachers, which severely limits institutions' ability to place teachers in schools. Many commenters advised against using employment data in assessments of teacher preparation programs. Some stated that these data would fail to recognize the importance of teacher preparation program students' variable career paths and potential for employment in teaching-related fields. To narrowly define teacher preparation program quality in terms of a limited conception of employment for graduates is misguided and unnecessarily damaging.
Other commenters argued that the assumption underlying this proposed
In applying these employment outcome measures, it would be absurd to assume that States will treat a rate that is below 100 percent as a poor reflection on the quality of the teacher preparation program. Rather, in applying these measures States may determine what placement rates and retention rates would be so low (or so high, if they choose to identify exceptionally performing programs) as to speak to the quality of the program itself.
However, while factors like those commenters identify affect employment outcomes, we believe that the primary goal of teacher preparation programs should be to produce graduates who successfully become classroom teachers and stay in teaching at least several years. We believe that high placement and retention rates are indicators that a teacher preparation program's graduates (or an alternative route program's participants if a State chooses to look at them rather than program graduates) have the requisite content knowledge and teaching skills to demonstrate sufficient competency to find a job, earn positive reviews, and choose to stay in the profession. This view is shared by States like North Carolina, Louisiana, and Tennessee, as well as CAEP, which require reporting on similar outcomes for teacher preparation programs.
Commenters accurately point out that teachers in low-performing schools with high concentrations of students of color have significantly higher rates of turnover. Research from New York State confirms this finding, but also shows that first-year teachers who leave a school are, on average, significantly less effective than those who stay.
The use of employment outcomes as indicators of the performance of a teacher preparation program also reflects the relationship between teacher retention rates and student outcomes. At the school level, high teacher turnover can have multiple negative effects on student learning. When a teacher leaves a school, it is more likely that the vacancy will be filled by a less-experienced and, on average, less-effective teacher, which will lower the achievement of students in the school. In addition to this effect on the composition of a school's teacher workforce, the findings of Ronfeldt,
Thus, we believe that employment outcomes, taken together, serve not only as reasonable indicators of academic content knowledge and teaching skill, but also as potentially important incentives for programs and States to focus on a program's ability to produce graduates with the skills and preparation to teach for many years. Placement rates overall and in high-need schools specifically, are particularly important, in that they provide a baseline context for evaluating a program's retention rates. In an extreme example, a program may have 100 graduates, but if only one graduate who actually secures employment as a teacher, and continues to teach, that school would have a retention rate of 100 percent. Plainly, such a retention rate does not provide a meaningful or complete assessment of the program's impact on teacher retention rate, and thus on this indicator of program quality. Similarly, two programs may each produce 100 teachers, but one program only places teachers in high-need schools, while the other places no teachers in high-need schools. Even if the programs produced graduates of the exact same quality, the program that serves high-need schools would be likely to have lower retention rates, due to the challenges described in comments and above.
Finally, we reiterate that States have flexibility to determine how employment outcomes should be weighted, so that they may match their metrics to their individual needs and conditions. In regard to using other available measures of teaching ability and academic content knowledge, like edTPA, we believe that, taken together, outcome-based measures that we require
It is clear from the comments we received that there is not an outright consensus on using employment outcomes to measure teacher preparation programs; however, we strongly believe that the inclusion of employment outcomes with other measures contributes to States' abilities to make valid and reliable decisions about program performance. Under the regulations, States will work with their stakeholders (see § 612.4(c)) to establish methods for evaluating the quality of data related to a program's outcome measures, and all other indicators, to ensure that the reported data are fair and equitable. As we discussed in the NPRM, in doing so, the State should use this process to ensure the reliability, validity, integrity, and accuracy of all data reported about the performance of teacher preparation programs. We recognize the burden that reporting on employment outcomes may place on individual programs, and for this reason, we suggest, but do not require, that States examine their capacity, within their longitudinal data systems, to track employment outcomes because we believe this will reduce costs for IHEs and increase efficiency of data collection.
We recognize that program graduates may not end up teaching in the same State as their teacher preparation program for a variety of reasons and suggest, but do not require, that States create inter-State partnerships to better track employment outcomes of program completers as well as agreements that allow them to track military service, graduate school enrollment, and employment as teacher in a private school. But we do not believe that the exclusion of these recent graduates, or those who go on to teach in private schools, jeopardizes reasonable use of this indicator of teacher preparation program performance. As noted, previously, we have revised the regulations so that States may not exclude recent graduates employed in positions which do not require certification from their calculations of employment outcomes. Working with their stakeholders (see § 612.4(c) States will be able to determine how best to apply the retention rate data that they have.
Finally, we understand that many teacher preparation programs do not currently collect data on factors like job placement, how long their graduates who become teachers stay in the profession, and the gains in academic achievement that are associated with their graduates. However, collecting this information is not beyond those programs' capacity. Moreover, the regulations make the State responsible for ensuring that data needed for each indicator to assess program performance are secured and used. How they will do so would be a subject for State discussion with its consultative group.
Commenters were concerned about how often data will be updated by the Department. They stated that, due to teachers changing schools mid-year, data will be outdated and not helpful to the consumer. Several commenters suggested that a national database would need to be in place for accurate data collection so institutions would be able to track graduates across State boundaries. Two commenters noted that it will be difficult to follow graduates over several years and collect accurate data to address all of the areas relevant to a program's retention rate, and that therefore reported rates would reflect a great deal of missing data.
Another commenter suggested that the Department provide support for the development and implementation of data systems that will allow States to safely and securely share employment, placement, and retention data.
In order to decrease the costs associated with calculating teacher placement and teacher retention rates and to better focus the data collection, our proposed and final definitions of teacher placement rate and teacher retention rate in § 612.2 permit States to exclude certain categories of novice teachers from their calculations for their teacher preparation programs, provided that each State uses a consistent approach to assess and report on all of the teacher preparation programs in the State. As we have already noted, these categories include teachers who teach in other States, teach in private schools, are not retained specifically and directly due to budget cuts, or join the military or enroll in graduate school. While we encourage States to work to capture these data to make the placement and retention rates for each program as robust as possible, we understand that
To address confidentiality concerns, § 612.4(b)(5) expressly exempts reporting of data where doing so would violate Federal or State privacy laws or regulations.
The regulations do not require States to submit documentation with the SRCs that supports their data collections; they only must submit the ultimate calculation for each program's indicator (and its weighting). However, States may not omit program graduates (or participants in alternative route programs if a State chooses to look at participants rather than program graduates) from any of the calculations of employment or survey outcomes indicators without being able to verify that these individuals are in the groups that the regulators permit States to omit.
Some commenters recommended that the Department maintain a national database, while others seemed to think that we plan to maintain such a database. States must submit their SRCs to the Department annually, and the Department intends to make these reports and the data they include, like SRCs that States annually submitted in prior years, publicly available. The Department has no other plans for activities relevant to a national database.
Commenters were concerned about difficulties in following graduates for the three-year period proposed in the NPRM. As discussed in response to comment on the “teacher retention rate” definition in § 612.2, we have modified the definition of “teacher retention rate” so that States will be reporting on the first three years a teacher is in the classroom rather than three out of the first five years. We believe this change addresses the commenters' concerns.
As we interpret the comment, one commenter suggested we provide support for more robust data systems so that States have access to the employment data of teachers who move to other States. We have technical assistance resources dedicated to helping States collect and use longitudinal data, including the Statewide Longitudinal Data System's Education Data Technical Assistance Program and the Privacy Technical Assistance Center, which focuses on the privacy and security of student data. We will look into whether these resources may be able to help address this matter.
As an alternative, commenters suggested that the Department alter the definition of “new teacher” so that both traditional and alternative route teacher candidates start on equal ground. For example, the definition might include “after all coursework is completed,” “at the point a teacher is placed in the classroom,” or “at the moment a teacher becomes a teacher of record.” Commenters recommended that teacher retention rate should be more in line with CAEP standards, which do not differentiate accountability for alternate and traditional route teacher preparation programs.
Many commenters were concerned about the ability of States to weight employment outcomes differently for alternative and traditional route programs, thus creating unfair comparisons among States or programs in different States while providing the illusion of fair comparisons by using the same metrics. One commenter was concerned about a teacher preparation program's ability to place candidates in fields where a degree in a specific discipline is needed, as those jobs will go to those with the discipline degree and not to a teacher preparation program degree, thus giving teachers from alternative route programs an advantage. Others stated that demographics may impact whether a student enrolls in a traditional or an alternative route program, so comparing the two types of programs in any way is not appropriate.
For reasons discussed in the
Recognizing both that (a) the differences in the characteristics of traditional and alternative route programs may create differences between teacher placement rate in high-need schools and (b) our removal of the requirement to include teacher placement rate for alternative certification programs creates a different number of required indicators for Employment Outcomes between the two
We believe States are best suited to analyze their traditional and alternative route programs and determine how best to apply employment outcomes to assess the overall performance of these programs. As such, to further promote transparency and fair treatment, we have revised section V of the SRC to include the need for each State to describe the rationale for how the State is treating the employment outcomes differently, provided it has not chosen to add a measure of placement rate for alternative route programs and does in fact have different bases for accountability.
We also believe that, as we had proposed, States should apply equivalent standards of accountability in how they treat employment outcomes for traditional programs and alternative route programs, and suggest a few approaches States might consider for achieving such equivalency.
For example, a State might devise a system with five areas in which a teacher preparation program must have satisfactory outcomes in order to be considered not low-performing or at-risk of being low-performing. For the employment outcomes measure (and leaving aside the need for employment outcomes for high-need schools), a State might determine that traditional route programs must have a teacher placement rate of at least 80 percent and a second-year teacher retention rate of at least 70 percent to be considered as having satisfactory employment outcomes. The State may, in consultation with stakeholders, determine that a second-year retention rate of 85 percent for alternative certification programs results in an equivalent level of accountability for those programs, given that almost all participants in such programs in the State are placed and retained for some period of time during their program.
As another example, a State might establish a numerical scale wherein the employment outcomes for all teacher preparation programs in the State account for 20 percent. A State might then determine that teacher placement (overall and at high-needs schools) and teacher retention (overall and at high-needs schools) outcomes are weighted equally, say at 10 percent each, for all traditional route programs, but weight the placement rate in high-need schools at 10 percent and retention rate (overall and at high-needs schools) at 10 percent for alternative route programs.
We also recognize that some alternative route programs are specifically designed to recruit high-quality participants who may be committed to teach only for a few years. Many also recruit participants who in college had academic majors in fields similar to what they will teach. Since a significant aspect of our indicators of academic content knowledge and teaching skills focus on the success of novice teachers regardless of the nature of their teacher preparation program, we do not believe we should establish a one-size-fits-all rule here. Rather, we think that States are in a better position to determine how the employment outcomes should best be used to help assess the performance of alternative route and traditional route programs.
We agree that use of multiple measures of program performance is important. We reiterate that the regulations require that, in reporting the performance of all programs, both traditional and alternative route, States must use the four indicators of academic content knowledge and teaching skills the regulations identify in § 612.5(a), including employment outcomes—the teacher placement rate (excepting the requirement here for alternative route programs), teacher placement rate in high-need schools, teacher retention rate, and teacher retention rate in high-need schools—in addition to any indicators of academic content knowledge and teaching skills and other criteria they may establish on their own.
However, we do not know of any inherent differences between traditional route programs and alternative route programs that should require different treatment of the other required indicators—student learning outcomes, survey outcomes, and the basic characteristics of the program addressed in § 612.5(a)(4). Nor do we see any reason why any differences in the type of individuals that traditional route programs and alternative route programs enroll should mean that the program's student learning outcomes should be assessed differently.
Finally, while some commenters argued about the relative advantage of alternative route or traditional route programs in reporting on employment outcomes, we reiterate that neither the regulations nor the SRCs pit programs against each other. Each State determines what teacher preparation programs are and are not low-performing or at-risk of being low-performing (as well as in any other category of performance it may establish). Each State then reports the data that reflect the indicators and criteria used to make this determination, and identifies those programs that are low-performing or at-risk of being low-performing. Of course, any differences in how employment outcomes are applied to traditional route and alternative route programs would need to result in equivalent levels of accountability and reporting (see § 612.5(a)(2)(B)). But the issue for each State is identifying each program's level of performance relative to the level of expectations the State established—not relative to levels of performance or results for indicators or criteria that apply to other programs.
We have also added a new § 612.5(a)(2)(v) to provide that a State is not required to calculate a teacher placement rate under paragraph (a)(2)(i)(A) of that section for alternative route to certification programs.
Recognizing these types of issues, the Department has determined that it is appropriate to create an alternative method for States to calculate employment outcomes for teacher preparation programs provided through distance education. Specifically, we have revised the definition of teacher placement rate to allow States, in calculating teacher placement rate for teacher preparation programs provided through distance education, to use the total number of recent graduates who have obtained initial certification or licensure in the State during the three preceding title II reporting years as the denominator in their calculation instead of the total number of recent graduates. Additionally, we believe it is appropriate to give States greater flexibility in assessing these outcomes, and have added a new § 612.5(a)(2)(iv) which allows States to assess teacher placement rates differently for teacher preparation programs provided through distance education provided that the differences in assessment are transparent and result in similar levels of accountability for all teacher preparation programs.
In addition, commenters recommended allowing institutions themselves to conduct and report annual survey data for teacher graduates and employers, noting that a number of institutions currently conduct well-honed, rigorous surveys of teacher preparation program graduates and their employers. Commenters were concerned with the addition of a uniform State-level survey for assessing teacher preparation programs, stating that it is not possible to obtain high individual response rates for two surveys addressing the same area. Commenters contended that, as a result, the extensive longitudinal survey databases established by some of the best teacher education programs in the Nation will be at-risk, resulting in the potential loss of the baseline data, the annual data, and the continuous improvement systems associated with these surveys despite years of investment in them and substantial demonstrated benefits.
Some commenters noted that it is hard to predict how reliable the teacher and employer surveys required by the regulations would be as an indicator of teacher preparation program quality, since the proposed regulations do not specify how these surveys would be developed or whether they would be the same across the State or States. In addition, the commenters noted that it is hard to predict how reliable the surveys may be in capturing teacher and employer perceptions of how adequately prepared teachers are since these surveys do not exist in most places and would have to be created. Commenters also stated that survey data will need to be standardized for all of a State's institutions, which will likely result in a significant cost to States.
Some commenters stated that, in lieu of surveys, States should be allowed to create preparation program-school system partnerships that provide for joint design and administration of the preparation program. They claimed when local school systems and preparation programs jointly design and oversee the preparation program, surveys are unnecessary because the partnership creates one preparation program entity that is responsible for the quality of preparation and satisfaction of district and school leaders.
We share the belief of these organizations that a novice teacher's perception, and that of his or her employer, of the teacher's readiness and capability during the first year teaching are key indicators of that individual's academic knowledge and teaching skills as well as whether his or her preparation program is training teachers
Regarding commenters concerns about the validity and reliability of the use of survey results to help assess program performance, we first reference our general discussion of the issue in response to public comment on
Beyond this, it plainly is important that States develop procedures to enable teachers' and employers' perceptions to be appropriately used and have the desired impacts, and at the same time to enable States to use survey results in ways that treat all programs fairly. To do so, we strongly encourage States to standardize their use of surveys so that for novice teachers who are similarly situated, they seek common information from them and their employers. We are confident that, in consultation with key stakeholders as provided for in § 612.4(c)(1), States will be able to develop a standardized, unbiased, and reliable set of survey questions, or ensure that IHE surveys meet the same standard. This goal would be very difficult to achieve, however, if States relied on existing surveys (unless modified appropriately) whose questions vary in content and thus solicit different information and responses. Of course, it is likely that many strong surveys already exist and are in use, and we encourage States to consider using such an existing survey so long as it comports with § 612.5(a)(3). Where a State finds an existing survey of novice teachers and their employers to be adequate, doing so will avoid the cost and time of preparing another, and to the extent possible, prevent the need for teachers and employers to complete more than one survey, which commenters reasonably would like to avoid. Concerns about the cost and burden of implementing teacher and employer surveys are discussed further with the next set of comments on this section.
We note that States have the discretion to determine how they will publicly post the results of surveys and how they will aggregate the results associated with teachers from each program for use as an indicator of that program's performance. We encourage States to report survey results disaggregated by question (as is done, for example, by Ohio
Like those who commented, we believe that partnerships between teacher preparation programs and local school systems have great value in improving the transition of individuals whom teacher preparation programs train to the classroom and a novice teacher's overall effectiveness. However, these partnerships cannot replace survey results as an indicator of the program's performance.
Commenters stated that, if used as expected for comparability purposes, the survey would likely need to be designed by and conducted through a third-party agency with professional credentials in survey design and survey administration. They stated that sampling errors and various forms of bias can easily skew survey results and the survey would need to be managed by a professional third-party group, which would likely be a significant cost to States.
One commenter recommended that a national training and technical assistance center be established to build data capacity, consistency, and quality among States and educator preparation providers to support scalable continuous improvement and program quality in teacher preparation. In support of this recommendation, the commenter, an accreditor of education preparation providers, stated that, based on its analysis of its first annual collection of outcome data from education preparation providers, and its follow-up survey of education preparation providers, the availability of survey outcomes data differs by survey type. The commenter noted that while 714 teacher preparation program providers reported that they have access to completer survey data, 250 providers reported that they did not have access. In addition, the commenter noted that teacher preparation program providers indicated that there were many challenges in reporting employment status, including State data systems as well as programs that export completers across the nation or internationally.
We note that in ensuring that the required surveys are reasonable and appropriate, States have some control
In considering the comment, we realized that while we estimated costs of reporting all indicators of academic content knowledge and teaching skills, including survey outcomes, on an annual basis, the regulations did not adequately clarify the need to collect and report data related to each indicator annually. Therefore, we have revised § 612.4(b)(2)(i) to require that data for each indicator be provided annually for the most recent title II reporting year.
Further discussion regarding the cost and burden of implementing teacher and employer surveys can be found in the
The regulations do not prescribe any particular method for obtaining the completed surveys, and States may certainly work with their teacher preparation programs and teacher preparation entities to implement effective ways to obtain survey results. Beyond this, we expect that States will seek and employ the assistance that they need to develop, implement, and manage teacher and employer surveys as they see fit. We expect that States will ensure the validity and reliability of survey outcomes—including how to address responder bias—when they establish their procedures for assessing and reporting the performance of each teacher preparation program with a representative group of stakeholders, as is required under § 612.4(c)(1)(i). The regulations do not specify the process States must use to develop, implement, or manage their employer surveys, so whether they choose to use third-party entities to help them do so is up to them.
Finally, we believe it is important for the Department to work with States and teacher preparation programs across the nation to improve those programs, and we look forward to engaging in continuing dialogue about how this can be done and what the appropriate role of the Department should be. However, the commenters' request for a national training and technical assistance center to support scalable continuous improvement and to improve program quality is outside the scope of this regulation—which is focused on the States' use of indicators of academic content knowledge and teaching skills in their processes of identifying those programs that are low-performing, or at-risk of being low-performing, and other matters related to reporting under the title II reporting system.
Commenters also felt that two of our suggestions in the NPRM to ensure completion of surveys—that States consider using commercially available survey software or that teachers be required to complete a survey before they can access their class rosters—raise tremendous questions about the security of student data and the sharing of identifying information with commercial entities.
While it may be true that responder bias could impact any survey data, we expect that the variety and number of responses from novice teachers employed at different schools and within different school districts will ensure that such bias will not substantially affect overall survey results.
There is no reason student data should ever be captured in any survey results, even if commercially available software is used or teachers are required to complete a survey before they can access and verify their class rosters. Commenters did not identify any particular concerns related to State or Federal privacy laws, and we do not understand what they might be. That being said, we fully expect States will design their survey procedures in keeping with requirements of any applicable privacy laws.
One commenter shared that, since 2007, the Illinois Association of Deans of Public Colleges of Education has conducted graduate surveys of new teachers from the twelve Illinois public universities, by mailing surveys to new teachers and their employers. The response rate for new teachers has been extremely low (44.2 percent for the 2012 survey and 22.6 percent for the 2013 survey). The supervisor response has been higher, but still insufficient, according to the commenter, for the purpose of rating programs (65.3 percent for the 2012 survey and 40.5 percent for the 2013 survey). In addition, the commenter stated that some data from these surveys indicate differences in the responses provided by new teachers and their supervisors. The commenter felt that the low response rate is compounded when trying to find matched pairs of teachers and supervisors. Using results from an institution's new teacher survey data, the commenter was only able to identify 29 out of 104 possible matched pairs in 2012 and 11 out of 106 possible matched pairs in 2013.
One commenter from an IHE stated that the institution's return rate on graduate surveys over the past 24 years has been 10 to 24 percent, which they stated is in line with national response rates. While the institution's last survey of 50 school principals had a 50 percent return rate, the commenter noted that her institution only surveys those school divisions which they know regularly hire its graduates because it does not have a source from which it can obtain actual employment information for all graduates. According to the commenter, a statewide process that better ensures that all school administrators provide feedback would be very helpful, but could also be very burdensome for the schools.
Another commenter noted that the response rate from the institution's graduates increased significantly when the questionnaire went out via email, rather than through the United States Postal Service; however, the response rate from school district administrators remained dismal, no matter what format was used—mail, email, Facebook, Instagram, SurveyMonkey, etc. One commenter added that defaulting back to the position of having teachers complete surveys during their school days, and thus being yet another imposition on content time in the classroom, was not a good alternative to address low response rates. Commenters saw an important Federal role in accurately tracking program graduates across State boundaries.
We believe that States can increase their response rate by incorporating the surveys into other structures, for example, having LEAs disseminate the survey at various points throughout teachers' induction period. Surveys may also be made part of required end-of-year closeout activities for teachers and their supervisors. As the regulations require States to survey only those teachers who are teaching in public schools and the public school employees who employ them (see the discussion of the definition of a novice teacher under § 612.2(d)), we believe that approaches such as these will enable States to achieve reasonably high response rates and, thus, valid survey results.
Finally, before the Department would consider working to develop a system, like one the commenter suggested, for tracking program graduates across State boundaries, we would want to consult with States, IHEs and other stakeholders.
Commenters raised an additional concern that the Department is seeking to implicitly mandate national accreditation, which would result in increased costs; and that the proposed regulations set a disturbing precedent by effectively mandating specialized accreditation as a requirement for demonstrating program quality. Some commenters were concerned that with CAEP as the only national accreditor for teacher preparation, variety of and access to national accreditation would be limited and controlled.
Other commenters expressed concern that our proposal to offer each State the option of presenting an assurance that the program is accredited by a specialized accrediting agency would, at best, make the specialized accreditor an agent of the Federal government, and at worst, effectively mandate specialized accreditation by CAEP. The commenters argued instead that professional accreditation should remain a voluntary, independent process based on evolving standards of the profession.
Some commenters asked that the requirement for State reporting on accreditation or program characteristics in § 612.5(a)(4)(i) and (ii) be removed because these are duplicative of existing State efforts with no clear benefit to understanding whether a teacher preparation program can effectively prepare candidates for classroom success, and because the proposed regulations are redundant to work being done for State and national
Some commenters argued that stronger standards are essential to improving teacher preparation programs, and providing some gradation of ratings of how well preparation programs are doing would provide useful information to the prospective candidates, hiring districts, and the teacher preparation programs the IRCs and SRCs are intended to inform. They noted that as long as CAEP continued with these accreditation levels, rather than lumping them all together under a high-level assurance, indicators of these levels should be reflected in the rating system. They also stated that where States do not require accreditation, States should attempt to assess the level at which programs are meeting the additional criteria.
Some commenters argued that accreditation alone is sufficient to hold teacher preparation programs accountable. Other commenters stated their agreement that active participation in professional accreditation should be recognized as an indicator of program quality. One commenter supported the alignment between the proposed regulations and CAEP's annual outcomes-based reporting measures, but was concerned that the regulations as proposed would spawn 50 separate State reporting systems, data definitions, and processes for quality assurance. The commenter supported incentivizing accreditation and holding all teacher preparation programs to the same standards and reporting requirements, and stated that CAEP's new accreditation process would achieve the goals of the proposed rules on a national level, while removing burden from the States. The commenter expressed concern about the requirement that the Secretary recognize the specialized accrediting agency, and the statement in the preamble of the NPRM that alternative route programs are often not eligible for specialized accreditation.
The commenter also indicated that current input- and compliance-based system requirements within the Department's recognition process for accreditors runs counter to the overarching goal of providing meaningful data and feedback loops for continuous improvement. The commenter noted that CAEP was launched to bring all teacher preparation programs, whether alternative, higher education based, or online-based, into the fold of accreditation. The commenter recommended that specialized accrediting agencies recognized by the Council for Higher Education Accreditation (CHEA) should be allowed to serve as a State indicator for program quality.
Commenters also noted that no definition of specialized accreditation was proposed, and requested that we include a definition of this term. One commenter recommended that a definition of specialized accreditation include the criteria that the Secretary would use to recognize an agency for the accreditation of professional teacher preparation programs, and that one of the criteria for a specialized agency should be the inclusion of alternative certification programs as eligible professional teacher preparation programs.
Also, upon review of the comments, we realized that imprecise wording in the proposed regulations likely led to misunderstanding of its intent regarding program-level accreditation. Our intent was simple: to allow States able to certify that the entity offering the teacher preparation program had been accredited by a teacher preparation program accreditor recognized by the Secretary to rely on that accreditation to demonstrate that the program produces teacher candidates with the basic qualifications identified in § 612.5(a)(4)(ii) rather than having to separately report on those qualifications. The proposed regulations would not have required separate accreditation of each individual program offered by an entity, but we have revised § 612.5(a)(4)(i) to better reflect this intent. In response to the concern about whether an entity that administers an alternative route program can receive such accreditation, the entity can apply for CAEP accreditation, as one of the commenters noted.
As summarized above, commenters presented opposing views of the role in the regulations of national accreditation through an accreditor recognized by the Secretary: Opinions that the inclusion of national accreditation in the regulations represented an unauthorized mandate for accreditation on the one hand, and an implication that accreditation alone was sufficient, thus making other options or further indicators unnecessary, on the other. Similarly, some commenters argued that the regulations require too much standardization across States (through either accreditation or a consistent set of broad indicators), while others argued that the regulations either allow too much variability among States (leading to lack of comparability) or encourage the duplicative effort of creating over 50 separate systems.
In the final regulations we seek to balance these concerns. States are to assess whether a program either has Federally recognized accreditation (§ 612.5(a)(4)(i)) or produces teacher candidates with certain characteristics (§ 612.5(a)(4)(ii)). Allowing States to report and assess whether their teacher preparation programs have specialized accreditation or produce teacher candidates with specific characteristics is not a mandate that a program fulfill either option, and it may eliminate or reduce duplication of effort by the State. If a State has an existing process to assess the program characteristics in § 612.5(a)(4)(ii), it can use that process rather than report on whether a program has specialized accreditation; conversely, if a State would like to simply use accreditation by an agency that evaluates factors in § 612.5(a)(4)(ii) (whether federally recognized or not) to fulfill this requirement, it may choose do so. We believe these factors do relate to preparation of effective teachers, which is reflected in standards and expectations developed by the field, including the CAEP standards. And since accreditation remains a voluntary process, we cannot rely on it alone for transparency and accountability across all programs.
We now address the commenters' statement that there may be no federally recognized accreditor for educator preparation entities. If there is none, and a State would like to use accreditation by an agency whose standards align with the elements listed
As we summarized above, some commenters requested that we include a definition of specialized accreditation, and that it include criteria the Secretary would use to recognize an agency for accreditation of teacher preparation programs, and that one of the criteria should be inclusion of alternative certification programs as eligible programs. While we appreciate these comments, we believe they are outside the scope of the proposed and final regulations.
Finally, because teacher preparation program oversight authority lies with the States, we do not intend for the regulations to require a single approach—via accreditation or otherwise—for all States to use in assessing the characteristics of teacher preparation programs. We do, however, encourage States to work together in designing data collection processes, in order to reduce or share costs, learn from one another, and allow greater comparability across States.
In terms of the use of other specific indicators (
As one commenter noted, the current statutory recognition process for accreditors is heavily input based, while the emphasis of the regulations is on outcomes. Any significant reorientation of the accreditor recognition process would require statutory change. Nonetheless, given the rigor and general acceptance of the Federal recognition process, we believe that accreditation only by a Federally recognized accreditor be specifically assessed in § 612.5(a)(4)(i), rather than accreditors recognized by outside agencies such as CHEA. For programs not accredited by a federally recognized accreditor, States determine whether or to what degree a program meets characteristics for the alternative, § 612.5(a)(4)(ii).
Because the regulation provides for use of State procedures as an alternative to specialized accreditor recognized by the Secretary, nothing in § 612.5(a)(4) would mandate program accreditation by CAEP or any other entity. Nor would the regulation otherwise interfere in what commenters argue should be a voluntary, independent process based on evolving standards of the profession. Indeed, this provision does not require any program accreditation at all.
Other commenters expressed concern about the consequences of creating rigorous teacher candidate entry and exit qualifications. Some commenters expressed concerns that this requirement does not take into account the unique missions of the institutions and will have a disproportionate and negative impact on MSIs, which may see decreases in eligible teacher preparation program candidates by denying entry to candidates who do not meet entry requirements established by this provision. These commenters were concerned that rigorous entrance requirements could decrease diversity in the teaching profession.
Commenters also expressed general opposition to requiring rigorous entry and exit qualifications because they felt that the general assurance of entry and exit requirements did little to provide transparency or differentiate programs by program quality. Therefore, the provisions were unneeded, and only added to the confusion and bureaucracy of these requirements.
Other commenters noted that a lack of clinical experience similar to the teaching environment in which they begin their careers results in a struggle for novice teachers, limiting their ability to meet the needs of their students in their early years in the classroom. They suggested that the regulations include “teaching placement,” for example, or “produces teacher candidates with content and pedagogical knowledge and quality clinical preparation relevant to their teaching placement, who have met rigorous teacher candidate entry and exit qualifications pursuant” to increase the skills and knowledge of teacher preparation program completers who are being placed in the classroom as a teacher.
Rather, as discussed in our response to public comment in the section on Specialized Accreditation, States have the authority to use their own process to determine whether a program has these characteristics. We feel that this authority provides ample flexibility for State discretion in how to treat this indicator in assessing overall program performance and the information about each program that could help that program in areas of program design. Moreover, the basic elements identified in § 612.5(a)(4)(ii) reflect recommendations of the non-Federal negotiators, and we agree with them that the presence or absence of these elements should impact the overall level of a teacher preparation program's performance.
The earlier discussion of “rigorous entry and exit requirements” in our
Ensuring that the program produces teacher candidates who have met rigorous exit qualifications alone will not provide necessary transparency or differentiation of program quality. However, having States report data on the full set of indicators for each program will provide significant and useful information, and explain the basis for a State's determination that a particular program is or is not low-performing or at-risk of being low-performing.
We agree with the importance of high quality clinical experience. However, it is unrealistic to require programs to ensure that each candidate's clinical experience is directly relevant to his or her future, as yet undetermined, teaching placement.
In reviewing commenters' suggestions, we realized that the term “predictive” in the phrase “predictive of a teacher's effect on student performance” is inaccurate. The additional measures States may use are indicators of their academic content knowledge and teaching skill, rather than predictors of teacher performance.
We therefore are removing the word “predictive” from the regulations. If a State uses other indicators of academic content knowledge and teaching skills, it must, as we had proposed, apply the same indicators for all of its teacher preparation programs to ensure consistent evaluation of preparation programs within the State.
Some commenters suggested additional examples of technical assistance to include in the regulations. Commenters believed that technical assistance could include: Training teachers to serve as clinical faculty or cooperating teachers using the National Board for Professional Teaching Standards; integrating models of accomplished practice into the preparation program curriculum; and assisting preparation programs to provide richer clinical experiences. Commenters also suggested including first-year teacher mentoring programs and peer networks as potential ways in which a State could provide technical assistance to low-performing programs. One commenter noted that, in a recent survey of educators, teachers cite mentor programs in their first year of teaching (90 percent) and peer networks (84 percent) as the top ways to improve teacher training programs.
Commenters recommended that States have the discretion to determine the scope of the technical assistance, rather than requiring that technical assistance focus only on low-performing programs. This would allow States to distribute support as appropriate in an individual context, and minimize the risk of missing essential opportunities to identify best practices from high-performing programs and supporting those programs who are best-positioned to be increasingly productive and effective providers. Commenters suggested that entities who administer teacher preparation programs be responsible for seeking and resourcing improvement for their low-performing programs.
Some commenters suggested that the Federal government provide financial assistance to States to facilitate the provision of technical assistance to low-performing programs. Commenters suggested that the Department make competitive grants available to States to distribute to low-performing programs in support of program improvement. Commenters also suggested that the Federal government offer meaningful incentives to help States design, test, and share approaches to strengthening weak programs and support research to assess effective interventions, as it would be difficult for States to offer the required technical assistance because State agencies have little experience and few staff in this area. In addition, commenters recommended that a national training and technical assistance center be established to build data capacity, consistency, and quality among States and teacher preparation programs to support scalable continuous improvement and program quality in educator preparation.
Commenters recommended that, in addition to a description of the procedures used to assist low-performing programs as required by section 207 of the HEA, States should be required to describe in the SRC the technical assistance they provide to low-performing teacher preparation programs in the last year. Commenters suggested that this would shift the information reported from descriptions of processes to more detailed information about real technical assistance efforts, which could inform technical assistance efforts in other States.
Commenters suggested adding a timeframe for States to provide the technical assistance to low-performing programs. Commenters suggested a maximum of three months from the time that the program is identified as low-performing because, while waiting for the assistance, and in the early stages of its implementation, the program will continue to produce teacher candidates of lower quality.
Commenters suggested that States should be required to offer the assistance of a team of well-recognized scholars in teacher education and in the education of diverse students in P-12 schools to assist in the assessment and redesign of programs that are rated below effective. Some commenters noted that States with publically supported universities designated as Historically Black Colleges and Universities, Hispanic Serving Institutions, and tribal institutions are required to file with the Secretary a supplemental report of equity in funding and other support to these institutions. Private and publically supported institutions in these categories often lack the resources to attract the most recognized scholars in the field.
We decline to adopt the recommendations of commenters who suggested that the regulations require States to provide specific types of technical assistance because we seek to provide States with flexibility to design technical assistance that is appropriate for the circumstances of each low-performing program. States have the discretion to implement technical assistance in a variety of ways. The regulations outline the minimum requirements, and we encourage States that wish to do more, such as providing assistance to at-risk or other programs, to do so. Furthermore, nothing in the regulations prohibits States from providing technical assistance to at-risk programs in addition to low-performing programs. Similarly, while we encourage States to provide timely assistance to low-performing programs, we decline to prescribe a certain timeframe so that States have the flexibility to meet these requirements according to their capacity. In the SRC, States are required to provide a description of the process used to determine the kind of technical assistance to provide to low-performing programs and how such assistance is administered.
The Department appreciates comments requesting Federal guidance and resources to support high-quality technical assistance. We agree that such activities could be beneficial. However, the commenters' suggestions that the Department provide financial assistance to States to facilitate their provision of technical assistance, and to teacher preparation programs to support their improvement, and request for national technical assistance centers to support scalable continuous improvement and to improve program quality, are outside the scope of this regulation, which is focused on reporting. The Department will consider ways to help States implement this and other provisions of the regulations, including by facilitating the sharing of best practices across States.
A number of commenters expressed their concerns about the impacts of losing financial aid eligibility, and stated that decreasing financial aid for prospective teachers would negatively impact the number of teachers joining the profession. As costs for higher education continue to increase and less financial aid is available, prospective teacher preparation program students may decide not to enroll in a teacher preparation program, and instead pursue other fields that may offer other financial incentives to offset the costs associated with college. The commenters believed this would result in fewer teachers entering the field because fewer students would begin and complete teacher preparation programs, thus increasing teacher shortages. Other commenters were concerned about how performance results of teacher preparation programs may impact job outcomes for students who attended those programs in the past as their ability to obtain jobs may be impacted by the rating of a program they have not attended recently. The commenters noted that being rated as low-performing would likely reduce the ability of a program to recruit, enroll, and retain students, which would translate into fewer teachers being available for teaching positions. Others stated that there will be a decrease in the number of students who seek certification in a high-need subject area due to link between TEACH Grant eligibility and teacher preparation program metrics. They believe this will increase teacher shortages in areas that have a shortage of qualified teachers. Additional commenters believed that results from an individual teacher would affect privacy concerns and further drive potential teachers away from the field due to fears that their performance would be published in a public manner.
Some commenters were specifically concerned about the requirement that low-performing programs be required to provide transition support and remedial services to students enrolled at the time of termination of State support or approval. The commenters noted that low-performing programs are unlikely to have the resources or capacity to provide transitional support to students.
We disagree with the commenters that the loss of TEACH Grant funds will have a negative impact on affordability of, and access to attend, teacher preparation programs. A program that loses its eligibility would be required to provide transitional support, if necessary, to students enrolled at the institution at the time of termination of financial support or withdrawal of approval to assist students in finding another teacher preparation program that is eligible to enroll students receiving title IV, HEA funds. By providing transition services to students, individuals who receive title IV, HEA funds would be able to find another program in which to use their financial aid and continue in a teacher preparation program in a manner that will still address college affordability. We also disagree with the commenters who stated that low-performing programs are unlikely to have the resources to provide transitional support to students. We believe that an IHE with a low-performing teacher preparation program will be offering other programs that may not be considered low-performing. As such, an IHE will have resources to provide transition services to students affected by the teacher preparation program being labeled as low-performing even if the money does not come directly from the teacher preparation program.
While teacher preparation program labels may negatively impact job market outcomes because low-performing teacher preparation programs' ability to recruit and enroll future cohorts of students would be negatively impacted by the rating, we believe these labels better serve the interests of students who deserve to know the quality of the program they may enroll in. As we have explained, § 612.7 applies only to programs that lose State approval or financial support as a result of being identified by the State as low-performing. It does not apply to every program that is identified as low-performing. We believe that, while providing information about the quality of a program to a prospective student may impact the student's enrollment decision, a student who wishes to become a teacher will find and enroll in a program that has not lost State approval or State financial support. We believe that providing quality consumer information to prospective students will allow them to make informed enrollment decisions. Students who are aware that a teacher preparation program is not approved by the State may reasonably choose not to enter that program. Individuals who wish to enter the teaching field will continue to find programs that prepare them for the workforce, while avoiding less effective programs. By doing so, we believe, the overall impact to the number of individuals entering the field will be minimal. Section 612.4(b) implements protections and allowances for teacher preparation programs with a program size of fewer than 25 students, which would help to protect against privacy violations, but does not require sharing information on individual teacher effectiveness with the general public.
In addition, we believe that, as section 207(b) of the HEA requires, removing title IV, HEA program eligibility from low-performing teacher preparation programs that lose State approval or financial support as a result of the State assessment will encourage individuals to enroll in more successful teacher preparation programs. This will keep more prospective teachers enrolled and will mitigate any negative impact on teacher employment rates.
While these regulations specify that the teacher placement rate and the teacher retention rate be calculated separately for high-need schools, no requirements have been created to track employment outcomes based on high-need subject areas. We believe that an emphasis on high-need schools will help focus on improving student success across the board for students in these schools. In addition, the requirement to report performance at the individual teacher preparation program level will likely promote reporting by high-need subjects as well.
Section 612.7(a) codifies statutory requirements related to teacher
The regulations do not require that an institution dictate how a student is assisted at the time of termination of financial support or withdrawal of approval from the State. Transition services may include helping a student transfer to another program at the same institution that still receives State funding and State approval, or another program at another institution. The transition services offered by the institution should be in the best interest of the student and assist the student in meeting their educational and occupational goals. However, the Department believes that teacher preparation programs may be offering these services through their staff already and those services should not stop because of the consequences of withdrawal of State approval or financial support.
Additionally, we intend § 612.7(b) to focus exclusively on the title IV, HEA consequences to the teacher preparation program that loses State approval or financial support and on the students enrolled in those programs. This subsection describes the procedure that a program must undertake to ensure that students are informed of the loss of State approval or financial support.
Another commenter noted that some programs might not ever regain authorization to prepare teachers if they must transfer students to other programs since there will not be any future student outcomes associated with the recent graduates of the low-performing programs.
We do not propose to tie the entire institution's eligibility for title IV, HEA funds to the performance of their teacher preparation program. Any loss of title IV, HEA funds based on these regulations would only apply to the institution's teacher preparation program and not to the entire institution. Therefore, an institution would be able to have both title IV eligible and non-title IV eligible programs at their institution. In addition, based on the reporting by program, an institution could have both eligible and non-eligible title IV teacher preparation programs based on the rating of each program. The remaining programs at the institution would still be eligible to receive title IV, HEA funds. We are concerned that our inclusion of proposed § 612.8(b)(2) may have led the commenter to believe that an entire institution would be prohibited from participating in the title IV programs as a result of a teacher preparation program's loss of approval or financial support based on low performance. To avoid such confusion, we have removed § 612.8(b)(2) from the final regulations. The institutional eligibility requirements in part 600 sufficiently describe the requirements for institutions to participate in the title IV, HEA programs.
We believe that providing transitional support to students enrolled at the institution at the time a State may terminate financial support or withdraw approval of a teacher preparation program will provide appropriate consumer protections to students. We disagree with the commenter who stated it would be impossible for a program to improve its performance on the State assessment, because there could not be any data available on which the program could be assessed, such as student learning outcomes associated with programs if the program was prohibited from enrolling additional title IV eligible students. Programs would not be prohibited from enrolling students to determine future student outcomes. Programs that have lost State approval or financial support would be limited only in their ability to enroll additional title IV eligible students, not to enroll all students.
Furthermore, to ensure that the TEACH Grant program regulations are consistent with the changes made to part 612, we have revised the timelines that we proposed in the definition of the term high-quality teacher preparation program in part 686 that we now incorporate in the terms “high quality teacher preparation not provided through distance education” and “high quality teacher preparation program provided through distance education.” We have also removed the phrase “or of higher quality” from “effective or of higher quality” to align the definition of “high-quality teacher preparation program not provided through distance education” with the definition of the term “effective teacher preparation program” in 34 CFR 612.1(d), which provides that an effective teacher preparation program is a program with a level of performance higher than a low-performing teacher preparation or an at-risk teacher preparation program. The phrase “or of higher quality” was redundant and unnecessary.
The new definition is consistent with changes we made respect to program-level reporting (including distance education), which are described in the section of the preamble related to § 612.4(a)(1)(i). We note that the new definition of the term “high quality teacher preparation program not provided through distance education” relates to the classification of the program under 34 CFR 612.4(b) made by the State where the program was located, as the proposed definition of the term “high-quality teacher preparation program” provided. This is in contrast to the definition of the term “high-quality teacher preparation program provided through distance education” discussed later in this document.
Also, the proposed definition provided that in the 2020-2021 award year, a program would be “high-quality” only if it was classified as an effective teacher preparation program in either or both the April 2019 and/or April 2020 State Report Cards. We have determined that this provision is unnecessary and have deleted it. Now, because the first State Report Cards under the regulations will be submitted in October 2019, we have provided that starting with the 2021-2022 award year, a program is high-quality if it is not classified by the State to be less than an effective teacher preparation program based on 34 CFR 612.4(b) in two out of the previous three years. We note that in the NPRM, the definition of the term “high-quality teacher preparation program” contained an error. The proposed definition provided that a program would be considered high-quality if it were classified as effective or of higher quality for two out of three years. We intended the requirement to be that a program is high-quality if it is not rated at a rating lower than effective for two out of three years. This is a more reasonable standard, and allows a program that has been rated as less than effective to improve its rating before becoming ineligible to award TEACH Grants.
Several commenters stated that it was unclear how the proposed regulations would take into account TEACH Grant eligibility for students enrolled in a teacher preparation program provided through distance education that does not lead to initial certification or if the program does not receive an evaluation by a State. Another commenter stated that the proposed regulations would effectively impose a requirement for distance education institutions to adopt a 50-State authorization compliance strategy to offer their distance education teacher licensure programs to students in all 50 States.
We disagree with the commenter that the determination of institutional eligibility to disburse TEACH Grants is meant to rest squarely with the Department, separate from determinations relating to teacher preparation program performance under title II of the HEA. The HEA provides that the Secretary determines which teacher preparation programs are high-quality, and the Secretary has reasonably decided to rely, in part, on the classification of teacher preparation program performance by States under title II of the HEA. Further, as the performance rating of teacher preparation programs not provided through distance education could also be subject to unrepresentative samples (for example, programs located near a State border), this concern is not limited to teacher preparation programs provided through distance education.
The performance standards related to title II are left to a State's discretion; thus, if States want to work together create a single set of performance standards, there is no barrier to them doing so.
By way of clarification, the HEA and current regulations provide for TEACH Grant eligibility for students enrolled in post-baccalaureate and master's degree programs. The eligibility of programs that do not lead to initial certification is not based on a title II performance rating. In addition, if the teacher preparation program provided through distance education is not classified by a State for a given year due to small n-size, students would still be able to receive TEACH Grants if the program meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or 34 CFR 612.4(b)(5). We disagree that the regulations effectively impose a requirement for distance education institutions to adopt a 50-State authorization compliance strategy to offer their distance education teacher licensure programs to students in all 50 States. Rather, our regulations provide, in part, for reporting on teacher preparation programs provided through distance education under the title II reporting system with the resulting performance level classification of the program based on that reporting forming the basis for that program's eligibility to disburse TEACH Grants.
A commenter that offers only graduate degree programs and no programs that lead to initial certification noted that the HEA provides that current teachers may be eligible for TEACH Grants to obtain graduate degrees, and questioned how those students could obtain TEACH Grants under the proposed definitions of the terms “TEACH Grant-eligible institution” and “TEACH Grant-eligible program.”.
Commenters also expressed concern that the proposed definition of the term TEACH Grant-eligible institution will result in an overall reduction in the number of institutions that are eligible to provide TEACH Grants, and that, because of this reduction, fewer students will pursue high-need fields such as special education, or teach in high-poverty, diverse, urban or rural communities where student test scores may be lower. One commenter stated that it is unfair to punish students by denying them access to financial aid when the States they live in and the institutions they attend may not be able to supply the data on which the teacher preparation programs are being assessed.
We agree that States will assess teacher preparation programs based on different criteria and measures. The HEA only requires a State to assess the quality of teacher preparation in that State and does not require comparability between States. That different States may use different standards is not necessarily unfair, as it is reasonable for States to consider specific conditions in their States when designing their annual assessments. We believe it is important that students receiving TEACH Grants be enrolled in programs that the State has identified as providing effective teacher preparation.
We agree that in addition to ensuring that students wishing to achieve initial certification to become teachers are eligible for TEACH Grants, the HEA provides that a teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field, or a teacher who is using high-quality alternative certification routes to become certified is eligible to receive TEACH Grants. To ensure that these eligible students are able to obtain TEACH grants, we have modified the definitions of the terms “TEACH Grant-eligible institution” and “TEACH Grant-eligible program.”
We also acknowledge the possibility that the overall number of institutions eligible to award TEACH Grants could decrease, because a TEACH Grant-eligible institution now must, in most cases, provide at least one high quality teacher preparation program, while in the current regulation, an institution may be TEACH Grant-eligible if it offers a baccalaureate degree that, in combination with other training or experience, will prepare an individual to teach in a high-need field and has entered into an agreement with another institution to provide courses necessary for its students to begin a career in teaching. We note that so long as an otherwise eligible institution has one high-quality teacher preparation program not provided through distance education or one high-quality program provided through distance education, it continues to be a TEACH Grant-eligible institution. Furthermore, we do not believe that fewer incentives for students to pursue fields such as special education or to teach in high-poverty, diverse, or rural communities where test scores may be lower would necessarily be created. TEACH Grants will continue to be available to students so long as their teacher preparation programs are classified as effective teacher preparation programs by the State (subject to the exceptions previously discussed), and we are not aware of any evidence that programs that prepare teachers who pursue fields such as special education or who teach in communities where test scores are lower will be classified as at-risk or low-performing teacher preparation programs on the basis of lower test scores. We believe that those students will choose to pursue those fields while enrolled in high-quality programs. The larger reason that the number of institutions providing TEACH Grants may decrease is that the final regulations narrow the definition of a TEACH Grant-eligible institution to generally those institutions that offer at least one high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education at the baccalaureate or master's degree level (that also meets additional requirements) and institutions that provide a high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education that is a post-baccalaureate program of study.
We do not agree that student learning outcomes for any subgroup, including for teachers who teach students with disabilities, would necessarily be lower if properly measured. Further, student learning outcomes is one of multiple measures used to determine a rating and, thereby, TEACH eligibility. So a single measure, whether student learning outcomes or another, would not necessarily lead to the teacher preparation program being determined by the State to be low-performing or at-risk of being low-performing and correspondingly being ineligible for TEACH Grants. As discussed elsewhere in this document, States determine the ways to measure student learning outcomes that give all teachers a chance to demonstrate effectiveness regardless of the composition of their classrooms, and States may also determine weights of the criteria used in their State assessments of teacher preparation program quality.
We do not agree with the comment that the definition of the term Teach Grant-eligible program will unfairly punish students who live in States or attend institutions that fail to comply with the regulations in part 612 by failing to supply the data required in that part. Section 205 of the HEA requires States and institutions to submit IRCs and SRCs annually. In addition, students will have access to information about a teacher preparation program's eligibility before they enroll so that they may select programs that are TEACH Grant-eligible. Section 686.3(c) also allows students who are currently enrolled in a TEACH Grant-eligible program to receive additional TEACH Grants to complete their program, even if the program becomes ineligible to award TEACH Grants to new students.
For reasons discussed under the TEACH Grant-eligible program section of this document, we have made conforming changes to the definition of a TEACH Grant-eligible program that are reflected in the definition of TEACH Grant-eligible institution where applicable.
Commenters also questioned what criteria the Secretary would use to determine eligibility, since the Secretary would be responsible for determining which STEM programs are TEACH Grant-eligible. Finally, commenters emphasized the importance of the pedagogical aspects of teacher education.
Many commenters stated that the proposed regulations would grant the State, rather than the Department of Education, authority to determine TEACH Grant eligibility, which is a delegation of authority that Congress did not provide the Department, and that a State's strict requirements may make the TEACH Grant program unusable by institutions, thereby eliminating TEACH Grant funding from students at those institutions. It was recommended that the regulations allow for professional judgment regarding TEACH Grant eligibility, that TEACH Grants mimic Federal Pell grants in annual aggregates, and that a link should be available at
With respect to comments objecting to the use of student growth to determine TEACH Grant eligibility, student growth is only one of the many indicators that States use to assess teacher preparation program quality in part 612, and States have discretion to determine the weight assigned to that indicator in their assessment.
While the new regulations will require financial aid offices to track and review additional information with respect to student eligibility for TEACH Grants, we do not agree that this would result in greater risk of incorrect packaging of financial aid. For an institution to begin and continue to participate in any title IV, HEA program, the institution must demonstrate to the Secretary that it is capable of administering that program under the standards of administrative capability provided under § 668.16 (Standards of administrative capability). An institution that does not meet administrative capability standards
We disagree with comments asserting that the proposed regulations would grant States, rather than the Department, authority to determine TEACH Grant eligibility, which they claimed is a delegation of authority that Congress did not authorize. The HEA provides that an “eligible institution” for purposes of the TEACH Grant program is one “that the Secretary determines . . . provides high quality teacher preparation . . . .” The Secretary has determined that States are in the best position to assess the quality of teacher preparation programs located in their States, and it is reasonable for the Secretary to rely on the results of the State assessment required by section 207 of the HEA. We believe that it is appropriate to use the regulatory process to define how the Secretary determines that an institution provides high quality teacher preparation and that the final regulations reasonably amend the current requirements so that they are more meaningful.
We also disagree with commenters that a State's strict requirements may make the TEACH Grant program unusable by institutions and thereby eliminate TEACH Grant funding for students at those institutions. We believe that States will conduct careful and reasonable assessments of teacher preparation programs located in their States, and we also believe if a State determines a program is not effective at providing teacher preparation, students should not receive TEACH Grants to attend that program.
Regarding the recommendation that the regulations allow for professional judgment regarding TEACH Grant eligibility, there is no prohibition regarding the use of professional judgment for the TEACH Grant program, provided that all applicable regulatory requirements are met. With respect to the comment suggesting that the TEACH Grant program should mimic the Pell Grant program in annual aggregates, we note that, just as the Pell Grant program has its own annual aggregates, the TEACH Grant program has its own statutory annual award limits that must be adhered to. The HEA provides that a undergraduate or post-graduate student may receive up to $4,000 per year, and § 686.3(a) provides that an undergraduate or post-baccalaureate student may receive the equivalent of up to four Scheduled Awards during the period required for completion of the first undergraduate baccalaureate program of study and the first post-baccalaureate program of study combined. For graduate students, the HEA provides up to $4,000 per year, and § 686.3(b) stipulates that a graduate student may receive the equivalent of up to two Scheduled Awards during the period required for the completion of the TEACH Grant-eligible master's degree program of study.
Regarding the comment requesting a link to the TEACH Grant program via the
We disagree with the comment that the Department should focus specifically on issues or deficiencies with the TEACH Grant program and not connect any issues or deficiencies to reporting of teacher preparation programs under title II. The regulations are intended to improve the TEACH Grant program, in part, by operationalizing the definition of a high-quality teacher preparation program by connecting the definition to the ratings of teacher preparation programs under the title II reporting system. The regulations are not meant to address specific TEACH Grant program issues or program deficiencies.
We decline to adopt the suggestion that an at-risk teacher preparation program should be given the opportunity and support to improve before any consequences, including those regarding TEACH Grants, are imposed. The HEA specifies that TEACH Grants may only be provided to high-quality teacher preparation programs, and we do not believe that a program identified as being at-risk should be considered a high-quality teacher preparation program. With respect to the comment that institutions in the specific commenter's State will remove themselves from participation in the TEACH Grant program rather than pursue high-stakes Federal requirements, we note that, while we cannot prevent institutions from ending their participation in the program, we believe that institutions understand the need for providing TEACH Grants to eligible students and that institutions will continue to try to meet that need. Additionally, we note that all institutions that enroll students receiving Federal financial assistance are required to submit an annual IRC under section 205(a) of the HEA, and that all States that receive funds under the HEA must submit an annual SRC. These provisions apply whether or not an institution participates in the TEACH Grant program.
We agree with the commenters who recommended avoiding specific carve-outs for potential mathematics and science teachers. As discussed under the section titled “TEACH Grant-eligible STEM program,” we have removed the TEACH Grant-eligible STEM program definition from § 686.2 and deleted the term where it appeared elsewhere in § 686.
Under Executive Order 12866, the Secretary must determine whether this regulatory action is “significant” and, therefore, subject to the requirements of the Executive order and subject to review by the Office of Management and Budget (OMB). Section 3(f) of Executive Order 12866 defines a “significant regulatory action” as an action likely to result in a rule that may—
(1) Have an annual effect on the economy of $100 million or more, or adversely affect a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities in a material way (also referred to as an “economically significant” rule);
(2) Create serious inconsistency or otherwise interfere with an action taken or planned by another agency;
(3) Materially alter the budgetary impacts of entitlement grants, user fees, or loan programs or the rights and obligations of recipients thereof; or
(4) Raise novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles stated in the Executive order.
This final regulatory action is a significant regulatory action subject to review by OMB under section 3(f) of Executive Order 12866.
We have also reviewed these regulations under Executive Order 13563, which supplements and explicitly reaffirms the principles, structures, and definitions governing regulatory review established in Executive Order 12866. To the extent permitted by law, Executive Order 13563 requires that an agency—
(1) Propose or adopt regulations only on a reasoned determination that their benefits justify their costs (recognizing that some benefits and costs are difficult to quantify);
(2) Tailor its regulations to impose the least burden on society, consistent with obtaining regulatory objectives and taking into account—among other things and to the extent practicable—the costs of cumulative regulations;
(3) In choosing among alternative regulatory approaches, select those approaches that maximize net benefits (including potential economic, environmental, public health and safety, and other advantages; distributive impacts; and equity);
(4) To the extent feasible, specify performance objectives, rather than the behavior or manner of compliance a regulated entity must adopt; and
(5) Identify and assess available alternatives to direct regulation, including economic incentives—such as user fees or marketable permits—to encourage the desired behavior, or provide information that enables the public to make choices.
Executive Order 13563 also requires an agency “to use the best available techniques to quantify anticipated present and future benefits and costs as accurately as possible.” The Office of Information and Regulatory Affairs of OMB has emphasized that these techniques may include “identifying changing future compliance costs that might result from technological innovation or anticipated behavioral changes.”
We are issuing these final regulations only on a reasoned determination that their benefits justify their costs. In choosing among alternative regulatory approaches, we selected those approaches that maximize net benefits. Based on the analysis that follows, the Department believes that these regulations are consistent with the principles in Executive Order 13563.
We also have determined that this regulatory action does not unduly interfere with State, local, or tribal governments in the exercise of their governmental functions.
In this RIA we discuss the need for regulatory action, the potential costs and benefits, net budget impacts, assumptions, limitations, and data sources, as well as regulatory alternatives we considered. Although the majority of the costs related to information collection are discussed within this RIA, elsewhere in this document under Paperwork Reduction Act of 1995, we also identify and further explain burdens specifically associated with information collection requirements.
Recent international assessments of student achievement have revealed that students in the United States are significantly behind students in other countries in science, reading, and mathematics.
A number of factors are associated with teacher quality, including academic content knowledge, in-service training, and years of experience, but researchers and policymakers have begun to examine whether student achievement discrepancies can be
Subsequent studies have examined the value-added scores of teachers prepared through different teacher preparation programs in Missouri, Louisiana, North Carolina, Tennessee, and Washington.
In contrast to these findings, Koedel, et al. found very small differences in effectiveness between teachers prepared at different programs in Missouri.
We acknowledge that there is debate in the research community about the specifications that should be used when conducting value-added analyses of the effectiveness of teachers prepared through different preparation programs,
Thus, despite the methodological debate in the research community, CAEP has developed new standards that require, among other measures, evidence that students completing a teacher preparation program positively impact student learning.
Despite research suggesting that the academic achievement of students taught by graduates of different teacher preparation programs may vary with regard to their teacher's program, analyses linking student achievement to teacher preparation programs have not been conducted and made available publicly for teacher preparation programs in all States. Congress has recognized the value of assessing and reporting on the quality of teacher preparation, and requires States and IHEs to report detailed information about the quality of teacher preparation programs in the State under the HEA. When reauthorizing the title II reporting system, members of Congress noted a goal of having teacher preparation programs explore ways to assess the impact of their programs' graduates on student academic achievement. In fact, the report accompanying the House Bill (H. Rep. 110-500) included the following statement, “[i]t is the intent of the Committee that teacher preparation programs, both traditional and those providing alternative routes to State certification, should strive to increase the quality of individuals graduating from their programs with the goal of exploring ways to assess the impact of such programs on student's academic achievement.”
Moreover, in roundtable discussions and negotiated rulemaking sessions held by the Department, stakeholders repeatedly expressed concern that the current title II reporting system provides little meaningful data on the quality of teacher preparation programs or the impact of those programs' graduates on student achievement. The recent GAO report on teacher preparation programs noted that half or more of the States and teacher preparation programs surveyed said the current title II data collection was not useful to assessing their programs; and none of the surveyed school district staff said they used the data.
Currently, States must annually calculate and report data on more than 400 data elements, and IHEs must report on more than 150 elements. While some information requested in the current reporting system is statutorily required, other elements—such as whether the IHE requires a personality test prior to admission—are not required by statute and do not provide information that is particularly useful to the public. Thus, stakeholders stressed at the negotiated rulemaking sessions that the current system is too focused on inputs and that outcome-based measures would provide more meaningful information.
Similarly, even some of the statutorily-required data elements in the current reporting system do not provide meaningful information on program performance and how program graduates are likely to perform in a classroom. For example, the HEA requires IHEs to report both scaled scores on licensure tests and pass rates for students who complete their teacher preparation programs. Yet, research provides mixed findings on the relationship between licensure test scores and teacher effectiveness.
Thus, while the current title II reporting system produces detailed and voluminous data about teacher preparation programs, the data do not convey a clear picture of program quality as measured by how program graduates will perform in a classroom. This lack of meaningful data prevents school districts, principals, and prospective teacher candidates from making informed choices, creating a market failure due to imperfect information.
On the demand side, principals and school districts lack information about the past performance of teachers from different teacher preparation programs and may rely on inaccurate assumptions about the quality of teacher preparation programs when recruiting and hiring novice teachers. An accountability system that provides information about how teacher preparation program graduates are likely to perform in a classroom and how likely they are to stay in the classroom will be valuable to school districts and principals seeking to efficiently recruit, hire, train, and retain high-quality educators. Such a system can help to reduce teacher attrition, a particularly important problem because many novice teachers do not remain in the profession, with more than a quarter of novice teachers leaving the teaching profession altogether within three years of becoming classroom teachers.
On the supply side, when considering which program to attend, prospective teachers lack comparative information about the placement rates and effectiveness of a program's graduates. Teacher candidates may enroll in a program without the benefit of information on employment rates post-graduation, employer and graduate feedback on program quality, and, most importantly, without understanding how well the program prepared prospective teachers to be effective in the classroom. NCES data indicate that 66 percent of certified teachers who received their bachelor's degree in 2008 took out loans to finance their undergraduate education. These teachers borrowed an average of $22,905.
The lack of meaningful data also prevents States from restricting program credentials to programs with the demonstrated ability to prepare more effective teachers, or accurately identifying low-performing and at-risk teacher preparation programs and helping these programs improve. Not surprisingly, States have not identified many programs as low-performing or at-risk based on the data currently collected. In the latest title II reporting requirement submissions, twenty-one States did not classify any teacher preparation programs as low-performing or at-risk.
Similarly, under the current title II reporting system, the Federal government is unable to ensure that financial assistance for prospective teachers is used to help students attend programs with the best record for producing effective classroom teachers. The final regulations help accomplish this by ensuring that program performance information is available for all teacher preparation programs in all States and by restricting eligibility for Federal TEACH Grants to programs that are rated “effective.”
Most importantly, elementary and secondary school students, including those students in high-need schools and communities who are disproportionately taught by recent teacher preparation program graduates, will be the ultimate beneficiaries of an improved teacher preparation program accountability system.
Recognizing the benefits of improved information on teacher preparation program quality and associated accountability, several States have already developed and implemented systems that map teacher effectiveness data back to teacher preparation programs. The regulations help ensure that all States generate useful data that are accessible to the public to support efforts to improve teacher preparation programs.
The Department's plan to improve teacher preparation has three core elements: (1) Reduce the reporting burden on IHEs while encouraging States to make use of data on teacher effectiveness to build an effective teacher preparation accountability system driven by meaningful indicators of quality (title II accountability system); (2) reform targeted financial aid for students preparing to become teachers by directing scholarship aid to students attending higher-performing teacher preparation programs (TEACH Grants); and (3) provide more support for IHEs that prepare high-quality teachers.
The regulations address the first two elements of this plan. Improving institutional and State reporting and State accountability builds on the work that States like Louisiana and Tennessee have already started, as well as work that is underway in States receiving grants under Phase One or Two of the Race to the Top Fund.
Consistent with feedback the Department has received from stakeholders, under the regulations
The regulations will help provide meaningful information on program quality to prospective teacher candidates, school districts, States, and IHEs that administer traditional teacher preparation programs and alternative routes to State certification or licensure programs. The regulations will make data available that also can inform academic program selection, program improvement, and accountability.
During public roundtable discussions and subsequent negotiated rulemaking sessions, the Department consulted with representatives from the teacher preparation community, States, teacher preparation program students, teachers, and other stakeholders about the best way to produce more meaningful data on the quality of teacher preparation programs while also reducing the reporting burden on States and teacher preparation programs where possible. The regulations specify three types of outcomes States must use to assess teacher preparation program quality, but States retain discretion to select the most appropriate methods to collect and report these data. In order to give States and stakeholders sufficient time to develop these methods, the requirements of these regulations are implemented over several years.
The Department has analyzed the costs of complying with the final regulations. Due to uncertainty about the current capacity of States in some relevant areas and the considerable discretion the regulations will provide States (
The Department has reviewed the comments submitted in response to the NPRM and has revised some assumptions in response to the information we received. We discuss specific public comments, where relevant, in the appropriate sections below. In general, we do not discuss non-substantive comments.
A number of commenters expressed general concerns regarding the cost estimates included in the NPRM and indicated that implementing these regulations would cost far more than $42.0 million over ten years. As noted above, we believe most of these comments arose from a fundamental misunderstanding of the estimates presented in the NPRM. While several commenters attempted to provide alternate cost estimates, we note that many of these estimates were unreasonably high because they included costs for activities or initiatives that are not required by the regulations. For instance, in one alternate estimate (submitted jointly by the California Department of Education, the California Commission on Teacher Credentialing, and the California State Board of Education) cited by a number of commenters, over 95 percent of the costs outlined were due to non-required activities such as dramatically expanding State standardized assessments to all grades and subjects or completing time- and cost-intensive teacher evaluations of all teachers in the State in every year. Nonetheless, we have taken portions of those estimates into account where appropriate (
In addition, some commenters argued that our initial estimates were too low because they did not include costs for activities not directly required by the regulations. These activities included making changes in State laws where those laws prohibited the sharing of data between State entities responsible for teacher certification and the State educational agency. Upon reviewing these comments, we have declined to include estimates of these potential costs. Such costs are difficult to quantify, as it is unclear how many States would be affected, how extensive the needed changes would be, or how much time and resources would be required on the part of State legislatures. Also, we believe that many States removed potential barriers in order to receive ESEA flexibility prior to the passage of ESSA, further minimizing the potential cost of legislative changes. To the extent that States do experience costs associated with these actions, or other actions not specifically required by the regulations and therefore not outlined below (
We have also updated our estimates using the most recently available wage rates from the Bureau of Labor Statistics. We have also updated our estimates of the number of teacher preparation programs and teacher preparation entities using the most recent data submitted to the Department in the 2015 title II data collection. While no commenters specifically addressed these issues, we believe that these updates will provide the most reasonable estimate of costs.
Based on revised assumptions, the Department estimates that the total annualized cost of the regulations will be between $27.5 million and $27.7 million (see the Accounting Statement section of this document for further detail). This estimate is significantly lower than the total annualized cost estimated in the proposed rule. The largest driver of this decrease is the increased flexibility provided to States under § 612.5(a)(1)(ii), as explained below. To provide additional context, we provide estimates in Table 3 for IHEs, States, and LEAs in Year 1 and Year 5. These estimates are not annualized or calculated on a net present value basis, but instead represent real dollar estimates.
Relative to these costs, the major benefit of the requirements, taken as a whole, will be better publicly available information on the effectiveness of teacher preparation programs that can be used by prospective students when choosing programs to attend; employers in selecting teacher preparation program graduates to recruit, train, and hire; States in making funding decisions; and teacher preparation programs themselves in seeking to improve.
The following is a detailed analysis of the estimated costs of implementing the specific requirements, including the costs of complying with paperwork-related requirements, followed by a discussion of the anticipated benefits.
Section 205(a) of the HEA requires that each IHE that provides a teacher preparation program leading to State certification or licensure report on a statutorily enumerated series of data elements for the programs it provides. Section 205(b) of the HEA requires that each State that receives funds under the HEA provide to the Secretary and make widely available to the public information on the quality of traditional and alternative route teacher preparation programs that includes not less than the statutorily enumerated series of data elements it provides. The State must do so in a uniform and comprehensible manner, conforming with definitions and methods established by the Secretary. Section 205(c) of the HEA directs the Secretary to prescribe regulations to ensure the validity, reliability, accuracy, and integrity of the data submitted. Section 206(b) requires that IHEs provide assurance to the Secretary that their teacher training programs respond to the needs of LEAs, be closely linked with the instructional decisions novice teachers confront in the classroom, and prepare candidates to work with diverse populations and in urban and rural settings, as applicable. Consistent with these statutory provisions, the Department is issuing regulations to ensure that the data reported by IHEs and States is accurate. The following sections provide a detailed examination of the costs associated with each of the regulatory provisions.
The regulations require that beginning on April 1, 2018, and annually thereafter, each IHE that conducts a traditional teacher preparation program or alternative route to State certification or licensure program and enrolls students receiving title IV, HEA funds, report to the State on the quality of its program using an IRC prescribed by the Secretary.
Under the current IRC, IHEs typically report at the entity level, rather than the program level, such that an IHE that administers multiple teacher preparation programs typically gathers data on each of those programs, aggregates the data, and reports the required information as a single teacher preparation entity on a single report card. By contrast, the regulations generally require that States report on program performance at the individual program level. The Department originally estimated that the initial burden for each IHE to adjust its recordkeeping systems in order to report the required data separately for each of its teacher preparation programs would be four hours per IHE. Numerous commenters argued that this estimate was low. Several commenters argued that initial set-up would take 8 to 12 hours, while others argued that it would take 20 to 40 hours per IHE. While we recognize that the amount of time it will take to initially adjust their record-keeping systems will vary, we believe that the estimates in excess of 20 hours are too high, given that IHEs will only be adjusting the way in which they report data, rather than collecting new data. However, the Department found arguments in favor of both 8 hours and 12 hours to be compelling and reasonable. We believe that eight hours is a reasonable estimate for how long it will take to complete this process generally; and for institutions with greater levels of oversight, review, or complexity, this process may take longer. Without additional information about the specific levels of review and oversight at individual institutions, we assume that the amount of time it will take institutions to complete this work will be normally distributed between 8 and 12 hours, with a national average of 10 hours per institution. Therefore, the Department has upwardly revised its initial estimate of four hours to ten hours. In the most recent year for which data are available, 1,490 IHEs submitted IRCs to the Department, for an estimated one-time cost of $384,120.
One commenter argued that institutions would have to make costly updates and upgrades to their existing information technology (IT) platforms in order to generate the required new reports. However, given that institutions will not be required to generate reports on any new data elements, but only disaggregate the data already being collected by program, and that we include cost estimates for making the necessary changes to their existing systems in order to generate reports in that way, we do not believe it would be appropriate to include additional costs associated with large IT purchases in this cost estimate.
The Department further estimated that each of the 1,490 IHEs would need to spend 78 hours to collect the data elements required for the IRC for its teacher preparation programs. Several commenters argued that it would take longer than 78 hours to collect the data elements required for the IRC each year. The Department reviewed its original estimates in light of these comments and the new requirement for IHEs to identify, in their IRCs, whether each program met the definition of a teacher preparation program provided through distance education. Pursuant to that review, the Department has increased its initial estimate to 80 hours, for an annual cost of $3,072,980.
We originally estimated that entering the required information into the information collection instrument would take 13.65 hours per entity. We currently estimate that, on average, it takes one hour for institutions to enter the data for the current IRC. The Department believed that it would take institutions approximately as long to complete the report for each program as it does currently for the entire entity. As such, the regulations would result in an additional burden of the time to complete all individual program level
The regulations also require that each IHE provide the information reported on the IRC to the general public by prominently and promptly posting the IRC on the IHE's Web site, and, if applicable, on the teacher preparation portion of the Web site. We originally estimated that each IHE would require 30 minutes to post the IRC. One commenter stated that this estimate was reasonable given the tasks involved, while two commenters argued that this was an underestimate. One of these commenters stated that posting data on the institutional Web site often involved multiple staff, which was not captured in the Department's initial estimate. Another commenter argued that this estimate did not take into account time for data verification, drafting of summary text to accompany the document, or ensuring compliance with the Americans with Disabilities Act (ADA). Given that institutions will simply be posting on their Web site the final IRC that was submitted to the Department, we assume that the document has already been reviewed by all necessary parties and that all included data have been verified prior to being submitted to the Department. As such, the requirement to post the IRC to the Web site should not incur any additional levels of review or data validation. Regarding ADA compliance, we assume the commenter was referring to the broad set of statutory requirements regarding accessibility of communications by entities receiving Federal funding. In general, it is our belief that the vast majority of institutions, when developing materials for public dissemination, already ensure that such materials meet government- and industry-recognized standards for accessibility. To the extent that they do not already do so, nothing in the regulations imposes additional accessibility requirements beyond those in the Rehabilitation Act of 1973, as amended, or the ADA. As such, while there may be accessibility-related work associated with the preparation of these documents that is not already within the standard procedures of the institution, such work is not a burden created by the regulations. Thus, we believe our initial estimate of 30 minutes is appropriate, for an annual cumulative cost of $19,210. The estimated total annual cost to IHEs to meet the requirements concerning IRCs would be $3,991,030.
We note that several commenters, in response to the Supplemental NPRM, argued that institutions would experience increased compliance costs given new provisions related to teacher preparation programs provided through distance education. However, nothing in the Supplemental NPRM proposed changes to institutional burden under § 612.3. Under the final regulations, the only increased burden on IHEs with respect to teacher preparation programs provided through distance education is that they identify whether each of the teacher preparation programs they offer meet the definition in § 612.2. We believe that the additional two hours estimated for data collection above the Department's initial estimate provides more than enough time for IHEs to meet this requirement. We do not estimate additional compliance costs to accrue to IHEs as a result of provisions in this regulation related to teacher preparation programs provided through distance education.
Section 205(b) of the HEA requires each State that receives funds under the HEA to report annually to the Secretary on the quality of teacher preparation in the State, both for traditional teacher preparation programs and for alternative routes to State certification or licensure programs, and to make this report available to the general public. In the NPRM, the Department estimated that the 50 States, the District of Columbia, the Commonwealth of Puerto Rico, Guam, American Samoa, the United States Virgin Islands, the Commonwealth of the Northern Mariana Islands, and the Freely Associated States, which include the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau would each need 235 hours to report the data required under the SRC.
In response to the original NPRM, two commenters argued that this estimate was too low. Specifically, one commenter stated that, based on the amount of time their State has historically devoted to reporting the data in the SRC, it would take approximately 372.5 hours to complete. We note that not all States will be able to complete the reporting requirements in 235 hours and that some States, particularly those with more complex systems or more institutions, will take much longer. We also note that the State identified by the commenter in developing the 372.5 hour estimate meets both of those conditions—it uses a separate reporting structure to develop its SRC (one of only two States nationwide to do so), and has an above-average number of preparation programs. As such, it is reasonable to assume that this State would require more than the nationwide average amount of time to complete the process. Another commenter stated that the Department's estimates did not take into account the amount of time and potential staff resources needed to prepare and post the information. We note that there are many other aspects of preparing and posting the data that are not reflected in this estimate, such as collecting, verifying, and validating the data. We also note that this estimate does not take into account the time required to report on student learning outcomes, employment outcomes, or survey results. However, all of these estimates are included elsewhere in these cost estimates. We believe that, taken as a whole, all of these various elements appropriately capture the time and staff resources necessary to comply with the SRC reporting requirement.
As proposed in the Supplemental NPRM, and as described in greater detail below, in these final regulations, States will be required to report on teacher preparation programs offered through distance education that produce 25 or more certified teachers in their State. The Department estimates that the reporting on these additional programs, in conjunction with the reduction in the total number of teacher preparation programs from our initial estimates in the NPRM, will result in a net increase in the time necessary to report the data required in the SRC from the 235 hours
Section 612.4(a)(2) requires that States post the SRC on the State's Web site. Because all States already have at least one Web site in operation, we originally estimated that posting the SRC on an existing Web site would require no more than half an hour at a cost of $25.78 per hour. Two commenters suggested that this estimate was too low. One commenter argued that the Department's initial estimate did not take into account time to create Web-ready materials or to address technical errors. In general, the regulations do not require the SRC to be posted in any specific format and we believe that it would take a State minimal time to create a file that would be compliant with the regulations by, for example, creating a PDF containing the SRC. We were unable to determine from this comment the specific technical errors that the commenter was concerned about, but believe that enough States will need less than the originally estimated 30 minutes to post the SRC so that the overall average will not be affected if a handful of States encounter technical issues. Another commenter estimated that, using its current Web reporting system, it would take approximately 450 hours to initially set up the SRC Web site with a recurring 8 hours annually to update it. However, we note that the system the commenter describes is more labor intensive and includes more data analysis than the regulations require. While we recognize the value in States' actively trying to make the SRC data more accessible and useful to the public, we cannot accurately estimate how many States will choose to do more than the regulations require, or what costs they would encounter to do so. We have therefore opted to estimate only the time and costs necessary to comply with the regulations. As such, we retain our initial estimate of 30 minutes to post the SRC. For the 50 States, the District of Columbia, the Commonwealth of Puerto Rico, Guam, American Samoa, the United States Virgin Islands, the Commonwealth of the Northern Mariana Islands, the Freely Associated States, which include the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau the total annual estimated cost of meeting this requirement would be $760.
The costs associated with the reporting requirements in paragraphs (b) and (c) of § 612.4 are discussed in the following paragraphs. The requirements regarding reporting of a teacher preparation program's indicators of academic content knowledge and teaching skills do not apply to the insular areas of American Samoa, Guam, the Commonwealth of the Northern Mariana Islands, the U.S. Virgin Islands, the freely associated States of the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau. Due to their size and limited resources and capacity in some of these areas, we believe that the cost to these insular areas of collecting and reporting data on these indicators would not be warranted.
As described in the Supplemental NPRM (81 FR 18808), the Department initially estimated that the portions of this regulation relating to reporting on teacher preparation programs offered through distance education would result in 812 additional reporting instances for States. A number of commenters acknowledged the difficulty in arriving at an accurate estimate of the number of teacher preparation programs offered through distance education that would be subject to reporting under the final regulation. However, those commenters also noted that, without a clear definition from the Department on what constitutes a teacher preparation program offered through distance education, it would be exceptionally difficult to offer an alternative estimate. No commenters provided alternate estimates. In these final regulations, the Department has adopted a definition of teacher preparation program offered through distance education. We believe that this definition is consistent with our initial estimation methodology and have no reason to adjust that estimate at this time.
Under § 612.4(b)(1), a State would be required to make meaningful differentiations in teacher preparation program performance using at least three performance levels—low-performing teacher preparation program, at-risk teacher preparation program, and effective teacher preparation program—based on the indicators in § 612.5, including student learning outcomes and employment outcomes for teachers in high-need schools. Because States would have the discretion to determine the weighting of these indicators, the Department assumes that States would consult with early adopter States or researchers to determine best practices for making such determinations and whether an underlying qualitative basis should exist for these decisions. The Department originally estimated that State higher education authorities responsible for making State-level classifications of teacher preparation programs would require at least 35 hours to discuss methods for ensuring that meaningful differentiations are made in their classifications. This initial estimate also included determining what it meant for particular indicators to be included “in significant part” and what constituted “satisfactory” student learning outcomes, which are not included in the final regulations.
A number of commenters stated that 35 hours was an underestimate. Of the commenters that suggested alternative estimates, those estimates typically ranged from 60 to 70 hours (the highest estimate was 350 hours). Based on these comments, the Department believes that its original estimate would not have provided sufficient time for multiple staff to meet and discuss teacher preparation program quality in a meaningful way. As such, and given that these staff will be making decisions regarding a smaller range of issues, the Department is revising its estimate to 70 hours per State. We believe that this amount of time would be sufficient for staff to discuss and make decisions on these issues in a meaningful and purposeful way. To estimate the cost per State, we assume that the State employee or employees would likely be in a managerial position (with national average hourly earnings of $45.58), for a total one-time cost for each of the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of $165,910.
Section 612.4(c)(1) requires States to consult with a representative group of stakeholders to determine the procedures for assessing and reporting the performance of each teacher preparation program in the State. The regulations specify that these stakeholders must include, at a minimum, representatives of leaders and faculty of traditional teacher preparation programs and alternative routes to State certification or licensure programs; students of teacher preparation programs; LEA superintendents; local school board members; elementary and secondary school leaders and instructional staff; elementary and secondary school students and their parents; IHEs that serve high proportions of low-income students or students of color, or English learners; advocates for English learners and students with disabilities; officials
Many commenters stated that their States would likely adopt methods different from those outlined below. In particular, these commenters argued that their States would include more than the minimum number of participants we used for these estimates. In general, while States may opt to do more than what is required by the regulations, for purposes of estimating the cost, we have based the estimate on what the regulations require. If States opt to include more participants or consult with them more frequently or for longer periods of time, then the costs incurred by States and the participants would be higher.
In order to estimate the cost of implementing these requirements, we assume that the average State will convene at least three meetings with at least the following representatives from required categories of stakeholders: One administrator or faculty member from a traditional teacher preparation program, one administrator or faculty member from an alternative route teacher preparation program, one student from a traditional or alternative route teacher preparation program, one teacher or other instructional staff, one representative of a small teacher preparation program, one LEA superintendent, one local school board member, one student in elementary or secondary school and one of his or her parents, one administrator or faculty member from an IHE that serves high percentages of low-income students or students of color, one representative of the interests of English learners, one representative of the interests of students with disabilities, one official from the State's standards board or other appropriate standards body, and one administrator or faculty from a teacher preparation program provided through distance education. We note that a representative of a small teacher preparation program and a representative from a teacher preparation program provided through distance education were not required stakeholders in the proposed regulations, but are included in these final regulations.
To estimate the cost of participating in these meetings for the required categories of stakeholders, we initially assumed that each meeting would require four hours of each participant's time and used the following national average hourly wages for full-time State government workers employed in these professions: Postsecondary education administrators, $50.57 (4 stakeholders); elementary or secondary education administrators, $50.97 (1 stakeholder); postsecondary teachers, $45.78 (1 stakeholder); primary, secondary, and special education school teachers, $41.66 (1 stakeholder). For the official from the State's standards board or other appropriate standards body, we used the national average hourly earnings of $59.32 for chief executives employed by State governments. For the representatives of the interests of students who are English learners and students with disabilities, we used the national average hourly earnings of $62.64 for lawyers in educational services (including private, State, and local government schools). For the opportunity cost to the representatives of elementary and secondary school students, we used the Federal minimum wage of $7.25 per hour and the average hourly wage for all workers of $22.71. These wage rates could represent either the involvement of a parent and a student at these meetings, or a single representative from an organization representing their interests who has an above average wage rate (
A number of commenters stated that this consultation process would take longer than the 12 hours in our initial estimate and that our estimates did not include time for preparation for the meetings or for participant travel. Alternate estimates from commenters ranged from 56 hours to 3,900 hours. Based on the comments we received, the Department believes that both States and participants may opt to meet for longer periods of time at each meeting or more frequently. However, we believe that many of the estimates from commenters were overestimates for an annual process. For example, the 3,900 hour estimate would require a commitment on the part of participants totaling 75 hours per week for 52 weeks per year. We believe this is highly unrealistic. However, we do recognize that States and interested parties may wish to spend a greater amount of time in the first year to discuss and establish the initial framework than we initially estimated. As such, we are increasing our initial estimate of 12 hours in the first year to 60 hours. We believe that this amount of time will provide an adequate amount of time for discussion of these important issues. We therefore estimate the cumulative cost to the 50 States, the District of Columbia, and Puerto Rico to be $2,385,900.
We also recognize that, although the Department initially only estimated this consultative process occurring once every five years, States may wish to have a continuing consultation with these stakeholders. We believe that this engagement would take place either over email or conference call, or with an on-site meeting. We therefore are adding an estimated 20 hours per year for the intervening years for consulting with stakeholders. We therefore estimate that these additional consultations with stakeholders will cumulatively cost the 50 States, the District of Columbia, and Puerto Rico $690,110.
States would also be required to report on the State-level rewards or consequences associated with the designated performance levels and on the opportunities they provide for teacher preparation programs to challenge the accuracy of their performance data and classification of the program. Costs associated with implementing these requirements are estimated in the discussion of annual costs associated with the SRC.
Under final § 612.4(b)(3), a State would be required to ensure that teacher preparation programs in the State are included on the SRC, but with some flexibility due to the Department's recognition that reporting on teacher preparation programs particularly consisting of a small number of prospective teachers could present privacy and data validity concerns. See § 612.4(b)(5). The Department originally
Two commenters stated that the Department's initial estimate seemed low given the amount of work involved and three other commenters stated that the Department's initial estimates were adequate. Another commenter stated that this process would likely take longer in his State. No commenters offered alternative estimates. For the vast majority of States, we continue to believe that 14 hours is a sufficient amount of time for staff to review and analyze the applicable laws and statutes. However, given the potential complexity of these issues, as raised by commenters, we recognize that there may be additional staff involved and additional meetings required for purposes of consultation. In order to account for these additional burdens where they may exist, the Department is increasing its initial estimate to 20 hours. We believe that this will provide sufficient time for review, analysis, and discussion of these important issues. This provides an estimated cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of $51,750, based on the average national hourly earnings for a lawyer employed full-time by a State government ($49.76).
For purposes of reporting under § 612.4, each State will need to establish indicators that would be used to assess the academic content knowledge and teaching skills of the graduates of teacher preparation programs within its jurisdiction. At a minimum, States must base their assessments on student learning outcomes, employment outcomes, survey outcomes, and whether or not the program is accredited by a specialized accrediting agency recognized by the Secretary for accreditation of professional teacher education programs, or provides teacher candidates with content and pedagogical knowledge, and quality clinical preparation, and has rigorous teacher candidate exit qualifications.
States are required to report these outcomes for teacher preparation programs within their jurisdiction, with the only exceptions being for small programs for which aggregation under § 612.4(b)(3)(ii) would not yield the program size threshold (or for a State that chooses a lower program size threshold, would not yield the lower program size threshold) for that program, and for any program where reporting data would lead to conflicts with Federal or State privacy and confidentiality laws and regulations.
In § 612.5, the Department requires that States assess the performance of teacher preparation programs based in part on data on the aggregate learning outcomes of students taught by novice teachers prepared by those programs. States have the option of calculating these outcomes using student growth, a teacher evaluation measure that includes student growth, another State-determined measure relevant to calculating student learning outcomes, or a combination of the three. Regardless of how they determine student learning outcomes, States are required to link these data to novice teachers and their teacher preparation programs. In the NPRM, we used available sources of information to assess the extent to which States appeared to already have the capacity to measure student learning outcomes and estimated the additional costs States that did not currently have the capacity might incur in order to comply with the regulations. However, in these final regulations, the Department has expanded the definition of “teacher evaluation measure” and provided States with the discretion to use a State-determined measure relevant to calculating student learning outcomes, which they did not have in the proposed regulations. In our initial estimates, the Department assumed that only eight States would experience costs associated with measuring student learning outcomes. Of those, the Department noted that two already had annual teacher evaluations that included at least some objective evidence of student learning. For these two States, we estimated it would cost approximately $596,720 to comply with the proposed regulations. For the six remaining States, we estimated a cost of $16,079,390. We note that several commenters raised concerns about the specifics of some of our assumptions in making these estimates, particularly the amount of time we assumed it would take to complete the tasks we described. We outline and respond to those comments below. However, given the revised definition of “teacher evaluation measure,” the additional option for States to use a State-defined measure other than student growth or a teacher evaluation measure, and the measures that States are already planning to implement consistent with ESSA, we believe all States either already have in place a system for measuring student learning outcomes or are already planning to have one in place absent these regulations. As such, we no longer believe that States will incur costs associated with measuring student learning outcomes solely as a result of these regulations.
In the NPRM, we assumed that the States would not need to incur any additional costs to measure student growth for tested grades and subjects and would only need to link these outcomes to teacher preparation programs by first linking the students' teachers to the teacher preparation program from which they graduated. The costs of linking student learning outcomes to teacher preparation programs are discussed below. Several commenters stated that assuming no costs for teachers in tested grades and subjects was unrealistic because this estimate was based on assurances provided by States, rather than on an assessment of actual State practice. We recognize the commenters' point. States that have made assurances to provide these student growth data may not currently be providing this information
In the NPRM, we assumed that the District of Columbia, Puerto Rico, and the 42 States, which all that had their requests for flexibility regarding specific requirements of the ESEA approved, would not incur additional costs to comply with the proposed regulations. This was, in part, because the teacher evaluation measures that they agreed to implement as part of the flexibility would meet the definition of a “teacher evaluation measure” under the proposed regulations. Some commenters expressed doubt that there would be no additional costs for these States, and others cited costs associated with developing new assessments for all currently non-tested grades and subjects (totaling as many as 57 new assessments). We recognize that States likely incurred costs to implement statewide comprehensive teacher evaluations. However, those additional costs did not accrue to States as a result of the regulations, but instead as part of their efforts under flexibility agreements. Therefore, we do not include an analysis of costs for States that received ESEA flexibility herein. Additionally, as noted previously, the regulations do not require States to develop new assessments for all currently non-tested grades and subjects. Therefore, we do not include costs for such efforts in these estimates.
To estimate, in the NPRM, the cost of measuring student growth for teachers in non-tested grades and subjects in the eight States that were not approved for ESEA flexibility, we divided the States into two groups—those who had annual teacher evaluations with at least some objective evidence of student learning outcomes and those that did not.
For those States that did not have an annual teacher evaluation in place, we estimated that it would take approximately 6.85 hours of a teacher's time and 5.05 hours of an evaluator's time to measure student growth using student learning objectives. Two commenters stated that these were underestimates, specifically noting that certain student outcomes (
In fact, we believe that this estimate likely overstated the cost to States that already require annual evaluations of all novice teachers because many of these evaluations would already encompass many of the activities in the framework. The National Council on Teacher Quality has reported that two of the eight States that did not receive ESEA flexibility required annual evaluations of all novice teachers and that those evaluations included at least some objective evidence of student learning. In these States, we initially estimated that teachers and evaluators would need to spend only a combined three hours to develop and measure against student learning objectives for the 4,629 novice teachers in these States.
Several commenters stated that their States did not currently have these data, and others argued that this estimate did not account for the costs of verifying the data. We understand that States may not currently have structures in place to measure student learning outcomes as defined in the proposed rules. However, we believe that the revisions in the final rule provide sufficient flexibility to States to ensure that they can meet the requirements of this section without incurring additional measurement costs as a result of compliance with this regulation. We have included costs for challenging data elsewhere in these estimates.
Whether using student scores on State assessments, teacher evaluation ratings, or other measures of student growth, under the regulations States must link the student learning outcomes data back to the teacher, and then back to that teacher's preparation program. The costs to States to comply with this requirement will depend, in part, on the data and linkages in their statewide longitudinal data system. Through the Statewide Longitudinal Data Systems (SLDS) program, the Department has awarded $575.7 million in grants to support data systems that, among other things, allow States to link student achievement data to individual teachers and to postsecondary education systems. Forty-seven States, the District of Columbia, and the Commonwealth of Puerto Rico have already received at least one grant under this program to support the development of these data systems, so we expect that the cost to these States of linking student learning outcomes to teacher preparation programs would be lower than for the remaining States.
According to information from the SLDS program in June 2015, nine States currently link K-12 teacher data including data on both teacher/administrator evaluations and teacher preparation programs to K-12 student data. An additional 11 States and the District of Columbia are currently in the process of establishing this linkage, and ten States and the Commonwealth of Puerto Rico have plans to add this linkage to their systems during their SLDS grant. Based on this information, it appears that 30 States, the Commonwealth of Puerto Rico, and the District of Columbia either already have the ability to aggregate data on student achievement of students taught by program graduates and link those data back to teacher preparation programs or have committed to doing so; therefore, we do not estimate any additional costs for these States to comply with this aspect of the regulations. We note that, based on information from other Department programs and initiatives, a larger number of States currently make these linkages and would therefore incur no additional costs associated with the regulations. However, for purposes of this estimate, we use data from the SLDS program. As a result, these estimates are likely overestimates of the actual costs borne by States to make these data connections.
During the development of the regulations, the Department consulted with experts familiar with the development of student growth models and longitudinal data systems. These experts indicated that the cost of calculating growth for students taught by individual teachers and aggregating these data according to the teacher preparation program that these teachers completed would vary among States. For example, in States in which data on teacher preparation programs are housed within different or even multiple different postsecondary data systems that are not currently linked to data systems for elementary through secondary education students and teachers, these experts suggested that a
Several commenters stated that their States did not currently have the ability to make these linkages and their data systems would have to be updated and that, even in States that already have these linkages, there may be required updates to the system. We recognize that some States for which we assume no costs do not yet have the required functionality in their State data systems to make the links required under the regulations. However, as noted elsewhere, we reasonably rely on the assurances made by States that they are already planning on establishing these links, and are not doing so as a result of the regulations. As a result, we do not estimate costs for those States here. In regards to States that already have systems with these links in place, we are not aware of any updates that will need to be made to any of these systems solely in order to comply with the regulations, and therefore estimate no additional costs to these States.
The final regulations require States to report employment outcomes, including data on both the teacher placement rate and the teacher retention rate, and on the effectiveness of a teacher preparation program in preparing, placing, and supporting novice teachers consistent with local educational needs. We have limited information on the extent to which States currently collect and maintain data on placement and retention for individual teachers.
Under § 612.4(b), States are required to report annually, for each teacher preparation program, on the teacher placement rate for traditional teacher preparation programs, the teacher placement rate calculated for high-need schools for all teacher preparation programs (whether traditional or alternative route), the teacher retention rate for all teacher preparation programs (whether traditional or alternative route), and the teacher retention rate calculated for high-need schools for all teacher preparation programs (whether traditional or alternative route). States are not required to report on the teacher placement rate for alternative route programs. The Department has defined the “teacher placement rate” as the percentage of recent graduates who have become novice teachers (regardless of retention) for the grade level, span, and subject area in which they were prepared. “High-need schools” is defined in § 612.2(d) by using the definition of “high-need school” in section 200(11) of the HEA. The regulations will give States discretion to exclude recent graduates from this measure if they are teaching in a private school, teaching in another State, teaching in a position that does not require State certification, enrolled in graduate school, or engaged in military service.
Section 612.5(a)(2) and the definition of “teacher retention rate” in § 612.2 require a State to provide data on each teacher preparation program's teacher retention rate, by calculating, for each of the last three cohorts of novice teachers preceding the current title II reporting year, the percentage of those teachers who have been continuously employed as teachers of record in each year between their first year as a novice teacher and the current reporting year. For the purposes of this definition, a cohort of novice teachers is determined by the first year in which they were identified as a novice teacher by the State. High-need schools is defined in § 612.2 by using the definition of “high-need school” from section 200(11) of the HEA. The regulations give States discretion to exclude novice teachers from this measure if they are teaching in a private school or another State, enrolled in graduate school, or serving in the military. States also have the discretion to treat this rate differently for alternative route and traditional route providers.
In its comments on the Department's Notice of Intention to Develop Proposed Regulations Regarding Teacher Preparation Reporting Requirements, the Data Quality Campaign reported that 50 States, the District of Columbia, and the Commonwealth of Puerto Rico all collect some certification information on individual teachers and that a subset of States collect the following specific information on teacher preparation or qualifications that is relevant to the requirements: Type of teacher preparation program (42 States), location of teacher preparation program (47 States), and year of certification (51 States).
Data from the SLDS program indicate that 24 States can currently link data on individual teachers with their teacher preparation programs, including information on their current certification status and placement. In addition, seven States are currently in the process of making these links, and 10 States plan to add this capacity to their data systems, but have not yet established the link and process for doing so. Because these States would also maintain information on the certification status and year of certification of individual teachers, we assume they would already be able to calculate the teacher placement and retention rates for novice teachers but may incur additional costs to identify recent graduates who are not employed in a full-time teaching position within the State. It should be possible to do this at minimal cost by matching rosters of recent graduates from teacher preparation programs against teachers employed in full-time teaching positions who received their initial certification within the last three years. Additionally, because States already maintain the necessary information in State databases to identify schools as “high-need,” we do not believe there would be any appreciable additional cost associated with adding “high-need” flags to any accounting of teacher retention or placement rates in the State.
Several commenters stated that it was unrealistic to assume that any States currently had the information required under the regulations as the requirements were new. While we recognize that States may not have previously conducted these specific data analyses in the past, this does not mean that their systems are incapable of doing so. In fact, as outlined above, information available to the Department indicates that at least 24 States already have this capacity and that an additional 17 are in the process of developing it or plan to do so. Therefore, regardless of whether the specific data analysis itself is new, these States will not incur additional costs associated with the final regulations to establish that functionality.
The remaining 11 States may need to collect additional information from teacher preparation programs and LEAs because they do not appear to be able to link information on the employment,
A number of commenters stated that IHEs would experience substantial burden in obtaining this information from all graduates. We agree that teacher preparation programs individually tracking and contacting their recent graduates would be highly burdensome and inefficient. However, in the regulations, the reporting burden falls on States, rather than institutions. As such, we believe it would be inappropriate to assume data collection costs and reporting burdens accruing to institutions.
For each of these 11 States, the Department originally estimated that 150 hours may be required at the State level to collect information about novice teachers employed in full-time teaching positions (including designing the data collection instruments, disseminating them, providing training or other technical assistance on completing the instruments, collecting the data, and checking their accuracy). Several commenters stated that the Department's estimates were too low. One commenter estimated that this process would take 350 hours. Another commenter indicated that his State takes approximately 100 hours to collect data on first year teachers and that data collection on more cohorts would take more time. Generally, the Department believes that this sort of data collection is subject to economies of scale—that for each additional cohort on which data are collected in a given year, the average time and cost associated with each cohort will decrease. This belief arises from the fact that many of the costs associated with such a collection, such as designing the data request instruments and disseminating them, are largely fixed. As such, we do not think that collecting data on three cohorts will take three times as long as collecting data on one. However, we do recognize that there could be wide variation across States depending on the complexity of their systems and the way in which they opt to collect these data. For example, a State that sends data requests to individual LEAs to query their own data systems will experience a much higher overall burden with this provision than one that sends data requests to a handful of analysts at the State level who perform a small number of queries on State databases. Because of this potentially wide variation in burden across States, it is difficult to accurately estimate an average. However, based on public comment, we recognize that our initial estimate may have been too low. However, we also believe that States will make every effort to reduce the burdens associated with this provision. As such, we are increasing our estimate to 200 hours, with an expectation that this may vary widely across States. Using this estimate, we calculate a total annual cost to the 11 States of $112,130, based on the national average hourly wage for education administrators of $50.97.
Under § 612.5(a)(4) States are required to report whether each teacher preparation program in the State either: (a) Is accredited by a specialized accrediting agency recognized by the Secretary for accreditation of professional teacher education programs, or (b) provides teacher candidates with content and pedagogical knowledge and quality clinical preparation, and has rigorous teacher candidate exit standards. As discussed in greater detail in the Paperwork Reduction Act section of this document, we estimate that the total cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of providing these assurances for the estimated 15,335 teacher preparation programs nationwide for which States have already determined are accredited based on previous title II reporting submissions would be $790,670, assuming that 2 hours were required per teacher preparation program and using an estimated hourly wage of $25.78. Several commenters argued that these estimates did not accurately reflect the costs associated with seeking specialized accreditation. We agree with this statement. However, the regulations do not require programs to seek specialized accreditation. Thus, there would be no additional costs associated with this requirement for programs that are already seeking or have obtained specialized accreditation. If teacher preparation programs that do not currently have specialized accreditation decide to seek it, they would not be doing so because of a requirement in these regulations, and therefore, it would be inappropriate to include those costs here.
The Department requires States to report—disaggregated for each teacher preparation program—qualitative and quantitative data from surveys of novice teachers and their employers in order to capture their perceptions of whether novice teachers who were prepared at a teacher preparation program in that State possess the skills needed to succeed in the classroom. The design and implementation of these surveys would be determined by the State, but we provide the following estimates of costs associated with possible options for meeting this requirement.
Some States and IHEs currently survey graduates or recent graduates of teacher preparation programs. According to experts consulted by the Department, depending on the number of questions and the size of the sample, some of these surveys have been administered quite inexpensively. Oregon conducted a survey of a stratified random sample of approximately 50 percent of its teacher preparation program graduates and estimated that it cost $5,000 to develop and administer the survey and $5,000 to analyze and report the data. Since these data will be used to assess and publicly report on the quality of each teacher preparation program, we expect that the cost of implementing the proposed regulations is likely to be higher, because States may need to survey a larger sample of teachers and their employers in order to capture information on all teacher preparation programs.
Another potential factor in the cost of the teacher and employer surveys would be the number and type of questions. We have consulted with researchers experienced in the collection of survey data, and they have indicated that it is important to balance the burden on the respondent with the need to collect adequate information. In addition to asking teachers and their employers whether graduates of particular teacher preparation programs are adequately prepared before entering the classroom, States may also wish to ask about course-taking and student teaching experiences, as well as to collect demographic information on the respondent, including information on the school environment in which the
Based on our consultation with experts and previous experience conducting surveys of teachers through evaluations of Department programs or policies, we originally estimated that it would cost the average State approximately $25,000 to develop the survey instruments, including instructions for the survey recipients. However, a number of commenters argued that these development costs were far too low. Alternate estimates provided by commenters ranged from $50,000 per State to $200,000, with the majority of commenters offering a $50,000 estimate. As such, the Department has revised its original estimate to $50,000. This provides a total cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of $2,600,000. However, we recognize that the cost would be lower for States that identify an existing instrument that could be adapted or used for this purpose, potentially including survey instruments previously developed by other States.
To estimate the cost of administering these surveys, we consulted researchers with experience conducting a survey of all recent graduates of teacher preparation programs in New York City.
The California State School Climate Survey (CSCS) is one portion of the larger California School Climate, Health, & Learning Survey, designed to survey teachers and staff to address questions of school climate. While the CSCS is subsidized by the State of California, it is also offered to school districts outside of the State for a fee, ranging from $500 to $1,500 per district, depending on its enrollment size. Applying this cost structure to all school districts nationwide with enrollment (as outlined in the Department's Common Core of Data), we estimated in the NPRM that costs would range from a low of $0.05 per FTE teacher to $500 per FTE teacher with an average of $21.29 per FTE. However, these costs are inflated by single-school, single-teacher districts, which are largely either charter schools or small, rural school districts unlikely to administer separate surveys. When removing single-school, single-teacher districts, the average cost per respondent decreased to $12.27.
Given the cost savings associated with online administration of surveys and the likelihood that States will fold these surveys into existing structures, we believe that many of these costs are likely over-estimates of the actual costs that States will bear in administering these surveys. However, for purposes of estimating costs in this context, we use a rate of $30.33 per respondent, which represents a cost per respondent at the 85th percentile of the CSCS administration and well above the maximum administration cost for popular consumer survey software. One commenter stated that the Department's initial estimate was appropriate; but also suggested that, to reduce costs further, a survey could be administered less than annually, or only a subset of novice teachers could be surveyed. One commenter argued that this estimate was too low and provided an alternate estimate of aggregate costs for their State of $300,000 per year. We note, however, that this commenter's alternate estimate was actually a lower cost per respondent than the Department's initial estimate—approximately $25 per respondent compared to $30.33. Another commenter argued that administration of the survey would cost $100 per respondent. Some commenters also argued that administering the survey would require additional staff. Given the information discussed above and that public comment was divided on whether our estimate was too high, too low, or appropriate, we do not believe there is adequate reason to change our initial estimate of $30.33 per respondent. Undoubtedly, some States may bear the administration costs by hiring additional staff while others will contract with an outside entity for the administration of the survey. In either case, we believe our original estimates to be reasonable. Using that estimate, we estimate that, if States surveyed a combined sample of 180,744 teachers and an equivalent number of
If States surveyed all teacher preparation program graduates and their employers, assuming that both the teacher and employer surveys would take no more than 30 minutes to complete, that the employers are likely to be principals or district administrators, and a response rate of 70 percent of teachers and employers surveyed, the total estimated burden for 126,521 teachers and their 126,521 employers of completing the surveys would be $2,635,430 and $3,224,390 respectively, based on the national average hourly wage of $41.66 for elementary and secondary public school teachers and $50.97 for elementary and secondary school level administrators. These costs would vary depending on the extent to which a State determines that it can measure these outcomes based on a sample of novice teachers and their employers. This may depend on the distribution of novice teachers prepared by teacher preparation programs throughout the LEAs and schools within each State and also on whether or not some of this information is available from existing sources such as surveys of recent graduates conducted by teacher preparation programs as part of their accreditation process.
One commenter stated that principals would be unlikely to complete these surveys unless paid to do so. We recognize that some administrators may see these surveys as a burden and may be less willing to complete these surveys. However, we believe that States will likely take this factor into consideration when designing and administering these surveys by either reducing the amount of time necessary to complete the surveys, providing a financial incentive to complete them, or incorporating the surveys into other, pre-existing instruments that already require administrator input. Some States may also simply make completion a mandatory part of administrators' duties.
As discussed in greater detail in the Paperwork Reduction Act section of this document, § 612.4 includes several requirements for which States must annually report on the SRC. Using an estimated hourly wage of $25.78, we estimate that the total cost for the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico to report the following required information in the SRC would be: Classifications of teacher preparation programs ($370,280, based on 0.5 hours per 28,726 programs); assurances of accreditation ($98,830, based on 0.25 hours per 15,335 programs); State's weighting of the different indicators in § 612.5 ($340 annually, based on 0.25 hours per State); State-level rewards and consequences associated with the designated performance levels ($670 in the first year and $130 thereafter, based on 0.5 hours per State in the first year and 0.1 hours per State in subsequent years); method of program aggregation ($130 annually, based on 0.1 hours per State); and process for challenging data and program classification ($4,020 in the first year and $1,550 thereafter, based on 3 hours per State in the first year and 6 hours for 10 States in subsequent years).
The Department's initial estimates also included costs associated with the examination of data collection quality (5.3 hours per State annually), and recordkeeping and publishing related to appeal decisions (5.3 hours per State). However, one commenter stated that the examination of data quality would take a high level of scrutiny and would take more time than was originally estimated and that our estimate associated with recordkeeping and publishing was low. Additionally, several commenters responded generally to the overall cost estimates in the NPRM with concerns about data quality and review. In response to these general concerns, and upon further review, the Department believes that States are likely to engage in a more robust data quality review process in response to these regulations. Furthermore, we believe that the associated documentation and recordkeeping estimates may have been lower than those reasonably expected by States. As such, the Department has increased its estimate of the time required from the original 5.3 hour estimate to 10 hours in both cases. These changes result in an estimated cost of $13,410 for each of the two components. The sum of these annual reporting costs would be $495,960 for the first year and $492,950 in subsequent years, based on a cumulative burden hours of 19,238 hours in the first year and 19,121 hours in subsequent years.
In addition, a number of commenters expressed concern that our estimates included time and costs associated with challenging data and program classification but did not reflect time and costs associated with allowing programs to actually review data in the SRC to ensure that the teachers attributed to them were actual recent program graduates. We agree that program-level review of these data may be necessary, particularly in the first few years, in order to ensure valid and reliable data. As such, we have revised our cost estimates to include time for programs to individually review data reports to ensure their accuracy. We assume that this review will largely consist of matching lists of recent teacher preparation program graduates with prepopulated lists provided by the State. Based on the number of program completers during the 2013-2014 academic year, and the total number of teacher preparation programs in that year, we estimate the average program would review a list of 19 recent graduates (180,744 program completers each year over three years divided by 27,914 programs). As such, we do not believe this review will take a considerable amount of time. However, to ensure that we estimate sufficient time for this review, we estimate 1 hour per program for a total cost for the 27,914 teacher preparation programs of $719,620.
Under § 612.5, States would also incur burden to enter the required aggregated information on student learning, employment, and survey outcomes into the information collection instrument for each teacher preparation program. Using the estimated hourly wage rate of $25.78, we estimate the following cumulative costs to the 50 States, the District of Columbia, and Puerto Rico to report on 27,914 teacher preparation programs and 812 teacher preparation programs provided through distance education: Annual reporting on student learning outcomes ($1,851,390 annually, based on 2.5 hours per program); and annual reporting of employment outcomes ($2,591,950 annually, based on 3.5 hours per program); and annual reporting of survey outcomes ($740,560 annually, based on 1 hour per program).
After publication of the NPRM, we recognized that our initial estimates did not include costs or burden associated with States' reporting data on any other indicators of academic content knowledge and teaching skills. To the
Our estimate of the total annual cost of reporting these outcome measures on the SRC related to § 612.5 is $5,924,460, based on 229,808 hours.
The principal benefits related to the evaluation and classification of teacher preparation programs under the regulations are those resulting from the reporting and public availability of information on the effectiveness of teachers prepared by teacher preparation programs within each State. The Department believes that the information collected and reported as a result of these requirements will improve the accountability of teacher preparation programs, both traditional and alternative route to certification programs, for preparing teachers who are equipped to succeed in classroom settings and help their students reach their full potential.
Research studies have found significant and substantial variation in teaching effectiveness among individual teachers and some variation has also been found among graduates of different teacher preparation programs.
The Department recognizes that simply requiring States to assess the performance of teacher preparation programs and report this information to the public will not produce increases in student achievement, but it is an important part of a larger set of policies and investments designed to attract talented individuals to the teaching profession; prepare them for success in the classroom; and support, reward, and retain effective teachers. In addition, the Department believes that, once information on the performance of teacher preparation programs is more readily available, a variety of stakeholders will become better consumers of these data, which will ultimately lead to improved student achievement by influencing the behavior of States seeking to provide technical assistance to low-performing programs, IHEs engaging in deliberate self-improvement efforts, prospective teachers seeking to train at the highest quality teacher preparation programs, and employers seeking to hire the most highly qualified novice teachers.
Louisiana has already adopted some of the proposed requirements and has begun to see improvements in teacher preparation programs. Based on data suggesting that the English Language Arts teachers prepared by the University of Louisiana at Lafayette were producing teachers who were less effective than other novice teachers prepared by other programs, Louisiana identified the program in 2008 as being in need of improvement and provided additional analyses of the qualifications of the program's graduates and of the specific areas where the students taught by program graduates appeared to be struggling.
This is one example, but it suggests that States can use data on student learning outcomes for graduates of teacher preparation programs to help these programs identify weaknesses and implement needed reforms in a reasonable amount of time. As more information becomes available and if the data indicate that some programs produce more effective teachers, LEAs seeking to hire novice teachers will prefer to hire teachers from those programs. All things being equal, aspiring teachers will elect to pursue their degrees or certificates at teacher preparation programs with strong student learning outcomes, placement and retention rates, survey outcomes, and other measures.
The final regulations link program eligibility for participation in the TEACH Grant program to the State assessment of program quality under 34 CFR part 612. Under §§ 686.11(a)(1)(iii) and 686.2(d), to be eligible to receive a TEACH Grant for a program, an individual must be enrolled in a high-quality teacher preparation program—that is, a program that is classified by the State as effective or higher in either or both the October 2019 or October 2020 SRC for the 2021-2022 title IV, HEA award year; or, classified by the State as effective or higher in two out of
In addition to the referenced benefits of improved accountability under the title II reporting system, the Department believes that the regulations relating to TEACH Grants will also contribute to the improvement of teacher preparation programs. Linking program eligibility for TEACH Grants to the performance assessment by the States under the title II reporting system provides an additional factor for prospective students to consider when choosing a program and an incentive for programs to achieve a rating of effective or higher.
In order to analyze the possible effects of the regulations on the number of programs eligible to participate in the TEACH Grant program and the amount of TEACH Grants disbursed, the Department analyzed data from a variety of sources. This analysis focused on teacher preparation programs at IHEs. This is because, under the HEA, alternative route programs offered independently of an IHE are not eligible to participate in the TEACH Grant program. For the purpose of analyzing the effect of the regulations on TEACH Grants, the Department estimated the number of teacher preparation programs based on data from the Integrated Postsecondary Education Data System (IPEDS) about program graduates in education-related majors as defined by the Category of Instructional Program (CIP) codes and award levels. For the purposes of this analysis, “teacher preparation programs” refers to programs in the relevant CIP codes that also have the IPEDS indicator flag for being a State-approved teacher education program.
As detailed in the NPRM published December 3, 2014, in order to estimate how many programs might be affected by a loss of TEACH Grant eligibility, the Department had to estimate how many programs will be individually evaluated under the regulations, which encourage States to report on the performance of individual programs offered by IHEs rather than on the aggregated performance of programs at the institutional level as currently required. As before, the Department estimates that approximately 3,000 programs may be evaluated at the highest level of aggregation and approximately 17,000 could be evaluated if reporting is done at the most disaggregated level. Table 3 summarizes these two possible approaches to program definition that represent the opposite ends of the range of options available to the States. Based on IPEDS data, approximately 30 percent of programs defined at the six digit CIP code level have at least 25 novice teachers when aggregated across three years, so States may add one additional year to the analysis or aggregate programs with similar features to push more programs over the threshold, pursuant to the regulations. The actual number of programs at IHEs reported on will likely fall between these two points represented by Approach 1 and Approach 2. The final regulations define a teacher preparation program offered through distance education as a teacher preparation program at which at least 50 percent of the program's required coursework is offered through distance education and that starting with the 2021-2022 award year and subsequent award years, is not classified as less than effective, based on 34 CFR 612.4(b), by the same State for two out of the previous three years or meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or (E). The exact number of these programs is uncertain, but in the Supplemental NPRM concerning teacher preparation programs offered through distance education, the Department estimated that 812 programs would be reported. Whatever the number of programs, the TEACH Grant volume associated with these schools is captured in the amounts used in our
Given the number of programs and their TEACH Grant participation status as described in Table 3, the Department examined IPEDS data and the Department's budget estimates for 2017 related to TEACH Grants to estimate the effect of the regulations on TEACH Grants beginning with the FY 2021 cohort when the regulations would be in effect. Based on prior reporting, only 37 IHEs (representing an estimated 129 programs) were identified as having a low-performing or at-risk program in 2010 and twenty-seven States have not identified any low-performing programs in twelve years. Given prior identification of such programs and the fact that the States would continue to control the classification of teacher preparation programs subject to analysis, the Department does not expect a large percentage of programs to be subject to a loss of eligibility for TEACH Grants. Therefore, the Department evaluated the effects on the amount of TEACH Grants disbursed and the number of recipients on the basis of the States classifying a range of three percent, five percent, or eight percent of programs to be low-performing or at-risk. These results are summarized in Table 6. Ultimately, the number of programs affected is subject to the program definition, rating criteria, and program classifications adopted by the individual States, so the distribution of those effects is not known with certainty. However, the maximum effect, whatever the distribution, is limited by the amount of TEACH Grants made and the percentage of programs classified as low-performing and at-risk that participate in the TEACH Grant program. In the NPRM, the Department invited comments about the expected percentage of programs that will be found to be low-performing and at-risk. No specific comments were received, so the updated numbers based on the budget estimates for 2017 apply the same percentages as were used in the NPRM.
The estimated effects presented in Table 4 reflect assumptions about the likelihood of a program being ineligible and do not take into account the size of the program or participation in the TEACH Grant program. The Department had no program level performance information and treats the programs as equally likely to become ineligible for TEACH Grants. If, in fact, factors such as size or TEACH Grant participation were associated with high or low performance, the number of TEACH Grant recipients and TEACH Grant volume could deviate from these estimates.
Whatever the amount of TEACH Grant volume at programs found to be ineligible, the effect on IHEs will be reduced from the full amounts represented by the estimated effects presented here as students could elect to enroll in other programs at the same IHE that retain eligibility because they are classified by the State as effective or higher. Another factor that would reduce the effect of the regulations on programs and students is that an otherwise eligible student who received a TEACH Grant for enrollment in a TEACH Grant-eligible program is eligible to receive additional TEACH Grants to complete the program, even if that program loses status as a TEACH Grant-eligible program.
Several commenters expressed concern that linking TEACH Grant eligibility to the State's evaluation of the program would harm teacher development from, and availability to, poor and underserved communities. We believe that the pilot year that provides some warning of program performance, the flexibility for States to develop their evaluation criteria, and a long history of programs performing above the at-risk or low-performing levels will reduce the possibility of this effect. The Department continues to expect that over time a large portion of the TEACH Grant volume now disbursed to students at programs that will be categorized as low-performing or at-risk will be shifted to programs that remain eligible. The extent to which this happens will depend on other factors affecting the students' enrollment decisions such as in-State status, proximity to home or future employment locations, and the availability of programs of interest, but the Department believes that students will take into account a program's rating and the availability of TEACH Grants when looking for a teacher preparation program. As discussed in the Net Budget Impacts section of this RIA, the Department expects that the reduction in TEACH Grant volume will taper off as States identify low-performing and at-risk programs and those programs are improved or are no longer eligible for TEACH Grants. Because existing recipients will continue to have access to TEACH Grants, and incoming students will have notice and be able to consider the program's eligibility for TEACH Grants in making an enrollment decision, the reduction in TEACH Grant volume that is classified as a transfer from students at ineligible programs to the Federal government will be significantly reduced from the estimated range of approximately $3.0 million to approximately $8.0 million in Table 4 for the initial years the regulations are in effect. While we have no past experience with students' reaction to a designation of a program as low-performing and loss of TEACH Grant eligibility, we assume that, to the extent it is possible, students would choose to attend a program rated effective or higher. For IHEs, the effect of the loss of TEACH Grant funds will depend on the students' reaction and how many choose to enroll in an eligible program at the same IHE, choose to attend a different IHE, or make up for the loss of TEACH Grants by funding their program from other sources.
The Department does not anticipate that many programs will lose State approval or financial support. If this does occur, IHEs with such programs would have to notify enrolled and accepted students immediately, notify the Department within 30 days, and
The final regulations were developed through a negotiated rulemaking process in which different options were considered for several provisions. Among the alternatives the Department considered were various ways to reduce the volume of information States and teacher preparation programs are required to collect and report under the existing title II reporting system. One approach would have been to limit State reporting to items that are statutorily required. While this would reduce the reporting burden, it would not address the goal of enhancing the quality and usefulness of the data that are reported. Alternatively, by focusing the reporting requirements on student learning outcomes, employment outcomes, and teacher and employer survey data, and also providing States with flexibility in the specific methods they use to measure and weigh these outcomes, the regulations balance the desire to reduce burden with the need for more meaningful information.
Additionally, during the negotiated rulemaking session, some non-Federal negotiators spoke of the difficulty States would have developing the survey instruments, administering the surveys, and compiling and tabulating the results for the employer and teacher surveys. The Department offered to develop and conduct the surveys to alleviate additional burden and costs on States, but the non-Federal negotiators indicated that they preferred that States and teacher preparation programs conduct the surveys.
One alternative considered in carrying out the statutory directive to direct TEACH Grants to “high quality” programs was to limit eligibility only to programs that States classified as “exceptional”, positioning the grants more as a reward for truly outstanding programs than as an incentive for low-performing and at-risk programs to improve. In order to prevent a program's eligibility from fluctuating year-to-year based on small changes in evaluation systems that are being developed and to keep TEACH Grants available to a wider pool of students, including those attending teacher preparation programs producing satisfactory student learning outcomes, the Department and most non-Federal negotiators agreed that programs rated effective or higher would be eligible for TEACH Grants.
The final regulations related to the TEACH Grant program are estimated to have a net budget impact of $0.49 million in cost reduction over the 2016 to 2026 loan cohorts. These estimates were developed using the Office of Management and Budget's (OMB) Credit Subsidy Calculator. The OMB calculator takes projected future cash flows from the Department's student loan cost estimation model and produces discounted subsidy rates reflecting the net present value of all future Federal costs associated with awards made in a given fiscal year. Values are calculated using a “basket of zeros” methodology under which each cash flow is discounted using the interest rate of a zero-coupon Treasury bond with the same maturity as that cash flow. To ensure comparability across programs, this methodology is incorporated into the calculator and used Government-wide to develop estimates of the Federal cost of credit programs. Accordingly, the Department believes it is the appropriate methodology to use in developing estimates for these regulations. That said, in developing the following Accounting Statement, the Department consulted with OMB on how to integrate the Department's discounting methodology with the discounting methodology traditionally used in developing regulatory impact analyses.
Absent evidence of the impact of these regulations on student behavior, budget cost estimates were based on behavior as reflected in various Department data sets and longitudinal surveys. Program cost estimates were generated by running projected cash flows related to the provision through the Department's student loan cost estimation model. TEACH Grant cost estimates are developed across risk categories: Freshmen/sophomores at 4-year IHEs, juniors/seniors at 4-year IHEs, and graduate students. Risk categories have separate assumptions based on the historical pattern of the behavior of borrowers in each category—for example, the likelihood of default or the likelihood to use statutory deferment or discharge benefits.
As discussed in the TEACH Grants section of the
The estimated budget impact presented in Table 5 is defined against the PB 2017 baseline costs for the TEACH Grant program, and the actual volume of TEACH Grants in 2021 and beyond will vary. The budget impact estimate depends on the assumptions about the percent of TEACH Grant volume at programs that become ineligible and the share of that volume that is redistributed or reduced as shown in Table 5. Finally, absent evidence of different rates of loan conversion at programs that will be eligible or ineligible for TEACH Grants when the proposed regulations are in place, the Department did not assume a different loan conversion rate as TEACH Grants shifted to programs rated effective or higher. However, given that placement and retention rates are one element of the program evaluation system, the Department does hope that, as students shift to programs rated effective, more TEACH Grant recipients will fulfill their service obligations. If this is the case and their TEACH Grants do not convert to loans, the students who do not have to repay the converted loans will benefit and the expected cost reductions for the Federal government may be reduced or reversed because more of the TEACH Grants will remain grants and no payment will be made to the Federal government for these grants. The final regulations also change total and permanent disability discharge provisions related to TEACH Grants to be more consistent with the treatment of interest accrual for total and permanent discharges in the Direct Loan program. This is not expected to have a significant budget impact.
In addition to the TEACH Grant provision, the regulations include a provision that would make a program ineligible for title IV, HEA funds if the program was found to be low-performing and subject to the withdrawal of the State's approval or termination of the State's financial support. As noted in the NPRM, the Department assumes this will happen rarely and that the title IV, HEA funds involved would be shifted to other programs. Therefore, there is no budget impact associated with this provision.
As required by OMB Circular A-4 (available at
These regulations will affect IHEs that participate in the title IV, HEA programs, including TEACH Grants, alternative certification programs not housed at IHEs, States, and individual borrowers. The U.S. Small Business Administration (SBA) Size Standards define for-profit IHEs as “small businesses” if they are independently owned and operated and not dominant in their field of operation with total annual revenue below $7,000,000. The SBA Size Standards define nonprofit IHEs as small organizations if they are independently owned and operated and not dominant in their field of operation, or as small entities if they are IHEs controlled by governmental entities with populations below 50,000. The revenues involved in the sector affected by these regulations, and the concentration of ownership of IHEs by private owners or public systems means that the number of title IV, HEA eligible IHEs that are small entities would be limited but for the fact that the nonprofit entities fit within the definition of a small organization regardless of revenue. The potential for some of the programs offered by entities subject to the final regulations to lose eligibility to participate in the title IV, HEA programs led to the preparation of this Final Regulatory Flexibility Analysis.
The Department has a strong interest in encouraging the development of highly trained teachers and ensuring that today's children have high quality and effective teachers in the classroom, and it seeks to help achieve this goal in these final regulations. Teacher preparation programs have operated without access to meaningful data that could inform them of the effectiveness of their teachers who graduate and go on to work in the classroom setting.
The Department wants to establish a teacher preparation feedback mechanism premised upon teacher effectiveness. Under the final regulations, an accountability system would be established that would identify programs by quality so that effective teacher preparation programs could be recognized and rewarded and low-performing programs could be supported and improved. Data collected under the new system will help all teacher preparation programs make necessary corrections and continuously improve, while facilitating States' efforts to reshape and reform low-performing and at-risk programs.
We are issuing these regulations to better implement the teacher preparation program accountability and reporting system under title II of the HEA and to revise the regulations implementing the TEACH Grant program. Our key objective is to revise Federal reporting requirements, while reducing institutional burden, as appropriate. Additionally, we aim to have State reporting focus on the most important measures of teacher preparation program quality while tying TEACH Grant eligibility to assessments of program performance under the title II accountability system. The legal basis for these regulations is 20 U.S.C. 1022d, 1022f, and 1070g,
The final regulations related to title II reporting affect a larger number of entities, including small entities, than the smaller number of entities that could lose TEACH Grant eligibility or title IV, HEA program eligibility. The Department has more data on teacher preparation programs housed at IHEs than on those independent of IHEs. Whether evaluated at the aggregated institutional level or the disaggregated program level, as described in the TEACH Grant section of the
The Department has no indication that programs at small entities are more likely to be ineligible for TEACH Grants or title IV, HEA funds. Since all private not-for-profit IHEs are considered to be small because none are dominant in the field, we would expect about 5 percent of TEACH Grant volume at teacher preparation programs at private not-for-profit IHEs to be at ineligible programs. In AY 2014-15, approximately 43.7 percent of TEACH Grant disbursements went to private not-for-profit IHEs, and by applying that to the estimated TEACH Grant volume in 2021 of $95,918,782, the Department estimates that TEACH Grant volume at private not-for-profit IHEs in 2021 would be approximately $42.0 million. At the five percent low-performing or at-risk rate assumed in the TEACH Grants portion
In addition to the teacher preparation programs at IHEs included in Table 6, approximately 1,281 alternative certification programs offered outside of IHEs are subject to the reporting requirements in the regulations. The Department assumes that a significant majority of these programs are offered by non-profit entities that are not dominant in the field, so all of the alternative certification teacher preparation programs are considered to be small entities. However, the reporting burden for these programs falls on the States. As discussed in the Paperwork Reduction Act section of this document, the estimated total paperwork burden on IHEs would decrease by 66,740 hours. Small entities would benefit from this relief from the current institutional reporting requirements.
The final regulations are unlikely to conflict with or duplicate existing Federal regulations.
The Paperwork Reduction Act of 1995 (PRA) does not require you to respond to a collection of information unless it displays a valid OMB control number. We display the valid OMB control numbers assigned to the collections of information in these final regulations at the end of the affected sections of the regulations.
Sections 612.3, 612.4, 612.5, 612.6, 612.7, 612.8, and 686.2 contain information collection requirements. Under the PRA, the Department has submitted a copy of these sections, related forms, and Information Collection Requests (ICRs) to the Office of Management and Budget (OMB) for its review.
The OMB control number associated with the regulations and related forms is 1840-0837. Due to changes described in the
These regulations implement a statutory requirement that IHEs and States establish an information and accountability system through which IHEs and States report on the performance of their teacher preparation programs. Because parts of the regulations require IHEs and States to establish or scale up certain systems and processes in order to collect information necessary for annual reporting, IHEs and States may incur one-time start-up costs for developing those systems and processes. The burden associated with start-up and annual reporting is reported separately in this statement.
Section 205(a) of the HEA requires that each IHE that provides a teacher preparation program leading to State certification or licensure report on a statutorily enumerated series of data elements for the programs it provides. The HEOA revised a number of the reporting requirements for IHEs.
The final regulations under § 612.3(a) require that, beginning on April 1, 2018, and annually thereafter, each IHE that conducts traditional or alternative route teacher preparation programs leading to State initial teacher certification or licensure and that enrolls students receiving title IV, HEA funds report to the State on the quality of its programs using an IRC prescribed by the Secretary.
Under the current IRC, IHEs typically report at the entity level rather than the program level. For example, if an IHE offers multiple teacher preparation programs in a range of subject areas (for example, music education and special education), that IHE gathers data on each of those programs, aggregates the data, and reports the required information as a single teacher preparation entity on a single report card. Under the final regulations and for the reasons discussed in the NPRM and the preamble to this final rule, reporting is now required at the teacher preparation program level rather than at the entity level. No additional data must be gathered as a consequence of this regulatory requirement; instead, IHEs will simply report the required data before, rather than after, aggregation.
As a consequence, IHEs will not be required to alter appreciably their systems for data collection. However, the Department acknowledges that in order to communicate disaggregated data, minimal recordkeeping adjustments may be necessary. The Department estimates that initial burden for each IHE to adjust its recordkeeping systems will be 10 hours per entity. In the most recent year for which data are available, 1,490 IHEs reported required data to the Department through the IRC. Therefore, the Department estimates that the one-time total burden for IHEs to adjust recordkeeping systems will be 14,900 hours (1,490 IHEs multiplied by 10 burden hours per IHE).
The Department believes that IHEs' experience during prior title II reporting cycles has provided sufficient knowledge to ensure that IHEs will not incur any significant start-up burden, except for the change from entity-level to program-level reporting described above. Therefore, the subtotal of start-up burden for § 612.3 is 14,900 hours.
For a number of years IHEs have gathered, aggregated, and reported data on teacher preparation program characteristics, including those required under the HEOA, to the Department using the IRC approved under OMB control number 1840-0837. The required reporting elements of the IRC principally concern admissions criteria, student characteristics, clinical preparation, numbers of teachers prepared, accreditation of the program, and the pass rates and scaled scores of teacher candidates on State teacher certification and licensure examinations.
Given all of the reporting changes under these final rules as discussed in the NPRM, the Department estimates that each IHE will require 66 fewer burden hours to prepare the revised IRC annually. The Department estimates that each IHE will require 146 hours to complete the current IRC approved by OMB. There would thus be an annual burden of 80 hours to complete the revised IRC (146 hours minus 66 hours in reduced data collection). The Department estimates that 1,490 IHEs would respond to the IRC required under the regulations, based on reporting figures from the most recent year data are available. Therefore, reporting data using the IRC would represent a total annual reporting burden of 119,200 hours (80 hours multiplied by 1,490 IHEs).
As noted in the start-up burden section of § 612.3, under the current IRC, IHEs report teacher preparation program data at the entity level. The final regulations require that each IHE
Based on the most recent year of data available, the Department estimates that there are 27,914 teacher preparation programs at 1,490 IHEs nationwide. Based on these figures, the Department estimates that on average, each of these IHEs offers 16.40 teacher preparation programs. Because each IHE already collects disaggregated IRC data, the Department estimates it will take each IHE one additional hour to fill in existing disaggregated data into the electronic IRC for each teacher preparation program it offers. Because IHEs already have to submit an IRC for the IHE, we estimate that the added burden for reporting on a program level will be 15.40 hours (an average of 16.40 programs at one hour per program, minus the existing submission of one IRC for the IHE, or 15.40 hours). Therefore, each IHE will incur an average burden increase of 15.40 hours (1 hour multiplied by an average of 15.40 teacher preparation programs at each IHE), and there will be an overall burden increase of 22,946 hours each year associated with this regulatory reporting requirement (15.40 multiplied by 1,490 IHEs).
The regulations also require that the IHE provide the information reported on the IRC to the general public by prominently and promptly posting the IRC information on the IHE's Web site. Because the Department believes it is reasonable to assume that an IHE offering a teacher preparation program and communicating data related to that program by electronic means maintains a Web site, the Department presumes that posting such information to an already-existing Web site will represent a minimal burden increase. The Department therefore estimates that IHEs will require 0.5 hours (30 minutes) to meet this requirement. This would represent a total burden increase of 745 hours each year for all IHEs (0.5 hours multiplied by 1,490 IHEs).
Aggregating the annual burdens calculated under the preceding sections results in the following burdens: Together, all IHEs would incur a total burden of 119,200 hours to develop the systems needed to meet the requirements of the revised IRC, 22,946 hours to report program-level data, and 745 hours to post IRC data to their Web sites. This would constitute a total burden of 142,891 hours of annual burden nationwide.
Aggregating the start-up and annual burdens calculated under the preceding sections results in the following burdens: Together, all IHEs would incur a total start-up burden under § 612.3 of 14,900 hours and a total annual reporting burden under § 612.3 of 142,891 hours. This would constitute a total burden of 157,791 total burden hours under § 612.3 nationwide.
The burden estimate for the existing IRC approved under OMB control number 1840-0837 was 146 hours for each IHE with a teacher preparation program. When the current IRC was established, the Department estimated that 1,250 IHEs would provide information using the electronic submission of the form for a total burden of 182,500 hours for all IHEs (1,250 IHEs multiplied by 146 hours). Applying these estimates to the current number of IHEs that are required to report (1,490) would constitute a burden of 217,540 hours (1,490 IHEs multiplied by 146 hours). Based on these estimates, the revised IRC would constitute a net burden reduction of 59,749 hours nationwide (217,540 hours minus 157,791 hours).
Section 205(b) of the HEA requires that each State that receives funds under the HEA provide to the Secretary and make widely available to the public not less than the statutorily required specific information on the quality of traditional and alternative route teacher preparation programs. The State must do so in a uniform and comprehensible manner, conforming with definitions and methods established by the Secretary. Section 205(c) of the HEA directs the Secretary to prescribe regulations to ensure the validity, reliability, accuracy, and integrity of the data submitted. Section 206(b) requires that IHEs assure the Secretary that their teacher training programs respond to the needs of LEAs, be closely linked with the instructional decisions novice teachers confront in the classroom, and prepare candidates to work with diverse populations and in urban and rural settings, as applicable.
Implementing the relevant statutory directives, the regulations under § 612.4(a) require that, starting October 1, 2019, and annually thereafter, each State report on the SRC the quality of all approved teacher preparation programs in the State, whether or not they enroll students receiving Federal assistance under the HEA, including distance education programs. This new SRC, to be implemented in 2019, is an update of the current SRC. The State must also make the SRC information widely available to the general public by posting the information on the State's Web site.
Section 103(20) of the HEA and § 612.2(d) of the proposed regulations define “State” to include nine locations in addition to the 50 States: The Commonwealth of Puerto Rico, the District of Columbia, Guam, American Samoa, the United States Virgin Islands, the Commonwealth of the Northern Mariana Islands, the Freely Associated States, which include the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau. For this reason, all reporting required of States explicitly enumerated under § 205(b) of the HEA (and the related portions of the regulations, specifically §§ 612.4(a) and 612.6(b)), apply to these 59 States. However, certain additional regulatory requirements (specifically §§ 612.4(b), 612.4(c), 612.5, and 612.6(a)) only apply to the 50 States of the Union, the Commonwealth of Puerto Rico, and the District of Columbia. The burden estimates under those portions of this report apply to those 52 States. For a full discussion of the reasons for the application of certain regulatory provisions to different States, see the preamble to the NPRM.
As noted in the start-up and annual burden sections of § 612.3, under the current information collection process, data are collected at the entity level, and the final regulations require data reporting at the program level. In 2015, States reported that there were 27,914 teacher preparation programs offered, including 24,430 at IHEs and 3,484 through alternative route teacher preparation programs not associated with IHEs. In addition, as discussed in the Supplemental NPRM, the Department estimates that the sections of these final regulations addressing teacher preparation programs offered through distance education will result in 812 additional reporting instances.
Section 612.4(a) codifies State reporting requirements expressly referenced in section 205(b) of the HEA; the remainder of § 612.4 provides for reporting consistent with the directives to the Secretary under sections 205(b) and (c) and the required assurance described in section 206(c).
The HEOA revised a number of the reporting requirements for States. The requirements of the SRC are more numerous than those contained in the IRC, but the reporting elements required in both are similar in many respects. In addition, the Department has successfully integrated reporting to the extent that data reported by IHEs in the IRC is pre-populated in the relevant fields on which the States are required to report in the SRC. In addition to the elements discussed in § 612.3 of this burden statement regarding the IRC, under the statute a State must also report on its certification and licensure requirements and standards, State-wide pass rates and scaled scores, shortages of highly qualified teachers, and information related to low-performing or at-risk teacher preparation programs in the State.
The SRC currently in use, approved under OMB control number 1840-0837, collects information on these elements. States have been successfully reporting information under this collection for many years. The burden estimate for the existing SRC was 911 burden hours per State. In the burden estimate for that SRC, the Department reported that 59 States were required to report data, equivalent to the current requirements. This represented a total burden of 53,749 hours for all States (59 States multiplied by 911 hours). This burden calculation was made on entity-level, rather than program-level, reporting (for a more detailed discussion of the consequences of this issue, see the sections on entity-level and program-level reporting in §§ 612.3 and 612.4). However, because relevant program-level data reported by the IHEs on the IRC will be pre-populated for States on the SRC, the burden associated with program-level reporting under § 612.4(a) will be minimal. Those elements that will require additional burden are discussed in the subsequent paragraphs of this section.
Using the calculations outlined in the NPRM and changes discussed above, the Department estimates that the total reporting burden for each State will be 243 hours (193 hours for the revised SRC plus the additional statutory reporting requirements totaling 50 hours). This would represent a reduction of 668 burden hours for each State to complete the requirements of the SRC, as compared to approved OMB collection 1840-0837 (911 burden hours under the current SRC compared to 243 burden hours under the revised SRC). The total burden for States to report this information would be 14,337 hours (243 hours multiplied by 59 States).
The final regulations also require that the State provide the information reported on the SRC to the general public by prominently and promptly posting the SRC information on the State's Web site. Because the Department believes it is reasonable to assume that each State that communicates data related to its teacher preparation programs by electronic means maintains a Web site, the Department presumes that posting such information to an already-existing Web site represents a minimal burden increase. The Department therefore estimates that States will require 0.5 hours (30 minutes) to meet this requirement. This would represent a total burden increase of 29.5 hours each year for all IHEs (0.5 hours multiplied by 59 States).
As noted in the preceding discussion, there is no start-up burden associated solely with § 612.4(a). Therefore, the aggregate start-up and annual reporting burden associated with reporting elements under § 612.4(a) would be 14,366.5 hours (243 hours multiplied by 59 States plus 0.5 hours for each of the 59 States).
The preceding burden discussion of § 612.4 focused on burdens related to the reporting requirements under section 205(b) of the HEA and reflected in 34 CFR 612.4(a). The remaining burden discussion of § 612.4 concerns reporting required under § 612.4(b) and (c).
Under § 612.4(b)(1), a State is required to make meaningful differentiations in teacher preparation program performance using at least three performance levels—low-performing teacher preparation program, at-risk teacher preparation program, and effective teacher preparation program—based on the indicators in § 612.5 and including employment outcomes for high-need schools and student learning outcomes.
The Department believes that State higher education authorities responsible for making State-level classifications of teacher preparation programs will require time to make meaningful differentiations in their classifications and determine whether alternative performance levels are warranted. States are required to consult with external stakeholders, review best practices by early adopter States that have more experience in program classification, and seek technical assistance.
States will also have to determine how they will make such classifications. For example, a State may choose to classify all teacher preparation programs on an absolute basis using a cut-off score that weighs the various indicators, or a State may choose to classify teacher preparation programs on a relative basis, electing to classify a certain top percentile as exceptional, the next percentile as effective, and so on. In exercising this discretion, States may choose to consult with various external and internal parties and discuss lessons learned with those States already making such classifications of their teacher preparation programs.
The Department estimates that each State will require 70 hours to make these determinations, and this would constitute a one-time total burden of 3,640 hours (70 hours multiplied by 52 States).
Under § 612.4(b)(3)(i)(A), for each teacher preparation program, a State must provide disaggregated data for each of the indicators identified pursuant to § 612.5. See the start-up burden section of § 612.5 for a more detailed discussion of the burden associated with gathering the indicator data required to be reported under this regulatory section. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting
Under § 612.4(b)(3)(i)(B), a State is required to provide, for each teacher preparation program in the State, the State's assurance that the teacher preparation program either: (a) Is accredited by a specialized agency or (b) provides teacher candidates with content and pedagogical knowledge, quality clinical preparation, and rigorous teacher exit qualifications. See the start-up burden section of § 612.5 for a detailed discussion of the burden associated with gathering the indicator data required to be reported under this regulation. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting these assurances. No further burden exists beyond the burden described in these two sections.
Under § 612.4(b)(2)(ii), a State must provide its weighting of the different indicators in § 612.5 for purposes of describing the State's assessment of program performance. See the start-up burden section of § 612.4 on stakeholder consultation for a detailed discussion of the burden associated with establishing the weighting of the various indicators under § 612.5. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting these relative weightings. No further burden exists beyond the burden described in these two sections.
Under § 612.4(b)(2)(iii), a State must provide the State-level rewards or consequences associated with the designated performance levels. See the start-up burden section of § 612.4 on stakeholder consultation for a more detailed discussion of the burden associated with establishing these rewards or consequences. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting these relative weightings. No further burden exists beyond the burden described in these two sections.
Under § 612.4(b)(3), a State must ensure that all of its teacher preparation programs in that State are represented on the SRC. The Department recognized that many teacher preparation programs consist of a small number of prospective teachers and that reporting on these programs could present privacy and data validity issues. After discussion and input from various non-Federal negotiators during the negotiated rulemaking process, the Department elected to set a required reporting program size threshold of 25. However, the Department realized that, on the basis of research examining accuracy and validity relating to reporting small program sizes, some States may prefer to report on programs smaller than 25. Section 612.4(b)(3)(i) permits States to report using a lower program size threshold. In order to determine the preferred program size threshold for its programs, a State may review existing research or the practices of other States that set program size thresholds to determine feasibility for its own teacher preparation program reporting. The Department estimates that such review will require 20 hours for each State, and this would constitute a one-time total burden of 1,040 hours (20 hours multiplied by 52 States).
Under § 612.4(b)(3), all teacher preparation entities must report on the remaining small programs that do not meet the program size threshold the State chooses. States will be able to do so through a combination of two possible aggregation methods described in § 612.4(b)(3)(ii). The preferred aggregation methodology is to be determined by the States after consultation with a group of stakeholders. For a detailed discussion of the burden related to this consultation process, see the start-up burden section of § 612.4, which discusses the stakeholder consultation process. Apart from the burden discussed in that section, no other burden is associated with this requirement.
Under § 612.4(c), a State must consult with a representative group of stakeholders to determine the procedures for assessing and reporting the performance of each teacher preparation program in the State. This stakeholder group, composed of a variety of members representing viewpoints and interests affected by these regulations, must provide input on a number of issues concerning the State's discretion. There are four issues in particular on which the stakeholder group advises the State—
a. The relative weighting of the indicators identified in § 612.5;
b. The preferred method for aggregation of data such that performance data for a maximum number of small programs are reported;
c. The State-level rewards or consequences associated with the designated performance levels; and
d. The appropriate process and opportunity for programs to challenge the accuracy of their performance data and program classification.
The Department believes that this consultative process will require that the group convene at least three times to afford each of the stakeholder representatives multiple opportunities to meet and consult with the constituencies they represent. Further, the Department believes that members of the stakeholder group will require time to review relevant materials and academic literature and advise on the relative strength of each of the performance indicators under § 612.5, as well as any other matters requested by the State.
These stakeholders will also require time to advise whether any of the particular indicators will have more or less predictive value for the teacher preparation programs in their State, given its unique traits. Finally, because some States have already implemented one or more components of the regulatory indicators of program quality, these stakeholders will require time to review these States' experiences in implementing similar systems. The Department estimates that the combination of gathering the stakeholder group multiple times, review of the relevant literature and other States' experiences, and making determinations unique to their particular State will take 900 hours for each State (60 hours per stakeholder multiplied by 15 stakeholders). This would constitute a one-time total of 46,800 hours for all States (900 hours multiplied by 52 States).
Aggregating the start-up burdens calculated under the preceding sections results in the following burdens: All States would incur a total burden of 3,640 hours to make meaningful differentiations in program classifications, 1,040 hours to determine the State's aggregation of small programs, and 46,800 hours to complete the stakeholder consultation process. This would constitute a total of 51,480 hours of start-up burden nationwide.
The bulk of the State burden associated with assigning programs among classification levels should be in
Under § 612.4(b)(2)(i)(A), States must report on the indicators of program performance in § 612.5. For a full discussion of the burden related to the reporting of this requirement, see the annual reporting burden section of § 612.5. Apart from the burden discussed in this section, no other burden is associated with this requirement.
Under § 612.4(b)(2)(ii), States must report the relative weight it places on each of the different indicators enumerated in § 612.5. The burden associated with this reporting is minimal: After the State, in consultation with a group of stakeholders, has made the determination about the percentage weight it will place on each of these indicators, reporting this information on the SRC is a simple matter of inputting a number for each of the indicators. Under § 612.5, this minimally requires the State to input eight general indicators of quality.
The eight indicators are—
a. Associated student learning outcome results;
b. Teacher placement results;
c. Teacher retention results;
d. Teacher placement rate calculated for high-need school results;
e. Teacher retention rate calculated for high-need school results;
f. Teacher satisfaction survey results;
g. Employer satisfaction survey results; and
h. Teacher preparation program characteristics.
This reporting burden will not be affected by the number of teacher preparation programs in a State, because such weighting applies equally to each program. Although the State has the discretion to add indicators, the Department does not believe that transmission of an additional figure representing the percentage weighting assigned to that indicator will constitute an appreciable burden increase. The Department therefore estimates that each State will incur a burden of 0.25 hours (15 minutes) to report the relative weighting of the regulatory indicators of program performance. This would constitute a total burden on States of 13 hours each year (0.25 hours multiplied by 52 States).
Similar to the reporting required under § 612.4(b)(2)(ii), after a State has made the requisite determination about rewards and consequences, reporting those rewards and consequences represents a relatively low burden. States must report this on the SRC during the first year of implementation, the SRC could provide States with a drop-down list representing common rewards or consequences in use by early adopter States, and States can briefly describe those rewards or consequences not represented in the drop-down options. For subsequent years, the SRC could be pre-populated with the prior-year's selected rewards and consequences, such that there will be no further burden associated with subsequent year reporting unless the State altered its rewards and consequences. For these reasons, the Department estimates that States will incur, on average, 0.5 hours (30 minutes) of burden in the first year of implementation to report the State-level rewards and consequences, and 0.1 hours (6 minutes) of burden in each subsequent year. The Department therefore estimates that the total burden for the first year of implementation of this regulatory requirement will be 26 hours (0.5 hours multiplied by 52 States) and 5.2 hours each year thereafter (0.1 hours multiplied by 52 States).
Under § 612.4(b)(4), during the first year of reporting and every five years thereafter, States must report on the procedures they established in consultation with the group of stakeholders described under § 612.4(c)(1). The burden associated with the first and third of these four procedures, the weighting of the indicators and State-level rewards and consequences associated with each performance level, respectively, are discussed in the preceding paragraphs of this section.
The second procedure, the method by which small programs are aggregated, is a relatively straightforward reporting procedure on the SRC. Pursuant to § 612.4(b)(3)(ii), States are permitted to use one of two methods, or a combination of both in aggregating small programs. A State can aggregate programs that are similar in teacher preparation subject matter. A State can also aggregate using prior year data, including that of multiple prior years. Or a State can use a combination of both methods. On the SRC, the State simply indicates the method it uses. The Department estimates that States will require 0.5 hours (30 minutes) to enter these data every fifth year. On an annualized basis, this would therefore constitute a total burden of 5.2 hours (0.5 hours multiplied by 52 States divided by five to annualize burden for reporting every fifth year).
The fourth procedure that States must report under § 612.4(b)(4) is the method by which teacher preparation programs in the State are able to challenge the accuracy of their data and the classification of their program. First, the Department believes that States will incur a paperwork burden each year from recordkeeping and publishing decisions of these challenges. Because the Department believes the instances of these appeals will be relatively rare, we estimate that each State will incur 10 hours of burden each year related to recordkeeping and publishing decisions. This would constitute an annual reporting burden of 520 hours (10 hours multiplied by 52 States).
After States and their stakeholder groups determine the preferred method for programs to challenge data, reporting that information will likely take the form of narrative responses. This is because the method for challenging data may differ greatly from State to State, and it is difficult for the Department to predict what methods States will choose. The Department therefore estimates that reporting this information in narrative form during the first year will constitute a burden of 3 hours for each State. This would represent a total reporting burden of 156 hours (3 hours multiplied by 52 States).
In subsequent reporting cycles, the Department can examine State responses and (1) pre-populate this response for States that have not altered their method for challenging data or (2) provide a drop-down list of representative alternatives. This will minimize subsequent burden for most States. The Department therefore estimates that in subsequent reporting cycles (every five years under the final regulations), only 10 States will require more time to provide additional narrative responses totaling 3 burden
Under § 612.4(c)(2), each State must periodically examine the quality of its data collection and reporting activities and modify those activities as appropriate. The Department believes that this review will be carried out in a manner similar to the one described for the initial stakeholder determinations in the preceding paragraphs: States will consult with representative groups to determine their experience with providing and using the collected data, and they will consult with data experts to ensure the validity and reliability of the data collected. The Department believes such a review will recur every three years, on average. Because this review will take place years after the State's initial implementation of the regulations, the Department further believes that the State's review will be of relatively little burden. This is because the State's review will be based on the State's own experience with collecting and reporting data pursuant to the regulations, and because States can consult with many other States to determine best practices. For these reasons, the Department estimates that the periodic review and modification of data collection and reporting will require 30 hours every three years or an annualized burden of 10 hours for each State. This would constitute a total annualized burden of 520 hours for all States (10 hours per year multiplied by 52 States).
Aggregating the annual burdens calculated under the preceding sections results in the following: All States would incur a burden of 14,363 hours to report classifications of teacher preparation programs, 13 hours to report State indicator weightings, 26 hours in the first year and 5.2 hours in subsequent years to report State-level rewards and consequences associated with each performance classification, 5.2 hours to report the method of program aggregation, 520 hours for recordkeeping and publishing appeal decisions, 156 hours the first year and 60 hours in subsequent years to report the process for challenging data and program classification, and 520 hours to report on the examination of data collection quality. This totals 15,603.2 hours of annual burden in the first year and 15,486.4 hours of annual burden in subsequent years nationwide.
Aggregating the start-up and annual burdens calculated under the preceding sections results in the following burdens: All States would incur a total burden under § 612.4(a) of 14,366.5 hours, a start-up burden under §§ 612.4(b) and 612.4(c) of 51,480 hours, and an annual burden under §§ 612.4(b) and 612.4(c) of 15,603.2 hours in the first year and 15,486.4 hours in subsequent years. This totals between 81,332.9 and 81,449.7 total burden hours under § 612.4 nationwide. Based on the prior estimate of 53,749 hours of reporting burden on OMB collection 1840-0837, the total burden increase under § 612.4 is between 27,583.9 hours and 27,700.7 hours (53,749 hours minus a range of 81,332.9 and 81,449.7 total burden hours).
The final regulations at § 612.5(a)(1) through (a)(4) identify those indicators that a State is required to use to assess the academic content knowledge and teaching skills of novice teachers from each of its teacher preparation programs. Under the regulations, a State must use the following indicators of teacher preparation program performance: (a) Student learning outcomes, (b) employment outcomes, (c) survey outcomes, and (d) whether the program (1) is accredited by a specialized accrediting agency or (2) produces teacher candidates with content and pedagogical knowledge and quality clinical preparation, who have met rigorous exit standards. Section 612.5(b) permits a State, at its discretion, to establish additional indicators of academic content knowledge and teaching skills.
As described in the Discussion of Costs, Benefits, and Transfers section of the RIA, we do not estimate that States will incur any additional burden associated with creating systems for evaluating student learning outcomes. However, the regulations also require that States link student growth or teacher evaluation data back to each teacher's preparation programs consistent with State discretionary guidelines included in § 612.4. Currently, few States have such capacity. However, based on data from the SLDS program, it appears that 30 States, the District of Columbia, and the Commonwealth of Puerto Rico either already have the ability to aggregate data on student achievement and map back to teacher preparation programs or have committed to do so. For these 30 States, the District of Columbia, and the Commonwealth of Puerto Rico we estimate that no additional costs will be needed to link student learning outcomes back to teacher preparation programs.
For the remaining States, the Department estimates that they will require 2,940 hours for each State, for a total burden of 58,800 hours nationwide (2,940 hours multiplied by 20 States).
Section 612.5(a)(2) requires a State to provide data on each teacher preparation program's teacher placement rate as well as the teacher placement rate calculated for high-need schools. High-need schools are defined in § 612.2(d) by using the definition of “high-need school” in section 200(11) of the HEA. The regulations give States discretion to exclude those novice teachers or recent graduates from this measure if they are teaching in a private school, teaching in another State, enrolled in graduate school, or engaged in military service. States also have the discretion to treat this rate differently for alternative route and traditional route providers.
Section 612.5(a)(2) requires a State to provide data on each teacher preparation program's teacher retention rate and teacher retention rate calculated for high-need schools. The regulations give States discretion to exclude those novice teachers or recent graduates from this measure if they are teaching in a private school (or other school not requiring State certification), another State, enrolled in graduate school, or serving in the military. States also have the discretion to treat this rate differently for alternative route and traditional route providers.
As discussed in the NPRM, the Department believes that only 11 States will likely incur additional burden in collecting information about the employment and retention of recent graduates of teacher preparation programs in its jurisdiction. To the extent that it is not possible to establish these measures using existing data systems, States may need to obtain some or all of this information from teacher preparation programs or from the teachers themselves upon requests for certification and licensure. The Department estimates that 200 hours may be required at the State level to collect information about novice
Section 612.5(a)(3) requires a State to provide data on each teacher preparation program's teacher survey results. This requires States to report data from a survey of novice teachers in their first year of teaching designed to capture their perceptions of whether the training that they received was sufficient to meet classroom and profession realities.
Section 612.5(a)(3) also requires a State to provide data on each teacher preparation program's employer survey results. This requires States to report data from a survey of employers or supervisors designed to capture their perceptions of whether the novice teachers they employ or supervise were prepared sufficiently to meet classroom and profession realities.
Some States and IHEs already survey graduates of their teacher preparation programs. The sampling size and length of survey instrument can strongly affect the potential burden associated with administering the survey. The Department has learned that some States already have experience carrying out such surveys (for a more detailed discussion of these and other estimates in this section, see the
Based on Departmental consultation with researchers experienced in carrying out survey research, the Department assumes that survey instruments will not require more than 30 minutes to complete. The Department further assumes that a State can develop a survey in 1,224 hours. Assuming that States with experience in administering surveys will incur a lower cost, the Department assumes that the total burden incurred nationwide would maximally be 63,648 hours (1,224 hours multiplied by 52 States).
Under § 612.5(a)(4), States must report, for each teacher preparation program in the State whether it: (a) Is accredited by a specialized accrediting agency recognized by the Secretary for accreditation of professional teacher education programs, or (b) provides teacher candidates with content and pedagogical knowledge and quality clinical preparation, and has rigorous teacher candidate exit standards.
CAEP, a union of two formerly independent national accrediting agencies, the National Council for Accreditation of Teacher Education (NCATE) and the Teacher Education Accreditation Council (TEAC), reports that currently it has fully accredited approximately 800 IHEs. The existing IRC currently requires reporting of whether each teacher preparation program is accredited by a specialized accrediting agency, and if so, which one. We note that, as of July 1, 2016, CAEP has not been recognized by the Secretary for accreditation of teacher preparation programs. As such, programs accredited by CAEP would not qualify under § 612.5(a)(4)(i). However, as described in the discussion of comments above, States would be able to use accreditation by CAEP as an indicator that the teacher preparation program meets the requirements of § 612.5(a)(4)(ii). In addition, we explain in the comments above that a State also could meet the reporting requirements in § 612.5(a)(4)(ii) by indicating that a program has been accredited by an accrediting organization whose standards cover the program characteristics identified in that section. Because section 205(a)(1)(D) of the HEA requires IHEs to include in their IRCs the identity of any agency that has accredited their programs, and the number of such accrediting agencies is small, States should readily know whether these other agencies meet these standards. For these reasons, the Department believes that no significant start-up burden will be associated with State determinations of specialized accreditation of teacher preparation programs for those programs that are already accredited.
As discussed in the NPRM, the Department estimates that States will have to provide information for 15,335 teacher preparation programs nationwide (11,461 unaccredited programs at IHEs plus 3,484 programs at alternative routes not affiliated with an IHE plus 390 reporting instances for teacher preparation programs offered through distance education).
The Department believes that States will be able to make use of accreditation guidelines from specialized accrediting agencies to determine the measures that will adequately inform them about which of its teacher preparation programs provide teacher candidates with content and pedagogical knowledge, quality clinical preparation, and have rigorous teacher candidate exit qualifications—the indicators contained in § 612.5(a)(4)(ii). The Department estimates that States will require 2 hours for each teacher preparation program to determine whether or not it can provide such information. Therefore, the Department estimates that the total reporting burden to provide this information would be 30,670 hours (15,335 teacher preparation programs multiplied by 2 hours).
Aggregating the start-up burdens calculated under the preceding sections results in the following burdens: All States would incur a burden of 58,800 hours to link student learning outcome measures back to each teacher's preparation program, 2,200 hours to measure employment outcomes, 63,648 hours to develop surveys, and 30,670 hours to establish the process to obtain information related to certain indicators for teacher preparation programs without specialized accreditation. This totals 155,318 hours of start-up burden nationwide.
Under § 612.5(a), States must transmit, through specific elements on the SRC, information related to indicators of academic content knowledge and teaching skills of novice teachers for each teacher preparation program in the State. We discuss the burden associated with establishing systems related to gathering these data in the section discussing start-up burden associated with § 612.5. The following section describes the burden associated with gathering these data and reporting them to the Department annually.
Under § 612.5(a)(1), States are required to transmit information related to student learning outcomes for each teacher preparation program in the State. The Department believes that in order to ensure the validity of the data, each State will require two hours to gather and compile data related to the student learning outcomes of each teacher preparation program. Much of
Under § 612.5(a)(2), States are required to transmit information related to employment outcomes for each teacher preparation program in the State. In order to report employment outcomes to the Department, States must compile and transmit teacher placement rate data, teacher placement rate data calculated for high-need schools, teacher retention rate data, and teacher retention rate data for high-need schools. Similar to the process for reporting student learning outcome data, much of the burden related to gathering data on employment outcomes is subsumed into the State-established data systems, which provides information on whether and where teachers were employed. The Department estimates that States will require 3 hours to gather data both on teacher placement and teacher retention for each teacher preparation program in the State. Reporting these data using the SRC is relatively straightforward. The measures are the percentage of teachers placed and the percentage of teachers who continued to teach, both generally and at high-need schools. The Department therefore estimates that States will require 0.5 hours (30 minutes) for each teacher preparation program to convey this information to the Department through the SRC. The combination of gathering and reporting data related to employment outcomes therefore constitutes a burden of 3.5 hours for each teacher preparation program and would represent a total burden of 100,541 hours annually (3.5 hours multiplied by 28,726 teacher preparation programs).
In addition to the start-up burden needed to produce a survey, States will incur annual burdens to administer the survey. Surveys will include, but will not be limited to, a teacher survey and an employer survey, designed to capture perceptions of whether novice teachers who are employed as teachers in their first year of teaching in the State where the teacher preparation program is located possess the skills needed to succeed in the classroom. The burdens for administering an annual survey will be borne by the State administering the survey and the respondents completing it. For the reasons discussed in the RIA in this document, the Department estimates that States will require approximately 0.5 hours (30 minutes) per respondent to collect a sufficient number of survey instruments to ensure an adequate response rate. The Department employs an estimate of 253,042 respondents (70 percent of 361,488—the 180,744 completers plus their 180,744 employers) that will be required to complete the survey. Therefore, the Department estimates that the annual burden to respondents nationwide would be 126,521 hours (285,181 respondents multiplied by 0.5 hours per respondent).
With respect to burden incurred by States to administer the surveys annually, the Department estimates that one hour of burden will be incurred for every respondent to the surveys. This would constitute an annual burden nationwide of 253,042 hours (253,042 respondents multiplied by one hour per respondent).
Under § 612.5(a)(3), after these surveys are administered, States are required to report the information using the SRC. In order to report survey outcomes to the Department, the Department estimates that States will need 0.5 hours to report the quantitative data related to the survey responses for each instrument on the SRC, constituting a total burden of one hour to report data on both instruments. This would represent a total burden of 28,726 hours annually (1 hour multiplied by 28,726 teacher preparation programs). The total burden associated with administering, completing, and reporting data on the surveys therefore constitutes 408,289 hours annually (126,521 hours plus 253,042 hours plus 28,726 hours).
Under § 612.5(a)(4), States are required to report whether each program in the State is accredited by a specialized accrediting agency recognized by the Secretary, or produces teacher candidates with content and pedagogical knowledge, with quality clinical preparation, and who have met rigorous teacher candidate exit qualifications. The Department estimates that 726 IHEs offering teacher preparation programs are or will be accredited by a specialized accrediting agency (see the start-up burden discussion for § 612.5 for an explanation of this figure). Using the IRC, IHEs already report to States whether teacher preparation programs have specialized accreditation. However, as noted in the start-up burden discussion of § 612.5, as of July 1, 2016, there are no specialized accrediting agencies recognized by the Secretary for teacher preparation programs. As such, the Department does not expect any teacher preparation program to qualify under § 612.5(a)(4)(i). However, as discussed elsewhere in this document, States can use accreditation by CAEP or another entity whose standards for accreditation cover the basic program characteristics in § 612.5(a)(4)(ii) as evidence that the teacher preparation program has satisfied the indicator of program performance in that provision. Since IHEs are already reporting whether they have specialized accreditation in their IRCs, and this reporting element will be pre-populated for States on the SRC, States would simply need to know whether these accrediting agencies have standards that examine the program characteristics in § 612.5(a)(4)(ii). Therefore, the Department estimates no additional burden for this reporting element for programs that have the requisite accreditation.
Under § 612.5(a)(4)(ii), for those programs that are not accredited by a specialized accrediting agency, States are required to report on certain indicators in lieu of that accreditation: Whether the program provides teacher candidates with content and pedagogical knowledge and quality clinical preparation, and has rigorous teacher candidate exit qualifications. We assume that such requirements are already built into State approval of relevant programs. The Department estimates that States will require 0.25 hours (15 minutes) to provide to the Secretary an assurance, in a yes/no format, whether each teacher preparation program in its jurisdiction not holding a specialized accreditation from CAEP, NCATE, or TEAC meets these indicators.
As discussed in the start-up burden section of § 612.5 which discusses reporting of teacher preparation program characteristics, the Department
Under § 612.5(b), States may include additional indicators of academic content knowledge and teaching skill in their determination of whether teacher preparation programs are low-performing. As discussed in the
Aggregating the annual burdens calculated under the preceding sections results in the following burdens: All States would incur a burden of 71,815 hours to report on student learning outcome measures for all subjects and grades, 100,541 hours to report on employment outcomes, 408,289 hours to report on survey outcomes, 3,834 hours to report on teacher preparation program characteristics, and 28,726 hours to report on other indicators not required in § 612.5(a)(1)-(4). This totals 613,204.75 hours of annual burden nationwide.
Aggregating the start-up and annual burdens calculated under the preceding sections results in the following burdens: All States would incur a start-up burden under § 612.5 of 155,318 hours and an annual burden under § 612.5 of 613,204.75 hours. This totals 768,522.75 burden hours under § 612.5 nationwide.
The regulations in § 612.6 require States to use criteria, including, at a minimum, indicators of academic content knowledge and teaching skills from § 612.5, to identify low-performing or at-risk teacher preparation programs.
For a full discussion of the burden related to the consideration and selection of the criteria reflected in the indicators described in § 612.5, see the start-up burden section of §§ 612.4(b) and 612.4(c) discussing meaningful differentiations. Apart from that burden discussion, the Department believes States will incur no other burden related to this regulatory provision.
For any IHE administering a teacher preparation program that has lost State approval or financial support based on being identified as a low-performing teacher preparation program, the regulations under § 612.7 require the IHE to—(a) notify the Secretary of its loss of State approval or financial support within thirty days of such designation; (b) immediately notify each student who is enrolled in or accepted into the low-performing teacher preparation program and who receives funding under title IV, HEA that the IHE is no longer eligible to provide such funding to them; and (c) disclose information on its Web site and promotional materials regarding its loss of State approval or financial support and loss of eligibility for title IV funding.
The Department does not expect that a large percentage of programs will be subject to a loss of title IV eligibility. The Department estimates that approximately 50 programs will lose their State approval or financial support.
For those 50 programs, the Department estimates that it will take each program 15 minutes to notify the Secretary of its loss of eligibility; 5 hours to notify all students who are enrolled in or accepted into the program and who receive funding under title IV of the HEA; and 30 minutes to disclose this information on its Web sites and promotional materials, for a total of 5.75 hours per program. The Department estimates the total burden at 287.5 hours (50 programs multiplied by 5.75 hours).
The regulations in § 612.8 provide a process for a low-performing teacher preparation program that has lost State approval or financial support to regain its ability to accept and enroll students who receive title IV, HEA funds. Under this process, IHEs will submit an application and supporting documentation demonstrating to the Secretary: (1) Improved performance on the teacher preparation program performance criteria reflected in indicators described in § 612.5 as determined by the State; and (2) reinstatement of the State's approval or the State's financial support.
The process by which programs and institutions apply for title IV eligibility already accounts for the burden associated with this provision.
Aggregating the total burdens calculated under the preceding sections of part 612 results in the following burdens: Total burden hours incurred under § 612.3 is 157,791 hours, under § 612.4 is between 81,332.9 hours and 81,449.7 hours, under § 612.5 is 768,522.75 hours, under § 612.7 is 287.5 hours, and under § 612.8 is 200 hours. This totals between 1,008,134.15 hours and 1,008,250.95 hours nationwide.
The changes to part 686 in these regulations have no measurable effect on the burden currently identified in the OMB Control Numbers 1845-0083 and 1845-0084.
Consistent with the discussions above, the following chart describes the sections of the final regulations involving information collections, the information being collected, and the collections the Department has submitted to the OMB for approval and public comment under the Paperwork Reduction Act. In the chart, the Department labels those estimated burdens not already associated an OMB approval number under a single prospective designation “OMB 1840-0837.” This label represents a single information collection; the different sections of the regulations are separated in the table below for clarity and to appropriately divide the burden hours associated with each regulatory section.
Please note that the changes in burden estimated in the chart are based on the change in burden under the current IRC OMB control numbers 1840-0837 and “OMB 1840-0837.” The burden estimate for 612.3 is based on the most recent data available for the number of IHEs that are required to report (
These programs are subject to the requirements of Executive Order 12372 and the regulations in 34 CFR part 79. One of the objectives of the Executive order is to foster an intergovernmental partnership and a strengthened federalism. The Executive order relies on processes developed by State and local governments for coordination and review of proposed Federal financial assistance.
This document provides early notification of our specific plans and actions for these programs.
In the NPRM we requested comments on whether the proposed regulations would require transmission of information that any other agency or authority of the United States gathers or makes available.
Based on the response to the NPRM and on our review, we have determined that these final regulations do not require transmission of information that any other agency or authority of the United States gathers or makes available.
Executive Order 13132 requires us to ensure meaningful and timely input by State and local elected officials in the development of regulatory policies that have federalism implications. “Federalism implications” means substantial direct effects on the States, on the relationship between the National Government and the States, or on the distribution of power and responsibilities among the various levels of government.
In the NPRM we identified a specific section that may have federalism implications and encouraged State and local elected officials to review and provide comments on the proposed regulations. In the
You may also access documents of the Department published in the
Administrative practice and procedure, Aliens, Colleges and universities, Consumer protection, Grant programs—education, Loan programs—education, Reporting and recordkeeping requirements, Selective Service System, Student aid, Vocational education.
For the reasons discussed in the preamble, the Secretary amends chapter VI of title 34 of the Code of Federal Regulations as follows:
20 U.S.C. 1022d and 1022f.
This part establishes regulations related to the teacher preparation program accountability system under title II of the HEA. This part includes:
(a) Institutional Report Card reporting requirements.
(b) State Report Card reporting requirements.
(c) Requirements related to the indicators States must use to report on teacher preparation program performance.
(d) Requirements related to the areas States must consider to identify low-performing teacher preparation programs and at-risk teacher preparation programs and actions States must take with respect to those programs.
(e) The consequences for a low-performing teacher preparation program that loses the State's approval or the State's financial support.
(f) The conditions under which a low-performing teacher preparation program that has lost the State's approval or the State's financial support may regain eligibility to resume accepting and enrolling students who receive title IV, HEA funds.
(a) The following terms used in this part are defined in the regulations for Institutional Eligibility under the HEA, 34 CFR part 600:
(b) The following term used in this part is defined in subpart A of the Student Assistance General Provisions, 34 CFR part 668:
(c) The following term used in this part is defined in 34 CFR 77.1:
(d) Other terms used in this part are defined as follows:
(i) The school is in the highest quartile of schools in a ranking of all schools served by a local educational agency (LEA), ranked in descending order by percentage of students from low-income families enrolled in such schools, as determined by the LEA based on one of the following measures of poverty:
(A) The percentage of students aged 5 through 17 in poverty counted in the most recent Census data approved by the Secretary.
(B) The percentage of students eligible for a free or reduced price school lunch under the Richard B. Russell National School Lunch Act [42 U.S.C. 1751
(C) The percentage of students in families receiving assistance under the State program funded under part A of title IV of the Social Security Act (42 U.S.C. 601
(D) The percentage of students eligible to receive medical assistance under the Medicaid program.
(E) A composite of two or more of the measures described in paragraphs (i)(A) through (D) of this definition.
(ii) In the case of—
(A) An elementary school, the school serves students not less than 60 percent of whom are eligible for a free or reduced price school lunch under the Richard B. Russell National School Lunch Act; or
(B) Any school other than an elementary school, the school serves students not less than 45 percent of whom are eligible for a free or reduced price school lunch under the Richard B. Russell National School Lunch Act.
(i) Be provided by qualified clinical instructors, including school and LEA-based personnel, who meet established qualification requirements and who use a training standard that is made publicly available;
(ii) Include multiple clinical or field experiences, or both, that serve diverse, rural, or underrepresented student populations in elementary through secondary school, including English learners and students with disabilities, and that are assessed using a performance-based protocol to
(iii) Require that teacher candidates use research-based practices, including observation and analysis of instruction, collaboration with peers, and effective use of technology for instructional purposes.
(i) Becoming a teacher of record; or
(ii) Obtaining initial certification or licensure.
(ii) At the State's discretion, the rate calculated under paragraph (i) of this definition may exclude one or more of the following, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State:
(A) Recent graduates who have taken teaching positions in another State.
(B) Recent graduates who have taken teaching positions in private schools.
(C) Recent graduates who have enrolled in graduate school or entered military service.
(iii) For a teacher preparation program provided through distance education, a State calculates the rate under paragraph (i) of this definition using the total number of recent graduates who have obtained certification or licensure in the State during the three preceding title II reporting years as the denominator.
(i) For the purposes of this definition, a cohort of novice teachers includes all teachers who were first identified as a novice teacher by the State in the same title II reporting year.
(ii) At the State's discretion, the teacher retention rates may exclude one or more of the following, provided that the State uses a consistent approach to assess and report on all teacher preparation programs in the State:
(A) Novice teachers who have taken teaching positions in other States.
(B) Novice teachers who have taken teaching positions in private schools.
(C) Novice teachers who are not retained specifically and directly due to budget cuts.
(D) Novice teachers who have enrolled in graduate school or entered military service.
Beginning not later than April 30, 2018, and annually thereafter, each institution of higher education that conducts a teacher preparation program and that enrolls students receiving title IV HEA program funds—
(a) Must report to the State on the quality of teacher preparation and other information consistent with section 205(a) of the HEA, using an institutional report card that is prescribed by the Secretary;
(b) Must prominently and promptly post the institutional report card information on the institution's Web site and, if applicable, on the teacher preparation program portion of the institution's Web site; and
(c) May also provide the institutional report card information to the general public in promotional or other materials it makes available to prospective students or other individuals.
(a)
(1) Report to the Secretary, using a State report card that is prescribed by the Secretary, on—
(i) The quality of all teacher preparation programs in the State consistent with paragraph (b)(3) of this section, whether or not they enroll students receiving Federal assistance under the HEA; and
(ii) All other information consistent with section 205(b) of the HEA; and
(2) Make the State report card information widely available to the general public by posting the State report card information on the State's Web site.
(b)
(1) Must make meaningful differentiations in teacher preparation program performance using at least three performance levels—low-performing teacher preparation program, at-risk teacher preparation program, and effective teacher preparation program—based on the indicators in § 612.5.
(2) Must provide—
(i) For each teacher preparation program, data for each of the indicators identified in § 612.5 for the most recent title II reporting year;
(ii) The State's weighting of the different indicators in § 612.5 for purposes of describing the State's assessment of program performance; and
(iii) Any State-level rewards or consequences associated with the designated performance levels;
(3) In implementing paragraph (b)(1) through (2) of this section, except as provided in paragraphs (b)(3)(ii)(D) and (b)(5) of this section, must ensure the performance of all of the State's teacher preparation programs are represented in the State report card by—
(i)(A) Annually reporting on the performance of each teacher preparation program that, in a given reporting year, produces a total of 25 or more recent graduates who have received initial certification or licensure from the State that allows them to serve in the State as teachers of record for K-12 students and, at a State's discretion, preschool students (
(B) If a State chooses a program size threshold of less than 25 (
(ii) For any teacher preparation program that does not meet the program size threshold in paragraph (b)(3)(i)(A) or (B) of this section, annually reporting on the program's performance by aggregating data under paragraph (b)(3)(ii)(A), (B), or (C) of this section in order to meet the program size threshold except as provided in paragraph (b)(3)(ii)(D) of this section.
(A) The State may report on the program's performance by aggregating data that determine the program's performance with data for other teacher preparation programs that are operated by the same teacher preparation entity and are similar to or broader than the program in content.
(B) The State may report on the program's performance by aggregating data that determine the program's performance over multiple years for up to four years until the program size threshold is met.
(C) If the State cannot meet the program size threshold by aggregating data under paragraph (b)(3)(ii)(A) or (B) of this section, it may aggregate data using a combination of the methods under both of these paragraphs.
(D) The State is not required under this paragraph (b)(3)(ii) of this section to report data on a particular teacher preparation program for a given reporting year if aggregation under paragraph (b)(3)(ii) of this section would not yield the program size threshold for that program; and
(4) Must report on the procedures established by the State in consultation with a group of stakeholders, as described in paragraph (c)(1) of this section, and on the State's examination of its data collection and reporting, as described in paragraph (c)(2) of this section, in the State report card submitted—
(i) No later than October 31, 2019, and every four years thereafter; and
(ii) At any other time that the State makes a substantive change to the weighting of the indicators or the procedures for assessing and reporting the performance of each teacher preparation program in the State described in paragraph (c) of this section.
(5) The State is not required under this paragraph (b) to report data on a particular teacher preparation program if reporting these data would be inconsistent with Federal or State privacy and confidentiality laws and regulations.
(c)
(i) The representative group of stakeholders must include, at a minimum, representatives of—
(A) Leaders and faculty of traditional teacher preparation programs and alternative routes to State certification or licensure programs;
(B) Students of teacher preparation programs;
(C) LEA superintendents;
(D) Small teacher preparation programs (
(E) Local school boards;
(F) Elementary through secondary school leaders and instructional staff;
(G) Elementary through secondary school students and their parents;
(H) IHEs that serve high proportions of low-income students, students of color, or English learners;
(I) English learners, students with disabilities, and other underserved students;
(J) Officials of the State's standards board or other appropriate standards body; and
(K) At least one teacher preparation program provided through distance education.
(ii) The procedures for assessing and reporting the performance of each teacher preparation program in the State under this section must, at minimum, include—
(A) The weighting of the indicators identified in § 612.5 for establishing performance levels of teacher preparation programs as required by this section;
(B) The method for aggregation of data pursuant to paragraph (b)(3)(ii) of this section;
(C) Any State-level rewards or consequences associated with the designated performance levels; and
(D) Appropriate opportunities for programs to challenge the accuracy of their performance data and classification of the program.
(2)
(d)
(a) For purposes of reporting under § 612.4, a State must assess, for each teacher preparation program within its jurisdiction, indicators of academic content knowledge and teaching skills of novice teachers from that program, including, at a minimum, the following indicators:
(1) Student learning outcomes.
(i) For each year and each teacher preparation program in the State, a State must calculate the aggregate student learning outcomes of all students taught by novice teachers.
(ii) For purposes of calculating student learning outcomes under paragraph (a)(1)(i) of this section, a State must use:
(A) Student growth;
(B) A teacher evaluation measure;
(C) Another State-determined measure that is relevant to calculating student learning outcomes, including academic performance, and that meaningfully differentiates among teachers; or
(D) Any combination of paragraphs (a)(1)(ii)(A), (B), or (C) of this section.
(iii) At the State's discretion, in calculating a teacher preparation program's aggregate student learning outcomes a State may exclude one or both of the following, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State—
(A) Student learning outcomes of students taught by novice teachers who have taken teaching positions in another State.
(B) Student learning outcomes of all students taught by novice teachers who have taken teaching positions in private schools.
(2) Employment outcomes.
(i) Except as provided in paragraph (a)(2)(v) of this section, for each year and each teacher preparation program in the State, a State must calculate:
(A) Teacher placement rate;
(B) Teacher placement rate in high-need schools;
(C) Teacher retention rate; and
(D) Teacher retention rate in high-need schools.
(ii) For purposes of reporting the teacher retention rate and teacher retention rate in high-need schools under paragraph (a)(2)(i)(C) and (D) of this section—
(A) Except as provided in paragraph (B), the State reports a teacher retention rate for each of the three cohorts of novice teachers immediately preceding the current title II reporting year.
(B)(
(
(
(iii) For the purposes of calculating employment outcomes under paragraph (a)(2)(i) of this section, a State may, at its discretion, assess traditional and alternative route teacher preparation programs differently, provided that differences in assessments and the reasons for those differences are transparent and that assessments result in equivalent levels of accountability and reporting irrespective of the type of program.
(iv) For the purposes of the teacher placement rate under paragraph (a)(2)(i)(A) and (B) of this section, a State may, at its discretion, assess teacher preparation programs provided through distance education differently from teacher preparation programs not provided through distance education, based on whether the differences in the way the rate is calculated for teacher preparation programs provided through distance education affect employment outcomes. Differences in assessments and the reasons for those differences must be transparent and result in equivalent levels of accountability and reporting irrespective of where the program is physically located.
(v) A State is not required to calculate a teacher placement rate under paragraph (a)(2)(i)(A) of this section for alternative route to certification programs.
(3)
(ii) At the State's discretion, in calculating a teacher preparation program's survey outcomes the State may exclude survey outcomes for all novice teachers who have taken teaching positions in private schools provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State.
(4)
(i) Is administered by an entity accredited by an agency recognized by the Secretary for accreditation of professional teacher education programs; or
(ii) Produces teacher candidates—
(A) With content and pedagogical knowledge;
(B) With quality clinical preparation; and
(C) Who have met rigorous teacher candidate exit qualifications.
(b) At a State's discretion, the indicators of academic content knowledge and teaching skills may include other indicators of a teacher's effect on student performance, such as student survey results, provided that the State uses the same indicators for all teacher preparation programs in the State.
(c) A State may, at its discretion, exclude from its reporting under paragraph (a)(1)-(3) of this section individuals who have not become novice teachers after three years of becoming recent graduates.
(d) This section does not apply to American Samoa, the Commonwealth of the Northern Mariana Islands, the freely associated states of the Republic of the Marshall Islands, the Federated States of Micronesia, the Republic of Palau, Guam, and the United States Virgin Islands.
(a)(1) In identifying low-performing or at-risk teacher preparation programs the State must use criteria that, at a minimum, include the indicators of academic content knowledge and teaching skills from § 612.5.
(2) Paragraph (a)(1) of this section does not apply to American Samoa, the
(b) At a minimum, a State must provide technical assistance to low-performing teacher preparation programs in the State to help them improve their performance in accordance with section 207(a) of the HEA. Technical assistance may include, but is not limited to: Providing programs with information on the specific indicators used to determine the program's rating (
(a) Any teacher preparation program for which the State has withdrawn the State's approval or the State has terminated the State's financial support due to the State's identification of the program as a low-performing teacher preparation program—
(1) Is ineligible for any funding for professional development activities awarded by the Department as of the date that the State withdrew its approval or terminated its financial support;
(2) May not include any candidate accepted into the teacher preparation program or any candidate enrolled in the teacher preparation program who receives aid under title IV, HEA programs in the institution's teacher preparation program as of the date that the State withdrew its approval or terminated its financial support; and
(3) Must provide transitional support, including remedial services, if necessary, to students enrolled at the institution at the time of termination of financial support or withdrawal of approval for a period of time that is not less than the period of time a student continues in the program but no more than 150 percent of the published program length.
(b) Any institution administering a teacher preparation program that has lost State approval or financial support based on being identified as a low-performing teacher preparation program must—
(1) Notify the Secretary of its loss of the State's approval or the State's financial support due to identification as low-performing by the State within 30 days of such designation;
(2) Immediately notify each student who is enrolled in or accepted into the low-performing teacher preparation program and who receives title IV, HEA program funds that, commencing with the next payment period, the institution is no longer eligible to provide such funding to students enrolled in or accepted into the low-performing teacher preparation program; and
(3) Disclose on its Web site and in promotional materials that it makes available to prospective students that the teacher preparation program has been identified as a low-performing teacher preparation program by any State and has lost the State's approval or the State's financial support, including the identity of the State or States, and that students accepted or enrolled in the low-performing teacher preparation program may not receive title IV, HEA program funds.
(a) A low-performing teacher preparation program that has lost the State's approval or the State's financial support may regain its ability to accept and enroll students who receive title IV, HEA program funds upon demonstration to the Secretary under paragraph (b) of this section of—
(1) Improved performance on the teacher preparation program performance criteria in § 612.5 as determined by the State; and
(2) Reinstatement of the State's approval or the State's financial support, or, if both were lost, the State's approval and the State's financial support.
(b) To regain eligibility to accept or enroll students receiving title IV, HEA funds in a teacher preparation program that was previously identified by the State as low-performing and that lost the State's approval or the State's financial support, the institution that offers the teacher preparation program must submit an application to the Secretary along with supporting documentation that will enable the Secretary to determine that the teacher preparation program has met the requirements under paragraph (a) of this section.
20 U.S.C. 1070g,
The additions and revisions read as follows:
(d) A definition for the following term used in this part is in Title II Reporting System, 34 CFR part 612:
Effective teacher preparation program.
(e) * * *
(i) Beginning with the 2021-2022 award year, is not classified by the State to be less than an effective teacher preparation program based on 34 CFR 612.4(b) in two of the previous three years; or
(ii) Meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or 34 CFR 612.4(b)(5).
(i) Beginning with the 2021-2022 award year, is not classified by the same State to be less than an effective teacher preparation program based on 34 CFR 612.4(b); in two of the previous three years; or
(ii) Meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or (E).
(i) Is located within the area served by the LEA that is eligible for assistance pursuant to title I of the ESEA;
(ii) Has been determined by the Secretary to be a school or educational service agency in which more than 30 percent of the school's or educational service agency's total enrollment is made up of children who qualify for services provided under title I of the ESEA; and
(iii) Is listed in the Department's Annual Directory of Designated Low-Income Schools for Teacher Cancellation Benefits. The Secretary considers all elementary and secondary schools and educational service agencies operated by the Bureau of Indian Education (BIE) in the Department of the Interior or operated on Indian reservations by Indian tribal groups under contract or grant with the BIE to qualify as schools or educational service agencies serving low-income students.
(i) Provides at least one high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education at the baccalaureate or master's degree level that also provides supervision and support services to teachers, or assists in the provision of services to teachers, such as—
(A) Identifying and making available information on effective teaching skills or strategies;
(B) Identifying and making available information on effective practices in the supervision and coaching of novice teachers; and
(C) Mentoring focused on developing effective teaching skills and strategies;
(ii) Provides a two-year program that is acceptable for full credit in a TEACH Grant-eligible program offered by an institution described in paragraph (i) of this definition, as demonstrated by the institution that provides the two-year program, or provides a program that is the equivalent of an associate degree, as defined in § 668.8(b)(1), that is acceptable for full credit toward a baccalaureate degree in a TEACH Grant-eligible program;
(iii) Provides a high-quality teacher preparation program not provided through distance education or a high-quality teacher preparation program provided through distance education that is a post-baccalaureate program of study; or
(iv) Provides a master's degree program that does not meet the definition of terms “high-quality teacher preparation program not provided through distance education” or “high-quality teacher preparation program that is provided through distance education” because it is not subject to reporting under 34 CFR part 612, but that prepares:
(A) A teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field; or
(B) A teacher who is using high-quality alternative certification routes to become certified.
(ii) A program that is a two-year program or is the equivalent of an associate degree, as defined in 34 CFR 668.8(b)(1), that is acceptable for full credit toward a baccalaureate degree in a TEACH Grant-eligible program; or;
(iii) A master's degree program that does not meet the definition of the terms “high-quality teacher preparation not provided through distance education” or “high-quality teacher preparation program that is provided through distance education” because it is not subject to reporting under 34 CFR part 612, but that prepares:
(A) A teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field; or
(B) A teacher who is using high-quality alternative certification routes to become certified.
(c) An otherwise eligible student who received a TEACH Grant for enrollment in a TEACH Grant-eligible program is eligible to receive additional TEACH Grants to complete that program, even if that program is no longer considered a TEACH Grant-eligible program, not to exceed four Scheduled Awards for an undergraduate or post-baccalaureate student and up to two Scheduled Awards for a graduate student.
The revision and addition read as follows:
(a) * * *
(1) * * *
(iii) Is enrolled in a TEACH Grant-eligible institution in a TEACH Grant-eligible program or is an otherwise eligible student who received a TEACH Grant and who is completing a program under § 686.3(c);
(d)
(1) Obtains a certification from a physician that the student is able to engage in substantial gainful activity as defined in 34 CFR 685.102(b);
(2) Signs a statement acknowledging that neither the new agreement to serve for the TEACH Grant the student receives nor any previously discharged agreement to serve which the grant recipient is required to fulfill in accordance with paragraph (d)(3) of this section can be discharged in the future on the basis of any impairment present when the new grant is awarded, unless that impairment substantially deteriorates and the grant recipient applies for and meets the eligibility requirements for a discharge in accordance with 34 CFR 685.213; and
(3) In the case of a student who receives a new TEACH Grant within three years of the date that any previous TEACH Grant service obligation or title IV loan was discharged due to a total and permanent disability in accordance with § 686.42(b), 34 CFR 685.213(b)(4)(iii), 34 CFR 674.61(b)(3)(v), or 34 CFR 682.402(c)(3)(iv), acknowledges that he or she is once again subject to the terms of the previously discharged TEACH Grant agreement to serve or resumes repayment on the previously discharged loan in accordance with 34 CFR 685.213(b)(7), 674.61(b)(6), or 682.402(c)(6) before receiving the new grant.
The revision reads as follows:
(d)
(1) At the time the grant recipient begins teaching in that field, even if that field subsequently loses its high-need designation for that State; or
(2) For teaching service performed on or after July 1, 2010, at the time the grant recipient begins teaching in that field or when the grant recipient signed the agreement to serve or received the TEACH Grant, even if that field subsequently loses its high-need designation for that State before the grant recipient begins teaching.
(b) If a grant recipient is performing full-time teaching service in accordance with the agreement to serve, or agreements to serve if more than one agreement exists, the grant recipient must, upon completion of each of the four required elementary or secondary academic years of teaching service, provide to the Secretary documentation of that teaching service on a form approved by the Secretary and certified by the chief administrative officer of the school or educational service agency in which the grant recipient is teaching. The documentation must show that the grant recipient is teaching in a low-income school. If the school or educational service agency at which the grant recipient is employed meets the requirements of a low-income school in the first year of the grant recipient's four elementary or secondary academic years of teaching and the school or educational service agency fails to meet those requirements in subsequent years, those subsequent years of teaching qualify for purposes of this section for that recipient.
(f) A grant recipient who taught in more than one qualifying school or more than one qualifying educational service agency during an elementary or secondary academic year and demonstrates that the combined teaching service was the equivalent of full-time, as supported by the certification of one or more of the chief administrative officers of the schools or educational service agencies involved, is considered to have completed one elementary or secondary academic year of qualifying teaching.
(b)
(2) If at any time the Secretary determines that the grant recipient does not meet the requirements of the three-year period following the discharge as described in 34 CFR 685.213(b)(7), the
(3) The Secretary's notification under paragraph (b)(2) of this section will—
(i) Include the reason or reasons for reinstatement;
(ii) Provide information on how the grant recipient may contact the Secretary if the grant recipient has questions about the reinstatement or believes that the agreement to serve was reinstated based on incorrect information; and
(iii) Inform the TEACH Grant recipient that he or she must satisfy the service obligation within the portion of the eight-year period that remained after the date of the discharge.
(4) If the TEACH Grant of a recipient whose TEACH Grant agreement to serve is reinstated is later converted to a Direct Unsubsidized Loan, the recipient will not be required to pay interest that accrued on the TEACH Grant disbursements from the date the agreement to serve was discharged until the date the agreement to serve was reinstated.
(a) * * *
(1) The grant recipient, regardless of enrollment status, requests that the TEACH Grant be converted into a Federal Direct Unsubsidized Loan because he or she has decided not to teach in a qualified school or educational service agency, or not to teach in a high-need field, or for any other reason;
Department of the Treasury.
Final rule.
The Secretary of the Treasury (the “Secretary”), as Chairperson of the Financial Stability Oversight Council (the “Council”), is adopting final rules (the “Final Rules”) in consultation with the Federal Deposit Insurance Corporation (the “FDIC”) to implement the qualified financial contract (“QFC”) recordkeeping requirements of the Dodd-Frank Wall Street Reform and Consumer Protection Act (the “Dodd-Frank Act” or the “Act”). The Final Rules require recordkeeping with respect to positions, counterparties, legal documentation, and collateral. This information is necessary and appropriate to assist the FDIC as receiver to: Fulfill its obligations under the Dodd-Frank Act in deciding whether to transfer QFCs; assess the consequences of decisions to transfer, disaffirm or repudiate, or allow the termination of, QFCs with one or more counterparties; determine if any risks to financial stability are posed by the transfer, disaffirmance or repudiation, or termination of such QFCs; and otherwise exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act.
The Final Rules are effective December 30, 2016.
Monique Y.S. Rollins, Deputy Assistant Secretary for Capital Markets, (202) 622-1745; Jacob Liebschutz, Director, Office of Capital Markets, (202) 622-8954; Peter Nickoloff, Financial Economist, Office of Capital Markets, (202) 622-1692; Steven D. Laughton, Assistant General Counsel (Banking & Finance), (202) 622-8413; or Stephen T. Milligan, Attorney-Advisor, (202) 622-4051.
Title II of the Dodd-Frank Act (“Title II”)
Section 210(c)(8)(H) of the Act requires the Federal primary financial regulatory agencies, as defined in the Act
Section 210(c)(8)(H) provides that if the PFRAs do not so prescribe such joint regulations by July 21, 2012, the Secretary, as Chairperson of the Council, shall prescribe such regulations in consultation with the FDIC. As the PFRAs did not prescribe such regulations by the statutory deadline, on January 7, 2015, the Secretary, as Chairperson of the Council, in consultation with the FDIC, requested public comment on proposed rules that would implement section 210(c)(8)(H) (the “Proposed Rules”).
The substantial constraints imposed by Title II on the FDIC's exercise of its rights with respect to QFCs necessitate the detailed, standardized recordkeeping requirements adopted in the Final Rules. As discussed in greater detail in the Supplementary Information to the Proposed Rules,
As referenced throughout this Supplementary Information to the Final Rules, Title II requires that the FDIC as receiver treat the QFCs of a covered financial company with a particular counterparty and that counterparty's affiliates consistently. Within certain constraints, the FDIC may take different approaches with respect to QFCs with different counterparties. However, if the FDIC as receiver desires to transfer any QFC with a particular counterparty, it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution. Similarly, if the FDIC desires to disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty.
Furthermore, the FDIC is required to confirm that the aggregate amount of liabilities, including QFCs, of the covered financial company that are transferred to, or assumed by, the bridge financial company from the covered financial company do not exceed the aggregate amount of the assets of the covered financial company that are transferred to, or purchased by, the bridge financial company from the covered financial company.
The Secretary has determined that, given these statutory constraints, it is necessary and appropriate for the FDIC as receiver to have access to detailed, standardized records from the financial companies that potentially would be the most likely to be considered for orderly liquidation under Title II. Nonetheless, having considered the comments received, the Secretary has determined that it is possible to reduce the scope of financial companies subject to the rules and the extent of recordkeeping required while still requiring the records the FDIC would need as receiver in order to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10). In particular, the Secretary has made changes in the Final Rules that provide for further differentiation among financial companies by:
• Adding to the definition of “records entity” new thresholds based on the level of a financial company's derivatives activity;
• providing an exclusion for insurance companies;
• providing a conditional exemption for clearing organizations; and
• providing a de minimis exemption from the recordkeeping requirements, other than the requirement to maintain copies of the documents that govern QFC transactions, for entities that are party to 50 or fewer open QFC positions.
The Final Rules also significantly reduce the burden of the required recordkeeping by, among other things:
• Revising the definition of “records entity” to identify which members of a corporate group are records entities by reference to whether they are consolidated under accounting standards;
• replacing the requirement to maintain organizational charts of counterparties with a requirement to identify only certain information as to each counterparty, such as the ultimate and immediate parent entities of the counterparty;
• eliminating the requirement to maintain risk metrics information;
• eliminating the requirement to maintain copies of additional information with respect to QFCs provided by the records entity to other regulators, swap data repositories, and security-based swap data repositories;
• eliminating the requirement that copies of QFC agreements be searchable;
• eliminating several fields from the required data tables; and
• providing for tiered initial compliance dates based on the size of the corporate group, with all records entities having additional time to comply with the rules.
The Final Rules also provide for additional fields in the required data tables that are not anticipated to impose a significant additional burden on records entities, and the proposed requirement that records of affiliated records entities be maintained in a form that allows for aggregation has been replaced in the Final Rules with the requirement that the top-tier parent financial company be capable of aggregating such records.
The following discussion provides a summary of the Proposed Rules, the comments received, and the Secretary's responses to those comments, including modifications made in the Final Rules. In addition to the considerations discussed in this section, the Secretary, in adopting these Final Rules, has taken into account the potential costs and benefits of the rules discussed in Section III below.
Section 148.1(a) of the Final Rules defines the scope of the rules. Section 148.1(b) explains the purpose of the rules. Sections 148.1(c) and (d) set forth the rules' effective and compliance dates.
The scope of the Final Rules is established by certain key definitions that determine the entities that would be subject to the rules. Specifically, section 148.1(a) of the Final Rules provides that the rules apply to any “financial company” that is a “records entity” and, with respect to section 148.3(a), to the “top-tier financial company” of a “corporate group,” as those terms are defined in the Final Rules.
• Financial companies that are not incorporated or organized under U.S. federal or state law;
• Farm Credit System institutions;
• Governmental entities, and regulated entities under the Federal Housing Enterprises Financial Safety and Soundness Act of 1992;
• Insured depository institutions.
As described below, the Secretary has modified the definition of “records entity” in order to further differentiate financial companies by reference to certain factors listed in section 210(c)(8)(H)(iv) and to reduce the costs of complying with the rules. This has the effect of substantially narrowing the scope of entities subject to the recordkeeping requirements of the Final Rules, as discussed more fully below, and thereby reducing the costs imposed by the rules. Furthermore, as discussed below, the Secretary has eliminated the phrase “guarantees, supports, or is linked to an open QFC” from the definition of “records entity” in the Final Rules.
The proposed $50 billion asset threshold received substantial attention from commenters. Several commenters stated that reliance on this threshold would lead to an overbroad application of the recordkeeping requirements and argued for a more tailored approach that would focus on those institutions that are more likely to be resolved under Title II.
The Secretary is making two changes to the definition of “records entity” in the Final Rules that will, by incorporating additional factors, substantially reduce the number of entities that will be subject to recordkeeping requirements. These measures relate to several of the factors specifically enumerated in section 210(c)(8)(H) of the Act and allow the Secretary to better limit the financial companies included within the scope of records entities to those companies that potentially would be the most likely to be considered for orderly liquidation under Title II.
First, the Final Rules specifically include in the definition of “records entity” those entities that are identified as G-SIBs.
However, the Secretary believes that to include only the G-SIBs identified by the Federal Reserve, along with designated financial market utilities and nonbank financial companies subject to a Council determination, within the definition of “records entity” would unduly limit the entities that would be subject to the recordkeeping rules. The G-SIBs identified under the Federal Reserve's rules by definition only include U.S. top-tier bank holding companies, whereas other types of financial companies potentially would also be among the most likely financial companies to be considered for orderly liquidation under Title II. Therefore, in addition to adding the G-SIBs to the definition of “records entity,” the Secretary has chosen to maintain the $50 billion threshold but supplement it with an additional factor tied to a financial company's level of derivatives activity. Specifically, section 148.2(n)(iii)(D) of the Final Rules provides that in addition to having total consolidated assets equal to or greater than $50 billion, an entity must on a consolidated basis have either (i) total gross notional derivatives outstanding equal to or greater than $250 billion or (ii) derivative liabilities equal to or greater than $3.5 billion in order to be deemed a records entity under that prong of the definition. As explained below, this approach incorporates the most relevant factors into the definition of “records entity” by reference to metrics that are already generally calculated by financial companies.
Gross notional derivatives outstanding relates directly to three of the factors enumerated in section 210(c)(8)(H)(iv)—complexity, interconnectedness, and the dollar amount of QFCs. Gross notional derivatives outstanding is used in the Federal Reserve's methodology for identifying G-SIBs as an indicator of complexity.
Unlike some other potential measures of complexity and interconnectedness and unlike the measures of the volume of QFCs generally, gross notional derivatives outstanding is a measure that the Secretary understands is generally already calculated, and in most cases reported or disclosed, by financial companies with assets of $50 billion or more. Bank holding companies with assets of $50 billion or more are required to report to the Federal Reserve the amount of gross notional derivatives outstanding quarterly on Schedule H-CL of Form Y-9C and annually on Schedule D of Form Y-15. Financial companies often satisfy the requirement to disclose in their financial statements the volume of their derivatives activity by disclosing the amount of gross notional derivatives outstanding;
Referring to gross notional derivatives outstanding alone, however, would not be sufficient to identify financial companies with large exposures to derivatives. The Final Rules include the amount of a financial company's derivative liabilities as an alternative measure by which a financial company may be deemed a records entity. The Final Rules define “derivative liabilities” as the fair value of derivative instruments in a negative position that are outstanding as of the end of the most recent fiscal year as recognized and measured in accordance with GAAP or other applicable accounting standards, taking into account the effects of master netting agreements and cash collateral held with the same counterparty on a net basis to the extent reflected on the financial company's financial statements. This metric, like total gross notional derivatives outstanding, serves as a proxy for interconnectedness, as a company that has a greater level of derivative liabilities would have higher counterparty exposure throughout the financial system. For this reason, derivative liabilities is one of the metrics used by the Council for identifying nonbank financial companies that may merit further evaluation for a potential determination under section 113.
The inclusion of both the total gross notional amount of derivatives outstanding and derivative liabilities thresholds in the definition of “records entity” will better capture entities that are using substantial amounts of derivatives. The amount of total gross notional derivatives outstanding is an amount that may not, by itself, be fully representative of the interconnection and complexity of an entity and its QFC activities. For example, the notional amount of interest rate derivatives tends to be significantly larger than the notional amount of credit derivatives representing comparable levels of fair value risk, yet both types of derivatives are indicative of the interconnection and complexity of an entity. In turn, reference to derivative liabilities alone could obscure entities' level of derivatives activity to the extent a financial company's financial statements take into account the effects of netting agreements and cash collateral held with the same counterparty on a net basis. Although such netting may reduce the risk to the entity from engaging in such derivatives, even a derivatives portfolio with a low negative fair value after accounting for the effects of master netting agreements and cash collateral held with the same counterparty is indicative of interconnection and complexity if it is sufficiently large on a gross notional basis.
By including reference to total assets, notional amount of derivatives, and derivative liabilities, the Secretary has incorporated, as explained above, consideration of size, complexity, interconnectedness to the financial system, and the dollar amount of QFCs into the definition of “records entity.” Size, complexity, and interconnectedness to the financial system are, in turn, all indicators of risk, particularly risk to financial stability.
The Final Rules provide for thresholds of $250 billion of total gross notional derivatives outstanding and $3.5 billion of total derivative liabilities. As noted above, bank holding companies with $50 billion or more in total consolidated assets report both total gross notional derivatives outstanding and derivative liabilities in regulatory filings. As of December 31, 2015, all of the G-SIBs were above the thresholds for total gross notional amount of derivatives outstanding and derivative liabilities and in most cases were significantly above the thresholds.
Another reason for setting the thresholds at these levels is to provide for some degree of stability in the set of financial companies that are deemed to be records entities. In looking back across the previous eight quarters, the bank holding companies with derivative liabilities currently at or above the $3.5 billion threshold were at or above the threshold in nearly every quarter, while those with total derivative liabilities currently below the threshold were below the threshold in each quarter. Similarly, for total gross notional derivatives outstanding, bank holding companies at or above the $250 billion threshold were at or above the threshold in nearly every quarter over the last eight quarters, while those with total gross notional derivatives outstanding currently below the threshold were below the threshold in nearly every quarter over the last eight quarters.
Similar trends are evidenced among other public financial companies reporting derivative liabilities and total gross notional derivatives outstanding
Several commenters stated that the use of the definition of “affiliate,” discussed further below, had the effect of including too broad a scope of affiliates within the definition of “records entity.”
As discussed further below, the Secretary has adopted the suggestion of commenters, noted above, to revise the definition of “records entity” to identify which members of a corporate group are records entities by reference to whether they are consolidated under accounting standards. This change should have the effect of reducing the number of records entities. The Final Rules do not otherwise revise the scope of members of a corporate group that are included as records entities because the Secretary has decided that it is not possible to describe,
Moreover, information about QFCs of each of the members of the corporate group could be of assistance to the FDIC as receiver in deciding whether to transfer the QFCs to a bridge financial company by giving the FDIC a full understanding of the impact of any transfer of the QFCs on the records entity's corporate group. For example, in the case of certain QFCs that the FDIC might otherwise determine to retain in the receivership rather than transfer to a bridge financial company (to which the equity in all of the records entity's subsidiaries have been transferred), if, by reference to a subsidiary's QFC records, the FDIC determines that the QFCs are offset by QFCs of the subsidiary with another counterparty, the FDIC as receiver may decide to transfer the records entity's QFCs to the bridge financial company in order to maintain a matched book at the corporate group level with the QFCs of the subsidiary.
The Secretary has, instead of excluding certain types or sizes of members of a corporate group from the definition of “records entity,” differentiated among financial companies by providing the de minimis exemption discussed below for records entities that are a party to 50 or fewer QFCs. As discussed below, the FDIC has advised the Secretary that it would be able to review the terms of that number of QFCs on a manual basis within the time frame provided by Title II. The de minimis exemption included in the Final Rules will, unlike commenters' proposed exclusions based on the materiality of the records entity, avoid a situation in which the FDIC as receiver will not have the records it may need for a particular records entity.
(1) An insured depository institution as defined in 12 U.S.C. 1813(c)(2);
(2) A subsidiary of an insured depository institution that is not a functionally regulated subsidiary as defined in 12 U.S.C. 1844(c)(5), a security-based swap dealer as defined in 15 U.S.C. 78c(a)(71), or a major security-based swap participant as defined in 15 U.S.C. 78c(a)(67); or
(3) A financial company that is not a party to a QFC and controls only exempt entities as defined in clause (1) of this definition.
The Final Rules use the term “excluded entity” rather than “exempt entity,” as used in the Proposed Rules, in order to avoid confusion with the Secretary's authority to grant exemptive relief from the requirements of the Final Rules. Several commenters requested the addition of other types of entities to the list of excluded entities, as discussed below.
Having considered these comments and the requirements of section 203(e) of the Act, the Secretary is excluding insurance companies from the definition of “records entity” in the Final Rules. Given that the liquidation or rehabilitation of an insurance company under Title II would be conducted under state law, to subject insurance companies to the requirements of the rules would not assist the FDIC as receiver in exercising its rights under the Act or fulfilling its obligations under sections 210(c)(8), (9), or (10). As discussed below, a definition of “insurance company” has been added in the Final Rules to ensure consistency with the application of section 203(e) of the Act.
Commenters also requested that certain non-insurance affiliates of insurance companies be excluded from the scope of the rules, specifically, that non-insurance affiliates within a holding company structure that is predominantly engaged in insurance activities be excluded from the rules.
The definition of “records entity” in the Final Rules would include only extremely large and interconnected asset management firms, and, for the reasons discussed above, investment advisers that are members of a corporate group that is subject to the rules. Although commenters cited examples of mergers and closures of funds and advisers that were conducted in an orderly fashion as demonstrating the unlikelihood of the need to resolve such entities under Title II, these examples did not address the potential effects of the rapid failure of a fund or of an asset management firm or other corporate group of the size and complexity that would be subject to the Final Rules.
The Secretary has made certain other changes in the Final Rules that will further reduce their impact on asset management firms. In response to the proposal of a commenter that noted that an investment adviser may be party to a QFC of one of its funds or clients for the limited purpose of providing a representation,
Commenters stated that the FDIC should coordinate with the clearing organizations' primary regulators (the Commodity Futures Trading Commission (“CFTC”) or SEC, as applicable) and utilize to the maximum extent practicable the existing reporting regulations, mechanisms, and formats already applicable to clearing organizations.
The Secretary acknowledges that all derivatives clearing organizations are required by the CFTC to maintain extensive records.
In addition, as commenters noted, the unique nature of derivatives clearing organizations make it possible that their existing recordkeeping practices would be sufficient to meet the needs of the FDIC. The unique characteristics include the following: (i) A clearing organization's only counterparties are its clearing members; (ii) it enters into, or clears, a prescribed set of QFCs; (iii) it maintains a consolidated recordkeeping system to calculate aggregate exposures and margin requirements of its clearing members; and (iv) all transactions are governed by the rulebook of the clearing organization rather than individual legal agreements. The data requirements of the tables included in the Proposed Rules and the Final Rules were created with the expectation that the FDIC as receiver might need to make decisions as to whether to transfer, disaffirm or repudiate, or allow the termination of QFCs with a specific counterparty and its affiliates. In the case of a clearing organization, in contrast, a significant focus of the FDIC would be maintaining the clearing organization's matched book of QFCs. In these cases, the most relevant data would be the type of data that would be of value to a transferee in managing the transferred QFC portfolio, and this is the type of data that clearing organizations are required by their primary regulators to maintain and report.
Having considered the foregoing, the Secretary has determined, after consulting with the FDIC, that the FDIC would be able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act if it has access to the records currently required to be maintained by clearing organizations. Accordingly, the Final Rules provide that a clearing
The Secretary has decided to simplify the rules by omitting references to “guaranteed or supported” and “linked.” Under the Final Rules, a financial company would, in addition to meeting the other criteria discussed above, have to be a party to an open QFC in order to be a “records entity,” and such a records entity would only be required to maintain records with respect to its QFCs. This change reduces the complexity of the rules but generally would not be expected to change significantly which entities would be records entities because guarantees and other credit enhancements of QFCs are themselves QFCs.
The FDIA, by reference to section 2 of the BHC Act, provides that any company has control over another company if the company directly or indirectly or acting through one or more persons owns, controls, or has the power to vote 25 percent or more of any class of voting securities of the company; the company controls in any manner the election of a majority of the directors or trustees of the company; or the Federal Reserve determines, after notice and opportunity for hearing, that the company directly or indirectly exercises a controlling influence over the management or policies of the company. The first two prongs of the definition of “control” in the Proposed Rules are consistent with the BHC Act definition. The third prong of the definition of “control” in the Proposed Rules, that an entity controls another entity if it must consolidate another entity for financial or regulatory purposes, was proposed to reflect the fact that, in certain situations, a controlling interest may be achieved through arrangements that do not involve voting interests and to provide an objective test that does not require a determination by the Federal Reserve. In the Proposed Rules, the definitions of “affiliate” and “control” related both to (1) the determination of which members of a corporate group would be records entities and (2) the information that would be required to be maintained by records entities as to the identities of affiliates of counterparties.
One commenter stated that existing recordkeeping and operational controls with respect to QFCs are customarily maintained by parent companies or other entities that have majority ownership of or are otherwise required to consolidate the entities engaging in QFC activity for financial and regulatory purposes.
The Secretary has determined that the FDIC as receiver in a Title II resolution would need to know the identities of the affiliates, as defined by reference to the BHC Act definition of “control,” of the records entity's counterparties. Specifically, as referenced above, section 210(c)(9)(A) of the Act provides the FDIC as receiver shall transfer to one transferee either all or none of the QFCs of a counterparty and the counterparty's “affiliates,” as defined by reference to the BHC Act definition of “control.”
As discussed below, the Proposed Rules would have required a records entity to identify each affiliate of a counterparty by maintaining full organizational charts of the corporate group of a QFC counterparty. This has been replaced in the Final Rules with a requirement in the tables in the appendix to the rules to maintain records as to the identity of the immediate and ultimate parent entity of each counterparty, which will allow the FDIC to identify affiliated counterparties based on their common parent and ultimate parent entities. A new term, “parent entity,” has been defined for this purpose as an entity that controls another entity.
In addition, the Final Rules have been revised to conform the third prong in the definition of “control” to that provided in the BHC Act,
As to the determination of which members of a corporate group would be records entities, the Secretary has adopted the request of commenters, referenced above, to define “records entity” by reference to whether an entity is consolidated under accounting standards. Specifically, under the Final Rules, “records entity” is defined to include a member of a corporate group that consolidates, is consolidated with, or is consolidated by the financial company member of the corporate group that meets the other criteria of the definition of “records entity,”
This change addresses the concerns identified by commenters that members of a corporate group would not have access to the records of a minority-owned entity or joint venture and is intended to better align the identification of records entities in a way that comports with existing recordkeeping practices by corporate groups. The modification of the definition of “records entity” is also responsive to concerns from commenters that the scope of the Proposed Rules would have been too broad, given that reference to accounting consolidation generally requires a higher level of an affiliation relationship than the 25 percent voting interest standard of the BHC Act definition of “control.”
Two commenters stated that the definition of “affiliate” could deem investment companies that are “seeded” with an initial capital investment by the fund's sponsor to be affiliates of that sponsor during the period before such a fund attracted third party investors.
Section 148.1(a) of the Final Rules provides that the recordkeeping requirements apply to each financial company that qualifies as a records entity and, with respect to section 148.3(a), to the top-tier financial company of a corporate group. As discussed above, the Secretary received numerous comments on the Proposed Rules pertaining to the definition of “records entity.” Section 210(c)(8)(H) of the Dodd-Frank Act gives the Secretary broad flexibility in determining the scope of the recordkeeping requirements as necessary or appropriate in order to assist the FDIC as a receiver for a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. Section 210(c)(8)(H) also requires the regulations to differentiate among financial companies, as appropriate, by taking into consideration their size, risk, complexity, leverage, frequency and dollar amount of QFCs, interconnectedness to the financial system, and any other factors deemed appropriate. As discussed earlier, the Secretary has complied with these requirements and consulted extensively with the FDIC.
The Secretary anticipates that records entities may include the following types of financial companies:
Section 148.1(b) of the Proposed Rules provided that the purpose of the rules is to establish QFC recordkeeping requirements for a records entity in order to assist the FDIC as receiver for a covered financial company. The Secretary did not receive any comments requesting changes to this section and has not modified it from the Proposed Rules.
Section 148.1(c) of the Proposed Rules provided that the rules would become effective 60 days after publication of the Final Rules in the
Several commenters submitted that the proposed compliance period would be an inadequate amount of time for implementation because of the significant information systems upgrades and changes in recordkeeping practices that commenters said would be required for implementation.
In response to these comments, the Final Rules provide additional time to all records entities to comply with the requirements of the rules. All records entities will have 90 days after the effective date of the rules to comply with the requirement to provide point of contact information to their PFRAs and the FDIC; this extension will provide additional time to financial companies to determine whether they are records entities under the rules. As to the remainder of the requirements of the rules, the Final Rules provide staggered compliance dates that will provide all records entities with additional time to comply with the recordkeeping requirements. The Final Rules provide that records entities with $1 trillion or more in total consolidated assets and the financial company members of their corporate group will have 540 days (approximately 18 months) after the effective date to comply with the rules. The Secretary understands that only the four largest G-SIBs would meet this threshold on the effective date. The Secretary has determined that it is important for data on the largest, most systemically important entities to be available as soon as reasonably possible. The FDIC has advised that, in general, large insured depository institutions subject to the Part 371 recordkeeping requirements have been able to comply with those requirements within 270 days. Although the recordkeeping requirements under the Final Rules are more detailed in many respects than those under Part 371, the Secretary believes that the extra time allotted for compliance should be sufficient to allow the largest financial companies to adapt the processes, procedures, and systems to comply with the Final Rules.
Under the Final Rules, all other records entities will have at least two years to comply with the rules' recordkeeping requirements. Records entities with total assets equal to or greater than $500 billion (but less than $1 trillion) and financial company members of the corporate group of such entities will have two years from the effective date to comply. Records entities with total assets equal to or greater than $250 billion (but less than $500 billion) and financial company members of the corporate group of such entities will have three years from the effective date to comply. All other records entities will have four years from the effective date to comply.
The Final Rules provide for a staggered schedule based on the total consolidated assets of the records entities (or other members of their corporate group) on the understanding that larger entities will generally have greater capacity to apply to the task of coming into initial compliance with the rules. In addition, because the Department of the Treasury and the FDIC anticipate providing guidance to records entities as they work to come into compliance with the rules, the staggered compliance schedule will permit staff of the Department of the Treasury and the FDIC to allocate their resources to address more efficiently requests for guidance from each tier of records entities in turn. The commenter's proposal to provide for staggered compliance based on type of QFC would mean that the FDIC would not have records that would be of meaningful usefulness under Title II until the final compliance deadline had been met, given the requirement, discussed above, that if the FDIC as receiver decides to (i) transfer any QFC with a particular counterparty, it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution and (ii) disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty. In contrast, the compliance schedule provided for in the Final Rules would provide the FDIC with complete records for a successively larger set of companies.
The Final Rules provide that a financial company that becomes a records entity after the effective date
Under Section 148.1(d)(2) of the Proposed Rules, a financial company that no longer qualifies as a records entity would have been permitted to cease maintaining records one year after it ceases to qualify as a records entity. The definition of “records entity” in section 148.2(n) of the Final Rules provides that a company that is a records entity by virtue of exceeding the total assets and derivatives exposure thresholds shall remain a records entity until one year after it ceases to meet the total assets and derivatives exposure thresholds. Financial companies that are members of such a corporate group would be subject to the same provision. However, in a change from the Proposed Rules, any company that is a records entity because it meets the other criteria of the definition shall cease to be a records entity and thus shall cease to be subject to the rules immediately upon ceasing to meet such criteria. For example, a nonbank financial company with respect to which the Council rescinds a determination under section 113 would no longer be a records entity upon such rescission.
The Proposed Rules provided that a financial company that becomes subject to the rules again after it had ceased recordkeeping would be required to comply with the requirements of the rules within 90 days of the date it again becomes subject to the rules. The Final Rules extend that period to 365 days, but if a longer period still remains under the applicable initial compliance period discussed above, the entity has until the end of that longer period to comply with the rules.
Section 148.1(d)(3) of the Final Rules, consistent with section 148.3(c)(3) of the Proposed Rules, authorizes the Secretary, in consultation with the FDIC, to grant extensions of time with respect to compliance with the recordkeeping requirements. As discussed in the Supplemental Information to the Proposed Rules, it is anticipated that such extensions of time would apply when records entities first become subject to the rules and likely would not be used to adjust the time periods specified in the maintenance and updating requirements of section 148.3(b) of the Final Rules. Extensions of time may also be appropriate on a limited basis with respect to a records entity that is temporarily incapable of generating records due to unforeseen technical issues.
Finally, section 148.1(d)(4) of the Final Rules provides that a top-tier financial company must comply with the requirement, discussed below, to be capable of generating a single, compiled set of records of all the members of its corporate group on the same date as the date on which the records entity members of the corporate group of which it is a member are required to comply with this part.
In addition to the definitions described in detail above in reference to the scope of the Proposed Rules, certain additional terms were defined in the Proposed Rules to describe a records entity's recordkeeping obligations. The Secretary did not receive any comments on these definitions.
The definition of “primary financial regulatory agency” has been revised to include, with respect to a financial market utility that is subject to a designation pursuant to section 804 of the Act, the Supervisory Agency for that financial market utility, as defined in section 803(8) of the Act, if such financial market utility would not otherwise have a PFRA.
The term “total assets,” which is used both in the definition of “records entity” and for determining a particular records entity's compliance date, is defined in the Final Rules by reference to the audited consolidated statement of financial condition submitted to the financial company's PFRAs or, if no such statement is submitted, to the financial company's consolidated balance sheet for the most recent fiscal year end, as prepared in accordance with GAAP or other applicable accounting standards. This definition is unchanged from the Proposed Rules other than the addition of the reference to GAAP or other applicable accounting standards. One commenter proposed excluding from the definition of “total assets” any assets under management, even if those assets are included on a balance sheet under applicable accounting standards.
The Final Rules also include several additional definitions. A definition of “legal entity identifier,” previously provided in the appendix, has been added to section 148.2. In addition, a definition of “parent entity” has been added because, as discussed below, the appendix has been revised in the Final Rules to require information regarding the immediate and ultimate parent entity of a counterparty to a QFC rather than a full organizational chart for each counterparty. In order to align with the definition of “affiliate” in Title II, as discussed above, “parent entity” is defined in the Final Rules as “an entity that controls another entity.”
Because, as discussed above, the Final Rules exclude insurance companies from the definition of “records entity,” a definition of “insurance company” has been added. In addition to incorporating the definition of “insurance company” provided in Title II, the definition in the Final Rules includes mutual insurance holding companies that meet the conditions, specified by the FDIC in part 380 of its rules, for being treated as an insurance company for the purpose of section 203(e) of the Act.
Accordingly, section 148.3(a)(1) has been revised in the Final Rules to provide that a top-tier financial company, defined as a financial company that is a member of a corporate group consisting of multiple records entities and that is not itself controlled by another financial company, must be able to generate a single, compiled set of the records, in electronic form, for all records entities in the corporate group that it consolidates or are consolidated with it, in a format that allows for aggregation and disaggregation of such data by records entity and counterparty. By limiting this requirement to records of records entities that are consolidated by or with the top-tier financial company, the Secretary has sought to avoid circumstances in which the top-tier financial company might not have access to the records it is required to compile. The top-tier financial company may comply with this requirement by providing that any of its affiliates or any third-party service provider maintains the capability of generating the single, compiled set of the records, in electronic form, for all records entities in the corporate group; provided, however, that the top-tier financial company shall itself maintain records under this part in the event that such affiliate or service provider shall fail to maintain such records.
Section 148.3(a)(3) of the Proposed Rules provided that each records entity designate a point of contact to enable its PFRA and the FDIC to contact the records entity with respect to the rules and to update this information within 30 days of any change. The Secretary did not receive any comments on this subsection, which in the Final Rules appears as section 148.3(a)(2), and has not modified it from the Proposed Rules, other than by subjecting the top-tier financial company of a corporate group to this requirement and by making certain technical changes.
Section 148.3(a)(4) of the Proposed Rules provided that each records entity that is regulated by a PFRA be capable of providing all QFC records specified in the rules to its PFRA within 24 hours of request. This provision has been revised as section 148.3(a)(3) of the Final Rules to provide that the records entity is required to be capable of providing electronically, within 24 hours of the request of the PFRA, all QFC records specified in the rules to both its PFRA and the FDIC. This change has been made to ensure that the records will be maintained in a format that is compatible with the FDIC's systems and to avoid any delay resulting from the records having to be transmitted from the PFRA to the FDIC.
All QFCs, regardless of their tenor, their volume, and how they are settled, are subject to the requirement, discussed above, that if the FDIC as receiver determines (i) to transfer any QFC with a particular counterparty, it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution or (ii) to disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty. The large volume of these short-term transactions supports the determination that the QFC information required to be provided must be maintained in the standard format specified in the rules to ensure rapid aggregation and evaluation of the information by the receiver. Whether these transactions are exchange traded will not necessarily affect the FDIC's decision as to whether to transfer the QFCs in question; rather, the FDIC's decision as to whether to transfer a particular counterparty's QFCs will be based on an evaluation of the other information required to be collected under the Final Rules and on an evaluation of the impact of such transfer on the receivership and U.S. financial stability. Furthermore, for corporate groups that include members that are subject to different recordkeeping regimes, permitting entities to rely on their existing records would not be consistent with the requirement for the top-tier financial company to be capable of generating a single, compiled set of QFC records in a format that allows for aggregation and disaggregation of such data. The Secretary notes, however, that under the exemptive process provided in the rules and discussed below, a records entity may apply for relief from particular requirements as to the information to be maintained by a records entity for a particular type of QFC or counterparty. Any exemptive relief requested with respect to a particular type of QFC or counterparty would need to be defined in such a way as to ensure consistency of treatment by each records entity.
Section 148.3(b) of the Proposed Rules would have required that each records entity maintain the capacity to produce QFC records on a daily basis based on previous end-of-day records and values. The Secretary has clarified in the Final Rules that, if records are maintained on behalf of a records entity by an affiliate or service provider, such records entity shall itself maintain records under this part in the event that such affiliate or service provider fails to maintain such records. The Secretary confirms that, as was suggested by a commenter, the information required to be capable of being provided shall be with respect to QFCs as of the end of the day on the date the request is provided.
Section 148.3(c) of the Proposed Rules provided that upon written request by a records entity, the FDIC, in consultation with the PFRAs for the records entity, may recommend that the Secretary grant a specific exemption from compliance with one or more of the requirements of the rules. In addition, under the Proposed Rules, the Secretary would also have been permitted to issue exemptions that have general applicability upon receipt of a recommendation from the FDIC, in consultation with the PFRAs for the applicable records entities.
One commenter suggested that exemptions should be granted by the PFRAs for a records entity rather than by the Secretary.
In addition, the Secretary has simplified the exemption provision by consolidating the separate provisions for general and specific exemptions and has specified in the Final Rules what a request for an exemption must contain. In determining whether to grant any requests from records entities for exemptions, the Secretary may take into consideration their size, risk, complexity, leverage, frequency and dollar amount of QFCs, interconnectedness to the financial system, and any other factors deemed appropriate, including whether the application of one or more requirements of the rules is not necessary or appropriate to achieve the purpose of the rules.
Several commenters argued that the requirements of the Proposed Rules should not apply to records entities that have a minimal level of QFC activity. Commenters noted that a financial company might be subject to the recordkeeping requirements of the Proposed Rules even if it is a party to only a single QFC.
After consideration of these comments, the Secretary has determined that an exemption from the preponderance of the recordkeeping
The recordkeeping requirements of Part 371 of the FDIC's rules relax the recordkeeping requirements for institutions with fewer than twenty open QFC positions. Based on its experience with Part 371, the FDIC advised that a receiver should be able to exercise its statutory rights and duties under the Dodd-Frank Act relating to QFCs without having access to standardized records for any records entity that is a party to no more than 50 open QFC positions. Having considered the comments received and the FDIC's experience with evaluating QFC portfolios, the Secretary has provided in the Final Rules that any records entity that is a party to no more than 50 open QFC positions is not required to maintain the records described in section 148.4 other than the copies of the documents governing QFC transactions between the records entity and each counterparty as provided in section 148.4(i). This exemption provides further differentiation among financial companies and reduces the burden of the rules without compromising the ability of the FDIC to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), and (10).
Section 148.4 of the Final Rules requires each records entity to maintain the data listed in the appendix tables, copies of the documents that govern QFCs, and lists of vendors directly supporting the QFC-related activities of the records entity and the vendors' contact information with respect to each QFC to which it is a party. As discussed above, the Final Rules have been simplified so as not to separately require that a full set of records be maintained with respect to the underlying QFCs for which a records entity provides a guarantee or other credit enhancement. Instead, as discussed below, certain fields specific to the provision by a records entity of a guarantee of a QFC or of another type of credit enhancement of a QFC have been added to the tables in the Final Rules.
The Proposed Rules would have also required that records entities maintain any written data or information that is not listed in the appendix tables that the records entity is required to provide to a swap data repository, security-based swap data repository, the CFTC, the SEC, or any non-U.S. regulator with respect to any QFC, for any period that such data or information is required to be maintained by its PFRA. Having considered a comment received indicating that this would be unduly burdensome,
The Proposed Rules provided that a records entity also would be required to maintain electronic, full-text searchable copies of all agreements that govern the QFC transactions subject to the rules, as well as credit support documents related to such QFC transactions. Having considered the comments received indicating that the requirement that such electronic documents be full-text searchable would be unduly burdensome,
For the receiver to make a well-informed decision that complies with the requirements of Title II discussed in section I, the receiver must have sufficient information to fully evaluate and model various QFC transfer or termination scenarios as well as the potential impact of its transfer or retention decisions. To perform this analysis in the extremely limited time frame provided by Title II, the receiver must have access to data on the QFC positions of the records entity, net QFC exposures under applicable netting agreements, detailed and aggregated collateral positions of the records entity and of its counterparties, and information regarding certain key provisions of the legal agreements governing the QFC transactions. Many commenters recognized the importance of maintaining detailed records of QFCs for use by the FDIC if it were appointed as receiver under Title II; however, several commenters expressed concern that the requirements of Tables A-1 through A-4, as proposed, were overly burdensome and would require maintenance of data that is different in content or format from that currently tracked or collected in the ordinary course of business or for other regulatory purposes.
The appendix to the Final Rules preserves the basic structure and content of the data tables included in the Proposed Rules. However, the Secretary has eliminated data fields that the Secretary decided would not provide a sufficiently significant benefit to the FDIC as receiver to justify the
The master data lookup tables are cross-referenced to one or more of Tables A-1 through A-4 and provide a centralized site for records of affiliate, counterparty, booking location, and safekeeping agent data, which eliminates the need for a records entity to include duplicative data in Tables A-1 through A-4 and thereby makes it easier for a records entity to enter and update the data included in those Tables. In particular, the records entity members of a corporate group, which are required to utilize common identifiers for shared counterparties, will be able to use the same counterparty consolidated corporate master lookup table for a given counterparty. For example, if there were several records entities in a corporate group and each was a party to one or more QFCs with a particular counterparty, use of the counterparty master lookup table would enable the information as to that counterparty to be entered only once. The lookup table format, which conforms to customary information technology practices, will also allow for smaller file sizes by eliminating repetitive entries, thereby reducing the burden of maintaining the records and maintaining the capability of transmitting them to the FDIC and the records entity's PFRA.
Each table contains examples and, as relevant, instructions for recording the required information and an indication of how the FDIC as receiver would apply the required information. A records entity may leave an entry blank for any data fields that do not apply to a given QFC transaction, agreement, collateral item, or counterparty. For example, if a QFC is not collateralized, the data fields that relate to collateral may be left blank (in the case of character fields) or given a zero value (in the case of numerical fields).
Several commenters noted that the scope of the recordkeeping requirements in the appendix is more extensive than that of the recordkeeping requirements in the appendix to Part 371.
Table A-1 requires each records entity to maintain detailed position-level data to enable the FDIC as receiver to evaluate a records entity's QFC exposure to each of its counterparties on a position-by-position basis. The records required by the table include critical information about the type, terms, and value of each of the records entity's QFCs. Position-level information must be available for each counterparty, affiliate, and governing netting agreement to allow the FDIC as receiver to model the potential impacts of its decisions relating to the transfer or retention of positions. This information will also enable the FDIC to confirm that the netting-set level data provided in Table A-2, such as the market value of all positions in the netting set (A2.6), based on the aggregated data from Table A-1, is accurate and can be validated across different tables. In addition, position-level information will assist the receiver or any transferee in complying with the terms of the records entity's QFCs and thereby reduce the likelihood of inadvertent defaults.
In response to comments received, the Secretary has made several changes to Table A-1 that will reduce the recordkeeping burden. One commenter recommended elimination of the requirement to identify the purpose of a QFC position, stating that this could involve a complicated analysis and impose a substantial burden on records entities. The commenter stated that a QFC position may have multiple purposes that may change over time such that any identified purpose would be of minimal value to the receiver.
One commenter also recommended eliminating the requirement to maintain operational and business-level details relating to QFC positions, such as the identification of related inter-affiliate positions, trading desk identifiers, and points of contact. The commenter stated that such operational and business-level details are subject to frequent change that would require frequent updates by records entities and submitted that this information would likely be of limited value to the receiver.
One commenter stated that the requirement to provide information based on a classification under GAAP or IFRS may not be appropriate if the records entity follows a different accounting standard.
To further reduce the burden of Table A-1, the Secretary has eliminated the following proposed data fields in the Final Rules: Industry code (GIC or SIC code); position standardized contract type; and documentation status of the position.
The Final Rules include two additional fields to Table A-1 based on the FDIC's experience with implementing Part 371. The Secretary believes that the addition of these fields should impose minimal, if any, additional burden on a records entity. The first addition is a data field for the
A netting agreement counterparty identifier field (A1.10) has also been added to the table. Based on the FDIC's experience with the implementation of Part 371, the FDIC has advised that it is necessary for the rules to address circumstances in which the counterparty to a QFC is different from the counterparty securing the QFC (for example, if an affiliate of the QFC counterparty is providing collateral for the position). In such cases, the netting agreement counterparty identifier is necessary to enable the receiver to link certain position-level data from Table A-1 to the applicable netting-set level data under Table A-2.
In addition certain fields specific to guarantees of QFCs provided by the records entity and other credit enhancements of QFCs provided by the records entity have been added to the table, including the type of QFC covered by the guarantee or other third party credit enhancement (A1.7.1) and the underlying QFC obligor identifier (A1.7.2). Further, the Final Rules include fields requiring identification of any credit enhancement that has been provided by a third party with respect to a QFC of the records entity (A1.21.1-.5).
As in the Proposed Rules, Table A-1 under the Final Rules requires that a records entity be identified by its legal entity identifier (“LEI”). In order for an LEI to be properly maintained, it must be kept current and up to date according to the standards established by the Global LEI Foundation. In addition, to the extent a records entity uses a global standard unique transaction identifier or unique product identifier to identify a QFC for which records are kept under these rules, the records entity should use such identifiers in completing fields A1.3 and A1.7, respectively. The Secretary has made this change in recognition of the ongoing work of the Committee on Payments and Market Infrastructures and the Board of the International Organization of Securities Commissions to establish such global identifiers.
Table A-2, which specifies the information to be maintained regarding aggregated QFC exposure and collateral data by counterparty, has been adopted in the Final Rules substantially as proposed, with certain changes discussed below.
Table A-2 requires a records entity to maintain records of the aggregated QFC exposures under each netting agreement between the records entity and its counterparty. Table A-2 also requires comprehensive information on the collateral exchanged to secure net exposures under each netting agreement. Information on collateral required by the table includes the market value of collateral, any collateral excess or deficiency positions, the identification of the collateral safekeeping agent, a notation as to whether the collateral posted by a counterparty or a records entity is subject to rehypothecation, and the market value of any collateral subject to rehypothecation. The information required by Table A-2 must be maintained at each level of netting under the relevant governing agreement. For example, if a master agreement includes an annex for repurchase agreements and an annex for forward exchange transactions and requires separate netting under each annex, the information required by Table A-2 with respect to the net exposures under each annex would need to be maintained separately.
In evaluating whether to transfer or retain QFCs between a records entity and a counterparty, the receiver must be able to assess the records entity's net exposure to the counterparty (and the counterparty's affiliates), the counterparty's net exposure to the records entity, and the amount of collateral securing those exposures. Net QFC exposure data will also assist the receiver in aggregating exposures under netting agreements with a counterparty and its affiliates based on the netting rights of the entire group, in order to determine relative concentrations of risk under each applicable netting agreement. This information will assist the receiver in modeling various transfer or termination scenarios and evaluating the effects and potential impact of the FDIC's decision to transfer the covered financial company's QFCs, retain and disaffirm or repudiate them, or retain them and allow the counterparty to terminate them. Information on collateral also ensures that the FDIC as receiver is able to comply with its statutory obligation to transfer all collateral securing the QFC obligations that it elects to transfer.
As discussed above, one commenter recommended eliminating the requirement to maintain operational and business-level details relating to QFC positions, including points of contact and the risk or relationship manager for each counterparty.
The burden of Table A-2 has been further reduced in the Final Rules by elimination of the following fields: Industry code (GIC or SIC code); master netting agreement for counterparty corporate group; name of each master agreement, master netting agreement or other governing documentation related to netting among affiliates in a counterparty's corporate group; current market value of all inter-affiliate positions with the records entity; master netting agreement for records entity's corporate group; and name of each master agreement, master netting agreement or governing documentation related to netting among records entities.
An additional change was made to Table A-2 relating to the requirement in the Proposed Rules for the maintenance of records on the current market value of all positions netted under the applicable netting agreement. Table A-2 in the Final Rules retains this
The Proposed Rules would have required that the amount of pending margin calls be included in the calculation of collateral positions. The Final Rules instead require information on the next margin payment date (A2.15) and the next margin payment amount (A2.16) in Table A-2. This information will assist the receiver in avoiding any failure to make a pending margin call during the one business day stay. Since the amount of pending margin calls was required to be calculated under Table A-2 as proposed to determine collateral excess or deficiency, requiring such information to be capable of being separately provided should not impose a significant additional burden.
In place of the data fields in the Proposed Rules for the legal name of any master agreement guarantor and the unique counterparty identifier of guarantor, Table A-2 includes a field for third-party credit enhancement agreement identifiers (A2.5), which clarifies that it covers unaffiliated providers of credit support and encompasses forms of support in addition to guarantees. The Final Rules also add new fields to Table A-2 (A2.4.1 and A2.5.1-.5) to provide additional information as to third-party credit enhancements. The Final Rules also add to Table A-2 certain fields necessary to link the data in Table A-2 to one or more of the other data tables or lookup tables. Finally, the Final Rules add to Table A-2 the data extraction date field discussed above.
Table A-3 as adopted is intended to ensure that the FDIC as receiver has available to it the legal agreements governing and setting forth the terms and conditions of each of the QFCs subject to the rules. Table A-3 requires each legal agreement to be identified by name and unique identifier (A3.3-A3.4) and requires the maintenance of records on key legal terms of the agreement, such as relevant governing law (A3.7) and information about any third-party credit enhancement agreement (A3.10-12.3).
In response to comments received on the Proposed Rules, the Final Rules include several changes to Table A-3 to reduce the recordkeeping burden. Commenters suggested eliminating the proposed requirement in Table A-3 to maintain records containing descriptions or excerpts of certain cross-default provisions, transfer restrictions, events of default, and termination events set forth in each QFC agreement or master agreement, arguing that providing this information would be extremely burdensome and of limited value to the receiver.
To further reduce the burden of Table A-3, the Final Rules eliminate the following proposed data fields: Basic form of agreement; legal name of guarantor of records entity obligations; industry code (GIC or SIC code); and legal name of counterparty obligations.
Other changes to Table A-3 conform to those discussed above with respect to other tables,
Table A-4 requires detailed information, on a counterparty by counterparty basis, relating to the collateral received by and the collateral posted by the records entity as reported in Table A-2. This information includes, for each collateral item, the unique collateral identifier (A4.6), information about the value of the collateral (A4.7-9), a description of the collateral (A4.10), the fair value asset classification (A4.11), the collateral segregation status (A4.12), the collateral location and jurisdiction (A4.13-14), and whether the collateral is subject to rehypothecation (A4.15). This collateral detail data, together with the netting-set level collateral data in Table A-2, will enable the receiver to more fully assess the type, nature, value, and location of the collateral and to model various QFC transfer or termination scenarios. Collateral detail information will also enable the receiver to ensure that collateral is transferred together with any QFCs that it secures, as required by the Act.
The Secretary did not receive any comments requesting specific changes to the requirements of Table A-4. Nevertheless, to reduce the burden of Table A-4, the following data fields have been eliminated in the Final Rules: Original face amount of collateral item in U.S. dollars; current end of day market value amount of collateral item in local currency; and collateral code. The Final Rules also eliminate the requirement to describe the scope of collateral segregation.
A collateral posted or received flag has been added to Table A-4 to clearly indicate to the receiver whether the collateral was posted or received by the records entity (A4.3). This field should impose minimal additional burden because a records entity will already need to identify all collateral as posted or received in Table A-2, which requires separate collateral information for collateral posted and collateral received. The Final Rules also adds the data extraction date field (A4.1), as discussed above, to Table A-4 as well as certain other fields necessary to link the data in Table A-4 to the data maintained in one or more of the other data tables or look-up tables (A4.2, A4.4, A4.5).
In the Proposed Rules, information regarding a records entity's affiliates was required by section 148.4(a)(7) and Tables A-1 and A-2. The Secretary has determined it is appropriate to provide instead for the corporate organization information to be maintained in the new corporate organization master data lookup table, which is cross-referenced with Tables A-1 through A-4. The Final Rules require this information to be maintained by a records entity with respect to itself and all of the members of its corporate group, which includes all of the records entities' affiliates. Although, as discussed above, the definition of “records entity” has been revised in the Final Rules to identify which members of a corporate group are records entities by reference to whether they are consolidated under accounting standards, in the event of a Title II resolution, the FDIC would need the information described in the next paragraph for each affiliate, irrespective of consolidation, to allow it to exercise its rights and obligations under, and ensure compliance with, section 210(c)(16) of the Act. As referenced above, under section 210(c)(16) of the Act, the contracts of subsidiaries or affiliates of a covered financial company that are guaranteed or otherwise supported by or linked to such covered financial company can be enforced by the FDIC as receiver of the covered financial company notwithstanding the insolvency, financial condition, or receivership of the financial company if the FDIC transfers the guarantee or other support to a bridge financial company or other third party.
The information that each records entity will need to maintain with respect to itself and each of its affiliates includes its and its affiliates' identifiers and legal name (CO.2-4), identification of immediate parent (CO.5-CO.7), the immediate parent's percentage ownership (CO.8), the entity type (CO.9), domicile (CO.10), and jurisdiction of incorporation or organization (CO.11). This information will be easier to provide and to update as part of the corporate organization master data lookup table rather than as part of the corporate organization chart provided for under the Proposed Rules. Use of the corporate organization master data lookup table will also facilitate the linking of the data provided in Tables A-1 through A-4 to key information about the records entity and its affiliates.
The corporate organization master data lookup table also includes a recordkeeping status field (CO.12) that was not included in the Proposed Rules. This field, which requires the records entity to identify, with respect to each of its affiliates, whether the affiliate is (i) a records entity, (ii) a non-financial company, (iii) an excluded entity, (iv) a financial company that is not a party to any open QFCs, (v) a records entity that is availing itself of the de minimis exemption, or (vi) a records entity that is availing itself of another exemption,
In the Proposed Rules, information regarding a records entity's non-affiliated QFC counterparties was required by section 148.4(a)(6) and in Table A-2. Several commenters suggested that the organizational and affiliate information for counterparties not affiliated with the records entity that would have been required by the Proposed Rules be eliminated or significantly reduced.
Having considered the comments received as to the burden of collecting, maintaining, and updating this information, the Secretary has determined that information regarding the identity of the immediate and ultimate parent of each counterparty is sufficient to enable the FDIC as receiver to comply with the requirement, discussed above, that the FDIC either (i) transfer all QFCs between the covered financial company and a counterparty and any affiliate of such counterparty to a single financial institution, (ii) disaffirm or repudiate all such QFCs, or (iii) retain all such QFCs. The data required by the counterparty master data lookup table includes the counterparty identifier (CP.2, which must be the current LEI maintained by the counterparty if the counterparty has obtained an LEI), the legal name of the counterparty (CP.4), domicile of counterparty (CP.5), jurisdiction of incorporation (CP.6), identification of the immediate parent of the counterparty (CP.7-CP.9), and identification of the ultimate parent of the counterparty (CP.10-CP.12).
In the Proposed Rules, the maintenance of information related to the booking location of a QFC position was required under Table A-1. To simplify the tables and facilitate the updating of this information, the Secretary has decided that some of this information should be maintained in a separate table. The information required by the booking location table, which includes the booking location identifier and booking unit or desk identifier, description and contact information, will enable the receiver to determine where the trade is booked and settled and understand the purpose of the position. As noted above, Table A-1 as
In the Proposed Rules, the maintenance of information relating to the safekeeping agent for collateral securing a QFC position was required by Table A-2. To simplify the tables and facilitate updating this information, the Secretary has decided to maintain the detailed information as to safekeeping agent in a separate table. The data required by this table includes the safekeeping agent identifier, name, and point of contact information (SA.2-SA.7). The information in this table must be capable of being provided with respect to each safekeeping agent for collateral of QFCs of a records entity, whether the safekeeping agent is a third party, the counterparty to the QFC secured by such collateral, or the records entity itself.
The Regulatory Flexibility Act (the “RFA”) (5 U.S.C. 601
The RFA requires agencies either to provide an initial regulatory flexibility analysis with a proposed rule or to certify that the proposed rule will not have a significant economic impact on a substantial number of small entities. As described in the Proposed Rules, the Secretary, in accordance with section 3(a) of the RFA, reviewed the Proposed Rules and preliminarily concluded that the Proposed Rules likely would not have a significant economic impact on a substantial number of small entities.
The Secretary certifies, pursuant to 5 U.S.C. 605(b), that the Final Rules will not have a significant economic impact on a substantial number of small entities under the Small Business Administration's (“SBA”) most recently revised standards for small entities, which went into effect on February 26, 2016. As discussed below, the Secretary has made various changes to reduce the scope and burden of the rules. However, even apart from these considerations, the Final Rules are not expected to have a significant economic effect on any small entities because any entities that would be subject to the rules as “records entities” that would otherwise meet the standards for small entities would be subsidiaries of large corporate groups and would therefore not be “independently owned and operated.”
In the Initial Regulatory Flexibility Analysis, the Secretary requested comment on whether the Proposed Rules would have a significant economic impact on a substantial number of small entities and whether the costs are the result of the Act itself, and not the Proposed Rules. Specifically, the Secretary requested that commenters quantify the number of small entities, if any, that would be subject to the Proposed Rules, describe the nature of any impact on small entities, and provide empirical and other data to illustrate and support the number of small entities subject to the Proposed Rules and the extent of any impact.
The Secretary received comments on the Proposed Rules from trade associations, asset managers, insurance companies, clearing organizations, nonprofit organizations, and a private individual. In general, commenters acknowledged the need for the FDIC to have appropriate information in order to exercise its role as a receiver under Title II of the Dodd-Frank Act.
The Proposed Rules, rather than requiring all financial companies to maintain records with respect to QFCs, would have applied to a narrower subset of financial companies. Specifically, the Secretary proposed to exclude from the scope of the Proposed Rules financial companies that did not meet one of the following three criteria: (1) A nonbank financial company subject to a determination by the Council pursuant to section 113 of the Act (12 U.S.C. 5323); (2) a financial market utility designated pursuant to Section 804 of the Act (12 U.S.C. 5463) as, or as likely to become, systemically important; or (3) have total assets equal to or greater than $50 billion. At the time the Proposed Rules were published, each of the financial companies expected to be subject to the rules under these criteria had revenues in excess of the SBA's revised standards for small entities that went into effect on July 22, 2013. The Proposed Rules would also have applied to these large financial companies' affiliated financial companies if an affiliated financial company otherwise qualified as a “records entity” and was not an “exempt entity” under the Proposed Rules. However, such affiliated financial
As discussed in section II.A.1 above, the Secretary, in response to comments, determined to make several changes to the definition of “records entity” in the Final Rules in order to substantially reduce the number of entities that will be subject to recordkeeping requirements. Further, as discussed in section II.C.3 above, the Secretary determined to include in the Final Rules a de minimis exemption from the preponderance of the recordkeeping requirements for certain records entities that have a minimal level of QFC activity. These changes have the effect of further reducing the likelihood that the rules would affect a substantial number of small entities. In addition, the definition of “records entity” has been revised in the Final Rules to refer to members of a corporate group that are consolidated under accounting standards, which should reduce the number of entities that would be included as records entities and ensure that records entities that are members of a corporate group are able to coordinate their compliance with the recordkeeping requirements of the rules. The addition in the Final Rules of the requirement that a top-tier financial company of a corporate group that has multiple records entities must be able to generate a single, compiled set of the records for all records entities in the corporate group that it consolidates or are consolidated with it would not affect the number of small entities that are subject to the rule as no such top-tier financial company would be a small entity.
As discussed above, the Final Rules would only affect large financial companies and certain of their affiliates that meet the definition of a records entity. Previously, the Secretary proposed that the recordkeeping requirements in the Proposed Rules would be applicable to all affiliated financial companies in a large corporate group that meet the definition of “records entity,” regardless of their size, because excluding records entities, including small entities, could significantly impair the FDIC's right to enforce certain QFCs of affiliates of covered financial companies under section 210(c)(16) of the Act. The Secretary has been advised by the FDIC that, based on its experience with Part 371, the FDIC as receiver should be able to exercise its statutory rights and duties under the Dodd-Frank Act relating to QFCs without having access to standardized records for any records entity that is a party to 50 or fewer open QFC positions. Thus the Secretary has determined that a de minimis exemption from maintaining the records described in section 148.4 of the Final Rules, other than the records described in section 148.4(i), is appropriate for records entities that have such a minimal level of QFC activity. This change has the effect of further reducing the likelihood that the Final Rules would affect a substantial number of small entities. Although it is unlikely that any small entities would be affected because affiliated members generally do not meet the definition of “small entity,” this revision will minimize the burden faced by affiliated members of a corporate group.
Based on current information and discussions with staff of several of the PFRAs who are familiar with financial company operations and have experience supervising financial companies with QFC portfolios, the Secretary believes that the large corporate groups that would be subject to the Final Rules would likely comply with the rules by utilizing a centralized recordkeeping system, whether by adapting an existing system or establishing a new system, that would obviate the need for each member of such corporate group, including small entity members of the corporate group, to maintain its own recordkeeping system in order to comply with the rules. This is expected to have the effect of substantially reducing the burden of compliance with the rules on particular small entity members, if any, of a corporate group subject to the rules. The Secretary requested information and comment in the Initial Regulatory Flexibility Analysis on the role of entities responsible for the centralized recordkeeping systems and whether such entities are small entities to which the Proposed Rules would apply. While several commenters addressed the impact of the Proposed Rules in general on information recordkeeping systems,
As discussed in more detail above, the Final Rules impose certain recordkeeping requirements on records entities. A records entity is required to maintain all records described in section 148.4 of the Final Rules, be able to generate data in the format set forth in the appendix to the Final Rules, and be capable of transmitting those records electronically to the records entity's PFRA and the FDIC. The Final Rules include recordkeeping requirements with respect to position-level data, counterparty-level data, legal documentation data, collateral detail data, corporate organization data, and a list of vendors directly supporting QFC-related activities of the records entity and the vendors' contact information.
As discussed in the Initial Regulatory Flexibility Analysis, based on discussions with several of the PFRAs that are familiar with financial company operations and have experience supervising financial companies with QFCs portfolios, the Secretary believes that records entities are already maintaining, as part of their ordinary course of business, most of the QFC information required to be maintained under the Final Rules, which minimizes the potential economic impact.
The Secretary recognizes that there may be particular types of QFCs or counterparties for which more limited information may be sufficient to enable the FDIC to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Final Rules provide the Secretary with the discretion to grant conditional or unconditional exemptions from one or more of the requirements of the Final Rules, which could include exemptions from the recordkeeping requirements regarding particular types of QFCs or counterparties. In addition, section 148.1(d)(3) of the Final Rules provides the Secretary with the authority to grant extensions of time for compliance purposes.
The Secretary requested in the Initial Regulatory Flexibility Analysis information and comment on any costs, compliance requirements, or changes in operating procedures arising from application of the Proposed Rules on small entities.
Certain provisions of the Final Rules contain “collection of information requirements” within the meaning of the Paperwork Reduction Act of 1995 (“PRA”). An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid control number. The collection of information requirements in the Final Rules have been submitted by the Secretary to the Office of Management and Budget (“OMB”) for review in accordance with the PRA, 44 U.S.C. 3507(d). The title of this collection is “Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation Authority.” The collection of information has been assigned OMB Control No. 1505-0256.
Previously, the Secretary requested comments on the collection of information burdens associated with the Proposed Rules. Specifically, the Secretary asked for comment concerning:
(1) Whether the proposed information collection is necessary for the proper performance of agency functions, including whether the information will have practical utility;
(2) The accuracy of the estimated burden associated with the proposed collection of information, including the validity of the methodology and assumptions used;
(3) How to enhance the quality, utility, and clarity of the information required to be maintained;
(4) How to minimize the burden of complying with the proposed information collection, including the application of automated collection techniques or other forms of information technology;
(5) Estimates of capital or start-up costs and costs of operation, maintenance, and purchase of services to maintain the information; and
(6) Estimates of (i) the number of financial companies subject to the Proposed Rules, (ii) the number of records entities that are parties to an open QFC or guarantee, support, or are linked to an open QFC, and (iii) the number of affiliated financial companies that are parties to an open QFC or guarantee, support, or are linked to an open QFC of an affiliate.
Commenters on the Proposed Rules generally acknowledged the need for the FDIC to have appropriate information in order to exercise its role as a receiver under Title II of the Act. Commenters also requested various modifications to or relief from aspects of the Proposed Rules that they stated would entail burdens that outweighed the benefits to the FDIC. This included recommendations that the records required to be maintained under the Proposed Rules be tailored more narrowly to require only data that is critical to the FDIC's QFC transfer determinations under section 210 of the Act. Several commenters also remarked generally that the Proposed Rules would entail significant information technology and systems development challenges.
The collection of information is required by section 210(c)(8)(H) of the Act, which mandates that the Secretary prescribe regulations requiring financial companies to maintain records with respect to QFCs to assist the FDIC as receiver for a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Final Rules implement these requirements by requiring that a records entity maintain records with respect to, among other things, position-level data, counterparty data, legal agreement data (including copies of agreements governing QFC transactions and open confirmations), collateral detail data, corporate organization information, and a list of vendors directly supporting QFC-related activities of the records entity and the vendors' contact information. The Final Rules require that a records entity be capable of providing QFC records to its PFRA and the FDIC within 24 hours of the request of such PFRA. For corporate groups that have multiple records entities, the top-tier financial company of the corporate group must be able to generate a single, compiled set of the records specified in the Final Rules for all records entities in the corporate group that it consolidates or are consolidated with it and provide such set of records to its PFRA and the FDIC within 24 hours of the request of such PFRA and in a format that allows for aggregation and disaggregation of such data by records entity and counterparty.
The Final Rules also provide that a records entity may request in writing an extension of time with respect to the compliance dates associated with the recordkeeping requirements. The Final Rules further provide that one or more records entities may request in writing an exemption from one or more of the recordkeeping requirements. Finally, the Final Rules provide a de minimis exemption from maintaining the records described in section 148.4 of the Final Rules, other than the records described in section 148.4(i), for a records entity that is a party to 50 or fewer open QFC positions.
In the PRA discussion in the Proposed Rules, the Secretary estimated that approximately 140 large corporate groups and each of their respective affiliated financial companies that is a party to an open QFC or guarantees, supports or is linked to an open QFC of an affiliate and is not an “exempt entity,” would meet the proposed definition of “records entity.” The estimate of 140 large corporate groups includes the four nonbank financial companies subject to a determination by the Council under section 113 of the Dodd-Frank Act and the eight financial market utilities designated by the Council under section 804 of the Dodd-Frank Act as systemically important. The Proposed Rules also included within the definition of “records entity” financial companies with assets greater than or equal to $50 billion. The Federal Financial Institutions Examination Council (“FFIEC”) maintains on its public Web site a list of bank holding companies with total assets of greater than $10 billion, which was used to identify bank holding companies with assets greater than or equal to $50 billion. For corporate groups that are not bank holding companies, SNL Financial, a private vendor that provides a subscription-access database that aggregates publicly available financial information on insurance, securities and investment, specialty
For purposes of the PRA discussion in the Proposed Rules, the Secretary estimated that each large corporate group was comprised of approximately 168 affiliates, resulting in an estimate of 23,325 affiliated financial companies. As noted above, commenters generally did not provide comments, empirical data, or other analyses directly addressing the Secretary's estimates in the PRA discussion. As discussed in detail in section II above, the Final Rules, as adopted, incorporate several changes to the Proposed Rules, including the addition to the definition of “records entity” of criteria based on the level of a financial company's derivatives activity, the exclusion of insurance companies, a conditional exemption for derivatives clearing organizations, and the inclusion of a de minimis exemption. Taken together, these changes substantially reduce the scope of financial companies subject to the recordkeeping requirements of the Final Rules.
The Secretary estimates that approximately 30 large corporate groups, and each of their respective affiliated financial companies that is a party to an open QFC and is not an “excluded entity,” will meet the definition of “records entity” in section 148.2(n) upon the effective date of the Final Rules, compared to the estimate in the Proposed Rules of 140 large corporate groups. The Secretary estimates that collectively these 30 corporate groups had approximately $15 trillion in total assets, compared to an estimated $25 trillion in total assets of the 140 corporate groups that were expected to meet the definition of “records entity” in the Proposed Rules. These estimates were based on the publicly disclosed financial statements of such corporate groups as of December 31, 2015 and December 31, 2013, respectively.
The estimate of 30 large corporate groups was calculated as follows. There are three categories of financial companies that are included within the definition of “records entity” in the Final Rules without regard to whether they meet the asset or derivatives thresholds. The estimate includes the eight U.S. top-tier bank holding companies currently identified as G-SIBs. Likewise, the estimate includes the two nonbank financial companies currently subject to a determination by the Council under section 113 of the Dodd-Frank Act. There are currently eight financial market utilities designated by the Council under section 804 of the Dodd-Frank Act as systemically important. Six of these entities are registered clearing agencies or derivatives clearing organizations, for which a conditional exemption has been provided under the Final Rules, though their affiliates may be subject to the recordkeeping requirements if they are party to open QFCs.
The estimate also includes large corporate groups that would be subject to the rules by virtue of the amount of their total consolidated assets and level of derivatives activity. For bank holding companies, the FFIEC-maintained list, referenced above, of bank holding companies with total assets of greater than $10 billion was used to identify bank holding companies with assets greater than or equal to $50 billion. The amount of total gross notional derivatives outstanding and the amount of derivatives liabilities of these bank holding companies was obtained by reference to the consolidated financial statements filed with the Federal Reserve by such bank holding companies on the Federal Reserve's Form FR Y-9C, which are publicly available on the Federal Reserve's Web site. For corporate groups that are not bank holding companies, the SNL Financial database referenced above, as well as financial statements filed with the SEC and, for broker-dealers, with the Financial Industry Regulatory Authority were used to identify corporate groups having total assets greater than or equal to $50 billion and having either greater than or equal to $3.5 billion in derivatives liabilities or greater than or equal to $250 billion in total gross notional derivatives outstanding as of December 31, 2015. By reference to these sources, as well as conversations with the PFRAs, twelve additional corporate groups were estimated to be subject to the rules. While the number of corporate groups having total assets greater than or equal to $50 billion was similar to that estimated at the time of the issuance of the Proposed Rules, the addition to the definition of “records entity” of criteria based on the level of a financial company's derivatives activity and the exclusion of insurance companies significantly reduced the number of corporate groups estimated to be subject to the rules.
The following table summarizes the calculation of the estimates of the number and aggregate size of large corporate groups subject to the Proposed Rules and the Final Rules.
The Final Rules would also apply to these large corporate groups' affiliated financial companies (regardless of their size) if an affiliated financial company otherwise qualifies as a “records entity,” and is not an “excluded entity.” In addition, as referenced above, the Final Rules will also require the top-tier financial company of the corporate group to be capable of generating a single, compiled set of the records specified in the Final Rules for all records entities in the corporate group that it consolidates or are consolidated with it and to be capable of providing such a set of records to its PFRA and the FDIC.
The Secretary estimates that the large corporate groups that will be subject to the rules collectively have 5,010 affiliated financial companies that may qualify as records entities. The Secretary recognizes that, based on a number of factors, the actual total number of respondents may differ significantly from this estimate. One such factor is that there is no information available to determine how many of the affiliated financial companies of a large corporate group are a party to an open QFC and thus would qualify as records entities. At the same time, the inclusion and availability of the de minimis exemption in the Final Rules will have the effect of reducing the number of affiliated financial companies in many corporate groups subject to the recordkeeping requirements. Finally, as previously noted, commenters did not provide requested comments, empirical data, or other analyses directly addressing the Secretary's estimates of the total number of respondents for purposes of the PRA discussion. For the foregoing reasons, the Secretary has concluded it is reasonable to maintain the estimate of affiliates per corporate group used in the PRA discussion in the Proposed Rules and therefore to assume that a total of 5,010 affiliated financial companies would qualify as record entities.
The Secretary's recordkeeping, reporting, data retention, and records generation burden estimates are based on discussions with the PFRAs regarding their prior experience with initial burden estimates for other recordkeeping systems. The Secretary also considered the burden estimates in rulemakings with similar recordkeeping and reporting requirements.
In order to comply with the Final Rules, each of the large corporate group respondents will need to set up its network infrastructure to collect data in the required format. This will likely impose a one-time initial burden on the large corporate group respondents in connection with the necessary updates to their recordkeeping systems, such as systems development or modifications. This initial burden is mitigated to some extent because QFC data is likely already retained in some form by each large corporate group respondent in the ordinary course of business, but large corporate group respondents may need to amend internal procedures, reprogram systems, reconfigure data tables, and implement compliance processes. Moreover, they may need to standardize the data and create records tables to match the format required by the Final Rules. In recognition of this, as discussed in section II.A.3 above, the Final Rules provide for staggered compliance dates that will provide all records entities with additional time to comply with the recordkeeping requirements. Under the Final Rules, all but the very largest institutions will have at least two years to comply with the rules' requirements.
As discussed above, the Final Rules also apply to affiliated financial companies of the large corporate group respondents. The Final Rules will likely impose a one-time initial burden on the affiliated financial companies in connection with necessary updates to their recordkeeping systems, such as systems development or modifications. These burdens will vary widely among affiliated financial companies. As noted herein and as discussed in section II.C.3 above, the Final Rules provide a de minimis exemption from the recordkeeping and reporting requirements for certain records entities that have a minimal level of QFC activity, which the Secretary believes will significantly reduce the number of affiliated financial companies subject to the recordkeeping and reporting requirements of the Final Rules.
The Secretary believes that the large corporate groups subject to the Final Rules are likely to rely on centralized systems to comply with most of the recordkeeping requirements, as set forth herein, for the QFC activities of all affiliated members of the corporate group. The entity responsible for each large corporate group's centralized system will likely operate and maintain a technology shared services model with the majority of the technology applications, systems, and data shared by the multiple affiliated financial companies within the corporate group. Therefore, the majority of the recordkeeping burden stemming from the Final Rules will be borne by the entity responsible for each large corporate group's centralized systems, while relatively little initial and ongoing recordkeeping burden will be imposed on their affiliated financial companies. The affiliated financial companies will likely have a much lower burden because they can utilize the technology and network infrastructure operated and maintained by the entity responsible for the centralized system at their respective large corporate group. Similarly, the Secretary believes that the affiliated financial companies will rely on the entities responsible for the centralized systems to perform the requirements under section 148.3(a)(1)(ii).
Similarly, the Secretary believes that affiliated financial companies will rely on large corporate group respondents to submit any requests for extensions of time under section 148.1(d)(3) or requests for exemption from one or more requirements of the Final Rules under section 148.3(c)(3).
The initial and annual recordkeeping burden is imposed by the Dodd-Frank Act, which requires that the Secretary prescribe regulations requiring financial companies to maintain records with respect to QFCs to assist the FDIC as receiver of a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act.
As discussed in more detail in section III.C.6.a below, the Secretary estimates the potential total costs of the initial recordkeeping burden associated with the Final Rules, including the burden hours estimated above plus estimated technology and systems development and modification costs, to be $36,631,995. The potential total costs of annual recordkeeping and reporting burdens associated with the Final Rules, including the burden hours estimated above, are estimated to be $1,248,795.
It has been determined that the Final Rules are a significant regulation as defined in section 3(f)(1) of Executive Order 12866, as amended. Accordingly, the Final Rules have been reviewed by OMB. The Regulatory Assessment prepared by the Secretary for the Final Rules is provided below.
The rulemaking is required by the Dodd-Frank Act to implement the QFC recordkeeping requirements of section 210(c)(8)(H) of the Act. Section 210(c)(8)(H) generally provides that if the PFRAs do not prescribe joint final or interim final regulations requiring financial companies to maintain records with respect to QFCs within 24 months from the date of enactment of the Act, the Chairperson of the Council shall prescribe such regulations in consultation with the FDIC. The Secretary, as Chairperson of the Council, is adopting the Final Rules in consultation with the FDIC because the PFRAs did not prescribe such joint final or interim final regulations. The recordkeeping required in the Final Rules is necessary and appropriate to assist the FDIC as receiver to exercise its rights and fulfill its obligations under sections 210(c)(8), (9), and (10) of the Dodd-Frank Act, by enabling it to assess the consequences of decisions to transfer, disaffirm or repudiate, or allow the termination of QFCs with one or more counterparties.
The recent financial crisis has demonstrated that management of QFC positions, including steps undertaken to close out such positions, can be an important element of a resolution strategy which, if not handled properly, may magnify market instability. Large, interconnected financial companies may hold very large positions in QFCs involving numerous counterparties. A disorderly unwinding of these QFCs, including the mass exercise of QFC default rights and the rapid liquidation of collateral, could cause severe negative consequences for not only the counterparties themselves but also U.S. financial stability. A disorderly unwind could result in rapid liquidations, or “fire sales,” of large volumes of financial assets, such as the collateral that secures the contracts, which can in turn weaken and cause stress for other firms by lowering the value of similar assets that they hold or have pledged as collateral to other counterparties.
In order for the FDIC to effectuate an orderly liquidation of a covered financial company under Title II, the FDIC would need to make appropriate decisions regarding whether to transfer QFCs to a bridge financial company or other solvent financial institution or leave QFCs of the covered financial company in receivership. Determining whether to transfer QFCs in a manner that complies with the requirements of Title II and ensuring continued performance on any QFCs transferred requires detailed and standardized records. It would not be possible for the FDIC to fully analyze a large amount of QFC information in the short time frame afforded by Title II unless such information is readily available to the FDIC in a standardized format designed to enable the FDIC to conduct the analysis in an expeditious manner.
As referenced in section I above, Title II requires the FDIC as receiver to exercise its authorities, to the greatest extent practicable, in a manner that maximizes value, minimizes losses, and mitigates the potential for serious adverse effects to the financial system. Title II also requires that the aggregate amount of liabilities of a covered financial company that are transferred to a bridge financial company from a covered financial company not exceed the aggregate amount of the assets of the covered financial company that are transferred to the bridge financial company from the covered financial company. If it does not have the records required by the rules, the FDIC may be unable to assess the financial position associated with certain QFCs and thus may not be able to determine how the transfers would affect the financial viability of a bridge financial company or other transferee institution, how the transfers would affect financial stability, whether the transfers would serve to maximize value and minimize losses in the disposition of assets of the receivership, and whether the transfers would cause the amount of aggregate transferred liabilities of the bridge financial company to exceed the amount of aggregate transferred assets.
Furthermore, as discussed in sections I and II above, if the FDIC as receiver decides to transfer any QFC with a particular counterparty, Title II requires that it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution, and if the FDIC as receiver decides to disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty. If the FDIC were to lack information about the affiliates of the counterparties to the QFCs of the covered financial company, it might not be able to transfer the QFCs given its uncertainty as to whether such a transfer would violate this requirement.
The FDIC's inability to effect the transfer of QFCs for any of the above reasons could have significant adverse effects on financial stability in circumstances in which transferring such QFCs may have prevented the unnecessary termination of QFCs and fire sales of collateral securing these QFCs. Even after a transfer decision is made, the records required by the rule are necessary to ensure that the bridge
In assessing the need for these recordkeeping requirements, we have reviewed two categories of academic literature. As highlighted above, one of the potential channels through which the disorderly unwinding of QFCs could cause severe negative consequences for both the counterparties themselves and U.S. financial stability is through the rapid liquidation of collateral. The disorderly failure of a financial company with a large QFC portfolio may lead QFC counterparties to exercise their contractual remedies and rights by closing out positions and liquidating collateral, while also potentially increasing uncertainty in both derivatives and asset markets. This could lead to lower asset prices, decrease the availability of funding, and increase the likelihood that other financial companies also are forced to liquidate assets. To assess the potential impact of rapid liquidations, we have reviewed economic studies of fire sales among financial companies. Second, while there is limited academic literature specifically focused on the cost of a disorderly unwinding of a large, complex financial company's QFC portfolio, there has been recent literature analyzing the cost of the Lehman Brothers bankruptcy in 2008, which may be illustrative of the potential costs.
The economic literature on financial company fire sales offers insight into their potential internal and external impacts. While not directly addressing QFCs, the fire sale literature can be applied to the potential impact of the rapid liquidation of QFC collateral that might occur in a disorderly unwinding of a large QFC portfolio. As noted above, the recordkeeping required by the Final Rules is necessary to assist the FDIC in being able to make decisions regarding whether to transfer QFCs of a covered financial company to a bridge financial company or other solvent financial institution or to retain the QFCs in the covered financial company in receivership. Transferring QFCs, if appropriate, may prevent the mass exercise of QFC default rights and a corresponding fire sale of assets held as collateral for those QFCs.
Shleifer and Vishny (2011) believe that before the September 2008 Lehman Brothers bankruptcy many specialist buyers, including most financial companies, were active in the market, but after the Lehman bankruptcy most of them were unwilling to buy assets, causing security prices to plunge, and prompting fund withdrawals, collateral calls, and self-reinforcing fire sales. This cycle of price collapses and deleveraging increased the fragility of the financial system, and disrupted financial intermediation.
At the time of a fire sale both seller and non-seller financial companies may curtail their lending, thereby imposing additional social costs associated with reduced financial intermediation. Shleifer and Vishny (2010)
Coval and Stafford (2007)
Dinc, Erel, and Liao (2015)
While ample research documents the costs of fire sales to distressed firms selling assets, little analytic emphasis has been placed on the effect of fire sales on asset buyers. A recent study by Meier and Servaes (2015)
In contrast to studies of the direct discounts or stock returns associated with asset transactions during fire sales, Duarte and Eisenbach (2015)
On the borrower side, Campello
Numerous researchers have provided broad estimates of the economic costs of the 2007-09 financial crisis (see GAO (2013)
The net worth of Lehman Brothers derivative positions at the time of bankruptcy on September 15, 2008 totaled $21 billion, with 96 percent representing over-the-counter (OTC) positions.
We use a framework that divides costs associated with derivatives resolution into private costs and public (external) costs. Private costs consist of direct losses to derivatives counterparties from unrecovered claims, indirect costs to derivatives counterparties from loss of hedged positions, costs to other Lehman Brothers creditors in the bankruptcy proceeding due to reductions in recovery values resulting from the termination and settlement of OTC derivatives, losses to the Lehman estate from excess collateral transfers during bulk sales of exchange-traded derivatives, and litigation and administrative expenses. While we find no literature that assesses the public costs directly attributable to the resolution of Lehman's derivatives portfolio, below we examine the literature assessing the public impact of Lehman's failure more broadly.
While rigorous estimates of the value of each cost element listed above would be ideal, in reality we are constrained by a lack of publicly available data. Therefore, this section combines qualitative descriptions of costs with limited quantitative information when available, in an effort to provide insight on the costs of resolving Lehman's QFC portfolio under the bankruptcy proceedings.
Johnson and Mamun (2012)
Dumontaux and Pop (2012)
The economic literature on financial asset fire sales maintains that such events are more systemically harmful when occurring during industry-wide periods of distress, making mitigating these costs a public policy concern. The Lehman Brothers bankruptcy and the resulting QFC terminations occurred during a crisis period, and might have imposed widespread private and public costs. We do not compare the Lehman bankruptcy costs to the alternative of potential resolution costs under a counterfactual case had Title II of the Dodd-Frank Act been in effect at the time of the Lehman bankruptcy filing. Nonetheless, Fleming and Sarkar (2014) argue that, “some of the losses associated with the failure of Lehman Brothers may have been avoided in a more orderly liquidation process.”
The FDIC promulgated 12 CFR part 371, Recordkeeping Requirements for Qualified Financial Contracts (“Part 371”), pursuant to section 11(e)(8)(H) of the FDIA.
Based on discussions with the staff of the PFRAs who are familiar with financial company operations and have experience supervising financial companies with QFC portfolios, the Secretary believes that the large corporate groups that would be subject to the Final Rules should already be maintaining much of the QFC information required to be maintained under the Final Rules as part of their ordinary course of business. In order for these large corporate groups to effectively manage their QFC portfolios, they need to have robust recordkeeping systems in place; for example, large corporate groups that trade derivatives out of several distinct legal entities need to have detailed records, including counterparty identification, position-level data, collateral received and posted, and contractual requirements, in order to effectively manage their portfolio, perform on contracts, and monitor risks. As noted by commenters, regulated financial companies must maintain extensive QFC records pursuant to other regulatory requirements.
The Secretary considered alternatives to implementing the recordkeeping requirements of the Final Rules but believes that the adopted form is the best available method of achieving both the statutory mandate and the regulatory objectives. The assessment of alternatives below is organized into three subcategories: The scope of the rules; the content of records; and standardized recordkeeping.
The scope of the Final Rules and the reasons for the changes made to the scope of the rules as compared to the Proposed Rules is provided in section II.A.1, above. The Secretary considered alternative criteria in developing the definition of a records entity, such as including financial companies that have more than $10 billion in assets. This threshold, which would have captured more financial companies that potentially might be considered for orderly liquidation under Title II, has been used in other regulatory requirements. For example, the Dodd-Frank Act requires certain financial companies with more than $10 billion in total consolidated assets to conduct annual stress tests.
However, the Secretary determined that while it is possible that financial companies with more than $10 billion and less than $50 billion in total assets would be considered for orderly liquidation under Title II, a more appropriate threshold is the $50 billion in total consolidated assets, supplemented by the secondary thresholds of $250 billion of total gross notional derivatives outstanding or $3.5 billion of derivative liabilities. Imposing the $50 billion total assets threshold by itself or including all financial companies with over $10 billion in total assets would substantially increase the number of financial companies subject to recordkeeping requirements, many of which would likely not be considered for orderly liquidation under Title II. Financial companies with total assets of $50 billion or more and with a substantial degree of activity in QFCs as indicated by total gross notional derivatives outstanding of at least $250 billion or derivative liabilities of at least $3.5 billion, potentially would be among the most likely to be considered for orderly liquidation under Title II. The definition of “records entity” in the Final Rules is thus designed to reduce recordkeeping burdens on smaller financial company groups by only capturing those financial companies that are part of a group with a member that it is the type of company for which the FDIC is most likely to be appointed as receiver.
The Secretary determined, after consulting with the FDIC, that requiring each records entity to maintain the data included in Tables A-1 through A-4 and the four master data lookup tables of the appendix to the Final Rules is necessary to assist the FDIC in being able to effectively exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. To facilitate the resolution of QFC portfolios, the FDIC, upon being appointed as receiver for a covered financial company under Title II, would need to analyze such data in order to promptly effectuate decisions. The information must be sufficient to allow the FDIC to estimate the financial and operational impact on the covered financial company and its counterparties, affiliated financial companies, and the financial markets as a whole of the FDIC's decision to transfer, retain and disaffirm or repudiate, or retain and allow the counterparty to terminate the covered financial company's QFCs. It must also allow the FDIC to assess the potential impact that such decisions may have on the financial markets as a whole, which may inform its transfer decisions. The need for the information specified by each table is discussed in further detail in section II.D.2 above.
As indicated above, the recordkeeping requirements of the Final Rules are similar to the FDIC's Part 371, rules applicable to insured depository institutions in troubled condition but the information requirements of the Final Rules (which do not apply to insured depository institutions) are more extensive. Previously, in developing the Proposed Rules, the Secretary considered the appropriateness of reducing the recordkeeping burden by aligning the requirements more closely with those of the FDIC's Part 371, but determined, in consultation with the FDIC, that additional recordkeeping beyond that required by Part 371 would be needed for the FDIC to resolve a financial company with significant QFC positions under Title II. The Secretary reaffirms in the Final Rules that this determination is appropriate and that, in a Title II resolution scenario, the FDIC will need the additional information required by the Final Rules to analyze the QFC portfolio, decide how to manage the QFCs, and perform their obligations under the QFCs, including meeting collateral requirements. Furthermore, although applying the Part 371 requirements to records entities instead of the requirements of the Final Rules would have imposed less of a burden on records entities, even the Part 371 requirements would require records entities to update their recordkeeping systems, including by amending internal procedures, reprogramming systems, reconfiguring data tables, and implementing compliance processes in similar ways as are expected to be required for records entities complying with the Final Rules.
As an example of the additional information required to be maintained under the Final Rules as compared to Part 371, the counterparty-level data required in Table A-2 to the appendix of the Final Rules includes the next margin payment date and payment amount. This will assist the FDIC in ensuring that a covered financial company and its subsidiaries perform their QFC obligations, including meeting clearing organization margin calls. The Table A-3 legal agreement information, which is not included in Part 371, is necessary to enable the FDIC as receiver to evaluate the likely treatment of QFCs under such contracts, and to inform the FDIC of any third-party credit enhancement and the identification of any default or other termination event provisions that reference an entity. Table A-4 includes additional collateral detail data, such as the location of collateral, the collateral segregation status, and whether the collateral may be subject to re-hypothecation by the counterparty. These additional data are necessary to enable the FDIC to assess risks associated with the collateral and improve the FDIC's ability to analyze various QFC transfer or termination scenarios. For example, for cross-border transactions, this information would help the FDIC evaluate the availability of collateral in different jurisdictions and the related close-out risks under local law if the receiver cannot arrange for the transfer of QFC positions. As noted above, we believe in many cases records entities are maintaining the additional information required under the rules due to existing business practices or other regulatory requirements. However, the Secretary understands that these large corporate groups are not currently maintaining the QFC records in the standardized format prescribed by the Final Rules and as set forth in the appendix to the Final Rules such that the additional information required will impose additional burden associated with amending internal procedures, reconfiguring data tables,
The Secretary determined that requiring records entities to have the capacity to maintain and generate QFC records in the uniform, standardized format set forth in the appendix to the Final Rules is necessary to assist the FDIC in being able to effectively exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. Specifically, when the FDIC is appointed as receiver of a covered financial company, the covered financial company's QFC counterparties are prohibited from exercising their contractual right of termination until 5 p.m. (eastern time) on the first business day following the date of appointment. After its appointment as receiver and prior to the close of the aforementioned 5 p.m. deadline, the FDIC has three options in managing a covered financial company's QFC portfolio. Specifically, with respect to all of the covered financial company's QFCs with a particular counterparty and all its affiliates, the FDIC may: (1) Transfer the QFCs to a financial institution, including a bridge financial company established by the FDIC; (2) retain the QFCs within the receivership and allow the counterparty to exercise contractual remedies to terminate the QFCs; or (3) retain the QFCs within the receivership, disaffirm or repudiate the QFCs, and pay compensatory damages. If the FDIC transfers the QFCs to a financial institution, the counterparty may not terminate the QFCs solely because the QFCs were transferred, or by reason of the covered financial company's financial condition or insolvency or the appointment of the FDIC as receiver. If the FDIC does not transfer the QFCs and does not disaffirm or repudiate such QFCs within the one business day stay period, the counterparty may exercise contractual remedies to terminate the QFCs and assert claims for payment from the covered financial company and may have rights to liquidate the collateral pledged by the covered financial company.
Previously, in developing the Proposed Rules, the Secretary considered reducing the recordkeeping burden by permitting the maintenance of QFC records in non-standardized formats, but determined, after consulting with the FDIC, that this alternative would compromise the FDIC's flexibility as receiver in managing the QFC portfolio and impair its ability as receiver to maximize the value of the assets of the covered financial company in the context of orderly liquidation.
However, while the Final Rules specify a standardized recordkeeping format, the Secretary also recognizes that there may be particular types of QFC or counterparties for which more limited information may be sufficient to enable the FDIC to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Final Rules provide the Secretary with the discretion to grant conditional or unconditional exemptions from compliance with one or more of the requirements of the Final Rules, which could include exemptions with respect to the information required regarding particular types of QFCs or counterparties.
Instead of requiring all financial companies to maintain records with respect to QFCs, the Secretary is limiting the scope of the Final Rules to a narrow subset of financial companies. Discretion to do so is afforded under section 210(c)(8)(H)(iv) of the Act, which requires the recordkeeping requirements to differentiate among financial companies by taking into consideration, among other things, their size and risk. The Secretary is exercising this discretion to define the term “records entity” and thereby include within the scope of the Final Rules only those financial companies that: (1) Are identified as U.S. G-SIBs; (2) the Council determines could pose a threat to U.S. financial stability; (3) the Council designates as systemically important financial market utilities; (4) have total consolidated assets equal to or greater than $50 billion and either (i) total gross notional derivatives outstanding equal to or greater than $250 billion or (ii) derivative liabilities equal to or greater than $3.5 billion; or (5) are part of the same corporate group in which at least one financial company satisfies one or more of the other foregoing criteria. The Final Rules would only apply to large corporate groups (including a large corporate group's affiliated financial companies, regardless of their size, if the affiliated financial company is a party to an open QFC and is not an “excluded entity” under the Final Rules). The types of financial companies that would qualify as records entities under the Final Rules include those listed in section II.A.1.b, above. The Secretary estimates that 30 large corporate groups would be subject to the recordkeeping requirements.
Based on discussions with the PFRAs who are familiar with financial company operations and have experience supervising financial companies with QFC portfolios, the Secretary believes that the costs of implementing the Final Rules may be mitigated by the fact that records entities should be maintaining most of the QFC information required by the Final Rules as part of their ordinary course of business. However, the Secretary recognizes that the requirement in the Final Rules for records to be maintained in a standardized format, among other requirements, may impose costs and burdens on records entities. In order to comply with the Final Rules, each of the approximately 30 large corporate groups that the Secretary estimates would be subject to the recordkeeping requirements will need to have network infrastructure to maintain data in the required format. The Secretary expects that this will likely impose one-time initial costs on each large corporate group in connection with necessary updates to their recordkeeping systems, such as systems development or modifications. The initial costs to set up network infrastructure will depend on whether a large corporate group already holds and maintains QFC data in an organized electronic format, and if so, whether the data currently reside on different systems rather than on one centralized system. Large corporate groups may need to amend internal procedures, reprogram systems, reconfigure data tables, and implement compliance processes. Moreover, they may need to standardize the data and create tables to match the format required by the Final Rules. However, the Secretary believes that the large corporate groups that would be subject to the Final Rules are likely to rely on existing centralized systems for recording and reporting QFC activities to perform most of the recordkeeping and reporting requirements set forth herein. The entity within the corporate group responsible for this centralized system will likely operate and maintain a technology shared services model with the majority of technology applications,
Previously, the Secretary estimated the costs of the initial and annual recordkeeping burdens, as well as the annual reporting burden, associated with the Proposed Rules in both man-hours and dollar terms and requested comment on whether the cost estimates were reasonable. As noted above, the Secretary's recordkeeping, reporting, data retention, and records generation burden estimates were based on discussions with the PFRAs regarding their prior experience with burden estimates for other recordkeeping systems. The Secretary also considered the burden estimates in rulemakings with similar recordkeeping requirements. For example, the initial non-recurring burden estimates provided in rulemakings for such recordkeeping requirements varied based on the scope of requirements and the type of entity subject to the requirements, but included initial burden estimates ranging from approximately 100 to 3,300 hours and estimates of required investments in technology and infrastructure from $50,000 to $250,000. Although the type and amount of data collected and reported for such reporting systems are substantively different in both content and format from the data that would be recorded under the Final Rules, the estimates from these prior rulemakings nevertheless provide some guidance as to the scale of system modifications and information technology investments that would be required for compliance with the Final Rules. Similarly, the types of information technology professionals that will establish the recordkeeping and data retention for records entities under the final rules are expected to be similar to the professionals involved in establishing the other systems referenced above.
Most commenters offered general comments on the costs associated with complying with the Proposed Rules, with several stating that the costs—either in general, or as related to certain proposed recordkeeping requirements—outweighed the benefits to the FDIC as receiver.
As discussed in detail in section II above, after carefully considering all of the comments received and consulting with the FDIC, the Secretary is adopting numerous changes from the Proposed Rules. Many of these changes are being adopted in response to comments and are intended to limit the scope and mitigate the burdens associated with complying with the QFC recordkeeping requirements of the Final Rules. In main part, these changes relate to narrowing the scope of the definition of “records entity,” extending the initial compliance period for all records entities, eliminating certain proposed recordkeeping requirements, and providing for a de minimis exemption from the preponderance of the recordkeeping requirements for certain records entities that have a minimal level of QFC activity.
Taking into consideration the changes made in the Final Rules and the comments received as to the burden the rules would place on records entities, the Secretary has updated the estimated potential costs. It is estimated that the initial recordkeeping burden for all records entities (including affiliates) will be approximately 218,505 hours with a total one-time initial cost of approximately $36,631,995 (in nominal dollars), representing $1,221,000 per large corporate group on average. The basis for this estimate, discussed further below, is necessarily constrained by the limited availability of relevant information, including the lack of quantitative information from commenters.
Specifically, based on staff-level discussions with several of the PFRAs, burden estimates in rulemakings with similar recordkeeping requirements, and the comments received, it is expected that each of the approximately 30 large corporate groups will incur on average approximately $500,000 in systems development and modification costs, including the purchase of computer software, and that the entity responsible for maintaining the centralized system within each large corporate group will incur 7,200 initial burden hours at a cost of $712,800 to update to their recordkeeping systems. This initial burden is mitigated to some extent because QFC data is likely already retained in some form by each large corporate group respondent in the ordinary course of business, but large corporate group respondents may need to amend internal procedures, reprogram systems, reconfigure data tables, and implement compliance processes. Moreover, they may need to standardize the data and create records tables to match the format required by the Final Rules. These costs will likely be borne by the entity responsible for maintaining the centralized system within each large corporate group. It is expected that the initial burden hours will require the work of senior programmers, programmer analysts,
The total estimated one-time cost for all large corporate group respondents to comply with the initial recordkeeping burden, is approximately $36,384,000, of which $21,384,000 is due to the burden hours and $15,000,000 is for systems development and modification costs. This is based on the estimated 7,200 initial burden hours for each of the 30 large corporate groups multiplied by the estimated average hourly wage rate for recordkeepers (216,000 hours multiplied by $99/hour) and the $500,000 in systems development and modification costs for each of the 30 large corporate groups. Finally, the total estimated one-time initial cost includes the estimated cost for the 5,010 affiliated financial company respondents to comply with the initial recordkeeping burden, which is approximately $247,995. This is based on an estimated 0.5 initial burden hour for each affiliated financial company, 5,010 affiliated financial companies, and the $99 estimated average hourly wage rate for recordkeepers described above (2,505 hours multiplied by $99/hour).
However, section 148.1(d)(1)(i) of the Final Rules provides for compliance periods of between 540 days and four years after the effective date of the Final Rules, depending on the total assets of records entities. Thus, the initial recordkeeping burden is expected to occur over multiple years, resulting in a substantial reduction in the annual cost. Information as to how records entities would spread this initial cost over the compliance period is not available. However, assuming the costs would be incurred evenly over the entire compliance period, this would result in annual one-time, initial recordkeeping costs ranging from $814,000 for a large corporate group with a 540 day compliance period to $305,267 for a large corporate group with a four year compliance period.
Based in part on staff-level discussions with several of the PFRAs, burden estimates in rulemakings with similar recordkeeping requirements, and the comments received, it is expected that the total estimated recurring annual recordkeeping burden necessary to oversee, maintain, and utilize the recordkeeping system will be approximately 240 hours for each large corporate group and 0.5 hours for each affiliated financial company. Based on the estimate of 30 large corporate groups and 168 affiliates of each corporate group that will be subject to the rules, the total estimated annual recordkeeping burden for all record entities will be approximately 9,705 hours with a total annual cost of approximately $960,795 (9,705 hours multiplied by $99/hour). The estimated average hourly wage rate for recordkeepers to comply with the annual recordkeeping burden is approximately $99 per hour, using the same methodology described above for compliance with the initial recordkeeping burden.
With regard to reporting burdens under the Final Rules, a records entity may request in writing an extension of time with respect to compliance with the recordkeeping requirements or an exemption from the recordkeeping requirements. The annual reporting burden under the Final Rules associated with such exemption requests is estimated to be approximately 50 hours per large corporate group. The estimated average hourly rate for recordkeepers to comply with the annual reporting burden is approximately $192 per hour based on the U.S. Department of Labor, Bureau of Labor Statistics' occupational employment statistics and wage statistics for financial sector occupations, dated May 2015. The $192 hourly wage rate is based on the average hourly wage rates for compliance managers, directors of compliance, and compliance attorneys that will conduct the reporting. The total annual cost of the reporting burden under the Final Rules is approximately $288,000 (50 hours multiplied by 30 records entities multiplied by $192/hour).
Based on the total one-time cost (phased in over 540 days to 4 years), the total annual recordkeeping cost, and the total annual cost of the reporting burden, the estimated net present values of the estimated potential costs of the Final Rules over the next 10 years are approximately $42,103,000 using a discount rate of 3 percent and $38,000,000 using a discount rate of 7 percent.
The estimated potential costs in nominal dollars for the initial recordkeeping burden, the annual recordkeeping burden, and the annual reporting burden associated with the Final Rules are summarized in the following table.
As noted earlier, QFCs tend to increase the interconnectedness of the financial system, and the recent financial crisis demonstrated that the management of QFC positions can be an important element of a resolution strategy which, if not handled properly, may magnify market instability. The recordkeeping requirements of the Final Rules are therefore designed to ensure that the FDIC, as receiver of a covered financial company, will have comprehensive information about the QFC portfolio of such financial company subject to orderly resolution, and enable the FDIC to carry out the rapid and orderly resolution of a financial company's QFC portfolio in the event of insolvency, for example, by transferring QFCs to a bridge financial company within the narrow time frame afforded by the Act. Given the short time frame for FDIC decisions regarding a QFC portfolio of significant size or complexity, the Final Rules require the use of a regularly updated and standardized recordkeeping format to allow the FDIC to process the large amount of QFC information quickly. In the absence of updated and standardized information, for example, the FDIC could leave QFCs in the receivership when transferring them to a bridge financial company or other solvent financial institution would have been the preferred course of action had better information been available. Specifically, if the FDIC does not transfer the QFCs and does not disaffirm or repudiate such QFCs, counterparties may terminate the QFCs and assert claims for payment from the covered financial company and may have rights to liquidate the collateral pledged by the covered financial company. However, a decision by the FDIC not to transfer the QFCs of a large, interconnected financial company must be calculated and based on detailed information about the QFC portfolio. Otherwise, the subsequent unwinding and termination of QFCs involving numerous counterparties risks becoming disorderly, potentially resulting in the rapid liquidation of collateral, deterioration in asset values, and severe negative consequences for U.S. financial stability. The FDIC as receiver may also wish to make sure that affiliates of the covered financial company continue to perform their QFC obligations in order to preserve the critical operations of the covered financial company and its affiliates. In such cases, the FDIC may need to arrange for additional liquidity, support, or collateral to the affiliates to enable them to meet collateral obligations and generally perform their QFC obligations.
While there could be significant benefits associated with the QFC recordkeeping requirements of the Final Rules, such benefits are difficult to quantify. The Final Rules are only one component of the orderly liquidation authority under Title II of the Act and the benefits of the Final Rules will only be realized upon such authority being exercised. Moreover, implementation of additional provisions of the Dodd-Frank Act has, among other things: (1) Subjected large, interconnected financial companies to stronger supervision, and, as a result, reduced the likelihood of their failure; and (2) blunted the impact of any such failure on U.S. financial stability and the economy. For example, bank holding companies with total consolidated assets of $50 billion or more and nonbank financial companies supervised by the Board are subject to supervisory and company-run stress tests to help the Board and the company measure the sufficiency of capital available to support the company's operations throughout periods of stress.
Nevertheless, one way to gauge the potential benefits of the Final Rules is to examine the effect of the recent financial crisis on the real economy and how the Title II orderly liquidation authority as a whole will help reduce the probability or severity of a future financial crisis. For example, in a 2013 Government Accountability Office (GAO) report, GAO cited research that suggests that U.S. output losses associated with the 2007-2009 financial crisis could range from several trillion dollars to over $10 trillion.
However, as discussed above, even if the benefits of preventing future financial crises are significant, it is difficult to quantify such benefits and determine what portion would be attributable to any single provision of the Dodd-Frank Act, let alone those benefits directly attributable to the Final Rules. In addition, as discussed above, the benefits associated with the Final Rules would only be realized if the Title II orderly liquidation authority is
Executive Order 13563 also directs the Secretary to develop a plan, consistent with law and the Department of the Treasury's resources and regulatory priorities, to conduct a periodic retrospective analysis of significant regulations to determine whether such regulations should be modified, streamlined, expanded, or repealed so as to make the regulations more effective and less burdensome. The Secretary expects to conduct a retrospective analysis not later than seven years after the effective date of the Final Rules. This review will consider whether the QFC recordkeeping requirements are necessary or appropriate to assist the FDIC as receiver in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act and may result in proposed amendments to the Final Rules. For example, the Secretary will review whether the total assets and derivatives thresholds of the definition of “records entity” should be adjusted and whether the data set forth in Tables A-1 through A-4 and the master tables in the appendix of the Final Rules are necessary or appropriate to assist the FDIC as receiver, and whether maintaining different data is necessary or appropriate.
Reporting and recordkeeping requirements.
For the reasons set forth in the preamble, the Department of the Treasury adds part 148 to 31 CFR chapter I to read as follows:
31 U.S.C. 321(b) and 12 U.S.C. 5390(c)(8)(H).
(a)
(b)
(c)
(d)
(A) 540 days after the effective date for a records entity that:
(
(
(B) Two years after the effective date for any records entity that is not subject to the compliance date set forth in paragraph (d)(1)(i)(A) of this section and:
(
(
(C) Three years after the effective date for any records entity that is not subject to the compliance date set forth in paragraphs (d)(1)(i)(A) or (B) of this section and:
(
(
(D) Four years after the effective date for any records entity that is not subject to the compliance dates set forth in paragraphs (d)(1)(i)(A), (B), or (C) of this section.
(ii) A financial company that becomes a records entity after the effective date must comply with § 148.3(a)(2) within 90 days of becoming a records entity and with all other applicable requirements of this part within 540 days of becoming a records entity or within the remainder of the applicable period provided under paragraph (d)(1)(i) of this section, whichever period is longer.
(2)
(3)
(i) A statement of the reasons why the records entity cannot comply by the deadline; and
(ii) A plan for achieving compliance during the requested extension period.
(4)
For purposes of this part:
(a)
(b)
(1) The entity directly or indirectly or acting through one or more other persons owns, controls, or has the power to vote 25 percent or more of any class of voting securities of the other entity;
(2) The entity controls in any manner the election of a majority of the directors or trustees of the other entity; or
(3) The Board of Governors of the Federal Reserve System has determined, after notice and opportunity for hearing in accordance with 12 CFR 225.31, that the entity directly or indirectly exercises a controlling influence over the management or policies of the other entity.
(c)
(d)
(e)
(f)
(1) An insured depository institution as defined in 12 U.S.C. 1813(c)(2);
(2) A subsidiary of an insured depository institution that is not:
(i) A functionally regulated subsidiary as defined in 12 U.S.C. 1844(c)(5);
(ii) A security-based swap dealer as defined in 15 U.S.C. 78c(a)(71); or
(iii) A major security-based swap participant as defined in 15 U.S.C. 78c(a)(67); or
(3) An insurance company.
(g)
(h)
(1) An insurance company as defined in 12 U.S.C. 5381(a)(13); and
(2) A mutual insurance holding company that meets the conditions set forth in 12 CFR 380.11 for being treated as an insurance company for the purpose of section 203(e) of the Dodd-Frank Act, 12 U.S.C. 5383(e).
(i)
(1) Regulatory Oversight Committee means the Regulatory Oversight Committee (of the Global LEI System), whose charter was set forth by the Finance Ministers and Central Bank Governors of the Group of Twenty and the Financial Stability Board, or any successor thereof; and
(2) Global LEI Foundation means the not-for-profit organization organized under Swiss law by the Financial Stability Board in 2014, or any successor thereof.
(j)
(k)
(l)
(1) With respect to any financial company, the primary financial regulatory agency as specified for such financial company in subparagraphs (A), (B), (C), and (E) of 12 U.S.C. 5301(12); and
(2) With respect to a financial market utility that is subject to a designation pursuant to 12 U.S.C. 5463 for which there is no primary financial regulatory agency under § 148.2(l)(1), the Supervisory Agency for that financial market utility as defined in 12 U.S.C. 5462(8).
(m)
(n)
(1) Records entity means any financial company that:
(i) Is not an excluded entity as defined in § 148.2(f);
(ii) Is a party to an open QFC; and
(iii) (A) Is subject to a determination that the company shall be subject to Federal Reserve supervision and enhanced prudential standards pursuant to 12 U.S.C. 5323;
(B) Is subject to a designation as, or as likely to become, systemically important pursuant to 12 U.S.C. 5463;
(C) Is identified as a global systemically important bank holding company pursuant to 12 CFR part 217;
(D)(
(
(
(
(E)(
(
(
(2) A financial company that qualifies as a records entity pursuant to paragraph (n)(1)(iii)(D) will remain a records entity until one year after it ceases to meet the criteria set forth in paragraph (n)(1)(iii)(D) of this section.
(o)
(p)
(q)
(r)
(s)
(a)
(ii) A top-tier financial company must be capable of generating a single, compiled set of the records required to be maintained by § 148.4(a)-(h), in a format that allows for aggregation and disaggregation of such data by records entity and counterparty, for all records entities in its corporate group that are consolidated by or consolidated with such top-tier financial company on financial statements prepared in accordance with U.S. generally accepted accounting principles or other applicable accounting standards or, for financial companies not subject to such principles or standards, that would be consolidated by or consolidated with such financial company if such principles or standards applied.
(2)
(3)
(i) In the case of a records entity, the records specified in § 148.4, and
(ii) In the case of a top-tier financial company, the set of records referenced in paragraph (a)(1)(ii) of this section.
(b)
(2)
(3)
(c)
(2)
(i) In compliance with the recordkeeping requirements of the Commodity Futures Trading Commission or the Securities and Exchange Commission, as applicable, including its maintenance of records pertaining to all QFCs cleared by such records entity; and
(ii) Capable of and not restricted from, whether by law, regulation, or agreement, transmitting electronically to the FDIC the records maintained under such recordkeeping requirements within 24 hours of request of the Commodity Futures Trading Commission or the Securities and Exchange Commission, as applicable.
(3)
(i) Identify the records entity or records entities or the types of records entities to which the exemption should apply;
(ii) Specify the requirement(s) under this part from which the identified records entities should be exempt;
(iii) Provide details as to the size, risk, complexity, leverage, frequency and dollar amount of qualified financial contracts, and interconnectedness to the financial system of each records entity identified in paragraph (c)(3)(i) of this section, to the extent appropriate, and any other relevant factors; and
(iv) Specify the reason(s) why granting the exemption will not impair or impede the FDIC's ability to exercise its rights or fulfill its statutory obligations under 12 U.S.C. 5390(c)(8), (9), and (10).
(4)
(ii) In determining whether to grant an exemption to one or more records entities, including whether to grant a conditional or unconditional exemption, the Secretary will consider any factors deemed appropriate by the Secretary, including whether application of one or more requirements of this part is not necessary to achieve the purpose of this part as described in § 148.1(b).
(iii) If the FDIC does not submit, within 90 days of the date on which the FDIC and the Department of the Treasury received the exemption request, a written recommendation to the Secretary as to whether to grant or deny an exemption request, the Secretary will nevertheless determine whether to grant or deny the exemption request.
Subject to § 148.3(c), a records entity must maintain the following records:
(a) The position level data listed in Table A-1 in appendix A to this part with respect to each QFC to which it is a party.
(b) The counterparty netting set data listed in Table A-2 in appendix A to this part for each netting set with respect to each QFC to which it is a party.
(c) The legal agreements information listed in Table A-3 in appendix A to this part with respect to each QFC to which it is a party.
(d) The collateral detail data listed in Table A-4 in appendix A to this part with respect to each QFC to which it is a party.
(e) The corporate organization master data lookup table in appendix A to this part for the records entity and each of its affiliates.
(f) The counterparty master data lookup table in appendix A to this part for each non-affiliated counterparty with respect to QFCs to which it is a party.
(g) The booking location master data lookup table in appendix A to this part for each booking location used with respect to QFCs to which it is a party.
(h) The safekeeping agent master data lookup table in the appendix to this part for each safekeeping agent used with respect to QFCs to which it is a party.
(i) All documents that govern QFC transactions between the records entity and each counterparty, including, without limitation, master agreements and annexes, schedules, netting agreements, supplements, or other modifications with respect to the agreements, confirmations for each open QFC position of the records entity that has been confirmed and all trade acknowledgments for each open QFC position that has not been confirmed, all credit support documents including, but not limited to, credit support annexes, guarantees, keep-well agreements, or net worth maintenance agreements that are relevant to one or more QFCs, and all assignment or novation documents, if applicable, including documents that confirm that all required consents, approvals, or other conditions precedent for such assignment or novation have been obtained or satisfied.
(j) A list of vendors directly supporting the QFC-related activities of the records entity and the vendors' contact information.
Category | Regulatory Information | |
Collection | Federal Register | |
sudoc Class | AE 2.7: GS 4.107: AE 2.106: | |
Publisher | Office of the Federal Register, National Archives and Records Administration |