Federal Register Vol. 81, No.210,

Federal Register Volume 81, Issue 210 (October 31, 2016)

Page Range75315-75670
FR Document

Current View
Page and SubjectPDF
81 FR 75456 - Sunshine Act MeetingPDF
81 FR 75427 - Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of CaliforniaPDF
81 FR 75426 - Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of CaliforniaPDF
81 FR 75427 - Indian Gaming; Approval of Amended Tribal-State Class III Gaming Compact in the State of South DakotaPDF
81 FR 75428 - Indian Gaming; Approval of Amendment to Tribal-State Class III Gaming Compact in the State of OregonPDF
81 FR 75427 - Indian Gaming; Approval of Amendment to Tribal-State Class III Gaming Compact in the State of CaliforniaPDF
81 FR 75428 - Indian Gaming; Approval of Amended Tribal-State Class III Gaming Compact in the State of CaliforniaPDF
81 FR 75405 - Proposed Data Collection Submitted for Public Comment and RecommendationsPDF
81 FR 75411 - Report on the Performance of Drug and Biologics Firms in Conducting Postmarketing Requirements and Commitments; AvailabilityPDF
81 FR 75406 - Agency Information Collection Activities: Proposed Collection; Comment RequestPDF
81 FR 75349 - Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry; AvailabilityPDF
81 FR 75351 - Good Laboratory Practice for Nonclinical Laboratory Studies; Extension of Comment PeriodPDF
81 FR 75419 - Labeling for Permanent Hysteroscopically Placed Tubal Implants Intended for Sterilization; Guidance for Industry and Food and Drug Administration Staff; AvailabilityPDF
81 FR 75409 - Agency Information Collection Activities: Submission for OMB Review; Comment RequestPDF
81 FR 75429 - Atlantic Wind Lease Sale 6 (ATLW-6) for Commercial Leasing for Wind Power on the Outer Continental Shelf Offshore New York-Final Sale Notice MMAA104000PDF
81 FR 75476 - Petition for Exemption; Summary of Petition Received; Douglas MyersPDF
81 FR 75477 - Petition for Exemption; Summary of Petition Received; Pentastar Aviation Charter, Inc.PDF
81 FR 75438 - Environmental Assessment for Commercial Wind Lease Issuance and Site Assessment Activities on the Atlantic Outer Continental Shelf Offshore New York; MMAA104000PDF
81 FR 75352 - Withholding of Unclassified Technical Data and Technology From Public DisclosurePDF
81 FR 75327 - Drawbridge Operation Regulation; Newtown Creek, Brooklyn and Queens, NYPDF
81 FR 75315 - Temporary Exceptions to FIRREA Appraisal Requirements in Areas Affected by Severe Storms and Flooding in LouisianaPDF
81 FR 75361 - Approval and Promulgation of Air Quality Implementation Plans; State of Utah; Revisions to Nonattainment Permitting RegulationsPDF
81 FR 75398 - Combined Notice of FilingsPDF
81 FR 75397 - Applied Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 AuthorizationPDF
81 FR 75393 - Moapa Southern Paiute Solar, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 AuthorizationPDF
81 FR 75398 - Combined Notice of Filings #1PDF
81 FR 75399 - Combined Notice of FilingsPDF
81 FR 75394 - Combined Notice of Filings #1PDF
81 FR 75395 - Combined Notice of Filings #1PDF
81 FR 75401 - Change in Bank Control Notices; Acquisitions of Shares of a Bank or Bank Holding CompanyPDF
81 FR 75387 - Agency Information Collection Activities; Comment Request; National Professional Development Program: Grantee Performance ReportPDF
81 FR 75345 - Fisheries of the Exclusive Economic Zone Off Alaska; Groundfish by Vessels Using Trawl Gear in the of the Gulf of AlaskaPDF
81 FR 75378 - Polyethylene Retail Carrier Bags From Malaysia: Final Results of the Antidumping Duty Administrative Review; 2014-2015PDF
81 FR 75373 - Foreign-Trade Zone (FTZ) 38-Spartanburg, South Carolina Authorization of Production Activity Benteler Automotive Corporation (Automotive Suspension and Body Components) Duncan, South CarolinaPDF
81 FR 75374 - Call for Applications for the International Buyer Program Select Service for Calendar Year 2018PDF
81 FR 75400 - Children's Health Protection Advisory CommitteePDF
81 FR 75379 - Call for Applications for the International Buyer Program Calendar Year 2018PDF
81 FR 75453 - New Postal ProductsPDF
81 FR 75376 - Certain Frozen Warmwater Shrimp From India: Initiation and Preliminary Results of Antidumping Duty Changed Circumstances ReviewPDF
81 FR 75439 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Special Dipping and Coating Operations (Dip Tanks)PDF
81 FR 75440 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Safety Standards for Underground Coal Mine Ventilation-Belt Entry Used as an Intake Air Course To Ventilate Working Sections and Areas Where Mechanized Mining Equipment Is Being Installed or RemovedPDF
81 FR 75449 - NuScale Power, LLC, Design-Specific Review Standard and Scope and Safety Review MatrixPDF
81 FR 75365 - Mercury and Air Toxics Standards (MATS) Completion of Electronic Reporting RequirementsPDF
81 FR 75478 - Pilot Program for Transit-Oriented Development Planning Project SelectionsPDF
81 FR 75452 - Duke Energy Florida, LLC; Levy Nuclear Plant Units 1 and 2PDF
81 FR 75392 - Environmental Management Site-Specific Advisory Board, Savannah River SitePDF
81 FR 75370 - Submission for OMB Review; Comment RequestPDF
81 FR 75444 - TUV Rheinland of North America, Inc.: Applications for Expansion of Recognition and Proposed Modification to the List of Appropriate NRTL Test StandardsPDF
81 FR 75442 - Intertek Testing Services NA, Inc.: Application for Expansion of RecognitionPDF
81 FR 75446 - Curtis-Strauss LLC: Application for Expansion of RecognitionPDF
81 FR 75371 - National Urban and Community Forestry Advisory CouncilPDF
81 FR 75347 - Refunding Baggage Fees for Delayed Checked BagsPDF
81 FR 75368 - Petitions for Reconsideration and Clarification of Action in Rulemaking ProceedingPDF
81 FR 75400 - Schedule Change Open Commission Meeting, Thursday, October 27, 2016PDF
81 FR 75370 - Forest Resource Coordinating CommitteePDF
81 FR 75388 - National Assessment Governing Board Quarterly Board MeetingPDF
81 FR 75390 - Request for Information on Interagency Working Group on Language and Communication's Report on Research and Development ActivitiesPDF
81 FR 75480 - Nondiscrimination on the Basis of Disability in Air Travel: Negotiated Rulemaking Committee Seventh MeetingPDF
81 FR 75481 - Exploring Industry Practices on Distribution and Display of Airline Fare, Schedule, and Availability InformationPDF
81 FR 75396 - Breitburn Operating LP v. Florida Gas Transmission Company, LLC; Notice of ComplaintPDF
81 FR 75397 - City of Tuscaloosa, Alabama; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments and Motions To IntervenePDF
81 FR 75399 - Public Service Company of New Hampshire; Notice of Availability of Environmental AssessmentPDF
81 FR 75392 - Alabama Power Company v. Southwest Power Pool; Notice of ComplaintPDF
81 FR 75393 - Indianapolis Power & Light Company v. Midcontinent Independent System Operator, Inc.; Notice of ComplaintPDF
81 FR 75396 - Dominion Carolina Gas Transmission, LLC; Notice of ApplicationPDF
81 FR 75328 - Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long-Term Care Hospital Prospective Payment System and Policy Changes and Fiscal Year 2017 Rates; Quality Reporting Requirements for Specific Providers; Graduate Medical Education; Hospital Notification Procedures Applicable to Beneficiaries Receiving Observation Services; Technical Changes Relating to Costs to Organizations and Medicare Cost Reports; Finalization of Interim Final Rules With Comment Period on LTCH PPS Payments for Severe Wounds, Modifications of Limitations on Redesignation by the Medicare Geographic Classification Review Board, and Extensions of Payments to MDHs and Low-Volume Hospitals; CorrectionPDF
81 FR 75423 - Commercial Customs Operations Advisory Committee (COAC)PDF
81 FR 75366 - Notice of Proposed Supplementary Rules for Public Lands Managed by the Moab Field Office in Grand County, UtahPDF
81 FR 75384 - Notice of Intent To Grant Exclusive Patent License to RF Networking Solutions, LLC; East Brunswick, NJPDF
81 FR 75386 - Notice of Public Hearing and Business Meeting; November 9 and December 14, 2016PDF
81 FR 75344 - NASA Federal Acquisition Regulation Supplement: Remove NASA FAR Supplement Clause Engineering Change Proposals (2016-N030)PDF
81 FR 75385 - Submission for OMB Review; Comment RequestPDF
81 FR 75449 - Submission for OMB Review; Comment RequestPDF
81 FR 75425 - Endangered and Threatened Wildlife and Plants; 5-Year Status Review of the Red WolfPDF
81 FR 75454 - 2017 Railroad Experience Rating Proclamations, Monthly Compensation Base and Other DeterminationsPDF
81 FR 75424 - Announcement of Meetings: North American Wetlands Conservation Council; Neotropical Migratory Bird Conservation Advisory GroupPDF
81 FR 75371 - Allegheny Resource Advisory Committee MeetingPDF
81 FR 75316 - Excepted Benefits; Lifetime and Annual Limits; and Short-Term, Limited-Duration InsurancePDF
81 FR 75439 - Agency Information Collection Activities; Proposed eCollection eComments Requested; Proposed Renewal, With Change, of a Previously Approved Collection; Attorney Student Loan Repayment Program Electronic FormsPDF
81 FR 75384 - Proposed Information Collection; Comment Request; Natural Resource Damage Assessment Restoration Project Information SheetPDF
81 FR 75383 - Proposed Information Collection; Comment Request; Expanded Vessel Monitoring System Requirement in the Pacific Coast Groundfish FisheryPDF
81 FR 75388 - Agency Information Collection Activities; Comment Request; Evaluation of the Comprehensive Technical Assistance CentersPDF
81 FR 75374 - Proposed Information Collection; Comment Request; Report of Requests for Restrictive Trade Practice or BoycottPDF
81 FR 75383 - Submission for OMB Review; Comment RequestPDF
81 FR 75382 - Submission for OMB Review; Comment RequestPDF
81 FR 75372 - Notice of Invitation for Nominations to the Advisory Committee on Agriculture StatisticsPDF
81 FR 75373 - Notice of Intent To Request Revision and Extension of a Currently Approved Information CollectionPDF
81 FR 75428 - Information Collection Request: National Park Service Centennial National Household SurveyPDF
81 FR 75327 - Drawbridge Operation Regulation; Upper Mississippi River, Clinton, IAPDF
81 FR 75491 - Proposed Information Collection (Application Requirements To Receive VA Dental Insurance Plan Benefits Under 38 CFR 17.169) Activity: Comment RequestPDF
81 FR 75377 - Freshwater Crawfish Tail Meat From the People's Republic of China: Initiation of Antidumping Duty New Shipper ReviewPDF
81 FR 75477 - Railroad Safety Advisory Committee; Notice of MeetingPDF
81 FR 75478 - Norfolk Southern Railway Company's Request for Positive Train Control Safety Plan Approval and System CertificationPDF
81 FR 75448 - NASA Advisory Council; Aeronautics Committee; MeetingPDF
81 FR 75401 - Patient Safety Organizations: Voluntary Relinquishment From the Patient Safety Leadership Council PSOPDF
81 FR 75402 - Agency Information Collection Activities: Proposed Collection; Comment RequestPDF
81 FR 75454 - New Postal ProductPDF
81 FR 75473 - Self-Regulatory Organizations; NASDAQ PHLX LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Add Commentary .14 to Rule 3317 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot)PDF
81 FR 75471 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Add Commentary .14 to Rule 4770 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot)PDF
81 FR 75468 - Self-Regulatory Organizations; NASDAQ BX, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Add Commentary .14 to Rule 4770 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot)PDF
81 FR 75460 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed Amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information To Amend OPRA's Non-Display Use FeesPDF
81 FR 75462 - Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed Amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information To Amend the Professional Subscriber Device-Based Fees and Policies with Respect to Device-Based FeesPDF
81 FR 75458 - Self-Regulatory Organizations; Bats EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend EDGX Rule 11.11, Routing to Away Trade CentersPDF
81 FR 75466 - Self-Regulatory Organizations; Bats EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change to EDGA Rule 11.11, Routing to Away Trading CentersPDF
81 FR 75464 - Self-Regulatory Organizations; Bats BZX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change to BZX Rule 11.13, Order Execution and RoutingPDF
81 FR 75456 - Self-Regulatory Organizations; Bats BYX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend BYX Rule 11.13, Order Execution and RoutingPDF
81 FR 75423 - National Cancer Institute; Notice of MeetingPDF
81 FR 75421 - Government-Owned Inventions; Availability for LicensingPDF
81 FR 75421 - Center for Scientific Review; Notice of Closed MeetingsPDF
81 FR 75381 - Determination of Overfishing or an Overfished ConditionPDF
81 FR 75488 - Unblocking of Specially Designated Nationals and Blocked Persons Resulting From the Termination of the National Emergency and Revocation of Executive Orders Related to BurmaPDF
81 FR 75387 - Agency Information Collection Activities; Comment Request; GEPA Section 427 Guidance for All Grant ApplicationsPDF
81 FR 75408 - Agency Information Collection Activities: Proposed Collection; Comment RequestPDF
81 FR 75441 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Derricks StandardPDF
81 FR 75487 - Fee Schedule for the Transfer of U.S. Treasury Book-Entry Securities Held on the National Book-Entry SystemPDF
81 FR 75338 - Amendment of the Commission's Space Station Licensing Rules and Policies, Second Order on ReconsiderationPDF
81 FR 75330 - Procedures for Disclosure of Information Under the Freedom of Information ActPDF
81 FR 75624 - Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation AuthorityPDF
81 FR 75494 - Teacher Preparation IssuesPDF

Issue

81 210 Monday, October 31, 2016 Contents Agency Health Agency for Healthcare Research and Quality NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75402-75405 2016-26143 Patient Safety Organizations: Voluntary Relinquishment from Patient Safety Leadership Council, 75401-75402 2016-26144 Agriculture Agriculture Department See

Forest Service

See

National Agricultural Statistics Service

NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75370 2016-26205
Army Army Department NOTICES Exclusive Patent Licenses: RF Networking Solutions, LLC, East Brunswick, NJ, 75384-75385 2016-26177 Fiscal Bureau of the Fiscal Service NOTICES Fee Schedule for Transfer of U.S. Treasury Book-Entry Securities Held on National Book-Entry System, 75487-75488 2016-26079 Centers Disease Centers for Disease Control and Prevention NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75405-75406 2016-26248 Centers Medicare Centers for Medicare & Medicaid Services RULES Medicare Program: Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals, etc.; Correction, 75328-75330 2016-26182 NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75406-75411 2016-26246 2016-26122 2016-26242 Coast Guard Coast Guard RULES Drawbridge Operations: Newtown Creek, Brooklyn and Queens, NY, 75327-75328 2016-26235 Upper Mississippi River, Clinton, IA, 75327 2016-26150 Commerce Commerce Department See

Foreign-Trade Zones Board

See

Industry and Security Bureau

See

International Trade Administration

See

National Oceanic and Atmospheric Administration

Comptroller Comptroller of the Currency RULES Financial Institutions Reform, Recovery, and Enforcement Act Appraisal Requirements: Louisiana; Temporary Exceptions in Areas Affected by Severe Storms and Flooding, 75315-75316 2016-26234 Defense Department Defense Department See

Army Department

See

Navy Department

PROPOSED RULES Withholding of Unclassified Technical Data and Technology from Public Disclosure, 75352-75361 2016-26236
Delaware Delaware River Basin Commission NOTICES Meetings: Delaware River Basin Commission, 75386 2016-26176 Education Department Education Department RULES Teacher Preparation Issues, 75494-75622 2016-24856 NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals: Evaluation of Comprehensive Technical Assistance Centers, 75388 2016-26158 Guidance for All Grant Applications, 75387-75388 2016-26123 National Professional Development Program: Grantee Performance Report, 75387 2016-26222 Meetings: National Assessment Governing Board, 75388-75390 2016-26194 Requests for Information: Interagency Working Group on Language and Communication's Report on Research and Development Activities, 75390-75392 2016-26193 Employee Benefits Employee Benefits Security Administration RULES Excepted Benefits; Lifetime and Annual Limits; and Short-Term, Limited-Duration Insurance, 75316-75327 2016-26162 Energy Department Energy Department See

Federal Energy Regulatory Commission

NOTICES Meetings: Environmental Management Site-Specific Advisory Board, Savannah River Site, 75392 2016-26206
Environmental Protection Environmental Protection Agency PROPOSED RULES Air Quality State Implementation Plans; Approvals and Promulgations: Utah; Revisions to Nonattainment Permitting Regulations, 75361-75365 2016-26233 Mercury and Air Toxics Standards Completion of Electronic Reporting Requirements, 75365-75366 2016-26209 NOTICES Meetings: Children's Health Protection Advisory Committee, 75400 2016-26217 Federal Aviation Federal Aviation Administration NOTICES Petitions for Exemption; Summaries: Douglas Myers, 75476 2016-26239 Pentastar Aviation Charter, Inc., 75477 2016-26238 Federal Communications Federal Communications Commission RULES Space Station Licensing Rules and Policies, 75338-75344 2016-25935 PROPOSED RULES Petitions for Reconsideration and Clarification of Action in Rulemaking Proceeding, 75368-75369 2016-26198 NOTICES Meetings: Federal Communications Commission; Schedule Change, 75400-75401 2016-26197 Federal Deposit Federal Deposit Insurance Corporation RULES Financial Institutions Reform, Recovery, and Enforcement Act Appraisal Requirements: Louisiana; Temporary Exceptions in Areas Affected by Severe Storms and Flooding, 75315-75316 2016-26234 Federal Energy Federal Energy Regulatory Commission NOTICES Applications: Dominion Carolina Gas Transmission, LLC, 75396-75397 2016-26185 Combined Filings, 75394-75396, 75398-75400 2016-26224 2016-26225 2016-26226 2016-26228 2016-26231 Complaints: Alabama Power Co. v. Southwest Power Pool, 75392-75393 2016-26187 Breitburn Operating, LP v. Florida Gas Transmission Co., LLC, 75396 2016-26190 Indianapolis Power and Light Co. v. Midcontinent Independent System Operator, Inc., 75393-75394 2016-26186 Environmental Assessments; Availability, etc.: Public Service Co. of New Hampshire, 75399 2016-26188 Hydroelectric Applications: Tuscaloosa, AL, 75397 2016-26189 Initial Market-Based Rate Filings Including Requests for Blanket Section 204 Authorizations: Applied Energy, LLC, 75397-75398 2016-26230 Moapa Southern Paiute Solar, LLC, 75393 2016-26229 Federal Railroad Federal Railroad Administration NOTICES Meetings: Railroad Safety Advisory Committee, 75477-75478 2016-26147 Positive Train Control Safety Plans: Norfolk Southern Railway Co.; Approval and System Certification, 75478 2016-26146 Federal Reserve Federal Reserve System RULES Financial Institutions Reform, Recovery, and Enforcement Act Appraisal Requirements: Louisiana; Temporary Exceptions in Areas Affected by Severe Storms and Flooding, 75315-75316 2016-26234 NOTICES Changes in Bank Control: Acquisitions of Shares of Bank or Bank Holding Company, 75401 2016-26223 Federal Transit Federal Transit Administration NOTICES Pilot Program for Transit-Oriented Development Planning Project Selections, 75478-75480 2016-26208 Fish Fish and Wildlife Service NOTICES Endangered and Threatened Wildlife and Plants: 5-Year Status Review of Red Wolf, 75425-75426 2016-26168 Meetings: North American Wetlands Conservation Council; Neotropical Migratory Bird Conservation Advisory Group, 75424-75425 2016-26166 Food and Drug Food and Drug Administration PROPOSED RULES Good Laboratory Practice for Nonclinical Laboratory Studies, 75351-75352 2016-26244 Guidance: Describing Hazard that Needs Control in Documents Accompanying Food, as Required by Four Rules Implementing FDA Food Safety Modernization Act, 75349-75351 2016-26245 NOTICES Guidance: Labeling for Permanent Hysteroscopically Placed Tubal Implants Intended for Sterilization, 75419-75421 2016-26243 Report on the Performance of Drug and Biologics Firms in Conducting Postmarketing Requirements and Commitments; Availability, 75411-75419 2016-26247 Foreign Assets Foreign Assets Control Office NOTICES Blocking or Unblocking of Persons and Properties, 75488-75491 2016-26124 Foreign Trade Foreign-Trade Zones Board NOTICES Production Activities: Benteler Automotive Corp., Foreign-Trade Zone 38, Spartanburg, SC, 75373-75374 2016-26219 Forest Forest Service NOTICES Meetings: Allegheny Resource Advisory Committee, 75371 2016-26165 Forest Resource Coordinating Committee, 75370-75371 2016-26195 National Urban and Community Forestry Advisory Council, 75371-75372 2016-26200 Health and Human Health and Human Services Department See

Agency for Healthcare Research and Quality

See

Centers for Disease Control and Prevention

See

Centers for Medicare & Medicaid Services

See

Food and Drug Administration

See

National Institutes of Health

RULES Excepted Benefits; Lifetime and Annual Limits; and Short-Term, Limited-Duration Insurance, 75316-75327 2016-26162
Homeland Homeland Security Department See

Coast Guard

See

U.S. Customs and Border Protection

Indian Affairs Indian Affairs Bureau NOTICES Indian Gaming: Approval of Amended Tribal-State Class III Gaming Compact in State of California, 75428 2016-26250 Approval of Amended Tribal-State Class III Gaming Compact in State of South Dakota, 75427 2016-26253 Approval of Amendment to Tribal-State Class III Gaming Compact in State of California, 75427 2016-26251 Approval of Amendment to Tribal-State Class III Gaming Compact in State of Oregon, 75428 2016-26252 Tribal-State Class III Gaming Compact Taking Effect in State of California, 75426-75428 2016-26254 2016-26255 2016-26256 Industry Industry and Security Bureau NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals: Report of Requests for Restrictive Trade Practice or Boycott, 75374 2016-26157 Interior Interior Department See

Fish and Wildlife Service

See

Indian Affairs Bureau

See

Land Management Bureau

See

National Park Service

See

Ocean Energy Management Bureau

Internal Revenue Internal Revenue Service RULES Excepted Benefits; Lifetime and Annual Limits; and Short-Term, Limited-Duration Insurance, 75316-75327 2016-26162 International Trade Adm International Trade Administration NOTICES Antidumping or Countervailing Duty Investigations, Orders, or Reviews: Certain Frozen Warmwater Shrimp from India, 75376-75377 2016-26214 Freshwater Crawfish Tail Meat from the People's Republic of China, 75377-75378 2016-26148 Polyethylene Retail Carrier Bags from Malaysia, 75378-75379 2016-26220 Requests for Applications: International Buyer Program Calendar Year 2018, 75379-75381 2016-26216 International Buyer Program Select Service for Calendar Year 2018, 75374-75376 2016-26218 Justice Department Justice Department NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals: Attorney Student Loan Repayment Program Electronic Forms, 75439 2016-26161 Labor Department Labor Department See

Employee Benefits Security Administration

See

Occupational Safety and Health Administration

NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals: Derricks Standard, 75441-75442 2016-26121 Safety Standards for Underground Coal Mine Ventilation, etc., 75440-75441 2016-26212 Special Dipping and Coating Operations, 75439-75440 2016-26213
Land Land Management Bureau PROPOSED RULES Public Land Orders: Moab Field Office, Grand County, UT, 75366-75368 2016-26179 Legal Legal Services Corporation RULES Procedures for Disclosure of Information under the Freedom of Information Act, 75330-75338 2016-25832 NASA National Aeronautics and Space Administration RULES Federal Acquisition Regulation Supplement: Removal of Engineering Change Proposals Clause, 75344-75345 2016-26174 NOTICES Meetings: NASA Advisory Council Aeronautics Committee, 75448-75449 2016-26145 National Agricultural National Agricultural Statistics Service NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75373 2016-26153 Requests for Nominations: Advisory Committee on Agriculture Statistics, 75372-75373 2016-26154 National Credit National Credit Union Administration RULES Financial Institutions Reform, Recovery, and Enforcement Act Appraisal Requirements: Louisiana; Temporary Exceptions in Areas Affected by Severe Storms and Flooding, 75315-75316 2016-26234 NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75449 2016-26169 National Institute National Institutes of Health NOTICES Government-Owned Inventions; Availability for Licensing, 75421-75423 2016-26129 Meetings: Center for Scientific Review, 75421 2016-26128 National Cancer Institute, 75423 2016-26130 National Oceanic National Oceanic and Atmospheric Administration RULES Fisheries of the Exclusive Economic Zone Off Alaska: Groundfish by Vessels Using Trawl Gear in Gulf of Alaska, 75345-75346 2016-26221 NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals: Expanded Vessel Monitoring System Requirement in Pacific Coast Groundfish Fishery, 75383 2016-26159 Licensing of Private Remote-Sensing Space Systems, 75383-75384 2016-26156 Natural Resource Damage Assessment Restoration Project Information Sheet, 75384 2016-26160 Papahanaumokuakea Marine National Monument Permit Application and Reports for Permits, 75382-75383 2016-26155 Determinations of Overfishing or Overfished Condition, 75381-75382 2016-26126 National Park National Park Service NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75428-75429 2016-26151 Navy Navy Department NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals, 75385-75386 2016-26173 2016-26170 Nuclear Regulatory Nuclear Regulatory Commission NOTICES Guidance: NuScale Power, LLC, Design-Specific Review Standard and Scope and Safety Review Matrix, 75449-75452 2016-26210 Indemnity Agreements: Duke Energy Florida, LLC; Levy Nuclear Plant Units 1 and 2; Modification Intent, 75452-75453 2016-26207 Occupational Safety Health Adm Occupational Safety and Health Administration NOTICES Applications for Expansion of Recognition: Curtis-Strauss, LLC, 75446-75448 2016-26202 Intertek Testing Services NA, Inc., 75442-75444 2016-26203 TUV Rheinland of North America, Inc., 75444-75446 2016-26204 Ocean Energy Management Ocean Energy Management Bureau NOTICES Atlantic Wind Lease Sales: Commercial Leasing for Wind Power on Outer Continental Shelf Offshore New York, 75429-75438 2016-26240 Environmental Assessments; Availability, etc.: Commercial Wind Lease Issuance and Site Assessment Activities on Atlantic Outer Continental Shelf Offshore New York, 75438 2016-26237 Postal Regulatory Postal Regulatory Commission NOTICES New Postal Products, 2016-26140 75453-75454 2016-26215 Railroad Retirement Railroad Retirement Board NOTICES 2017 Railroad Experience Rating Proclamations, Monthly Compensation Base, and Other Determinations, 75454-75456 2016-26167 Securities Securities and Exchange Commission NOTICES Meetings; Sunshine Act, 75456 2016-26304 Self-Regulatory Organizations; Proposed Rule Changes: Bats BYX Exchange, Inc., 75456-75458 2016-26131 Bats BZX Exchange, Inc., 75464-75466 2016-26132 Bats EDGA Exchange, Inc., 75466-75468 2016-26133 Bats EDGX Exchange, Inc., 75458-75460 2016-26134 NASDAQ BX, Inc., 75468-75471 2016-26137 NASDAQ PHLX, LLC, 75473-75476 2016-26139 NASDAQ Stock Market, LLC, 75471-75473 2016-26138 Options Price Reporting Authority, 2016-26135 75460-75464 2016-26136 Transportation Department Transportation Department See

Federal Aviation Administration

See

Federal Railroad Administration

See

Federal Transit Administration

PROPOSED RULES Refunding Baggage Fees for Delayed Checked Bags, 75347-75349 2016-26199 NOTICES Exploring Industry Practices on Distribution and Display of Airline Fare, Schedule, and Availability Information, 75481-75487 2016-26191 Meetings: Advisory Committee on Accessible Air Transportation, 75480-75481 2016-26192
Treasury Treasury Department See

Bureau of the Fiscal Service

See

Comptroller of the Currency

See

Foreign Assets Control Office

See

Internal Revenue Service

RULES Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation Authority, 75624-75670 2016-25329
Customs U.S. Customs and Border Protection NOTICES Meetings: Commercial Customs Operations Advisory Committee, 75423-75424 2016-26180 Veteran Affairs Veterans Affairs Department NOTICES Agency Information Collection Activities; Proposals, Submissions, and Approvals: Application Requirements to Receive VA Dental Insurance Plan Benefits, 75491-75492 2016-26149 Separate Parts In This Issue Part II Education Department, 75494-75622 2016-24856 Part III Treasury Department, 75624-75670 2016-25329 Reader Aids

Consult the Reader Aids section at the end of this issue for phone numbers, online resources, finding aids, and notice of recently enacted public laws.

To subscribe to the Federal Register Table of Contents electronic mailing list, go to https://public.govdelivery.com/accounts/USGPOOFR/subscriber/new, enter your e-mail address, then follow the instructions to join, leave, or manage your subscription.

81 210 Monday, October 31, 2016 Rules and Regulations DEPARTMENT OF THE TREASURY Office of the Comptroller of the Currency 12 CFR Part 34 [Docket No. OCC-2016-0030] FEDERAL RESERVE SYSTEM 12 CFR Part 225 [Docket No. R-1551] RIN 7100 AE-62 FEDERAL DEPOSIT INSURANCE CORPORATION 12 CFR Part 323 NATIONAL CREDIT UNION ADMINISTRATION 12 CFR Part 722 Temporary Exceptions to FIRREA Appraisal Requirements in Areas Affected by Severe Storms and Flooding in Louisiana AGENCY:

Office of the Comptroller of the Currency, Treasury (OCC); Board of Governors of the Federal Reserve System (Board); Federal Deposit Insurance Corporation (FDIC); and National Credit Union Administration (NCUA), collectively referred to as the Agencies.

ACTION:

Statement and order; temporary exceptions.

SUMMARY:

Section 2 of the Depository Institutions Disaster Relief Act of 1992 (DIDRA) authorizes the Agencies to make exceptions to statutory and regulatory appraisal requirements under Title XI of the Financial Institutions Reform, Recovery, and Enforcement Act of 1989 (FIRREA). The exceptions are available for transactions involving real property located within an area declared to be a major disaster area by the President if the Agencies determine, and describe by publication of a regulation or order, that the exceptions would facilitate recovery from the disaster and would be consistent with safety and soundness. In this statement and order, the Agencies exercise their authority to grant temporary exceptions to the FIRREA appraisal requirements for real estate related transactions, provided certain criteria are met, in the Louisiana parishes declared a major disaster area by President Obama on August 14, 2016, as a result of the severe storms and flooding in Louisiana. The expiration date for the exceptions is December 31, 2017.

DATES:

This order is effective on October 31, 2016 and expires for specific areas on December 31, 2017.

FOR FURTHER INFORMATION CONTACT:

OCC: Robert Parson, Senior Appraisal Policy Advisor, Chief National Bank Examiner's Office, at (202) 649-6423; Kevin Lawton, Appraisal Specialist, Chief National Bank Examiner's Office, at (202) 649-7152; Christopher Manthey, Special Counsel, Chief Counsel's Office, at (202) 649-6203; or Mitchell Plave, Special Counsel, Chief Counsel's Office, at (202) 649-6285 or, for persons who are deaf or hard of hearing, TTY (202) 649-5597.

Board: Carmen D. Holly, Senior Supervisory Financial Analyst, Division of Banking Supervision and Regulation at 202-973-6122; Gillian Burgess, Counsel, Legal Division, at (202) 736-5564.

FDIC: Beverlea S. Gardner, Senior Examination Specialist, Division of Risk Management and Supervision, at (202) 898-3640; Benjamin K. Gibbs, Counsel, Legal Division, at (202) 898- 6726; or Kimberly Stock, Counsel, Legal Division, at (202) 898-3815, Federal Deposit Insurance Corporation, 550 17th Street NW., Washington, DC 20429.

NCUA: D. Scott Neat, Director of Supervision, Office of Examination and Insurance, at (703) 518-6363; John Brolin, Staff Attorney, Office of General Counsel, at (703) 518-6438, National Credit Union Administration, 1775 Duke Street, Alexandria, VA 22314.

SUPPLEMENTARY INFORMATION: Statement

Section 2 of DIDRA, which added section 1123 to Title XI of FIRREA,1 authorizes the Agencies to make exceptions to statutory and regulatory appraisal requirements for certain transactions. These exceptions are available for transactions involving real property located in areas in which the President has determined a major disaster exists, pursuant to 42 U.S.C. 5170, provided that the exception would facilitate recovery from the major disaster and is consistent with safety and soundness.

1 12 U.S.C. 3352

On August 14, 2016, the President declared that 22 parishes in Louisiana were in a major disaster area (Major Disaster Area) due to extensive damage that occurred as a result of severe storms and subsequent flooding.2 The Agencies believe that granting relief from the appraisal requirements set forth in Title XI of FIRREA for real estate transactions in the Major Disaster Area is consistent with the provisions of DIDRA.

2 Press Release, The White House (Aug. 14, 2016), available at https://www.whitehouse.gov/the-press-office/2016/08/14/president-obama-signs-louisiana-disaster-declaration.

Facilitation of Recovery From the Storms and Flooding Declared as Major Disaster

The Agencies have determined that the disruption of real estate markets in the Major Disaster Area interferes with the ability of depository institutions to obtain appraisals that comply with all statutory and regulatory requirements. Further, the Agencies have determined that the disruption may impede institutions in making loans and engaging in other transactions that would aid in the reconstruction and rehabilitation of the affected area. Accordingly, the Agencies have determined that recovery from this major disaster would be facilitated by exempting certain transactions involving real estate located in the area directly affected by the severe storms and flooding from the real estate appraisal requirements of Title XI of FIRREA and its implementing regulations.3

3 12 U.S.C. 3331-3355; 12 CFR 34.41-34.47 (OCC); 12 CFR part 225, subpart G (Board); 12 CFR part 323, subpart A (FDIC); 12 CFR part 722 (NCUA).

Consistency With Safety and Soundness

The Agencies also have determined that the exceptions are consistent with safety and soundness, provided that the depository institution determines and maintains appropriate documentation of the following: (1) The transaction involves real property located in the Major Disaster Area; (2) there is a binding commitment to fund the transaction that was entered into on or after August 14, 2016, but no later than December 31, 2017; and (3) the value of the real property supports the institution's decision to enter into the transaction. In addition, the transaction must continue to be subject to review by management and by the Agencies in the course of examinations of the institution.

Expiration Date

Exceptions made under section 1123 of FIRREA may be provided for no more than three years after the President determines that a major disaster exists in the area.4 The Agencies have determined that the exceptions provided for by this order shall expire on December 31, 2017.

4 12 U.S.C. 3352(b).

Order

In accordance with section 2 of DIDRA, relief is hereby granted from the provisions of Title XI of FIRREA and the Agencies' appraisal regulations for any real estate-related financial transaction that requires the services of an appraiser under those provisions, provided that the institution determines, and maintains documentation made available to the Agencies upon request, of the following:

(1) The transaction involves real property located in one of the 22 parishes declared a major disaster area as a result of severe storms and flooding in Louisiana by the President on August 14, 2016 (identified in the Appendix);

(2) There is a binding commitment to fund a transaction that was entered into on or after August 14, 2016, but no later than December 31, 2017; and

(3) The value of the real property supports the institution's decision to enter into the transaction.

Appendix (Major Disaster Area)

Designated Parishes: Acadia, Ascension, Avoyelles, East Baton Rouge, East Feliciana, Evangeline, Iberia, Iberville, Jefferson Davis, Lafayette, Livingston, Pointe Coupee, St. Helena, St. James, St. Landry, St. Martin, St. Tammany, Tangipahoa, Vermilion, Washington, West Baton Rouge and West Feliciana.

Dated: October 19, 2016. Thomas J. Curry, Comptroller of the Currency. By order of the Board of Governors of the Federal Reserve System, October 21, 2016. Margaret McCloskey Shanks, Deputy Secretary of the Board. Dated at Washington, DC, October 19, 2016.

By order of the Board of Directors.

Federal Deposit Insurance Corporation. Robert E. Feldman, Executive Secretary. Dated at Alexandria, VA, October 27, 2016.

By order of the Board of Directors.

National Credit Union Administration. Gerard Poliquin, Secretary of the Board.
[FR Doc. 2016-26234 Filed 10-28-16; 8:45 am] BILLING CODE 6210-01-P
DEPARTMENT OF THE TREASURY Internal Revenue Service 26 CFR Part 54 [TD 9791] RIN 1545-BN44 DEPARTMENT OF LABOR Employee Benefits Security Administration 29 CFR Part 2590 RIN 1210-AB75 DEPARTMENT OF HEALTH AND HUMAN SERVICES 45 CFR Parts 144, 146, 147, and 148 [CMS-9932-F] RIN 0938-AS93 Excepted Benefits; Lifetime and Annual Limits; and Short-Term, Limited-Duration Insurance AGENCY:

Internal Revenue Service, Department of the Treasury; Employee Benefits Security Administration, Department of Labor; Centers for Medicare & Medicaid Services, Department of Health and Human Services.

ACTION:

Final rules.

SUMMARY:

This document contains final regulations regarding the definition of short-term, limited-duration insurance for purposes of the exclusion from the definition of individual health insurance coverage, and standards for travel insurance and supplemental health insurance coverage to be considered excepted benefits. This document also amends a reference in the final regulations relating to the prohibition on lifetime and annual dollar limits.

DATES:

Effective date. These final regulations are effective on December 30, 2016.

Applicability date. These final regulations apply to group health plans and health insurance issuers beginning on the first day of the first plan year (or, in the individual market, the first day of the first policy year) beginning on or after January 1, 2017.

FOR FURTHER INFORMATION CONTACT:

Elizabeth Schumacher or Matthew Litton of the Department of Labor, at 202-693-8335, Karen Levin, Internal Revenue Service, Department of the Treasury, at (202) 317-5500, David Mlawsky or Cam Clemmons, Centers for Medicare & Medicaid Services, Department of Health and Human Services, at 410-786-1565.

Customer Service Information: Individuals interested in obtaining information from the Department of Labor concerning employment-based health coverage laws may call the Employee Benefits Security Administration (EBSA) Toll-Free Hotline, at 1-866-444-EBSA (3272) or visit the Department of Labor's Web site (http://www.dol.gov/ebsa). In addition, information from the Department of Health and Human Services (HHS) on private health insurance for consumers can be found on the Centers for Medicare & Medicaid Services (CMS) Web site (www.cms.gov/cciio) and information on health reform can be found at www.HealthCare.gov.

SUPPLEMENTARY INFORMATION: I. Background

The Health Insurance Portability and Accountability Act of 1996 (HIPAA), Public Law 104-191 (110 Stat. 1936), added title XXVII of the Public Health Service Act (PHS Act), part 7 of the Employee Retirement Income Security Act of 1974 (ERISA), and Chapter 100 of the Internal Revenue Code (the Code), providing portability and nondiscrimination rules with respect to health coverage. These provisions of the PHS Act, ERISA, and the Code were later augmented by other consumer protection laws, including the Mental Health Parity Act of 1996,1 the Paul Wellstone and Pete Domenici Mental Health Parity and Addiction Equity Act of 2008,2 the Newborns' and Mothers' Health Protection Act,3 the Women's Health and Cancer Rights Act,4 the Genetic Information Nondiscrimination Act of 2008,5 the Children's Health Insurance Program Reauthorization Act of 2009,6 Michelle's Law,7 and the Patient Protection and Affordable Care Act, as amended by the Health Care and Education Reconciliation Act of 2010 (Affordable Care Act).8

1 Public Law 104-204, 110 Stat. 2944 (September 26, 1996).

2 Public Law 110-343, 122 Stat. 3881 (October 3, 2008).

3 Public Law 104-204, 110 Stat. 2935 (September 26, 1996).

4 Public Law 105-277, 112 Stat. 2681-436 (October 21, 1998).

5 Public Law 110-233, 122 Stat. 881 (May 21, 2008).

6 Public Law 111-3, 123 Stat. 65 (February 4, 2009).

7 Public Law 110-381, 122 Stat. 4081 (October 9, 2008).

8 The Patient Protection and Affordable Care Act, Public Law 111-148, was enacted on March 23, 2010, and the Health Care and Education Reconciliation Act of 2010, Public Law 111-152, was enacted on March 30, 2010. (These statutes are collectively known as the “Affordable Care Act”.)

The Affordable Care Act reorganizes, amends, and adds to the provisions of part A of title XXVII of the PHS Act relating to group health plans and health insurance issuers in the group and individual markets. For this purpose, the term “group health plan” includes both insured and self-insured group health plans.9 The Affordable Care Act added section 715(a)(1) of ERISA and section 9815(a)(1) of the Code to incorporate the provisions of part A of title XXVII of the PHS Act (generally, sections 2701 through 2728 of the PHS Act) into ERISA and the Code to make them applicable to group health plans and health insurance issuers providing health insurance coverage in connection with group health plans.

9 The term “group health plan” is used in title XXVII of the PHS Act, part 7 of ERISA, and Chapter 100 of the Code, and is distinct from the term “health plan,” as used in other provisions of title I of the Affordable Care Act. The term “health plan” as used in other provisions of title I of the Affordable Care Act does not include self-insured group health plans.

II. Overview of the Final Regulations

On June 10, 2016, the Departments of Labor, Health and Human Services and the Treasury (the Departments 10 ) issued proposed regulations with respect to expatriate health plans, expatriate health plan issuers, and qualified expatriates; requirements for travel insurance, similar supplemental coverage, and hospital indemnity or other fixed indemnity insurance to be excepted benefits; the prohibition on lifetime and annual limits; and short-term, limited-duration insurance.11 After consideration of comments on the proposed regulations, the Departments are publishing final regulations regarding short-term, limited duration insurance, travel insurance, similar supplemental coverage, and lifetime and annual limits. The Departments intend to address hospital indemnity or other fixed indemnity insurance and expatriate health plans in future rulemaking, taking into account comments received on these issues.12

10 Note, however, that in sections under headings listing only two of the three Departments, the term “Departments” generally refers only to the two Departments listed in the heading.

11 81 FR 38019 (June 10, 2016).

12 The preamble to the proposed regulations also invited public comment on insurance coverage of specified diseases or illnesses as excepted benefits. While not addressed in this rulemaking, the Departments may address this issue in future regulations or guidance.

On July 20, 2015, the Internal Revenue Service published Notice 2015-43, 2015-29 IRB 73, to provide interim guidance with respect to the treatment of expatriate health plans, expatriate health plan issuers, and employers in their capacity as plan sponsors of expatriate health plans, as defined in the Expatriate Health Coverage Clarification Act of 2014 (EHCCA).13 The interim guidance in Notice 2015-43 generally allows a taxpayer to apply the requirements of the EHCCA using a reasonable good faith interpretation of the EHCCA until further guidance is issued, except as otherwise specifically provided with respect to the health insurance providers fee under section 9010 of the Affordable Care Act. Notice 2015-29 provided interim guidance pertaining to the fee under section 9010 for calendar years 2014 and 2015, and Notice 2016-14 provided guidance pertaining to the fee for calendar year 2016. Additionally, the preamble to the Departments' proposed regulations provides that issuers, employers, administrators, and individuals are permitted to rely on the proposed regulations pending the applicability date of final regulations in the Federal Register.14 Until final regulations are issued and effective, this reliance rule as well as the interim guidance in Notice 2015-43 remain in effect.

13 Division M of the Consolidated and Further Continuing Appropriations Act, 2015, Public Law 113-235.

14 81 FR 38019, 38033 (June 10, 2016).

A. Short-Term, Limited-Duration Insurance

Short-term, limited-duration insurance is a type of health insurance coverage that is designed to fill temporary gaps in coverage when an individual is transitioning from one plan or coverage to another plan or coverage. Although short-term, limited-duration insurance is not an excepted benefit, it is similarly exempt from PHS Act requirements because it is not individual health insurance coverage. Section 2791(b)(5) of the PHS Act provides that the term “individual health insurance coverage” means health insurance coverage offered to individuals in the individual market, but does not include short-term, limited-duration insurance. The PHS Act does not define short-term, limited-duration insurance. Under current regulations, short-term, limited-duration insurance means “health insurance coverage provided pursuant to a contract with an issuer that has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder without the issuer's consent) that is less than 12 months after the original effective date of the contract.” 15

15 26 CFR 54.9801-2, 29 CFR 2590.701-2, 45 CFR 144.103.

Before enactment of the Affordable Care Act, short-term, limited-duration insurance was an important means for individuals to obtain health coverage when transitioning from one job to another (and from one group health plan to another) or when faced with other similar situations. However, with guaranteed availability of coverage and special enrollment period requirements in the individual health insurance market under the Affordable Care Act, individuals can purchase coverage with the protections of the Affordable Care Act to fill in the gaps in coverage.

The Departments have become aware that short-term, limited-duration insurance is being sold in situations other than those that the exception from the definition of individual health insurance coverage was initially intended to address.16 In some instances, individuals are purchasing this coverage as their primary form of health coverage and, contrary to the intent of the 12-month coverage limitation in the current definition of short-term, limited-duration insurance, some issuers are providing renewals of the coverage that extend the duration beyond 12 months. Because short-term, limited-duration insurance is exempt from certain consumer protections, the Departments are concerned that these policies may have significant limitations, such as lifetime and annual dollar limits on essential health benefits (EHB) and pre-existing condition exclusions, and therefore may not provide meaningful health coverage. Further, because these policies can be medically underwritten based on health status, healthier individuals may be targeted for this type of coverage, thus adversely impacting the risk pool for Affordable Care Act-compliant coverage.

16See e.g., Mathews, Anna W. “Sales of Short-Term Health Policies Surge,” The Wall Street Journal April 10, 2016, available at http://www.wsj.com/articles/sales-of-short-term-health-policies-surge-1460328539.

To address the issue of short-term, limited-duration insurance being sold as a type of primary coverage, the Departments proposed regulations to revise the definition of short-term, limited-duration insurance so that the coverage must be less than three months in duration, including any period for which the policy may be renewed. The proposed regulations also included a requirement that a notice must be prominently displayed in the contract and in any application materials provided in connection with enrollment in such coverage with the following language: THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.

In addition to proposing to reduce the length of short-term, limited-duration insurance to less than three months, the proposed regulations modified the permitted coverage period to take into account extensions made by the policyholder “with or without the issuer's consent.” This modification was intended to address the Departments' concern that some issuers are taking liberty with the current definition of short-term, limited-duration insurance—either by automatically renewing such policies or having a simplified reapplication process with the result being that such coverage, which does not contain the important protections of the Affordable Care Act, lasts longer than 12 months and serves as an individual's primary health coverage.

The Departments received a number of comments relating to the treatment of short-term, limited-duration insurance. Several commenters supported the proposed rules and the reasoning behind them, noting that short-term, limited-duration insurance is not subject to the same consumer protections as major medical coverage and can discriminate based on health status by recruiting healthier consumers to the exclusion of sicker consumers. These commenters suggested the proposed rules would limit the number of consumers relying on short-term, limited-duration insurance as their primary form of coverage and improve the Affordable Care Act's single risk pool.

Some commenters requested that the Departments go further and prohibit issuers from offering short-term, limited-duration insurance to consumers who have previously purchased this type of coverage to prevent consumers from stringing together coverage under policies offered by the same or different issuers. However, in the Departments' view, such a restriction is not warranted. The individual shared responsibility provision of the Code,17 which generally requires individuals to obtain minimum essential coverage in order to avoid an additional payment with their taxes, provides sufficient incentive to discourage consumers from purchasing multiple successive short-term, limited-duration insurance policies. The added notice requirement ensures that individuals purchasing such policies are aware of the individual shared responsibility requirement and its potential implications. Furthermore, such a prohibition would be difficult for State regulators to enforce, since prior coverage of a consumer would have to be tracked.

17See Code section 5000A.

Other commenters expressed general opposition to the proposed rules or requested that short-term, limited-duration insurance be allowed to provide coverage for a longer period. Several commenters stated that some individuals who lose their employer-sponsored coverage may not be able to obtain COBRA continuation coverage 18 and that a job search can often take longer than three months. One commenter suggested alignment of short-term, limited-duration insurance with the employer waiting period rules by permitting a coverage period of up to four months.19 Another commenter asked that issuers be allowed to renew coverage beyond the three-month period in certain situations, such as when an individual experiences a triggering event for a special enrollment period.20 The Departments decline to adopt these suggestions. Short-term, limited-duration insurance allows for coverage to fill temporary coverage gaps when an individual transitions between sources of primary coverage. As explained above, for longer gaps in coverage, guaranteed availability of coverage and special enrollment period requirements in the individual health insurance market under the Affordable Care Act ensure that individuals can purchase individual market coverage through or outside of the Exchange that is minimum essential coverage and includes the consumer protections of the Affordable Care Act. Further, limiting the coverage of short-term, limited-duration insurance to less than three months is consistent with the exemption from the individual shared responsibility provision for gaps in coverage of less than three months (the short coverage gap exemption).21 Under current law, an individual who is not enrolled in minimum essential coverage (whether enrolled in short-term, limited-duration coverage or otherwise) for a period of three months or more generally cannot claim the short coverage gap exemption for any of those months. The final regulations help ensure that individuals who purchase a short-term, limited-duration insurance policy will be eligible for the short coverage gap exemption (assuming other requirements are met) during the temporary coverage period.

18 COBRA continuation coverage means coverage that satisfies an applicable COBRA continuation provision. These provisions are sections 601-608 of ERISA, section 4980B of the Code (other than paragraph (f)(1) of such section 4980B insofar as it relates to pediatric vaccines), or Title XXII of the PHS Act.

19See 26 CFR 54.9815-2708; 29 CFR 2590.715-2708; 45 CFR 147.116.

20See 26 CFR 54.9801-6; 29 CFR 2590.701-6; 45 CFR 146.117 and 147.104.

21 26 CFR 1.5000A-3(j).

After consideration of the comments and feedback received from stakeholders, the Departments are finalizing the proposed regulations without change.

The revised definition of short-term, limited-duration insurance applies for policy years beginning on or after January 1, 2017. The Departments recognize, however, that State regulators may have approved short-term, limited-duration insurance products for sale in 2017 that met the definition in effect prior to January 1, 2017. Accordingly, the Department of Health and Human Services (HHS) will not take enforcement action against an issuer with respect to the issuer's sale of a short-term, limited-duration insurance product before April 1, 2017 on the ground that the coverage period is three months or more, provided that the coverage ends on or before December 31, 2017 and otherwise complies with the definition of short-term, limited-duration insurance in effect under the regulations.22 States may also elect not to take enforcement actions against issuers with respect to such coverage sold before April 1, 2017.

22 This non-enforcement policy is limited to the requirement that short-term, limited-duration insurance must be less than three months. It does not relieve issuers of short-term, limited-duration insurance of the notice requirement, which applies for policy years beginning on or after January 1, 2017.

B. Excepted Benefits

Sections 2722 and 2763 of the PHS Act, section 732 of ERISA, and section 9831 of the Code provide that the respective requirements of title XXVII of the PHS Act, part 7 of ERISA, and Chapter 100 of the Code generally do not apply to the provision of certain types of benefits, known as “excepted benefits.” Excepted benefits are described in section 2791(c) of the PHS Act, section 733(c) of ERISA, and section 9832(c) of the Code.

The parallel statutory provisions establish four categories of excepted benefits. The first category, under section 2791(c)(1) of the PHS Act, section 733(c)(1) of ERISA and section 9832(c)(1) of the Code, includes benefits that are generally not health coverage (such as automobile insurance, liability insurance, workers compensation, and accidental death and dismemberment coverage). The benefits in this category are excepted in all circumstances. In contrast, the benefits in the second, third, and fourth categories are types of health coverage that are excepted only if certain conditions are met.

The second category of excepted benefits is limited excepted benefits, which may include limited scope vision or dental benefits, and benefits for long-term care, nursing home care, home health care, or community-based care. Section 2791(c)(2)(C) of the PHS Act, section 733(c)(2)(C) of ERISA, and section 9832(c)(2)(C) of the Code authorize the Secretaries of HHS, Labor, and the Treasury (collectively, the Secretaries) to issue regulations establishing other, similar limited benefits as excepted benefits. The Secretaries exercised this authority previously with respect to certain health flexible spending arrangements.23 To be excepted under this second category, the benefits must either: (1) Be provided under a separate policy, certificate, or contract of insurance; or (2) otherwise not be an integral part of a group health plan, whether insured or self-insured.24

23 26 CFR 54.9831-1(c)(3)(v), 29 CFR 2590.732(c)(3)(v), 45 CFR 146.145(b)(3)(v).

24 PHS Act section 2722(c)(1), ERISA section 732(c)(1), Code section 9831(c)(1).

The third category of excepted benefits, referred to as “noncoordinated excepted benefits,” includes both coverage for only a specified disease or illness (such as cancer-only policies), and hospital indemnity or other fixed indemnity insurance. These benefits are excepted under section 2722(c)(2) of the PHS Act, section 732(c)(2) of ERISA, and section 9831(c)(2) of the Code only if all of the following conditions are met: (1) The benefits are provided under a separate policy, certificate, or contract of insurance; (2) there is no coordination between the provision of such benefits and any exclusion of benefits under any group health plan maintained by the same plan sponsor; and (3) the benefits are paid with respect to any event without regard to whether benefits are provided under any group health plan maintained by the same plan sponsor.

The fourth category, under section 2791(c)(4) of the PHS Act, section 733(c)(4) of ERISA, and section 9832(c)(4) of the Code, is supplemental excepted benefits. These benefits are excepted only if they are provided under a separate policy, certificate, or contract of insurance and are Medicare supplemental health insurance (also known as Medigap), TRICARE supplemental programs, or “similar supplemental coverage provided to coverage under a group health plan.” The phrase “similar supplemental coverage provided to coverage under a group health plan” is not defined in the statute or regulations. However, the Departments issued regulations clarifying that one requirement to be similar supplemental coverage is that the coverage “must be specifically designed to fill gaps in primary coverage, such as coinsurance or deductibles.” 25

25 26 CFR 54.9831-1(c)(5)(i)(C), 29 CFR 2590.732(c)(5)(i)(C), and 45 CFR 146.145(b)(5)(i)(C).

In 2007 and 2008, the Departments issued guidance on the circumstances under which supplemental health insurance would be considered excepted benefits under section 2791(c)(4) of the PHS Act (and the parallel provisions of ERISA and the Code).26 The guidance identifies several factors the Departments will apply when evaluating whether supplemental health insurance will be considered to be “similar supplemental coverage provided to coverage under a group health plan.” The guidance provides a safe harbor that supplemental health insurance will be considered an excepted benefit if it is provided through a policy, certificate, or contract of insurance separate from the primary coverage under the plan and meets all of the following requirements: (1) The supplemental policy, certificate, or contract of insurance is issued by an entity that does not provide the primary coverage under the plan; (2) the supplemental policy, certificate, or contract of insurance is specifically designed to fill gaps in primary coverage, such as coinsurance or deductibles, but does not become secondary or supplemental only under a coordination of benefits provision; (3) the cost of the supplemental coverage is 15 percent or less of the cost of primary coverage (determined in the same manner as the applicable premium is calculated under a COBRA continuation provision); and (4) the supplemental coverage sold in the group health insurance market does not differentiate among individuals in eligibility, benefits, or premiums based upon any health factor of the individual (or any dependents of the individual).

26See EBSA Field Assistance Bulletin No. 2007-04 (available at http://www.dol.gov/ebsa/regs/fab2007-4.html); CMS Insurance Standards Bulletin 08-01 (available at http://www.cms.gov/CCIIO/Resources/Files/Downloads/hipaa_08_01_508.pdf); and IRS Notice 2008-23 (available at http://www.irs.gov/irb/2008-07_IRB/ar09.html).

On February 13, 2015, the Departments issued Affordable Care Act Implementation FAQs Part XXIII, providing additional guidance on the circumstances under which health insurance coverage that supplements group health plan coverage may be considered supplemental excepted benefits.27 The FAQ states that the Departments intend to propose regulations clarifying the circumstances under which supplemental insurance products that do not fill in cost-sharing gaps under the primary plan are considered to be specifically designed to fill gaps in primary coverage. Specifically, the FAQ provides that health insurance coverage that supplements group health coverage by providing coverage of additional categories of benefits (as opposed to filling in cost-sharing gaps under the primary plan) would be considered to be designed to “fill in the gaps” of the primary coverage only if the benefits covered by the supplemental insurance product are not EHB, as defined under section 1302(b) of the Affordable Care Act, in the State in which the product is being marketed. The FAQ further states that, until regulations are issued and effective, the Departments will not take enforcement action against an issuer of group or individual market coverage that otherwise meets the conditions to be supplemental excepted benefits that does not fill cost-sharing gaps in the group health plan and only provides coverage of additional categories of benefits that are not covered by the group health plan and are not EHB in the applicable State. States were encouraged to exercise similar enforcement discretion.

27 Frequently Asked Questions about Affordable Care Act Implementation (Part XXIII), available at http://www.dol.gov/ebsa/pdf/faq-AffordableCareAct23.pdf and https://www.cms.gov/CCIIO/Resources/Fact-Sheets-and-FAQs/Downloads/Supplmental-FAQ_2-13-15-final.pdf.

1. Similar Supplemental Coverage

The proposed regulations incorporated guidance from the Affordable Care Act Implementation FAQs Part XXIII addressing supplemental health insurance products that provide categories of benefits in addition to those in the primary coverage. Under the proposed regulations, if group or individual supplemental health insurance covers items and services not included in the primary coverage (referred to as providing “additional categories of benefits”), the coverage will be considered to be designed “to fill gaps in primary coverage,” for purposes of being supplemental excepted benefits if none of the benefits provided by the supplemental policy are an EHB, as defined under section 1302(b) of the Affordable Care Act, in the State in which the coverage is issued.28 Thus, if any benefit provided by the supplemental policy is either included in the primary coverage or is an EHB in the State where the coverage is issued, the insurance coverage would not be supplemental excepted benefits under the proposed regulations. Furthermore, supplemental health insurance products that both fill in cost sharing in the primary coverage, such as coinsurance or deductibles, and cover additional categories of benefits that are not EHB, would be considered supplemental excepted benefits under the proposed regulations provided all other criteria are met.

28 For this purpose, a supplemental plan would determine what benefits are EHB based on the EHB-benchmark plan applicable in the State, along with any additional benefits that are considered EHB consistent with 45 CFR 155.170(a)(2).

The Departments received several comments in support of the proposed regulations. One commenter expressed support but requested that the Departments provide additional examples in the regulations. Another commenter requested clarification regarding the application of the standards for similar supplemental coverage that provides benefits outside of the United States, noting that no State's EHB rules require coverage for services outside of the United States. If any benefit provided by the supplemental policy is a type of service that is an EHB in the State where the coverage is issued, the coverage would not be supplemental excepted benefits under the final regulations, even if the supplemental coverage was limited to covering the benefit in a location or setting where it would not be covered as an EHB.

After consideration of the comments, the Departments are finalizing the proposed regulations on similar supplemental coverage without substantive change. For purposes of consistency and clarity, HHS is also including a cross reference in the individual market excepted benefits regulations at 45 CFR 148.220 to reflect the standard for similar supplemental coverage under the group market regulations at 45 CFR 146.145(b)(5)(i)(C). The Departments may provide additional guidance on similar supplemental coverage that meets the criteria to be excepted benefits in the future.

2. Travel Insurance

The Departments are aware that certain travel insurance products may include limited health benefits. However, these products typically are not designed as major medical coverage. Instead, the risks being insured relate primarily to: (1) The interruption or cancellation of a trip; (2) the loss of baggage or personal effects; (3) damages to accommodations or rental vehicles; or (4) sickness, accident, disability, or death occurring during travel, with any health benefits usually incidental to other coverage.

Section 2791(c)(1)(H) of the PHS Act, section 733(c)(1)(H) of ERISA, and section 9832(c)(1)(H) of the Code provide that the Departments may, in regulations, designate as excepted benefits “benefits for medical care [that] are secondary or incidental to other insurance benefits.” Pursuant to this authority, and to clarify which types of travel-related insurance products are excepted benefits under the PHS Act, ERISA, and the Code, the Departments' proposed regulations identified travel insurance as an excepted benefit under the first category of excepted benefits and proposed a definition of travel insurance consistent with the definition of travel insurance under final regulations issued by the Treasury Department and the IRS for the health insurance providers fee imposed by section 9010 of the Affordable Care Act,29 which uses a modified version of the National Association of Insurance Commissioners definition of travel insurance.

29 26 CFR 57.2(h)(4).

The proposed regulations defined the term “travel insurance” as insurance coverage for personal risks incident to planned travel, which may include, but are not limited to, interruption or cancellation of a trip or event, loss of baggage or personal effects, damages to accommodations or rental vehicles, and sickness, accident, disability, or death occurring during travel, provided that the health benefits are not offered on a stand-alone basis and are incidental to other coverage. For this purpose, travel insurance does not include major medical plans that provide comprehensive medical protection for travelers with trips lasting six months or longer, including, for example, those working overseas as an expatriate or military personnel being deployed.

The Departments received a number of comments in favor of the treatment of travel insurance as an excepted benefit, as well as the proposed definition of travel insurance. Several comments expressed support for the proposed definition's consistency with regulations governing the health insurance providers fee. One commenter requested clarification that the requirement that health benefits are incidental to other coverage be determined based solely on coverage under the travel insurance policy, without regard to other coverage provided by an employer or plan sponsor; the Departments agree that this is correct. The Departments are finalizing without change the proposed regulations defining travel insurance and treating such coverage as an excepted benefit.

C. Definition of EHB for Purposes of the Prohibition on Lifetime and Annual Limits

Section 2711 of the PHS Act, as added by the Affordable Care Act, generally prohibits group health plans and health insurance issuers offering group or individual health insurance coverage from imposing lifetime and annual dollar limits on EHB, as defined under section 1302(b) of the Affordable Care Act. These prohibitions apply to both grandfathered and non-grandfathered health plans, except the annual limits prohibition does not apply to grandfathered individual health insurance coverage.

Under the Affordable Care Act, self-insured group health plans, large group market health plans, and grandfathered health plans are not required to offer EHB, but they generally cannot place lifetime or annual dollar limits on services they cover that are considered EHB. On November 18, 2015, the Departments issued final regulations implementing section 2711 of the PHS Act.30 The final regulations provide that, for plan years (in the individual market, policy years) beginning on or after January 1, 2017, a plan or issuer that is not required to provide EHB must define EHB, for purposes of the prohibition on lifetime and annual dollar limits, in a manner consistent with any of the 51 EHB base-benchmark plans applicable in a State or the District of Columbia, or one of the three Federal Employees Health Benefits Program (FEHBP) EHB base-benchmark plans, as specified under 45 CFR 156.100.31

30 80 FR 72192.

31 26 CFR 54.9815-2711(c), 29 CFR 2590.715-2711(c), 45 CFR 147.126(c).

The final regulations under section 2711 of the PHS Act include a reference to selecting a “base-benchmark” plan, as specified under 45 CFR 156.100, for purposes of determining which benefits cannot be subject to lifetime or annual dollar limits. The base-benchmark plan selected by a State or applied by default under 45 CFR 156.100, however, may not reflect the complete definition of EHB in the applicable State. For that reason, the Departments are amending the regulations at 26 CFR 54.9815-2711(c), 29 CFR 2590.715-2711(c), and 45 CFR 147.126(c) to refer to the provisions that capture the complete definition of EHB in a State.

Specifically, in these final regulations, the Departments replace the phrase “in a manner consistent with one of the three Federal Employees Health Benefit Program (FEHBP) options as defined by 45 CFR 156.100(a)(3) or one of the base-benchmark plans selected by a State or applied by default pursuant to 45 CFR 156.100” in each of the regulations with the following: “in a manner that is consistent with (1) one of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered EHB consistent with 45 CFR 155.170(a)(2); or (2) one of the three Federal Employees Health Benefit Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.” This change reflects the possibility that base-benchmark plans, including the FEHBP plan options, could require supplementation under 45 CFR 156.110, and ensures the inclusion of State-required benefit mandates enacted on or before December 31, 2011 in accordance with 45 CFR 155.170, which when coupled with a State's EHB-benchmark plan, establish the definition of EHB in that State under regulations implementing section 1302(b) of the Affordable Care Act.32

32 In the HHS Notice of Benefit and Payment Parameters for 2016 published February 27, 2015 (80 FR 10750), HHS instructed States to select a new base-benchmark plan to take effect beginning with plan or policy years beginning in 2017. The new final EHB base-benchmark plans selected as a result of this process are publicly available at downloads.cms.gov/cciio/Final%20List%20of%20BMPs_15_10_21.pdf. Additional information about the new base-benchmark plans, including plan documents and summaries of benefits, is available at www.cms.gov/CCIIO/Resources/Data-Resources/ehb.html. The definition of EHB in each of the 50 states and the District of Columbia is based on the base-benchmark plan, and takes into account any additions to the base-benchmark plan, such as supplementation under 45 CFR 156.110, and State-required benefit mandates in accordance with 45 CFR 155.170.

Some commenters requested clarification that self-insured group health plans, large group market health plans and grandfathered plans are not required to include as covered benefits any specific items and services covered by the State-EHB benchmark plan, including any additional State-required benefits considered EHB under 45 CFR 155.170(a)(2). The requirement in section 2707(a) of the PHS Act to provide the EHB package required under section 1302(a) of the Affordable Care Act applies only to non-grandfathered health insurance coverage in the individual and small group markets. Self-insured group health plans, large group market health plans and grandfathered health plans are not required to include coverage of EHB, but cannot place lifetime or annual dollar limits on any EHB covered by these plans.33 These plans are permitted to impose limits other than dollar limits on EHB, as long as they comply with other applicable statutory provisions. In addition, these plans can continue to impose annual and lifetime dollar limits on benefits that do not fall within the definition of EHB.

33 The annual limits prohibition does not apply to grandfathered individual market coverage.

One commenter urged the Departments to eliminate the option for large group market health plans to define EHB based on one of the three largest nationally available FEHBP benchmark plan options to ensure consistency with the definition of EHB in the individual and small group markets. However, these FEHBP plan options 34 are unique among benchmark plans in that they are available nationally, and thus can more appropriately be utilized to determine what benefits would be categorized as EHB for those employers that provide health coverage to employees throughout the United States and are not situated only in a single State. The Departments are finalizing the proposed clarification to the lifetime and annual limit regulations without change.

34 The three largest nationally available FEHBP plan options are available at https://www.cms.gov/CCIIO/Resources/Regulations-and-Guidance/Downloads/Top3ListFinal-5-19-2015.pdf.

D. Applicability Date

These final regulations are applicable for plan years (or, in the individual market, policy years) beginning on or after January 1, 2017. The HHS final regulations specify the applicability dates in the group market regulations at 45 CFR 146.125 and in the individual market regulations at 45 CFR 148.102.

III. Economic Impact and Paperwork Burden A. Summary—Department of Labor and Department of Health and Human Services

These final regulations specify the conditions for similar supplemental coverage products that are designed to fill gaps in primary coverage by providing coverage of additional categories of benefits (as opposed to filling in gaps in cost sharing) to constitute supplemental excepted benefits, and clarify that certain travel-related insurance products that provide only incidental health benefits constitute excepted benefits.

These final regulations also revise the definition of short-term, limited-duration insurance so that the coverage (including renewals) has to be less than three months in total duration (as opposed to the current definition of less than 12 months in duration), and provide that a notice must be prominently displayed in the contract and in any application materials provided in connection with enrollment in the coverage indicating that such coverage is not minimum essential coverage.

Finally, the regulations amend the definition of “essential health benefits” for purposes of the prohibition on lifetime and annual dollar limits with respect to group health plans and health insurance issuers that are not required to provide essential health benefits, including self-insured group health plans, large group market health plans, and grandfathered health plans.

The Departments are publishing these final regulations to implement the protections intended by the Congress in the most economically efficient manner possible. The Departments have examined the effects of this rule as required by Executive Order 13563 (76 FR 3821, January 21, 2011), Executive Order 12866 (58 FR 51735, September 1993, Regulatory Planning and Review), the Regulatory Flexibility Act (September 19, 1980, Pub. L. 96-354), the Unfunded Mandates Reform Act of 1995 (Pub. L. 104-4), Executive Order 13132 on Federalism, and the Congressional Review Act (5 U.S.C. 804(2)).

B. Executive Orders 12866 and 13563—Department of Labor and Department of Health and Human Services

Executive Order 12866 (58 FR 51735) directs agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects; distributive impacts; and equity). Executive Order 13563 (76 FR 3821, January 21, 2011) is supplemental to and reaffirms the principles, structures, and definitions governing regulatory review as established in Executive Order 12866.

Section 3(f) of Executive Order 12866 defines a “significant regulatory action” as an action that is likely to result in a final rule—(1) having an annual effect on the economy of $100 million or more in any one year, or adversely and materially affecting a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or state, local or tribal governments or communities (also referred to as “economically significant”); (2) creating a serious inconsistency or otherwise interfering with an action taken or planned by another agency; (3) materially altering the budgetary impacts of entitlement grants, user fees, or loan programs or the rights and obligations of recipients thereof; or (4) raising novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles set forth in the Executive Order.

A regulatory impact analysis must be prepared for rules with economically significant effects (for example, $100 million or more in any 1 year), and a “significant” regulatory action is subject to review by the Office of Management and Budget. The Departments have determined that this regulatory action is not likely to have economic impacts of $100 million or more in any one year, and is not significant within the meaning of Executive Order 12866. However, the Departments are nonetheless providing a discussion of the benefits and costs that might stem from these final regulations in the Summary of Impacts section below.

1. Need for Regulatory Action

These final regulations clarify the conditions for similar supplemental coverage and travel insurance to be recognized as excepted benefits. These clarifications are necessary to provide health insurance issuers offering supplemental coverage and travel insurance products with a clearer understanding of the Federal standards that apply to these types of coverage. These final regulations also amend the definition of short-term, limited-duration insurance for purposes of the exclusion from the definition of individual health insurance coverage and impose a new notice requirement in response to reports that short-term, limited-duration insurance coverage is being sold to individuals as primary coverage.

2. Summary of Impacts

The final regulations outline the conditions for travel insurance and similar supplemental health insurance coverage to be considered excepted benefits, and revise the definition of short-term, limited-duration insurance.

The Departments received comments suggesting that the majority of travel insurance policies are issued for trips of short duration, with the average policy length being approximately three months, and these policies generally provide limited medical coverage and property and casualty coverage to protect against risks related to travel. The Departments believe that the designation of certain travel insurance products (as defined by the regulations) as excepted benefits is consistent with prevailing industry practices, and therefore, will not result in significant cost to issuers of these products or consumers who purchase them.

Short-term, limited-duration policies represent a very small fraction of the health insurance market, though their use is increasing. In 2015, total premiums earned for short-term, limited-duration insurance was approximately $160 million for approximately 1,517,000 member months and with approximately 148,000 covered lives at the end of the year,35 while in 2013, total premiums were approximately $98 million for 1,031,000 member months with approximately 80,400 covered lives at the end of the year.36

35 National Association of Insurance Commissioners, 2015 Accident and Health Policy Experience Report, 2016, available at http://naic.org/prod_serv/AHP-LR-16.pdf.

36 National Association of Insurance Commissioners, 2013 Accident and Health Policy Experience Report, 2014, available at http://naic.org/prod_serv/AHP-LR-14.pdf.

The Departments received comments indicating that a large majority of the short-term, limited-duration insurance plans are sold as transitional coverage, particularly for individuals seeking to cover periods of unemployment or gaps between employer-sponsored coverage, and typically provide coverage for less than three months. Therefore, the Departments believe that the final regulations will have no effect on the majority of consumers who purchase such coverage and issuers of those policies. The small fraction of consumers who purchase such policies for longer periods and who may have to transition to individual market coverage will benefit from the protections afforded by the Affordable Care Act, such as no preexisting condition exclusions, essential health benefits without annual or lifetime dollar limits, and guaranteed renewability. While some of these consumers may experience an increase in costs due to higher premiums compared with short-term, limited-duration coverage, they will also avoid potential tax liability by having minimum essential coverage. Some consumers may also be eligible for premium tax credits and cost-sharing reductions for coverage offered through the Exchanges. Finally, inclusion of these individuals, often relatively healthier individuals, in the individual market will help strengthen the individual market's single risk pool. The notice requirement will help ensure that consumers do not inadvertently purchase these products expecting them to be minimum essential coverage. Further, the Departments believe that any costs incurred by issuers of short-term, limited-duration insurance to include the required notice in application or enrollment materials will be negligible since the Departments have provided the exact text for the notice.

As a result, the Departments have concluded that the impacts of these final regulations are not economically significant.

C. Paperwork Reduction Act—Department of Health and Human Services

The final regulations provide that to be considered short-term, limited-duration insurance for policy years beginning on or after January 1, 2017, a notice must be prominently displayed in the contract and in any application materials, stating that the coverage is not minimum essential coverage and that failure to have minimum essential coverage may result in an additional tax payment. The Departments have provided the exact text for these notice requirements and the language will not need to be customized. The burden associated with these notices is not subject to the Paperwork Reduction Act of 1995 in accordance with 5 CFR 1320.3(c)(2) because they do not contain a “collection of information” as defined in 44 U.S.C. 3502(3).

D. Regulatory Flexibility Act

The Regulatory Flexibility Act (5 U.S.C. 601 et seq.) (RFA) imposes certain requirements with respect to Federal rules that are subject to the notice and comment requirements of section 553(b) of the Administrative Procedure Act (5 U.S.C. 551 et seq.) and that are likely to have a significant economic impact on a substantial number of small entities. Unless an agency certifies that a proposed rule is not likely to have a significant economic impact on a substantial number of small entities, section 603 of RFA requires that the agency present an initial regulatory flexibility analysis at the time of the publication of the notice of proposed rulemaking describing the impact of the rule on small entities and seeking public comment on such impact. Small entities include small businesses, organizations and governmental jurisdictions.

The RFA generally defines a “small entity” as (1) a proprietary firm meeting the size standards of the Small Business Administration (13 CFR 121.201); (2) a nonprofit organization that is not dominant in its field; or (3) a small government jurisdiction with a population of less than 50,000. (States and individuals are not included in the definition of “small entity.”) The Departments use as their measure of significant economic impact on a substantial number of small entities a change in revenues of more than 3 to 5 percent.

The Departments expect the impact of these final regulations to be limited because the provisions are generally consistent with current industry practices and impact only a small fraction of the health insurance market. Therefore, the Departments certify that the final regulations will not have a significant impact on a substantial number of small entities. In addition, section 1102(b) of the Social Security Act requires agencies to prepare a regulatory impact analysis if a rule may have a significant economic impact on the operations of a substantial number of small rural hospitals. This analysis must conform to the provisions of section 604 of the RFA. These final regulations will not affect small rural hospitals. Therefore, the Departments have determined that these final regulations will not have a significant impact on the operations of a substantial number of small rural hospitals.

E. Special Analysis—Department of the Treasury

Certain IRS regulations, including this one, are exempt from the requirements of Executive Order 12866, as supplemented and reaffirmed by Executive Order 13563. Therefore, a regulatory impact assessment is not required. For applicability of RFA, see paragraph D of this section III.

Pursuant to section 7805(f) of the Code, these regulations have been submitted to the Chief Counsel for Advocacy of the Small Business Administration for comment on their impact on small business.

F. Unfunded Mandates Reform Act

For purposes of the Unfunded Mandates Reform Act of 1995 (2 U.S.C. 1501 et seq.), as well as Executive Order 12875, these final regulations do not include any Federal mandate that may result in expenditures by State, local, or tribal governments, or the private sector, which may impose an annual burden of $146 million adjusted for inflation since 1995.

G. Federalism—Department of Labor and Department of Health and Human Services

Executive Order 13132 outlines fundamental principles of federalism. It requires adherence to specific criteria by Federal agencies in formulating and implementing policies that have “substantial direct effects” on the States, the relationship between the national government and States, or on the distribution of power and responsibilities among the various levels of government. Federal agencies promulgating regulations that have these federalism implications must consult with State and local officials, and describe the extent of their consultation and the nature of the concerns of State and local officials in the preamble to the final regulation.

In the Departments' view, these final regulations have federalism implications because they would have direct effects on the States, the relationship between the national government and the States, or on the distribution of power and responsibilities among various levels of government. Under these final regulations, health insurance issuers offering short-term, limited-duration insurance, travel insurance and similar supplemental coverage will be required to follow the minimum Federal standards to not be subject to the market reform provisions under the PHS Act, ERISA and the Code. However, in the Departments' view, the federalism implications of these final regulations are substantially mitigated because, with respect to health insurance issuers, the Departments expect that the majority of States will enact laws or take other appropriate action resulting in their meeting or exceeding the Federal standards.

In general, through section 514, ERISA supersedes State laws to the extent that they relate to any covered employee benefit plan, and preserves State laws that regulate insurance, banking, or securities. While ERISA prohibits States from regulating an employee benefit plan as an insurance or investment company or bank, the preemption provisions of section 731 of ERISA and section 2724 of the PHS Act (implemented in 29 CFR 2590.731(a) and 45 CFR 146.143(a) and 148.210(b)) apply so that the requirements in title XXVII of the PHS Act (including those added by the Affordable Care Act) are not to be construed to supersede any provision of State law which establishes, implements, or continues in effect any standard or requirement solely relating to health insurance issuers in connection with individual or group health insurance coverage except to the extent that such standard or requirement prevents the application of a Federal requirement. The conference report accompanying HIPAA indicates that this is intended to be the “narrowest” preemption of State laws (See House Conf. Rep. No. 104-736, at 205, reprinted in 1996 U.S. Code Cong. & Admin. News 2018).

States may continue to apply State law requirements except to the extent that such requirements prevent the application of the market reform requirements that are the subject of this rulemaking. Accordingly, States have significant latitude to impose requirements on health insurance issuers that are more restrictive than the Federal law.

In compliance with the requirement of Executive Order 13132 that agencies examine closely any policies that may have federalism implications or limit the policy making discretion of the States, the Departments have engaged in efforts to consult with and work cooperatively with affected States, including consulting with, and attending conferences of, the National Association of Insurance Commissioners and consulting with State insurance officials on an individual basis. It is expected that the Departments will act in a similar fashion in enforcing the market reform provisions of the Affordable Care Act.

Throughout the process of developing these final regulations, to the extent feasible within the applicable preemption provisions, the Departments have attempted to balance the States' interests in regulating health insurance issuers, and Congress' intent to provide uniform minimum protections to consumers in every State. By doing so, it is the Departments' view that they have complied with the requirements of Executive Order 13132.

Pursuant to the requirements set forth in section 8(a) of Executive Order 13132, and by the signatures affixed to this final rule, the Departments certify that the Employee Benefits Security Administration and the Centers for Medicare & Medicaid Services have complied with the requirements of Executive Order 13132 for the attached final rules in a meaningful and timely manner.

H. Congressional Review Act

These final regulations are subject to the Congressional Review Act provisions of the Small Business Regulatory Enforcement Fairness Act of 1996 (5 U.S.C. 801 et seq.) and will be transmitted to the Congress and to the Comptroller General for review in accordance with such provisions.

I. Statement of Availability of IRS Documents

IRS Revenue Procedures, Revenue Rulings notices, and other guidance cited in this document are published in the Internal Revenue Bulletin (or Cumulative Bulletin) and are available from the Superintendent of Documents, U.S. Government Printing Office, Washington, DC 20402, or by visiting the IRS Web site at http://www.irs.gov.

IV. Statutory Authority

The Department of the Treasury regulations are adopted pursuant to the authority contained in sections 7805 and 9833 of the Code.

The Department of Labor regulations are adopted pursuant to the authority contained in 29 U.S.C. 1135 and 1191c; and Secretary of Labor's Order 1-2011, 77 FR 1088 (Jan. 9, 2012).

The Department of Health and Human Services regulations are adopted pursuant to the authority contained in sections 2701 through 2763, 2791, and 2792 of the PHS Act (42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92), as amended.

List of Subjects 26 CFR Part 54

Pension and excise taxes.

29 CFR Part 2590

Continuation coverage, Disclosure, Employee benefit plans, Group health plans, Health care, Health insurance, Medical child support, Reporting and recordkeeping requirements.

45 CFR Parts 144, 146 and 147

Health care, Health insurance, Reporting and recordkeeping requirements.

45 CFR Part 148

Administrative practice and procedure, Health care, Health insurance, Penalties, Reporting and recordkeeping requirements.

John Dalrymple, Deputy Commissioner for Services and Enforcement, Internal Revenue Service. Approved: October 25, 2016. Mark J. Mazur, Assistant Secretary of the Treasury (Tax Policy). Signed this 25th day of October 2016. Phyllis C. Borzi, Assistant Secretary, Employee Benefits Security Administration, Department of Labor. Dated: October 24, 2016. Andrew M. Slavitt, Acting Administrator, Centers for Medicare & Medicaid Services. Dated: October 25, 2016. Sylvia M. Burwell, Secretary, Department of Health and Human Services. DEPARTMENT OF THE TREASURY Internal Revenue Service 26 CFR Chapter I

Accordingly, 26 CFR part 54 is amended as follows:

PART 54—PENSION AND EXCISE TAXES Par. 1. The authority citation for part 54 continues to read in part as follows: Authority:

26 U.S.C. 7805 * * *

Par. 2. Section 54.9801-2 is amended by revising the definition of “short-term, limited-duration insurance”, and adding a definition of “travel insurance” in alphabetical order. The revision and addition read as follows:
§ 54.9801-2 Definitions.

Short-term, limited-duration insurance means health insurance coverage provided pursuant to a contract with an issuer that:

(1) Has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder with or without the issuer's consent) that is less than 3 months after the original effective date of the contract; and

(2) Displays prominently in the contract and in any application materials provided in connection with enrollment in such coverage in at least 14 point type the following: “THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.”

Travel insurance means insurance coverage for personal risks incident to planned travel, which may include, but is not limited to, interruption or cancellation of trip or event, loss of baggage or personal effects, damages to accommodations or rental vehicles, and sickness, accident, disability, or death occurring during travel, provided that the health benefits are not offered on a stand-alone basis and are incidental to other coverage. For this purpose, the term travel insurance does not include major medical plans that provide comprehensive medical protection for travelers with trips lasting 6 months or longer, including, for example, those working overseas as an expatriate or military personnel being deployed.

Par. 3. Section 54.9815-2711 is amended by revising paragraph (c) to read as follows:
§ 54.9815-2711 No lifetime or annual limits.

(c) Definition of essential health benefits. The term “essential health benefits” means essential health benefits under section 1302(b) of the Patient Protection and Affordable Care Act and applicable regulations. For this purpose, a group health plan or a health insurance issuer that is not required to provide essential health benefits under section 1302(b) must define “essential health benefits” in a manner that is consistent with—

(1) One of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered essential health benefits consistent with 45 CFR 155.170(a)(2); or

(2) One of the three Federal Employees Health Benefits Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.

Par. 4. Section 54.9831-1 is amended: a. In paragraph (b)(1) by removing the reference “54.9812-1T” and adding in its place the reference “54.9812-1, 54.9815-1251 through 54.9815-2719A,” and in paragraph (c)(1) by removing the reference “54.9811-1T, 54.9812-1T” and adding in its place the phrase “54.9811-1, 54.9812-1, 54.9815-1251 through 54.9815-2719A”; b. In paragraph (c)(2)(vii) by removing “and” at the end; c. In paragraph (c)(2)(viii) by removing the period and adding “; and” at the end; d. Adding paragraph (c)(2)(ix); and e. Revising paragraph (c)(5)(i)(C).

The revisions and additions are as follows:

§ 54.9831-1 Special rules relating to group health plans.

(c) * * *

(2) * * *

(ix) Travel insurance, within the meaning of § 54.9801-2.

(5) * * *

(i) * * *

(C) Similar supplemental coverage provided to coverage under a group health plan. To be similar supplemental coverage, the coverage must be specifically designed to fill gaps in the primary coverage. The preceding sentence is satisfied if the coverage is designed to fill gaps in cost sharing in the primary coverage, such as coinsurance or deductibles, or the coverage is designed to provide benefits for items and services not covered by the primary coverage and that are not essential health benefits (as defined under section 1302(b) of the Patient Protection and Affordable Care Act) in the State where the coverage is issued, or the coverage is designed to both fill such gaps in cost sharing under, and cover such benefits not covered by, the primary coverage. Similar supplemental coverage does not include coverage that becomes secondary or supplemental only under a coordination-of-benefits provision.

Par. 5. Section 54.9833-1 is amended by adding a sentence at the end to read as follows:
§ 54.9833-1 Effective dates.

* * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 54.9801-2 and paragraph (c)(5)(i)(C) of § 54.9831-1 apply for plan years beginning on or after January 1, 2017.

DEPARTMENT OF LABOR Employee Benefits Security Administration 29 CFR Chapter XXV

For the reasons stated in the preamble, the Department of Labor amends 29 CFR part 2590 as set forth below:

PART 2590—RULES AND REGULATIONS FOR GROUP HEALTH PLANS 6. The authority citation for part 2590 is revised to read as follows: Authority:

29 U.S.C. 1027, 1059, 1135, 1161-1168, 1169, 1181-1183, 1181 note, 1185, 1185a, 1185b, 1191, 1191a, 1191b, and 1191c; sec. 101(g), Pub. L. 104-191, 110 Stat. 1936; sec. 401(b), Pub. L. 105-200, 112 Stat. 645 (42 U.S.C. 651 note); sec. 512(d), Pub. L. 110-343, 122 Stat. 3881; sec. 1001, 1201, and 1562(e), Pub. L. 111-148, 124 Stat. 119, as amended by Pub. L. 111-152, 124 Stat. 1029; Division M, Pub. L. 113-235, 128 Stat. 2130; Secretary of Labor's Order 1-2011, 77 FR 1088 (Jan. 9, 2012).

7. Section 2590.701-2 is amended by revising the definition of “short-term, limited-duration insurance”, and adding a definition of “travel insurance” in alphabetical order. The addition and revision read as follows:
§ 2590.701-2 Definitions.

Short-term, limited-duration insurance means health insurance coverage provided pursuant to a contract with an issuer that:

(1) Has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder with or without the issuer's consent) that is less than 3 months after the original effective date of the contract; and

(2) Displays prominently in the contract and in any application materials provided in connection with enrollment in such coverage in at least 14 point type the following: “THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.”

Travel insurance means insurance coverage for personal risks incident to planned travel, which may include, but is not limited to, interruption or cancellation of trip or event, loss of baggage or personal effects, damages to accommodations or rental vehicles, and sickness, accident, disability, or death occurring during travel, provided that the health benefits are not offered on a stand-alone basis and are incidental to other coverage. For this purpose, the term travel insurance does not include major medical plans that provide comprehensive medical protection for travelers with trips lasting 6 months or longer, including, for example, those working overseas as an expatriate or military personnel being deployed.

8. Section 2590.715-2711 is amended by revising paragraph (c) to read as follows:
§ 2590.715-2711 No lifetime or annual limits.

(c) Definition of essential health benefits. The term “essential health benefits” means essential health benefits under section 1302(b) of the Patient Protection and Affordable Care Act and applicable regulations. For this purpose, a group health plan or a health insurance issuer that is not required to provide essential health benefits under section 1302(b) must define “essential health benefits” in a manner that is consistent with—

(1) One of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered essential health benefits consistent with 45 CFR 155.170(a)(2); or

(2) One of the three Federal Employees Health Benefits Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.

9. Section 2590.732 is amended by adding paragraph (c)(2)(ix) and revising paragraph (c)(5)(i)(C) to read as follows:
§ 2590.732 Special rules relating to group health plans.

(c) * * *

(2) * * *

(ix) Travel insurance, within the meaning of § 2590.701-2.

(5) * * *

(i) * * *

(C) Similar supplemental coverage provided to coverage under a group health plan. To be similar supplemental coverage, the coverage must be specifically designed to fill gaps in the primary coverage. The preceding sentence is satisfied if the coverage is designed to fill gaps in cost sharing in the primary coverage, such as coinsurance or deductibles, or the coverage is designed to provide benefits for items and services not covered by the primary coverage and that are not essential health benefits (as defined under section 1302(b) of the Patient Protection and Affordable Care Act) in the State where the coverage is issued, or the coverage is designed to both fill such gaps in cost sharing under, and cover such benefits not covered by, the primary coverage. Similar supplemental coverage does not include coverage that becomes secondary or supplemental only under a coordination-of-benefits provision.

10. Section 2590.736 is amended by adding a sentence at the end to read as follows:
§ 2590.736 Applicability dates.

* * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 2590.701-2 and paragraph (c)(5)(i)(C) of § 2590.732 apply for plan years beginning on or after January 1, 2017.

DEPARTMENT OF HEALTH AND HUMAN SERVICES 45 CFR Chapter 1

For the reasons stated in the preamble, the Department of Health and Human Services amends 45 CFR parts 144, 146, 147, and 148 as set forth below:

PART 144—REQUIREMENTS RELATING TO HEALTH INSURANCE COVERAGE 11. The authority citation for part 144 continues to read as follows: Authority:

Secs. 2701 through 2763, 2791, and 2792 of the Public Health Service Act, 42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92.

12. Section 144.103 is amended by revising the definition of “short-term, limited-duration insurance” and adding a definition of “travel insurance” in alphabetical order. The revision and addition read as follows:
§ 144.103 Definitions.

Short-term, limited-duration insurance means health insurance coverage provided pursuant to a contract with an issuer that:

(1) Has an expiration date specified in the contract (taking into account any extensions that may be elected by the policyholder with or without the issuer's consent) that is less than 3 months after the original effective date of the contract; and

(2) Displays prominently in the contract and in any application materials provided in connection with enrollment in such coverage in at least 14 point type the following: “THIS IS NOT QUALIFYING HEALTH COVERAGE (“MINIMUM ESSENTIAL COVERAGE”) THAT SATISFIES THE HEALTH COVERAGE REQUIREMENT OF THE AFFORDABLE CARE ACT. IF YOU DON'T HAVE MINIMUM ESSENTIAL COVERAGE, YOU MAY OWE AN ADDITIONAL PAYMENT WITH YOUR TAXES.”

Travel insurance means insurance coverage for personal risks incident to planned travel, which may include, but is not limited to, interruption or cancellation of trip or event, loss of baggage or personal effects, damages to accommodations or rental vehicles, and sickness, accident, disability, or death occurring during travel, provided that the health benefits are not offered on a stand-alone basis and are incidental to other coverage. For this purpose, the term travel insurance does not include major medical plans that provide comprehensive medical protection for travelers with trips lasting 6 months or longer, including, for example, those working overseas as an expatriate or military personnel being deployed.

PART 146—REQUIREMENTS FOR THE GROUP HEALTH INSURANCE MARKET 13. The authority citation for part 146 continues to read as follows: Authority:

Secs. 2702 through 2705, 2711 through 2723, 2791, and 2792 of the Public Health Service Act (42 U.S.C. 300gg-1 through 300gg-5, 300gg-11 through 300gg-23, 300gg-91, and 300gg-92.

14. Section 146.125 is amended by adding a sentence at the end to read as follows:
§ 146.125 Applicability dates.

* * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 144.103 of this subchapter and paragraph (c)(5)(i)(C) of § 146.145 apply for policy years and plan years beginning on or after January 1, 2017.

15. Section 146.145 is amended by adding paragraph (b)(2)(ix) and revising paragraph (b)(5)(i)(C) to read as follows:
§ 146.145 Special rules relating to group health plans.

(b) * * *

(2) * * *

(ix) Travel insurance, within the meaning of § 144.103 of this subchapter.

(5) * * *

(i) * * *

(C) Similar supplemental coverage provided to coverage under a group health plan. To be similar supplemental coverage, the coverage must be specifically designed to fill gaps in the primary coverage. The preceding sentence is satisfied if the coverage is designed to fill gaps in cost sharing in the primary coverage, such as coinsurance or deductibles, or the coverage is designed to provide benefits for items and services not covered by the primary coverage and that are not essential health benefits (as defined under section 1302(b) of the Patient Protection and Affordable Care Act) in the State where the coverage is issued, or the coverage is designed to both fill such gaps in cost sharing under, and cover such benefits not covered by, the primary coverage. Similar supplemental coverage does not include coverage that becomes secondary or supplemental only under a coordination-of-benefits provision.

PART 147—HEALTH INSURANCE REFORM REQUIREMENTS FOR THE GROUP AND INDIVIDUAL HEALTH INSURANCE MARKETS 16. The authority citation for part 147 continues to read as follows: Authority:

Secs. 2701 through 2763, 2791, and 2792 of the Public Health Service Act (42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92), as amended.

17. Section 147.126 is amended by revising paragraph (c) to read as follows:
§ 147.126 No lifetime or annual limits.

(c) Definition of essential health benefits. The term “essential health benefits” means essential health benefits under section 1302(b) of the Patient Protection and Affordable Care Act and applicable regulations. For this purpose, a group health plan or a health insurance issuer that is not required to provide essential health benefits under section 1302(b) must define “essential health benefits” in a manner that is consistent with—

(1) One of the EHB-benchmark plans applicable in a State under 45 CFR 156.110, and includes coverage of any additional required benefits that are considered essential health benefits consistent with 45 CFR 155.170(a)(2); or

(2) One of the three Federal Employees Health Benefits Program (FEHBP) plan options as defined by 45 CFR 156.100(a)(3), supplemented, as necessary, to meet the standards in 45 CFR 156.110.

PART 148—REQUIREMENTS FOR THE INDIVIDUAL HEALTH INSURANCE MARKET 18. The authority citation for part 148 continues to read as follows: Authority:

Secs. 2701 through 2763, 2791, and 2792 of the Public Health Service Act (42 U.S.C. 300gg through 300gg-63, 300gg-91, and 300gg-92), as amended.

19. Section 148.102 is amended by adding a sentence at the end of paragraph (b) to read as follows:
§ 148.102 Scope, applicability, and effective dates.

(b) * * * Notwithstanding the previous sentence, the definition of “short-term, limited-duration insurance” in § 144.103 of this subchapter and paragraph (b)(7) of § 148.220 apply for policy years beginning on or after January 1, 2017.

20. Section 148.220 is amended by adding paragraph (a)(9) and revising paragraph (b)(7) to read as follows:
§ 148.220 Excepted benefits.

(a) * * *

(9) Travel insurance, within the meaning of § 144.103 of this subchapter.

(b) * * *

(7) Similar supplemental coverage provided to coverage under a group health plan (as described in § 146.145(b)(5)(i)(C) of this subchapter).

[FR Doc. 2016-26162 Filed 10-28-16; 8:45 am] BILLING CODE 4830-01-P; 4120-01-P; 4510-29-P
DEPARTMENT OF HOMELAND SECURITY Coast Guard 33 CFR Part 117 [Docket No. USCG-2016-0956] Drawbridge Operation Regulation; Upper Mississippi River, Clinton, IA AGENCY:

Coast Guard, DHS.

ACTION:

Notice of deviation from drawbridge regulation.

SUMMARY:

The Coast Guard has issued a temporary deviation from the operating schedule that governs three drawbridges crossing the Upper Mississippi River in Iowa: The Illinois Central Railroad Drawbridge, mile 579.9, Dubuque, IA; the Sabula Railroad Drawbridge, mile 535.0, Sabula, IA; and the Clinton Railroad Drawbridge, mile 518.0, Clinton, IA. The deviation is necessary to allow the bridge owners time to perform preventive maintenance that is essential to the continued safe operation of the drawbridges and allows for a seasonal deviation issued for these bridges each year. Maintenance is scheduled in the winter, when there is less impact on navigation due to less traffic. This deviation allows the bridges to open on signal if at least 24 hours advance notice is given.

DATES:

This deviation is effective from 5 p.m., December 13, 2016 until 9 a.m., March 2, 2017.

ADDRESSES:

The docket for this deviation, (USCG-2016-0956) is available at http://www.regulations.gov. Type the docket number in the “SEARCH” box and click “SEARCH.” Click on Open Docket Folder on the line associated with this deviation.

FOR FURTHER INFORMATION CONTACT:

If you have questions on this temporary deviation, call or email Eric A. Washburn, Bridge Administrator, Western Rivers, Coast Guard; telephone 314-269-2378, email [email protected].

SUPPLEMENTARY INFORMATION:

The Illinois Central, Canadian Pacific, and Union Pacific Railroads requested a temporary deviation for the Illinois Central Railroad Drawbridge, mile 579.9, Dubuque, Iowa, Sabula Railroad Drawbridge, mile 535.0, Sabula, Iowa, and Clinton Railroad Drawbridge, mile 518.0, Clinton, Iowa, across the Upper Mississippi River to open on signal if at least 24 hours advance notice is given for 79 days from 5 p.m., December 13, 2016 to 9 a.m., March 2, 2017 for scheduled maintenance on the bridges.

The Illinois Central, Sabula, and Clinton Railroad Drawbridges currently operate in accordance with 33 CFR 117.5, which states the general requirement that drawbridges open on signal.

There are no alternate routes for vessels transiting these sections of the Upper Mississippi River. The bridges cannot open in case of emergency.

The Illinois Central Railroad Drawbridge provides a vertical clearance of 19.9 feet, Sabula Railroad Drawbridge provides a vertical clearance of 18.1 feet, and Clinton Railroad Drawbridge provides a vertical clearance of 18.7 feet, above normal pool in their closed-to-navigation positions. Navigation on the waterway consists primarily of commercial tows and recreational watercraft and will not be significantly impacted. This temporary deviation has been coordinated with waterway users. No objections were received.

In accordance with 33 CFR 117.35(e), each of these drawbridges must return to its regular operating schedule immediately at the end of the effective period of this temporary deviation. This deviation from the operating regulations is authorized under 33 CFR 117.35.

Dated: October 25, 2016. Eric A. Washburn, Bridge Administrator, Western Rivers.
[FR Doc. 2016-26150 Filed 10-28-16; 8:45 am] BILLING CODE 9110-04-P
DEPARTMENT OF HOMELAND SECURITY Coast Guard 33 CFR Part 117 [Docket No. USCG-2016-0948] Drawbridge Operation Regulation; Newtown Creek, Brooklyn and Queens, NY AGENCY:

Coast Guard, DHS.

ACTION:

Notice of deviation from drawbridge regulation.

SUMMARY:

The Coast Guard has issued a temporary deviation from the operating schedule that governs the Pulaski Bridge across the Newtown Creek, mile 0.6, between Brooklyn and Queens, New York. This deviation is necessary to allow the bridge owner to perform span locks adjustment at the bridge.

DATES:

This deviation is effective from 12:01 a.m. on November 8, 2016 to 5 a.m. on December 2, 2016.

ADDRESSES:

The docket for this deviation, [USCG-2016-0948] is available at http://www.regulations.gov. Type the docket number in the “SEARCH” box and click “SEARCH”. Click on Open Docket Folder on the line associated with this deviation.

FOR FURTHER INFORMATION CONTACT:

If you have questions on this temporary deviation, call or email Judy Leung-Yee, Project Officer, First Coast Guard District, telephone (212) 514-4330, email [email protected].

SUPPLEMENTARY INFORMATION:

The Pulaski Bridge, mile 0.6, across the Newtown Creek, has a vertical clearance in the closed position of 39 feet at mean high water and 43 feet at mean low water. The existing bridge operating regulations are found at 33 CFR 117.801(g)(1).

The waterway is transited by commercial barge traffic of various sizes.

The bridge owner, New York City DOT, requested a temporary deviation from the normal operating schedule to perform span locks adjustment at the bridge.

Under this temporary deviation, the Pulaski Bridge shall remain in the closed position as follows:

November 8, 2016 between 12:01 a.m. and 5 a.m.

November 9, 2016 between 12:01 a.m. and 5 a.m.

November 10, 2016 between 12:01 a.m. and 5 a.m.

November 11, 2016 between 12:01 a.m. and 5 a.m.

November 15, 2016 between 12:01 a.m. and 5 a.m.

November 16, 2016 between 12:01 a.m. and 5 a.m.

November 17, 2016 between 12:01 a.m. and 5 a.m.

November 18, 2016 between 12:01 a.m. and 5 a.m.

November 22, 2016 between 12:01 a.m. and 5 a.m.

November 23, 2016 between 12:01 a.m. and 5 a.m.

November 24, 2016 between 12:01 a.m. and 5 a.m.

November 25, 2016 between 12:01 a.m. and 5 a.m.

November 29, 2016 between 12:01 a.m. and 5 a.m.

November 30, 2016 between 12:01 a.m. and 5 a.m.

December 1, 2016 between 12:01 a.m. and 5 a.m.

December 2, 2016 between 12:01 a.m. and 5 a.m.

Vessels able to pass under the bridge in the closed position may do so at anytime. The bridge will not be able to open for emergencies and there is no immediate alternate route for vessels to pass.

The Coast Guard will inform the users of the waterways through our Local Notice and Broadcast to Mariners of the change in operating schedule for the bridge so that vessel operations can arrange their transits to minimize any impact caused by the temporary deviation. The Coast Guard notified known companies of the commercial oil and barge vessels in the area and they have no objections to the temporary deviation.

In accordance with 33 CFR 117.35(e), the drawbridge must return to its regular operating schedule immediately at the end of the effective period of this temporary deviation. This deviation from the operating regulations is authorized under 33 CFR 117.35.

Dated: October 26, 2016. C.J. Bisignano, Supervisory Bridge Management Specialist, First Coast Guard District.
[FR Doc. 2016-26235 Filed 10-28-16; 8:45 am] BILLING CODE 9110-04-P
DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services 42 CFR Parts 405, 412, 413, and 489 [CMS-1655-CN3] RINs 0938-AS77; 0938-AS88; 0938-AS41 Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long-Term Care Hospital Prospective Payment System and Policy Changes and Fiscal Year 2017 Rates; Quality Reporting Requirements for Specific Providers; Graduate Medical Education; Hospital Notification Procedures Applicable to Beneficiaries Receiving Observation Services; Technical Changes Relating to Costs to Organizations and Medicare Cost Reports; Finalization of Interim Final Rules With Comment Period on LTCH PPS Payments for Severe Wounds, Modifications of Limitations on Redesignation by the Medicare Geographic Classification Review Board, and Extensions of Payments to MDHs and Low-Volume Hospitals; Correction AGENCY:

Centers for Medicare & Medicaid Services (CMS), HHS.

ACTION:

Final rule; correction.

SUMMARY:

This document corrects a typographical error in the final rule that appeared in the August 22, 2016 Federal Register as well as additional typographical errors in a related correction to that rule that appeared in the October 5, 2016 Federal Register. The final rule was titled “Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long-Term Care Hospital Prospective Payment System and Policy Changes and Fiscal Year 2017 Rates; Quality Reporting Requirements for Specific Providers; Graduate Medical Education; Hospital Notification Procedures Applicable to Beneficiaries Receiving Observation Services; Technical Changes Relating to Costs to Organizations and Medicare Cost Reports; Finalization of Interim Final Rules With Comment Period on LTCH PPS Payments for Severe Wounds, Modifications of Limitations on Redesignation by the Medicare Geographic Classification Review Board, and Extensions of Payments to MDHs and Low-Volume Hospitals”.

DATES:

Effective Date: This correcting document is effective on October 28, 2016.

Applicability Date: This correcting document is applicable for discharges beginning October 1, 2016.

FOR FURTHER INFORMATION CONTACT:

Donald Thompson, (410) 786-4487.

SUPPLEMENTARY INFORMATION:

I. Background

In the final rule which appeared in the August 22, 2016 Federal Register (81 FR 56761) entitled “Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long Term Care Hospital Prospective Payment System and Policy Changes and Fiscal Year 2017 Rates; Quality Reporting Requirements for Specific Providers; Graduate Medical Education; Hospital Notification Procedures Applicable to Beneficiaries Receiving Observation Services; Technical Changes Relating to Costs to Organizations and Medicare Cost Reports; Finalization of Interim Final Rules with Comment Period on LTCH PPS Payments for Severe Wounds, Modifications of Limitations on Redesignation by the Medicare Geographic Classification Review Board, and Extensions of Payments to MDHs and Low Volume Hospitals” (hereinafter referred to as the FY 2017 IPPS/LTCH PPS final rule), there were a number of technical and typographical errors. To correct the typographical and technical errors in the FY 2017 IPPS/LTCH PPS final rule, we published a correcting document that appeared in the October 5, 2016 Federal Register (81 FR 68947) (hereinafter referred to as the FY 2017 IPPS/LTCH PPS correcting document).

II. Summary of Errors A. Summary of Errors in the FY 2017 IPPS/LTCH PPS Final Rule

On page 57105, we inadvertently made a typographical error in defining an MSA-dominant hospital.

B. Summary of Errors in the FY 2017 IPPS/LTCH PPS Correcting Document

On page 68953 in the table titled “CHANGE OF FY 2016 STANDARDIZED AMOUNTS TO THE FY 2017 STANDARDIZED AMOUNTS,” we inadvertently made a typographical error in the Labor figure for the “National Standardized Amount for FY 2017 if Wage Index is Greater than 1.0000; Labor/Non-Labor Share Percentage (69.6/30.4)” under the classification of “Hospital did NOT submit quality data and is a meaningful EHR user”.

On page 68955 in the table titled “Table 1A—NATIONAL ADJUSTED OPERATING STANDARDIZED AMOUNTS, LABOR/NONLABOR (69.6 PERCENT LABOR SHARE/30.4 PERCENT NONLABOR SHARE IF WAGE INDEX IS GREATER THAN 1)—FY 2017,” we inadvertently made a typographical error in the Nonlabor figure under the classification of “Hospital submitted quality data and is a meaningful EHR user (update = 1.65 percent)”.

On page 68958 in the table titled “FY 2017 IPPS ESTIMATED PAYMENTS DUE TO RURAL AND IMPUTED FLOOR WITH NATIONAL BUDGET NEUTRALITY,” we made errors in the alignment of the data in the fourth column titled “Difference (in $ millions)”. Specifically, when creating the table in the correcting document, the data in the fourth column was inadvertently misaligned starting with the entry for Washington, DC and continuing to the end, resulting in incorrect values in that column.

III. Waiver of Proposed Rulemaking and Delay in Effective Date

We ordinarily publish a notice of proposed rulemaking in the Federal Register to provide a period for public comment before the provisions of a rule take effect in accordance with section 553(b) of the Administrative Procedure Act (APA) (5 U.S.C. 553(b)). However, we can waive this notice and comment procedure if the Secretary finds, for good cause, that the notice and comment process is impracticable, unnecessary, or contrary to the public interest, and incorporates a statement of the finding and the reasons therefore in the notice.

Section 553(d) of the APA ordinarily requires a 30-day delay in the effective date of final rules after the date of their publication in the Federal Register. This 30-day delay in effective date can be waived, however, if an agency finds for good cause that the delay is impracticable, unnecessary, or contrary to the public interest, and the agency incorporates a statement of the findings and its reasons in the rule issued.

We believe that this correcting document does not constitute a rule that would be subject to the APA notice and comment or delayed effective date requirements. This correcting document corrects typographical errors in the FY 2017 IPPS/LTCH PPS final rule and the FY 2017 IPPS/LTCH PPS correcting document but does not make substantive changes to the policies or payment methodologies that were adopted in the final rule. As a result, this correcting document is intended to ensure that the information in the FY 2017 IPPS/LTCH PPS final rule accurately reflects the policies adopted in that final rule.

In addition, even if this were a rule to which the notice and comment procedures and delayed effective date requirements applied, we find that there is good cause to waive such requirements. Undertaking further notice and comment procedures to incorporate the corrections in this document into the final rule or delaying the effective date would be contrary to the public interest because it is in the public's interest for providers to receive appropriate payments in as timely a manner as possible, and to ensure that the FY 2017 IPPS/LTCH PPS final rule accurately reflects our policies. Furthermore, such procedures would be unnecessary, as we are not altering our payment methodologies or policies, but rather, we are simply implementing correctly the policies that we previously proposed, received comment on, and subsequently finalized. This correcting document is intended solely to ensure that the FY 2017 IPPS/LTCH PPS final rule accurately reflects these payment methodologies and policies. Therefore, we believe we have good cause to waive the notice and comment and effective date requirements.

IV. Correction of Errors A. Correction of Errors in the Final Rule

In FR Doc. 2016-18476 of August 22, 2016 (81 FR 56761), we are making the following correction:

1. On page 57105, first column, first partial paragraph, lines 6 and 7, the phrase “total hospital's Medicare discharges” is corrected to read “total hospital Medicare discharges”.

B. Correction of Errors in the Correcting Document

In FR Doc. 2016-24042 of October 5, 2016 (81 FR 68947), we are making the following corrections:

1. On pages 68952 through 68954 in the table titled, “CHANGE OF FY 2016 STANDARDIZED AMOUNTS TO THE FY 2017 STANDARDIZED AMOUNTS”, the last entry on page 68953 is corrected to read as follows:

Hospital submitted
  • quality data and is
  • a meaningful
  • EHR user
  • Hospital submitted
  • quality data and is
  • NOT a meaningful
  • EHR user
  • Hospital did NOT
  • submit quality data
  • and is a meaningful
  • EHR user
  • Hospital did NOT
  • submit quality data
  • and is NOT
  • a meaningful
  • EHR user
  • National Standardized Amount for FY 2017 if Wage Index is Greater Than 1.0000; Labor/Non-Labor Share Percentage (69.6/30.4) Labor: $3,839.23
  • Nonlabor: $1,676.91
  • Labor: $3,762.75
  • Nonlabor: $1,643.50
  • Labor: $3,813.74
  • Nonlabor: $1,665.77
  • Labor: $3,737.25.
  • Nonlabor: $1,632.37.
  • 2. On page 68955, top of the page in the table titled, “Table 1A—NATIONAL ADJUSTED OPERATING STANDARDIZED AMOUNTS, LABOR/NONLABOR (69.6 PERCENT LABOR SHARE/30.4 PERCENT NONLABOR SHARE IF WAGE INDEX IS GREATER THAN 1)—FY 2017”, the first column of the table is corrected to read as follows:

    Hospital submitted quality data and is a meaningful EHR User
  • (update = 1.65 percent)
  • Labor Nonlabor
    $3,839.23 $1,676.91

    3. On page 68958, top of the page, the table titled, “FY 2017 IPPS ESTIMATED PAYMENTS DUE TO RURAL AND IMPUTED FLOOR WITH NATIONAL BUDGET NEUTRALITY” is corrected to read as follows:

    FY 2017 IPPS Estimated Payments Due to Rural and Imputed Floor With National Budget Neutrality State Number of
  • hospitals
  • Number of
  • hospitals that will receive the rural floor or imputed floor
  • Percent
  • change in payments due to application of rural floor and imputed floor with budget neutrality
  • Difference
  • (in $ millions)
  • (1) (2) (3) (4) Alabama 83 6 −0.3 −6 Alaska 6 4 2.1 4 Arizona 57 46 3.5 63 Arkansas 44 0 −0.4 −4 California 301 186 1.3 131 Colorado 48 3 0.2 3 Connecticut 31 8 0.2 4 Delaware 6 2 0 0 Washington, DC 7 0 −0.4 −2 Florida 171 16 −0.3 −18 Georgia 105 0 −0.4 −10 Hawaii 12 0 −0.3 −1 Idaho 14 0 −0.3 −1 Illinois 126 3 −0.4 −19 Indiana 89 0 −0.4 −11 Iowa 35 0 −0.4 −4 Kansas 53 0 −0.3 −3 Kentucky 65 0 −0.4 −6 Louisiana 95 2 −0.4 −5 Maine 18 0 −0.4 −2 Massachusetts 58 15 0.6 22 Michigan 95 0 −0.4 −18 Minnesota 49 0 −0.3 −6 Mississippi 62 0 −0.4 −4 Missouri 74 2 −0.3 −8 Montana 12 4 0.3 1 Nebraska 26 0 −0.3 −2 Nevada 24 3 −0.2 −2 New Hampshire 13 9 2.2 11 New Jersey 64 18 0.2 6 New Mexico 25 0 −0.3 −1 New York 154 21 −0.3 −20 North Carolina 84 1 −0.4 −12 North Dakota 6 1 −0.3 −1 Ohio 130 10 −0.4 −13 Oklahoma 86 2 −0.3 −4 Oregon 34 2 −0.4 −4 Pennsylvania 151 5 −0.4 −20 Puerto Rico 51 12 0.1 0 Rhode Island 11 10 4.7 18 South Carolina 57 5 −0.1 −2 South Dakota 18 0 −0.2 −1 Tennessee 92 20 −0.3 −7 Texas 320 3 −0.4 −26 Utah 33 1 −0.3 −2 Vermont 6 0 −0.2 −1 Virginia 76 1 −0.3 −8 Washington 49 6 −0.1 −1 West Virginia 29 3 −0.2 −1 Wisconsin 65 6 −0.3 −5 Wyoming 10 0 −0.1 0
    Dated: October 26, 2016. Madhura Valverde, Executive Secretary to the Department, Department of Health and Human Services.
    [FR Doc. 2016-26182 Filed 10-28-16; 8:45 am] BILLING CODE 4120-01-P
    LEGAL SERVICES CORPORATION 45 CFR Part 1602 Procedures for Disclosure of Information Under the Freedom of Information Act AGENCY:

    Legal Services Corporation.

    ACTION:

    Final rule, request for comments.

    SUMMARY:

    The Legal Services Corporation (LSC) is publishing for public comment a proposed final rule to implement the statutorily required amendments in the FOIA Improvement Act of 2016. LSC is also making technical changes to Part 1602 to improve the structure and clarity of its Freedom of Information Act (FOIA) regulations.

    DATES:

    The final rule is effective on December 15, 2016, unless LSC receives substantive adverse comments during the comment period. Written comments will be accepted until November 30, 2016.

    ADDRESSES:

    You may submit comments by any of the following methods:

    Email: [email protected]. Include “Part 1602 Proposed Final Rule” in the subject line of the message.

    Fax: (202) 337-6519, ATTN: Helen Guyton, Part 1602 Proposed Final Rule.

    Mail/Hand Delivery/Courier: Helen Guyton, Assistant General Counsel, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007, ATTN: Part 1602 Proposed Final Rule.

    Instructions: Electronic submissions are preferred via email with attachments in Acrobat PDF format. LSC may not consider written comments sent via any other method or received after the end of the comment period.

    FOR FURTHER INFORMATION CONTACT:

    Helen Gerostathos Guyton, Assistant General Counsel, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007, (202) 295-1632 (phone), (202) 337-6519 (fax), [email protected].

    SUPPLEMENTARY INFORMATION: I. Background

    LSC is subject to the FOIA by the terms of the Legal Services Corporation Act. 42 U.S.C. 2996d(g). LSC has implemented FOIA by adopting regulations that contain the rules and procedures LSC will follow in making its records available to the public. LSC last amended its FOIA regulations in 2008. 73 FR 67791, Dec. 31, 2008.

    On June 30, 2016, President Obama signed into law the FOIA Improvement Act of 2016 (“2016 Amendments” or the “Act”). The Act codifies a number of transparency and openness principles and enacts housekeeping measures designed to facilitate FOIA requests and production. LSC must review its current regulations and issue revised regulations on procedures for the disclosure of records consistent with the Act no later than December 27, 2016. The revised regulations described in this final rule reflect the required changes prescribed by the Act. LSC also identified and proposed technical changes to clarify the language and update the structure of its FOIA regulations.

    In light of the deadline established by Congress, LSC management requested that the Operations and Regulations Committee (Committee) recommend that the Board authorize expedited rulemaking and publication of this final rule. On October 16, 2016, the Committee considered the request and voted to make the recommendation to the Board. On October 18, 2016, the Board voted to authorize expedited rulemaking and the publication of this final rule.

    II. Section-by-Section Analysis § 1602.1 Purpose

    There are no proposed changes to this section.

    § 1602.2 Definitions

    LSC modified several existing definitions, deleted one definition, and added five new definitions to make its regulations clearer. Specifically, LSC amended the Definitions section as follows:

    Duplication. LSC is modifying this definition to require the release of records “in a form appropriate for release.” This change complies with FOIA guidance that records be released in the format requested, where possible.

    LSC. LSC is replacing all references to “the Corporation” with “LSC” for simplicity. LSC is introducing this definition to make clear that, unless otherwise specified, references to LSC in this rule include both the Corporation and LSC's Office of Inspector General.

    Office. LSC is adding this definition in order to simplify references to the Office of Inspector General and/or the Office of Legal Affairs, where appropriate.

    Office of Inspector General records. LSC is deleting this definition because the general definition of “Records” includes the Office of Inspector General records, making this definition redundant.

    Person. LSC's current regulations do not define person. To address this gap, LSC is adding a definition modeled after the definition of person contained in the FOIA, 5 U.S.C. 551(2).

    Records. LSC is modifying he definition of this term to comport with the definition of records in LSC's Records Management Policy, which was updated in September 2015. It also incorporates Office of Inspector General records, which were previously defined separately.

    Rule. LSC's FOIA regulations cite to personnel rules, rules of procedure, and substantive rules, but do not define the term rule. To address this gap, LSC has added a definition of rule modeled on the definition contained in the FOIA, 5 U.S.C. 551(4).

    Submitter. On February 14, 2003, LSC published in the Federal Register a final rule adding provisions for a submitter's rights process to its FOIA regulations. 68 FR 7433, Feb. 14, 2003. These provisions were modeled after the process outlined in Executive Order No. 12,600 (June 23, 1987). The 2003 final rule limited submitter solely to any person or entity from whom LSC receives grant application records. LSC is now expanding the definition of submitter to include “any person or applicant for funds who provides confidential commercial information to LSC.” This definition more closely conforms with the spirit of E.O. 12,600 and ensures that submitters who may have an interest in the protection of their confidential commercial information are properly notified.

    Confidential Commercial Information. LSC is adding a definition of “Confidential Commercial Information” modeled on the definition in E.O. 12,600, to comport with the new definition of “Submitter” described above.

    § 1602.3 Policy

    LSC is making minor technical edits to clarify this section.

    § 1602.4 Records Published in the Federal Register

    LSC is making minor technical edits to clarify this section.

    § 1602.5 Public Reading Room

    This section sets out the process by which LSC makes available for public inspection the records described in the FOIA, 5 U.S.C. 552(a)(2). In the current version of its FOIA regulations, LSC sets out the specific categories of records that must be publicly disclosed. LSC is deleting those specific provisions and replacing them with a broader reference to § 552(a)(2) generally in anticipation of implementing the “Release to One, Release to All” policy.

    The Department of Justice Office of Information Policy launched a pilot program as part of its Open Government Initiative called “Release to One, Release to All.” Under this policy, agencies would release FOIA processed records not only to a requester, but to the public at large by posting them online. LSC intends to comply with this policy immediately. As a result, it is revising the description of records in this section to track what LSC actually will be disclosing upon implementation of the “Release to One, Release to All” policy.

    LSC is also making minor technical revisions to clarify this section.

    § 1602.6 Procedures for Using the Public Reading Room

    LSC is adding a provision to this section that will provide requesters with onsite computer and printer access to electronic reading room records. This provision is consistent with federal agency practice and provides greater access to LSC's records to the public at large.

    § 1602.7 Index of Records

    LSC is updating this section to reflect its current practice of maintaining its index of records electronically.

    § 1602.8 Requests for Records

    The current version of § 1602.8 includes provisions relating to the format of requests for records, the timing of responses, and the format of responses to requests. There are no subheadings to distinguish these provisions within the section, making it difficult to follow. To improve readability, LSC is restructuring § 1602.8 by limiting the section solely to provisions related to the format of FOIA requests. LSC is also adding a provision that informs requesters of their right to specify the preferred form or format for the records sought and that requires requesters to provide their contact information to assist LSC in communicating with them about their request.

    § 1602.9 Timing and Responses to Requests for Records

    This is a new section. As described in the discussion of § 1602.8, LSC determined that it would be clearer if the provisions for timing and responses to requests were contained in a separate section. LSC also is making technical changes to the language and structure to improve clarity. In addition, LSC is adding provisions describing the dispute resolution processes available to the public as required by the 2016 Amendments. These provisions describe when a requester may seek assistance, including dispute resolution services, from an LSC FOIA Public Liaison or the U.S. National Archives and Record Administration's Office of Government Information Services.

    § 1602.10 Exemptions for Withholding Records

    LSC is amending this section to incorporate the 2016 Amendments' codification of the Department of Justice's foreseeable harm standard, which requires LSC to withhold information only if disclosure would harm an interest protected by an exemption or prohibited by law. It further obligates LSC to consider whether partial disclosure of information is possible when full disclosure is not and to take reasonable steps to segregate and release nonexempt information.

    In addition, LSC is modifying its rule regarding the applicability of the deliberative process privilege, as required by the 2016 Amendments. The privilege now applies only to records created within 25 years of the date on which the records were requested.

    Finally, LSC is adding exemptions 1, 8, and 9 from 5 U.S.C. 552(8)(B)(b) to its regulations. While these exemptions, which deal with national security, financial institutions, and geological information, generally do not apply to the work of LSC, their absence caused confusion because LSC's exemption numbers did not track the commonly used exemption numbers found in both the FOIA and case law. This change will eliminate any confusion.

    § 1602.11 Officials Authorized To Grant or Deny Requests for Records

    LSC is deleting paragraph (a) of this section, which describes the role of the General Counsel in adequately and consistently applying the provisions of this part within LSC. The 2016 Amendments establish the role of the Chief FOIA Officer in ensuring compliance with FOIA, thereby superseding LSC's current regulations.

    § 1602.12 Denials

    LSC is adding a provision to this section requiring it to include a provision in its denial decisions notifying the requester of his or her right to seek dispute resolution services from LSC's FOIA Public Liaison or the Office of Government Information Services.

    § 1602.13 Appeals of Denials

    LSC is making minor technical edits to clarify this section. LSC is also adding a provision required by the 2016 Amendments. This provision requires LSC to notify a requester of the mediation services offered by the Office of Government Information Systems as a non-exclusive alternative to litigation.

    § 1602.14 Fees

    LSC is adding a provision to this section that prohibits LSC from assessing fees if its response time is delayed, subject to limited exceptions described in the 2016 Amendments. LSC is also deleting references to the specific dollar amounts it will charge for search and reproduction costs because they are outdated and providing instead the web address for its FOIA page, which will contain current fee and cost schedules.

    § 1602.15 Submitter's Rights Process

    As previously described in the discussion of § 1602.2's definition of the term submitter, LSC is expanding the submitter's rights process to include “any person or applicant for funds who provides confidential commercial information to LSC.” This definition more closely conforms with the spirit of E.O. 12,600 and ensures that submitters who may have an interest in the protection of their confidential information are properly notified.

    LSC is further modifying this section to include a right to appeal to the Inspector General for Office of Inspector General-related requests, as the current regulations do not provide a mechanism to do so.

    Finally, LSC is clarifying an ambiguous provision that requires a submitter to provide to LSC within seven days his or her statement objecting to disclosure of his information. LSC must receive the submitter's statement within seven days of the date of LSC's notice to the submitter.

    List of Subjects in 45 CFR Part 1602

    Freedom of Information.

    For the reasons stated in the preamble, revise 45 CFR part 1602 to read as follows: PART 1602—PROCEDURES FOR DISCLOSURE OF INFORMATION UNDER THE FREEDOM OF INFORMATION ACT Sec. 1602.1 Purpose. 1602.2 Definitions. 1602.3 Policy. 1602.4 Records published in the Federal Register. 1602.5 Public reading room. 1602.6 Procedures for use of public reading room. 1602.7 Index of records. 1602.8 Requests for records. 1602.9 Timing and responses to requests for records. 1602.10 Exemptions for withholding records. 1602.11 Officials authorized to grant or deny requests for records. 1602.12 Denials. 1602.13 Appeals of denials. 1602.14 Fees. 1602.15 Submitter's rights process. Authority:

    42 U.S.C. 2996g(e)

    § 1602.1 Purpose.

    This part contains the rules and procedures the Legal Services Corporation (LSC) follows in making records available to the public under the Freedom of Information Act.

    § 1602.2 Definitions.

    (a) Commercial use request means a request from or on behalf of one who seeks information for a use or purpose that furthers the commercial, trade, or profit interests of the requester or the person on whose behalf the request is made. In determining whether a requester properly belongs in this category, LSC will look to the use to which a requester will put the documents requested. When LSC has reasonable cause to doubt the requester's stated use of the records sought, or where the use is not clear from the request itself, it will seek additional clarification before assigning the request to a category.

    (b) Confidential commercial information means records provided to LSC by a submitter that arguably contain material exempt from release under Exemption 4 of the FOIA, 5 U.S.C. 552(b)(4), because disclosure could reasonably be expected to cause substantial competitive harm.

    (c) Duplication means the process of making a copy of a requested record pursuant to this part in a form appropriate for release in response to a FOIA request.

    (d) Educational institution means a preschool, a public or private elementary or secondary school, an institution of undergraduate or graduate higher education, or an institution of professional or vocational education which operates a program or programs of scholarly research.

    (e) FOIA means the Freedom of Information Act, 5 U.S.C. 552.

    (f) LSC means the Legal Services Corporation. Unless explicitly stated otherwise, LSC includes the Office of Inspector General.

    (g) Non-commercial scientific institution means an institution that is not operated on a commercial basis and which is operated solely for the purpose of conducting scientific research, the results of which are not intended to promote any particular product or industry.

    (h) Office refers to the Office of Legal Affairs and/or the Office of Inspector General (OIG).

    (i) Person includes an individual, partnership, corporation, association, or public or private organization other than LSC.

    (j) Records are any type of information made or received by LSC or the OIG for purposes of transacting LSC or OIG business and preserved by LSC or the OIG (either directly or maintained by a third party under contract to LSC or the OIG for records management purposes) regardless of form (e.g., paper or electronic, formal or informal, copies or original) as evidence of LSC's or OIG's organization, functions, policies, decisions, procedures, operations, or other activities of LSC or the OIG or because the Record has informational value.

    (k) Representative of the news media means any person or entity that gathers information of potential interest to a segment of the public, uses its editorial skills to turn the raw materials into a distinct work, and distributes that work to an audience. In this clause, the term “news” means information that is about current events or that would be of current interest to the public. Examples of news media entities are television or radio stations broadcasting to the public at large and publishers of periodicals (but only if such entities qualify as disseminators of “news”) who make their products available for purchase or subscription or by free distribution to the general public. These examples are not all-inclusive. Moreover, as methods of news delivery evolve (for example, the adoption of the electronic dissemination of newspapers through telecommunications services), such alternative media shall be considered to be news media entities. A freelance journalist shall be regarded as working for a news media entity if the journalist can demonstrate a solid basis for expecting publication through that entity, whether or not the journalist is actually employed by the entity. A publication contract would present a solid basis for such an expectation. LSC may also consider the past publication record of the requester in making such a determination.

    (l) Review means the process of examining documents located in response to a request to determine whether any portion of any such document is exempt from disclosure. It also includes processing any such documents for disclosure. Review does not include time spent resolving general legal or policy issues regarding the application of exemptions.

    (m) Rule means the whole or a part of an LSC statement of general or particular applicability and future effect designed to implement, interpret, or prescribe law or policy or describing the organization, procedure, or practice requirements of LSC.

    (n) Search means the process of looking for and retrieving records that are responsive to a request for records. It includes page-by-page or line-by-line identification of material within documents and also includes reasonable efforts to locate and retrieve information from records maintained in electronic form or format. Searches may be conducted manually or by automated means and will be conducted in the most efficient and least expensive manner.

    (o) Submitter means any person or applicant for funds who provides confidential commercial information to LSC.

    § 1602.3 Policy.

    LSC will make records concerning its operations, activities, and business available to the public to the maximum extent reasonably possible. LSC will withhold records from the public only in accordance with the FOIA and this part. LSC will disclose records otherwise exempt from disclosure under the FOIA when disclosure is not prohibited by law and disclosure would not foreseeably harm a legitimate interest of the public, LSC, a recipient, or any individual.

    § 1602.4 Records published in the Federal Register.

    LSC routinely publishes in the Federal Register information on its basic structure and operations necessary to inform the public how to deal effectively with LSC. LSC will make reasonable efforts to currently update such information, which will include basic information on LSC's location, functions, rules of procedure, substantive rules, statements of general policy, and information regarding how the public may obtain information, make submittals or requests, or obtain decisions.

    § 1602.5 Public reading room.

    (a) LSC will maintain a public reading room at its offices at 3333 K St. NW., Washington, DC 20007. This room will be supervised and will be open to the public during LSC's regular business hours. Procedures for use of the public reading room are described in § 1602.6. LSC also maintains an electronic public reading room that may be accessed at http://www.lsc.gov/about-lsc/foia/foia-electronic-public-reading-room.

    (b) Subject to the limitation stated in paragraph (c), LSC will make available for public inspection in its electronic public reading room the records described in 5 U.S.C. 552(a)(2).

    (c) Certain records otherwise required by FOIA to be available in the public reading room may be exempt from mandatory disclosure pursuant to 5 U.S.C. 552(b) and § 1602.10. LSC will not make such records available in the public reading room. LSC may edit other records maintained in the reading room by redacting details about individuals to prevent clearly unwarranted invasions of personal privacy. In such cases, LSC will attach a full explanation of the redactions to the record. LSC will indicate the extent of the redactions unless doing so would harm an interest protected by the exemption under which the redactions are made. If technically feasible, LSC will indicate the extent of the redactions at the place in the record where the redactions were made.

    § 1602.6 Procedures for use of public reading room.

    (a) A person who wishes to inspect or copy records in the public reading room should arrange a time in advance, by telephone or letter request made to the Office of Legal Affairs, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007 or by email to [email protected].

    (1) In appropriate circumstances, LSC will advise persons making telephonic requests to use the public reading room that a written request would aid in the identification and expeditious processing of the records sought.

    (2) Written requests should identify the records sought in the manner provided in § 1602.8(b) and should request a specific date for inspecting the records.

    (b) LSC will advise the requester as promptly as possible if, for any reason, it is not feasible to make the records sought available on the date requested.

    (c) A computer terminal and printer are available upon request in the public reading room for accessing Electronic Reading Room records.

    § 1602.7 Index of records.

    LSC will maintain and make available for public inspection in an electronic format a current index identifying any matter within the scope of §§ 1602.4 and 1602.5(b).

    § 1602.8 Requests for records.

    (a) LSC will make its records promptly available, upon request, to any person in accordance with this section, unless:

    (1) The FOIA requires the records to be published in the Federal Register (§ 1602.4) or to be made available in the public reading room (§ 1602.5); or

    (2) LSC determines that such records should be withheld and are exempt from mandatory disclosure under the FOIA and § 1602.10.

    (b)(1) Requests for LSC records. All requests for LSC records must be clearly marked Freedom of Information Act Request and shall be addressed to the FOIA Analyst, Office of Legal Affairs, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007. Email requests shall be sent to [email protected]. Requests for LSC Records may also be made online using the FOIA Request Electronic Submission Form located at http://www.lsc.gov/about-lsc/foia.

    (2) Requests for Office of Inspector General records. All requests for records maintained by the OIG must be clearly marked Freedom of Information Act Request and shall be addressed to the FOIA Officer, Office of Inspector General, Legal Services Corporation, 3333 K Street NW., Washington, DC 20007. Email requests shall be sent to [email protected].

    (3) Any request not marked and addressed as specified in this section will be so marked by LSC personnel as soon as it is properly identified, and will be forwarded immediately to the appropriate Office. A request improperly addressed will be deemed to have been received as in accordance with § 1602.9 only when it has been received by the appropriate Office. Upon receipt of an improperly addressed request, the Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees shall notify the requester of the date on which the time period began.

    (c) A request must reasonably describe the records requested so that employees of LSC who are familiar with the subject area of the request are able, with a reasonable amount of effort, to determine which particular records are within the scope of the request. Before submitting their requests, requesters may contact LSC's or OIG's FOIA Analyst or FOIA Public Liaison to discuss the records they seek and to receive assistance in describing the records. If LSC determines that a request does not reasonably describe the records sought, LSC will inform the requester what additional information is needed or why the request is otherwise insufficient. Requesters who are attempting to reformulate or modify their request may discuss their request with LSC's or OIG's FOIA Analyst or FOIA Public Liaison. If a request does not reasonably describe the records sought, LSC's response to the request may be delayed.

    (d) To facilitate the location of records by LSC, a requester should try to provide the following kinds of information, if known:

    (1) The specific event or action to which the record refers;

    (2) The unit or program of LSC which may be responsible for or may have produced the record;

    (3) The date of the record or the date or period to which it refers or relates;

    (4) The type of record, such as an application, a grant, a contract, or a report;

    (5) Personnel of LSC who may have prepared or have knowledge of the record;

    (6) Citations to newspapers or publications which have referred to the record.

    (e) Requests may specify the preferred form or format (including electronic formats) for the records sought. LSC will provide records in the form or format indicated by the requester to the extent such records are readily reproducible in the requested form or format. LSC reserves the right to limit the number of copies of any document that will be provided to any one requester or to require that special arrangements for duplication be made in the case of bound volumes or other records representing unusual problems of handling or reproduction.

    (f) Requesters must provide contact information, such as their phone number, email address, and/or mailing address, to assist LSC in communicating with them and providing released records.

    (g) LSC is not required to create a record or to perform research to satisfy a request.

    (h) Any request for a waiver or reduction of fees should be included in the FOIA request, and any such request should indicate the grounds for a waiver or reduction of fees, as set out in § 1602.14(g). LSC shall respond to such request as promptly as possible.

    § 1602.9 Timing and responses to requests for records.

    (a)(1)(i) Upon receiving a request for LSC or Inspector General records under § 1602.8, the Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees shall make an initial determination of whether to comply with or deny such request. The Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees will send the determination to the requester within 20 business days after receipt of the request and will notify the requester of their right to seek assistance from an LSC FOIA Public Liaison.

    (ii) If the processing Office determines that a request or portion thereof is for the other Office's records, the processing Office shall promptly refer the request or portion thereof to the appropriate Office and send notice of such referral to the requester.

    (2) The 20-day period under paragraph (a)(1)(i) of this section shall commence on the date on which the request is first received by the appropriate Office, but in no event later than 10 working days after the request has been received by either the Office of Legal Affairs or the Office of Inspector General. The 20-day period shall not be tolled by the Office processing the request except that the processing Office may make one request to the requester for information pursuant to paragraph (b) of this section and toll the 20-day period while

    (i) It is awaiting such information that it has reasonably requested from the requester under this section; or

    (ii) It communicates with the requester to clarify issues regarding fee assessment. In either case, the processing Office's receipt of the requester's response to such a request for information or clarification ends the tolling period.

    (b)(1) In unusual circumstances, as specified in paragraph (b)(3) of this section, LSC may extend the time limit for up to 10 working days by written notice to the requester setting forth the reasons for such extension and the date on which LSC expects to send its determination.

    (2) If a request is particularly broad or complex so that it cannot be completed within the time periods stated in paragraph (a)(1)(i) of this section, LSC may ask the requester to narrow the request or agree to an additional delay. In addition, to aid the requester, LSC shall make available a FOIA Public Liaison, who shall assist in the resolution of any disputes between the requester and LSC, and shall notify the requester of his right to seek dispute resolution services from the U.S. National Archives and Records Administration's Office of Government Information Services.

    (3) Unusual circumstances. As used in this part, unusual circumstances are limited to the following, but only to the extent reasonably necessary for the proper processing of the particular request:

    (i) The need to search for and collect the requested records from establishments that are separate from the office processing the request;

    (ii) The need to search for, collect, and appropriately examine a voluminous amount of separate and distinct records which are demanded in a single request; or

    (iii) The need for consultation, which shall be conducted with all practicable speed, with another agency or organization, such as a recipient, having a substantial interest in the determination of the request or among two or more components of LSC having substantial subject matter interest therein.

    (c)(1) When the processing Office cannot send a determination to the requester within the applicable time limit, the Chief FOIA Officer, Office of the Inspector General Legal Counsel, or their designees shall inform the requester of the reason for the delay, the date on which the processing Office expects to send its determination, and the requester's right to treat the delay as a denial and to appeal to LSC's President or Inspector General, in accordance with § 1602.13, or to seek dispute resolution services from a FOIA Public Liaison or the Office of Government Information Services.

    (2) If the processing Office has not sent its determination by the end of the 20-day period or the last extension thereof, the requester may deem the request denied, and exercise a right of appeal in accordance with § 1602.13, or seek dispute resolution services from LSC's or OIG's FOIA Public Liaison or the National Archives and Records Administration's Office of Government Information Services. The Chief FOIA Officer, Office of Inspector General Legal Counsel, or their designees may ask the requester to forego appeal until a determination is made.

    (d) After the processing Office determines that a request will be granted, LSC or the OIG will act with due diligence in providing a substantive response.

    (e)(1) Expedited treatment. Requests and appeals will be taken out of order and given expedited treatment whenever the requester demonstrates a compelling need. A compelling need means:

    (i) Circumstances in which the lack of expedited treatment could reasonably be expected to pose an imminent threat to the life or physical safety of an individual;

    (ii) An urgency to inform the public about an actual or alleged LSC activity and the request is made by a person primarily engaged in disseminating information;

    (iii) The loss of substantial due process rights; or

    (iv) A matter of widespread and exceptional media interest raising questions about LSC's integrity which may affect public confidence in LSC.

    (2) A request for expedited processing may be made at the time of the initial request for records or at any later time. For a prompt determination, a request for expedited processing must be properly addressed and marked and received by LSC pursuant to § 1602.8.

    (3) A requester who seeks expedited processing must submit a statement demonstrating a compelling need and explaining in detail the basis for requesting expedited processing. The requester must certify that the statement is true and correct to the best of the requester's knowledge and belief.

    (4) Within 10 calendar days of receiving a request for expedited processing, the Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees shall decide whether to grant the request and shall notify the requester of the decision. If a request for expedited treatment is granted, the request shall be given priority and shall be processed as soon as practicable. If a request for expedited processing is denied, the requester may appeal in writing to LSC's President or Inspector General in the format described in § 1602.13(a). Any appeal of a denial for expedited treatment shall be acted on expeditiously by LSC.

    § 1602.10 Exemptions for withholding records.

    (a) LSC shall—

    (1) Withhold information under this section only if—

    (i) LSC reasonably foresees that disclosure would harm an interest protected by an exemption described in paragraph (b); or

    (ii) Disclosure is prohibited by law; and

    (2)(i) Consider whether partial disclosure of information is possible whenever LSC determines that a full disclosure of a requested record is not possible; and

    (ii) Take reasonable steps necessary to segregate and release nonexempt information; and

    (b) LSC may withhold a requested record from public disclosure only if one or more of the following exemptions authorized by the FOIA apply:

    (1)(i) Matter that is specifically authorized under criteria established by an Executive order to be kept secret in the interest of national defense or foreign policy and

    (ii) Is in fact properly classified pursuant to such Executive Order;

    (2) Matter that is related solely to the internal personnel rules and practices of LSC;

    (3) Matter that is specifically exempted from disclosure by statute (other than the exemptions under FOIA at 5 U.S.C. 552(b)), provided that such statute requires that the matters be withheld from the public in such a manner as to leave no discretion on the issue, or establishes particular criteria for withholding, or refers to particular types of matters to be withheld;

    (4) Trade secrets and commercial or financial information obtained from a person and privileged or confidential;

    (5) Inter-agency or intra-agency memoranda or letters that would not be available by law to a party other than an agency in litigation with the Corporation, provided that the deliberative process privilege shall not apply to records created 25 years or more before the date on which the records were requested;

    (6) Personnel and medical files and similar files, the disclosure of which would constitute a clearly unwarranted invasion of personal privacy;

    (7) Records or information compiled for law enforcement purposes, including enforcing the Legal Services Corporation Act or any other law, but only to the extent that the production of such law enforcement records or information:

    (i) Could reasonably be expected to interfere with enforcement proceedings;

    (ii) Would deprive a person or a recipient of a right to a fair trial or an impartial adjudication;

    (iii) Could reasonably be expected to constitute an unwarranted invasion of personal privacy;

    (iv) Could reasonably be expected to disclose the identity of a confidential source, including a State, local, or foreign agency or authority or any private institution that furnished information on a confidential basis, and in the case of a record or information compiled by a criminal law enforcement authority in the course of a criminal investigation, information furnished by a confidential source;

    (v) Would disclose techniques and procedures for law enforcement investigations or prosecutions, or would disclose guidelines for law enforcement investigations or prosecutions if such disclosure could reasonably be expected to risk circumvention of the law; or

    (vi) Could reasonably be expected to endanger the life or physical safety of any individual;

    (8) Matter that is contained in or related to examination, operating, or condition reports prepared by, on behalf of, or for the use of an agency responsible for the regulation or supervision of financial institutions; or

    (9) Geological and geophysical information and data, including maps, concerning wells.

    (c) In the event that one or more of the exemptions in paragraph (b) of this section applies, any reasonably segregable portion of a record shall be provided to the requester after redaction of the exempt portions. The amount of information redacted and the exemption under which the redaction is being made shall be indicated on the released portion of the record, unless doing so would harm the interest protected by the exemption under which the redaction is made. If technically feasible, the amount of information redacted and the exemption under which the redaction is being made shall be indicated at the place in the record where the redaction occurs.

    (d) No requester shall have a right to insist that any or all of the techniques in paragraph (c) of this section should be employed in order to satisfy a request.

    (e) Records that may be exempt from disclosure pursuant to paragraph (b) of this section may be made available at the discretion of the LSC official authorized to grant or deny the request for records, after appropriate consultation as provided in § 1602.11. Records may be made available pursuant to this paragraph when disclosure is not prohibited by law and does not appear adverse to legitimate interests of LSC, the public, a recipient, or any person.

    § 1602.11 Officials authorized to grant or deny requests for records.

    (a) The Chief FOIA Officer, Office of Inspector General Legal Counsel or their designees are authorized to grant or deny requests under this part. In the absence of an Office of Inspector General Legal Counsel, the Inspector General shall name a designee who will be authorized to grant or deny requests under this part and who will perform all other functions of the Office of Inspector General Legal Counsel under this part.

    (b)(1) The Chief FOIA Officer or designee shall consult with the Office of Inspector General Legal Counsel or designee prior to granting or denying any request for records or portions of records which originated with the OIG, or which contain information which originated with the OIG, but which are maintained by other components of LSC.

    (2) The Office of Inspector General Legal Counsel or designee shall consult with the Chief FOIA Officer or designee prior to granting or denying any request for records or portions of records which originated with any component of LSC other than the OIG, or which contain information which originated with a component of LSC other than the OIG, but which are maintained by the OIG.

    § 1602.12 Denials.

    (a) A denial of a written request for a record that complies with the requirements of § 1602.8 shall be in writing and shall include the following:

    (1) A reference to the applicable exemption or exemptions in § 1602.10(b) upon which the denial is based;

    (2) An explanation of how the exemption applies to the requested records;

    (3) A statement explaining why it is deemed unreasonable to provide segregable portions of the record after deleting the exempt portions;

    (4) An estimate of the volume of requested matter denied unless providing such estimate would harm the interest protected by the exemption under which the denial is made;

    (5) The name and title of the person or persons responsible for denying the request;

    (6) An explanation of the right to appeal the denial and of the procedures for submitting an appeal, as described in § 1602.13, including the address of the official to whom appeals should be submitted; and

    (7) An explanation of the right of the requester to seek dispute resolution services from a FOIA Public Liaison or the Office of Government Information Services.

    (b) Whenever LSC makes a record available subject to the deletion of a portion of the record, such action shall be deemed a denial of a record for purposes of paragraph (a) of this section.

    (c) All denials shall be treated as final opinions under § 1602.5(b)(1).

    § 1602.13 Appeals of denials.

    (a) Any person whose written request has been denied is entitled to appeal the denial within 90 days of the date of the response by writing to the President of LSC or, in the case of a denial of a request for OIG records, the Inspector General, at the mailing or email addresses given in § 1602.8(b)(1) and (2). The envelope and letter or email appeal should be clearly marked: “Freedom of Information Appeal.” An appeal need not be in any particular form, but should adequately identify the denial, if possible, by describing the requested record, identifying the official who issued the denial, and providing the date on which the denial was issued.

    (b) No personal appearance, oral argument, or hearing will ordinarily be permitted on appeal of a denial. Upon request and a showing of special circumstances, however, this limitation may be waived and an informal conference may be arranged with the President, Inspector General or their designees for this purpose.

    (c) The decision of the President or the Inspector General on an appeal shall be in writing and, in the event the denial is in whole or in part upheld, shall contain an explanation responsive to the arguments advanced by the requester, the matters described in § 1602.12(a)(1) through (4), and the provisions for judicial review of such decision under 5 U.S.C. 552(a)(4). The decision must also notify the requester of the mediation services offered by the National Archives and Records Administration's Office of Government Information Systems as a non-exclusive alternative to litigation.

    (d) LSC will send its decision to the requester within 20 business days after receipt of the appeal, unless an additional period is justified due to unusual circumstances, as described in § 1602.9, in which case LSC may extend the time limit for up to 10 working days by written notice to the requester setting forth the reasons for such extension and the date on which LSC expects to send its determination. The decision of the President or the Inspector General shall constitute the final action of LSC. All such decisions shall be treated as final opinions under § 1602.5(b)(1).

    (e) On an appeal, the President or designee shall consult with the OIG prior to reversing in whole or in part the denial of any request for records or portions of records which originated with the OIG, or which contain information which originated with the OIG, but which are maintained by other components of LSC. The Inspector General or designee shall consult with the President prior to reversing in whole or in part the denial of any request for records or portions of records which originated with LSC, or which contain information which originated with LSC, but which are maintained by the OIG.

    § 1602.14 Fees.

    (a) LSC will not charge fees for information routinely provided in the normal course of doing business.

    (b)(1) When records are requested for commercial use, LSC shall limit fees to reasonable standard charges for document search, review, and duplication.

    (2) LSC shall not assess any search fees (or if the requester is a representative of the news media, duplication fees) if LSC has failed to comply with the time limits set forth in § 1602.9 and no unusual circumstances, as defined in that section apply.

    (3)(i) If LSC has determined that unusual circumstances as defined in § 1602.9 apply and LSC has provided timely written notice to the requester in accordance with § 1602.9(b)(1), a failure described in paragraph § 1602.9(c)(2) is excused for an additional 10 days. If LSC fails to comply with the extended time limit, LSC may not assess any search fees (or, if the requester is a representative of the news media, duplication fees).

    (ii) If LSC has determined that unusual circumstances as defined in § 1602.9 apply and more than 5,000 pages are necessary to respond to the request, LSC may charge search fees or duplication fees if LSC has provided a timely written notice to the requester in accordance with § 1602.9 and LSC has discussed with the requester via written mail, electronic mail, or telephone (or made not less than three good-faith attempts to do so) how the requester could effectively limit the scope of the request in accordance with paragraph § 1602.9.

    (c) When records are sought by a representative of the news media or by an educational or non-commercial scientific institution, LSC shall limit fees to reasonable standard charges for document duplication after the first 100 pages; and

    (d) For all other requests, LSC shall limit fees to reasonable standard charges for search time after the first 2 hours and duplication after the first 100 pages.

    (e) The schedule of charges and fees for services regarding the production or disclosure of the Corporation's records may be viewed on LSC's FOIA home page at http://www.lsc.gov/about-lsc/foia.

    (f) LSC may charge for time spent searching even if it does not locate any responsive records or it withholds the records located as exempt from disclosure.

    (g) Fee waivers. A requester may seek a waiver or reduction of the fees established under paragraph (e) of this section. A fee waiver or reduction request will be granted where LSC has determined that the requester has demonstrated that disclosure of the information is in the public interest because it is likely to contribute significantly to public understanding of the operations of LSC and is not primarily in the commercial interest of the requester.

    (1) In order to determine whether disclosure of the information is in the public interest because it is likely to contribute significantly to public understanding of the operations or activities of LSC, LSC shall consider the following four factors:

    (i) The subject of the request: Whether the subject of the requested records concerns “the operations or activities of LSC.” The subject of the requested records must concern identifiable operations or activities of LSC, with a connection that is direct and clear, not remote or attenuated.

    (ii) The informative value of the information to be disclosed: Whether the disclosure is “likely to contribute” to an understanding of LSC operations or activities. The requested records must be meaningfully informative about LSC operations or activities in order to be likely to contribute to an increased public understanding of those operations or activities. The disclosure of information that is already in the public domain, in either a duplicative or a substantially identical form, would not be likely to contribute to such understanding where nothing new would be added to the public's understanding.

    (iii) The contribution to an understanding of the subject by the public likely to result from disclosure: Whether disclosure of the requested records will contribute to “public understanding.” The disclosure must contribute to a reasonably broad audience of persons interested in the subject, as opposed to the personal interest of the requester. A requester's expertise in the subject area and ability and intention to effectively convey information to the public shall be considered. LSC shall presume that a representative of the news media will satisfy this consideration.

    (iv) The significance of the contribution to public understanding: Whether the disclosure is likely to contribute “significantly” to public understanding of LSC operations or activities. The disclosure must enhance the public's understanding of the subject in question to a significant extent.

    (2) In order to determine whether disclosure of the information is not primarily in the commercial interest of the requester, LSC will consider the following two factors:

    (i) The existence and magnitude of a commercial interest: Whether the requester has a commercial interest that would be furthered by the requested disclosure. LSC shall consider any commercial interest of the requester (with reference to the definition of “commercial use” in this part) or of any person on whose behalf the requester may be acting, that would be furthered by the requested disclosure.

    (ii) The primary interest in disclosure: Whether the magnitude of the identified commercial interest is sufficiently large, in comparison with the public interest in disclosure, that disclosure is “primarily” in the commercial interest of the requester. A fee waiver or reduction is justified where the public interest is of greater magnitude than is any identified commercial interest in disclosure. LSC ordinarily shall presume that where a news media requester has satisfied the public interest standard, the public interest will be the interest primarily served by disclosure to that requester. Disclosure to data brokers or others who merely compile and market government information for direct economic return shall not be presumed primarily to serve a public interest.

    (3) Where LSC has determined that a fee waiver or reduction request is justified for only some of the records to be released, LSC shall grant the fee waiver or reduction for those records.

    (4) Requests for fee waivers and reductions shall be made in writing and must address the factors listed in this paragraph as they apply to the request.

    (h) Requesters must agree to pay all fees charged for services associated with their requests. LSC will assume that requesters agree to pay all charges for services associated with their requests up to $25 unless otherwise indicated by the requester. For requests estimated to exceed $25, LSC will consult with the requester prior to processing the request, and such requests will not be deemed to have been received by LSC until the requester agrees in writing to pay all fees charged for services.

    (i) No requester will be required to make an advance payment of any fee unless:

    (1) The requester has previously failed to pay a required fee within 30 days of the date of billing, in which case an advance deposit of the full amount of the anticipated fee together with the fee then due plus interest accrued may be required (and the request will not be deemed to have been received by LSC until such payment is made); or

    (2) LSC determines that an estimated fee will exceed $250, in which case the requester shall be notified of the amount of the anticipated fee or such portion thereof as can readily be estimated. Such notification shall be transmitted as soon as possible, but in any event within five working days of receipt by LSC, giving the best estimate then available. The notification shall offer the requester the opportunity to confer with appropriate representatives of LSC for the purpose of reformulating the request so as to meet the needs of the requester at a reduced cost. The request will not be deemed to have been received by LSC for purposes of the initial 20-day response period until the requester makes a deposit on the fee in an amount determined by LSC.

    (j) Interest may be charged to those requesters who fail to pay the fees charged. Interest will be assessed on the amount billed, starting on the 31st day following the day on which the billing was sent. The rate charged will be as prescribed in 31 U.S.C. 3717.

    (k) If LSC reasonably believes that a requester or group of requesters is attempting to break a request into a series of requests for the purpose of evading the assessment of fees, LSC shall aggregate such requests and charge accordingly. Likewise, LSC will aggregate multiple requests for documents received from the same requester within 45 days.

    § 1602.15 Submitter's rights process.

    (a) When LSC receives a FOIA request seeking the release of confidential commercial information, LSC shall provide prompt written notice of the request to the submitter in order to afford the submitter an opportunity to object to the disclosure of the requested confidential commercial information. The notice shall reasonably describe the confidential commercial information requested and inform the submitter of the process required by paragraph (b) of this section.

    (b) If a submitter who has received notice of a request for the submitter's confidential commercial information wishes to object to the disclosure of the confidential commercial information, the submitter must provide LSC with a detailed written statement identifying the information which it objects to LSC disclosing. The submitter must send its objections to the Office of Legal Affairs or, if it pertains to Office of Inspector General records, to the Office of Inspector General, and must specify the grounds for withholding the information under FOIA or this part. In particular, the submitter must demonstrate why the information is commercial or financial information that is privileged or confidential. The submitter's statement must be received by LSC within seven business days of the date of the notice from LSC. If the submitter fails to respond to the notice from LSC within that time, LSC will deem the submitter to have no objection to the disclosure of the information.

    (c) Upon receipt of written objection to disclosure by a submitter, LSC shall consider the submitter's objections and specific grounds for withholding in deciding whether to release the disputed information. Whenever LSC decides to disclose information over the objection of the submitter, LSC shall give the submitter written notice which shall include:

    (1) A description of the information to be released and a notice that LSC intends to release the information;

    (2) A statement of the reason(s) why the submitter's request for withholding is being rejected; and

    (3) Notice that the submitter shall have five business days from the date of the notice of proposed release to appeal that decision to the LSC President or Inspector General (as provided in § 1602.13 (c)), whose decision shall be final.

    (d) The requirements of this section shall not apply if:

    (1) LSC determines upon initial review of the requested confidential commercial information that the requested information should not be disclosed;

    (2) The information has been previously published or officially made available to the public; or

    (3) Disclosure of the information is required by statute (other than FOIA) or LSC's regulations.

    (e) Whenever a requester files a lawsuit seeking to compel disclosure of a submitter's information, LSC shall promptly notify the submitter.

    (f) Whenever LSC provides a submitter with notice and opportunity to oppose disclosure under this section, LSC shall notify the requester that the submitter's rights process under this section has been triggered. Likewise, whenever a submitter files a lawsuit seeking to prevent the disclosure of the submitter's information, LSC shall notify the requester.

    Dated: October 20, 2016. Stefanie K. Davis, Assistant General Counsel.
    [FR Doc. 2016-25832 Filed 10-28-16; 8:45 am] BILLING CODE P
    FEDERAL COMMUNICATIONS COMMISSION 47 CFR Part 25 [IB Docket No. 02-34; FCC 16-108] Amendment of the Commission's Space Station Licensing Rules and Policies, Second Order on Reconsideration AGENCY:

    Federal Communications Commission.

    ACTION:

    Final rule.

    SUMMARY:

    The Federal Communications Commission addresses the remaining petitions for reconsideration of the First Space Station Licensing Reform Order, and amends, clarifies, or eliminates certain provisions to streamline its procedures and ease administrative burdens on applicants and licensees.

    DATES:

    Effective November 30, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Jay Whaley, 202-418-7184, or if concerning the information collections in this document, Cathy Williams, 202-418-2918.

    SUPPLEMENTARY INFORMATION:

    This is a summary of the Commission's Second Order on Reconsideration, FCC 16-108, adopted on August 15, 2016 and released August 16, 2016. The full text of the Second Order on Reconsideration is available at https://apps.fcc.gov/edocs_public/attachmatch/FCC-16-108A1.pdf. It is also available for inspection and copying during business hours in the FCC Reference Information Center, Portals II, 445 12th Street SW., Room CYA257, Washington, DC 20554. To request materials in accessible formats for people with disabilities, send an email to [email protected] or call the Consumer & Governmental Affairs Bureau at 202-418-0530 (voice), 202-418-0432 (TTY).

    Synopsis

    In the First Space Station Licensing Reform Order, 68 FR 51499, the Commission adopted new satellite licensing procedures intended to enable the Commission to issue satellite licenses more quickly without allowing satellite license applicants to abuse the Commission's licensing procedures. In response, a number of petitions for reconsideration were filed. The Commission addressed those petitions that were focused on the satellite bond requirements in the First Order on Reconsideration and Fifth Report and Order. This Second Order on Reconsideration addresses the remaining petitions for reconsideration of the First Space Station Licensing Reform Order and amends the Commission's rules in order to streamline these new satellite licensing procedures, and to clarify and reaffirm safeguards against subversion of the licensing process, thus furthering the goals of the First Space Station Licensing Reform Order to develop a faster satellite licensing procedure while safeguarding against speculative applications, thereby expediting service to the public.

    NGSO-Like Processing Round Procedure

    We revise section 25.157(e) of the current rules to eliminate the requirement that the Commission withhold spectrum for use in a subsequent processing round if fewer than three qualified applicants file applications in the initial processing round, known as the “three-licensee presumption.” We find that the “three-licensee presumption” is overly restrictive for its intended purpose. We agree with petitioners that a specific frequency band does not necessarily equate to a market, and thus having fewer than three licensees in a band does not necessarily indicate a harmful lack of competition in some market that we should attempt to remedy. We find it common that licensees in different bands compete with each other in the provision of satellite-based services in broader markets, and we note that there are numerous NGSO-like system operators that currently compete across frequency bands.

    We also recognize that in cases where one or more applicants in a processing round request less spectrum than they would be assigned if all the available spectrum were divided equally among all the qualified applicants, some spectrum would remain unassigned, thus we retain the procedure that the Commission adopted in the First Space Station Licensing Reform Order, to redistribute the remaining spectrum among the other qualified applicants who have previously applied for the spectrum. If spectrum still remains, then interested parties would be free to apply for that unassigned spectrum in another processing round.

    Procedures for Redistribution of Spectrum

    We clarify the procedures that apply when we redistribute spectrum among the remaining NGSO-like systems after an authorization for a NGSO-like system has been canceled or otherwise becomes available. This redistribution procedure applies only in cases where spectrum was granted pursuant to a processing round, and one or more of those grants of spectrum is lost or surrendered for any reason. In these cases, the Commission will issue a public notice or order announcing the loss or surrender of such spectrum, and will then propose to modify the remaining grants to redistribute the returned spectrum among the remaining system operators that have requested use of the spectrum. The returned spectrum will generally be redistributed equally among the remaining operators that requested the spectrum, although no operator will receive more spectrum on redistribution than it requested in its application. Additionally, if an operator has not requested use of a particular spectrum band, it will not receive spectrum in that band. If the Commission is unable to make a finding that there will be reasonably efficient use of the spectrum, we will consider on a case-by-case basis whether to open a new processing round for the returned spectrum, leave it unassigned at that point, or repurpose it for another use.

    Safeguards Against Speculation

    In the First Space Station Licensing Reform Order, the Commission eliminated the anti-trafficking rule for satellites, which prohibited satellite licensees from selling “bare” satellite licenses for profit, so as not to prevent a satellite license from being transferred to an entity that would put it to its highest valued use in the shortest amount of time. The Commission put in place certain safeguards, including a determination of whether the seller obtained the license in good faith or for the primary purpose of selling it for profit, whether the licensee made serious efforts to develop a satellite or constellation, and/or whether the licensee faces changed circumstances. Petitioners expressed concern that by making this determination, the Commission would undercut the public interest benefits it identified in eliminating the anti-trafficking rule. We reiterate that this limited exception does not undermine our elimination of the anti-trafficking rule, and we require that parties opposing a transaction based on a seller's motivation to provide, at a minimum, substantial evidence that a satellite license was obtained for purposes of selling the license for profit, thus preventing opponents to a transaction from delaying the transaction on purely frivolous grounds and ensuring that these transactions do not encounter any unwarranted delay.

    In the First Space Station Licensing Reform Order, the Commission adopted a rule prohibiting sales of places in the queue as an additional safeguard against speculation and revised its rules so that an applicant proposing to merge with another company could do so without losing its place in the processing queue. The revised rule treated transfers of control as minor amendments, thus within the queue, and major amendments to applications as newly filed applications, thus moving to the end of the queue. We find that it is not inconsistent to prohibit an applicant from selling its place in the queue, while allowing an applicant that transfers control over itself to a new controlling party to retain its place in the queue, especially when the new company is better positioned to compete in the marketplace, and that an applicant's transfer of control is less likely to be used as an abusive strategy than selling its place in the queue.

    Effect of License Surrender Prior to Milestone Deadlines on Application Limit

    Under section 25.159(d) of the rules, adopted in the First Space Station Licensing Reform Order and commonly referred to as the “Three-Strikes” rule, if a licensee misses three milestones in any three-year period, it is prohibited from filing additional satellite applications if it possesses two satellite applications and/or unbuilt satellites in any frequency band. This limit remains in force until the licensee demonstrates that it would be very likely to construct its licensed facilities if it were allowed to file more applications. The Commission reasoned that a licensee that consistently obtains licenses but does not meet its milestones precludes others from going forward with their business plans while it holds those licenses.

    SES Americom (SES) maintains that the Commission should not consider a licensee's relinquishing a license prior to the contract execution milestone in determining whether to impose the limit on satellite applications and/or unbuilt satellites on that licensee. As an initial matter, we note that the milestone rules have been revised in the Part 25 Review Second R&O to eliminate interim milestones. As a result, there is no longer a contract execution milestone, and thus SES's arguments are now moot in part. However, since we retained the final milestone requirement, any authorization surrendered prior to fulfilling the remaining milestone requirement will continue to be subject to the “Three-Strikes” rule. For the reasons set forth in the Part 25 Review Second R&O, we continue to believe that, on balance, retaining this milestone and the resulting operation of the “Three Strikes” rule best serves the public interest, and we see no compelling justification to counter-balance the public interest benefits in retaining the current requirements. Accordingly, we will continue to presume that these licensees (i.e., those covered under the “Three Strikes” rule) acquired licenses for speculative purposes, and we will restrict the number of additional satellite applications they may file to limit the potential for future speculation while the presumption is in effect.

    Effects of Mergers on Application Limits

    SIA asserts that it is unclear in the First Space Station Licensing Reform Order how the limit on pending and licensed but unlaunched satellites applies to satellite operators that would be formed by the merger of two companies. We clarify that the limit on satellite applications does not prevent the filing of an application for transfer of control or assignment of licenses, even if the combined entities would not meet the limits on pending applications and unbuilt stations specified in the rule. Of course, any such approval of the transfer of control will ultimately be conditioned on the entity coming into compliance with the limits within a reasonable amount of time.1

    1 In ruling on proposed mergers, the Commission routinely assesses “whether the proposed transaction complies with the specific provisions of the Act, other applicable statutes, and the Commission's rules.”

    Needs for Safeguards in Different Parts of the GSO Orbit

    In its Petition, Hughes asserts that the limit on pending applications and licensed-but-unlaunched satellites is not necessary for those orbital locations not covering the United States.2 Hughes also advocates eliminating the bond requirement for applicants for satellites that will operate at non-U.S. orbital locations.3 Hughes proposes to define “U.S.” orbital locations as those within the orbital arc between 60° W.L. and 140° W.L., and to define “non-U.S.” locations as those outside that arc. Hughes argues that the limit should not apply to the “non-U.S.” orbital locations because other Administrations have international coordination priority at many of those locations and because many other Administrations have volatile economies. Hughes argues that the demand for such locations has been “reasoned and measured,” so that the Commission can address them in an orderly fashion.

    2 As noted above, the First Space Station Licensing Reform Order established two limits on pending applications and/or unbuilt satellites, the stricter of the two limits is applicable to licensees that have established a pattern of missing milestones. Hughes maintains that the stricter limit should not apply to orbital locations not covering the United States. We also observed above that the Part 25 Review Second R&O eliminated one of the two limits on pending applications and/or unbuilt satellites and the bond requirement. As a result, this issue is moot.

    3 In the Part 25 Review Second R&O, the Commission adopted significant revisions to the bond requirement adopted in the First Space Station Licensing Reform Order. However, the Commission continues to require a bond for all satellite licenses regardless of the orbit location.

    The purpose of the safeguards in section 25.159 of the Commission's rules is not to reduce the number of satellite applications to a “reasoned and measured” level. Rather, the Commission intended the safeguards to discourage speculators from applying for satellite licenses, thereby precluding another applicant from obtaining a license, constructing a satellite, and providing service to the customers. Hughes assumes that, because fewer applications are filed outside of the arc from 60° W.L. to 140° W.L. than within that arc, speculation is not a concern. Although demand may not be as great for locations that cannot serve large portions of the United States, we have licensed many satellites at orbital locations in this portion of the arc that are subject to competition. We have also granted U.S. market access to many non-U.S.-licensed satellites operating at those locations to provide services to U.S. customers. Thus, allowing operators to hold these orbital locations while they decide whether to proceed with implementation could preclude other operators whose plans also involve providing international service from going forward. For these reasons, we will continue to apply the safeguards against speculation, including the bond requirement, where appropriate, regardless of orbital location.

    Satellite System Implementation Requirements

    In its petition for reconsideration, ICO asserts that the First Space Station Licensing Reform Order does not state clearly that NGSO-like licensees acquiring additional spectrum from other NGSO-like licensees are permitted to implement a single, integrated NGSO system under a single milestone schedule. ICO requests the Commission to clarify that such licensees will not be required to construct multiple separate satellite systems.

    The Commission eliminated the anti-trafficking rule to allow NGSO-like licensees in modified processing rounds to acquire rights to operate on additional spectrum from other licensees if they feel it is necessary to meet their business needs. It would be inefficient to require these licensees to build two incompatible satellite networks, each operating in only part of the spectrum rights that the licensee is authorized to use. We therefore clarify that NGSO-like licensees acquiring spectrum rights from other NGSO-like licensees are permitted to build a single, integrated NGSO-like system operating on all authorized frequency bands, under a single milestone schedule. These cases are inherently fact-specific, and so we decline to adopt a blanket approach about the milestone schedule that would apply in these cases.4 If the milestone schedules of each license differ, we will address, on a case-by-case basis, the particular milestone schedule that will be imposed on the integrated system.

    4 For example, depending on the differences in the milestone schedules, permitting licensees to adopt a schedule with significantly more time might encourage licensees to acquire other licensees merely to gain more time to fulfill their milestone schedules. On the other hand, integrating additional spectrum into a single network may legitimately require more time in some cases.

    Non-U.S.-Licensed Satellites

    Under the terms of the World Trade Organization (WTO) Agreement on Basic Telecommunication Services (WTO Telecom Agreement),5 WTO signatories, including the United States, have made binding commitments to open their markets to foreign competition in satellite services.6 Consistent with those commitments, the Commission adopted DISCO II in 1997 to establish procedures for non-U.S.-licensed satellite operators seeking access to the U.S. market. In the DISCO II First Reconsideration Order, the Commission streamlined those procedures.

    5 The WTO came into being on January 1, 1995, pursuant to the Marrakesh Agreement Establishing the World Trade Organization (the Marrakesh Agreement). The Marrakesh Agreement includes multilateral agreements on the trade in goods, services, intellectual property, and dispute settlement. The General Agreement on Trade in Service (GATS) is Annex 1B of the Marrakesh Agreement. The WTO Telecom Agreement was incorporated into the GATS by the Fourth Protocol to the GATS (April 30, 1996).

    6 The United States made market access commitments for Direct-to-Home (DTH) Service, Direct Broadcast Satellite (DBS) Service, and Digital Audio Radio Service (SDARS), and took an exemption from most-favored nation (MFN) treatment for those services as well. Generally, GATS requires WTO member countries to afford MFN treatment to all other WTO member nations.

    In the First Space Station Licensing Reform Order, the Commission established a procedure for addressing changes in ownership of non-U.S.-licensed satellites. Specifically, when the operator of such a satellite undergoes a change in ownership, the Commission requires the satellite operator to notify the Commission of the change. The Commission then issues a public notice announcing that the transaction has taken place and inviting comment on whether the transaction affects any of the considerations made when the original satellite operator was allowed to enter the U.S. market. In addition, if control of the satellite was transferred to an operator not based in a WTO member country, the Commission would invite comment on whether the purchaser has satisfied all applicable DISCO II requirements. The Commission then determines whether any commenter raised any concern that would warrant precluding the new operator from entering the U.S. market, including concerns relating to national security, law enforcement, foreign policy, or trade issues.

    According to SIA, the rule revisions adopted in the First Space Station Licensing Reform Order to implement this satellite transfer procedure do not state clearly that satellite operators are allowed to notify the Commission of transfers of ownership of satellites after the transfer takes place. SIA asks us to revise section 25.137(g) of the Commission's rules to make clear that non-U.S.-satellite operators may notify the Commission of a change of ownership after the transfer takes place. We will do so. The Commission did not intend to require foreign entities to notify the Commission of the transaction before it had been completed. Rather, the Commission adopted its proposal in the Space Station Licensing Reform NPRM to address such changes in ownership by “issuing a public notice announcing that the transaction has taken place.” Therefore, we revise section 25.137(g) as SIA suggests, as set forth in Appendix B of the Second Order on Reconsideration. We also clarify that parties must notify the Commission within 30 days after consummation of the transaction in order to enable the Commission to perform the review described in the First Space Station Licensing Reform Order in a meaningful and timely manner while the new foreign operator is permitted to access the U.S. market.

    Further, in the First Space Station Licensing Reform Order, the Commission stated that operators requesting authority to provide service in the United States from a foreign-licensed satellite must file Form 312 (Application for Satellite Space and Earth Station Authorizations). Hughes asserts that the electronic Form 312 does not allow a non-U.S.-licensed satellite operator to indicate that it is not seeking a Commission license, but is instead seeking U.S. market access. Hughes also questions whether parties seeking U.S. market access must file their requests electronically. First, contrary to Hughes's assertion, the electronic version of Form 312 provides a place to indicate that the applicant is filing for a petition for declaratory ruling, which is the procedure for requesting U.S. market access. Second, the Commission stated explicitly in the First Space Station Licensing Reform Order that U.S. market access requests must be filed electronically, and we continue to believe that mandatory electronic filing serves the public interest by facilitating prompt receipt of petitions for declaratory ruling and accurate recording of the time of filing under the first-come, first-served processing procedure, and by providing other administrative efficiencies.

    ITU Priority

    In the First Space Station Licensing Reform Order, the Commission discussed the interrelationship between its domestic licensing framework and the international coordination framework set forth in the Radio Regulations of the International Telecommunication Union (ITU). Hughes requests that we clarify how we will determine whether to grant or deny market access requests from non-U.S.-licensed satellite operators, particularly in cases where a non-U.S. operator has ITU coordination date-filing priority, i.e., an earlier ITU protection date, but is behind a U.S. applicant in the U.S. space station queue. In particular, Hughes argues that the first-come, first-served procedure should not “block” a non-U.S.-licensed satellite operator with ITU priority.

    The Commission discussed international coordination issues in the First Space Station Licensing Reform Order. Specifically, the Commission stated that it will license satellites at orbital locations at which another Administration has ITU priority. In fact, if a later-filed market access request—with or without ITU priority—is mutually exclusive with an earlier-filed, granted application, it may be dismissed absent a coordination agreement between the applicants. The Commission further stated, however, that it will issue the earlier-filed authorization subject to the outcome of the international coordination process, and emphasized that the Commission is not responsible for the success or failure of the required international coordination. Absent such coordination, a U.S.-licensed satellite making use of an ITU filing with a later protection date would be required to cease service to the U.S. market immediately upon launch and operation of a non-U.S.-licensed satellite with an earlier protection date, or be subject to further conditions. We continue to follow this general approach today.

    Modifications

    Hughes notes that the rule revisions adopted in the First Space Station Licensing Reform Order require the Commission to treat modification requests involving new orbital locations or new frequency bands in the application processing queue, and other modification requests outside of the queue. Hughes supports this approach, but asserts that the Commission stated elsewhere in the First Space Station Licensing Reform Order that, unless it could categorically classify certain modification requests involving new frequencies or orbital locations as “minor,” it would treat all such modification requests in the processing queue. Hughes requests the Commission to reconcile these two statements.

    In the First Space Station Licensing Reform Order, the Commission revised its rules to adopt a clear, simple test for determining whether to process a modification request in the processing queue: modification requests involving new orbital locations or new frequency bands are considered in the queue, and other modifications are considered outside of the queue.7 We clarify here that nothing in the text of the First Space Station Licensing Reform Order was intended to alter the Commission's decision to consider modification requests in this fashion. The Commission also suggested, however, that it could, at a later date, adopt rules to define certain modification requests involving new orbital locations as minor, and to consider such modification requests outside the queue. In this regard, in the Second Space Station Licensing Reform Order, the Commission decided to treat certain fleet management modification requests involving orbital reassignment of specific satellites outside the queue. We affirm, however, that, absent a rulemaking finding public interest reasons to create additional exceptions, we will continue to process orbital reassignment and frequency modification requests as set forth in section 25.117(d)(2)(iii).

    7 The Commission adopted this test instead of a more complex proposal to place “major” modification requests in the queue, and to define “major” modification requests as those that would “degrade the interference environment.”

    Supplemental Final Regulatory Flexibility Analysis

    As required by the Regulatory Flexibility Act (RFA), an Initial Regulatory Flexibility Analysis (IRFA) was incorporated in the Further Notice of Proposed Rulemaking in the Matter of Comprehensive Review of Licensing and Operating Rules for Satellite Services. The Commission sought written public comment on the proposals in the Further Notice, including comment on the IRFA. No comments were received on the IRFA. This Final Regulatory Flexibility Analysis (FRFA) conforms to the RFA.

    Paperwork Reduction Act

    This document does not contain new or modified information collection requirements subject to the Paperwork Reduction Act of 1995, Public Law 104-13. Therefore it does not contain any new or modified “information burden for small business concerns with fewer than 25 employees” pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198. Thus, on October 14, 2016, the Office of Management and Budget (OMB) determined that the rule changes in this document are non-substantive changes to the currently approved collection, OMB Control Number 3060-0678. ICR Reference Number: 201610-3060-011.

    Pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198, see 44 U.S.C. 3506(c)(4), we previously sought specific comment on how the Commission might further reduce the information collection burden for small business concerns with fewer than 25 employees. We received no comments on this issue. We have assessed the effects of the revisions adopted that might impose information collection burdens on small business concerns, and find that the impact on businesses with fewer than 25 employees will be an overall reduction in burden. The amendments adopted in this Second Order on Reconsideration eliminate unnecessary information filing requirements for licensees and applicants; eliminate unnecessary technical restrictions and enable applicants and licensees to conserve time, effort, and expense in preparing applications and reports. Overall, these changes may have a greater positive impact on small business entities with more limited resources.

    Congressional Review Act

    The Commission will send copies of this Second Order on Reconsideration to Congress and the General Accountability Office pursuant to the Congressional Review Act, 5 U.S.C. 801(a)(1)(A).

    Effective Date

    The effective date for the rules adopted in this Second Order on Reconsideration is 30 days after date of publication in the Federal Register.

    Need for, and Objectives of, the Rules

    This Order adopts minor changes to part 25 of the Commission's rules, which governs licensing and operation of space stations and earth stations for the provision of satellite communication services.8 We revise the rules to, among other things, further the goals of the First Space Station Licensing Reform Order to develop a faster satellite licensing procedure while safeguarding against speculative applications, thereby expediting service to the public.

    8 47 CFR part 25, Satellite Communications.

    This Order revises two sections of part 25 of the rules. Specifically, it revises the rules to:

    (1) Eliminate the “three-licensee presumption” that applies to the NGSO-like processing round procedure, and also revise the procedures that we will apply when we redistribute spectrum among remaining NGSO-like licensees when a license is cancelled for any reason.

    (2) Clarify that non-U.S.-satellite operators may notify the Commission of a change of ownership after the transfer takes place.

    Summary of Significant Issues Raised by Public Comments in Response to the IRFA

    No party filing comments in this proceeding responded to the IRFA, and no party filing comments in this proceeding otherwise argued that the policies and rules proposed in this proceeding would have a significant economic impact on a substantial number of small entities. The Commission has, nonetheless, considered any potential significant economic impact that the rule changes may have on the small entities which are impacted. On balance, the Commission believes that the economic impact on small entities will be positive rather than negative, and that the rule changes move to streamline the part 25 requirements.

    Response to Comments by the Chief Counsel for Advocacy of the Small Business Administration

    Pursuant to the Small Business Jobs Act of 2010, the Commission is required to respond to any comments filed by the Chief Counsel for Advocacy of the Small Business Administration, and to provide a detailed statement of any change made to the proposed rules as a result of those comments. The Chief Counsel did not file any comments in response to the proposed rules in this proceeding.

    Description and Estimate of the Number of Small Entities to Which the Rules May Apply

    The RFA directs agencies to provide a description of, and, where feasible, an estimate of, the number of small entities that may be affected by the rules adopted herein. The RFA generally defines the term “small entity” as having the same meaning as the terms “small business,” “small organization,” and “small governmental jurisdiction.” In addition, the term “small business” has the same meaning as the term “small business concern” under the Small Business Act. A small business concern is one which: (1) Is independently owned and operated; (2) is not dominant in its field of operation; and (3) satisfies any additional criteria established by the Small Business Administration (SBA). Below, we describe and estimate the number of small entity licensees that may be affected by the adopted rules.

    Satellite Telecommunications and All Other Telecommunications

    The rules adopted in this Order will affect some providers of satellite telecommunications services. Satellite telecommunications service providers include satellite and earth station operators. Since 2007, the SBA has recognized two census categories for satellite telecommunications firms: “Satellite Telecommunications” and “Other Telecommunications.” Under the “Satellite Telecommunications” category, a business is considered small if it had $32.5 million or less in annual receipts. Under the “Other Telecommunications” category, a business is considered small if it had $32.5 million or less in annual receipts.

    The first category of Satellite Telecommunications “comprises establishments primarily engaged in providing point-to-point telecommunications services to other establishments in the telecommunications and broadcasting industries by forwarding and receiving communications signals via a system of satellites or reselling satellite telecommunications.” For this category, Census Bureau data for 2007 show that there were a total of 512 satellite communications firms that operated for the entire year. Of this total, 482 firms had annual receipts of under $25 million.

    The second category of Other Telecommunications is comprised of entities “primarily engaged in providing specialized telecommunications services, such as satellite tracking, communications telemetry, and radar station operation. This industry also includes establishments primarily engaged in providing satellite terminal stations and associated facilities connected with one or more terrestrial systems and capable of transmitting telecommunications to, and receiving telecommunications from, satellite systems. Establishments providing Internet services or voice over Internet protocol (VoIP) services via client-supplied telecommunications connections are also included in this industry.” For this category, Census Bureau data for 2007 show that there were a total of 2,383 firms that operated for the entire year. Of this total, 2,346 firms had annual receipts of under $25 million. We anticipate that some of these “Other Telecommunications firms,” which are small entities, are earth station applicants/licensees that will be affected by our adopted rule changes.

    We anticipate that our rule changes will have an impact on space station applicants and licensees. Space station applicants and licensees, however, rarely qualify under the definition of a small entity. Generally, space stations cost hundreds of millions of dollars to construct, launch and operate. Consequently, we do not anticipate that any space station operators are small entities that would be affected by our actions.

    Description of Projected Reporting, Recordkeeping, and Other Compliance Requirements for Small Entities

    The Order adopts a number of rule changes that will affect reporting, recordkeeping and other compliance requirements for space station operators. These changes, as described below, will decrease the burden for all businesses operators, especially firms that are applicants for licenses to operate NGSO-like space stations.

    We simplify the rules to facilitate improved compliance. First, the Order simplifies information collections in applications for NGSO-like space station licenses. Specifically, the Order eliminates reporting requirements that are more burdensome than necessary. For example, the Order removes the “three-licensee presumption,” a rebuttable presumption that assumes, for purposes of the modified processing round procedure for NGSO-like space station applications, a sufficient number of licensees in the frequency band is three, and if the processing round results in less than three applicants, 1/3 of the spectrum in the allocated band will be reserved for an additional processing round. To rebut this presumption, a party must provide convincing evidence that allowing less than three licensees in the frequency band will result in extraordinarily large, cognizable, and non-speculative efficiencies. Thus, applicants for NGSO-like space stations will not need to expend resources, both technical and legal, to demonstrate that their NGSO-like systems are designed to provide such efficiencies in order to rebut the three-licensee presumption. Furthermore, in cases where spectrum was granted pursuant to a processing round, and one or more of those grants of spectrum is lost or surrendered for any reason, the rules now allow for the returned spectrum to be redistributed without automatically triggering a new processing round and the corresponding costs and paperwork involved, thus reducing the administrative burdens on those applicants.

    Another example is that we see no reason to require non-U.S.-satellite operators with satellites on the Permitted List to notify the Commission of a change of ownership before the transfer takes place. Thus, we revise our rule to state clearly that non-U.S.-satellite operators are allowed to notify the Commission of transfers of ownership of Permitted List satellites after the transfer takes place. Thus, these satellite operators are relieved of any additional burden that could result from a delay in completing a transfer of Permitted List satellites pending Commission approval.

    Steps Taken To Minimize Significant Economic Impact on Small Entities, and Significant Alternatives Considered

    The RFA requires an agency to describe any significant, specifically small business, alternatives that it has considered in reaching its proposed approach, which may include the following four alternatives (among others): “(1) the establishment of differing compliance or reporting requirements or timetables that take into account the resources available to small entities; (2) the clarification, consolidation, or simplification of compliance and reporting requirements under the rules for such small entities; (3) the use of performance rather than design standards; and (4) an exemption from coverage of the rule, or any part thereof, for such small entities.”

    The Commission is aware that some of the revisions may impact small entities. The First Space Station Licensing Reform Order sought comment from all interested parties, and small entities were encouraged to bring to the Commission's attention any specific concerns they may have with the proposals outlined in the First Space Station Licensing Reform Order. No commenters raised any specific concerns about the impact of the revisions on small entities. This order adopts rule revisions to modernize the rules and advance the satellite industry. The revisions eliminate unnecessary requirements and expand routine processing to applications in additional frequency bands, among other changes. Together, the revisions in this Order lessen the burden of compliance on small entities with more limited resources than larger entities.

    The adopted changes for NGSO-like space station licensing clarify requirements for NGSO-like modified processing rounds. Each of these changes will lessen the burden in the licensing process. Specifically, this Order adopts revisions to reduce filing requirements and clarify the procedures for redistribution of surrendered spectrum in such a way that applicant burden will be reduced. Thus, the revisions will ultimately lead to benefits for small NGSO-like space station operators in the long-term.

    Report to Congress

    The Commission will send a copy of this Second Report and Order, including this FRFA, in a report to be sent to Congress pursuant to the Congressional Review Act. In addition, the Commission will send a copy of this Order, including this FRFA, to the Chief Counsel for Advocacy of the SBA. A copy of this Report and Order and FRFA (or summaries thereof) will also be published in the Federal Register.

    Legal Basis

    The action is authorized under sections 4(i), 7(a), 303(c), 303(f), 303(g), and 303(r) of the Communications Act of 1934, as amended, 47 U.S.C. 154(i), 157(a), 161, 303(c), 303(f), 303(g), and 303(r).

    Ordering Clauses

    It is ordered, that pursuant to sections 4(i), 301, 302, 303(r), 308, 309, and 310 of the Communications Act, 47 U.S.C. 154(i), 301, 302, 303(r), 308, 309, and 310, and section 1.429 of the Commission's rules, 47 CFR 1.429, the petitions for reconsideration listed in Appendix A to the Second Order on Reconsideration are granted in part, denied in part, and dismissed as moot in part, to the extent indicated above.

    It is further ordered, pursuant to sections 4(i), 7(a), 303(c), 303(f), 303(g), and 303(r) of the Communications Act of 1934, as amended, 47 U.S.C. 154(i), 157(a), 303(c), 303(f), 303(g), 303(r), that this Second Order on Reconsideration in IB Docket 02-34 is hereby adopted.

    It is further ordered, that part 25 of the Commission's Rules is amended as set forth in Appendix B of the Second Order on Reconsideration and section 25.157 is revised to remove the “three-licensee presumption” as well as the requirement that the Commission withhold spectrum for use in a subsequent processing round if fewer than three qualified applicants are licensed in the initial processing round.

    It is further ordered, that section 25.137(g) is amended to clarify that satellite operators are allowed to notify the Commission of transfers of ownership of Permitted List satellites after the transfer takes place.

    It is further ordered, that all rule revisions will be effective on the same date, which will be announced in a Public Notice.

    It is further ordered, that the Consumer Information Bureau, Reference Information Center, shall send a copy of this Order, including the Final Regulatory Flexibility Certification, to the Chief Counsel for Advocacy of the Small Business Administration.

    It is further ordered, that the Chief, International Bureau is delegated authority to modify satellite licenses consistent with the provisions of this Order above.

    It is further ordered, that this proceeding is terminated pursuant to section 4(i) and 4(j) of the Communications Act, 47 U.S.C. 154(i) and (j), absent applications for review or further appeals of this Second Order on Reconsideration.

    Federal Communications Commission. Marlene H. Dortch, Secretary. List of Subjects in 47 CFR Part 25

    Administrative practice and procedure, Earth stations, Satellites.

    For the reasons discussed in the preamble, the Federal Communications Commission amends 47 CFR part 25 as follows:

    PART 25—SATELLITE COMMUNICATIONS 1. The authority citation for part 25 continues to read as follows: Authority:

    Interprets or applies 47 U.S.C. 154, 301, 302, 303, 307, 309, 310, 319, 332, 605, and 721, unless otherwise noted.

    2. Revise § 25.137(g) to read as follows:
    § 25.137 Requests for U.S. market access through non-U.S.-licensed space stations.

    (g) A non-U.S.-licensed satellite operator that acquires control of a non-U.S.-licensed space station that has been permitted to serve the United States must notify the Commission within 30 days after consummation of the transaction so that the Commission can afford interested parties an opportunity to comment on whether the transaction affected any of the considerations we made when we allowed the satellite operator to enter the U.S. market. A non-U.S.-licensed satellite that has been transferred to new owners may continue to provide service in the United States unless and until the Commission determines otherwise. If the transferee or assignee is not licensed by, or seeking a license from, a country that is a member of the World Trade Organization for services covered under the World Trade Organization Basic Telecommunications Agreement, the non-U.S.-licensed satellite operator will be required to make the showing described in paragraph (a) of this section.

    3. Amend § 25.157 by revising paragraph (e) and removing paragraph (g)(3) to read as follows:

    (e)(1) In the event that there is insufficient spectrum in the frequency band available to accommodate all the qualified applicants in a processing round, the available spectrum will be divided equally among the licensees whose applications are granted pursuant to paragraph (d) of this section, except as set forth in paragraph (e)(2) of this section.

    (2) In cases where one or more applicants apply for less spectrum than they would be warranted under paragraph (e)(1) of this section, those applicants will be assigned the bandwidth amount they requested in their applications. In those cases, the remaining qualified applicants will be assigned the lesser of the amount of spectrum they requested in their applications, or the amount of spectrum that they would be assigned if the available spectrum were divided equally among the remaining qualified applicants.

    [FR Doc. 2016-25935 Filed 10-28-16; 8:45 am] BILLING CODE 6712-01-P
    NATIONAL AERONAUTICS AND SPACE ADMINISTRATION 48 CFR Parts 1801, 1843 and 1852 RIN 2700-AE35 NASA Federal Acquisition Regulation Supplement: Remove NASA FAR Supplement Clause Engineering Change Proposals (2016-N030) AGENCY:

    National Aeronautics and Space Administration.

    ACTION:

    Final rule.

    SUMMARY:

    NASA is issuing a final rule amending the NASA Federal Acquisition Regulation Supplement (NFS) to remove the Engineering Change Proposals (ECPs) basic clause with its Alternate I & II and associated information collection from the NFS.

    DATES:

    Effective: November 30, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Andrew O'Rourke, telephone 202-358-4560.

    SUPPLEMENTARY INFORMATION:

    I. Background

    NASA published a proposed rule in the Federal Register at 81 FR 54783 on August 17, 2016, to amend the NFS to remove contract clause 1852.243-70, Engineering Change Proposals (ECPs) with its Alternate I & II and associated information collection from the NFS. Six comments were received in response to the proposed rule.

    II. Discussion and Analysis

    NASA reviewed the public comments received in the development of the final rule. The six comments received were advertisements for personal services from the same respondent and completely unrelated to the purpose of this rule. Therefore, no change was made to the final rule as a result of the public comments received.

    III. Executive Orders 12866 and 13563

    Executive Orders (E.O.s) 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distributive impacts, and equity). E.O. 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. This is not a significant regulatory action and, therefore, was not subject to review under section 6(b) of E.O. 12866, Regulatory Planning and Review, dated September 30, 1993. This rule is not a major rule under 5 U.S.C. 804.

    IV. Regulatory Flexibility Act

    A final regulatory flexibility analysis has been prepared consistent with the Regulatory Flexibility Act, 5 U.S.C. 601, et seq., and is summarized as follows:

    The National Aeronautics and Space Administration (NASA) is issuing a final rule to amend the NASA FAR Supplement (NFS) to remove NFS clause 1852.243-70, Engineering Change Proposals (ECPs) basic clause with its Alternate I & II and associated information collection from the NFS because the NFS clause is no longer used in procurements and is duplicative to FAR requirements. NASA conducted a retrospective review of its regulations and determined NFS clause 1852.243-70 should be removed along with the corresponding information collection requirement OMB Control No. 2700—054.

    No changes were made to the final rule as a result of public comments received. Comments received in response to the proposed rule were advertisements for personal services and deemed out of scope.

    NASA does not expect this final rule to have a significant economic impact on a substantial number of small entities within the meaning of the Regulatory Flexibility Act, 5 U.S.C. 601 et seq., because we are removing a NFS clause and its associated information collection requirements for contractors. By removing this clause, the information collection burden on contractors will be reduced, thus providing all entities, both large and small, with a positive benefit.

    This rule does not include any new reporting, recordkeeping, or other compliance requirements for small businesses. There are no significant alternatives that could further minimize the already minimal impact on businesses, small or large.

    V. Paperwork Reduction Act

    The rule contains information collection requirements that require the approval of the OMB under the Paperwork Reduction Act (44 U.S.C. chapter 35); however, the changes to the NFS removes the information collection requirements previously approved under OMB Control Number 2700-0054, entitled NFS 1843 Contract Modifications for Engineering Change Proposals (ECP).

    List of Subjects in 48 CFR Parts 1801, 1843, and 1852

    Government procurement.

    Manuel Quinones, NASA FAR Supplement Manager.

    Accordingly, 48 CFR parts 1801, 1843, and 1852 are amended as follows:

    1. The authority citation for parts 1801, 1843 and 1852 continues to read as follows: Authority:

    51 U.S.C. 20113(a) and 48 CFR chapter 1.

    PART 1801—FEDERAL ACQUISITION REGULATIONS SYSTEM 2. Revise section 1801.106 to read as follows:

    1801.106 OMB approval under the Paperwork Reduction Act.

    The following OMB control numbers apply:

    NFS Segment OMB
  • control
  • No.
  • 1823 2700-0089 1827 2700-0052 1852.223-70 2700-0160 NF 533 2700-0003 NF 1018 2700-0017
    PART 1843—CONTRACT MODIFICATIONS 3. Revise section 1843.205-70 to read as follows:
    1843.205-70 NASA contract clauses.

    The contracting officer may insert a clause substantially as stated at 1852.243-72, Equitable Adjustments, in solicitations and contracts for—

    (a) Dismantling, demolishing, or removing improvements; or

    (b) Construction, when the contract amount is expected to exceed the simplified acquisition threshold and a fixed-price contract is contemplated.

    PART 1852—SOLICITATION PROVISIONS AND CONTRACT CLAUSES
    1852.243-70 [Removed and Reserved]
    4. Section 1852.243-70 is removed and reserved.
    1852.243-72 [Amended]
    5. Amend section 1852.243-72 by removing “1843.205-70(b)” and adding “1843.205-70” in its place.
    [FR Doc. 2016-26174 Filed 10-28-16; 8:45 am] BILLING CODE 7510-13-P
    DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration 50 CFR Part 679 [Docket No. 150818742-6210-02] RIN 0648-XF007 Fisheries of the Exclusive Economic Zone Off Alaska; Groundfish by Vessels Using Trawl Gear in the of the Gulf of Alaska AGENCY:

    National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration (NOAA), Commerce.

    ACTION:

    Temporary rule; opening.

    SUMMARY:

    NMFS is opening directed fishing for groundfish by vessels using trawl gear in the Gulf of Alaska (GOA). This action is necessary to fully use the 2016 groundfish total allowable catch in the GOA.

    DATES:

    Effective 1200 hours, Alaska local time (A.l.t.), October 28, 2016, through 2400 hours, A.l.t., December 31, 2016.

    Comments must be received at the following address no later than 4:30 p.m., A.l.t., November 15, 2016.

    ADDRESSES:

    You may submit comments on this document, identified by FDMS Docket Number NOAA-NMFS-2015-0110, by any of the following methods:

    Electronic Submission: Submit all electronic public comments via the Federal e-Rulemaking Portal. Go to www.regulations.gov/#!docketDetail;D=NOAA-NMFS-2015-0110, click the “Comment Now!” icon, complete the required fields, and enter or attach your comments.

    Mail: Address written comments to Glenn Merrill, Assistant Regional Administrator, Sustainable Fisheries Division, Alaska Region NMFS, Attn: Ellen Sebastian. Mail comments to P.O. Box 21668, Juneau, AK 99802-1668.

    Instructions: Comments sent by any other method, to any other address or individual, or received after the end of the comment period, may not be considered by NMFS. All comments received are a part of the public record and will generally be posted for public viewing on www.regulations.gov without change. All personal identifying information (e.g., name, address), confidential business information, or otherwise sensitive information submitted voluntarily by the sender will be publicly accessible. NMFS will accept anonymous comments (enter “N/A” in the required fields if you wish to remain anonymous). Attachments to electronic comments will be accepted in Microsoft Word, Excel, or Adobe PDF file formats only.

    FOR FURTHER INFORMATION CONTACT:

    Josh Keaton 907-586-7228.

    SUPPLEMENTARY INFORMATION:

    NMFS manages the groundfish fishery in the GOA exclusive economic zone according to the Fishery Management Plan for Groundfish of the Gulf of Alaska (FMP) prepared by the North Pacific Fishery Management Council under authority of the Magnuson-Stevens Fishery Conservation and Management Act. Regulations governing fishing by U.S. vessels in accordance with the FMP appear at subpart H of 50 CFR part 600 and 50 CFR part 679.

    NMFS prohibited directed fishing for groundfish by vessels using trawl gear in the GOA, effective 1200 hours, A.l.t., October 22, 2016 (81 FR 74313) under § 679.21(d)(6)(i). That action was necessary because the annual prohibited species catch (PSC) limit for Pacific halibut specified for vessels using trawl gear in the GOA was reached.

    As of October 25, 2016, NMFS has determined that approximately 250 metric tons of the trawl Pacific halibut PSC limit remains. Therefore, in accordance with § 679.25(a)(1)(i), (a)(2)(i)(C), and (a)(2)(iii)(D), and to fully utilize the 2016 groundfish total allowable catch, NMFS is terminating the previous closure and is opening directed fishing for groundfish by vessels using trawl gear in the GOA. The Administrator, Alaska Region (Regional Administrator) considered the following factors in reaching this decision: (1) The current harvest of Pacific halibut PSC in the trawl fishery the of the GOA and, (2) the harvest capacity and stated intent on future harvesting patterns of vessels in participating in this fishery.

    Classification

    This action responds to the best available information recently obtained from the fishery. The Assistant Administrator for Fisheries, NOAA (AA), finds good cause to waive the requirement to provide prior notice and opportunity for public comment pursuant to the authority set forth at 5 U.S.C. 553(b)(B) as such requirement is impracticable and contrary to the public interest. This requirement is impracticable and contrary to the public interest as it would prevent NMFS from responding to the most recent fisheries data in a timely fashion and would delay the opening of directed fishing for groundfish by vessels using trawl gear in the GOA. NMFS was unable to publish a notice providing time for public comment because the most recent, relevant data only became available as of October 25, 2016.

    The AA also finds good cause to waive the 30-day delay in the effective date of this action under 5 U.S.C. 553(d)(3). This finding is based upon the reasons provided above for waiver of prior notice and opportunity for public comment.

    Without this inseason adjustment, NMFS could not allow the trawl deep-water species fishery in the GOA to be harvested in an expedient manner and in accordance with the regulatory schedule. Under § 679.25(c)(2), interested persons are invited to submit written comments on this action to the above address until November 15, 2016.

    This action is required by §§ 679.21 and 679.25 and is exempt from review under Executive Order 12866.

    Authority:

    16 U.S.C. 1801 et seq.

    Dated: October 26, 2016. Emily H. Menashes, Acting Director, Office of Sustainable Fisheries, National Marine Fisheries Service.
    [FR Doc. 2016-26221 Filed 10-26-16; 4:15 pm] BILLING CODE 3510-22-P
    81 210 Monday, October 31, 2016 Proposed Rules DEPARTMENT OF TRANSPORTATION Office of the Secretary 14 CFR Part 259 [Docket No. DOT-OST-2016-0208] RIN 2105-AE53 Refunding Baggage Fees for Delayed Checked Bags AGENCY:

    Office of the Secretary (OST), Department of Transportation (DOT).

    ACTION:

    Advance notice of proposed rulemaking (ANPRM).

    SUMMARY:

    The Department of Transportation (DOT or Department) is soliciting public comment and feedback on various issues related to the requirement for airlines to refund checked baggage fees when they fail to deliver the bags in a timely manner, as provided by the FAA Extension, Safety, and Security Act of 2016.

    DATES:

    Comments should be filed by November 30, 2016. Late-filed comments will be considered to the extent practicable.

    ADDRESSES:

    You may file comments identified by the docket number DOT-OST-2016-0208 by any of the following methods:

    Federal eRulemaking Portal: go to http://www.regulations.gov and follow the online instructions for submitting comments.

    Mail: Docket Management Facility, U.S. Department of Transportation, 1200 New Jersey Ave. SE., West Building Ground Floor, Room W12-140, Washington, DC 20590-0001.

    Hand Delivery or Courier: West Building Ground Floor, Room W12-140, 1200 New Jersey Ave. SE., Washington, DC, between 9 a.m. and 5 p.m. ET, Monday through Friday, except Federal holidays.

    Fax: (202) 493-2251.

    Instructions: You must include the agency name and docket number DOT-OST-2016-0208 or the Regulatory Identification Number (RIN) for the rulemaking at the beginning of your comment. All comments received will be posted without change to http://www.regulations.gov, including any personal information provided.

    Privacy Act: Anyone is able to search the electronic form of all comments received in any of our dockets by the name of the individual submitting the comment (or signing the comment, if submitted on behalf of an association, business, labor union, etc.). You may review DOT's complete Privacy Act statement in the Federal Register published on April 11, 2000 (65 FR 19477-78), or you may visit http://DocketsInfo.dot.gov.

    Docket: For access to the docket to read background documents and comments received, go to http://www.regulations.gov or to the street address listed above. Follow the online instructions for accessing the docket.

    FOR FURTHER INFORMATION CONTACT:

    Clereece Kroha, Senior Trial Attorney, Office of the Assistant General Counsel for Aviation Enforcement and Proceedings, U.S. Department of Transportation, 1200 New Jersey Ave. SE., Washington, DC 20590, 202-366-9342 (phone), 202-366-7152 (fax), [email protected] (email).

    SUPPLEMENTARY INFORMATION:

    The Department of Transportation (DOT or Department) is seeking comment on the appropriate means to implement a requirement in recent legislation for airlines to refund checked baggage fees when they fail to deliver the bags in a timely manner. Specifically, the Department seeks comment on how to define a baggage delay, and the appropriate method for providing the refund for delayed baggage.

    Background

    On April 25, 2011, the Department of Transportation published its second Enhancing Airline Passenger Protections final rule that requires, among other things, that U.S. and foreign air carriers adopt and adhere to a customer service plan that addresses various consumer issues. See 76 FR 23110 (April 25, 2011). In the proposal preceding that final rule, the Department solicited comments on whether we should include as standards: (1) That carriers reimburse passengers the fee charged to transport a bag if that bag is lost or not timely delivered, and (2) the time when a bag should be considered not to have been timely delivered (e.g., delivered on same or earlier flight than the passenger, delivered within 2 hours of the passenger's arrival). After reviewing the comments received, we adopted in the final rule a customer service standard that requires carriers to reimburse passengers for any fee charged to transport a bag if the bag is lost. We decided to not require carriers to reimburse passengers for any fee charged to transport a bag that is not timely delivered. In making this determination, we stated that, as is the case with transporting passengers, while delay in receiving baggage may be inconvenient, once the carrier delivers a bag, the service has been performed. We clarified that although not required to refund baggage fees in the case of delayed delivery of a checked bag, carriers must comply with the Department's baggage liability rule, 14 CFR part 254, and applicable international agreements, to compensate passengers for direct or consequential damages resulting from the delay in delivering of luggage, up to the limits set by the rule and the agreements.

    Baggage fees, along with other ancillary fees, have become an increasingly important component of the airline industry's revenue structure. According to data from the Department's Bureau of Transportation Statistics (BTS), the top 13 U.S. carriers collectively generated over $3.8 billion in revenue in 2015 from baggage fees.1 While we have no doubt that airlines continue to invest in baggage handling infrastructure and technology to improve the efficiency and quality of their services, we also realize that baggage delays do occur and affect many consumers on a daily basis. Data from the Department's Air Travel Consumer Report demonstrate that, in 2015, the 13 largest U.S. carriers received close to 2 million mishandled baggage reports from passengers for their domestic scheduled flights.2 Although these mishandled baggage reports also include reports of lost, damaged, and pilfered baggage in addition to delayed baggage, this figure suggests that the number of delayed baggage incidents is likely significant.3 Since the issuance of the 2011 final rule in which the Department decided not to require airlines to refund baggage fees for delayed bags, many consumers and consumer rights advocacy groups have voiced their opinion that airlines should be required to refund checked baggage fees if they fail to deliver bags on time.

    1 Source: Baggage Fees by Airline 2015, Bureau of Transportation Statistics, Office of the Assistant Secretary for Research and Technology, updated on May 2, 2016. https://www.rita.dot.gov/bts/sites/rita.dot.gov.bts/files/subject_areas/airline_information/baggage_fees/html/2015.html.

    2 Source: Air Travel Consumer Report, February 2016 Edition, Page 31. https://www.rita.dot.gov/bts/sites/rita.dot.gov.bts/files/subject_areas/airline_information/baggage_fees/html/2015.html. The Department does not collect information on mishandled baggage for international flights.

    3 The mishandled baggage data as reported to the Department is based on the number of mishandled baggage reports received from passengers by the reporting carriers. Each report may involve more than one piece of mishandled baggage.

    This matter has also caught the attention of the Congress. In 2016, both the Senate and the House of Representatives included in their Federal Aviation Administration reauthorization bills a provision to require the Department to issue a rule that mandates refunds of baggage fees for delayed bags.4 On July 15, 2016, the President signed into law the FAA Extension, Safety, and Security Act of 2016 (“FAA Extension Act” or “Act”) which includes a requirement for the Department to issue a rule mandating that airlines provide automated refunds to passengers for any fee charged to transport a bag if the bag is delayed.5

    4 Sec. 3109, Federal Aviation Administration Reauthorization Act of 2016, S. 2658, 114th Cong. (2015-2016); Sec. 507, Aviation Innovation, Reform, and Reauthorization Act of 2016, H.R. 4441, 114th Cong. (2015-2016).

    5 See, FAA Extension, Safety, and Security Act of 2016, Public Law 114-190, July 15, 2016.

    Defining a Baggage Delay

    Section 2305 of the FAA Extension Act provides that the Department shall issue a final rule within one year of the enactment of the Act that requires U.S. and foreign carriers to promptly provide an automated refund for any ancillary fees paid by the passenger for checked baggage if the carriers fail to deliver the bag to passengers within 12 hours of arrival for domestic flights and within 15 hours of arrival for international flights, if the passenger notifies the carrier about the delayed or lost baggage. The Act also allows the Department to extend these timeframes to up to 18 hours for domestic flights and up to 30 hours for international flights, if the Department determines that the 12-hour or 15-hour standards are not feasible and would adversely affect consumers in certain cases.

    Each delayed bag affects an individual passenger's travel experience, resulting in inconvenience and other harms. The Department is seeking comments from all stakeholders in order to determine how to implement section 2305 of the Act so the mandated regulation would best achieve Congress' and the Department's goal of mitigating the inconvenience and harm to consumers caused by delayed baggage.

    DOT is seeking comment to help it determine the appropriate length of delay within the statutory parameters that would trigger the refund requirement. As stated above, the Act provides that a refund should be issued to passengers if the carrier fails to deliver the checked baggage to the passenger not later than 12 hours after the arrival of a domestic flight, or not later than 15 hours after the arrival of an international flight. The Act also authorizes the Department to extend these timeframes to up to 18 hours for domestic flights and 30 hours for international flights if the Secretary determines that the 12-hour or 15-hour standards are infeasible and would “adversely affect consumers in certain cases.” The Department invites public input on the 12 and 15 hour standards prescribed in the Act as well as any other standards within the statutory parameters, which are for domestic flights between 12 and 18 hours after the flight's arrival and for international flights between 15 and 30 hours after the flight's arrival. The Department seeks comment on why a particular length of time within this timeframe would be more appropriate than other times.

    The Department also seeks comment on how the rule should deal with a passenger itinerary that consists of an international flight connecting to a domestic flight. Is there a reason that this itinerary should be considered an international flight within the meaning of the statute, or does the final domestic flight cause the passenger to be treated as domestic for purposes of the statute and rule? Is there a reason to distinguish between a standard interline (i.e., multiple-carrier) connection on a single ticket and a connection constructed by the passenger using two tickets (e.g., where the carriers do not interline with each other)?

    We solicit comments on the ways in which standard industry practice for baggage interlining and mishandled baggage may affect the mandated rule. For example, the last carrier on an interline itinerary is generally responsible for handling a mishandled-baggage report to conclusion, but on a baggage delay on an interline trip this will generally not be the carrier to whom the passenger paid the baggage fee.

    In addition to situations, such as interline, in which there are multiple entities involved in the transportation of bags, there are also situations in which there are multiple entities involved in the transactions of bag fees. Specifically, although not a common practice among most carriers, there are instances in which a carrier authorizes a ticket agent, by contractual agreement, to collect baggage fees from the ticket agent's customers on behalf of the carrier. To the extent an entity other than the carrier is involved in collecting baggage fees, we seek comments on who should be held responsible to refund the bag fees for delayed bags. Should we hold both entities responsible? Based on the structure of the agreement between the two entities, and common business practice, what is the best way to ensure that bag fees are refunded in a timely manner and to avoid passengers being sent back and forth between two entities to determine which entity is responsible?

    As the statute gives the Department some flexibility to modify the length of delay taking into consideration feasibility and any negative impact on consumers, we construe the statute's use of the phrase “in certain cases” to mean that Congress intends to provide the Department the flexibility to differentiate the length of delay that triggers a refund based on certain circumstances, if appropriate, instead of applying one standard to all domestic flights, and another standard to all international flights, if the Department determines this is appropriate. In that regard, in addition to domestic versus international flights, is there a reason that the rule should establish a secondary set of criteria, such as the flight duration and/or the frequency of service in question? Is the frequency of the operation by the transporting carrier or all carriers that operate on the same route relevant to defining the delay? Since some international flights are short haul flights (e.g., trans-border flights), and some domestic flights can last for over 10 hours (e.g., New York to Honolulu), should we instead tier the delay standard based on the length of the passenger's flight(s)?

    DOT is also seeking comment on how to determine when the clock stops running for purposes of measuring the delay. The Act provides that the 12 hour and 15 hour clock stops when the carrier “delivers the checked baggage to the passenger.” Sometimes, a passenger may stay at the arrival airport and wait for the delayed baggage if the delay is likely to be within a few hours. However, when the delay goes beyond a certain point, the industry's common practice is to deliver the bags to the passenger's residence or a designated location requested by the passenger. In some cases, the passengers may choose to receive notice when their bags arrive and pick up the bags at the carrier's baggage office at the destination airport. How should we determine that the bags have been “delivered” to the passenger and therefore stop the clock from running in each of these situations?

    DOT seeks comment on the number of bags that are delayed annually based on the 12 and 18 hour and 15 and 30 hour statutory timeframes, and lost bags. The Department receives information on the number of mishandled-baggage reports filed by passengers, but we do not have data on how many of these are delayed bags, and how many are lost. Information on the number of delayed and lost bags that would be affected by this rulemaking would help the Department to better estimate the impact this rule would have on consumers and airlines.

    Method for Refunding Delayed Baggage

    The Department is also seeking comment on the appropriate method for providing a refund for delayed baggage. The Department's credit card refund regulation, 14 CFR part 374, implements the Consumer Credit Protection Act and Regulation Z of the Board of Governors of the Federal Reserve System, 15 U.S.C. 1601-1693r and 12 CFR part 226 (Regulation Z) with respect to air carriers and foreign air carriers. It states that when refunds are due on purchases with a credit card, a carrier must transmit a credit statement to the credit card issuer within seven business days of receipt of full documentation for the refund requested. In addition, the Department requires that, with respect to purchases with forms of payment other than credit cards, an airline must provide a refund within 20 days of receipt of full documentation of such a request. See 14 CFR 259.5(b)(5). The Department applies these refund standards to all refunds that are due to consumers, including airfare refunds and ancillary fee refunds. In order to receive a refund under Regulation Z, a consumer must request the refund from the carrier and provide all necessary supporting documents. In contrast, the Act states that carriers should “promptly provide an automated refund” to an eligible passenger when the carriers fail to meet the applicable time limit in delivering the checked bag, and the passenger has notified the carrier of the lost or delayed checked baggage. Under the Act, an “automated refund” should be issued to passengers as long as the delay has met the threshold timeframe and the passenger has notified the carrier about the delayed or lost bag. In that regard, we view the delayed baggage fee refund provision in the FAA Extension Act differently from Regulation Z in that the Act only requires a passenger to notify the carrier that a bag is delayed or lost, and there is not a requirement for the passenger to request a refund for the baggage fee. We emphasize that since the Act's automated refund requirement covers all bags that are delayed for more than a set number of hours, it will also cover “lost bags,” refunding fees charged for which is already required by 14 CFR 259.5(b)(3).6 As such, both bags delayed for more than the set number of hours and bags that are considered “lost” would be eligible for an automated refund.

    6 We have not defined “lost” for purposes of 14 CFR 259.5(b)(3) mandating a refund of the baggage fee for lost bags. Instead, in a Frequently Asked Questions document issued by the Department's Office of Aviation Enforcement and Proceedings, that office states that if a carrier unreasonably refuses to consider a bag to be lost after it has been missing for a considerable period of time, it could be subject to enforcement action for violating the statutory prohibition against unfair and deceptive practices. See, Answers to Frequently Asked Questions Concerning the Enforcement of the Second Final Rule on Enhancing Airline Passenger Protections (EAPP #2), last updated May 8, 2015, https://www.transportation.gov/sites/dot.gov/files/docs/EAPP_2_FAQ_2_0.pdf.

    The Department seeks comment on whether prescribing a specific mechanism for the carriers to use to provide the statutorily required automated refund would negatively or positively impact carriers and consumers. What procedures would be necessary on interline itineraries, for which the carrier to whom the passenger reports the delayed bag at his or her destination or stopover is not the carrier to whom the passenger had paid the baggage fee? In addition to soliciting comment on all of the issues and concerns identified above, we also welcome and any other information relevant to this issue. This specifically includes comments and data on the cost impact on new-entrant carriers (many of whom do not have interline agreements) of the time standard developed in this proceeding, and the cost impact on regional airlines.

    Issued this 18th day of October, 2016, in Washington, DC. Anthony R. Foxx, Secretary of Transportation.
    [FR Doc. 2016-26199 Filed 10-28-16; 8:45 am] BILLING CODE 4910-9X-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Parts 1, 112, 117, and 507 [Docket No. FDA-2016-D-2841] Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry; Availability AGENCY:

    Food and Drug Administration, HHS.

    ACTION:

    Notification of availability.

    SUMMARY:

    The Food and Drug Administration (FDA, we, or Agency) is announcing the availability of a draft guidance for industry entitled “Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry.” This draft guidance explains our current thinking on disclosure statements made by an entity, in documents accompanying food, that certain hazards have not been controlled by that entity as required by certain provisions in four final rules. This document describes our current thinking on how to describe the hazard under each of the four rules and which documents we consider to be “documents of the trade” for the purpose of disclosure statements.

    DATES:

    Although you can comment on any guidance at any time (see 21 CFR 10.115(g)(5)), to ensure that we consider your comment on this draft guidance before we begin work on the final version of the guidance, submit either electronic or written comments on the draft guidance by May 1, 2017. Submit either electronic or written comments on the proposed collection of information by May 1, 2017.

    ADDRESSES:

    You may submit comments as follows:

    Electronic Submissions

    Submit electronic comments in the following way:

    Federal eRulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments. Comments submitted electronically, including attachments, to http://www.regulations.gov will be posted to the docket unchanged. Because your comment will be made public, you are solely responsible for ensuring that your comment does not include any confidential information that you or a third party may not wish to be posted, such as medical information, your or anyone else's Social Security number, or confidential business information, such as a manufacturing process. Please note that if you include your name, contact information, or other information that identifies you in the body of your comments, that information will be posted on http://www.regulations.gov.

    • If you want to submit a comment with confidential information that you do not wish to be made available to the public, submit the comment as a written/paper submission and in the manner detailed (see “Written/Paper Submissions” and “Instructions”).

    Written/Paper Submissions

    Submit written/paper submissions as follows:

    Mail/Hand delivery/Courier (for written/paper submissions): Division of Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852.

    • For written/paper comments submitted to the Division of Dockets Management, FDA will post your comment, as well as any attachments, except for information submitted, marked and identified, as confidential, if submitted as detailed in “Instructions.”

    Instructions: All submissions received must include the Docket No. FDA-2016-D-2841 for “Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry.” Received comments will be placed in the docket and, except for those submitted as “Confidential Submissions,” publicly viewable at http://www.regulations.gov or at the Division of Dockets Management between 9 a.m. and 4 p.m., Monday through Friday.

    Confidential Submissions—To submit a comment with confidential information that you do not wish to be made publicly available, submit your comments only as a written/paper submission. You should submit two copies total. One copy will include the information you claim to be confidential with a heading or cover note that states “THIS DOCUMENT CONTAINS CONFIDENTIAL INFORMATION.” The Agency will review this copy, including the claimed confidential information, in its consideration of comments. The second copy, which will have the claimed confidential information redacted/blacked out, will be available for public viewing and posted on http://www.regulations.gov. Submit both copies to the Division of Dockets Management. If you do not wish your name and contact information to be made publicly available, you can provide this information on the cover sheet and not in the body of your comments and you must identify this information as “confidential.” Any information marked as “confidential” will not be disclosed except in accordance with 21 CFR 10.20 and other applicable disclosure law. For more information about FDA's posting of comments to public dockets, see 80 FR 56469, September 18, 2015, or access the information at: http://www.fda.gov/regulatoryinformation/dockets/default.htm.

    Docket: For access to the docket to read background documents or the electronic and written/paper comments received, go to http://www.regulations.gov and insert the docket number, found in brackets in the heading of this document, into the “Search” box and follow the prompts and/or go to the Division of Dockets Management, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852.

    Submit written requests for single copies of the draft guidance to Center for Food Safety and Applied Nutrition (HFS-300), Food and Drug Administration (HFS-300), 5001 Campus Drive, College Park, MD 20740. Send two self-addressed adhesive labels to assist that office in processing your request. See the SUPPLEMENTARY INFORMATION section for electronic access to the draft guidance.

    FOR FURTHER INFORMATION CONTACT:

    With regard to this draft guidance: For questions regarding this draft guidance as it relates to our regulation entitled “Current Good Manufacturing Practice, Hazard Analysis, and Risk-Based Preventive Controls for Human Food,” contact Jenny Scott, Center for Food Safety and Applied Nutrition, (HFS-300), Food and Drug Administration, 5001 Campus Dr., College Park, MD 20740, 240-402-2166.

    For questions regarding this draft guidance as it relates to our regulation entitled “Current Good Manufacturing Practice, Hazard Analysis, and Risk-Based Preventive Controls for Food for Animals,” contact Jeanette Murphy, Center for Veterinary Medicine (HFV-200), Food and Drug Administration, 7519 Standish Pl., Rockville, MD 20855, 240-402-6246.

    For questions regarding this draft guidance as it relates to our regulation entitled “Standards for the Growing, Harvesting, Packing, and Holding of Produce for Human Consumption,” contact Samir Assar, Center for Food Safety and Applied Nutrition (HFS-317), Food and Drug Administration, 5001 Campus Dr., College Park, MD 20740, 240-401-1636.

    For questions regarding this draft guidance as it relates to our regulation entitled “Foreign Supplier Verification Programs (FSVP) for Importers of Food for Humans and Animals,” contact Rebecca Buckner, Office of Food and Veterinary Medicine, Food and Drug Administration, 10903 New Hampshire Ave., Silver Spring, MD 20993-0002, 301-796-4576.

    SUPPLEMENTARY INFORMATION:

    I. Background

    We are announcing the availability of a draft guidance for industry entitled “Describing a Hazard That Needs Control in Documents Accompanying the Food, as Required by Four Rules Implementing the FDA Food Safety Modernization Act: Guidance for Industry.” We are issuing the draft guidance consistent with FDA's good guidance practices regulation (21 CFR 10.115). The draft guidance, when finalized, will represent the current thinking of FDA on this topic. It does not create or confer any rights for or on any person and does not operate to bind FDA or the public. You can use an alternate approach if it satisfies the requirements of the applicable statutes and regulations.

    The draft guidance relates to four of the seven foundational rules that we have established in Title 21 of the Code of Federal Regulations (21 CFR) as part of our implementation of the FDA Food Safety Modernization Act (FSMA) (Pub. L. 111-353). Table 1 lists these four rules. Each of these rules includes “customer provisions” as specified in table 1.

    Table 1—The Four Foundational FSMA Rules Relevant to the Draft Guidance Title and abbreviations for the purpose of this document Regulatory codification “Customer provisions” Publication Current Good Manufacturing Practice, Hazard Analysis, and Risk-Based Preventive Controls for Human Food (part 117) 21 CFR part 117 21 CFR 117.136(a)(2), (3), and (4) 80 FR 55908, September 17, 2015. Current Good Manufacturing Practice, Hazard Analysis, and Risk-Based Preventive Controls for Food for Animals (part 507) 21 CFR part 507 21 CFR 507.36(a)(2), (3), and (4) 80 FR 56170, September 17, 2015. Standards for the Growing, Harvesting, Packing, and Holding of Produce for Human Consumption (produce safety regulation) 21 CFR part 112 21 CFR 112.2(b) 80 FR 74354, November 27, 2015. Foreign Supplier Verification Programs (FSVP) for Importers of Food for Humans and Animals (FSVP regulation) 21 CFR part 1, subpart L 21 CFR 1.507(a)(2)(i), (a)(3)(i), and (a)(4)(i) 80 FR 74226, November 27, 2015.

    The “customer provisions” of part 117 and part 507 each include a requirement for a “disclosure statement” in which a manufacturer/processor must disclose, in documents accompanying the food, in accordance with the practice of the trade, that the food is “not processed to control [identified hazard]” in certain circumstances. Likewise, the “customer provisions” of the FSVP regulation include a requirement for a “disclosure statement” in which an importer must disclose, in documents accompanying the food, in accordance with the practice of the trade, that the food is “not processed to control [identified hazard]” in certain circumstances. The “customer provisions” of the produce safety regulation relate to an exemption from that regulation that includes a requirement for a “disclosure statement” in which a farm must disclose, in documents accompanying the food, in accordance with the practice of the trade, that the food is “not processed to adequately reduce the presence of microorganisms of public health significance.”

    The draft guidance responds to industry questions regarding these requirements for a disclosure statement. On March 23, 2016, FDA met with a food trade association at their request to listen to concerns regarding the customer provisions of part 117 (Ref. 1), including concerns regarding the disclosure statement in part 117. At the meeting, the trade association expressed concern about providing a disclosure statement when multiple hazards may be present, including chemical hazards (such as mycotoxins) and physical hazards (such as stones in raw agricultural commodities), as well as for multiple biological hazards (such as microbial pathogens). The trade association also asked us to allow a variety of types of documents that accompany the food to have the disclosure statement (e.g., contractual agreements, Web sites referenced on labels and in contracts, labels, letters of guarantee, shipment-specific certificates of analysis, shipping documents, specifications, and terms and conditions).

    The trade association focused its discussion on the requirements of part 117, but noted that it had parallel concerns for the analogous provisions of part 507 and the FSVP regulation (Ref. 1). Although the trade association did not express concern with the disclosure statement in the produce safety regulation, we believe it will be helpful to businesses subject to the produce safety regulation, to include our current thinking on the disclosure statement in all four rules that have requirements for a disclosure statement, not just the three rules mentioned by the trade association.

    II. Paperwork Reduction Act of 1995

    This draft guidance refers to previously approved collections of information found in FDA regulations. These collections of information are subject to review by the Office of Management and Budget (OMB) under the Paperwork Reduction Act of 1995 (44 U.S.C. 3501-3520). The collections of information in 21 CFR part 117 have been approved under OMB control number 0910-0751. The collections of information in 21 CFR part 507 have been approved under OMB control number 0910-0789. The collections of information in 21 CFR part 112 have been approved under OMB control number 0910-0816. The collections of information in 21 CFR part 1, subpart L have been approved under OMB control number 0910-0752.

    III. Electronic Access

    Persons with access to the Internet may obtain the draft guidance at either http://www.fda.gov/FoodGuidances or http://www.regulations.gov. Use the FDA Web site listed in the previous sentence to find the most current version of the guidance.

    IV. References

    The following references are on display in the Division of Dockets Management, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852, and are available for viewing by interested persons between 9 a.m. and 4 p.m., Monday through Friday; they are also available electronically at http://www.regulations.gov.

    1. Grocery Manufacturers Association, “21 CFR 117.136. Industry Impacts from Disclosure and Written Assurance Requirements,” 2016. Dated: October 26, 2016. Leslie Kux, Associate Commissioner for Policy.
    [FR Doc. 2016-26245 Filed 10-28-16; 8:45 am] BILLING CODE 4164-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Parts 16 and 58 [Docket No. FDA-2010-N-0548] Good Laboratory Practice for Nonclinical Laboratory Studies; Extension of Comment Period AGENCY:

    Food and Drug Administration, HHS.

    ACTION:

    Proposed rule; extension of comment period.

    SUMMARY:

    The Food and Drug Administration (FDA) is extending the comment period for the proposed rule that appeared in the Federal Register of August 24, 2016. In the proposed rule, FDA requested comments on its proposal to amend the regulations for good laboratory practice for nonclinical studies. The Agency is taking this action in response to requests for an extension to allow interested persons additional time to submit comments.

    DATES:

    FDA is extending the comment period on the proposed rule published August 24, 2016 (81 FR 58342). Submit either electronic or written comments by January 21, 2017.

    ADDRESSES:

    You may submit comments as follows:

    Electronic Submissions

    Submit electronic comments in the following way:

    Federal eRulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments. Comments submitted electronically, including attachments, to http://www.regulations.gov will be posted to the docket unchanged. Because your comment will be made public, you are solely responsible for ensuring that your comment does not include any confidential information that you or a third party may not wish to be posted, such as medical information, your or anyone else's Social Security number, or confidential business information, such as a manufacturing process. Please note that if you include your name, contact information, or other information that identifies you in the body of your comments, that information will be posted on http://www.regulations.gov.

    • If you want to submit a comment with confidential information that you do not wish to be made available to the public, submit the comment as a written/paper submission and in the manner detailed (see “Written/Paper Submissions” and “Instructions”).

    Written/Paper Submissions

    Submit written/paper submissions as follows:

    Mail/Hand delivery/Courier (for written/paper submissions): Division of Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852.

    • For written/paper comments submitted to the Division of Dockets Management, FDA will post your comment, as well as any attachments, except for information submitted, marked and identified, as confidential, if submitted as detailed in “Instructions.”

    Instructions: All submissions received must include the Docket No. FDA-2010-N-0548 for “Good Laboratory Practice for Nonclinical Laboratory Studies.” Received comments will be placed in the docket and, except for those submitted as “Confidential Submissions,” publicly viewable at http://www.regulations.gov or at the Division of Dockets Management between 9 a.m. and 4 p.m., Monday through Friday.

    Confidential Submissions—To submit a comment with confidential information that you do not wish to be made publicly available, submit your comments only as a written/paper submission. You should submit two copies total. One copy will include the information you claim to be confidential with a heading or cover note that states “THIS DOCUMENT CONTAINS CONFIDENTIAL INFORMATION.” The Agency will review this copy, including the claimed confidential information, in its consideration of comments. The second copy, which will have the claimed confidential information redacted/blacked out, will be available for public viewing and posted on http://www.regulations.gov. Submit both copies to the Division of Dockets Management. If you do not wish your name and contact information to be made publicly available, you can provide this information on the cover sheet and not in the body of your comments and you must identify this information as “confidential.” Any information marked as “confidential” will not be disclosed except in accordance with 21 CFR 10.20 and other applicable disclosure law. For more information about FDA's posting of comments to public dockets, see 80 FR 56469, September 18, 2015, or access the information at: http://www.fda.gov/regulatoryinformation/dockets/default.htm.

    Docket: For access to the docket to read background documents or the electronic and written/paper comments received, go to http://www.regulations.gov and insert the docket number, found in brackets in the heading of this document, into the “Search” box and follow the prompts and/or go to the Division of Dockets Management, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852.

    FOR FURTHER INFORMATION CONTACT:

    Vernon Toelle, Office of Surveillance and Compliance, Center for Veterinary Medicine, Food and Drug Administration, 7519 Standish Pl., MPN4-142, Rockville, MD 20855, 240-402-5637; or Kristin Webster Maloney, Office of Policy and Risk Management, Office of Regulatory Affairs, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 32, Rm. 4373, Silver Spring, MD 20993, 240-402-4993.

    SUPPLEMENTARY INFORMATION:

    In the Federal Register of August 24, 2016, FDA published a proposed rule with a 90-day comment period to request comments on its proposal to amend the regulations for good laboratory practice for nonclinical studies. Comments on the proposed amendments will inform FDA's rulemaking to establish regulations for good laboratory practice for nonclinical laboratory studies.

    The Agency has received requests for a 90-day extension of the comment period for the proposed rule. Each request conveyed concern that the current 90-day comment period does not allow sufficient time to develop a meaningful or thoughtful response to the proposed rule.

    FDA has considered the requests and is extending the comment period for the proposed rule for 60 days, until January 21, 2017. The Agency believes that a 60-day extension allows adequate time for interested persons to submit comments without significantly delaying rulemaking on these important issues.

    Dated: October 26, 2016. Leslie Kux, Associate Commissioner for Policy.
    [FR Doc. 2016-26244 Filed 10-28-16; 8:45 am] BILLING CODE 4164-01-P
    DEPARTMENT OF DEFENSE Office of the Secretary 32 CFR Part 250 [Docket ID: DOD-2015-OS-0126] RIN 0790-AI73 Withholding of Unclassified Technical Data and Technology From Public Disclosure AGENCY:

    Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, DoD.

    ACTION:

    Proposed rule.

    SUMMARY:

    This rulemaking establishes policy, assigns responsibilities, and prescribes procedures for the dissemination and withholding of certain unclassified technical data and technology subject to the International Traffic in Arms Regulations (ITAR) and Export Administration Regulations (EAR). It applies to DoD components, their contractors and grantees and is meant to control the transfer of technical data and technology contributing to the military potential of any country or countries, groups, or individuals that could prove detrimental to U.S, national security or critical interests.

    DATES:

    Comments must be received by December 30, 2016.

    ADDRESSES:

    You may submit comments, identified by docket number and/or RIN number and title, by any of the following methods:

    Federal Rulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments.

    Mail: Department of Defense, Office of the Deputy Chief Management Officer, Directorate for Oversight and Compliance, 4800 Mark Center Drive, Mailbox #24, Alexandria, VA 22350-1700.

    Instructions: All submissions received must include the agency name and docket number or Regulatory Information Number (RIN) for this Federal Register document. The general policy for comments and other submissions from members of the public is to make these submissions available for public viewing on the Internet at http://www.regulations.gov as they are received without change, including any personal identifiers or contact information.

    FOR FURTHER INFORMATION CONTACT:

    Vakare Valaitis, 703-767-9159.

    SUPPLEMENTARY INFORMATION: Background

    For the purposes of this regulation, public disclosure of technical data and technology is the same as providing uncontrolled foreign access. This rule instructs DoD employees, contractors, and grantees to ensure unclassified technical data and technology that discloses technology or information with a military or space application may not be exported without authorization and should be controlled and disseminated consistent with U.S. export control laws and regulations. These policies preserve the U.S. military's technological superiority, establish and maintain interoperability with allies and coalition partners, and manage direct and indirect impacts on defense industrial base.There are penalties for export control violations. For export control violations involving items controlled by the United States Department of State under the International Traffic in Arms Regulations (ITAR), including many munitions items, the statute authorizes a maximum criminal penalty of $1 million per violation and, for an individual person, up to 10 years imprisonment. In addition, ITAR violations can result in the imposition of a maximum civil fine of $500,000 per violation, as well as debarment from exporting defense articles or services. For export control violations involving dual-use and certain munitions items controlled by the United States Department of Commerce under the Export Administration Regulations, criminal and civil penalties are currently provided by the International Emergency Economic Powers Act (IEEPA), 50 U.S.C. 1705, which has continued the Export Administration Regulations (EAR) in effect while the Export Administration Act is in lapse through Executive Order 13222 of August 17, 2001 (3 CFR 2001 Comp. 783 (2002)), as amended by Executive Order 13637 of March 8, 2013, 78 FR 16129 (March 13, 2013) and as extended by successive Presidential Notices, the most recent being that of August 4, 2016 (81 FR 52587 (Aug. 8, 2016)). Under the EAR and IEEPA, as adjusted by 15 CFR 5.4(b), the penalty for persons who violate, attempt or conspire to violate, or cause a violation of the export control regulations includes civil penalties of not more than $284,582 per transaction or twice the amount of the transaction, whichever is greater, and criminal penalties of not more than $1,000,000, imprisonment of not more than 20 years, or both. Violations of the EAR may also result in the denial of export priveleges and other administrative sanctions.

    Authority To Issue This Regulation

    In accordance with 10 U.S.C. 133 part (b)(2), the Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)) may exercise powers relating to establishing policies for acquisition (including procurement of goods and services, research and development, developmental testing, and contract administration) for all elements of the Department of Defense. In addition, U.S. export control laws, including 22 U.S.C. 2778 (also known as the “Arms Export Control Act”); 50 U.S.C. chapter 35 (also known as the “International Emergency Economic Powers Act” (IEEPA)); 22 CFR parts 120 through 130 (also known as “International Traffic in Arms Regulations” (ITAR)); and 15 CFR parts 730 through 774 (also known as “Export Administration Regulations” (EAR)) govern this rule.

    Summary of the Major Provisions of the Rulemaking

    This proposed rule describes procedures for the release of technical information; discusses procedures for technical data and technology to be marked for distribution; and provides an example of the notice to accompany export-controlled technical data and technology.

    Costs and Benefits

    DoD is proposing this regulation to update the CFR and DoD Directive 5230.25 (available at http://dtic.mil/whs/directives/corres/pdf/523025p.pdf). The Department currently spends $571,876 annually on export control certification activities. The costs to DoD contractors and grantee consist primarily of the time needed to organize, format, and submit information to the U.S./Canada Joint Certification Office to qualify for export controlled technical data and technology.

    The program has no discernible increase in anticipated costs and benefits as the program is being updated to conform to national security guidance cited in the text in §§ 250.1 through 250.7.

    The potential benefits include greater public access and understanding of information about the qualifications needed for access to export controlled technical data and technology. Such information may help potential contractors and grantees to better understand their options for participating in DoD activities; to better enable funders and researchers to determine the need for information and technolgy; to provide more complete information of those who use information from DoD research and contracts to inform other decisions; and to better enable the scientific community to examine the overall state of information and technology in this area as a basis for engaging in quality improvement (e.g., with regard to research methods). The proposed rule is also expected to provide greater clarity about what is required for those who are authorized holders of export controlled technical data and technology.

    This proposed rule is included in DoD's retrospective plan, completed in August 2011, and will be reported in future status updates of DoD's retrospective review in accordance with the requirements in Executive Order 13563. DoD's full plan can be accessed at: http://www.regulations.gov/#!docketDetail;D=DOD-2011-OS-0036.

    Regulatory Procedures Executive Order 12866, “Regulatory Planning and Review” and Executive Order 13563, “Improving Regulation and Regulatory Review”

    Executive Orders 13563 and 12866 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distribute impacts, and equity). Executive Order 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. Although this rulemaking is not “economically significant” because it does not have an annual effect on the economy of $100 million or more or adversely affect in a material way the economy, it has been deemed “other significant” for raising novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles set forth in these Executive Orders. For that reason, it has been reviewed by the Office of Management and Budget (OMB).

    Section 202, Public Law 104-4, “Unfunded Mandates Reform Act”

    Section 202 of the Unfunded Mandates Reform Act of 1995 (UMRA) (Pub. L. 104-4) requires agencies assess anticipated costs and benefits before issuing any rule whose mandates require spending in any 1 year of $100 million in 1995 dollars, updated annually for inflation. In 2014, that threshold is approximately $141 million. This proposed rule would not mandate any requirements for State, local, or tribal governments, nor would it affect private sector costs.

    Public Law 96-354, “Regulatory Flexibility Act” (5 U.S.C. 601)

    The Department of Defense certifies that this proposed rule is not subject to the Regulatory Flexibility Act (5 U.S.C. 601) because it would not, if promulgated, have a significant economic impact on a substantial number of small entities. Therefore, the Regulatory Flexibility Act, as amended, does not require us to prepare a regulatory flexibility analysis.

    Public Law 96-511, “Paperwork Reduction Act” (44 U.S.C. Chapter 35)

    It has been certified that this proposed rule does impose reporting or recordkeeping requirements under the Paperwork Reduction Act of 1995. These reporting requirements have been approved by OMB under OMB Control Number 0704-0207 titled DD Form 2345, Militarily Critical Technical Data Agreement.

    Cost to the Public

    In exchange for Government-owned unclassified export controlled technical data and technology, a contractor provides basic company information, identifies a technical data and technology custodian, and describes need-to-know. The reporting burden is estimated to average 20 minutes per response. The DD Form 2345 and supporting documentation must be submitted to the U.S./Canada Joint Certification Office in hardcopy. Approximately 24,000 U.S. companies have active certifications.

    24,000 responses $9.94 * per response $19.99 postage ** per response $638,400 * U.S. Department of Labor. Bureau of Labor Statistics. 2014 median weekly earnings of full-time workers with at least a bachelor's degree: $1,193. http://www.bls.gov/spotlight/2015/a-look-at-pay-at-the-top-the-bottom-and-in-between/home.htm. ** Most applicants choose Priority Mail Express Flat Rate Envelope USPS Postage Price Calculator http://postcalc.usps.com/. Cost to the Government 4 FTE registrars GS 9 step 5 $59,036 * $236,144 1 FTE Team Lead GS11 step 5 71,429 * 71,429 .5 FTE US Representative GS13 step 5 101,807 50,904 .25 FTE Division Chief GS14 step 5 120,303 30,075 .25 FTE Director GS15 step 5 35,378 * 35,378 O&M for IT SP4701-15-F-0031 2,958,915 147,946 Total 571,876 * 2014 General Schedule (Base) Office of Personnel Management Salaries and Wages https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2014/general-schedule/. Executive Order 13132, “Federalism”

    Executive Order 13132 establishes certain requirements that an agency must meet when it promulgates a proposed rule (and subsequent final rule) that imposes substantial direct requirement costs on State and local governments, preempts State law, or otherwise has federalism implications. This proposed rule will not have a substantial effect on State and local governments.

    List of Subjects in 32 CFR Part 250

    Exports, Science and technology.

    Accordingly, 32 CFR part 250 is proposed to be revised to read as follows:

    PART 250—WITHHOLDING OF UNCLASSIFIED TECHNICAL DATA AND TECHNOLOGY FROM PUBLIC DISCLOSURE Sec. 250.1 Purpose. 250.2 Applicability. 250.3 Definitions. 250.4 Policy. 250.5 Responsibilities. 250.6 Procedures. 250.7 Directly arranged visits. Authority:

    10 U.S.C. 133.

    § 250.1 Purpose.

    This part establishes policy, assigns responsibilities, and prescribes procedures for the dissemination and withholding of certain unclassified technical data and technology consistent with the requirements of 10 U.S.C. 130.

    § 250.2 Applicability.

    This part:

    (a) Applies to:

    (1) The Office of the Secretary of Defense, the Military Departments, the Office of the Chairman of the Joint Chiefs of Staff and the Joint Staff, the Combatant Commands, the Office of Inspector General of the Department of Defense, the Defense Agencies, the DoD Field Activities, and all other organizational entities within the DoD (referred to collectively in this part as the “DoD Components”).

    (2) All unclassified technical data and technology that discloses technology or information with military or space application, in the possession or under the control of a DoD Component, that may not be exported lawfully without an approval, authorization, license, license exception, or exemption in accordance with U.S. export control laws and regulations: 22 U.S.C. 2778 (also known as the “Arms Export Control Act”); 50 U.S.C. chapter 35 (also known as the “International Emergency Economic Powers Act”); 22 CFR parts 120-130 (also known as “International Traffic in Arms Regulations” (ITAR)); and 15 CFR parts 730 through 774 (also known as “Export Administration Regulations” (EAR)).

    (b) Does not modify or supplant the regulations governing the export of technical data and technology established by 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, 10 CFR 810, and 15 CFR parts 730 through 774.

    (c) Does not apply to technical information under the control of the Department of Energy or the Nuclear Regulatory Commission pursuant to the Atomic Energy Act of 1954, as amended, and the Nuclear Non-Proliferation Act of 1978, as amended.

    (d) Does not introduce any additional controls on the dissemination of technical data and technology by private enterprises or individuals beyond those specified by export control laws and regulations or in contracts or other agreements, including certifications as specified in paragraph (a)(9) of § 250.5. Accordingly, the fact that DoD may possess such technical data and technology does not in itself provide a basis for control of such technical data and technology under this part.

    (e) Does not introduce any controls on the dissemination of:

    (1) Scientific, educational, or other items that are not subject to the EAR or exclusively controlled for export or reexport by another department or agency pursuant to 15 CFR 734.3, 734.7 through 734.8;

    (2) Information in the public domain as described in 22 CFR 120.11 and technical data that has been approved for release in accordance with 22 CFR 125.4(b)(13)).

    (f) Does not alter the responsibilities of the DoD Components to protect proprietary technical data and technology of a private party, including:

    (1) In which the DoD has less than unlimited rights (e.g., pursuant to 48 CFR 227.7202, 252.227-7013, 252.227-7014, 252.227-7015, and 252.227.7018); and

    (2) That is authorized to be withheld from public disclosure pursuant to 5 U.S.C. 552, also known and referred to in this part as the “Freedom of Information Act (FOIA).”

    (g) Does not pertain to or affect the release of technical data and technology by DoD Components to foreign governments, international organizations or their respective representatives, or contractors pursuant to official agreements or formal arrangements with the U.S. Government (USG), or pursuant to USG-licensed transactions involving such entities or individuals. However, in the absence of such USG-sanctioned relationships this part does apply.

    (h) Does not apply to classified technical data. However, after declassification, dissemination of the technical data and technology within the scope of paragraph (a)(2) of this section is governed by this part.

    (i) Does not alter the responsibilities of the DoD Components to mark and protect information qualifying for designation as controlled unclassified information in accordance with Executive Order 13556, “Controlled Unclassified Information,” as implemented by volume 4 of DoD Manual 5200.01, “DoD Information Security Program” (available at http://www.dtic.mil/whs/directives/corres/pdf/520001_vol4.pdf).

    § 250.3 Definitions.

    Unless otherwise noted, these terms and their definitions are for the purpose of this part.

    Certification. The United States-Canada Joint Certification Program certifies contractors of each country for access, on an equally favorable basis, to unclassified technical data and technology that discloses technology or information with military or space application controlled in the United States by this part and in Canada by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition (available at http://laws-lois.justice.gc.ca/PDF/SOR-86-345.pdf).

    Controlling DoD office. The DoD activity that sponsored the work that generated the technical data and technology or received the technical data and technology on behalf of the DoD and therefore is responsible for determining the distribution of a document containing the technical data and technology. In the case of joint sponsorship, the controlling office is determined by advance agreement and may be a party, a group, or a committee representing the interested activities or the DoD Components.

    Critical technology. Technology or technologies essential to the design, development, production, operation, application, or maintenance of an defense or dual-use article or service, which makes or could make a significant contribution to the military potential of any country, including the United States (also referred to as militarily critical technology). This includes, but is not limited to, design and manufacturing know-how, technical data, keystone equipment including manufacturing, inspection, and test equipment that is required for the effective application of technical information and technical know-how.

    (1) With respect to defense articles or defense services: Those technologies specified in 22 CFR 121.1.

    (2) With respect to categories of systems, equipment, and components; test, inspection, and production equipment; materials; software; and technology subject to the EAR: Those technologies specified in 15 CFR part 774.

    (3) With respect to nuclear equipment, materials, and technology: Those technologies specified in 10 CFR part 810.

    (4) With respect to select agents and toxins: Those technologies specified in 7 CFR part 331, 9 CFR part 121, and 42 CFR part 73; and any other technologies affecting the critical infrastructure.

    (5) With respect to emerging critical defense technology: Research and engineering development, or engineering and technology integration that will produce a defense article or defense service, including its underlying technology and software, covered by 22 CFR parts 120 through 130, or a dual-use or munitions item, including its underlying technology and software, covered by 15 CFR parts 730 through 774.

    Defense article. Defined at 22 CFR 120.6.

    Defense services. Defined at 22 CFR 120.9.

    Formal arrangement. An instrument that provides the formal authorization to establish a voluntary agreement between two or more parties for mutual sharing of resources and tasks to achieve a common set of objectives, such as The Technical Cooperation Program.

    Legitimate business relationship. A relationship in which the DoD determines that a need exists to acquire, share, exchange, or disseminate DoD technical information to anyone other than a DoD employee for supporting the DoD mission. The relationship may be established by a memorandum of understanding, agreement, contract, or grant. The DoD has the sole responsibility for determining that a legitimate business relationship exists since the only purpose is to provide access to information created by or under the control of the DoD. Relationships may be established with an individual or organization in another Federal department or agency; contractors, grantees, or potential DoD contractors; other branches of the Federal Government; State and local governments; and foreign countries.

    Limited rights. The rights to use, modify, reproduce, release, perform, display, or disclose technical data and technology, in whole or in part, within the government.

    Other legitimate business purposes. Include:

    (1) Providing or seeking to provide equipment or technology to a foreign government with USG approval (for example, through foreign military sale).

    (2) Bidding, or preparing to bid, on a sale of surplus property.

    (3) Selling or producing products for the commercial domestic marketplace or for the commercial foreign marketplace, providing that any required export license is obtained.

    (4) Engaging in scientific research in a professional capacity.

    (5) Acting as a subcontractor to a qualified contractor.

    Potential DoD contractor. An individual or organization outside the DoD declared eligible for DoD information services by a sponsoring DoD activity.

    Public disclosure. Making technical data available without restricting its dissemination or use.

    Qualified contractor. A qualified U.S. contractor or a qualified Canadian contractor referred to in and governed by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition and certified in the Joint Certification Program through acceptance of a valid DD Form 2345.

    Qualified Canadian contractor. Canadian contractors are qualified for technical data and technology that do not require a license or other authorization for export to Canada under 22 CFR 126.5 by submitting a certification request to the United States-Canada Joint Certification Office established at the Defense Logistics Agency, Battle Creek, Michigan, in accordance with the “Memorandum of Understanding Between the Government of Canada and the Government of the United States Concerning Strategic Technical Exchange”.

    Qualified U.S. contractor. A private individual or enterprise that, in accordance with procedures established by the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) and as a condition of obtaining export-controlled technical data and technology subject to this part from the DoD:

    (1) Certifies that the individual who will act as recipient of the export-controlled technical data and technology on behalf of the U.S. contractor is a U.S. citizen or a person admitted lawfully into the United States for permanent residence and is located in the United States.

    (2) Certifies that such data and technology are needed to bid or perform on a contract with the DoD or other USG agency, or for other legitimate business purposes in which the U.S. contractor is engaged or plans to engage. The purpose for which the data and technology are needed must be described sufficiently in such certification to permit an evaluation of whether subsequent requests for data and technology are related properly to such business purpose.

    (3) Acknowledges its responsibilities under U.S. export control laws and regulations (including the obligation, under certain circumstances, to obtain an export license prior to the release of technical data and technology within the United States) and agrees that it will not disseminate any export-controlled technical data and technology subject to this part in violation of applicable export control laws and regulations.

    (4) Agrees that, unless dissemination is permitted by paragraph (i) of § 250.6, it will not provide access, including network access, to export-controlled technical data and technology subject to this part to persons other than its employees or persons acting on its behalf, and who meet the same citizenship or residency requirements without the permission of the DoD Component that provided the technical data and technology.

    (5) To the best of its knowledge, knows of no person employed by it or acting on its behalf who will have access to such data and technology, who is debarred, suspended, or otherwise ineligible from performing on USG contracts; or has violated U.S. export control laws or a certification previously made to the DoD under the provisions of this part.

    (6) Asserts that it is not debarred, suspended, or otherwise determined ineligible by any agency of the USG to perform on USG contracts, has not been convicted of export control law violations, and has not been disqualified under the provisions of this part.

    (7) Requests the certification be accepted based on its description of extenuating circumstances when the certifications required by this definition cannot be made truthfully.

    Restricted rights. The government's rights to use a computer program with one computer at one time. Applicable only to noncommercial computer software.

    Technical data. Defined at 22 CFR 120.10.

    (1) Classified data relating to defense articles and defense services on the U.S. Munitions List;

    (2) Information covered by an invention secrecy order; or

    (3) Software (see 22 CFR 120.45(f)) directly related to defense articles.

    (4) The definition does not include information concerning general scientific, mathematical, or engineering principles commonly taught in schools, colleges, and universities, or information in the public domain as defined in 22 CFR 120.11 or telemetry data as defined in note 3 to Category XV(f) of in 22 CFR part 121. It also does not include basic marketing information on function or purpose or general system descriptions of defense articles.

    Technical information. Includes technical data and technology as defined in 15 CFR parts 730 through 774, as well as technical information that is not subject to 22 CFR parts 120 through 130 or 15 CFR parts 730 through 774. It also includes technical data or computer software of any kind that can be used or adapted for use in the design, production, manufacture, assembly, repair, overhaul, processing, engineering, development, operation, maintenance, adapting, testing, or reconstruction of goods or materiel; or any technology that advances the state of the art, or establishes a new art, in an area of significant military or space applicability in the United States. The data may be in tangible form, such as a blueprint, photograph, plan, instruction, or an operating manual, or may be intangible, such as a technical service or oral, auditory, or visual descriptions. Examples of technical data include research and engineering data, engineering drawings, and associated lists, specifications, standards, process sheets, manuals, technical reports, technical orders, catalog item identifications, data sets, studies and analyses and related information, and computer software.

    Technology. Defined in 15 CFR 772.1.

    United States. The 50 States, the District of Columbia, and the territories and possessions of the United States.

    United States-Canada Joint Certification Office. The office established to certify contractors of each country for access, on an equally favorable basis, to unclassified technical data and technology disclosing technology controlled in the United States by this part and in Canada by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition.

    U.S. DoD contractor. Those qualified U.S. contractors currently holding grants or contracts with DoD or those contractors declared eligible for DoD information services by a sponsoring DoD activity on the basis of participation in a DoD Potential Contractor Program.

    § 250.4 Policy.

    It is DoD policy that:

    (a) Pursuant to 10 U.S.C. 130 and 133, the Secretary of Defense may withhold from public disclosure any technical data and technology with military or space application in the possession or under the control of the DoD, if such technical data and technology may not be exported lawfully without a license, exception, exemption, or other export authorization, in accordance with U.S. export control laws and regulations (including 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, and 15 CFR parts 730 through 774). However, technical data and technology may not be withheld if regulations distributed in accordance with 22 U.S.C. 2778 authorize the export of such technical data and technology pursuant to a general unrestricted license or exemption.

    (b) Because public disclosure of technical data and technology subject to this part is the same as providing uncontrolled foreign access, withholding such technical data and technology from public disclosure, unless approved, authorized, or licensed in accordance with export control laws, is necessary and in the national interest.

    (c) Notwithstanding the authority in paragraph (c)(1) of this section, it is DoD policy to provide technical data and technology governed by this part to individuals and enterprises that are:

    (1) Currently qualified U.S. contractors, when such technical data and technology relate to a legitimate business purpose for which the contractor is certified; or

    (2) A certified Canadian contractor referred to in and governed by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition (available at http://laws-lois.justice.gc.ca/PDF/SOR-86-345.pdf) and registered at the United States-Canada Joint Certification Office when a legitimate business relationship has been established between the government and the contractor.

    (d) This part may not be used by the DoD Components as authority to deny access to technical data and technology to the Congress or to any Federal, State, or local government agency that requires the technical data and technology for regulatory or other official government purposes. Dissemination of the technical data and technology will include a statement that DoD controls it, in accordance with this part.

    (e) The authority in this part may not be used to withhold from public disclosure unclassified information regarding DoD operations, policies, activities, or programs, including the costs and evaluations of performance and reliability of military and space equipment. When information does contain technical data and technology subject to this part, the technical data and technology must be excised from what is disclosed publicly.

    (f) This part may not be used as a basis for the release of limited rights or restricted rights data as defined in 48 CFR or those that are authorized to be withheld from public disclosure pursuant to the 5 U.S.C. 552.

    (g) This part may not be used to provide protection for technical data that should be classified in accordance with Executive Order 13526, “Classified National Security Information,” and volume 1 of DoD Manual 5200.01 (available at http://www.dtic.mil/whs/directives/corres/pdf/520001_vol1.pdf).

    (h) This part provides immediate authority to cite section (b)(3) of 5 U.S.C. 552 (FOIA Exemption 3) described in 32 CFR part 286 as the basis for denials under 5 U.S.C. 552 of technical data and technology currently determined to be subject to the provisions of this part. The technical data will be withheld under the authority of 10 U.S.C.130. If the information originated or is under the control of a Government Agency outside the DoD, DoD Components will refer to that Government Agency for a release determination.

    (i) Technical data and technology subject to this part must be marked in accordance with DoD Instruction 5230.24, “Distribution Statements on Technical Documents” (available at http://www.dtic.mil/whs/directives/corres/pdf/523024p.pdf) and volume 4 of DoD Manual 5200.01 and released in accordance with DoD Instruction 2040.02, “International Transfers of Technology, Articles, and Services” (available at http://www.dtic.mil/whs/directives/corres/pdf/204002_2014.pdf), DoD Directive 5230.09, “Clearance of DoD Information for Public Release” (available at http://www.dtic.mil/whs/directives/corres/pdf/523009p.pdf), DoD Instruction 5230.29, “Security and Policy Review of DoD Information for Public Release” (available at http://www.dtic.mil/whs/directives/corres/pdf/523029p.pdf), and 32 CFR part 285.

    (j) Technical data and technology subject to this part, when disseminated electronically, must be marked in accordance with volume 4 of DoD Manual 5200.01 and are subject to all applicable security requirements specified in DoD Instruction 8500.01, “Cybersecurity” (available at http://www.dtic.mil/whs/directives/corres/pdf/850001_2014.pdf) and Chairman of the Joint Chiefs of Staff Instruction 6510.01F, “Information Assurance (IA) and Support to Computer Network Defense (CND),” February 9, 2011, as amended (available at http://www.dtic.mil/cjcs_directives/cdata/unlimit/6510_01.pdf).

    (k) In accordance with DoD Instruction 5015.02, “DoD Records Management Program” (available at http://www.dtic.mil/whs/directives/corres/pdf/501502p.pdf), technical data and technology subject to this part must be maintained and managed consistent with National Archives and Records Administration approved dispositions to ensure proper maintenance, use, accessibility, and preservation, regardless of format or medium.

    § 250.5 Responsibilities.

    (a) The Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) has overall responsibility for the implementation of this part and will designate an office to:

    (1) Administer and monitor compliance with this part.

    (2) Receive and disseminate notifications of temporary revocation of contractor qualification in accordance with paragraph (e) of § 250.6.

    (3) Receive recommendations for contractor disqualification made in accordance with paragraph (f) of § 250.6, and act as disqualification authority.

    (4) Provide technical assistance when necessary to the DoD Components to assess the significance of the military or space application of technical data and technology that may be withheld from public disclosure in accordance with this part.

    (5) Maintain and update procedures and appropriate mechanisms for the certification of qualified contractors, in accordance with paragraph (c) of § 250.4 of this part.

    (6) Ensure that the requirements of this part are incorporated into 48 CFR for application to contracts involving technical data and technology governed by this part.

    (7) Develop, in conjunction with the Office of the General Counsel of the Department of Defense (GC DoD), guidelines for responding to appeals, as identified in paragraph (k) of § 250.6.

    (8) Develop procedures to ensure that the DoD Components apply consistent criteria in authorizing exceptions in accordance with paragraph (j) of § 250.6.

    (9) Prescribe procedures to develop, collect, and disseminate certification statements; to ensure their sufficiency, accuracy, and periodic renewal; and to make final determinations of qualification.

    (10) Take such other actions that may be required to ensure consistent and appropriate implementation of this part within the DoD.

    (b) The Under Secretary of Defense for Policy (USD(P)):

    (1) Prepares and issues policy guidance regarding the foreign disclosure and security controls for information in international programs within the scope of this part.

    (2) Provides consultation to DoD offices on export control and commodity jurisdiction determinations.

    (c) The Deputy Chief Management Officer (DCMO) of the Department of Defense:

    (1) Monitors the implementation of the provisions of this part that pertain to 5 U.S.C. 552 and 32 CFR part 285.

    (2) Provides such other assistance as may be necessary to ensure compliance with this part.

    (d) The GC DoD:

    (1) Advises DoD Components with respect to the statutory and regulatory requirements governing the export of technical data and technology.

    (2) Advises the USD(AT&L) regarding consistent and appropriate implementation of this part.

    (e) The DoD Component heads:

    (1) Disseminate and withhold from public disclosure technical data and technology subject to this part consistent with its policies and procedures.

    (2) Designate a focal point to:

    (i) Ensure implementation of this part.

    (ii) Identify classes of technical data and technology whose release are governed by paragraph (d)(3) of § 250.6.

    (iii) Act on appeals relating to case-by-case denials for release of technical data and technology.

    (iv) Temporarily revoke a contractor's qualification in accordance with paragraph (e) of § 250.6.

    (v) Receive and evaluate requests for reinstatement of a contractor's qualification in accordance with paragraph (e)(4) of § 250.6.

    (vi) Recommend contractor's disqualification to the USD(AT&L) in accordance with paragraph (f) of § 250.6.

    (3) Develop, distribute, and effect Component regulations to implement this part.

    (4) Ensure that the controlling DoD office that created or sponsored the technical information exercises its inherently governmental responsibility to determine the appropriate marking in accordance with DoD Instruction 5230.24 and volumes 2 and 4 of DoD Manual 5200.01 (volume 2 available at http://www.dtic.mil/whs/directives/corres/pdf/520001_vol2.pdf) and that all technical documents, including research, development, engineering, test, sustainment, and logistics information, regardless of media or form, are marked correctly.

    § 250.6 Procedures.

    (a) Procedures for release of technical information must be made under the following guidelines:

    (1) DoD Components may make their technical information for other than military or space application available for public disclosure in accordance with DoD Directive 5230.09 and DoD Instruction 5230.29. DoD has the authority to withhold technical data and technology as defined in § 250.3 from public disclosure.

    (2) DoD Components will process FOIA requests from the public for technical information in accordance with 32 CFR part 286 and governing DoD Component issuances. All requested technical data and technology currently determined to be subject to the withholding authority in this part will be denied under Exemption 3 of 5 U.S.C. 552 and 10 U.S.C. 130. Any FOIA appeals for the denied information will be processed in accordance with 32 CFR part 286 and governing DoD Component issuances.

    (3) DoD Components may give qualified contractors access to their technical data and technology as permitted by the provisions of this part.

    (i) United States-Canada Joint Certification Office adjudicates certification of qualified contractors.

    (ii) To qualify, U.S. and Canadian contractors must submit a completed DD Form 2345 “Militarily Critical Technical Data Agreement,” to the United States-Canada Joint Certification Office.

    (iii) To qualify, Canadian contractors will submit a completed DD Form 2345 when a Canadian contractor intends to request access to DoD-controlled technical data and technology.

    (iv) A copy of the company's State/Provincial Business License, Incorporation Certificate, Sales Tax Identification Form, ITAR Controlled Goods Registration letter or certificate, or other documentation that verifies the legitimacy of the company must accompany all DD Forms 2345.

    (v) The contractor's business activity is a key element of the certification process since this information is used by the controlling office as a basis for approving or disapproving specific requests for technical data and technology. The business activity statement should be sufficiently detailed to support requests for any data that the contractor expects for legitimate business purposes.

    (b) Upon receipt of a request for technical information in the possession of, or under the control of the DoD, the controlling DoD office for the requested information will determine whether the information is governed by this part.

    (1) The determination will be based on whether

    (i) The information is subject to 22 CFR part 121 or 15 CFR part 774.

    (ii) The information would require a license, exception, exemption, or other export authorization in accordance with U.S. export control laws and regulations in accordance with 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, and 15 CFR parts 730 through 774.

    (iii) The information would not fall into the categories of information described in paragraphs (c) and (d) of § 250.2.

    (2) In making such a determination, the controlling office may consult with the Defense Technology Security Administration for advice on whether U.S. export control laws or regulations apply. The controlling DoD office may request assistance in making this determination from the USD(AT&L), and if necessary, consult the Departments of State, Commerce, or Energy.

    (c) The controlling DoD office will ensure technical data and technology governed by this part are marked for distribution in accordance with DoD Instruction 5230.24 and volume 4 of DoD Manual 5200.01.

    (d) The controlling DoD office will authorize release of technical data and technology governed by this part to qualified contractors, as defined in § 250.3, unless either:

    (1) The qualification of the contractor concerned has been temporarily revoked in accordance with paragraph (e) of this section;

    (2) The controlling DoD office judges the requested technical data and technology to be unrelated to the purpose for which the qualified contractor is certified. When release of technical data and technology is denied in accordance with this paragraph, the controlling DoD office will request additional information to explain the intended use of the requested technical data and technology and, if appropriate, request a new certification (see § 250.3) describing the intended use of the requested technical data and technology; or

    (3) The technical data and technology are being requested for a purpose other than to permit the requester to bid or perform on a contract with the DoD or other USG agency. In this case, the controlling DoD office will withhold the technical data and technology if the DoD Component focal point determines the release of the technical data and technology may jeopardize an important technological or operational military advantage of the United States.

    (e) Upon receipt of substantial and credible information that a qualified U.S. contractor has violated U.S. export control law; violated its certification; made a certification in bad faith; or omitted or misstated material fact, the DoD Component will temporarily revoke the U.S. contractor's qualification. Canadian contractors are disqualified in accordance with Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition.

    (1) The DoD Component may delay such temporary revocations with the potential to compromise a USG investigation.

    (2) Immediately upon a temporary revocation, the DoD Component will notify the contractor and the USD(AT&L).

    (3) The contractor will be given an opportunity to respond in writing to the information upon which the temporary revocation is based before being disqualified.

    (4) Any U.S. contractor whose qualification has been temporarily revoked may present information to the DoD Component showing that the basis for revocation was in error or has been remedied and be reinstated.

    (f) When the basis for a contractor's temporary revocation cannot be removed within 20 working days, the DoD Component will recommend to the USD(AT&L) that the contractor be disqualified.

    (g) After receipt of substantial and credible information that a qualified U.S. contractor has violated U.S. export control law, the DoD Component must notify the appropriate law enforcement agency.

    (h) Charges for copying, certifying, and searching records rendered to requesters will be levied in accordance with chapter 4, appendix 2 of volume 11A of DoD 7000.14-R, “Department of Defense Financial Management Regulations (FMRs)” (available at http://comptroller.defense.gov/Portals/45/documents/fmr/Volume_11a.pdf). Normally, only one copy of the same record or document will be provided to each requester. Each release to qualified contractors of controlled technical data and technology governed by this part will be accompanied by a “Notice to Accompany the Dissemination of Export-Controlled Technical Data and Technology” (see Figure to § 250.6(h)).

    BILLING CODE 5001-06-P EP31OC16.007 BILLING CODE 5001-06-C

    (i) Qualified U.S. contractors who receive technical data and technology governed by this part may disseminate that technical data and technology for purposes consistent with their certification without the permission of the controlling DoD office or when dissemination is:

    (1) To any foreign recipient for which the technical data and technology are approved, authorized, or licensed in accordance with 22 U.S.C. 2778 or 15 CFR parts 730 through 774.

    (2) To another qualified U.S. contractor including existing or potential subcontractors, but only within the scope of the certified legitimate business purpose of the recipient.

    (3) To the Departments of State and Commerce to apply for approvals, authorizations, or licenses for export pursuant to 22 U.S.C. 2778 or 15 CFR parts 730 through 774. The application will include a statement that the technical data and technology for which the approval, authorization, or license is sought is controlled by the DoD in accordance with this part.

    (4) To the Congress or any Federal, State, or local governmental agency for regulatory purposes or otherwise as may be required by law or court order. Any such dissemination will include a statement that the technical data and technology are controlled by the DoD in accordance with this part.

    (j) A qualified contractor desiring to disseminate technical data and technology subject to this part in a manner not permitted expressly by the terms of this part must be granted authority to do so by the controlling DoD office, consistent with U.S. export control laws and regulations specified in 22 U.S.C. 2778, 50 U.S.C. chapter 35, 22 CFR parts 120 through 130, and 15 CFR parts 730 through 774 and DoD policies.

    (k) Any requester denied technical data and technology or any qualified U.S. contractor denied permission to disseminate such technical data and technology in accordance with this part will be promptly provided with a written statement of reasons for that action, and advised of the right to make a written appeal to a specifically identified appellate authority within the DoD Component. Other appeals will be processed as directed by the USD(AT&L).

    (l) Denials will cite 10 U.S.C. 130 and 133 as implemented by this part. Implementing procedures will provide for resolution of any appeal within 20 working days.

    § 250.7 Directly arranged visits.

    (a) USG officials and certified U.S. contractors and Canadian government officials and certified Canadian contractors may use the certification process to facilitate directly arranged visits that involve access to unclassified technical data and technology. Activities under this process are limited to:

    (1) Procurement activities such as unclassified pre-solicitation conferences, discussions related to unclassified solicitations, and collection of procurement unclassified documents.

    (2) Performance of an unclassified contract.

    (3) Scientific research, in support of unclassified U.S. or Canadian national defense initiatives.

    (4) Attendance at restricted meetings, conferences, symposia, and program briefings where technical data and technology governed by this part or Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition will be presented, or the event is being held in an unclassified access controlled area.

    (b) A directly arranged visit does not apply to uncertified U.S. or Canadian contractors; classified visits, where confirmation of the visitors' security clearances is required; or unsolicited marketing visits.

    (c) A directly arranged visit related to the release of information controlled in the United States by this part or in Canada by Canada Minister of Justice, Technical Data Control Regulations SOR/86-345, May 27, 2014 current edition, is permitted when two conditions are satisfied.

    (1) First condition:

    (i) There is a valid license covering the export of the technical data and technology;

    (ii) The export or release is permitted under the Canadian exemption on 22 CFR 126.5;

    (iii) The export or release is covered by the general exemptions in 22 CFR 125.4; or

    (iv) The export or release qualifies for a license exception under 15 CFR parts 730 through 774.

    (2) Second condition:

    (i) The distribution statement applied to the technical data and technology pursuant to DoD Instruction 5230.24 permits release; or

    (ii) The originator or government controlling office authorizes release.

    Dated: October 26, 2016. Aaron Siegel, Alternate OSD Federal Register Liaison Officer, Department of Defense.
    [FR Doc. 2016-26236 Filed 10-28-16; 8:45 am] BILLING CODE 5001-06-P
    ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 52 [EPA-R08-OAR-2016-0620; FRL-9954-67-Region 8] Approval and Promulgation of Air Quality Implementation Plans; State of Utah; Revisions to Nonattainment Permitting Regulations AGENCY:

    Environmental Protection Agency (EPA).

    ACTION:

    Proposed rule.

    SUMMARY:

    The Environmental Protection Agency (EPA) is proposing to conditionally approve State Implementation Plan (SIP) revisions submitted by the state of Utah on August 20, 2013, with supporting administrative documentation submitted on September 12, 2013. These submittals revise the Utah Administrative Code (UAC) that pertain to the issuance of Utah air quality permits for major sources in nonattainment areas. The EPA proposes a conditional approval because while the submitted revisions to Utah's nonattainment permitting rules do not fully address the deficiencies in the state's program, Utah has committed to address additional remaining deficiencies in the state's nonattainment permitting program no later than a year from the EPA finalizing this conditional approval. If finalized, and upon the EPA finding a timely meeting of this commitment in full, the proposed conditional approval of the SIP revisions would convert to a final approval of Utah's plan. This action is being taken under section 110 of the Clean Air Act (CAA) (Act).

    DATES:

    Written comments must be received on or before November 30, 2016.

    ADDRESSES:

    Submit your comments, identified by EPA-R08-OAR-2016-0620 at http://www.regulations.gov. Follow the online instructions for submitting comments. Once submitted, comments cannot be edited or removed from regulations.gov. The EPA may publish any comment received to its public docket. Do not submit electronically any information you consider to be Confidential Business Information (CBI) or other information whose disclosure is restricted by statute. Multimedia submissions (audio, video, etc.) must be accompanied by a written comment. The written comment is considered the official comment and should include discussion of all points you wish to make. The EPA will generally not consider comments or comment contents located outside of the primary submission (i.e., on the Web, cloud, or other file sharing system). For additional submission methods, the full EPA public comment policy, information about CBI or multimedia submissions, and general guidance on making effective comments, please visit http://www2.epa.gov/dockets/commenting-epa-dockets.

    Docket: All documents in the docket are listed in the http://www.regulations.gov index. Although listed in the index, some information is not publicly available, e.g., CBI or other information whose disclosure is restricted by statute. Certain other material, such as copyrighted material, will be publicly available only in hard copy. Publicly-available docket materials are available at http://www.regulations.gov or in hard copy at the EPA Region 8, Office of Partnerships and Regulatory Assistance, Air Program, 1595 Wynkoop Street, Denver, Colorado 80202-1129. The EPA requests that if at all possible, you contact the individual listed in the FOR FURTHER INFORMATION CONTACT section to view the hard copy of the docket. You may view the hard copy of the docket Monday through Friday, 8:00 a.m. to 4:00 p.m., excluding federal holidays.

    FOR FURTHER INFORMATION CONTACT:

    Kevin Leone, Air Program, EPA, Region 8, Mailcode 8P-AR, 1595 Wynkoop Street, Denver, Colorado 80202-1129, (303) 312-6227, [email protected].

    SUPPLEMENTARY INFORMATION: I. General Information What should I consider as I prepare my comments for the EPA?

    a. Submitting CBI. Do not submit CBI to EPA through http://www.regulations.gov or email. Clearly mark the part or all of the information that you claim to be CBI. For CBI information in a disk or CD ROM that you mail to the EPA, mark the outside of the disk or CD ROM as CBI and then identify electronically within the disk or CD ROM the specific information that is claimed as CBI. In addition to one complete version of the comment that includes information claimed as CBI, a copy of the comment that does not contain the information claimed as CBI must be submitted for inclusion in the public docket. Information so marked will not be disclosed except in accordance with procedures set forth in 40 CFR part 2.

    b. Tips for Preparing Your Comments. When submitting comments, remember to:

    i. Identify the rulemaking by docket number and other identifying information (subject heading, Federal Register date and page number).

    ii. Follow directions—The agency may ask you to respond to specific questions or organize comments by referencing a Code of Federal Regulations (CFR) part or section number.

    iii. Explain why you agree or disagree; suggest alternatives and substitute language for your requested changes.

    iv. Describe any assumptions and provide any technical information and/or data that you used.

    v. If you estimate potential costs or burdens, explain how you arrived at your estimate in sufficient detail to allow for it to be reproduced.

    vi. Provide specific examples to illustrate your concerns, and suggest alternatives.

    vii. Explain your views as clearly as possible, avoiding the use of profanity or personal threats.

    viii. Make sure to submit your comments by the comment period deadline identified.

    II. Background

    On May 10, 2001, the EPA sent Utah a letter outlining concerns that Utah's nonattainment permitting rules, which are codified in UAC R307-403 (Permits: New and Modified Sources in Nonattainment Areas and Maintenance Areas), have not been consistent with federal requirements (see docket R08-OAR-2016-0620). On August 20, 2013, with supporting administrative documentation submitted on September 12, 2013, Utah sent the EPA revisions to their nonattainment permitting regulations, specifically to address EPA identified deficiencies in their nonattainment permitting regulations that affected the EPA's ability to approve Utah's PM10 maintenance plan and that may affect the EPA's ability to approve of Utah's PM2.5 SIP. These revisions addressed R307-403-1 (Purpose and Definitions), R307-403-2 (Applicability), R307-403-11 (Actual Plant-wide Applicability Limits (PALs)), and R307-420 (Ozone Offset Requirements in Davis and Salt Lake Counties). In addition, Utah moved R307-401-19 (Analysis of Alternatives) to R307-403-10 and moved R307-401-20 (Relaxation of Limits) to R307-403-2. On June 2, 2016, the EPA entered into a consent decree with the Center for Biological Diversity, Center for Environmental Health, and Neighbors for Clean Air regarding a failure to act, pursuant to CAA sections 110(k)(2)-(4), on certain complete SIP submissions from states intended to address specific requirements related to the 2006 p.m.2.5 NAAQS for certain nonattainment areas, including the submittal from the Governor of Utah dated August 20, 2013.

    The SIP revisions submitted by the Utah Department of Air Quality (UDAQ) on August 20, 2013, establish specific nonattainment new source review permitting requirements. In this revision, the UDAQ has incorporated federal regulatory language—establishing permitting requirements for new and modified major stationary sources in a nonattainment area—from portions of 40 CFR 51.165 and reformatted it into state-specific requirements for sources in Utah under R307-403-1 (Purpose and Definitions) and R307-403-2 (Applicability), including provisions relevant to nonattainment NSR programs for PM2.5 nonattainment areas. Additionally, UDAQ incorporated by reference the provisions of 40 CFR 51.165(f)(1)-(f)(14) into 307-403-11 (Actual PALs), and revised R307-420 to state that the definitions and applicability provisions in R307-403-1 apply to this section.

    CAA section 110(a)(2)(C) requires each state plan to include “a program to provide for . . . regulation of the modification and construction of any stationary source within the areas covered by the plan as necessary to assure that [NAAQS] are achieved, including a permit program as required in parts C and D of this subchapter,” and CAA section 172(c)(5) provides that the plan “shall require permits for the construction and operation of new or modified major stationary sources anywhere in the nonattainment area, in accordance with section [173].” CAA section 173 lays out the requirements for obtaining a permit that must be included in a state's SIP-approved permit program. CAA section 110(a)(2)(A) requires that SIPs contain enforceable emissions limitations and other control measures. Under section CAA section 110(a)(2), the enforceability requirement in section 110(a)(2)(A) applies to all plans submitted by a state. CAA section 110(i) (with certain limited exceptions) prohibits states from modifying SIP requirements for stationary sources except through the SIP revision process. CAA section 172(c)(7) requires that nonattainment plans, including nonattainment New Source Review (NSR) programs required by section 172(c)(5), meet the applicable provisions of section 110(a)(2), including the requirement in section 110(a)(2)(A) for enforceable emission limitations and other control measures. CAA section 110(l) provides that the EPA cannot approve a SIP revision that interferes with any applicable requirement of the Act.

    Section 51.165 in title 40 of the CFR (Permit Requirements) sets out the minimum plan requirements states are to meet within each SIP nonattainment NSR permitting program. Generally, 40 CFR 51.165 consists of a set of definitions, minimum plan requirements regarding procedures for determining applicability of nonattainment NSR and use of offsets, and minimum plan requirements regarding other source obligations, such as recordkeeping.

    Specifically, subparagraphs 51.165(a)(1)(i) through (xlvi) enumerate a set of definitions which states must either use or replace with definitions that a state demonstrates are more stringent or at least as stringent in all respects. Subparagraph 51.165(a)(2) sets minimum plan requirements for procedures to determine the applicability of the nonattainment NSR program to new and modified sources. Subparagraph 51.165(a)(3), (a)(9) and (a)(11) set minimum plan requirements for the use of offsets by sources subject to nonattainment NSR requirements. Subparagraphs (a)(8) and (a)(10) regard precursors, and subparagraphs (a)(6) and (a)(7) regard recordkeeping obligations. Subparagraph 51.165(a)(4) allows nonattainment NSR programs to treat fugitive emissions in certain ways. Subparagraph 51.165(a)(5) regards enforceable procedures for after approval to construct has been granted. Subparagraph 51.165(b) sets minimum plan requirements for new major stationary sources and major modifications in attainment and unclassifiable areas that would cause or contribute to violations of the national ambient air quality standards (NAAQS.) Finally, subparagraph 51.165(f) sets minimum plan requirements for the use of PALs. Please refer to docket EPA-R08-OAR-2016-0620 to view a cross-walk table which outlines how Utah's nonattainment permitting rules correlate with the requirements of 40 CFR 51.165.

    Clean Air Act section 189(e) requires that state SIPs apply the same control requirements that apply to major stationary sources of PM10 to major stationary sources of PM10 precursors, “except where the Administrator determines that such sources do not contribute significantly to PM10 levels which exceed the standard in the area.” On January 4, 2013, the U.S. Court of Appeals for the District of Columbia Circuit, in Natural Resources Defense Council v. EPA, 706 F.3d 428 (D.C. Cir. 2013), issued a decision that remanded the EPA's 2008 PM2.5 NSR Implementation Rule (73 FR 28321). The court found that the EPA erred in implementing the PM2.5 NAAQS in these rules solely pursuant to the general implementation provisions of subpart 1 of part D of title I of the CAA, rather than pursuant to the additional implementation provisions specific to particulate matter nonattainment areas in subpart 4. In particular, subpart 4 includes section 189(e) of the CAA, which requires the control of major stationary sources of PM10 precursors (and hence under the court decision, PM2.5 precursors) “except where the Administrator determines that such sources do not contribute significantly to PM10 levels which exceed the standard in the area.” Accordingly, nonattainment NSR programs that are submitted for PM2.5 nonattainment areas must regulate all PM2.5 precursors, i.e., SO2, NOX, VOC, and ammonia, unless the Administrator determines that such sources of a particular precursor do not contribute significantly to nonattainment in the nonattainment area. The EPA recently finalized a new provision at 40 CFR 51.165(a)(13) that codifies this requirement, as it applies to PM2.5, in the federal regulations.

    As a result, it became clear that Utah needed to submit further revisions to address remaining deficiencies in the nonattainment permitting program for the EPA to approve the August 20, 2013, submittal. Included as part of those deficiencies was that Utah has not submitted an analysis demonstrating that sources of ammonia, as a PM2.5 precursor, do not contribute significantly to PM2.5 levels that exceed the NAAQS in nonattainment areas in the State. On September 30, 2016, Utah submitted to EPA a commitment letter in which Utah commits to address additional remaining deficiencies in the State's nonattainment permitting program in R307-403 by December 8, 2017, that were not addressed in the August 20, 2013, submittal, including revisions to R307-403-2, R307-403-3, and R307-403-4. In Utah's commitment letter, Utah specifies that:

    1. UDAQ commits to submit a SIP revision that either regulates major stationary sources of the pursuant to Utah's nonattainment new source review (NNSR) permitting program, consistent with all applicable federal regulatory requirements or demonstrates that sources of ammonia, as a PM2.5 precursor, do not contribute significantly to PM2.5 levels that exceed the NAAQS in nonattainment areas in the state, consistent with new provisions at 40 CFR 51.1006(a)(3);

    2. UDAQ commits to revise R307-403-2 consistent with the new definitions in 40 CFR 51.165 that EPA recently finalized in the PM2.5 SIP Requirements Rules;

    3. UDAQ commits to revise R307-403-3, including R307-403-3(3), to remove the reference to NNSR determinations being made “at the time of the source's proposed start-up date”;

    4. UDAQ commits to revise R307-403-3, including R307-403-3(2) and R307-403-3(3), to specify that NNSR permit requirements are applicable to all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment;

    5. UDAQ commits to revise R307-403-3, in addition to the previously adopted definition of lowest achievable emission rate (LAER) in R307-403-1, to explicitly state that LAER applies to all major new sources and major modifications for the relevant pollutants in nonattainment areas;

    6. UDAQ commits to revise R307-403-4 to incorporate the requirements from 40 CFR 51.165 to establish that all general offset permitting requirements apply for all offsets regardless of the pollutant at issue, and to revise the provision to impose immediate and direct general offset permitting requirements on all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment;

    7. UDAQ commits to work with the Utah Air Quality Board to revise R307-403-4 to reference the criteria discussed in section IV.D. of 40 CFR 51, Appendix S; and

    8. UDAQ will update R307-403 to include a new section that imposes requirements that address emission offsets for PM2.5 nonattainment areas (as required in 40 CFR 51.165(a)(11)) on NNSR sources in Utah. UDAQ will revise R307-403-3, including R307-403-3(3)(c), to cross reference this new section, as well as the requirements in R307-403-4, R307-403-5, and R307-403-6; and UDAQ commits to work with the Utah Air Quality Board to revise this section to include the requirements of CAA Section 173(c)(1) and 40 CFR 51.165 (specifically 40 CFR 51.165(a)(3)) concerning the requirement that creditable reductions be calculated based on actual emissions for offset purposes.

    Under section 110(k)(4) of the Act, the EPA may approve a SIP revision based on a commitment by the state to adopt specific enforceable measures by a date certain, but not later than one year after the date of approval of the plan revision. Under a conditional approval, the state must adopt and submit the specific revisions it has committed to within one year of the EPA's finalization. If the EPA fully approves the submittal of the revisions specified in the commitment letter, the conditional nature of the approval would be removed and the submittal would become fully approved. If the state does not submit these revisions within one year, or if the EPA finds the state's revisions to be incomplete, or EPA disapproves the state's revisions, a conditional approval will convert to a disapproval. If any of these occur and the EPA's conditional approval converts to a disapproval, that will constitute a disapproval of a required plan element under part D of title I of the Act, which starts an 18-month clock for sanctions, see section 179(a)(2), and the two-year clock for a federal implementation plan (FIP), see section 110(c)(1)(B).

    III. Proposed Action

    The EPA is proposing to conditionally approve Utah's revisions submitted on August 20, 2013, which have not been withdrawn by Utah. These revisions addressed R307-403-1 (Purpose and Definitions), R307-403-2 (Applicability), R307-403-11 (Actual PALs), and R307-420 (Ozone Offset Requirements in Davis and Salt Lake Counties). In addition, Utah moved R307-401-19 (Analysis of Alternatives) to R307-403-10 and moved R307-401-20 (Relaxation of Limits) to R307-403-2. The EPA proposes that these changes, when combined with the changes Utah has committed to submitting to the EPA by December 8, 2017, in Utah's September 30, 2016 commitment letter, create enforceable obligations for sources and are consistent with the CAA and EPA regulations, including the requirements of CAA section 110(a)(2)(A), 110(a)(2)(C), 110(i), 110(l), 172(c)(5), 172(c)(7), 173.

    The crosswalk table in the docket details how the submittal corresponds to specific requirements in 40 CFR 51.165; however, as stated earlier, we are not proposing to determine that Utah's PM2.5 nonattainment permitting rules meet all requirements of 40 CFR 51.165 at this time, but rather are conditionally approving these revisions based on Utah's September 30, 2016 commitment letter. If we finalize our proposed conditional approval, Utah must adopt and submit to the EPA the specific revisions it has committed to by December 8, 2017. If the EPA fully approves the submittal of the revisions specified in the commitment letter, the conditional nature of this proposed approval would be removed and the August 20, 2013 submittal would, at that time, become fully approved. If Utah does not submit these revisions by December 8, 2017, or if we find Utah's revisions to be incomplete, or we disapprove Utah's revisions, the final conditional approval will convert to a disapproval. If any of these occur and our final conditional approval converts to a disapproval, that will constitute a disapproval of a required plan element under part D of title I of the Act, which starts an 18-month clock for sanctions, see CAA section 179(a)(2), and the two-year clock for a FIP, see CAA section 110(c)(1)(B).

    Specifically, we are proposing to conditionally approve:

    R307-401-19 (Analysis of Alternatives)

    Section R307-401-19 being moved removed from R307-401-19 and being added to R307-403-10. Because this section applies only to major sources or major modifications that are located in a nonattainment area or impact a nonattainment area, this section is more appropriately located in R307-403.

    R307-401-20 (Relaxation of Limits)

    Section R307-401-20 being moved removed from R307-401-19 and being added to R307-403-2. Because this section applies only to major sources or major modifications that are located in a nonattainment area or impact a nonattainment area, this section is more appropriately located in R307-403.

    R307-403-1 (Purpose and Definitions)

    Language being added in R307-403-1(1)-(4) to parallel federal nonattainment permitting regulations in 40 CFR 51.165; however, Utah committed to addressing further deficiencies regarding ammonia as a precursor to PM2.5 in this section, as specified in Utah's September 30, 2016 commitment letter.

    In particular, R307-403-1(4)(b) states that “ammonia is not a precursor to PM2.5 in the Logan, Salt Lake City, and Provo PM2.5 nonattainment areas as defined in the July 1, 2010 version of 40 CFR 81.345,” however, UDAQ has not submitted an analysis demonstrating that sources of ammonia, as a PM2.5 precursor, do not contribute significantly to PM2.5 levels that exceed the NAAQS in nonattainment areas in the State. UDAQ committed to submit a SIP revision that either regulates major stationary sources of ammonia pursuant to Utah's NNSR permitting program, consistent with all applicable federal regulatory requirements or demonstrates that sources of ammonia, as a PM2.5 precursor, do not contribute significantly to PM2.5 levels that exceed the NAAQS in nonattainment areas in the State, consistent with new provisions at 40 CFR 51.1006(a)(3).

    R307-403-2 (Applicability)

    The title of this section being changed from “Emission Limitations” to “applicability” and language being added to R307-403-2(1)-(12) to parallel federal nonattainment permitting regulations in 40 CFR 51.165; however, Utah committed to addressing further deficiencies in this section in its September 30, 2016 commitment letter. Utah committed to revise R307-403-2 consistent with the new definitions in 40 CFR 51.165 that the EPA recently finalized in the PM2.5 SIP Requirements Rules.

    On September 23, 2016, Utah submitted a letter to the EPA requesting to withdraw R307-403-2(12) (see docket EPA-R08-OAR-2016-0620.) As a result, we will not be acting on that subparagraph.

    R307-403-11 (Actuals PALs)

    R307-403-11 being added to implement a portion of the EPA's NSR Reform provisions that were adopted in the federal regulations in 2002 and have not yet been incorporated into the Utah Air Quality Rules. R307-403-11 incorporates by reference the provisions of 40 CFR 51.165(f)(1) through (14).

    R307-403-20 (Permits: Ozone Offset Requirements in Davis and Salt Lake Counties)

    This rule being revised to include the definitions and applicability provisions of R307-403-1. This rule change will ensure that the definitions and applicability provisions in R307-420 are consistent with related permitting rules in R307-403.

    UDAQ additionally committed to submit a revised SIP by December 8, 2017 to: (1) Revise R307-403-3, including R307-403-3(3), to remove the reference to NNSR determinations being made “at the time of the source's proposed start-up date; (2) revise R307-403-3, including R307-403-3(2) and R307-403-3(3), to specify that NNSR permit requirements are applicable to all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment; (3) revise R307-403-3, in addition to the previously adopted definition of LAER in R307-403-1, to explicitly state that LAER applies to all major new sources and major modifications for the relevant pollutants in nonattainment areas; (4) revise R307-403-4 to incorporate the requirements from 40 CFR 51.165 to establish that all general offset permitting requirements apply for all offsets regardless of the pollutant at issue, and to revise the provision to impose immediate and direct general offset permitting requirements on all new major stationary sources or major modifications located in a nonattainment area that are major for the pollutant for which the area is designated nonattainment; (5) revise R307-403-4 to reference the criteria discussed in section IV.D. of 40 CFR 51, Appendix S; (6) update R307-403, to include a new section that imposes requirements that address emission offsets for PM2.5 nonattainment areas (as required in 40 CFR 51.165(a)(11)) on NNSR sources, and revise R307-403-3, including R307-403-3(3)(c), to cross reference this new section, as well as the requirements in R307-403-4, R307-403-5, and R307-403-6, and revise this section to include the requirements of CAA Section 173(c)(1) and 40 CFR 51.165 (specifically 40 CFR 51.165(a)(3)) concerning the requirement that creditable reductions be calculated based on actual emissions for offset purposes; and (7) address further deficiencies regarding ammonia as a precursor to PM2.5.

    IV. Consideration of Section 110(l) of the CAA

    Under section 110(l) of the CAA, the EPA cannot approve a SIP revision if the revision would interfere with any applicable requirements concerning attainment and reasonable futher progress (RFP) toward attainment of the NAAQS, or any other applicable requirement of the Act. In addition, section 110(l) requires that each revision to an implementation plan submitted by a state shall be adopted by the state after reasonable notice and public hearing.

    The Utah SIP revisions that the EPA is proposing to approve do not interfere with any applicable requirements of the Act. The revisions to R307-401 and R307-403 submitted by the Utah on August 20, 2013, are intended to strengthen the SIP. Therefore, CAA section 110(l) requirements are satisfied.

    V. Incorporation by Reference

    In this rule, the EPA is proposing to include in a final EPA rule regulatory text that includes incorporation by reference. In accordance with requirements of 1 CFR 51.5, the EPA is proposing to incorporate by reference the UDAQ rules promulgated in the DAR, R307-400 Series as discussed in section III of this preamble. The EPA has made, and will continue to make, these materials generally available through www.regulations.gov and/or at the EPA Region 8 Office (please contact the person identified in the FOR FURTHER INFORMATION CONTACT section of this preamble for more information).

    VI. Statutory and Executive Order Reviews

    Under the Clean Air Act, the Administrator is required to approve a SIP submission that complies with the provisions of the Act and applicable federal regulations. 42 U.S.C. 7410(k); 40 CFR 52.02(a). Thus, in reviewing SIP submissions, the EPA's role is to approve state choices, provided that they meet the criteria of the Clean Air Act. Accordingly, this action merely proposes to approve state law as meeting federal requirements and does not impose additional requirements beyond those imposed by state law. For that reason, this proposed action:

    • Is not a “significant regulatory action” subject to review by the Office of Management and Budget under Executive Order 12866 (58 FR 51735, October 4, 1993);

    • does not impose an information collection burden under the provisions of the Paperwork Reduction Act (44 U.S.C. 3501 et seq.);

    • is certified as not having a significant economic impact on a substantial number of small entities under the Regulatory Flexibility Act (5 U.S.C. 601 et seq.);

    • does not contain any unfunded mandate or significantly or uniquely affect small governments, as described in the Unfunded Mandates Reform Act of 1995 (Pub. L. 104-4);

    • does not have federalism implications as specified in Executive Order 13132 (64 FR 43255, August 10, 1999);

    • is not an economically significant regulatory action based on health or safety risks subject to Executive Order 13045 (62 FR 19885, April 23, 1997);

    • is not a significant regulatory action subject to Executive Order 13211 (66 FR 28355, May 22, 2001);

    • is not subject to requirements of Section 12(d) of the National Technology Transfer and Advancement Act of 1995 (15 U.S.C. 272 note) because application of those requirements would be inconsistent with the CAA; and

    • does not provide the EPA with the discretionary authority to address, as appropriate, disproportionate human health or environmental effects, using practicable and legally permissible methods, under Executive Order 12898 (59 FR 7629, February 16, 1994).

    In addition, the SIP is not approved to apply on any Indian reservation land or in any other area where the EPA or an Indian tribe has demonstrated that a tribe has jurisdiction. In those areas of Indian country, the proposed rule does not have tribal implications and will not impose substantial direct costs on tribal governments or preempt tribal law as specified by Executive Order 13175 (65 FR 67249, November 9, 2000).

    List of Subjects in 40 CFR Part 52

    Environmental protection, Air pollution control, Carbon monoxide, Intergovernmental relations, Incorporation by reference, Lead, Nitrogen dioxide, Ozone, Particulate matter, Reporting and recordkeeping requirements, Sulfur oxides, Volatile organization compounds.

    Authority:

    42 U.S.C. 7401 et seq.

    Dated: October 20, 2016. Shaun L. McGrath, Regional Administrator, Region 8.
    [FR Doc. 2016-26233 Filed 10-28-16; 8:45 am] BILLING CODE 6560-50-P
    ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 63 [EPA-HQ-OAR-2009-0234; FRL-9954-62-OAR] RIN 2060-AS75 Mercury and Air Toxics Standards (MATS) Completion of Electronic Reporting Requirements AGENCY:

    Environmental Protection Agency (EPA).

    ACTION:

    Proposed rule; extension of comment period.

    SUMMARY:

    On September 29, 2016, the Environmental Protection Agency (EPA) proposed a rule titled, “Mercury and Air Toxics Standards (MATS) Completion of Electronic Reporting Requirements.” The EPA is extending the comment period on the proposed rule that was scheduled to close on October 31, 2016, by 15 days until November 15, 2016. The EPA is making this change based on three requests for additional time to prepare comments on this proposed rule.

    DATES:

    The public comment period for the proposed rule published in the Federal Register on September 29, 2016 (81 FR 67062), is being extended. Written comments must be received on or before November 15, 2016.

    ADDRESSES:

    The EPA has established a docket for the proposed rulemaking (available at http://www.regulations.gov). The Docket ID No. is EPA-HQ-OAR-2009-0234. Submit your comments, identified by Docket ID No. EPA-HQ-OAR-2009-0234, to the Federal eRulemaking Portal: http://www.regulations.gov. Follow the online instructions for submitting comments. Once submitted, comments cannot be edited or withdrawn. The EPA may publish any comment received to its public docket. Do not submit electronically any information you consider to be Confidential Business Information (CBI) or other information whose disclosure is restricted by statute. If you need to include CBI as part of your comment, please visit http://www.epa.gov/dockets/comments.html for instructions. Multimedia submissions (audio, video, etc.) must be accompanied by a written comment. The written comment is considered the official comment and should include discussion of all points you wish to make.

    For additional submission methods, the full EPA public comment policy, and general guidance on making effective comments, please visit http://www.epa.gov/dockets/comments.html.

    FOR FURTHER INFORMATION CONTACT:

    For additional information on this action, contact Barrett Parker, Sector Policies and Programs Division, Office of Air Quality Planning and Standards (D243-05), Environmental Protection Agency, Research Triangle Park, NC 27711; telephone number: (919) 541-5635; email address: [email protected].

    SUPPLEMENTARY INFORMATION:

    To allow additional time for stakeholders to provide comments, the EPA has decided to extend the public comment period until November 15, 2016.

    Dated: October 24, 2016. Stephen Page, Director, Office of Air Quality Planning and Standards.
    [FR Doc. 2016-26209 Filed 10-28-16; 8:45 am] BILLING CODE 6560-50-P
    DEPARTMENT OF THE INTERIOR Bureau of Land Management 43 CFR Part 8360 [16XL 1109AF LLUTY0100 L12200000.EA0000 24-1A] Notice of Proposed Supplementary Rules for Public Lands Managed by the Moab Field Office in Grand County, Utah AGENCY:

    Bureau of Land Management, Interior.

    ACTION:

    Proposed supplementary rule.

    SUMMARY:

    The Bureau of Land Management (BLM) is proposing a supplementary rule addressing conduct on public lands in the vicinity of Corona Arch and Gemini Bridges in Grand County, Utah. The proposed supplementary rule would prohibit roped activities around Corona Arch and Gemini Bridges. Such activities involve the use of ropes or other climbing aids, and include, but are not limited to, ziplining, highlining, slacklining, traditional rock climbing, sport rock climbing, rappelling, and swinging.

    DATES:

    Comments on the proposed supplementary rule must be received or postmarked by December 30, 2016 to be assured of consideration. Comments received, postmarked or electronically dated after that date will not necessarily be considered in the development of the final supplementary rules.

    ADDRESSES:

    Please mail or hand deliver all comments concerning the proposed supplementary rule to the Bureau of Land Management, 82 E. Dogwood, Moab, UT 84532, or email comments to Katie Stevens, at [email protected]. The proposed supplementary rule and a map depicting the area that would be affected are available for public review at the Moab Field Office, located at 82 E. Dogwood, Moab, UT 84532. The affected area is also shown on a map on the Moab Field Office's Web site at http://www.blm.gov/ut/st/en/fo/moab.html.

    FOR FURTHER INFORMATION CONTACT:

    Beth Ransel, Moab Field Manager, BLM Moab Field Office, 82 E. Dogwood, Moab, UT 84532, or telephone (435) 259-2110. Persons who use a telecommunications device for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 800-877-8339 to leave a message or question with the above individual. The FIRS is available 24 hours a day, 7 days a week. You will receive a reply during normal business hours.

    SUPPLEMENTARY INFORMATION:

    I. Public Comment Procedures

    The public is invited to provide comments on the proposed supplementary rule. See the DATES and ADDRESSES sections for information on submitting comments. Written comments on the proposed supplementary rule must be sent in accordance with the information outlined in the DATES and ADDRESSES sections of this notice. The BLM need not consider, or include in the Administrative Record for the final supplementary rule, (a) comments delivered to an address other than those listed above (See ADDRESSES), or (b) comments that the BLM receives after the close of the comment period (See DATES), unless they are postmarked or electronically dated before the deadline.

    Written comments on the proposed supplementary rule should be specific, confined to issues pertinent to the proposed supplementary rule, and should explain the reason for any recommended change. Where possible, comments should reference the specific section of the rule that the comment is addressing. Comments, including names, street addresses, and other contact information of respondents, will be available for public review at 82 E. Dogwood, Moab, UT 84532, during regular business hours (8:00 a.m. to 4:30 p.m.), Monday through Friday, except Federal holidays. Before including your address, telephone number, email address, or other personal identifying information in your comment, be advised that your entire comment, including your personal identifying information, may be made publicly available at any time. While you can ask us in your comment to withhold from public review your personal identifying information, we cannot guarantee that we will be able to do so.

    II. Background

    The BLM establishes supplementary rules under the authority of 43 CFR 8365.1-6, which allows the BLM State Directors to establish such rules for the protection of persons, property, and public lands and resources. This regulation allows the BLM to issue rules of less than national effect without codifying the rules in the Code of Federal Regulations.

    Corona Arch and Gemini Bridges are two of the most popular locations in the Moab Field Office. Corona Arch is a partly freestanding arch with a 110-foot by 110-foot opening. Gemini Bridges are two large arches standing side-by-side. Corona Arch is visited by approximately 40,000 visitors per year, and Gemini Bridges are visited by approximately 50,000 visitors per year. The BLM has received many complaints that roped activities, including swinging from the arches, conflict with other visitors' use and enjoyment of the arches. The BLM finds merit in these complaints. People setting up and using swings and rappels from the arches endanger both themselves and those viewing them from below. In addition, the rock arches may be damaged by ropes “sawing” on the rock spans. The supplementary rules currently in effect in the Moab Field Office (at 81 FR 9498 (Feb. 25, 2016)) do not address roped activities on the affected arches, although a temporary restriction (80 FR 27703 (May 14, 2015)) is in effect until May 2017.

    The legal descriptions of the affected public lands are:

    Salt Lake Meridian T. 25 S., R. 20 E., Sec. 34, NW1/4 SW1/4, that part surrounding Gemini Bridges. T. 25 S., R. 21 E., sec. 32, SE1/4 SE1/4, that part surrounding Corona Arch. T. 26 S., R. 21 E., sec. 5, NE1/4, that part surrounding Corona Arch.

    The areas described aggregate 37.3 acres.

    This proposed supplementary rule would allow for enforcement as a tool in minimizing the adverse effects of roped activities within the affected areas. After it goes into effect, the supplementary rule will be available for inspection in the Moab Field Office, and it will be announced broadly through the news media and direct mail to the constituents included on the Moab Field Office mailing list. It will also be posted on signs at main entry points to the affected areas.

    III. Discussion of the Proposed Supplementary Rule

    The Moab Field Office proposes to ban roped activities in the vicinity of Corona Arch and Gemini Bridges. The prohibited activities would include, but not be limited to, ziplining, highlining, slacklining, traditional rock climbing, sport rock climbing, rappelling, and swinging, using equipment such as ropes, cables, climbing aids, webbing, or anchors. The proposed supplementary rule would affect 31 acres surrounding Corona Arch and 6.3 acres surrounding Gemini Bridges. The proposed supplementary rule is necessary for the protection of visitors and for the protection of the arches.

    IV. Procedural Matters Executive Order 12866, Regulatory Planning and Review

    This proposed supplementary rule is not a significant regulatory action and is not subject to review by the Office of Management and Budget under Executive Order 12866. This proposed supplementary rule would not have an annual effect of $100 million or more on the economy. It is not intended to affect commercial activity, but imposes a rule of conduct on recreational visitors for public safety and resource protection reasons in a limited area of public lands. This supplementary rule would not adversely affect, in a material way, the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities. This proposed supplementary rule would not create a serious inconsistency or otherwise interfere with an action taken or planned by another agency. This proposed supplementary rule does not materially alter the budgetary effects of entitlements, grants, user fees, or loan programs or the right or obligations of their recipients, nor does it raise novel legal or policy issues; it merely strives to protect public safety and the environment.

    Clarity of the Rule

    Executive Order 12866 requires each agency to write regulations that are simple and easy to understand. The BLM invites comments on how to make this proposed supplementary rule easier to understand, including answers to questions such as the following:

    (1) Are the requirements in the proposed supplementary rule clearly stated?

    (2) Does the proposed supplementary rule contain technical language or jargon that interferes with their clarity?

    (3) Does the format of the proposed supplementary rule (grouping and order of sections, use of headings, paragraphing, etc.) aid or reduce its clarity?

    (4) Would the proposed supplementary rule be easier to understand if it was divided into more (but shorter) sections?

    (5) Is the description of the proposed supplementary rule in the SUPPLEMENTARY INFORMATION section of this preamble helpful to your understanding of the proposed supplementary rule? How could this description be more helpful in making the proposed supplementary rule easier to understand?

    Please send any comments you have on the clarity of the proposed supplementary rule to the address specified in the ADDRESSES section.

    National Environmental Policy Act (NEPA)

    A temporary restriction on roped activities was analyzed in Environmental Assessment (EA) DOI-BLM-UT-2014-0170-EA, Temporary Restriction of Roped Activities at Corona Arch and Gemini Bridges. This document was subject to a 30-day public comment period; it was signed on January 6, 2015. The permanent restriction on roped activities was analyzed in Environmental Assessment DOI-BLM-UT-2015-0227, Permanent Restriction of Corona Arch and Gemini Bridges to Roped Activities. This document was subject to a 30-day scoping period and a 30-day public comment period. The Decision Record was signed on May 5, 2016. The EA found that the proposed supplementary rule would not constitute a major Federal action significantly affecting the quality of the human environment under Section 102(2)(C) of the National Environmental Policy Act of 1969 (NEPA), 42 U.S.C. 4332(2)(C). The proposed supplementary rule merely contains rules of conduct for the BLM public lands administered by the Moab Field Office within a 31-acre area around Corona Arch and 6.3-acre area around Gemini Bridges. This rule is designed to protect the environment and public safety. A detailed impact statement under NEPA is not required. The BLM has placed the EA and the Finding of No Significant Impact on file in the BLM Administrative Record at the address specified in the ADDRESSES section.

    Regulatory Flexibility Act

    Congress enacted the Regulatory Flexibility Act (RFA), 5 U.S.C. 601-612, to ensure that Government regulations do not unnecessarily or disproportionately burden small entities. The RFA requires a regulatory flexibility analysis if a rule would have a significant economic impact, either detrimental or beneficial, on a substantial number of small entities. The proposed supplementary rule does not pertain specifically to commercial or governmental entities of any size, but to public recreational use of specific public lands. Therefore, the BLM has determined under the RFA that the proposed supplementary rule would not have a significant economic impact on a substantial number of small entities.

    Small Business Regulatory Enforcement Fairness Act

    This proposed supplementary rule would not constitute a “major rule” as defined at 5 U.S.C. 804(2). This proposed supplementary rule merely contains rules of conduct for recreational use of public lands. This proposed rule would not affect business, commercial, or industrial use of public lands.

    Unfunded Mandates Reform Act

    This proposed supplementary rule would not pose an unfunded mandate on State, local, or tribal governments of more than $100 million per year; nor would it have a significant or unique effect on small governments. This proposed supplementary rule does not require anything of State, local, or tribal governments. Therefore, the BLM is not required to prepare a statement containing the information required by the Unfunded Mandates Reform Act, 2 U.S.C. 1531 et seq.

    Executive Order 12630, Governmental Actions and Interference With Constitutionally Protected Property Rights (Takings)

    This proposed supplementary rule is not a government action capable of interfering with constitutionally protected property rights. This proposed supplementary rule does not address property rights in any form, and does not cause the impairment of anybody's property rights. Therefore, the BLM has determined that this proposed supplementary rule would not cause a taking of private property or require further discussion of takings implications under this Executive Order.

    Executive Order 13132, Federalism

    This proposed supplementary rule would not have a substantial direct effect on the states, on the relationship between the Federal government and the states, or on the distribution of power and responsibilities among the various levels of government. Therefore, the BLM has determined that this proposed supplementary rule does not have sufficient Federalism implications to warrant preparation of a Federalism assessment.

    Executive Order 12988, Civil Justice Reform

    Under Executive Order 12988, the BLM has determined that this proposed supplementary rule would not unduly burden the judicial system and that the requirements of sections 3(a) and 3(b)(2) of the Order are met. This supplementary rule contains rules of conduct for recreational use of certain public lands to protect public safety and the environment.

    Executive Order 13175, Consultation and Coordination With Indian Tribal Governments

    In accordance with Executive Order 13175, the BLM has found that this proposed supplementary rule does not include policies that have tribal implications. This proposed supplementary rule does not affect lands held in trust for the benefit of Native American tribes, individual Indians, Aleuts, or others, nor does it affect lands for which title is held in fee status by Indian tribes or U.S. Government-owned lands managed by the Bureau of Indian Affairs.

    Paperwork Reduction Act

    This proposed supplementary rule does not contain information collection requirements that the Office of Management and Budget must approve under the Paperwork Reduction Act, 44 U.S.C. 3501 et seq.

    Executive Order 13211, Actions Concerning Regulations That Significantly Affect Energy Supply, Distribution, or Use

    This proposed supplementary rule does not comprise a significant energy action. This proposed supplementary rule would not have an adverse effect on energy supplies, production, or consumption. It only addresses rules of conduct for recreational use of certain public lands to protect public safety and the environment, and has no connection with energy policy.

    Author

    The principal author of the proposed supplementary rule is Beth Ransel, Field Manager for the Moab Field Office, Utah.

    For the reasons stated in the preamble, and under the authority for supplementary rules at 43 U.S.C. 1740 and 43 CFR 8365.1-6, the Utah State Director, BLM, proposes to issue this supplementary rule for public lands managed by the BLM in Utah, to read as follows:

    V. Proposed Supplementary Rule Definitions

    Roped activities means activities that involve the use of ropes, cables, climbing aids, webbing, or anchors, and includes, but is not limited to, ziplining, highlining, slacklining, traditional rock climbing, sport rock climbing, rappelling, and swinging.

    Prohibited Acts

    1. You must not participate in any roped activities on public lands in the vicinity of Corona Arch or Gemini Bridges. This prohibition includes, but is not limited to, the use of ropes, cables, climbing aids, webbing, anchors, and similar devices.

    Exemptions

    The following persons are exempt from this supplementary rule: Any Federal, State, local government officer or employee in the scope of their duties; members of any organized law enforcement, rescue, or firefighting force in performance of an official duty; and any persons, agencies, municipalities or companies whose activities are authorized in writing by the BLM.

    Enforcement

    Any person who violates this supplementary rule may be tried before a United States Magistrate and fined in accordance with 18 U.S.C. 3571, imprisoned no more than 12 months under 43 U.S.C. 1733(a) and 43 CFR 8360.0-7, or both. In accordance with 43 CFR 8365.1-7, State or local officials may also impose penalties for violations of Utah law.

    Jenna Whitlock, Bureau of Land Management, Acting State Director, Utah.
    [FR Doc. 2016-26179 Filed 10-28-16; 8:45 am] BILLING CODE 4310-DQ-P
    FEDERAL COMMUNICATIONS COMMISSION 47 CFR Parts 1 and 4 [GN Docket No. 15-206; Report No. 3052] Petitions for Reconsideration and Clarification of Action in Rulemaking Proceeding AGENCY:

    Federal Communications Commission.

    ACTION:

    Petitions for reconsideration and clarification.

    SUMMARY:

    Petitions for Reconsideration and Clarification (Petitions) have been filed in the Commission's rulemaking proceeding by Andrew D. Lipman, on behalf of Submarine Cable Coalition, and Kent D. Bressie, on behalf of North American Submarine Cable Association.

    DATES:

    Oppositions to the Petition must be filed on or before November 15, 2016. Replies to an opposition must be filed on or before November 25, 2016.

    ADDRESSES:

    Federal Communications Commission, 445 12th Street SW., Washington, DC 20554.

    FOR FURTHER INFORMATION CONTACT:

    Peter Shroyer, Public Safety and Homeland Security Bureau, email: [email protected]; phone: (202) 418-1575.

    SUPPLEMENTARY INFORMATION:

    This is a summary of the Commission's document, Report No. 3052, released October 12, 2016. The full text of the Petitions is available for viewing and copying at the FCC Reference Information Center, 445 12th Street SW., Room CY-A257, Washington, DC 20554 or may be accessed online via the Commission's Electronic Comment Filing System at http://apps.fcc.gov/ecfs/. The Commission will not send a copy of this Notice pursuant to the Congressional Review Act, 5 U.S.C. 801(a)(1)(A), because this Notice does not have an impact on any rules of particular applicability.

    Subject: Improving Outage Reporting for Submarine Cables and Enhanced Submarine Cable Outage Data; NORS; FCC 16-81, published at 81 FR 52354, August 8, 2016 in GN 15-206. This Notice is being published pursuant to 47 CFR 1.429(e). See also 47 CFR 1.4(b)(1) and 1.429(f), (g).

    Number of Petitions Filed: 2.

    Federal Communications Commission. Marlene H. Dortch, Secretary.
    [FR Doc. 2016-26198 Filed 10-28-16; 8:45 am] BILLING CODE 6712-01-P
    81 210 Monday, October 31, 2016 Notices DEPARTMENT OF AGRICULTURE Submission for OMB Review; Comment Request October 26, 2016.

    The Department of Agriculture has submitted the following information collection requirement(s) to OMB for review and clearance under the Paperwork Reduction Act of 1995, Public Law 104-13. Comments are requested regarding (1) whether the collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; (2) the accuracy of the agency's estimate of burden including the validity of the methodology and assumptions used; (3) ways to enhance the quality, utility and clarity of the information to be collected; and (4) ways to minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology.

    Comments regarding this information collection received by November 30, 2016 will be considered. Written comments should be addressed to: Desk Officer for Agriculture, Office of Information and Regulatory Affairs, Office of Management and Budget (OMB), New Executive Office Building, 725 17th Street NW., Washington, DC 20502. Commenters are encouraged to submit their comments to OMB via email to: [email protected] or fax (202) 395-5806 and to Departmental Clearance Office, USDA, OCIO, Mail Stop 7602, Washington, DC 20250-7602. Copies of the submission(s) may be obtained by calling (202) 720-8958.

    An agency may not conduct or sponsor a collection of information unless the collection of information displays a currently valid OMB control number and the agency informs potential persons who are to respond to the collection of information that such persons are not required to respond to the collection of information unless it displays a currently valid OMB control number.

    Animal and Plant Health Inspection Service

    Title: Untreated Oranges, Tangerines, and Grapefruit From Mexico Transiting the United States to Foreign Countries.

    OMB Control Number: 0579-0303.

    Summary of Collection: Under the Plant Protection Act (7 U.S.C. 7701 et seq.), the Secretary of Agriculture is authorized to prohibit or restrict the importation, entry, or movement of plants and plant pests to prevent the introduction of plant pests into the United States or their dissemination within the United States. The Code of Federal Regulations, § 352.30 addresses the movement into or through the United States of untreated oranges, tangerines, and grapefruit from Mexico that transit the United States en route to foreign countries.

    Need and Use of the Information: The Animal and Plant Health Inspection Service (APHIS) is taking action to provide additional protection against the possible introduction of fruit flies via untreated oranges, tangerines, and grapefruit from Mexico that transit the United States. Untreated oranges, tangerines, and grapefruit from Mexico transiting the United States for export to another country must be shipped in sealed, refrigerated containers and insect-proof packaging. A transportation and export permit must be issued by an inspector for shipments of untreated oranges, tangerines, and grapefruit from Mexico, as well as an inspection certificate and notice of arrival. Without the information, APHIS would not be able to allow the movement of untreated citrus to transit the United States to foreign countries.

    Description of Respondents: Business, importers.

    Number of Respondents: 3.

    Frequency of Responses: Reporting: On occasion.

    Total Burden Hours: 26.

    Ruth Brown, Departmental Information Collection Clearance Officer.
    [FR Doc. 2016-26205 Filed 10-28-16; 8:45 am] BILLING CODE 3410-34-P
    DEPARTMENT OF AGRICULTURE Forest Service Forest Resource Coordinating Committee AGENCY:

    Forest Service, USDA.

    ACTION:

    Notice of meeting.

    SUMMARY:

    The Forest Resource Coordinating Committee (Committee) will meet in Washington, DC. The Committee is authorized under Section 8005 of the Food, Conservation, and Energy Act of 2008 (the Act) (Pub. L. 110-246). Additional information concerning the Committee, including the meeting agenda, supporting documents and minutes, can be found by visiting the Committee's Web site at http://www.fs.fed.us/spf/coop/frcc/.

    DATES:

    The meeting will be held on the following dates and time:

    • Wednesday, November 9, 2016, from 8:30 a.m. to 5:00 p.m. EST • Thursday, November 10, 2016, from 8:30 a.m. to 5:00 p.m. EST

    All meetings are subject to cancellation. For updated status of the meeting prior to attendance, please contact the person listed under FOR FURTHER INFORMATION CONTACT.

    ADDRESSES:

    The meeting will be held at the Hotel Indigo, Inspiration Conference Room, 151 Haywood Street, Asheville, North Carolina.

    Written comments may be submitted as described under SUPPLEMENTARY INFORMATION. All comments, including names and addresses when provided, are placed in the record and are available for public inspection and copying. The public may inspect comments placed on the Committee's Web site listed above.

    FOR FURTHER INFORMATION CONTACT:

    Scott Stewart, Forest Resource Coordinating Committee Designated Federal Officer, Cooperative Forestry Staff by phone at 202-205-1618 or Jennifer Helwig, Forest Resource Coordinating Committee Program Coordinator, Cooperative Forestry Staff by phone at 202-205-0892. Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 between 8:00 a.m. and 8:00 p.m., Eastern Daylight Time, Monday through Friday.

    SUPPLEMENTARY INFORMATION:

    The purpose of the meeting is to:

    1. Discuss current and emerging recommendation efforts and develop a briefing-paper for incoming Administration;

    2. Meet partners to hear concerns and opportunities to collaborate; and

    3. Conduct Work Group break out sessions.

    The meeting is open to the public. The agenda will include time for people to make oral statements of three minutes or less. Individuals wishing to make an oral statement should submit a request in writing by November 3, 2016 to be scheduled on the agenda. Anyone who would like to bring related matters to the attention of the Committee may file written statements with the Committee staff before November 3, 2016. Written comments and time requests for oral comments must be sent to Scott Stewart, 1400 Independence Ave. SW., Mailstop 1123, Washington, DC 20250; or by email to [email protected]. A summary of the meeting will be posted at http://www.fs.fed.us/spf/coop/frcc within 21 days after the meeting.

    Meeting Accommodations: If you are a person requiring reasonable accommodation, please make requests in advance for sign language interpreting, assistive listening devices or other reasonable accommodations for access to the facility or proceedings by contacting the person listed under the FOR FURTHER INFORMATION CONTACT. All reasonable accommodation requests are managed on a case by case basis.

    Dated: October 21, 2016. Patricia Hirami, Associate Deputy Chief, State and Private Forestry.
    [FR Doc. 2016-26195 Filed 10-28-16; 8:45 am] BILLING CODE 3411-15-P
    DEPARTMENT OF AGRICULTURE Forest Service Allegheny Resource Advisory Committee Meeting AGENCY:

    Forest Service, USDA.

    ACTION:

    Notice of meeting.

    SUMMARY:

    The Allegheny Resource Advisory Committee (RAC) will meet in Warren, Pennsylvania. The committee is authorized under the Secure Rural Schools and Community Self-Determination Act (the Act) and operates in compliance with the Federal Advisory Committee Act. The purpose of the committee is to improve collaborative relationships and to provide advice and recommendations to the Forest Service concerning projects and funding consistent with Title II of the Act. Additional RAC information can be found at the following Web site: http://www.fs.usda.gov/main/allegheny/workingtogether/advisorycommittees.

    DATES:

    The meeting will be held Thursday, December 8, 2016, at 10:00 a.m. EST.

    All RAC meetings are subject to cancellation. For status of meeting prior to attendance, please contact the person listed under FOR FURTHER INFORMATION CONTACT.

    ADDRESSES:

    The meeting will be held at the Allegheny National Forest Supervisor's Office, 4 Farm Colony Drive, Warren, Pennsylvania.

    Written comments may be submitted as described under SUPPLEMENTARY INFORMATION. All comments, including names and addresses when provided, are placed in the record and are available for public inspection and copying. The public may inspect comments received at the Allegheny National Forest Supervisor's Office. Please call ahead at 814-728-6100 to facilitate entry into the building.

    FOR FURTHER INFORMATION CONTACT:

    Ruth Sutton, RAC Coordinator by phone at 814-728-6100, or via email at [email protected].

    Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 between 8:00 a.m. and 8:00 p.m., Eastern Standard Time, Monday through Friday. Please make requests in advance for sign language interpreting, assistive listening devices or other reasonable accommodation for access to the facility or proceedings by contacting the person listed above.

    SUPPLEMENTARY INFORMATION:

    The purpose of the meeting is to review and approve project submissions.

    The meeting is open to the public. The agenda will include time for people to make oral statements of three minutes or less. Individuals wishing to make an oral statement should request in writing by November 30, 2016, to be scheduled on the agenda. Anyone who would like to bring related matters to the attention of the committee may file written statements with the committee staff before or after the meeting. Written comments and requests for time to make oral comments must be sent to Ruth Sutton, RAC Coordinator, Allegheny National Forest Supervisor's Office, 4 Farm Colony Drive, Warren, Pennsylvania 16365; by email to [email protected], or via facsimile to 814-726-1465.

    Meeting Accommodations: If you are a person requiring reasonable accommodation, please make requests in advance for sign language interpreting, assistive listening devices or other reasonable accommodation for access to the facility or proceedings by contacting the person listed in the section titled For Further Information Contact. All reasonable accommodation requests are managed on a case-by-case basis.

    Dated: October 24, 2016. Sherry A. Tune, Forest Supervisor.
    [FR Doc. 2016-26165 Filed 10-28-16; 8:45 am] BILLING CODE 3411-15-P
    DEPARTMENT OF AGRICULTURE Forest Service National Urban and Community Forestry Advisory Council AGENCY:

    Forest Service, USDA.

    ACTION:

    Notice of meeting.

    SUMMARY:

    The National Urban and Community Forestry Advisory Council (Council) will meet in Washington, DC The Council is authorized under Section 9 of the Cooperative Forestry Assistance Act, as amended by Title XII, Section 1219 of Public Law 101-624 (the Act) (16 U.S.C. 2105g) and the Federal Advisory Committee Act (FACA) (5 U.S.C. App. II). Additional information concerning the Council, can be found by visiting the Council's Web site at: http://www.fs.fed.us/ucf/nucfac.shtml.

    DATES:

    The meeting will be held on the following dates and times:

    • Monday, November 13, 2016, from 9:00 a.m. to 5:00 p.m. Central Time or until Council business is completed. All meetings are subject to cancellation. For an updated status of meeting prior to attendance, please contact the person listed under FOR FURTHER INFORMATION CONTACT.

    ADDRESSES:

    The meeting will be held at the Santa Fe Room, Indianapolis Marriot Downtown, 350 West Maryland Street, Indianapolis, Indiana.

    Written comments concerning this meeting should be submitted as described under SUPPLEMENTARY INFORMATION. All comments, including names and addresses, when provided, are placed in the record and available for public inspection and copying. The public may inspect comments received at the USDA Forest Service, Sidney Yates Building, Room 3SC-01C, 201 14th Street SW., Washington, DC 20024. Please call ahead at 202-205-7829 to facilitate entry into the building.

    FOR FURTHER INFORMATION CONTACT:

    Nancy Stremple, Executive Staff, National Urban and Community Forestry Advisory Council, Sidney Yates Building, Room 3SC-01C, 201 14th Street SW., Washington, DC 20024 by telephone at 202-205-7829, or by email at [email protected], or by cell phone at 202-309-9873, or via facsimile at 202-690-5792. Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 between 8:00 a.m. and 8:00 p.m., Eastern Standard Time, Monday through Friday.

    SUPPLEMENTARY INFORMATION:

    The purpose of the meeting is to:

    1. Introduce new members;

    2. Finalize the 2016 Accomplishment and Recommendations;

    3. Update status of the 2017 grant review;

    4. Listen to local constituents urban forestry concerns;

    5. Provide updates on the 10-year action plan (2016-2026);

    6. Receive Forest Service budget and program updates; and

    The meeting is open to the public. The agenda will include time for people to make oral statements of three minutes or less. Individuals wishing to make an oral statement should submit a request in writing by November 1, 2016, to be scheduled on the agenda. Council discussion is limited to Forest Service staff and Council members, however anyone who would like to bring urban and community forestry matters to the attention of the Council may file written statements with the Council's staff before or after the meeting. Written comments and time requests for oral comments must be sent to Nancy Stemple, Executive Staff, National Urban and Community Forestry Advisory Council, Sidney Yates Building, Room 3SC-01C, 201 14th Street SW., Washington, DC 20024, or by email at [email protected].

    Meeting Accommodations: If you are a person requiring reasonable accommodation, please make requests in advance for sign language interpreting, assistive listening devices or other reasonable accommodation for access to the facility or proceedings by contacting the person listed in the section titled FOR FURTHER INFORMATION CONTACT. All reasonable accommodation requests are managed on a case by case basis.

    Dated: October 25, 2016. Steven W. Koehn, Director, Cooperative Forestry.
    [FR Doc. 2016-26200 Filed 10-28-16; 8:45 am] BILLING CODE 3411-15-P
    DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Invitation for Nominations to the Advisory Committee on Agriculture Statistics AGENCY:

    National Agricultural Statistics Service (NASS), Department of Agriculture.

    ACTION:

    Solicitation of Nominations to the Advisory Committee on Agriculture Statistics.

    SUMMARY:

    In accordance with the Federal Advisory Committee Act, 5 U.S.C. App. 2, this notice announces an invitation from the Office of the Secretary of Agriculture for nominations to the Advisory Committee on Agriculture Statistics.

    On August 15, 2016, the Secretary of Agriculture renewed the Advisory Committee charter for a two-year term to expire on August 15, 2018. The purpose of the Committee is to advise the Secretary of Agriculture on the scope, timing, content, etc., of the periodic censuses and surveys of agriculture, other related surveys, and the types of information to obtain from respondents concerning agriculture. The Committee also prepares recommendations regarding the content of agriculture reports and presents the views and needs for data of major suppliers and users of agriculture statistics.

    DATES:

    The nomination period for interested candidates will close 30 days after publication of this notice.

    ADDRESSES:

    You may submit nominations by any of the following methods:

    Email: Scan the completed form and email to: [email protected].

    eFax: 855-493-0445.

    Mail: Nominations should be mailed to Renee Picanso, Associate Administrator, National Agricultural Statistics Service, U.S. Department of Agriculture, 1400 Independence Avenue SW., Room 5041 South Building, Washington, DC 20250-2010.

    Hand Delivery/Courier: Hand deliver to: Renee Picanso, Associate Administrator, National Agricultural Statistics Service, U.S. Department of Agriculture, 1400 Independence Avenue SW., Room 5041 South Building, Washington, DC 20250-2010.

    FOR FURTHER INFORMATION CONTACT:

    Renee Picanso, Associate Administrator, National Agricultural Statistics Service, (202) 720-2707.

    SUPPLEMENTARY INFORMATION:

    Each person nominated to serve on the committee is required to submit the following form: AD-755 (Advisory Committee Membership Background Information, OMB Number 0505-0001), available on the Internet at https://www.nass.usda.gov/About_NASS/Advisory_Committee_on_Agriculture_Statistics/AD-755.pdf. This form may also be requested by telephone, fax, or email using the information above. Completed forms may be faxed to the number above, mailed, or completed and emailed directly from the Internet site. For more information on the Advisory Committee on Agriculture Statistics, see the NASS Web site at https://www.nass.usda.gov/About_NASS/Advisory_Committee_on_Agriculture_Statistics/index.php. The Committee draws on the experience and expertise of its members to form a collective judgment concerning agriculture data collected and the statistics issued by NASS. This input is vital to keep current with shifting data needs in the rapidly changing agricultural environment and keeps NASS informed of emerging issues in the agriculture community that can affect agriculture statistics activities.

    The Committee, appointed by the Secretary of Agriculture, consists of 20 members representing a broad range of disciplines and interests, including, but not limited to, producers, representatives of national farm organizations, agricultural economists, rural sociologists, farm policy analysts, educators, State agriculture representatives, and agriculture-related business and marketing experts.

    Members serve staggered 2-year terms, with terms for half of the Committee members expiring in any given year. Nominations are being sought for 10 open Committee seats. Members can serve up to 3 terms for a total of 6 consecutive years. The Chairperson of the Committee shall be elected by members to serve a 1-year term.

    Equal opportunity practices, in line with USDA policies, will be followed in all membership appointments to the Committee. To ensure that the recommendations of the Committee have taken into account the needs of the diverse groups served by USDA, membership will include to the extent possible, individuals with demonstrated ability to represent the needs of all racial and ethnic groups, women and men, and persons with disabilities.

    The duties of the Committee are solely advisory. The Committee will make recommendations to the Secretary of Agriculture with regards to the agricultural statistics programs of NASS, and such other matters as it may deem advisable, or which the Secretary of Agriculture; Under Secretary for Research, Education, and Economics; or the Administrator of NASS may request. The Committee will meet at least annually. All meetings are open to the public. Committee members are reimbursed for official travel expenses only.

    Send questions, comments, and requests for additional information to the email address, fax number, or address listed above.

    Signed at Washington, DC, October 20, 2016. R. Renee Picanso, Associate Administrator, National Agricultural Statistics Service.
    [FR Doc. 2016-26154 Filed 10-28-16; 8:45 am] BILLING CODE 3410-20-P
    DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Request Revision and Extension of a Currently Approved Information Collection AGENCY:

    National Agricultural Statistics Service, USDA.

    ACTION:

    Notice and request for comments.

    SUMMARY:

    In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics Service (NASS) to request revision and extension of a currently approved information collection, the Cotton Ginning Survey. Revision to burden hours will be needed due to changes in the size of the target population, sampling design, and/or questionnaire length.

    DATES:

    Comments on this notice must be received by December 30, 2016 to be assured of consideration.

    ADDRESSES:

    You may submit comments, identified by docket number 0535-0220, by any of the following methods:

    Email: [email protected]. Include docket number above in the subject line of the message.

    E-fax: (855) 838-6382.

    Mail: Mail any paper, disk, or CD-ROM submissions to: David Hancock, NASS Clearance Officer, U.S. Department of Agriculture, Room 5336 South Building, 1400 Independence Avenue SW., Washington, DC 20250-2024.

    Hand Delivery/Courier: Hand deliver to: David Hancock, NASS Clearance Officer, U.S. Department of Agriculture, Room 5336 South Building, 1400 Independence Avenue SW., Washington, DC 20250-2024.

    FOR FURTHER INFORMATION CONTACT:

    R. Renee Picanso, Associate Administrator, National Agricultural Statistics Service, U.S. Department of Agriculture, (202) 720-2707. Copies of this information collection and related instructions can be obtained without charge from David Hancock, NASS—OMB Clearance Officer, at (202) 690-2388 or at [email protected].

    SUPPLEMENTARY INFORMATION:

    Title: Cotton Ginning Survey.

    OMB Control Number: 0535-0220.

    Expiration Date of Approval: March 31, 2017.

    Type of Request: Intent to Seek Approval to Revise and Extend an Information Collection for a period of three years.

    Abstract: The primary objective of the National Agricultural Statistics Service (NASS) is to collect, prepare and issue State and national estimates of crop and livestock production, prices, and disposition as well as economic statistics, environmental statistics related to agriculture and also to conduct the Census of Agriculture. The Cotton Ginning surveys provide cotton ginning statistics from August through May by State. Data collected consists of bales of cotton ginned to date, cotton to be ginned, lint cotton produced, cottonseed produced, cottonseed sold to oil mills, cottonseed used for other uses, number of gins by type, bales produced by county of origin, and cottonseed prices received by producers. The forecasting procedure involves calculating a weighted percent ginned to date as well as an allowance for cross-state movement and bale weight adjustments. Production by State allows adjustments for year-end State and county estimates. Total pounds of lint cotton produced, is used to derive an actual bale weight which increases the precision of production estimates.

    Authority: These data will be collected under authority of 7 U.S.C. 2204(a). Individually identifiable data collected under this authority are governed by Section 1770 of the Food Security Act of 1985 as amended, 7 U.S.C. 2276, which requires USDA to afford strict confidentiality to non-aggregated data provided by respondents. This Notice is submitted in accordance with the Paperwork Reduction Act of 1995, Public Law 104-13 (44 U.S.C. 3501, et seq.) and Office of Management and Budget regulations at 5 CFR part 1320.

    NASS also complies with OMB Implementation Guidance, “Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA),” Federal Register, Vol. 72, No. 115, June 15, 2007, p. 33362.

    Estimate of Burden: Public reporting burden for this collection of information is estimated to be between 10 to 15 minutes per respondent per survey.

    Respondents: Active Cotton Gins.

    Estimated Number of Respondents: 650.

    Estimated Total Annual Burden on Respondents: 1,250 hours.

    Comments: Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed collection of information including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, technological, or other forms of information technology collection methods.

    All responses to this notice will become a matter of public record and be summarized in the request for OMB approval.

    Signed at Washington, DC, October 18, 2016. R. Renee Picanso, Associate Administrator.
    [FR Doc. 2016-26153 Filed 10-28-16; 8:45 am] BILLING CODE 3410-20-P
    DEPARTMENT OF COMMERCE Foreign-Trade Zones Board [B-48-2016] Foreign-Trade Zone (FTZ) 38—Spartanburg, South Carolina Authorization of Production Activity Benteler Automotive Corporation (Automotive Suspension and Body Components) Duncan, South Carolina

    On June 28, 2016, the South Carolina State Ports Authority, grantee of FTZ 38, submitted a notification of proposed production activity to the FTZ Board on behalf of Benteler Automotive Corporation, within Subzone 38F, in Duncan, South Carolina.

    The notification was processed in accordance with the regulations of the FTZ Board (15 CFR part 400), including notice in the Federal Register inviting public comment (81 FR 49927, July 29, 2016). The FTZ Board has determined that no further review of the activity is warranted at this time. The production activity described in the notification is authorized, subject to the FTZ Act and the Board's regulations, including Section 400.14.

    Dated: October 25, 2016. Andrew McGilvray, Executive Secretary.
    [FR Doc. 2016-26219 Filed 10-28-16; 8:45 am] BILLING CODE 3510-DS-P
    DEPARTMENT OF COMMERCE Bureau of Industry and Security Proposed Information Collection; Comment Request; Report of Requests for Restrictive Trade Practice or Boycott AGENCY:

    Bureau of Industry and Security, Department of Commerce.

    ACTION:

    Notice.

    SUMMARY:

    The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.

    DATES:

    Written comments must be submitted on or before December 30, 2016.

    ADDRESSES:

    Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW., Washington, DC 20230 (or via the Internet at [email protected]).

    FOR FURTHER INFORMATION CONTACT:

    Requests for additional information or copies of the information collection instrument and instructions should be directed to Mark Crace, BIS ICB Liaison, (202) 482-8093, [email protected].

    SUPPLEMENTARY INFORMATION:

    I. Abstract

    This information is used to monitor requests for participation in foreign boycotts against countries friendly to the U.S. The information is analyzed to note changing trends and to decide upon appropriate action to be taken to carry out the United States' policy of discouraging its citizens from participating in foreign restrictive trade practices and boycotts directed against friendly countries.

    II. Method of Collection

    Submitted on paper or electronically.

    III. Data

    OMB Control Number: 0694-0012.

    Form Number(s): BIS-621P, BIS-6051P, BIS-6051 P-a.

    Type of Review: Regular submission.

    Affected Public: Business or other for-profit organizations.

    Estimated Number of Respondents: 892.

    Estimated Time per Response: 1 hour to 1 hour and 30 minutes.

    Estimated Total Annual Burden Hours: 1171.

    Estimated Total Annual Cost to Public: $0.

    IV. Request for Comments

    Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.

    Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.

    Sheleen Dumas, PRA Departmental Lead, Office of the Chief Information Officer.
    [FR Doc. 2016-26157 Filed 10-28-16; 8:45 am] BILLING CODE 3510-33-P
    DEPARTMENT OF COMMERCE International Trade Administration [Docket No.: 161012955-6955-01] Call for Applications for the International Buyer Program Select Service for Calendar Year 2018 AGENCY:

    International Trade Administration, Department of Commerce.

    ACTION:

    Notice and call for applications.

    SUMMARY:

    The U.S. Department of Commerce (DOC), International Trade Administration (ITA) announces that it will accept applications for the International Buyer Program (IBP) Select service for calendar year 2018 (January 1, 2018, through December 31, 2018). This announcement sets out the objectives, procedures and application review criteria for IBP Select. Under IBP Select, ITA recruits international buyers to U.S. trade shows to meet with U.S. suppliers exhibiting at those shows. The main difference between IBP and IBP Select is that IBP offers worldwide promotion, whereas IBP Select focuses on promotion and recruitment in up to five international markets. Specifically, through the IBP Select, the DOC selects domestic trade shows that will receive DOC assistance in the form of targeted promotion and recruitment in up to five foreign markets, export counseling to exhibitors, and export counseling and matchmaking services at the trade show. This notice covers selection for IBP Select participation during calendar year 2018.

    DATES:

    Applications for IBP Select must be received by Friday, January 6, 2017.

    ADDRESSES:

    The application form can be found at www.export.gov/ibp. Applications may be submitted by any of the following methods: (1) Mail/Hand (including express) Delivery Service: International Buyer Program, Trade Promotion Programs, International Trade Administration, U.S. Department of Commerce, Ronald Reagan Building, 1300 Pennsylvania Ave. NW., Suite 800—Mezzanine Level—Atrium North, Washington, DC 20004; (2) Facsimile: (202) 482-7800; or (3) email: [email protected]. Facsimile and email applications will be accepted as interim applications, and must be followed by a signed original application that is received by the program no later than five (5) business days after the application deadline. To ensure that applications are received by the deadline, applicants are strongly urged to send applications by express delivery service (e.g., U.S. Postal Service Express Delivery, Federal Express, UPS, etc.).

    FOR FURTHER INFORMATION CONTACT:

    Vidya Desai, Senior Advisor, Trade Promotion Programs, International Trade Administration, U.S. Department of Commerce, 1300 Pennsylvania Ave. NW., Ronald Reagan Building, Suite 800M—Mezzanine Level—Atrium North, Washington, DC 20004; Telephone (202) 482-2311; Facsimile: (202) 482-7800; Email: [email protected].

    SUPPLEMENTARY INFORMATION:

    The IBP was established in the Omnibus Trade and Competitiveness Act of 1988 (Pub. L. 100-418, title II, § 2304, codified at 15 U.S.C. 4724) to bring international buyers together with U.S. firms by promoting leading U.S. trade shows in industries with high export potential. The IBP emphasizes cooperation between the DOC and trade show organizers to benefit U.S. firms exhibiting at selected shows and provides practical, hands-on assistance such as export counseling and market analysis to U.S. companies interested in exporting. Shows selected for the IBP Select will provide a venue for U.S. companies interested in expanding their sales into international markets.

    Through the IBP Select, the DOC selects trade shows that DOC determines to be leading trade shows with participation by U.S. firms interested in exporting. DOC provides successful applicants with assistance in the form of targeted overseas promotion of the show by U.S. Embassies and Consulates; outreach to show participants about exporting; recruitment of potential buyers to attend the shows; and staff assistance in setting up and staffing international trade centers at the shows. Targeted promotion in up to five markets can be executed through the overseas offices of ITA or in U.S. Embassies in countries where ITA does not maintain offices.

    ITA is accepting applications for IBP Select from trade show organizers of trade shows taking place between January 1, 2018, and December 31, 2018. Selection of a trade show for IBP Select is valid for one show. A trade show organizer seeking selection for a recurring show must submit a new application for selection for each occurrence of the show. For shows that occur more than once in a calendar year, the trade show organizer must submit a separate application for each show.

    There is no fee required to submit an application. For IBP Select in calendar year 2018, ITA expects to select approximately 10 shows from among the applicants. ITA will select those shows that are determined to most clearly support the statutory mandate in 15 U.S.C. 4721 to promote U.S. exports, especially those of small- and medium-sized enterprises, and that best meet the selection criteria articulated below. Once selected, applicants will be required to enter into a Memorandum of Agreement (MOA) with the DOC, and submit payment of the $6,000 2018 participation fee (by check or credit card) within 30 days of written notification of acceptance into IBP Select. The MOA constitutes an agreement between the DOC and the show organizer specifying which responsibilities for international promotion and export assistance services at the trade shows are to be undertaken by the DOC as part of the IBP Select and, in turn, which responsibilities are to be undertaken by the show organizer. Anyone requesting application information will be sent a sample copy of the MOA along with the application form and a copy of this Federal Register Notice. Applicants are encouraged to review the MOA closely, as IBP Select participants are required to comply with all terms, conditions, and obligations in the MOA. Trade show organizer obligations include the construction of an International Trade Center at the trade show, production of an export interest directory, and provision of complimentary hotel accommodations for DOC staff as explained in the MOA. ITA responsibilities include targeted promotion of the trade show and, where feasible, recruitment of international buyers to that show from up to five target markets identified, provision of on-site export assistance to U.S. exhibitors at the show, and the reporting of results to the show organizer.

    Selection as an IBP Select show does not constitute a guarantee by DOC of the show's success. IBP Select participation status is not an endorsement of the show except as to its international buyer activities. Non-selection of an applicant for IBP Select status should not be viewed as a determination that the show will not be successful in promoting U.S. exports.

    Eligibility: 2018 U.S. trade shows with 1,350 or fewer exhibitors are eligible to apply, through the show organizer, for IBP Select participation. First-time shows will also be considered.

    Exclusions: U.S. trade shows with over 1,350 exhibitors will not be considered for IBP Select.

    General Evaluation Criteria: ITA will evaluate applicants for IBP Select using the following criteria:

    (a) Export Potential: The trade show promotes products and services from U.S. industries that have high export potential, as determined by DOC sources, including industry analysts' assessment of export potential, ITA best prospects lists, and U.S. export analysis.

    (b) Level of International Interest: The trade show meets the needs of a significant number of overseas markets and corresponds to marketing opportunities as identified by ITA. Previous international attendance at the show may be used as an indicator.

    (c) Scope of the Show: The show must offer a broad spectrum of U.S. made products and services for the subject industry. Trade shows with a majority of U.S. firms as exhibitors are given priority.

    (d) U.S. Content of Show Exhibitors: Trade shows with exhibitors featuring a high percentage of products produced in the United States or products with a high degree of U.S. content will be preferred.

    (e) Stature of the Show: The trade show is clearly recognized by the industry it covers as a leading show for the promotion of that industry's products and services both domestically and internationally, and as a showplace for the latest technology or services in that industry.

    (f) Level of Exhibitor Interest: There is significant interest on the part of U.S. exhibitors in receiving international business visitors during the trade show. A significant number of U.S. exhibitors should be new-to-export or seeking to expand their sales into additional export markets.

    (g) Level of Overseas Marketing: There has been a demonstrated effort by the applicant to market prior shows overseas. In addition, the applicant should describe in detail the international marketing program to be conducted for the show, and explain how efforts should increase individual and group international attendance.

    (h) Level of Cooperation: The applicant demonstrates a willingness to cooperate with ITA to fulfill the program's goals and adhere to the target dates set out in the MOA and in the show timetables, both of which are available from the program office (see the FOR FURTHER INFORMATION CONTACT section above). Past experience in the IBP will be taken into account in evaluating the applications received.

    (i) Delegation Incentives: Waived or reduced (by at least 50% off lowest price) admission fees are required for international attendees who are participating in IBP Select. Delegation leaders also must be provided complimentary admission to the show. In addition, show organizers should offer a range of incentives to delegations and/or delegation leaders recruited by the DOC overseas posts. Examples of incentives to international visitors and to organized delegations include: Special organized events, such as receptions, meetings with association executives, briefings, and site tours; or complimentary accommodations for delegation leaders.

    Review Process: ITA will vet all applications received based on the criteria set out in this notice. Vetting will include soliciting input from ITA industry analysts, as well as domestic and international field offices, focusing primarily on the export potential, level of international interest, and stature of the show. In reviewing applications, ITA will also consider sector and calendar diversity in terms of the need to allocate resources to support selected shows.

    Application Requirements: Show organizers submitting applications for 2018 IBP Select are required to submit: (1) A narrative statement addressing each question in the application, OMB 0625-0143 (found at www.export.gov/ibp); and (2) a signed statement that “The above information provided is correct and the applicant will abide by the terms set forth in this Call for Applications for the International Buyer Program Select (January 1, 2018 through December 31, 2018);” on or before the deadline noted above. Applications for IBP Select must be received by Friday, January 6, 2017. There is no fee required to apply. ITA expects to issue the results of this process in April 2017.

    Legal Authority: The statutory program authority for ITA to conduct the IBP is 15 U.S.C. 4724. ITA has the legal authority to enter into MOAs with show organizers under the provisions of the Mutual Educational and Cultural Exchange Act of 1961 (MECEA), as amended (22 U.S.C. 2455(f) and 2458(c)). MECEA allows ITA to accept contributions of funds and services from firms for the purposes of furthering its mission.

    The Office of Management and Budget (OMB) has approved the information collection requirements of the application to this program (0625-0143) under the provisions of the Paperwork Reduction Act of 1995 (44 U.S.C. 3501 et seq.) (OMB Control No. 0625-0143). Notwithstanding any other provision of law, no person is required to respond to, nor shall a person be subject to a penalty for failure to comply with, a collection of information subject to the requirements of the Paperwork Reduction Act, unless that collection of information displays a currently valid OMB Control Number.

    For further information please contact: Vidya Desai, Senior Advisor, Trade Promotion Programs ([email protected]).

    Frank Spector, Trade Promotion Programs.
    [FR Doc. 2016-26218 Filed 10-28-16; 8:45 am] BILLING CODE 3510-DR-P
    DEPARTMENT OF COMMERCE International Trade Administration [A-533-840] Certain Frozen Warmwater Shrimp From India: Initiation and Preliminary Results of Antidumping Duty Changed Circumstances Review AGENCY:

    Enforcement and Compliance, International Trade Administration, Department of Commerce.

    SUMMARY:

    Avanti Frozen Foods Private Limited (Avanti Frozen) requested a changed circumstances review of the antidumping duty order on certain frozen warmwater shrimp (shrimp) from India. The Department of Commerce (Department) is initiating this changed circumstances review and preliminarily determining that Avanti Frozen is the successor-in-interest to Avanti Feeds Limited (Avanti Feeds).

    DATES:

    Effective October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    E. Whitley Herndon, AD/CVD Operations, Office II, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230; telephone: 202-482-6274.

    SUPPLEMENTARY INFORMATION:

    Background

    On February 1, 2005, the Department published in the Federal Register an antidumping duty order on shrimp from India.1 In the tenth administrative review of the Order, Avanti Feeds was assigned a cash deposit rate of 2.20 percent.2

    1See Notice of Amended Final Determination of Sales at Less Than Fair Value and Antidumping Duty Order: Certain Frozen Warmwater Shrimp from India, 70 FR 5147 (February 1, 2005) (Order).

    2See Certain Frozen Warmwater Shrimp From India: Final Results of Antidumping Duty Administrative Review; Final Determination of No Shipments; 2014-2015, 81 FR 62867 (September 13, 2016) (10th AR).

    On September 7, 2016, Avanti Frozen requested that, pursuant to section 751(b)(1) of the Tariff Act of 1930, as amended (the Act) and 19 CFR 351.216(b), the Department conduct a changed circumstances review of the Order to confirm that Avanti Frozen is the successor-in-interest to Avanti Feeds.3 In its submission, Avanti Frozen explained that Avanti Feeds undertook a business reorganization and transferred its shrimp business to its subsidiary company, Avanti Frozen.4 The domestic industry did not file any comment for these preliminary results.

    3See Letter from Avanti Frozen entitled “Frozen Warmwater Shrimp form India: Request to Initiate a Successor-in-Interest Changed Circumstances Review,” dated September 7, 2016 (Avanti Frozen CCR Request).

    4Id. at 2.

    Scope of the Order

    The merchandise subject to the order is certain frozen warmwater shrimp.5 The product is currently classified under the following Harmonized Tariff Schedule of the United States (HTSUS) item numbers: 0306.17.00.03, 0306.17.00.06, 0306.17.00.09, 0306.17.00.12, 0306.17.00.15, 0306.17.00.18, 0306.17.00.21, 0306.17.00.24, 0306.17.00.27, 0306.17.00.40, 1605.21.10.30, and 1605.29.10.10. Although the HTSUS numbers are provided for convenience and customs purposes, the written product description remains dispositive.

    5 For a complete description of the Scope of the Order, see 10th AR, and accompanying Issues and Decision Memorandum at “Scope.”

    Initiation and Preliminary Results

    Pursuant to section 751(b)(1) of the Act, the Department will conduct a changed circumstances review upon receipt of information concerning, or a request from, an interested party for a review of an antidumping duty order which shows changed circumstances sufficient to warrant a review of the order. As indicated in the “Background” section, we received information indicating that Avanti Feeds has transferred its shrimp business to Avanti Frozen. This constitutes changed circumstances warranting a review of the order.6 Therefore, in accordance with section 751(b)(1) of the Act and 19 CFR 351.216(d) and (e), we are initiating a changed circumstances review based upon the information contained in Avanti Frozen's submission.

    6See 19 CFR 351.216(d).

    Section 351.221(c)(3)(ii) of the Department's regulations permits the Department to combine the notice of initiation of a changed circumstances review and the notice of preliminary results if the Department concludes that expedited action is warranted. In this instance, because the record contains information necessary to make a preliminary finding, we find that expedited action is warranted and have combined the notice of initiation and the notice of preliminary results.

    In this changed circumstances review, pursuant to section 751(b) of the Act, the Department conducted a successor-in-interest analysis. In making a successor-in-interest determination, the Department examines several factors, including, but not limited to, changes in the following: (1) Management; (2) production facilities; (3) supplier relationships; and (4) customer base.7 While no single factor or combination of factors will necessarily provide a dispositive indication of a successor-in-interest relationship, generally, the Department will consider the new company to be the successor to the previous company if the new company's resulting operation is not materially dissimilar to that of its predecessor.8 Thus, if the record evidence demonstrates that, with respect to the production and sale of the subject merchandise, the new company operates as the same business entity as the predecessor company, the Department may assign the new company the cash deposit rate of its predecessor.9

    7See, e.g., Notice of Final Results of Changed Circumstances Antidumping Duty Administrative Review: Polychloroprene Rubber From Japan, 67 FR 58 (January 2, 2002).

    8See, e.g., Fresh and Chilled Atlantic Salmon From Norway; Final Results of Changed Circumstances Antidumping Duty Administrative Review, 64 FR 9979, 9980 (March 1, 1999).

    9See, e.g., Circular Welded Non-Alloy Steel Pipe From the Republic of Korea; Preliminary Results of Antidumping Duty Changed Circumstances Review, 63 FR 14679 (March 26, 1998), unchanged in Circular Welded Non-Alloy Steel Pipe From Korea; Final Results of Antidumping Duty Changed Circumstances Review, 63 FR 20572 (April 27, 1998), in which the Department found that a company which only changed its name and did not change its operations is a successor-in-interest to the company before it changed its name.

    In accordance with 19 CFR 351.216, we preliminarily determine that Avanti Frozen is the successor-in-interest to Avanti Feeds. Record evidence, as submitted by Avanti Frozen, indicates that Avanti Frozen operates as essentially the same business entity as Avanti Feeds with respect to the subject merchandise.10 For the complete successor-in-interest analysis, including discussion of business proprietary information, refer to the accompanying successor-in-interest memorandum.11

    10See Avanti Frozen CCR Request.

    11See Memorandum to Melissa G. Skinner, Director, Office II, entitled “Certain Frozen Warmwater Shrimp from India: Preliminary Successor-In-Interest Determination” dated concurrently with this notice.

    Record evidence, as submitted by Avanti Frozen, indicates that the shrimp business was transferred fully from Avanti Feeds to its subsidiary, Avanti Frozen. Specifically, Avanti Frozen provided a Business Transfer Agreement which transfers Avanti Feed's entire shrimp business to Avanti Frozen; approvals from various governing entities approving/confirming the transfer of the shrimp business from Avanti Feeds to Avanti Frozen; letters notifying customers, suppliers, and employees of the business transfer; Avanti Frozen's first annual report; charts demonstrating the board of directors and equity stockholders of both Avanti Feed and Avanti Frozen; and a list of suppliers, customers, and production and business locations before and after the transfer.12 In summary, Avanti Frozen presented evidence to support its claim of successorship and the transfer did not impact any of the criteria that the Department typically looks to when making a changed circumstances determination.

    12See Avanti Frozen CCR Request.

    We find that the evidence provided by Avanti Frozen is sufficient to preliminarily determine that the transfer of shrimp operations from Avanti Feeds to its subsidiary Avanti Frozen did not affect the company's operations in a meaningful way. Therefore, based on the aforementioned reasons, we preliminarily determine that Avanti Frozen is the successor-in-interest to Avanti Feeds and, thus, should receive the same antidumping duty treatment with respect to the subject merchandise as Avanti Feeds.

    Public Comment

    Pursuant to 19 CFR 351.310(c), any interested party may request a hearing within 30 days of publication of this notice. In accordance with 19 CFR 351.309(c)(1)(ii), interested parties may submit case briefs not later than 30 days after the date of publication of this notice. Rebuttal briefs, limited to issues raised in the case briefs, may be filed no later than five days after the case briefs, in accordance with 19 CFR 351.309(d). Parties who submit case or rebuttal briefs are encouraged to submit with each argument: (1) A statement of the issue; (2) a brief summary of the argument; and (3) a table of authorities. All comments are to be filed electronically using Enforcement and Compliance's Antidumping and Countervailing Duty Centralized Electronic Service System (ACCESS) available to registered users at http://iaaccess.trade.gov and in the Central Records Unit, Room B8024 of the main Department of Commerce building, and must also be served on interested parties.29 An electronically filed document must be received successfully in its entirety by ACCESS by 5:00 p.m. Eastern Time on the day it is due.13

    13See 19 CFR 351.303(b).

    Consistent with 19 CFR 351.216(e), we will issue the final results of this changed circumstances review no later than 270 days after the date on which this review was initiated, or within 45 days if all parties agree to our preliminary finding. This notice is published in accordance with sections 751(b)(1) and 777(i) of the Act and 19 CFR 351.216(b), 351.221(b) and 351.221(c)(3).

    Dated: October 24, 2016. Ronald K. Lorentzen, Acting Assistant Secretary for Enforcement and Compliance.
    [FR Doc. 2016-26214 Filed 10-28-16; 8:45 am] BILLING CODE 3510-DS-P
    DEPARTMENT OF COMMERCE International Trade Administration [A-570-848] Freshwater Crawfish Tail Meat From the People's Republic of China: Initiation of Antidumping Duty New Shipper Review AGENCY:

    Enforcement and Compliance, International Trade Administration, Department of Commerce.

    DATES:

    Effective October 31, 2016.

    SUMMARY:

    Based on a request, the Department of Commerce (the Department) is initiating a new shipper review (NSR) of the antidumping duty order on freshwater crawfish tail meat from the People's Republic of China (PRC) with respect to Jingzhou Tianhe Aquatic Products Co., Ltd. (Jingzhou Tianhe). We have determined that this request meets the statutory and regulatory requirements for initiation.

    FOR FURTHER INFORMATION CONTACT:

    Dmitry Vladimirov, AD/CVD Operations Office I, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230; Telephone: (202) 482-0665.

    SUPPLEMENTARY INFORMATION:

    Background

    The antidumping duty order on freshwater crawfish tail meat from the PRC published in the Federal Register on September 15, 1997.1 Pursuant to section 751(a)(2)(B)(i) of the Tariff Act of 1930, as amended (the Act), the Department received a timely and properly filed request for a NSR of the order from Jingzhou Tianhe during the anniversary month of the antidumping duty order.2 In its request, Jingzhou Tianhe certified that it is both the producer and exporter of the subject merchandise upon which the request was based.3

    1See Notice of Amendment to Final Determination of Sales at Less Than Fair Value and Antidumping Duty Order: Freshwater Crawfish Tail Meat From the People's Republic of China, 62 FR 48218 (September 15, 1997) (Crawfish Order).

    2See Letter from Jingzhou Tianhe, “RE: Freshwater Crawfish Tail meat From the People's Republic of China; Request for New Shipper Review,” dated September 30, 2016.

    3Id., at 2.

    Pursuant to section 751(a)(2)(B)(i)(I) of the Act and 19 CFR 351.214(b)(2)(i), Jingzhou Tianhe certified that it did not export subject merchandise to the United States during the period of investigation (POI).4 In addition, pursuant to section 751(a)(2)(B)(i)(II) of the Act and 19 CFR 351.214(b)(2)(iii)(A), Jingzhou Tianhe certified that, since the initiation of the investigation, it has never been affiliated with any exporter or producer who exported subject merchandise to the United States during the POI, including those respondents not individually examined during the POI.5 As required by 19 CFR 351.214(b)(2)(iii)(B), Jingzhou Tianhe also certified that its export activities were not controlled by the government of the PRC.6

    4Id., at Attachment 1.

    5Id.

    6Id.

    In addition to the certifications described above, pursuant to 19 CFR 351.214(b)(2), Jingzhou Tianhe submitted documentation establishing the following: (1) The date on which it first shipped subject merchandise for export to the United States; (2) the volume of its first shipment; and (3) the date of its first sale to an unaffiliated customer in the United States.7

    7Id., at Attachment 2; see also Jingzhou Tianhe's October 14, 2016, response to the Department's request for additional information, dated October 3, 2016.

    Period of Review

    In accordance with 19 CFR 351.214(g)(1)(i)(A), the period of review (POR) for a NSR initiated in the month immediately following the anniversary month will be the twelve-month period immediately preceding the anniversary month. Therefore, the POR for this NSR is September 1, 2015, through August 31, 2016.

    Initiation of New Shipper Review

    Pursuant to section 751(a)(2)(B) of the Act and 19 CFR 351.214(b), we find that the request from Jingzhou Tianhe meets the threshold requirements for initiation of a NSR for shipments of freshwater crawfish tail meat from the PRC produced and exported by Jingzhou Tianhe.8

    8See the memorandum to the file entitled, “Freshwater Crawfish Tail Meat From the People's Republic of China: Initiation Checklist for Antidumping Duty New Shipper Review of Jingzhou Tianhe Aquatic Products Co., Ltd.,” dated concurrently with this notice.

    On February 24, 2016, the President signed into law the “Trade Facilitation and Trade Enforcement Act of 2015,” H.R. 644, which made several amendments to section 751(a)(2)(B) of the Act. We will conduct this NSR in accordance with section 751(a)(2)(B) of the Act, as amended by the Trade Facilitation and Trade Enforcement Act of 2015.9

    9 Notably, the Trade Facilitation and Trade Enforcement Act of 2015 removed from section 751(a)(2)(B) of the Act the provision directing the Department to instruct U.S. Customs and Border Protection (CBP) to allow an importer the option of posting a bond or security in lien of a cash deposit during the pendency of an NSR.

    Unless extended, the Department intends to issue the preliminary results of this NSR no later than 180 days from the date of initiation and final results of the review no later than 90 days after the date the preliminary results are issued.10

    10See section 751(a)(2)(B)(iv) of the Act.

    It is the Department's usual practice, in cases involving non-market economy countries, to require that a company seeking to establish eligibility for an antidumping duty rate separate from the country-wide rate provide evidence of de jure and de facto absence of government control over the company's export activities. Accordingly, we will issue a questionnaire to Jingzhou Tianhe which will include a section requesting information concerning its eligibility for a separate rate. We will rescind the NSR of Jingzhou Tianhe if we determine that Jingzhou Tianhe has not demonstrated that it is eligible for a separate rate.

    Because Jingzhou Tianhe certified that it produced and exported subject merchandise, the sale of which is the basis for the request for a NSR, we will instruct CBP to continue to suspend liquidation of all entries of subject merchandise produced and exported by Jingzhou Tianhe.

    To assist in its analysis of the bona fides of Jingzhou Tianhe's sales, upon initiation of this NSR, the Department will require Jingzhou Tianhe to submit on an ongoing basis complete transaction information concerning any sales of subject merchandise to the United States that were made subsequent to the POR.

    Interested parties requiring access to proprietary information in the NSR should submit applications for disclosure under administrative protective order, in accordance with 19 CFR 351.305 and 351.306.

    This initiation and notice are published in accordance with section 751(a)(2)(B) of the Act and 19 CFR 351.214 and 351.221(c)(1)(i).

    Christian Marsh, DAS for AD/CVD Operations.
    [FR Doc. 2016-26148 Filed 10-28-16; 8:45 am] BILLING CODE 3510-DS-P
    DEPARTMENT OF COMMERCE International Trade Administration [A-557-813] Polyethylene Retail Carrier Bags From Malaysia: Final Results of the Antidumping Duty Administrative Review; 2014-2015 AGENCY:

    Enforcement and Compliance, International Trade Administration, Department of Commerce.

    SUMMARY:

    On June 24, 2016, the Department of Commerce (the Department) published the preliminary results of the administrative review of the antidumping duty order on polyethylene retail carrier bags (PRCBs) from Malaysia. The review covers one producer/exporter of the subject merchandise, Euro SME Sdn Bhd (Euro SME) for the period of review (POR) August 1, 2014, through July 31, 2015. The final estimated weighted-average dumping margin is listed below in the “Final Results of Review” section of this notice.

    DATES:

    Effective October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Bryan Hansen or Minoo Hatten, AD/CVD Operations, Office I, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230; telephone: (202) 482-3683 or (202) 482-1690, respectively.

    SUPPLEMENTARY INFORMATION:

    Background

    On June 24, 2016, the Department published the Preliminary Results in the Federal Register, and invited parties to comment.1 For events subsequent to the Preliminary Results, see the Department's Final Decision Memorandum.2 The Department conducted this review in accordance with section 751 of the Tariff Act of 1930, as amended (the Act).

    1See Polyethylene Retail Carrier Bags From Malaysia: Preliminary Results of Antidumping Duty Administrative Review; 2014-2015, 81 FR 41294 (June 24, 2016) (Preliminary Results).

    2See Memorandum from Christian Marsh, Deputy Assistant Secretary for Antidumping and Countervailing Duty Operations, to Ronald K. Lorentzen, Acting Assistant Secretary for Enforcement and Compliance, “Issues and Decision Memorandum for Final Results of Antidumping Duty Administrative Review: Polyethylene Retail Carrier Bags from Malaysia,” dated concurrently with, and hereby adopted by this notice (Final Decision Memorandum).

    Scope of the Order

    The merchandise subject to the order is PRCBs. The product is currently classified under the Harmonized Tariff Schedules of the United States (HTSUS) subheading 3923.21.0085. While the HTSUS subheading is provided for convenience and customs purposes, the written description is dispositive. A full description of the scope of the order is contained in the Final Decision Memorandum.3

    3Id.

    Analysis of Comments Received

    All issues raised in the case and rebuttal briefs by parties to this review are addressed in the Final Decision Memorandum, which is hereby adopted by this notice. A list of the issues raised is attached to this notice as Appendix. The Final Decision Memorandum is a public document and is on file electronically via Enforcement and Compliance's Antidumping and Countervailing Duty Centralized Electronic Service System (ACCESS). ACCESS is available to registered users at https://access.trade.gov, and to all parties in the Central Records Unit (CRU), Room B8024 of the main Department of Commerce building. In addition, a complete version of the Final Decision Memorandum can be accessed directly on the Internet at http://enforcement.trade.gov/frn/index.html.

    Changes Since the Preliminary Results

    Based on our analysis of comments received, we made one revision that changed the results for Euro SME.4

    4 We corrected a programming error in the margin calculation we included in the Preliminary Results. A detailed discussion of the correction we made is in the final analysis memorandum for Euro SME, dated concurrently with this notice, which is available in ACCESS, to registered users at https://access.trade.gov, or available in the CRU.

    Final Results of the Review

    As a result of this administrative review, we determine that a weighted-average dumping margin of 0.00 percent exists for Euro SME for this POR.

    Disclosure

    We intend to disclose the calculations performed to parties in this proceeding within five days after public announcement of the final results, in accordance with 19 CFR 351.224(b).

    Assessment Rates

    In accordance with 19 CFR 351.212 and the Final Modification, 5 the Department will instruct U.S. Customs and Border Protection (CBP) to liquidate all appropriate entries for Euro SME without regard to antidumping duties. For entries of subject merchandise during the POR produced by Euro SME for which it did not know its merchandise was destined for the United States, we will instruct CBP to liquidate un-reviewed entries at the all-others rate if there is no rate for the intermediate company(ies) involved in the transaction. We intend to issue instructions to CBP 15 days after publication of the final results of this review.

    5See Antidumping Proceedings: Calculation of the Weighted-Average Dumping Margin and Assessment Rate in Certain Antidumping Duty Proceedings; Final Modification, 77 FR 8101, 8102 (February 14, 2012) (Final Modification).

    Cash Deposit Requirements

    The following deposit requirements will be effective upon publication of the notice of final results of administrative review for all shipments of PRCBs from Malaysia entered, or withdrawn from warehouse, for consumption on or after the date of publication as provided by section 751(a)(2) of the Act: (1) The cash deposit rate for Euro SME will be 0.00 percent, the rate established in the final results of this administrative review; (2) for merchandise exported by manufacturers or exporters not covered in this review but covered in a prior segment of the proceeding, the cash deposit rate will continue to be the company-specific rate published for the most recently completed segment of this proceeding in which that manufacturer or exporter participated; (3) if the exporter is not a firm covered in this review, a prior review, or the original investigation but the manufacturer is, the cash deposit rate will be the rate established for the manufacturer of the merchandise for the most recently completed segment of this proceeding for the manufacturer of the merchandise; (4) the cash deposit rate for all other manufacturers or exporters will continue to be 84.94 percent.6 These cash deposit requirements, when imposed, shall remain in effect until further notice.

    6 This all-others rate was established in the Notice of Final Determination of Sales at Less Than Fair Value: Polyethylene Retail Carrier Bags From Malaysia, 69 FR 34128 (June 18, 2004).

    Notification to Importers

    This notice serves as a final reminder to importers of their responsibility under 19 CFR 351.402(f)(2) to file a certificate regarding the reimbursement of antidumping duties prior to liquidation of the relevant entries during this review period. Failure to comply with this requirement could result in the Secretary's presumption that reimbursement of antidumping duties occurred and the subsequent assessment of double antidumping duties.

    Administrative Protective Orders

    This notice also serves as a reminder to parties subject to administrative protective order (APO) of their responsibility concerning the destruction of proprietary information disclosed under APO in accordance with 19 CFR 351.305(a)(3). Timely written notification of the return or destruction of APO materials or conversion to judicial protective order is hereby requested. Failure to comply with the regulations and terms of an APO is a sanctionable violation.

    Notification to Interested Parties

    The Department is issuing and publishing these final results of administrative review in accordance with sections 751(a)(1) and 777(i)(1) of the Act, and 19 CFR 351.213(h).

    Dated: October 24, 2016. Ronald K. Lorentzen, Acting Assistant Secretary for Enforcement and Compliance. Appendix List of Topics Discussed in the Final Decision Memorandum: I. Summary II. Background III. Scope of the Order IV. Margin Calculation V. Discussion of the Issues Issue 1: Whether the U.S. Sale is Bona Fide Issue 2: Home Market Window Period VI. Recommendation
    [FR Doc. 2016-26220 Filed 10-28-16; 8:45 am] BILLING CODE 3510-DS-P
    DEPARTMENT OF COMMERCE International Trade Administration [Docket No. 161012954-6954-01] Call for Applications for the International Buyer Program Calendar Year 2018 AGENCY:

    International Trade Administration, Department of Commerce.

    ACTION:

    Notice and call for applications.

    SUMMARY:

    In this notice, the U.S. Department of Commerce (DOC) International Trade Administration (ITA) announces that it will accept applications for the International Buyer Program (IBP) for calendar year 2018 (January 1, 2018, through December 31, 2018). The announcement also sets out the objectives, procedures and application review criteria for the IBP. The purpose of the IBP is to bring international buyers together with U.S. firms in industries with high export potential at leading U.S. trade shows. Specifically, through the IBP, the ITA selects domestic trade shows which will receive ITA assistance in the form of global promotion in foreign markets, provision of export counseling to exhibitors, and provision of matchmaking services at the trade show. This notice covers selection for IBP participation during calendar year 2018.

    DATES:

    Applications for the IBP must be received by Friday, January 6, 2017.

    ADDRESSES:

    The application form can be found at www.export.gov/ibp. Applications may be submitted by any of the following methods: (1) Mail/Hand (including express) Delivery Service: International Buyer Program, Trade Promotion Programs, International Trade Administration, U.S. Department of Commerce, Ronald Reagan Building, 1300 Pennsylvania Ave. NW., Suite 800M—Mezzanine Level—Atrium North, Washington, DC 20004; (2) Facsimile: (202) 482-7800; or (3) email: [email protected]. Facsimile and email applications will be accepted as interim applications, but must be followed by a signed original application that is received by the program no later than five (5) business days after the application deadline. To ensure that applications are received by the deadline, applicants are strongly urged to send applications by express delivery service (e.g., U.S. Postal Service Express Delivery, Federal Express, UPS, etc.).

    FOR FURTHER INFORMATION CONTACT:

    Vidya Desai, Senior Advisor for Trade Events, Trade Promotion Programs, International Trade Administration, U.S. Department of Commerce, 1300 Pennsylvania Ave. NW., Ronald Reagan Building, Suite 800M—Mezzanine Level—Atrium North, Washington, DC 20004; Telephone (202) 482-2311; Facsimile: (202) 482-7800; Email: [email protected].

    SUPPLEMENTARY INFORMATION:

    The IBP was established in the Omnibus Trade and Competitiveness Act of 1988 (Pub. L. 100-418, codified at 15 U.S.C. 4724) to bring international buyers together with U.S. firms by promoting leading U.S. trade shows in industries with high export potential. The IBP emphasizes cooperation between the DOC and trade show organizers to benefit U.S. firms exhibiting at selected events and provides practical, hands-on assistance such as export counseling and market analysis to U.S. companies interested in exporting. Shows selected for the IBP will provide a venue for U.S. companies interested in expanding their sales into international markets.

    Through the IBP, ITA selects U.S. trade shows with participation by U.S. firms interested in exporting that ITA determines to be leading international trade shows, for promotion in overseas markets by U.S. Embassies and Consulates. The DOC is authorized to provide successful applicants with assistance in the form of overseas promotion of the show; outreach to show participants about exporting; recruitment of potential buyers to attend the events; and staff assistance in setting up international trade centers at the shows. Worldwide promotion is executed through ITA offices at U.S. Embassies and Consulates in more than 70 countries representing the United States' major trading partners, and also in Embassies in countries where ITA does not maintain offices.

    The International Trade Administration (ITA) is accepting applications from trade show organizers for the IBP for trade shows taking place between January 1, 2018, and December 31, 2018. Selection of a trade show is valid for one show, i.e., a trade show organizer seeking selection for a recurring show must submit a new application for selection for each occurrence of the show. For shows that occur more than once in a calendar year, the trade show organizer must submit a separate application for each show.

    For the IBP in calendar year 2018, the ITA expects to select approximately 20 shows from among the applicants. The ITA will select those shows that are determined to most clearly meet the statutory mandate in 15 U.S.C. 4721 to promote U.S. exports, especially those of small- and medium-sized enterprises, and the selection criteria articulated below.

    There is no fee required to submit an application. If accepted into the program for calendar year 2018, a participation fee of $9,800 is required for shows of five days or fewer. For trade shows more than five days in duration, or requiring more than one International Trade Center, a participation fee of $15,000 is required. For trade shows ten days or more in duration, and/or requiring more than two International Trade Centers, the participation fee will be determined by DOC and stated in the written notification of acceptance. It would be calculated on a full cost recovery basis. Successful applicants will be required to enter into a Memorandum of Agreement (MOA) with ITA within 10 days of written notification of acceptance into the program. The participation fee (by check or credit card) is due within 30 days of written notification of acceptance into the program.

    The MOA constitutes an agreement between ITA and the show organizer specifying which responsibilities for international promotion and export assistance services at the trade shows are to be undertaken by ITA as part of the IBP and, in turn, which responsibilities are to be undertaken by the show organizer. Anyone requesting application information will be sent a sample copy of the MOA along with the application and a copy of this Federal Register Notice. Applicants are encouraged to review the MOA closely as IBP participants are required to comply with all terms, conditions, and obligations in the MOA. Trade show organizer obligations include, but are not limited to, providing waived or reduced admission fees for international attendees who are participating in the IBP, the construction of an International Trade Center at the trade show, production of an export interest directory, and provision of complimentary hotel accommodations for DOC staff as explained in the MOA. Some of the most important commitments is for the trade show organizer to: Include in the terms and conditions of its exhibitor contracts provisions for the protection of intellectual property rights (IPR); to have procedures in place at the trade show to address IPR infringement which, at a minimum, provide information to help U.S. exhibitors procure legal representation during the trade show; and to agree to assist the DOC to reach and educate U.S. exhibitors on the Strategy Targeting Organized Piracy (STOP!), IPR protection measures available during the show, and the means to protect IPR in overseas markets, as well as in the United States. ITA responsibilities include, but are not limited to, the worldwide promotion of the trade show and, where feasible, recruitment of international buyers to that show, provision of on-site export assistance to U.S. exhibitors at the show, and the reporting of results to the show organizer.

    Selection as an IBP partner does not constitute a guarantee by DOC of the show's success. IBP partnership status is not an endorsement of the show except as to its international buyer activities. Non-selection of an applicant for IBP partnership status should not be viewed as a determination that the show will not be successful in promoting U.S. exports.

    Eligibility: All 2018 U.S. trade shows are eligible to apply for IBP participation through the show organizer.

    Exclusions: Trade shows that are either first-time or horizontal (non-industry specific) shows generally will not be considered.

    General Evaluation Criteria: The ITA will evaluate shows to be International Buyer Program partners using the following criteria:

    (a) Export Potential: The trade show promotes products and services from U.S. industries that have high export potential, as determined by DOC sources, including industry analysts' assessment of export potential, ITA best prospects lists and U.S. export statistics.

    (b) Level of International Interest: The trade show meets the needs of a significant number of overseas markets and corresponds to marketing opportunities as identified by ITA. Previous international attendance at the show may be used as an indicator of such interest.

    (c) Scope of the Show: The show offers a broad spectrum of U.S. made products and services for the subject industry. Trade shows with a majority of U.S. firms as exhibitors will be given priority.

    (d) U.S. Content of Show Exhibitors: Trade shows with exhibitors featuring a high percentage of products produced in the United States or products with a high degree of U.S. content will be preferred.

    (e) Stature of the Show: The trade show is clearly recognized by the industry it covers as a leading show for the promotion of that industry's products and services both domestically and internationally, and as a showplace for the latest technology or services in that industry.

    (f) Level of Exhibitor Interest: U.S. exhibitors have expressed interest in receiving international business visitors during the trade show. A significant number of U.S. exhibitors should be seeking to begin exporting or to expand their sales into additional export markets.

    (g) Level of Overseas Marketing: There has been a demonstrated effort by the applicant to market this show and prior related shows. For this criterion, the applicant should describe in detail, among other information, the international marketing program to be conducted for the show, and explain how efforts should increase individual and group international attendance.

    (h) Logistics: The trade show site, facilities, transportation services, and availability of accommodations at the site of the exhibition (i.e. International Trade Center, interpreters) are capable of accommodating large numbers of attendees whose native language will not be English.

    (i) Level of Cooperation: The applicant demonstrates a willingness to cooperate with the ITA to fulfill the program's goals and adhere to the target dates set out in the MOA and in the show timetables, both of which are available from the program office (see the FOR FURTHER INFORMATION CONTACT section above). Past experience in the IBP will be taken into account in evaluating the applications received.

    (j) Delegation Incentives: The IBP Office will be evaluating the level and/or range of incentives offered to delegations and/or delegation leaders recruited by U.S. overseas Embassies and Consulates. Examples of incentives to international visitors and to organized delegations include: Special organized shows, such as receptions, meetings with association executives, briefings, and site tours; and complimentary accommodations for delegation leaders (beyond those required in the MOA).

    Review Process: ITA will evaluate all applications received based on the criteria set out in this notice. Vetting will include soliciting input from ITA industry analysts, as well as domestic and international field offices, focusing primarily on the export potential, level of international interest, and stature of the show. In reviewing applications, ITA will also consider scheduling and sector balance in terms of the need to allocate resources to support selected shows.

    Application Requirements: Show organizers submitting applications for the 2018 IBP are requested to submit: (1) A narrative statement addressing each question in the application, Form OMB 0625-0143 (found at www.export.gov/ibp); (2) a signed statement that “The information submitted in this application is correct and the applicant will abide by the terms set forth in the Call for Applications for the 2018 International Buyer Program (January 1, 2018 through December 31, 2018);” and (3) two copies of the application: one copy of the application printed on company letterhead, and one electronic copy of the application submitted on a CD-RW (preferably in Microsoft Word® format), on or before the deadline noted above. There is no fee required to apply. Applications for the IBP must be received by Friday, January 6, 2017. ITA expects to issue the results of its review process in April 2017.

    Legal Authority: The statutory program authority for the ITA to conduct the International Buyer Program is 15 U.S.C. 4724. The DOC has the legal authority to enter into MOAs with show organizers under the provisions of the Mutual Educational and Cultural Exchange Act of 1961 (MECEA), as amended (22 U.S.C. 2455(f) and 2458(c)). MECEA allows ITA to accept contributions of funds and services from firms for the purposes of furthering its mission.

    The Office of Management and Budget (OMB) has approved the information collection requirements of the application to this program (Form OMB 0625-0143) under the provisions of the Paperwork Reduction Act of 1995 (44 U.S.C. 3501 et seq.) (OMB Control No. 0625-0143). Notwithstanding any other provision of law, no person is required to respond to, nor shall a person be subject to a penalty for failure to comply with, a collection of information subject to the requirements of the Paperwork Reduction Act, unless that collection of information displays a currently valid OMB Control Number.

    For further information please contact: Vidya Desai, Senior Advisor for Trade Events, Trade Promotion Programs ([email protected]).

    Frank Spector, Trade Promotion Programs.
    [FR Doc. 2016-26216 Filed 10-28-16; 8:45 am] BILLING CODE 3510-DR-P
    DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration RIN 0648-XE906 Determination of Overfishing or an Overfished Condition AGENCY:

    National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration (NOAA), Commerce.

    ACTION:

    Notice.

    SUMMARY:

    This action serves as a notice that NMFS, on behalf of the Secretary of Commerce (Secretary), has found that the following stocks are subject to overfishing—Hood Canal coho salmon and Pribilof Islands blue king crab; the following salmon stocks are approaching an overfished condition—Quillayute Fall coho and Snohomish coho; and the following stocks are still both overfished and subject to overfishing—Western and Central North Pacific striped marlin and Atlantic and Gulf of Mexico dusky shark. NMFS, on behalf of the Secretary, notifies the appropriate fishery management council (Council) whenever it determines that overfishing is occurring, a stock is in an overfished condition, a stock is approaching an overfished condition, or when a rebuilding plan has not resulted in adequate progress toward ending overfishing and rebuilding affected fish stocks.

    FOR FURTHER INFORMATION CONTACT:

    Regina Spallone, (301) 427-8568.

    SUPPLEMENTARY INFORMATION:

    Pursuant to sections 304(e)(2) and (e)(7) of the Magnuson-Stevens Fishery Conservation and Management Act (Magnuson-Stevens Act), 16 U.S.C. 1854(e)(2) and (e)(7), and implementing regulations at 50 CFR 600.310(e)(2) and (j)(1), NMFS, on behalf of the Secretary, must notify Councils whenever it determines that a stock or stock complex is overfished or approaching an overfished condition; or if an existing rebuilding plan has not ended overfishing or resulted in adequate rebuilding progress. NMFS also notifies Councils when it determines a stock or stock complex is subject to overfishing.

    NMFS has determined that Hood Canal coho is subject to overfishing, based on the most recent salmon stock assessments conducted by the Pacific Fishery Management Council (Pacific Council) Salmon Technical Team (STT). The Pacific Council has, consistent with the Pacific Coast Salmon Fishery Management Plan, already taken action shaping the 2016 fisheries to ensure Pacific Council area fisheries are not contributing to overfishing (May 2, 2016, 81 FR 26157). In addition, NMFS has determined that Pribilof Islands blue king crab is subject to overfishing based on catch levels exceeding the stock's overfishing limit. The North Pacific Fishery Management Council has been informed that they must take action to end overfishing immediately on this stock.

    NMFS has determined that Quillayute Fall coho and Snohomish coho salmon are both approaching an overfished condition, based on the most recent salmon stock assessments conducted by the Pacific Council STT. These salmon stocks will be considered approaching an overfished condition if the 3-year geometric mean of the stock's two most recent postseason estimates of spawning escapement and the current preseason forecast of spawning escapement is below the stock's minimum stock size threshold. The Pacific Council has been informed that if either of these stocks becomes overfished, they must direct the STT to prepare a rebuilding plan within one year.

    In addition, NMFS has determined that both Western and Central North Pacific striped marlin and Atlantic and Gulf of Mexico dusky shark are still overfished and subject to overfishing, based on the most recent assessments of these stocks. The striped marlin's determination was based on a 2015 assessment conducted by the Billfish Working Group of the International Scientific Committee for Tuna and Tuna-like Species in the North Pacific Ocean. On May 19, 2014, NMFS had announced its overfishing and overfished status determination for striped marlin, and informed the Western Pacific Fishery Management Council and the Pacific Fishery Management Council of their obligations under the MSA to address the domestic and international impact of U.S. fisheries on this stock (79 FR 28686). NMFS continues to work with the Councils and its partners to meet its domestic and international obligations, as specified in that earlier notice.

    The dusky shark determination is based on a 2016 stock assessment update to the 21st Southeast Data Assessment and Review benchmark assessment for this stock, finalized in 2011. NMFS manages dusky shark under the 2006 Consolidated Atlantic Highly Migratory Species Fishery Management Plan and its amendments. Dusky shark has been a prohibited species since 2000, and may not be landed or retained in any fisheries. However, multiple commercial and recreational fisheries sometimes interact with the species as bycatch.

    Dated: October 25, 2016. Jennifer M. Wallace, Acting Director, Office of Sustainable Fisheries, National Marine Fisheries Service.
    [FR Doc. 2016-26126 Filed 10-28-16; 8:45 am] BILLING CODE 3510-22-P
    DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Submission for OMB Review; Comment Request

    The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. Chapter 35).

    Agency: National Oceanic and Atmospheric Administration (NOAA).

    Title: Papahānaumokuākea Marine National Monument Permit Application and Reports for Permits (fka Northwestern Hawaiian Islands Marine National Monument).

    OMB Control Number: 0648-0548.

    Form Number(s): None.

    Type of Request: Regular (revision and extension of a currently approved information collection).

    Number of Respondents: 192.

    Average Hours per Response: Research, Conservation and Management and Education (“general” permits), 5 hours; Special Ocean Use permits, 10 hours; Native Hawaiian Practices permits, 8 hours; Recreation permits, 6 hours; modification requests and final reports, 10 hours; annual reports, 5 hours.

    Burden Hours: 1,343.

    Needs and Uses: This request is for revision and extension of a currently approved information collection. There will be minor changes to the forms and instructions.

    On June 15, 2006, President Bush established the Papahānaumokuākea Marine National Monument by issuing Presidential Proclamation 8031 (71 FR 36443, June 26, 2006) under the authority of the Antiquities Act (16 U.S.C. 431). The proclamation includes restrictions and prohibitions regarding activities in the monument consistent with the authority provided by the act. Specifically, the proclamation prohibits access to the monument except when passing through without interruption or as allowed under a permit issued by NOAA and the U.S. Fish and Wildlife Service (FWS). Vessels passing through the monument without interruption are required to notify NOAA and FWS upon entering into and leaving the monument. Individuals wishing to access the monument to conduct certain regulated activities must first apply for and be granted a permit issued by NOAA and FWS to certify compliance with vessel monitoring system requirements, monument regulations and best management practices. On August 29, 2006, NOAA and FWS published a final rule codifying the provisions of the proclamation (71 FR 51134).

    Affected Public: Individuals, not for profit institutions; Federal, State, local, government, Native Hawaiian organizations; business or other for-profit organizations.

    Frequency: Annually and on occasion.

    Respondent's Obligation: Required to obtain or maintain benefits.

    This information collection request may be viewed at reginfo.gov. Follow the instructions to view Department of Commerce collections currently under review by OMB.

    Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to [email protected] or fax to (202) 395-5806.

    Dated: October 26, 2016. Sarah Brabson, NOAA PRA Clearance Officer.
    [FR Doc. 2016-26155 Filed 10-28-16; 8:45 am] BILLING CODE 3510-NK-P
    DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Expanded Vessel Monitoring System Requirement in the Pacific Coast Groundfish Fishery AGENCY:

    National Oceanic and Atmospheric Administration (NOAA), Commerce.

    ACTION:

    Notice.

    SUMMARY:

    The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.

    DATES:

    Written comments must be submitted on or before December 30, 2016.

    ADDRESSES:

    Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW., Washington, DC 20230 (or via the Internet at [email protected]).

    FOR FURTHER INFORMATION CONTACT:

    Requests for additional information or copies of the information collection instrument and instructions should be directed to Karen Palmigiano, (206) 526-4491 or [email protected].

    SUPPLEMENTARY INFORMATION:

    I. Abstract

    This request is for extension of a currently approved information collection.

    The National Oceanic and Atmospheric Administration (NOAA) has established large-scale depth-based management areas, referred to as Groundfish Conservation Areas (GCAs), where groundfish fishing is prohibited or restricted. These areas were specifically designed to reduce the catch of species while allowing healthy fisheries to continue in areas and with gears where little incidental catch of overfished species is likely to occur. Because NOAA needs methods to effectively enforce area restrictions, certain commercial fishing vessels are required to install and use a vessel monitoring system (VMS) that automatically send hourly position reports. Exemptions from the reporting requirement are available for inactive vessels or vessels fishing outside the monitored area. The vessels are also required to declare what gear will be used.

    To ensure the integrity of the GCAs and Rockfish Conservation Areas, a pilot VMS program was implemented on January 1, 2004. The pilot program required vessels registered to Pacific Coast groundfish fishery limited entry permits to carry and use VMS transceiver units while fishing off the coasts of Washington, Oregon and California. On January 1, 2007, the VMS program coverage was expanded on to include all open access fisheries in addition to the limited entry fisheries. Finally, in 2010, NMFS expanded the declaration reports to include several more limited entry categories.

    II. Method of Collection

    The installation/activation reports are available over the Internet. Due to the need for the owner's signature, installation reports must be faxed or mailed to the National Marine Fisheries Service (NMFS). Hourly position reports are automatically sent from VMS transceivers installed aboard vessels. Exemption reports and declaration reports are submitted via a toll-free telephone number.

    III. Data

    OMB Control Number: 0648-0573.

    Form Number(s): None.

    Type of Review: Regular (extension of a currently approved collection).

    Affected Public: Business or other for-profits organizations; individuals or households.

    Estimated Number of Respondents: 1,500.

    Estimated Time per Response: VMS installation: 4 hours; VMS maintenance: 4 hours; installation, exemption and activation reports: 5 minutes each; and declaration reports: 4 minutes.

    Estimated Total Annual Burden Hours: 12,872.

    Estimated Total Annual Cost to Public: $4,350,375.

    IV. Request for Comments

    Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.

    Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.

    Dated: October 26, 2016. Sarah Brabson, NOAA PRA Clearance Officer.
    [FR Doc. 2016-26159 Filed 10-28-16; 8:45 am] BILLING CODE 3510-22-P
    DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Submission for OMB Review; Comment Request

    The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. Chapter 35).

    Agency: National Oceanic and Atmospheric Administration (NOAA).

    Title: Licensing of Private Remote-Sensing Space Systems.

    OMB Control Number: 0648-0174.

    Form Number(s): None.

    Type of Request: Regular (extension of a currently approved information collection).

    Number of Respondents: 18.

    Average Hours per Response: 0 hours for the submission of a license application; 10 hours for the submission of a data protection plan; 5 hours for the submission of a plan describing how the licensee will comply with data collection restrictions; 3 hours for the submission of an operations plan for restricting collection or dissemination of imagery of Israeli territory; 3 hours for submission of a data flow diagram; 2 hours for the submission of satellite sub- systems drawings; 3 hours for the submission of a final imaging system specifications document; 2 hours for the submission of a public summary for a licensed system; 2 hours for the submission of a preliminary design review; 2 hours for the submission of a critical design review; 1 hour for notification of a binding launch services contract; 1 hour for notification of completion of pre-ship review; 10 hours for the submission of a license amendment; 2 hours for the submission of a foreign agreement notification; 2 hours for the submission of spacecraft operational information submitted when a spacecraft becomes operational; 2 hours for notification of deviation in orbit or spacecraft disposition; 2 hours for notification of any operational deviation; 2 hours for notification of planned purges of information to the National Satellite Land Remote Sensing Data Archive; 3 hours for the submission of an operational quarterly report; 8 hours for an annual compliance audit; 10 hours for an annual operational audit; and 2 hours for notification of the demise of a system or a decision to discontinue system operations.

    Burden Hours: 552.

    Needs and Uses: This request is for extension of a current information collection.

    NOAA has established requirements for the licensing of private operators of remote-sensing space systems. The information in applications and subsequent reports is needed to ensure compliance with the Land Remote-Sensing Policy Act of 1992 and with the national security and international obligations of the United States. The requirements are contained in 15 CFR part 960.

    Affected Public: Business or other for-profit organizations.

    Frequency: Quarterly, annually and on occasion.

    Respondent's Obligation: Mandatory.

    This information collection request may be viewed at reginfo.gov. Follow the instructions to view Department of Commerce collections currently under review by OMB.

    Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to [email protected] or fax to (202) 395-5806.

    Dated: October 26, 2016. Sarah Brabson, NOAA PRA Clearance Officer.
    [FR Doc. 2016-26156 Filed 10-28-16; 8:45 am] BILLING CODE 3510-HR-P
    DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Natural Resource Damage Assessment Restoration Project Information Sheet AGENCY:

    National Oceanic and Atmospheric Administration (NOAA), Commerce.

    ACTION:

    Notice.

    SUMMARY:

    The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.

    DATES:

    Written comments must be submitted on or before December 30, 2016.

    ADDRESSES:

    Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW., Washington, DC 20230 (or via the Internet at [email protected]).

    FOR FURTHER INFORMATION CONTACT:

    Requests for additional information or copies of the information collection instrument and instructions should be directed to Megan Brockway, (301) 427-8692 or [email protected].

    SUPPLEMENTARY INFORMATION:

    I. Abstract

    This request is for an extension of a currently approved information collection.

    The purpose of this information collection is to assist state and federal Natural Resource Trustees in more efficiently carrying out the restoration planning phase of Natural Resource Damage Assessments (NRDA), in compliance with the National Environmental Policy Act (NEPA) of 1969, 42 U.S.C. 4321-4370d; 40 CFR 1500-1500 and other federal and local statutes and regulations as applicable. The NRDA Restoration Project Information Sheet is designed to facilitate the collection of information on existing, planned, or proposed restoration projects. This information will be used by the Natural Resource Trustees to develop potential restoration alternatives for natural resource injuries and service losses requiring restoration, during the restoration planning phase of the NRDA process.

    II. Method of Collection

    The Restoration Project Information Sheet can be submitted on paper through the mail or faxed, or can be submitted electronically via the internet or email.

    III. Data

    OMB Control Number: 0648-0497.

    Form Number: None.

    Type of Review: Regular submission (extension of a current information collection).

    Affected Public: State, local, or tribal governments; individuals or households; business or other for-profits organizations; not-for-profit institutions; farms; and the federal government.

    Estimated Number of Respondents: 300.

    Estimated Time per Response: 20 minutes.

    Estimated Total Annual Burden Hours: 100.

    Estimated Total Annual Cost to Public: $0 in recordkeeping/reporting costs.

    IV. Request for Comments

    Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.

    Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.

    Dated: October 26, 2016. Sarah Brabson, NOAA PRA Clearance Officer.
    [FR Doc. 2016-26160 Filed 10-28-16; 8:45 am] BILLING CODE 3510-22-P
    DEPARTMENT OF DEFENSE Department of the Army Notice of Intent To Grant Exclusive Patent License to RF Networking Solutions, LLC; East Brunswick, NJ AGENCY:

    Department of the Army, DoD.

    ACTION:

    Notice of Intent.

    SUMMARY:

    In compliance with 35 U.S.C. 209(e) and 37 CFR 404.7(a)(1)(i), the Department of the Army hereby gives notice of its intent to grant to RF Networking Solutions, LLC; a company having its principle place of business at 4 Huron Court, East Brunswick, NJ 08816, exclusive license in all fields. The proposed license would be relative to the following: U.S. Patent Number 6,844,841 entitled “Radio Frequency Link Performance Tool Process and System”, Inventor Michael Masciulli, Issue Date January 18, 2005.

    DATES:

    The prospective exclusive license may be granted unless within fifteen (15) days from the date of this published notice, the U.S. Army Research Laboratory receives written objections including evidence and argument that establish that the grant of the license would not be consistent with the requirements of 35 U.S.C. 209 and 37 CFR 404.7. Competing applications completed and received by the U.S. Army Research Laboratory within fifteen (15) days from the date of this published notice will also be treated as objections to the grant of the contemplated exclusive license.

    Objections submitted in response to this notice will not be made available to the public for inspection and, to the extent permitted by law, will not be released under the Freedom of Information Act, 5 U.S.C. 552.

    ADDRESSES:

    Send written objections to U.S. Army Research Laboratory Technology Transfer and Outreach Office, RDRL-DPT/Thomas Mulkern, Building 321, Room 110, Aberdeen Proving Ground, MD 21005-5425.

    FOR FURTHER INFORMATION CONTACT:

    Thomas Mulkern, (410) 278-0889,E-Mail: [email protected].

    SUPPLEMENTARY INFORMATION:

    None.

    Brenda S. Bowen, Army Federal Register Liaison Officer.
    [FR Doc. 2016-26177 Filed 10-28-16; 8:45 am] BILLING CODE 5001-03-P
    DEPARTMENT OF DEFENSE Department of the Navy [Docket ID USN-2014-0017] Submission for OMB Review; Comment Request ACTION:

    Notice.

    SUMMARY:

    The Department of Defense has submitted to OMB for clearance, the following proposal for collection of information under the provisions of the Paperwork Reduction Act.

    DATES:

    Consideration will be given to all comments received by November 30, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Fred Licari, 571-372-0493.

    SUPPLEMENTARY INFORMATION:

    Title, Associated Form and OMB Number: Candidate Application Procedures for the United States Naval Academy; USNA 1110/11, 1110/12, 1110/14, 1110/15, 1110/91, 1110/92, 1110/23, 1110/19, 1110/93, 1110/96, 1531/34, and 5500/1; OMB Control Number 0703-0036.

    Type of Request: Reinstatement, with change, of a previously approved collection for which approval has expired.

    Number of Respondents: 84,000.

    Responses per Respondent: 1.

    Annual Responses: 84,000.

    Average Burden per Response: 1 hour and 21 minutes.

    Annual Burden Hours: 99,165.

    Needs and Uses: The information is collected to determine the eligibility, overall competitive standing, scholastic, and leadership potential of candidates for an appointment to the USNA. Respondents are high school or college students applying for admission to the USNA, officials assisting with the application process, Chain of Command officials for active duty applicants, Blue and Gold Officers, and local law enforcement officials.

    Affected Public: Individuals or households; state, local, or tribal government.

    Frequency: Annually.

    Respondent's Obligation: Voluntary.

    OMB Desk Officer: Ms. Jasmeet Seehra.

    Comments and recommendations on the proposed information collection should be emailed to Ms. Jasmeet Seehra, DoD Desk Officer, at [email protected]. Please identify the proposed information collection by DoD Desk Officer and the Docket ID number and title of the information collection.

    You may also submit comments and recommendations, identified by Docket ID number and title, by the following method:

    Federal eRulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments.

    Instructions: All submissions received must include the agency name, Docket ID number and title for this Federal Register document. The general policy for comments and other submissions from members of the public is to make these submissions available for public viewing on the Internet at http://www.regulations.gov as they are received without change, including any personal identifiers or contact information.

    DOD Clearance Officer: Mr. Frederick Licari.

    Written requests for copies of the information collection proposal should be sent to Mr. Licari at WHS/ESD Directives Division, 4800 Mark Center Drive, East Tower, Suite 03F09, Alexandria, VA 22350-3100.

    Dated: October 26, 2016. Aaron Siegel, Alternate OSD Federal Register Liaison Officer, Department of Defense.
    [FR Doc. 2016-26170 Filed 10-28-16; 8:45 am] BILLING CODE 5001-06-P
    DEPARTMENT OF DEFENSE Department of the Navy [Docket ID: USN-2013-0040] Submission for OMB Review; Comment Request ACTION:

    Notice.

    SUMMARY:

    The Department of Defense has submitted to OMB for clearance, the following proposal for collection of information under the provisions of the Paperwork Reduction Act.

    DATES:

    Consideration will be given to all comments received by November 30, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Fred Licari, 571-372-0493.

    SUPPLEMENTARY INFORMATION:

    Title, Associated Form and OMB Number: Camp Lejeune Notification Database; OMB Control Number 0703-0057.

    Type of Request: Reinstatement.

    Number of Respondents: 10,000.

    Responses per Respondent: 1.

    Annual Responses: 10,000.

    Average Burden per Response: 6 minutes.

    Annual Burden Hours: 1,000 hours.

    Needs and Uses: The information collection requirement is used to obtain and maintain contact information on people who may have been exposed to contaminated drinking water in the past aboard Marine Corps Base Camp Lejeune, NC, as well as other persons interested in the issue. The information will be used to provide notifications and updated information as it becomes available. The information will also be used to correspond with registrants, as necessary (e.g. respond to voicemails or letters).

    Affected Public: Individuals or households; Federal Government.

    Frequency: On occasion.

    Respondent's Obligation: Voluntary.

    OMB Desk Officer: Ms. Jasmeet Seehra.

    Comments and recommendations on the proposed information collection should be emailed to Ms. Jasmeet Seehra, DoD Desk Officer, at [email protected]. Please identify the proposed information collection by DoD Desk Officer and the Docket ID number and title of the information collection.

    You may also submit comments and recommendations, identified by Docket ID number and title, by the following method:

    Federal eRulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments.

    Instructions: All submissions received must include the agency name, Docket ID number and title for this Federal Register document. The general policy for comments and other submissions from members of the public is to make these submissions available for public viewing on the Internet at http://www.regulations.gov as they are received without change, including any personal identifiers or contact information.

    DOD Clearance Officer: Mr. Frederick Licari.

    Written requests for copies of the information collection proposal should be sent to Mr. Licari at WHS/ESD Directives Division, 4800 Mark Center Drive, East Tower, Suite 03F09, Alexandria, VA 22350-3100.

    Dated: October 26, 2016. Aaron Siegel, Alternate OSD Federal Register Liaison Officer, Department of Defense.
    [FR Doc. 2016-26173 Filed 10-28-16; 8:45 am] BILLING CODE 5001-06-P
    DELAWARE RIVER BASIN COMMISSION Notice of Public Hearing and Business Meeting; November 9 and December 14, 2016

    Notice is hereby given that the Delaware River Basin Commission will hold a public hearing on Wednesday, November 9, 2016. A business meeting will be held the following month, on Wednesday, December 14, 2016. The hearing and business meeting are open to the public and will be held at the Washington Crossing Historic Park Visitor Center, 1112 River Road, Washington Crossing, Pennsylvania.

    Public Hearing. The public hearing on November 9, 2016 will begin at 1:30 p.m. Hearing items will include draft dockets for the withdrawals, discharges and other water-related projects subject to the Commission's review. The Commission will also accept public input on the persistent dry conditions throughout the basin and how to address them. The Commission would then be prepared, if conditions worsen, to consider a declaration of water supply emergency under section 10.4 of the Compact.

    The list of projects scheduled for hearing, including project descriptions, will be posted on the Commission's Web site, www.drbc.net, in a long form of this notice at least ten days before the hearing date. Draft resolutions scheduled for hearing also will be posted at www.drbc.net ten or more days prior to the hearing.

    Written comments on matters scheduled for hearing on November 9 will be accepted through 5:00 p.m. on November 10. After the hearing on all scheduled matters has been completed, and as time allows, an opportunity for Open Public Comment will also be provided.

    The public is advised to check the Commission's Web site periodically prior to the hearing date, as items scheduled for hearing may be postponed if additional time is deemed necessary to complete the Commission's review, and items may be added up to ten days prior to the hearing date. In reviewing docket descriptions, the public is also asked to be aware that project details commonly change in the course of the Commission's review, which is ongoing.

    Public Meeting. The public business meeting on December 14, 2016 will begin at 10:30 a.m. and will include: adoption of the Minutes of the Commission's September 14, 2016 business meeting, announcements of upcoming meetings and events, a report on hydrologic conditions, reports by the Executive Director and the Commission's General Counsel, and consideration of any items for which a hearing has been completed or is not required.

    After all scheduled business has been completed and as time allows, the meeting will also include up to one hour of Open Public Comment.

    There will be no opportunity for additional public comment for the record at the December 14 business meeting on items for which a hearing was completed on November 9 or a previous date. Commission consideration on December 14 of items for which the public hearing is closed may result in approval of the item (by docket or resolution) as proposed, approval with changes, denial, or deferral. When the Commissioners defer an action, they may announce an additional period for written comment on the item, with or without an additional hearing date, or they may take additional time to consider the input they have already received without requesting further public input. Any deferred items will be considered for action at a public meeting of the Commission on a future date.

    Advance Sign-Up for Oral Comment. Individuals who wish to comment on the record during the public hearing on November 9 or to address the Commissioners informally during the Open Public Comment portion of the meeting on either November 9 or December 14 as time allows, are asked to sign up in advance by contacting Ms. Paula Schmitt of the Commission staff, at [email protected].

    Addresses for Written Comment. Written comment on items scheduled for hearing may be delivered by hand at the public hearing or: By hand, U.S. Mail or private carrier to: Commission Secretary, P.O. Box 7360, 25 State Police Drive, West Trenton, NJ 08628; by fax to Commission Secretary, DRBC at 609-883-9522; or by email (preferred) to [email protected]. If submitted by email, written comments on a docket should also be sent to Mr. David Kovach, Manager, Project Review Section at [email protected].

    Accommodations for Special Needs. Individuals in need of an accommodation as provided for in the Americans with Disabilities Act who wish to attend the informational meeting, conference session or hearings should contact the Commission Secretary directly at 609-883-9500 ext. 203 or through the Telecommunications Relay Services (TRS) at 711, to discuss how we can accommodate your needs.

    Additional Information, Contacts. Additional public records relating to hearing items may be examined at the Commission's offices by appointment by contacting Carol Adamovic, 609-883-9500, ext. 249. For other questions concerning hearing items, please contact Judith Scharite, Project Review Section assistant at 609-883-9500, ext. 216.

    Dated: October 25, 2016. Pamela M. Bush, Commission Secretary and Assistant General Counsel.
    [FR Doc. 2016-26176 Filed 10-28-16; 8:45 am] BILLING CODE 6360-01-P
    DEPARTMENT OF EDUCATION [Docket No. ED-2016-ICCD-0120] Agency Information Collection Activities; Comment Request; National Professional Development Program: Grantee Performance Report AGENCY:

    Department of Education (ED), Office of English Language Acquisition (OELA).

    ACTION:

    Notice.

    SUMMARY:

    In accordance with the Paperwork Reduction Act of 1995 (44 U.S.C. chapter 3501 et seq.), ED is proposing an extension of an existing information collection.

    DATES:

    Interested persons are invited to submit comments on or before December 30, 2016.

    ADDRESSES:

    To access and review all the documents related to the information collection listed in this notice, please use http://www.regulations.gov by searching the Docket ID number ED-2016-ICCD-0120. Comments submitted in response to this notice should be submitted electronically through the Federal eRulemaking Portal at http://www.regulations.gov by selecting the Docket ID number or via postal mail, commercial delivery, or hand delivery. Please note that comments submitted by fax or email and those submitted after the comment period will not be accepted. Written requests for information or comments submitted by postal mail or delivery should be addressed to the Director of the Information Collection Clearance Division, U.S. Department of Education, 400 Maryland Avenue SW., LBJ, Room 2E-347, Washington, DC 20202-4537.

    FOR FURTHER INFORMATION CONTACT:

    For specific questions related to collection activities, please contact Samuel Lopez, 202-401-1423.

    SUPPLEMENTARY INFORMATION:

    The Department of Education (ED), in accordance with the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3506(c)(2)(A)), provides the general public and Federal agencies with an opportunity to comment on proposed, revised, and continuing collections of information. This helps the Department assess the impact of its information collection requirements and minimize the public's reporting burden. It also helps the public understand the Department's information collection requirements and provide the requested data in the desired format. ED is soliciting comments on the proposed information collection request (ICR) that is described below. The Department of Education is especially interested in public comment addressing the following issues: (1) Is this collection necessary to the proper functions of the Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden accurate; (4) how might the Department enhance the quality, utility, and clarity of the information to be collected; and (5) how might the Department minimize the burden of this collection on the respondents, including through the use of information technology. Please note that written comments received in response to this notice will be considered public records.

    Title of Collection: National Professional Development Program: Grantee Performance Report.

    OMB Control Number: 1885-0555.

    Type of Review: An extension of an existing information collection.

    Respondents/Affected Public: State, Local, and Tribal Governments.

    Total Estimated Number of Annual Responses: 138.

    Total Estimated Number of Annual Burden Hours: 6,900.

    Abstract: The National Professional Development (NPD) program provides professional development activities intended to improve instruction for students with limited English proficiency and assists education personnel working with such children to meet high professional standards. The NPD program office is submitting this application to request approval to collect information from NPD grantees. This data collection serves two purposes; the data are necessary to assess the performance of the NPD program on Government Performance Results Act measures, also, budget information and data on project-specific performance measures are collected from NPD grantees for project-monitoring information.

    Dated: October 26, 2016. Kate Mullan, Acting Director, Information Collection Clearance Division, Office of the Chief Privacy Officer, Office of Management.
    [FR Doc. 2016-26222 Filed 10-28-16; 8:45 am] BILLING CODE 4000-01-P
    DEPARTMENT OF EDUCATION [Docket No.: ED-2016-ICCD-0118] Agency Information Collection Activities; Comment Request; GEPA Section 427 Guidance for All Grant Applications AGENCY:

    Office of the Secretary (OS), Department of Education (ED).

    ACTION:

    Notice.

    SUMMARY:

    In accordance with the Paperwork Reduction Act of 1995 (44 U.S.C. chapter 3501 et seq.), ED is proposing an extension of an existing information collection.

    DATES:

    Interested persons are invited to submit comments on or before December 30, 2016.

    ADDRESSES:

    To access and review all the documents related to the information collection listed in this notice, please use http://www.regulations.gov by searching the Docket ID number ED-2016-ICCD-0118. Comments submitted in response to this notice should be submitted electronically through the Federal eRulemaking Portal at http://www.regulations.gov by selecting the Docket ID number or via postal mail, commercial delivery, or hand delivery. Please note that comments submitted by fax or email and those submitted after the comment period will not be accepted. Written requests for information or comments submitted by postal mail or delivery should be addressed to the Director of the Information Collection Clearance Division, U.S. Department of Education, 400 Maryland Avenue SW., LBJ, Room 2E-343, Washington, DC 20202-4537.

    FOR FURTHER INFORMATION CONTACT:

    For specific questions related to collection activities, please contact Alfreida Pettiford, 202-245-6110.

    SUPPLEMENTARY INFORMATION:

    The Department of Education (ED), in accordance with the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3506(c)(2)(A)), provides the general public and Federal agencies with an opportunity to comment on proposed, revised, and continuing collections of information. This helps the Department assess the impact of its information collection requirements and minimize the public's reporting burden. It also helps the public understand the Department's information collection requirements and provide the requested data in the desired format. ED is soliciting comments on the proposed information collection request (ICR) that is described below. The Department of Education is especially interested in public comment addressing the following issues: (1) Is this collection necessary to the proper functions of the Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden accurate; (4) how might the Department enhance the quality, utility, and clarity of the information to be collected; and (5) how might the Department minimize the burden of this collection on the respondents, including through the use of information technology. Please note that written comments received in response to this notice will be considered public records.

    Title of Collection: GEPA Section 427 Guidance for All Grant Applications.

    OMB Control Number: 1894-0005.

    Type of Review: An extension of an existing information collection.

    Respondents/Affected Public: State, Local, and Tribal Governments.

    Total Estimated Number of Annual Responses: 12,396.

    Total Estimated Number of Annual Burden Hours: 18,594.

    Abstract: On October 20, 1994, the Improving America's Schools Act, Public Law 103-382 (The Act), became law. The Act added a provision to the General Education Provisions Act (GEPA). Section 427 of GEPA requires an applicant for assistance under Department programs to develop and describe in the grant application the steps it proposes to take to ensure equitable access to, and equitable participation in, its proposed project for students, teachers, and other program beneficiaries with special needs. The current GEPA Section 427 guidance for discretionary grant applications and formula grant applications has approval through March 31, 2014, the Department is requesting an extension of this approval.

    Dated: October 25, 2016. Stephanie Valentine, Acting Director, Information Collection Clearance Division, Office of the Chief Privacy Officer, Office of Management.
    [FR Doc. 2016-26123 Filed 10-28-16; 8:45 am] BILLING CODE 4000-01-P
    DEPARTMENT OF EDUCATION [Docket No. ED-2016-ICCD-0119] Agency Information Collection Activities; Comment Request; Evaluation of the Comprehensive Technical Assistance Centers AGENCY:

    Institute of Education Sciences (IES), Department of Education (ED).

    ACTION:

    Notice.

    SUMMARY:

    In accordance with the Paperwork Reduction Act of 1995 (44 U.S.C. chapter 3501 et seq.), ED is proposing a revision of an existing information collection.

    DATES:

    Interested persons are invited to submit comments on or before December 30, 2016.

    ADDRESSES:

    To access and review all the documents related to the information collection listed in this notice, please use http://www.regulations.gov by searching the Docket ID number ED-2016-ICCD-0119. Comments submitted in response to this notice should be submitted electronically through the Federal eRulemaking Portal at http://www.regulations.gov by selecting the Docket ID number or via postal mail, commercial delivery, or hand delivery. Please note that comments submitted by fax or email and those submitted after the comment period will not be accepted. Written requests for information or comments submitted by postal mail or delivery should be addressed to the Director of the Information Collection Clearance Division, U.S. Department of Education, 400 Maryland Avenue SW., LBJ, Room 2E-347, Washington, DC 20202-4537.

    FOR FURTHER INFORMATION CONTACT:

    For specific questions related to collection activities, please contact Amy Johnson, 202-245-7781.

    SUPPLEMENTARY INFORMATION:

    The Department of Education (ED), in accordance with the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3506(c)(2)(A)), provides the general public and Federal agencies with an opportunity to comment on proposed, revised, and continuing collections of information. This helps the Department assess the impact of its information collection requirements and minimize the public's reporting burden. It also helps the public understand the Department's information collection requirements and provide the requested data in the desired format. ED is soliciting comments on the proposed information collection request (ICR) that is described below. The Department of Education is especially interested in public comment addressing the following issues: (1) Is this collection necessary to the proper functions of the Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden accurate; (4) how might the Department enhance the quality, utility, and clarity of the information to be collected; and (5) how might the Department minimize the burden of this collection on the respondents, including through the use of information technology. Please note that written comments received in response to this notice will be considered public records.

    Title of Collection: Evaluation of the Comprehensive Technical Assistance Centers.

    OMB Control Number: 1850-0914.

    Type of Review: A revision of an existing information collection.

    Respondents/Affected Public: Individuals or Households.

    Total Estimated Number of Annual Responses: 648.

    Total Estimated Number of Annual Burden Hours: 236.

    Abstract: The National Evaluation of the Comprehensive Technical Assistance Centers will examine and document how the Comprehensive Center program and its individual centers intend to build SEA capacity and what types of activities they actually conduct to build capacity. The study will use surveys and interviews of center staff and technical assistance recipients, as well as technical assistance event observations, to collect information about how the Comprehensive Centers design their work, how they operate, and the results of their work.

    Dated: October 26, 2016. Kate Mullan, Acting Director, Information Collection Clearance Division, Office of the Chief Privacy Officer, Office of Management.
    [FR Doc. 2016-26158 Filed 10-28-16; 8:45 am] BILLING CODE 4000-01-P
    DEPARTMENT OF EDUCATION National Assessment Governing Board Quarterly Board Meeting AGENCY:

    National Assessment Governing Board, U.S. Department of Education.

    ACTION:

    Announcement of open and closed meetings.

    SUMMARY:

    This notice sets forth the agenda for the November 17-19, 2016 Quarterly Board Meeting of the National Assessment Governing Board (hereafter referred to as Governing Board). This notice provides information to members of the public who may be interested in attending the meeting or providing written comments on the meeting. The notice of this meeting is required under § 10(a)(2) of the Federal Advisory Committee Act (FACA).

    DATES:

    The Quarterly Board Meeting will be held on the following dates:

    • November 17, 2016 from 12:30 p.m. to 6:00 p.m. • November 18, 2016 from 8:30 a.m. to 5:00 p.m. • November 19, 2016 from 7:30 a.m. to 11:45 a.m. ADDRESSES:

    Sheraton Pentagon City, 900 South Orme Street, Arlington, Virginia 22204.

    FOR FURTHER INFORMATION CONTACT:

    Munira Mwalimu, Executive Officer/Designated Federal Official of the Governing Board, 800 North Capitol Street NW., Suite 825, Washington, DC 20002, telephone: (202) 357-6938, fax: (202) 357-6945.

    SUPPLEMENTARY INFORMATION:

    Statutory Authority and Function: The Governing Board is established under the National Assessment of Educational Progress Authorization Act, Title III of Public Law 107-279. Information on the Governing Board and its work can be found at www.nagb.gov.

    The Governing Board is established to formulate policy for the National Assessment of Educational Progress (NAEP). The Governing Board's responsibilities include the following: Selecting subject areas to be assessed, developing assessment frameworks and specifications, developing appropriate student achievement levels for each grade and subject tested, developing standards and procedures for interstate and national comparisons, improving the form and use of NAEP, developing guidelines for reporting and disseminating results, and releasing initial NAEP results to the public.

    November 17-19, 2016 Committee Meetings

    The Governing Board's standing committees will meet to conduct regularly scheduled work based on agenda items planned for this Quarterly Board Meeting and follow-up items as reported in the Governing Board's committee meeting minutes available at http://nagb.gov/what-we-do/board-committee-reports-and-agendas.html.

    Detailed Meeting Agenda: November 17-19, 2016

    November 17: Assessment Development Committee (ADC): Closed Session: 12:30 p.m. to 2:30 p.m.; Open Session: 2:30 p.m. to 4:00 p.m.

    November 17: Executive Committee: Open Session: 4:30 p.m. to 5:35 p.m.; Closed Session: 5:35 p.m. to 6:00 p.m.

    November 18: Full Governing Board and Committee Meetings

    Full Governing Board: Open Session: 8:30 a.m. to 10:00 a.m.; Closed Sessions: 12:45 p.m. to 3:15 p.m.; Open Session: 3:30 p.m. to 5:00 p.m.

    ADC and Committee on Standards, Design and Methodology (COSDAM): Joint Open Session: 10:15 a.m. to 11:00 a.m.; Joint Closed Session: 11:00 a.m. to 11:30 a.m.;

    ADC: Closed Session: 11:45 a.m. to 12:30 p.m.

    COSDAM: Open Session: 11:30 a.m. to 12:30 p.m.

    Reporting & Dissemination (R&D) Committee: Open Session 10:15 a.m. to 12:30 p.m.

    November 19: Full Governing Board and Committee Meetings

    Nominations Committee: Closed Session: 7:30 a.m. to 8:15 a.m.

    Full Governing Board: Closed Session: 8:30 a.m. to 9:45 a.m.; Open Session: 10:00 a.m. to 11:45 a.m.

    On Thursday, November 17, 2016, ADC will meet in closed session from 12:30 p.m. to 2:30 p.m. to review secure digital-based tasks in mathematics for grade 12 and for science at grades 4 and 8. This meeting must be conducted in closed session because the test items are secure and have not been released to the public. Public disclosure of the secure test items would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    From 2:30 p.m. to 4:00 p.m. the ADC will meet in open session to review grade 12 contextual questions for students, teachers, and schools in reading and mathematics.

    The Executive Committee will meet in open session on November 17 from 4:30 p.m. to 5:35 p.m. and thereafter in closed session from 5:35 p.m. to 6:00 p.m. During the closed session, the Executive Committee will be briefed on the development of the NAEP research grants program and the forthcoming request for proposals (RFP). This discussion will include secure information that will be included in the request for proposals which is not yet available to the public. This meeting must be conducted in closed session because premature public disclosure of this information would likely have an adverse impact on the proposed agency action if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    On Friday, November 18, the full Governing Board will meet in open session from 8:30 a.m. to 10:00 a.m. The Governing Board will review and approve the November 17-19, 2016 Governing Board meeting agenda and meeting minutes from the August 2016 Quarterly Board Meeting. Thereafter, the Secretary of Education, John B. King, Jr. will administer the oath of office to a new Board member and four reappointed members following which he will provide remarks to the Governing Board.

    This session will be followed by a report from the Executive Director of the Governing Board, William Bushaw, followed by an update on National Center for Education Statistics (NCES) work by Holly Spurlock, Branch Chief, National Assessment Operations, NCES.

    The Governing Board will recess for committee meetings at 10:00 a.m. which are scheduled to take place from 10:15 a.m. to 12:30 p.m.

    On November 18, 2016, the ADC will meet in a joint open session with COSDAM from 10:15 a.m. to 11:00 a.m. Thereafter the two committees will meet in a joint closed session from 11:00 a.m. to 11:30 a.m. to receive a briefing on an embargoed NCES research study involving 2015 mathematics data from grades 4 and 8 at national and state levels. The data and analyses are secure and have not been released to the public. Public disclosure of the secure test data and analyses would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    Following this joint meeting, ADC will meet in closed session from 11:45 a.m. to 12:30 p.m. to receive a briefing on the history and content of the NAEP Long-Term Trend assessments in reading and mathematics, which are conducted at ages 9, 13, and 17. The briefing will include secure reading and mathematics test items from these three age-level assessments that have not been released to the public. This meeting must be conducted in closed session because the test items are secure and have not been released to the public. Public disclosure of the secure test items would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    On November 18, the COSDAM will meet in open session from 11:30 a.m. to 12:30 p.m. to conduct regularly scheduled work. On November 17, the R&D Committee will meet in open session from 10:15 a.m. to 12:30 p.m. to conduct regularly scheduled work.

    Following the committee meetings on Friday, November 18, the Governing Board will meet in closed session from 12:45 p.m. to 1:45 p.m. to receive a briefing on the 2015 National Indian Education Study in reading and mathematics from James Deaton, NCES. Results from this study have not been released to the public. Public disclosure of the study results would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    Following this closed session, the Governing Board will meet in closed session from 2:00 p.m. to 3:15 p.m. to receive a briefing from Eunice Greer, NCES, on data from recent NAEP digital-based pilot assessments in reading, mathematics, and writing. Secure test questions in each subject area as well as embargoed data will be presented during this briefing. The test questions and data have not been released to the public and the session must be conducted in closed session. Public disclosure of the secure test items and data would significantly impede implementation of the NAEP assessment program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    Thereafter, the Governing Board will take a fifteen-minute break and reconvene in open session from 3:30 p.m. to 4:15 p.m. to discuss and take action on the Governing Board's Strategic Vision. The discussion will be led by the Governing Board's Vice Chair Lucille Davy, with a presentation from Lily Clark of the Governing Board staff.

    From 4:15 p.m. to 5:00 p.m., Marcella Goodridge-Keiller, Office of the General Counsel will provide the annual ethics briefing, and William Bushaw, Governing Board Executive Director, and Peggy Carr, NCES Acting Commissioner, will provide a briefing on keeping embargoed data secure.

    The November 18, 2016 meeting will adjourn at 5:00 p.m.

    On November 19, the Nominations Committee will meet in closed session from 7:30 a.m. to 8:15 a.m. The committee will receive a briefing on nominations received for Governing Board terms beginning in October 1, 2017. The Nominations Committee's discussions pertain solely to internal personnel rules and practices of an agency and information of a personal nature where disclosure would constitute a clearly unwarranted invasion of personal privacy. As such, the discussions are protected by exemptions 2 and 6 of § 552b(c) of Title 5 of the United States Code.

    On November 19, the Governing Board will meet from 8:30a.m. to 9:45 a.m. to receive a briefing from the National Academy of Sciences on the Evaluation of the NAEP Achievement Levels for Mathematics and Reading. The evaluation report has not yet been publically released by the National Academy of Sciences. Public disclosure of the evaluation results would significantly impede implementation of the NAEP assessment and reporting program if conducted in open session. Such matters are protected by exemption 9(B) of § 552b(c) of Title 5 of the United States Code.

    Thereafter, the Governing Board will have a short break and reconvene from 10:00 a.m. to 10:30 a.m. to receive an update on committee reports and take action on the R&D recommended release plan for the 2016 NAEP Arts assessment. The Governing Board will also take action on a joint delegation of authority to COSDAM and the Executive Committee for providing an official response to the Evaluation of NAEP Achievement Levels.

    Following a short break, from 10:30 a.m. to 10:45 a.m., the Governing Board will meet in open session from 10:45 a.m. to 11:45 a.m. to receive a briefing on draft Governing Board guidelines for Releasing, Reporting, and Disseminating Results.

    The November 19, 2016 meeting is scheduled to adjourn at 11:45 a.m.

    Access to Records of the Meeting: Pursuant to FACA requirements, the public may also inspect the meeting materials at www.nagb.gov beginning on Thursday, November 17, 2016 by 10:00 a.m. ET. The official verbatim transcripts of the public meeting sessions will be available for public inspection no later than 30 calendar days following the meeting.

    Reasonable Accommodations: The meeting site is accessible to individuals with disabilities. If you will need an auxiliary aid or service to participate in the meeting (e.g., interpreting service, assistive listening device, or materials in an alternate format), notify the contact person listed in this notice at least two weeks before the scheduled meeting date. Although we will attempt to meet a request received after that date, we may not be able to make available the requested auxiliary aid or service because of insufficient time to arrange it.

    Electronic Access to This Document: The official version of this document is the document published in the Federal Register. Free Internet access to the official edition of the Federal Register and the Code of Federal Regulations is available via the Federal Digital System at: www.thefederalregister.org/fdsys. At this site you can view this document, as well as all other documents of this Department published in the Federal Register, in text or Adobe Portable Document Format (PDF). To use PDF, you must have Adobe Acrobat Reader, which is available free at the Adobe Web site.

    You may also access documents of the Department published in the Federal Register by using the article search feature at: www.federalregister.gov. Specifically, through the advanced search feature at this site, you can limit your search to documents published by the Department.

    Authority:

    Public Law 107-279, Title III—National Assessment of Educational Progress § 301.

    Dated: October 26, 2016. William J. Bushaw, Executive Director, National Assessment Governing Board (NAGB), U. S. Department of Education.
    [FR Doc. 2016-26194 Filed 10-28-16; 8:45 am] BILLING CODE 4000-01-P
    DEPARTMENT OF EDUCATION [Docket No. ED-2016-IES-0109] Request for Information on Interagency Working Group on Language and Communication's Report on Research and Development Activities AGENCY:

    Institute of Education Sciences, U.S. Department of Education.

    ACTION:

    Request for information.

    SUMMARY:

    To assist National Science and Technology Council's (NSTC) Interagency Working Group on Language and Communication (IWGLC) in its efforts to further improve coordination and collaboration of research and development (R & D) agendas related to language and communication across the Federal Government, the Institute of Education Sciences (the Institute) requests information from interested parties through this notice.

    DATES:

    Written submissions must be received by the Department on or before December 30, 2016.

    ADDRESSES:

    Submit your comments through the Federal eRulemaking Portal or via postal mail or commercial delivery. We will not accept comments by fax, email, or hand delivery. To ensure that we do not receive duplicate copies, please submit your comments only one time. In addition, please include the Docket ID and the term “Language and Communication R & D Activities response” at the top of your comments.

    Federal eRulemaking Portal: Go to www.regulations.gov to submit your comments electronically. Information on using Regulations.gov, including instructions for accessing agency documents, submitting comments, and viewing the docket, is available on the site under “Are you new to this site?”

    Postal Mail or Commercial Delivery: If you mail your comments, address them to Rebecca McGill-Wilkinson, National Center for Education Research, Institute of Education Sciences, Attention: Language and Communication R & D Activities RFI, U.S. Department of Education, 400 Maryland Avenue SW., PCP-4127, Washington, DC 20202.

    Privacy Note: The Department's policy for comments received from members of the public (including comments submitted by mail or commercial delivery) is to make these submissions available for public viewing in their entirety on the Federal eRulemaking Portal at www.regulations.gov. Therefore, commenters should be careful to include in their comments only information that they wish to make publicly available on the Internet.

    Submission of Proprietary Information: Given the subject matter, some comments may include proprietary information as it relates to confidential commercial information. The Freedom of Information Act defines “confidential commercial information” as information the disclosure of which could reasonably be expected to cause substantial competitive harm. You may wish to request that we not disclose what you regard as confidential commercial information.

    To assist us in making a determination on your request, we encourage you to identify any specific information in your comments that you consider confidential commercial information. Please list the information by page and paragraph numbers.

    This is a request for information (RFI) only. This RFI is not a request for proposals (RFP) or a promise to issue an RFP or a notice inviting applications (NIA). This RFI does not commit the Department to contract for any supply or service whatsoever. Further, the Department is not seeking proposals and will not accept unsolicited proposals. The Department will not pay for any information or administrative costs that you may incur in responding to this RFI. If you do not respond to this RFI, you may still apply for future contracts and grants. The Department posts RFPs on the Federal Business Opportunities Web site (www.fbo.gov). The Department announces grant competitions in the Federal Register (www.thefederalregister.org/fdsys). It is your responsibility to monitor these sites to determine whether the Department issues an RFP or NIA after considering the information received in response to this RFI. The documents and information submitted in response to this RFI become the property of the U.S. Government and will not be returned.

    FOR FURTHER INFORMATION CONTACT:

    Dr. Rebecca McGill-Wilkinson, U.S. Department of Education, 400 Maryland Avenue SW., PCP 4127, Washington, DC. Telephone: (202) 245-7613 or by email: [email protected].

    If you use a telecommunications device for the deaf (TDD) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1-800-877-8339.

    SUPPLEMENTARY INFORMATION:

    Introduction

    The Institute requests information from interested parties to help inform its work with the IWGLC as it moves forward to improve coordination and collaboration of research and development agendas related to a recently published report on language and communication across the Federal Government. The Report from the Interagency Working Group on Language & Communication (Report) is available at: www.whitehouse.gov/sites/default/files/microsites/ostp/NSTC/report_of_the_interagency_working_group_on_language_and_communication_final.pdf.

    Background

    The NSTC is the principal means by which the Executive Branch coordinates science and technology policy across the Federal Government. A primary objective of the NSTC is establishing clear national goals for Federal science and technology investments. The IWGLC serves as part of the internal deliberative process of the NSTC. The IWGLC includes representatives from the White House Office of Science and Technology Policy, National Science Foundation, Department of Health and Human Services, Department of Education, Department of Defense, Department of Agriculture, Department of Justice, Department of Energy, Department of Homeland Security, Department of State, Department of Commerce, National Endowment for the Humanities, National Aeronautics and Space Administration, and the Department of Transportation, and recently researched and authored the Report.

    Human interaction in society depends upon language and communication. Across the Federal Government, agencies support R & D activities focused on furthering the understanding of and supporting better language and communication. To date, however, there has been no systematic accounting or description of the range of language and communication R & D that is programs and activities being supported by the Federal Government. In the Report, the IWGLC took on the challenge of creating a taxonomy of language and communication R & D activities and summarizing current and recent Federal investment in this area.

    The taxonomy included in the Report identified four broad R & D topics in language and communication funded by the Federal Government, along with a number of subtopics under each broad topic. Please consult the taxonomy on pages 48-50 in the Report. The four broad topic headings include:

    1. Knowledge and Processes Underlying Language and Communication.

    2. Language and Communication Abilities and Skills.

    3. Using Language and Communication to Influence Behavior and Share Information.

    4. Language and Communication Technologies.

    The taxonomy also identified four types of R & D activities that could be supported within each topic area:

    1. Basic/foundational.

    2. Translational.

    3. Applied.

    4. Implementation.

    The Report provides programmatic recommendations for key areas for investment and collaboration in language and communication research to support a broad range of government functions such as environmental protection, education, national security, law enforcement, transportation, and public health.

    Questions

    The Institute is interested in gathering information that would be of help to the IWGLC in coordinating and making recommendations about the range of R & D programs and activities related to key topics of language and communication that are supported across the Federal agencies. Specifically, the Institute, on behalf of the IWGLC, requests information on the following:

    1. Whether the taxonomy included in the Report captures all types of federally funded R & D programs and activities on language and communication. If not, please indicate which types of R & D activities should be added to the taxonomy.

    2. Whether there are language and communication R & D programs and activities carried out in the non-Federal sector (e.g., commercial industry, nonprofit organizations, institutions of higher education) that do not fall into any of the taxonomy subtopics (see pgs. 48-50 of the Report). If so, please describe those activities.

    3. Whether there are activities that are not included in the Report's list of recommended next steps for the Federal Government to take related to language and communication R & D programs and activities that should be considered (see pgs. 33-36). If so, please indicate what activities should be added to the Report's recommendations.

    Written comments may be submitted through any of the methods discussed in the ADDRESSES section of this notice. This notice is for information purposes only. The Institute and the other member Federal agencies on the IWGLC will review and consider information provided in response to this notice as the IWGLC moves forward with its new charter to improve coordination and collaboration of research and development agendas related to language and communication across the Federal Government.

    Accessible Format: Individuals with disabilities can obtain this document in an accessible format (e.g., braille, large print, audiotape, or compact disc) on request to Dr. Rebecca McGill-Wilkinson at (202) 245-7613 or [email protected].

    Electronic Access to This Document: The official version of this document is the document published in the Federal Register. Free Internet access to the official edition of the Federal Register and the Code of Federal Regulations is available via the Federal Digital System at: www.thefederalregister.org/fdsys. At this site you can view this document, as well as all other documents of this Department published in the Federal Register, in text or Portable Document Format (PDF). To use PDF you must have Adobe Acrobat Reader, which is available free at the site.

    You may also access documents of the Department published in the Federal Register by using the article search feature at: www.federalregister.gov. Specifically, through the advanced search feature at this site, you can limit your search to documents published by the Department.

    Authority: Executive Order 12881 of November 23, 1993, as amended by Executive Order 13284 of January 23, 2003. 20 U.S.C. 3402(4).

    Dated: October 26, 2016. Ruth Neild, Deputy Director for Policy and Research, Delegated the Duties of the Director, Institute of Education Sciences.
    [FR Doc. 2016-26193 Filed 10-28-16; 8:45 am] BILLING CODE 4000-01-P
    DEPARTMENT OF ENERGY Environmental Management Site-Specific Advisory Board, Savannah River Site AGENCY:

    Department of Energy.

    ACTION:

    Notice of open meeting.

    SUMMARY:

    This notice announces a meeting of the Environmental Management Site-Specific Advisory Board (EM SSAB), Savannah River Site. The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires that public notice of this meeting be announced in the Federal Register.

    DATES:

    Monday, November 14, 2016, 1:00 p.m.-4:45 p.m. Tuesday, November 15, 2016, 8:30 a.m.-4:45 p.m. ADDRESSES:

    Applied Research Center, 301 Gateway Drive, Aiken, SC 29802.

    FOR FURTHER INFORMATION CONTACT:

    James Giusti, Office of External Affairs, Department of Energy, Savannah River Operations Office, P.O. Box A, Aiken, SC, 29802; Phone: (803) 952-7684.

    SUPPLEMENTARY INFORMATION:

    Purpose of the Board: The purpose of the Board is to make recommendations to DOE-EM and site management in the areas of environmental restoration, waste management, and related activities.

    Tentative Agenda Monday, November 14, 2016 Opening and Agenda Review Combined Committees Session Order of committees: • Administrative & Outreach • Facilities Disposition & Site Remediation • Strategic & Legacy Management • Waste Management • Nuclear Materials Public Comments Adjourn Tuesday, November 15, 2016 Opening, Chair Update, and Agenda Review Agency Updates Public Comments Recommendation Voting • Waste Management Committee Draft Recommendation • Nuclear Materials Committee Draft Recommendation • Strategic & Legacy Management Committee Draft Recommendation Break Administrative & Outreach Committee Update • Voting for Board Chair and Vice Chair Facilities Disposition & Site Remediation Committee Update Lunch Break Strategic & Legacy Management Committee Update Waste Management Committee Update Public Comments Break Nuclear Materials Committee Update Strategic Plan Update Public Comments Adjourn

    Public Participation: The EM SSAB, Savannah River Site, welcomes the attendance of the public at its advisory committee meetings and will make every effort to accommodate persons with physical disabilities or special needs. If you require special accommodations due to a disability, please contact James Giusti at least seven days in advance of the meeting at the phone number listed above. Written statements may be filed with the Board either before or after the meeting. Individuals who wish to make oral statements pertaining to agenda items should contact James Giusti's office at the address or telephone listed above. Requests must be received five days prior to the meeting and reasonable provision will be made to include the presentation in the agenda. The Deputy Designated Federal Officer is empowered to conduct the meeting in a fashion that will facilitate the orderly conduct of business. Individuals wishing to make public comments will be provided a maximum of five minutes to present their comments.

    Minutes: Minutes will be available by writing or calling James Giusti at the address or phone number listed above. Minutes will also be available at the following Web site: http://cab.srs.gov/srs-cab.html.

    Issued at Washington, DC, on October 25, 2016. LaTanya R. Butler, Deputy Committee Management Officer.
    [FR Doc. 2016-26206 Filed 10-28-16; 8:45 am] BILLING CODE 6450-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. EL17-11-000] Alabama Power Company v. Southwest Power Pool; Notice of Complaint

    Take notice that on October 24, 2016, pursuant to Rule 206 of the Federal Energy Regulatory Commission's (Commission) Rules of Practice and Procedure, 18 CFR 385.206 and sections 205, 206, 306, and 309 of the Federal Power Act, (FPA) 1 Alabama Power Company (Complainant) filed a formal complaint against the Southwest Power Pool (Respondent) alleging that, Respondent levied unlawful charges upon the Complainant and Respondent's rates for transmission service are unjust, unreasonable, unduly discriminatory and preferential, all in violation of the FPA, as more fully explained in the complaint.

    1 16 U.S.C. 824(d), 824(e), 825(e), and 825(h) (2015).

    The Complainant certifies that a copy of the complaint has been served on the Respondent.

    Any person desiring to intervene or to protest this filing must file in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211, 385.214). Protests will be considered by the Commission in determining the appropriate action to be taken, but will not serve to make protestants parties to the proceeding. Any person wishing to become a party must file a notice of intervention or motion to intervene, as appropriate. The Respondent's answer and all interventions, or protests must be filed on or before the comment date. The Respondent's answer, motions to intervene, and protests must be served on the Complainants.

    The Commission encourages electronic submission of protests and interventions in lieu of paper using the “eFiling” link at http://www.ferc.gov. Persons unable to file electronically should submit an original and 5 copies of the protest or intervention to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.

    This filing is accessible on-line at http://www.ferc.gov, using the “eLibrary” link and is available for electronic review in the Commission's Public Reference Room in Washington, DC There is an “eSubscription” link on the Web site that enables subscribers to receive email notification when a document is added to a subscribed docket(s). For assistance with any FERC Online service, please email [email protected], or call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Comment Date: 5:00 p.m. Eastern Time on November 14, 2016.

    Dated: October 25, 2016. Kimberly D. Bose, Secretary.
    [FR Doc. 2016-26187 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER17-157-000] Moapa Southern Paiute Solar, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 Authorization

    This is a supplemental notice in the above-referenced proceeding of Moapa Southern Paiute Solar, LLC`s application for market-based rate authority, with an accompanying rate tariff, noting that such application includes a request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability.

    Any person desiring to intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211 and 385.214). Anyone filing a motion to intervene or protest must serve a copy of that document on the Applicant.

    Notice is hereby given that the deadline for filing protests with regard to the applicant's request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability is November 14, 2016.

    The Commission encourages electronic submission of protests and interventions in lieu of paper, using the FERC Online links at http://www.ferc.gov. To facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests.

    Persons unable to file electronically should submit an original and 5 copies of the intervention or protest to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.

    The filings in the above-referenced proceeding are accessible in the Commission's eLibrary system by clicking on the appropriate link in the above list. They are also available for electronic review in the Commission's Public Reference Room in Washington, DC. There is an eSubscription link on the Web site that enables subscribers to receive email notification when a document is added to a subscribed docket(s). For assistance with any FERC Online service, please email [email protected] or call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated: October 25, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26229 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. EL17-8-000] Indianapolis Power & Light Company v. Midcontinent Independent System Operator, Inc.; Notice of Complaint

    Take notice that on October 21, 2016, pursuant to sections 206 of the Federal Power Act, 16 U.S.C. 824e and Rule 206 of the Federal Energy Regulatory Commission's (Commission) Rules of Practice and Procedure, 18 CFR 385.206, Indianapolis Power & Light Company (IPL or Complainant) filed a formal complaint against Midcontinent Independent System Operator, Inc., (MISO or Respondent) alleging that the Respondent's Open Access Transmission, Energy and Operating Reserve Markets Tariff is unjust and unreasonable, unduly discriminatory and preferential because it does not provide a means for IPL's Advancion® Energy Storage Array, a.k.a. the Harding Street Station Battery Energy Storage System to be compensated for services it provides to the MISO system, including Primary Frequency Response, as more fully explained in the complaint.

    Complainant certifies that copies of the complaint were served on the contacts for Respondent as listed on the Commission's list of Corporate Officials.

    Any person desiring to intervene or to protest this filing must file in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211 and 385.214). Protests will be considered by the Commission in determining the appropriate action to be taken, but will not serve to make protestants parties to the proceeding. Any person wishing to become a party must file a notice of intervention or motion to intervene, as appropriate. The Respondent's answer and all interventions, or protests must be filed on or before the comment date. The Respondent's answer, motions to intervene, and protests must be served on the Complainants.

    The Commission encourages electronic submission of protests and interventions in lieu of paper using the “eFiling” link at http://www.ferc.gov. Persons unable to file electronically should submit an original and 5 copies of the protest or intervention to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.

    This filing is accessible on-line at http://www.ferc.gov, using the “eLibrary” link and is available for review in the Commission's Public Reference Room in Washington, DC. There is an “eSubscription” link on the Web site that enables subscribers to receive email notification when a document is added to a subscribed docket(s). For assistance with any FERC Online service, please email [email protected], or call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Comment Date: 5:00 p.m. Eastern Time on November 10, 2016.

    Dated: October 25, 2016. Kimberly D. Bose, Secretary.
    [FR Doc. 2016-26186 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings #1

    Take notice that the Commission received the following electric rate filings:

    Docket Numbers: ER11-1844-003.

    Applicants: Midcontinent Independent System Operator, Inc.

    Description: Midcontinent Independent System Operator, Inc. submits tariff filing per 35: 2016-10-24_Compliance filing to address ITC PARs Order to be effective 1/1/2011.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5046.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER16-833-003.

    Applicants: Midcontinent Independent System Operator, Inc.

    Description: Compliance filing: 2016-10-21_Default Technology Specific Avoidable Cost to be effective 9/1/2016.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5153.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER16-1314-002.

    Applicants: Southwest Power Pool, Inc.

    Description: Compliance filing: 2198R20 and 2198R21 KPP NITSA NOA Compliance Filing to be effective 3/1/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5078.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER16-2402-001.

    Applicants: UGI Utilities Inc.

    Description: Compliance filing: Supplemental Revisions to Market Based Rate Tariff to be effective 10/10/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5119.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER16-2403-001.

    Applicants: UGI Development Company.

    Description: Compliance filing: Supplemental Revisions to Market Based Rate Tariff to be effective 10/10/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5126.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-158-000.

    Applicants: Western Farmers Electric Cooperative.

    Description: Petition for Tariff Waiver of Western Farmers Electric Cooperative.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5173.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-159-000.

    Applicants: DTE Electric Company.

    Description: Notice of Cancellation of Tariff No. 1 of DTE Electric Company.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5033.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-160-000.

    Applicants: Public Service Company of New Mexico.

    Description: Tariff Cancellation: Notice of Cancellation of Third Revised NITSA and Third Revised NOA to be effective 12/31/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5059.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-161-000.

    Applicants: Public Service Company of New Mexico.

    Description: Tariff Cancellation: Notice of Cancellation of Power Sale Agreement to be effective 12/31/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5057.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-162-000.

    Applicants: Public Service Company of New Mexico.

    Description: Compliance filing: Executed Service Agreement for Electric Service under PNM?s Coordination Tariff to be effective 1/1/2017.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5058.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-163-000.

    Applicants: PJM Interconnection, L.L.C.

    Description: § 205(d) Rate Filing: First Revised ISA No. 4331, Queue No. AA2-139 to be effective 9/22/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5113.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-164-000.

    Applicants: Wisconsin Public Service Corporation.

    Description: § 205(d) Rate Filing: WPS Corp and Daggett Agreement for Wholesale Distribution Service to be effective 1/1/2017.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5127.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-165-000.

    Applicants: Wisconsin Public Service Corporation.

    Description: § 205(d) Rate Filing: WPS Corp and Stephenson Agreement for Wholesale Distribution Service to be effective 1/1/2017.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5128.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-166-000.

    Applicants: Upper Michigan Energy Resources Corporation.

    Description: § 205(d) Rate Filing: UMERC to Daggett Rate Schedule No 6 to be effective 1/1/2017.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5129.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-167-000.

    Applicants: Upper Michigan Energy Resources Corporation.

    Description: § 205(d) Rate Filing: UMERC to Stephenson Rate Schedule No 7 to be effective 1/1/2017.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5135.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-168-000.

    Applicants: Applied Energy LLC.

    Description: Baseline eTariff Filing: Market-Based Rates Application to be effective 12/24/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5141.

    Comments Due: 5 p.m. ET 11/14/16.

    The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.

    Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.

    eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf. For other information, call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated: October 24, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26225 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings #1

    Take notice that the Commission received the following electric corporate filings:

    Docket Numbers: EC17-19-000.

    Applicants: Chisholm View Wind Project II, LLC.

    Description: Application for Authorization Under Section 203 of the Federal Power Act, Request for Expedited Consideration and Confidential Treatment for Chisholm View Wind Project II, LLC.

    Filed Date: 10/19/16.

    Accession Number: 20161019-5145.

    Comments Due: 5 p.m. ET 11/9/16.

    Docket Numbers: EC17-20-000.

    Applicants: Eurus Combine Hills I LLC, Crescent Ridge LLC.

    Description: Application for Authorization Under Section 203 of the Federal Power Act and Request for Waivers, Confidential Treatment, Expedited Action and Shortened Comment Period of Eurus Combine Hills I LLC, et al.

    Filed Date: 10/19/16.

    Accession Number: 20161019-5148.

    Comments Due: 5 p.m. ET 11/9/16.

    Docket Numbers: EC17-21-000.

    Applicants: Vantage Wind Energy LLC.

    Description: Application for Authorization Under Section 203 of the Federal Power Act and Request for Waivers and Expedited Action of Vantage Wind Energy LLC.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5098.

    Comments Due: 5 p.m. ET 11/10/16.

    Take notice that the Commission received the following electric rate filings:

    Docket Numbers: ER15-2571-003.

    Applicants: GenOn Energy Management, LLC.

    Description: Report Filing: Refund Report—Informational Filing to be effective N/A.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5086.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER15-2572-002.

    Applicants: GenOn Energy Management, LLC.

    Description: Report Filing: Refund Report—Informational Filing to be effective N/A.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5087.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER15-2573-002.

    Applicants: GenOn Energy Management, LLC.

    Description: Report Filing: Refund Report—Informational Filing to be effective N/A.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5088.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER16-2518-001.

    Applicants: PJM Interconnection, L.L.C.

    Description: Compliance filing: OATT Revisions re: Earlier Queue Submittal per 10/7/16 Order in ER16-2518-000 to be effective 10/31/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5066.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER17-142-000.

    Applicants: Broadview Energy JN, LLC.

    Description: § 205(d) Rate Filing: Certificate of Concurrence to be effective 12/1/2016.

    Filed Date: 10/19/16.

    Accession Number: 20161019-5136.

    Comments Due: 5 p.m. ET 11/9/16.

    Docket Numbers: ER17-143-000.

    Applicants: Southern California Edison Company.

    Description: § 205(d) Rate Filing: EDC Letter Agreement between SCE and RPU to be effective 12/21/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5001.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER17-144-000.

    Applicants: New York Independent System Operator, Inc.

    Description: Request of New York Independent System Operator, Inc. for Limited Tariff Waiver, et al.

    Filed Date: 10/19/16.

    Accession Number: 20161019-5150.

    Comments Due: 5 p.m. ET 10/26/16.

    Docket Numbers: ER17-145-000.

    Applicants: Southwest Power Pool, Inc.

    Description: § 205(d) Rate Filing: 3006R1 CP Bloom Wind, LLC Generator Interconnection Agr to be effective9/26/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5037.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER17-146-000.

    Applicants: PJM Interconnection, L.L.C.

    Description: Tariff Cancellation: Notice of Cancellation of Service Agreement No. 3780, Queue No. W4-045 to be effective 7/26/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5047.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER17-147-000.

    Applicants: Midcontinent Independent System Operator, Inc., MidAmerican Energy Company.

    Description: § 205(d) Rate Filing: 2016-10-20_MidAmerican-ITC Midwest Louisa Facilities and Operating Agreements to be effective 10/21/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5051.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER17-148-000.

    Applicants: Citizens Sunrise Transmission LLC.

    Description: § 205(d) Rate Filing: Annual TRBAA Filing to be effective1/1/2017.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5067.

    Comments Due: 5 p.m. ET 11/10/16.

    Docket Numbers: ER17-149-000.

    Applicants: Grady Wind Energy Center, LLC.

    Description: Baseline eTariff Filing: Certificate of Concurrence to be effective 12/1/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5072.

    Comments Due: 5 p.m. ET 11/10/16.

    The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.

    Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.

    eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf. For other information, call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated: October 20, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26224 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RP17-52-000] Breitburn Operating LP v. Florida Gas Transmission Company, LLC; Notice of Complaint

    Take notice that on October 24, 2016, pursuant to Rule 206 of the Federal Energy Regulatory Commission's (Commission) Rules of Practice and Procedure, 18 CFR 385.206 and section 5 of the Natural Gas Act (NGA), 15 U.S.C. 717d (2009), Breitburn Operating LP (Complainant) filed a formal complaint against Florida Gas Transmission Company, LLC (Respondent) alleging that, Respondent: (1) Unduly discriminated against Complainant by unilaterally requiring its natural gas supplier to pay both the Western Division and Market Area rates while similarly situated shippers paid only the Western Division rate and (2) unlawfully charged and collected a rate under section 4 of the NGA without Commission authorization, all as more fully explained in the complaint.

    The Complainant certifies that a copy of the complaint has been served on the contacts for the Respondent.

    Any person desiring to intervene or to protest this filing must file in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211, 385.214). Protests will be considered by the Commission in determining the appropriate action to be taken, but will not serve to make protestants parties to the proceeding. Any person wishing to become a party must file a notice of intervention or motion to intervene, as appropriate. The Respondent's answer and all interventions, or protests must be filed on or before the comment date. The Respondent's answer, motions to intervene, and protests must be served on the Complainants.

    The Commission encourages electronic submission of protests and interventions in lieu of paper using the “eFiling” link at http://www.ferc.gov. Persons unable to file electronically should submit an original and 5 copies of the protest or intervention to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.

    This filing is accessible on-line at http://www.ferc.gov, using the “eLibrary” link and is available for electronic review in the Commission's Public Reference Room in Washington, DC. There is an “eSubscription” link on the Web site that enables subscribers to receive email notification when a document is added to a subscribed docket(s). For assistance with any FERC Online service, please email [email protected], or call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Comment Date: 5:00 p.m. Eastern Time on November 14, 2016.

    Dated: October 25, 2016. Kimberly D. Bose, Secretary.
    [FR Doc. 2016-26190 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP17-3-000] Dominion Carolina Gas Transmission, LLC; Notice of Application

    Take notice that on October 13, 2016, Dominion Carolina Gas Transmission, LLC, 707 East Main Street, Richmond, VA 23219, filed an application pursuant to section 7(b) of the Natural Gas Act (NGA) requesting authorization to abandon approximately 60 miles of mainline transmission pipeline facilities in Chester, Kershaw, Lancaster, and York Counties, South Carolina that comprise the Line A Abandonment Project, all as more fully set forth in the application which is on file with the Commission and open to public inspection. The filing may also be viewed on the web at http://www.ferc.gov using the “eLibrary” link. Enter the docket number excluding the last three digits in the docket number field to access the document. For assistance, please contact FERC Online Support at [email protected] or toll free at (866) 208-3676, or TTY, contact (202) 502-8659.

    Any questions concerning this application may be directed to Richard D. Jessee, Gas Transmission Certificates Program Manager, Dominion Carolina Gas Transmission, LLC, 707 East Main Street, Richmond, VA 23219, telephone no. (866) 319-3382, facsimile no. (804) 771-4804 and email: [email protected].

    Pursuant to section 157.9 of the Commission's rules (18 CFR 157.9), within 90 days of this Notice, the Commission staff will either: Complete its environmental assessment (EA) and place it into the Commission's public record (eLibrary) for this proceeding or issue a Notice of Schedule for Environmental Review. If a Notice of Schedule for Environmental Review is issued, it will indicate, among other milestones, the anticipated date for the Commission staff's issuance of the final environmental impact statement (FEIS) or EA for this proposal. The filing of the EA in the Commission's public record for this proceeding or the issuance of a Notice of Schedule will serve to notify federal and state agencies of the timing for the completion of all necessary reviews, and the subsequent need to complete all federal authorizations within 90 days of the date of issuance of the Commission staff's FEIS or EA.

    There are two ways to become involved in the Commission's review of this project. First, any person wishing to obtain legal status by becoming a party to the proceedings for this project should, on or before the comment date stated below, file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, a motion to intervene in accordance with the requirements of the Commission's Rules of Practice and Procedure (18 CFR 385.214 or 385.211) and the Regulations under the NGA (18 CFR 157.10). A person obtaining party status will be placed on the service list maintained by the Secretary of the Commission and will receive copies of all documents filed by the applicant and by all other parties. A party must submit 5 copies of filings made with the Commission and must mail a copy to the applicant and to every other party in the proceeding. Only parties to the proceeding can ask for court review of Commission orders in the proceeding.

    However, a person does not have to intervene in order to have comments considered. The second way to participate is by filing with the Secretary of the Commission, as soon as possible, an original and two copies of comments in support of or in opposition to this project. The Commission will consider these comments in determining the appropriate action to be taken, but the filing of a comment alone will not serve to make the filer a party to the proceeding. The Commission's rules require that persons filing comments in opposition to the project provide copies of their protests only to the party or parties directly involved in the protest.

    Persons who wish to comment only on the environmental review of this project should submit an original and two copies of their comments to the Secretary of the Commission. Environmental commenters will be placed on the Commission's environmental mailing list, will receive copies of the environmental documents, and will be notified of meetings associated with the Commission's environmental review process. Environmental commenters will not be required to serve copies of filed documents on all other parties. However, the non-party commenters will not receive copies of all documents filed by other parties or issued by the Commission (except for the mailing of environmental documents issued by the Commission) and will not have the right to seek court review of the Commission's final order.

    The Commission strongly encourages electronic filings of comments, protests and interventions in lieu of paper using the “eFiling” link at http://www.ferc.gov. Persons unable to file electronically should submit an original and 5 copies of the protest or intervention to the Federal Energy regulatory Commission, 888 First Street NE., Washington, DC 20426.

    Comment Date: 5:00 p.m. Eastern Time on November 15, 2016.

    Dated: October 25, 2016. Kimberly Bose, Secretary.
    [FR Doc. 2016-26185 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 14790-000] City of Tuscaloosa, Alabama; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments and Motions To Intervene

    On June 29, 2016, the City of Tuscaloosa, Alabama filed an application for a preliminary permit, pursuant to section 4(f) of the Federal Power Act (FPA), proposing to study the feasibility of the Lake Tuscaloosa Dam Hydroelectric Project (Lake Tuscaloosa Project or project) to be located on the North River, near the City of Tuscaloosa in Tuscaloosa County, Alabama. The sole purpose of a preliminary permit, if issued, is to grant the permit holder priority to file a license application during the permit term. A preliminary permit does not authorize the permit holder to perform any land-disturbing activities or otherwise enter upon lands or waters owned by others without the owners' express permission.

    The proposed project would consist of the following: (1) The City of Tuscaloosa's existing 1,280-foot-long, 36-foot-wide earth filled embankment dam; (2) a reservoir with a surface area of 5,885 acres and a storage capacity of 122,755-acre-feet; (3) a 20-foot-long, 96-foot-wide intake channel; (4) a 200-foot-long, 66-inch-diameter penstock with a 66-inch-diameter butterfly valve at the junction of the existing outlet; (5) a powerhouse containing one generating unit with a total capacity of 3.0 megawatts; (6) a 540-foot-long, 20-foot-wide tailrace; and (7) a 3.9-mile-long, 15 kV transmission line. The proposed project would have an estimated average annual generation of 18,207 megawatt-hours.

    Applicant Contact: Mr. Scott B. Holmes, City of Tuscaloosa., 2201 University Blvd., Tuscaloosa, Alabama 35401; Phone (205) 248-5140; Email: [email protected]

    FERC Contact: Christiane Casey, [email protected], (202) 502-8577.

    Competing Application: This application competes with Project No. 14750-000 filed December 22, 2015. Competing applications had to be filed on or before June 30, 2016.

    Deadline for filing comments and motions to intervene: 60 days from the issuance of this notice. Comments and motions to intervene may be filed electronically via the Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc.gov/docs-filing/efiling.asp. Commenters can submit brief comments up to 6,000 characters, without prior registration, using the eComment system at http://www.ferc.gov/docs-filing/ecomment.asp. You must include your name and contact information at the end of your comments. For assistance, please contact FERC Online Support at [email protected] or toll free at 1-866-208-3676, or for TTY, (202) 502-8659. Although the Commission strongly encourages electronic filing, documents may also be paper-filed. To paper-file, mail an original and five copies to: Kimberly D. Bose, Secretary, Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.

    More information about this project, including a copy of the application, can be viewed or printed on the “eLibrary” link of Commission's Web site at http://www.ferc.gov/docs-filing/elibrary.asp. Enter the docket number (P-14790) in the docket number field to access the document. For assistance, contact FERC Online Support.

    Dated: October 25, 2016. Kimberly D. Bose, Secretary.
    [FR Doc. 2016-26189 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER17-168-000] Applied Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 Authorization

    This is a supplemental notice in the above-referenced proceeding of Applied Energy LLC`s application for market-based rate authority, with an accompanying rate tariff, noting that such application includes a request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability.

    Any person desiring to intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in accordance with Rules 211 and 214 of the Commission's Rules of Practice and Procedure (18 CFR 385.211 and 385.214). Anyone filing a motion to intervene or protest must serve a copy of that document on the Applicant.

    Notice is hereby given that the deadline for filing protests with regard to the applicant's request for blanket authorization, under 18 CFR part 34, of future issuances of securities and assumptions of liability is November 14, 2016.

    The Commission encourages electronic submission of protests and interventions in lieu of paper, using the FERC Online links at http://www.ferc.gov. To facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests.

    Persons unable to file electronically should submit an original and 5 copies of the intervention or protest to the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426.

    The filings in the above-referenced proceeding are accessible in the Commission's eLibrary system by clicking on the appropriate link in the above list. They are also available for electronic review in the Commission's Public Reference Room in Washington, DC. There is an eSubscription link on the Web site that enables subscribers to receive email notification when a document is added to a subscribed docket(s). For assistance with any FERC Online service, please email [email protected]. or call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated: October 25, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26230 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings

    Take notice that the Commission has received the following Natural Gas Pipeline Rate and Refund Report filings:

    Filings Instituting Proceedings

    Docket Numbers: RP17-49-000.

    Applicants: Destin Pipeline Company, L.L.C.

    Description: § 4(d) Rate Filing: Tariff Changes in Response to Audit Order FA15-001-000 to be effective 12/1/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5024.

    Comments Due: 5 p.m. ET 11/7/16.

    Docket Numbers: RP17-50-000.

    Applicants: Columbia Gulf Transmission, LLC.

    Description: Compliance filing Columbia Gulf Section 5 Settlement Implementation to be effective 7/1/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5025.

    Comments Due: 5 p.m. ET 11/7/16.

    Docket Numbers: RP17-51-000.

    Applicants: Freeport-McMoRan Exploration & Productio,Anadarko US Offshore LLC.

    Description: Joint Petition for Temporary Waivers of Commission Policies, et. al. of Freeport-McMoRan Exploration & Production LLC, et. al. under RP17-51.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5148.

    Comments Due: 5 p.m. ET 11/1/16.

    Docket Numbers: RP17-52-000.

    Applicants: Breitburn Operating LP v. Florida Gas Tr.

    Description: Formal Complaint of Breitburn Operating LP under RP17-52.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5178.

    Comments Due: 5 p.m. ET 11/7/16.

    The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.

    Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and § 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.

    eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf. For other information, call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated: October 25, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26231 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings #1

    Take notice that the Commission received the following electric corporate filings:

    Docket Numbers: EC17-22-000.

    Applicants: Bluestem Wind Energy, LLC.

    Description: Application for Authorization under Section 203 of the Federal Power Act of Bluestem Wind Energy, LLC.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5187.

    Comments Due: 5 p.m. ET 11/14/16.

    Take notice that the Commission received the following electric rate filings:

    Docket Numbers: ER10-2249-005.

    Applicants: Portland General Electric Company.

    Description: Second Supplement to June 30, 2016 Triennial Market Power Analysis in the Northwest Region for Portland General Electric Company.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5065.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER16-2126-001.

    Applicants: ISO New England Inc., New England Power Pool Participants Committee.

    Description: Compliance filing: Compliance Filing Re: Automatically Matching Capacity & Multi-Year Lock In to be effective 12/27/2016.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5022.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-169-000.

    Applicants: ALLETE, Inc.

    Description: § 205(d) Rate Filing: ALLETE Maintenance Services Agreement Filing to be effective 12/23/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5143.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-170-000.

    Applicants: California Independent System Operator Corporation.

    Description: § 205(d) Rate Filing: 2016-10-24—Planning Coordinator Agreement with MWD and CEII Request to be effective 12/26/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5144.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-171-000.

    Applicants: Pacific Gas and Electric Company.

    Description: Tariff Cancellation: Notice of Termination of Lathrop IA and WDT Service Agreement (SA 23) to be effective 8/31/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5146.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-172-000.

    Applicants: Lockhart Power Company.

    Description: § 205(d) Rate Filing: Request for Revision to FERC Electric Tariff, Original Volume No. 1 to be effective 12/24/2016.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5147.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-173-000.

    Applicants: Wabash Valley Power Association, Inc.

    Description: Request for Revised and Additional Depreciation Rates of Wabash Valley Power Association, Inc.

    Filed Date: 10/24/16.

    Accession Number: 20161024-5170.

    Comments Due: 5 p.m. ET 11/14/16.

    Docket Numbers: ER17-174-000.

    Applicants: Southwest Power Pool, Inc.

    Description: § 205(d) Rate Filing: 2236R8 Golden Spread Electric Cooperative, Inc. NITSA NOA to be effective 10/1/2016.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5021.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-175-000.

    Applicants: Southwest Power Pool, Inc.

    Description: § 205(d) Rate Filing: 1276R12 KCPL NITSA NOA to be effective 10/1/2016.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5033.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-176-000.

    Applicants: Southern California Edison Company.

    Description: Tariff Cancellation: Notices of Cancellation of SGIA and Service Agreement for Lucerne Valley Solar to be effective 7/15/2016.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5034.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-177-000.

    Applicants: UGI Energy Services, LLC.

    Description: Compliance filing: New Baseline Market Based Rate Filing to be effective 10/1/2013.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5043.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-178-000.

    Applicants: UGI Energy Services, LLC.

    Description: § 205(d) Rate Filing: Supplemental Revisions to Market Based Rate Tariff to be effective 10/10/2016.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5048.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-179-000.

    Applicants: PPL Electric Utilities Corporation, American Transmission Systems, Incorporated, PJM Interconnection, L.L.C.

    Description: § 205(d) Rate Filing: PJM and PJM Transmission Owners Submit Tariff Revisions re Supplemental Projects to be effective 12/31/9998.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5063.

    Comments Due: 5 p.m. ET 11/15/16.

    Docket Numbers: ER17-180-000.

    Applicants: San Diego Gas & Electric Company.

    Description: § 205(d) Rate Filing: SDGE Resubmittal of Standard LGIA—Clone to be effective 10/22/2011.

    Filed Date: 10/25/16.

    Accession Number: 20161025-5085.

    Comments Due: 5 p.m. ET 11/15/16.

    The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.

    Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.

    eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf. For other information, call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated: October 25, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26228 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 2457-041-NH] Public Service Company of New Hampshire; Notice of Availability of Environmental Assessment

    In accordance with the National Environmental Policy Act of 1969 and the Federal Energy Regulatory Commission's (Commission) regulations, 18 CFR part 380 (Order No. 486, 52 FR 47897), the Office of Energy Projects has reviewed the application for a new license for the Eastman Falls Hydroelectric Project, located on the Pemigewasset River in the town of Franklin, in Merrimack and Belknap Counties, New Hampshire, and has prepared an Environmental Assessment (EA).

    The EA contains the staff's analysis of the potential environmental impacts of the project and concludes that licensing the project, with appropriate environmental protective measures, would not constitute a major federal action that would significantly affect the quality of the human environment.

    A copy of the EA is available for review at the Commission in the Public Reference Room or may be viewed on the Commission's Web site at http://www.ferc.gov using the “eLibrary” link. Enter the docket number excluding the last three digits in the docket number field to access documents. For assistance, contact FERC Online Support at [email protected], (866) 208-3676 (toll free), or (202) 502-8659 (TTY).

    You may also register online at http://www.ferc.gov/docs-filing/esubscription.asp to be notified via email of new filings and issuances related to this or other pending projects. For assistance, contact FERC Online Support.

    Any comments should be filed within 30 days from the date of this notice. The Commission strongly encourages electronic filing. Please file comments using the Commission's eFiling system at http://www.ferc.gov/docs-filing/efiling.asp. Commenters can submit brief comments up to 6,000 characters, without prior registration, using the eComment system at http://www.ferc.gov/docs-filing/ecomment.asp. You must include your name and contact information at the end of your comments. For assistance, please contact FERC Online Support. In lieu of electronic filing, please send a paper copy to: Secretary, Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426. The first page of any filing should include docket number P-2457-041.

    For further information, contact Steve Kartalia at (202) 502-6131 or [email protected].

    Dated: October 24, 2016. Kimberly D. Bose, Secretary.
    [FR Doc. 2016-26188 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings

    Take notice that the Commission has received the following Natural Gas Pipeline Rate and Refund Report filings:

    Filings Instituting Proceedings

    Docket Numbers: RP17-42-000.

    Applicants: Natural Gas Pipeline Company of America.

    Description: § 4(d) Rate Filing: Wells Fargo Negotiated Rate to be effective 11/1/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5105.

    Comments Due: 5 p.m. ET 11/1/16.

    Docket Numbers: RP17-43-000.

    Applicants: Iroquois Gas Transmission System, L.P.

    Description: § 4(d) Rate Filing: 10/20/16 Negotiated Rates—Trafigura Trading LLC (RTS) 7445-10 to be effective 11/1/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5114.

    Comments Due: 5 p.m. ET 11/1/16.

    Docket Numbers: RP17-44-000.

    Applicants: Midcontinent Express Pipeline LLC.

    Description: § 4(d) Rate Filing: Fuel Tracker Filing 10/21/16 to be effective 12/1/2016.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5040.

    Comments Due: 5 p.m. ET 11/2/16.

    Docket Numbers: RP17-45-000.

    Applicants: Iroquois Gas Transmission System, L.P.

    Description: § 4(d) Rate Filing: 10/21/16 Negotiated Rates—Macquarie Energy LLC (RTS) 4090-13 to be effective 11/1/2016.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5062.

    Comments Due: 5 p.m. ET 11/2/16.

    Docket Numbers: RP17-46-000.

    Applicants: Iroquois Gas Transmission System, L.P.

    Description: § 4(d) Rate Filing: 10/21/16 Negotiated Rates—Macquarie Energy LLC (RTS) 4090-14 to be effective 11/1/2016.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5063.

    Comments Due: 5 p.m. ET 11/2/16.

    Docket Numbers: RP17-47-000.

    Applicants: Algonquin Gas Transmission, LLC.

    Description: § 4(d) Rate Filing: Negotiated Rates—Cargill contract 510950 to be effective 11/1/2016.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5065.

    Comments Due: 5 p.m. ET 11/2/16.

    Docket Numbers: RP17-48-000.

    Applicants: Iroquois Gas Transmission System, L.P.

    Description: § 4(d) Rate Filing: 10/21/16 Negotiated Rates—Twin Eagle Resource Management, LLC (RTS) 7300-01 to be effective 11/1/2016.

    Filed Date: 10/21/16.

    Accession Number: 20161021-5066.

    Comments Due: 5 p.m. ET 11/2/16.

    Any person desiring to intervene or protest in any of the above proceedings must file in accordance with Rules 211 and 214 of the Commission's Regulations (18 CFR 385.211 and 385.214) on or before 5:00 p.m. Eastern time on the specified comment date. Protests may be considered, but intervention is necessary to become a party to the proceeding.

    Filings in Existing Proceedings

    Docket Numbers: RP16-1301-001.

    Applicants: Rockies Express Pipeline LLC.

    Description: Tariff Amendment: Errata to Interim Fuel Filing RP16-1301 to be effective 11/1/2016.

    Filed Date: 10/20/16.

    Accession Number: 20161020-5116.

    Comments Due: 5 p.m. ET 11/1/16.

    Any person desiring to protest in any of the above proceedings must file in accordance with Rule 211 of the Commission's Regulations (18 CFR 385.211) on or before 5:00 p.m. Eastern time on the specified comment date.

    The filings are accessible in the Commission's eLibrary system by clicking on the links or querying the docket number.

    eFiling is encouraged. More detailed information relating to filing requirements, interventions, protests, service, and qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf. For other information, call (866) 208-3676 (toll free). For TTY, call (202) 502-8659.

    Dated October 24, 2016. Nathaniel J. Davis, Sr., Deputy Secretary.
    [FR Doc. 2016-26226 Filed 10-28-16; 8:45 am] BILLING CODE 6717-01-P
    ENVIRONMENTAL PROTECTION AGENCY [FRL-9954-64-OA] Children's Health Protection Advisory Committee AGENCY:

    Environmental Protection Agency (EPA).

    ACTION:

    Notice of meeting of the Children's Health Protection Advisory Committee.

    SUMMARY:

    Pursuant to the provisions of the Federal Advisory Committee Act, Public Law 92-463, notice is hereby given that the next meeting of the Children's Health Protection Advisory Committee (CHPAC) will be held November 15 and 16, 2016 at the George Washington University Milken Institute School of Public Health, located at 950 New Hampshire Avenue NW., Washington, DC 20037.

    DATES:

    November 15 and 16, 2016.

    ADDRESSES:

    950 New Hampshire Avenue NW., Washington, DC 20037.

    SUPPLEMENTARY INFORMATION:

    The meetings of the CHPAC are open to the public. The CHPAC will meet on Thursday, November 15 from 1:00 p.m. to 5:30 p.m. and Friday, November 16 from 9:00 a.m. to 4:00 p.m. An agenda will be posted to www.epa.gov/children.

    Access and Accommodations: For information on access or services for individuals with disabilities, please contact Martha Berger at 202-564-2191 or [email protected], preferably at least 10 days prior to the meeting.

    FOR FURTHER INFORMATION CONTACT:

    Martha Berger, Designated Federal Officer, U.S. EPA; telephone (202) 564-2191 or [email protected].

    Dated: November 24, 2016. Martha Berger, Designated Federal Officer.
    [FR Doc. 2016-26217 Filed 10-28-16; 8:45 am] BILLING CODE 6560-50-P
    FEDERAL COMMUNICATIONS COMMISSION Schedule Change Open Commission Meeting, Thursday, October 27, 2016 October 25, 2016.

    Please note that the time for the Federal Communications Commission Open Meeting is rescheduled from 10:30 a.m. to 9:30 a.m.

    The Federal Communications Commission will consider the Agenda items listed on the Commission's Notice of October 20 at the Open Meeting on Thursday, October 27, 2016, scheduled to commence at 9:30 a.m. in room TW-C305, at 445 12th Street SW., Washington, DC. The order of the agenda items is changed as follows:

    Item No. Bureau Subject 1 Enforcement Title: Locus Telecommunications, Inc. Summary: The Commission will consider a Memorandum Opinion and Order that dismisses and denies a Petition for Reconsideration of a Forfeiture Order issued by the Commission for the deceptive marketing of prepaid calling cards. 2 Enforcement Title: Lyca Tel, LLC. Summary: The Commission will consider a Memorandum Opinion and Order that dismisses and denies a Petition for Reconsideration of a Forfeiture Order issued by the Commission for the deceptive marketing of prepaid calling cards. 3 Enforcement Title: Touch-Tel USA, LLC. Summary: The Commission will consider a Memorandum Opinion and Order that dismisses and denies a Petition for Reconsideration of a Forfeiture Order issued by the Commission for the deceptive marketing of prepaid calling cards. 4 Enforcement Title: NobelTel, LLC. Summary: The Commission will consider a Memorandum Opinion and Order that dismisses and denies a Petition for Reconsideration of a Forfeiture Order issued by the Commission for the deceptive marketing of prepaid calling cards. 5 Wireline Competition Title: Protecting the Privacy of Customers of Broadband and Other Telecommunications Services Alerts (WC Docket No. 16-106). Summary: The Commission will consider a Report and Order that applies the privacy requirements of the Communications Act to broadband Internet access service providers and other telecommunications services to provide broadband customers with the tools they need to make informed decisions about the use and sharing of their information by their broadband providers. Federal Communications Commission. Marlene H. Dortch, Secretary.
    [FR Doc. 2016-26197 Filed 10-28-16; 8:45 am] BILLING CODE 6712-01-P
    FEDERAL RESERVE SYSTEM Change in Bank Control Notices; Acquisitions of Shares of a Bank or Bank Holding Company

    The notificants listed below have applied under the Change in Bank Control Act (12 U.S.C. 1817(j)) and § 225.41 of the Board's Regulation Y (12 CFR 225.41) to acquire shares of a bank or bank holding company. The factors that are considered in acting on the notices are set forth in paragraph 7 of the Act (12 U.S.C. 1817(j)(7)).

    The notices are available for immediate inspection at the Federal Reserve Bank indicated. The notices also will be available for inspection at the offices of the Board of Governors. Interested persons may express their views in writing to the Reserve Bank indicated for that notice or to the offices of the Board of Governors. Comments must be received not later than November 15, 2016.

    A. Federal Reserve Bank of Chicago (Colette A. Fried, Assistant Vice President) 230 South LaSalle Street, Chicago, Illinois 60690-1414:

    1. Blair M. Bowman, Brighton, Michigan, and Peter D. Scodeller, Beverly Hills, Michigan, together as a group acting in concert; to acquire additional voting shares of Huron Valley Bancorp, Inc. and thereby indirectly acquire Huron Valley State Bank, both of Milford, Michigan.

    B. Federal Reserve Bank of Kansas City (Dennis Denney, Assistant Vice President) 1 Memorial Drive, Kansas City, Missouri 64198-0001:

    1. The Judy Svajgr Trust dated June 24, 1983, Cozad, Nebraska, the Judy Svajgr Trust dated March 20, 1997, Cozad, Nebraska, and Kirk Randal Riley, Cozad, Nebraska, individually and as voting representative of the foregoing trusts; to acquire voting shares of Midwest Banco Corporation, and thereby indirectly acquire voting shares of First Bank and Trust Company, both of Cozad Nebraska. In addition, the Rebecca Akers Irrevocable Trust, Cozad, Nebraska, the Kevin Olson Irrevocable Trust, Cozad, Nebraska, the Keith Olson 2016 Irrevocable Family Trust, Colorado Springs, Colorado, along with Rebecca Anne Akers, Monument, Colorado, Kevin Edward Olson, Colorado Springs, Colorado, and Steven K. Mulliken, Colorado Springs, Colorado, request approval as members of the Olson/Svajgr group acting in concert to control Midwest Banco Corporation, and thereby own shares of First Bank and Trust Company, Cozad, Nebraska.

    Board of Governors of the Federal Reserve System, October 26, 2016. Yao-Chin Chao, Assistant Secretary of the Board.
    [FR Doc. 2016-26223 Filed 10-28-16; 8:45 am] BILLING CODE 6210-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Patient Safety Organizations: Voluntary Relinquishment From the Patient Safety Leadership Council PSO AGENCY:

    Agency for Healthcare Research and Quality (AHRQ), Department of Health and Human Services (HHS).

    ACTION:

    Notice of delisting.

    SUMMARY:

    The Patient Safety and Quality Improvement Act of 2005, 42 U.S.C. 299b-21 to b-26, (Patient Safety Act) and the related Patient Safety and Quality Improvement Final Rule, 42 CFR part 3 (Patient Safety Rule), published in the Federal Register on November 21, 2008, 73 FR 70732-70814, establish a framework by which hospitals, doctors, and other health care providers may voluntarily report information to Patient Safety Organizations (PSOs), on a privileged and confidential basis, for the aggregation and analysis of patient safety events. The Patient Safety Rule authorizes AHRQ, on behalf of the Secretary of HHS, to list as a PSO an entity that attests that it meets the statutory and regulatory requirements for listing. A PSO can be “delisted” by the Secretary if it is found to no longer meet the requirements of the Patient Safety Act and Patient Safety Rule, when a PSO chooses to voluntarily relinquish its status as a PSO for any reason, or when a PSO's listing expires. AHRQ has accepted a notification of voluntary relinquishment from the Patient Safety Leadership Council PSO of its status as a PSO, and has delisted the PSO accordingly. The Patient Safety Leadership Council PSO submitted this request for voluntary relinquishment after receiving a Notice of Preliminary Finding of Deficiency.

    DATES:

    The directories for both listed and delisted PSOs are ongoing and reviewed weekly by AHRQ. The delisting was effective at 12:00 Midnight ET (2400) on September 30, 2016.

    ADDRESSES:

    Both directories can be accessed electronically at the following HHS Web site: http://www.pso.ahrq.gov/listed.

    FOR FURTHER INFORMATION CONTACT:

    Eileen Hogan, Center for Quality Improvement and Patient Safety, AHRQ, 5600 Fishers Lane, Room 06N94B, Rockville, MD 20857; Telephone (toll free): (866) 403-3697; Telephone (local): (301) 427-1111; TTY (toll free): (866) 438-7231; TTY (local): (301) 427-1130; Email: [email protected].

    SUPPLEMENTARY INFORMATION:

    Background

    The Patient Safety Act authorizes the listing of PSOs, which are entities or component organizations whose mission and primary activity are to conduct activities to improve patient safety and the quality of health care delivery.

    HHS issued the Patient Safety Rule to implement the Patient Safety Act. AHRQ administers the provisions of the Patient Safety Act and Patient Safety Rule relating to the listing and operation of PSOs. The Patient Safety Rule authorizes AHRQ to list as a PSO an entity that attests that it meets the statutory and regulatory requirements for listing. A PSO can be “delisted” if it is found to no longer meet the requirements of the Patient Safety Act and Patient Safety Rule, when a PSO chooses to voluntarily relinquish its status as a PSO for any reason, or when a PSO's listing expires. Section 3.108(d) of the Patient Safety Rule requires AHRQ to provide public notice when it removes an organization from the list of federally approved PSOs.

    AHRQ has accepted a notification from the Patient Safety Leadership Council PSO, PSO number P0164, to voluntarily relinquish its status as a PSO. Accordingly, the Patient Safety Leadership Council PSO was delisted effective at 12:00 Midnight ET (2400) on September 30, 2016. AHRQ notes that the Patient Safety Leadership Council PSO submitted this request for voluntary relinquishment following receipt of the Notice of Preliminary Finding of Deficiency sent on September 1, 2016.

    More information on PSOs can be obtained through AHRQ's PSO Web site at http://www.pso.ahrq.gov.

    Sharon B. Arnold, Deputy Director.
    [FR Doc. 2016-26144 Filed 10-28-16; 8:45 am] BILLING CODE 4160-90-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY:

    Agency for Healthcare Research and Quality, HHS.

    ACTION:

    Notice.

    SUMMARY:

    This notice announces the intention of the Agency for Healthcare Research and Quality (AHRQ) to request that the Office of Management and Budget (OMB) approve the proposed information collection project: “Agency for Healthcare Research and Quality's (AHRQ) Guide to Improving Patient Safety in Primary Care Settings by Engaging Patients and Families—Evaluation.” In accordance with the Paperwork Reduction Act, 44 U.S.C. 3501-3521, AHRQ invites the public to comment on this proposed information collection.

    This proposed information collection was previously published in the Federal Register on August 11th, 2016 and allowed 60 days for public comment. AHRQ did not receive any substantive comments. The purpose of this notice is to allow an additional 30 days for public comment.

    DATES:

    Comments on this notice must be received by November 30, 2016.

    ADDRESSES:

    Written comments should be submitted to: AHRQ's OMB Desk Officer by fax at (202) 395-6974 (attention: AHRQ's desk officer) or by email at [email protected] (attention: AHRQ's desk officer).

    FOR FURTHER INFORMATION CONTACT:

    Doris Lefkowitz, AHRQ Reports Clearance Officer, (301) 427-1477, or by email at [email protected].

    SUPPLEMENTARY INFORMATION:

    Proposed Project Agency for Healthcare Research and Quality's Guide To Improving Patient Safety in Primary Care Settings by Engaging Patients and Families—Evaluation

    There is a substantial evidence base showing that engaging patients and families in their care can lead to improvements in patient safety. Since the 1999 release of To Err is Human, there has been an undeniable focus on improving patient safety and eliminating patient harm within acute care. What is not as well documented is how to achieve these improvements in primary care settings.

    Patient and Family Engagement (PFE) strategies for acute care settings include: patient and family advisory committees; membership on patient safety oversight bodies at both operations and governance levels; consultation in the development of patient information material; engaging patients in process improvement or redesign projects; rounding with patients and families; patient and family participation in clinical education programs, and welcoming patients and families to work alongside providers and health systems employees on transparency, culture change and high reliability organization initiatives.

    Although the field of PFE in patient safety for hospitals and health systems is maturing, leveraging PFE to improve patient safety in non-acute settings is in its infancy. Building sustainable processes and practice-based infrastructure are crucial to improving patient safety through patient and family engagement in primary care.

    In response to the limited guidance available for primary care practices to improve safety through patient and family engagement, the Agency for Healthcare Research and Quality (AHRQ) has funded the development of a Guide to Improving Safety in Primary Care Settings by Engaging Patients and Families (hereafter referred to as the Guide). The comprehensive guide will provide primary care practices with interventions that they can use to engage patients and families in ways that lead to improved patient safety. It will include explicit instructions to help primary care practices, providers, and patients and families adopt new behaviors. The Guide and its development are prefaced on several key insights relevant to primary care including:

    Active engagement requires organizational commitment to hearing the patient and family voice and action by leadership to include them as central members of the health care team.

    Patients and families expect and increasingly demand meaningful engagement in harm prevention efforts.

    Institutional courage is required to openly share patient safety vulnerabilities and proactively engage patients in developing solutions that prevent harm.

    Supportive infrastructure is needed to hardwire PFE into all facets of care delivery across the care continuum.

    When done well, patient engagement yields important and measurable results. When not done well, PFE activities may disenfranchise patients, contribute to misunderstanding about risk, result in lack of trust between providers and their organizations, and create fissures among members of the clinical care team.

    With these insights as a basis, three precepts undergird our approach to development for the Guide. The Guide interventions must yield:

    Meaningful relationship-based engagement for patients and families and primary care providers.

    Innovation and enabling technologies to support engagement, shared decision making and patient safety.

    Workable processes yielding sustainable engagement opportunities for patients, families, providers, and practice staff.

    The Guide will principally, but not exclusively, meet the needs of practices that have not already implemented effective PFE structures or processes. An environmental scan revealed several promising interventions for consideration for inclusion in the Guide. The four interventions selected as part of the Guide include:

    Teach-back Be Prepared to Be Engaged Medication Management Warm Handoff

    The interventions will be compiled into a Guide for adoption by primary care practices. The environmental scan also yielded several important implications for Guide development including:

    Engagement efforts in primary care to date have focused on the patient as the agent of change with limited guidance to providers on how to support patients in these efforts.

    Many interventions are focused heavily on educational efforts alone, either for the patient, the provider, or the practice.

    Few of the tools and interventions identified are immediately usable without the need for additional development or enabling materials to support sustainable adoption.

    Health equity and literacy considerations are limited. Tools for patients are often at a relatively high level of literacy, and/or health literacy is required for use.

    Current interventions, tools, and toolkits have a high level of complexity that may impede adoption.

    Existing evidence-based interventions are being refined to reduce complexity and enhance the opportunity for implementation. Implementation development activities including guidance for each intervention and the Guide as a whole are currently underway. Guide field testing will evaluate the implementation challenges faced by primary care practices thereby offering an opportunity to revise the Guide materials for optimal implementation success prior to widespread dissemination.

    The Guide will be made publicly accessible through the AHRQ Web site for easy referral, access, and use by other health care professionals and primary care practices. AHRQ recognizes the importance of ensuring that the Guide will be useful, well implemented and effective in achieving the goals of improving patient safety by engaging patients and families. Thus, the purpose of the Field Testing evaluation is to gain insight on the implementation challenges identified by the twelve primary care practices field testing the Guide. The Guide materials will be revised in an effort to overcome these implementation challenges prior to broad dissemination.

    The specific goals of the proposed Guide field testing evaluation are to examine the following:

    The feasibility of implementing a minimum of two of the four Guide interventions within twelve medium or large primary care practices.

    The challenges to implementing the interventions at the patient, clinician, practice staff, and practice level.

    The uptake and confidence among primary care practices to improve patient safety through patient and family engagement.

    How the implementation of two of the four Guide interventions changes the perception of patient safety among patients, clinicians, and practice staff.

    How the implementation of two of the four Guide interventions changes the perception of patient and family engagement among patients, clinicians, and practice staff.

    Whether primary care practices will continue to use the Guide (or its interventions) beyond the period of field testing and evaluation (i.e. examine sustainability).

    What changes patients, clinicians, and practice staff would recommend to the interventions and the Guide to enhance sustainability.

    This study is being conducted by AHRQ through its contractor, MedStar, pursuant to AHRQ's statutory authority to conduct and support research on health care and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).

    Method of Collection

    To achieve the goals of the project, the following data collections will be implemented during the Field Testing evaluation:

    1. Baseline Practice Assessment of Primary Care Practices. This pen and paper survey will be administered to the twelve primary care practice champions, individuals at each practice responsible for coordinating Guide activities and responding to inquiries from MedStar during Field Testing, immediately following the recruitment as part of the Guide Field Test and prior to commencing implementation of the Guide. Information collected includes: (i) Practice name and location (e.g., city and State); (ii) non-identifying demographic information about the practice (e.g., number of clinicians by type, number of patients served by the practice, payer mix of patients served by practice, race and ethnicity of patients served by practice); (iii) general descriptive information on the practice's experience with patient safety and quality improvement activities (e.g., current experience with Guide interventions, patient safety culture routinely measured); (iv) information related to the practice's affiliation with larger health system; and (v) information related to any competing priorities or practice improvement initiatives (e.g., patient centered medical home designation, etc.).

    2. Post-Implementation Focus Groups for Patients and Families. Information from patients on their experiences with the Guide and its interventions will be solicited twice during the Field Test—once at 3-months and again at 6-months post-implementation of the Guide. Each patient and family focus group will aim to recruit between 6-8 participants and solicit feedback from patients and family members on their experiences with the Guide materials. Information collected will include: (i) Perceptions of patient safety in primary care practices; (ii) perceptions of patient and family engagement in primary care practices; (iii) feedback from the patient perspective on the Guide materials and their general use; (iv) feasibility of adopting the patient and family focused intervention materials in practice; (v) feedback on the patient and family experiences of the Guide and its relation to patient safety.

    3. Baseline Practice Readiness Assessment. Information from primary care practices about their readiness to adopt patient and family engagement strategies will be solicited through telephone interviews with practice staff champions. Information collected will include: (i) Descriptive information on the person completing the interview (e.g., position in the practice, length of employment, experience in implementing patient safety improvements); (ii) description of the patient safety culture of the primary care practice (e.g., teamwork, communication, patient safety culture, etc.,); (iii) perceptions of patient and family engagement within the practice; (iv) perceptions of change management strategies, challenges, and barriers (e.g., leadership support, competing initiatives, other production pressures); (v) capacity for ongoing internal measurement and assessment of the intervention. This process will also solicit general information the interviewee would like to share about the practice's readiness to implement the Guide strategies.

    4. Post-Implementation Interviews of Primary Care Clinicians. Information from primary care clinicians (e.g., physicians, nurses, nurse practitioners, social workers, etc.) on their experiences with the Guide and its interventions will be solicited twice during the Field Test—once at 3-months and again at 6-months post-implementation of the Guide. Interviews with 2 or 3 primary care clinicians per practice will be conducted during Field Testing to solicit feedback on their experiences with the Guide materials. Information collected will include: (i) Perceptions on patient safety in primary care practices; (ii) perceptions of patient and family engagement in primary care practices; (iii) feedback from the clinician perspective on the Guide materials and their general use; (iv) feasibility of adopting the intervention materials in practice; (v) feedback on the clinicians' experiences of the Guide and its relation to patient safety.

    5. Post-Implementation Focus Groups for Practice Staff Members. Information from practice staff members (e.g., practice administrators, medical assistants, schedulers, practice facilitators, other non-clinical staff, etc.) on their experiences with the Guide and its interventions will be solicited twice during the Field Test—once at 3-months and again at 6-months post-implementation of the Guide. Focus groups with between 6-8 primary care practice staff will be conducted in each practice during Field Testing to solicit feedback on their experiences with the Guide materials. Information collected will include: (i) Perceptions on patient safety in primary care practices; (ii) Perceptions of patient and family engagement in primary care practices; (iii) feedback from the practice staff perspective on the Guide materials and their general use; (iv) feasibility of adopting the intervention materials in practice; (v) feedback on the practice staff's experiences of the Guide and its relation to patient safety.

    6. Monthly Telephone Interviews with Practice Champions. This survey will be completed over the phone on a monthly basis with the practice champions from the twelve primary care practices engaged in the Field Testing of the Guide. Information collected will include: (i) Current progress towards implementation of the intervention(s); (ii) movement towards target goals set in the prior meeting; (iii) barriers to implementation; (iv) facilitators of implementation; (v) perceived impact on patient safety; (vi) perceived impact on patient and family engagement; vii) plans for the coming weeks/months.

    The Guide will be tested to evaluate the feasibility of adopting it in primary care practices. A mixed-methods approach will be used to identify barriers and facilitators to uptake and sustainability, and to answer the question “How and in what contexts do the chosen interventions work or can they be amended to work”, rather than “Do they work?” Testing will occur at up to 12 primary care sites and feasibility will be assessed at the patient, provider, and practice levels. The Guide will be revised based on these findings.

    Estimated Annual Respondent Burden

    Exhibit 1 shows the estimated annualized burden hours for the respondents' time to participate in this evaluation of the Guide during field testing. Two formative evaluations will be conducted during field testing in twelve primary care practices in at least two geographic regions of the United States. Evaluation efforts will include collection of baseline practice level data prior to Guide implementation and two separate rounds of focus groups and interviews conducted 3-months and 6-months after Guide implementation. Baseline assessments will be conducted on paper via phone consultation between the Contractor and the local practice champion and will take between 30-60 minutes. Patient focus groups will be conducted at the 3- and 6-month evaluation periods; each lasting between 60-90 minutes. Practice staff focus groups will be conducted during each of the site visits, conducted outside regular practice hours, and last between 60-90 minutes. Primary care clinician interviews will last approximately 45 minutes. We estimate that approximately 12 individuals will participate in the monthly telephone interviews over the 9-month implementation and evaluation period.

    Exhibit 1—Estimated Annualized Burden Hours Form name Number of
  • respondents
  • Number of
  • responses per
  • respondent
  • Hours per
  • response
  • Total burden
  • hours
  • Baseline Practice Assessment 12 1 1 12 Post-Implementation Focus Group for Patients and Family Members 72 2 1.5 216 Interview Guide—Baseline Practice Readiness 12 1 .75 9 Post-Implementation Interview Protocol—Providers 24 2 .75 36 Post-Implementation Focus Group Protocol—Practice Staff 72 2 1.5 216 Topic guide for Telephone Protocol- Guide Practice Champions 12 6 .5 36 Total 204 NA NA 525

    Exhibit 2 shows the estimated annualized cost burden based on the respondents' time to participate in this project. The total cost burden is estimated to be $18,629.16.

    Exhibit 2—Estimated Annualized Cost Burden Form name Number of
  • respondents
  • Total burden
  • hours
  • Average
  • hourly wage
  • rate *
  • Total cost
  • burden
  • Baseline Practice Assessment 12 12 a $37.40 448.80 Post-Implementation Focus Group for Patients and Family Members 72 216 c 23.23 5,017.68 Interview Guide—Baseline Practice Readiness 12 9 a 37.40 336.60 Post-Implementation Interview Protocol—Providers 24 36 b 94.48 3,401.28 Post-Implementation Focus Group Protocol—Practice Staff 72 216 a 37.40 8,078.40 Topic guide for Telephone Protocol—Guide Practice Champions 12 36 a 37.40 1,346.40 Total 204 525 18,629.16 * National Compensation Survey: Occupational wages in the United States May 2015, “U.S. Department of Labor, Bureau of Labor Statistics.” http://www.bls.gov/oes/current/oes_nat.htm. a Based on the mean wages for Miscellaneous Health care Worker (Code 29-9090). b Based on the mean wages for Internists, General (Code 29-1063). c Based on the mean wages for All Occupations (Code 00-0000).
    Request for Comments

    In accordance with the Paperwork Reduction Act, comments on AHRQ's information collection are requested with regard to any of the following: (a) Whether the proposed collection of information is necessary for the proper performance of AHRQ health care research and health care information dissemination functions, including whether the information will have practical utility; (b) the accuracy of AHRQ's estimate of burden (including hours and costs) of the proposed collection(s) of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information upon the respondents, including the use of automated collection techniques or other forms of information technology.

    Comments submitted in response to this notice will be summarized and included in the Agency's subsequent request for OMB approval of the proposed information collection. All comments will become a matter of public record.

    Sharon B. Arnold, Deputy Director.
    [FR Doc. 2016-26143 Filed 10-28-16; 8:45 am] BILLING CODE 4160-90-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [60Day-17-17BX; Docket No. CDC-2016-0103] Proposed Data Collection Submitted for Public Comment and Recommendations AGENCY:

    Centers for Disease Control and Prevention (CDC), Department of Health and Human Services (HHS).

    ACTION:

    Notice with comment period.

    SUMMARY:

    The Centers for Disease Control and Prevention (CDC), as part of its continuing efforts to reduce public burden and maximize the utility of government information, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995. This notice invites comment on a proposed information collection entitled “Understanding the Needs, Challenges, Opportunities, Vision and Emerging Roles in Environmental Health (UNCOVER EH).” The purpose of the data collection is to collect information from the health department environmental health (EH) workforce to determine demographics, education/training, experience, areas of practice, and current and future needs to address emerging environmental issues.

    DATES:

    Written comments must be received on or before December 30, 2016.

    ADDRESSES:

    You may submit comments, identified by Docket No. CDC-2016-0103 by any of the following methods:

    Federal eRulemaking Portal: Regulations.gov. Follow the instructions for submitting comments.

    Mail: Leroy A. Richardson, Information Collection Review Office, Centers for Disease Control and Prevention, 1600 Clifton Road NE., MS-D74, Atlanta, Georgia 30329.

    Instructions: All submissions received must include the agency name and Docket Number. All relevant comments received will be posted without change to Regulations.gov, including any personal information provided. For access to the docket to read background documents or comments received, go to Regulations.gov.

    Please note: All public comment should be submitted through the Federal eRulemaking portal (Regulations.gov) or by U.S. mail to the address listed above.

    FOR FURTHER INFORMATION CONTACT:

    To request more information on the proposed project or to obtain a copy of the information collection plan and instruments, contact the Information Collection Review Office, Centers for Disease Control and Prevention, 1600 Clifton Road NE., MS-D74, Atlanta, Georgia 30329; phone: 404-639-7570; Email: [email protected].

    SUPPLEMENTARY INFORMATION:

    Under the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3501-3520), Federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. In addition, the PRA also requires Federal agencies to provide a 60-day notice in the Federal Register concerning each proposed collection of information, including each new proposed collection, each proposed extension of existing collection of information, and each reinstatement of previously approved information collection before submitting the collection to OMB for approval. To comply with this requirement, we are publishing this notice of a proposed data collection as described below.

    Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology; and (e) estimates of capital or start-up costs and costs of operation, maintenance, and purchase of services to provide information. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; to develop, acquire, install and utilize technology and systems for the purpose of collecting, validating and verifying information, processing and maintaining information, and disclosing and providing information; to train personnel and to be able to respond to a collection of information, to search data sources, to complete and review the collection of information; and to transmit or otherwise disclose the information.

    Proposed Project

    Understanding the Needs, Challenges, Opportunities, Vision and Emerging Roles in Environmental Health (UNCOVER EH)—NEW—National Center for Environmental Health (NCEH), Centers for Disease Control and Prevention (CDC).

    Background and Brief Description

    The environmental health (EH) workforce is an essential component of the public health workforce. According to recent health department surveys, EH professionals are employed at approximately 85% of local health departments, 81% of state health departments, and 30% of tribal health departments. Describing and characterizing the EH workforce is essential to identifying gaps in staffing, training, and ultimately ensuring EH professionals are prepared to meet future challenges.

    CDC seeks OMB approval for a one-time, one-year information collection designed to thoroughly describe the health department EH workforce on: (1) The current supply of EH professionals; (2) EH workforce demographics and professional roles; (3) gaps in current EH education and competencies and training needs; and (4) critical skills and resources needed to meet the evolving and emerging EH issues and challenges. This information will benefit the government and other entities by providing essential data to inform and support workforce development activities and initiatives and understand areas of practice and where gaps may exist in capacity to address current EH issues and future challenges.

    The respondent universe will be the estimated 20,000 EH professionals working within health departments. They will be enumerated and recruited by identifying a point of contact in each state, local, tribal, and territorial health department from whom a roster of EH professionals will be requested. A list of respondents and their business email addresses will be generated and used for recruitment and survey administration. Any contact information collected will be related to the respondents' role in the organization. Participation will be voluntary.

    Data will be collected one time from a census of members of the public health department EH workforce using a web-based survey instrument. The UNCOVER EH Survey will take approximately 30 minutes to complete per respondent, and it will take approximately 5 minutes for health department administrative staff to compile EH workforce names and email addresses into the Health Department Roster.

    There will be no cost to respondents other than their time. The requested time burden is 10,269 hours.

    Estimated Annualized Burden Hours Type of respondents Form name Number of
  • respondents
  • Number of
  • responses
  • per
  • respondent
  • Average
  • burden
  • per
  • response
  • (in hrs.)
  • Total
  • burden
  • (in hrs.)
  • Health Department EH Administrative Staff Health Department Roster 3,231 1 5/60 269 Health Department EH Professionals UNCOVER EH Survey 20,000 1 30/60 10,000 Total 10,269
    Leroy A. Richardson, Chief, Information Collection Review Office, Office of Scientific Integrity, Office of the Associate Director for Science, Office of the Director, Centers for Disease Control and Prevention.
    [FR Doc. 2016-26248 Filed 10-28-16; 8:45 am] BILLING CODE 4163-18-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services [Document Identifiers: CMS-416, CMS-8003, CMS-10142, CMS-10396, and CMS-R-262] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY:

    Centers for Medicare & Medicaid Services, HHS.

    ACTION:

    Notice.

    SUMMARY:

    The Centers for Medicare & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of 1995 (the PRA), federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information (including each proposed extension or reinstatement of an existing collection of information) and to allow 60 days for public comment on the proposed action. Interested persons are invited to send comments regarding our burden estimates or any other aspect of this collection of information, including any of the following subjects: (1) The necessity and utility of the proposed information collection for the proper performance of the agency's functions; (2) the accuracy of the estimated burden; (3) ways to enhance the quality, utility, and clarity of the information to be collected; and (4) the use of automated collection techniques or other forms of information technology to minimize the information collection burden.

    DATES:

    Comments must be received by December 30, 2016.

    ADDRESSES:

    When commenting, please reference the document identifier or OMB control number. To be assured consideration, comments and recommendations must be submitted in any one of the following ways:

    1. Electronically. You may send your comments electronically to http://www.regulations.gov. Follow the instructions for “Comment or Submission” or “More Search Options” to find the information collection document(s) that are accepting comments.

    2. By regular mail. You may mail written comments to the following address: CMS, Office of Strategic Operations and Regulatory Affairs, Division of Regulations Development, Attention: Document Identifier/OMB Control Number ___, Room C4-26-05, 7500 Security Boulevard, Baltimore, Maryland 21244-1850.

    To obtain copies of a supporting statement and any related forms for the proposed collection(s) summarized in this notice, you may make your request using one of following:

    1. Access CMS' Web site address at http://www.cms.hhs.gov/PaperworkReductionActof1995.

    2. Email your request, including your address, phone number, OMB number, and CMS document identifier, to [email protected].

    3. Call the Reports Clearance Office at (410) 786-1326.

    FOR FURTHER INFORMATION CONTACT:

    Reports Clearance Office at (410) 786-1326.

    SUPPLEMENTARY INFORMATION: Contents

    This notice sets out a summary of the use and burden associated with the following information collections. More detailed information can be found in each collection's supporting statement and associated materials (see ADDRESSES).

    CMS-416 Annual Early and Periodic Screening, Diagnostic and Treatment (EPSDT) Participation Report CMS-8003 1915(c) Home and Community Based Services (HCBS) Waiver CMS-10142 Bid Pricing Tool (BPT) for Medicare Advantage (MA) Plans and Prescription Drug Plans (PDP) CMS-10396 Medication Therapy Management Program Improvements CMS-R-262 Contract Year 2018 Plan Benefit Package (PBP) Software and Formulary Submission

    Under the PRA (44 U.S.C. 3501-3520), federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. The term “collection of information” is defined in 44 U.S.C. 3502(3) and 5 CFR 1320.3(c) and includes agency requests or requirements that members of the public submit reports, keep records, or provide information to a third party. Section 3506(c)(2)(A) of the PRA requires federal agencies to publish a 60-day notice in the Federal Register concerning each proposed collection of information, including each proposed extension or reinstatement of an existing collection of information, before submitting the collection to OMB for approval. To comply with this requirement, CMS is publishing this notice.

    Information Collection

    1. Type of Information Collection Request: Revision of a currently approved collection; Title of Information Collection: Annual Early and Periodic Screening, Diagnostic and Treatment (EPSDT) Participation Report; Use: The collected baseline data is used to assess the effectiveness of state early and periodic screening, diagnostic and treatment (EPSDT) programs in reaching eligible children (by age group and basis of Medicaid eligibility) who are provided initial and periodic child health screening services, referred for corrective treatment, and receiving dental, hearing, and vision services. This assessment is coupled with the state's results in attaining the participation goals set for the state. The information gathered from this report, permits federal and state managers to evaluate the effectiveness of the EPSDT law on the basic aspects of the program. Form Number: CMS-416 (OMB control number 0938-0354); Frequency: Yearly and on occasion; Affected Public: State, Local, or Tribal Governments; Number of Respondents: 56; Total Annual Responses: 168; Total Annual Hours: 1,624. (For policy questions regarding this collection contact Kimberly Perrault at 410-786-2482.)

    2. Type of Information Collection Request: Extension without change of a currently approved collection; Title of Information Collection: 1915(c) Home and Community Based Services (HCBS) Waiver; Use: We will use the web-based application to review and adjudicate individual waiver actions. The web-based application will also be used by states to submit and revise their waiver requests. Form Number: CMS-8003 (OMB control number 0938-0449); Frequency: Yearly; Affected Public: State, Local, or Tribal Governments; Number of Respondents: 47; Total Annual Responses: 71; Total Annual Hours: 6,005. (For policy questions regarding this collection contact Kathy Poisal at 410-786-5940.)

    3. Type of Information Collection Request: Revision of a currently approved collection; Title of Information Collection: Bid Pricing Tool (BPT) for Medicare Advantage (MA) Plans and Prescription Drug Plans (PDP); Use: We require that Medicare Advantage organizations and Prescription Drug Plans complete the BPT as part of the annual bidding process. During this process, organizations prepare their proposed actuarial bid pricing for the upcoming contract year and submit them to us for review and approval. The purpose of the BPT is to collect the actuarial pricing information for each plan. The BPT calculates the plan's bid, enrollee premiums, and payment rates. We publish beneficiary premium information using a variety of formats (www.medicare.gov, the Medicare & You handbook, Summary of Benefits marketing information) for the purpose of beneficiary education and enrollment. Form Number: CMS-10142 (OMB control number 0938-0944); Frequency: Yearly; Affected Public: Business or other for-profits and Not-for-profit institutions; Number of Respondents: 555; Total Annual Responses: 4,995; Total Annual Hours: 149,850. (For policy questions regarding this collection contact Rachel Shevland at 410-786-3026.)

    4. Type of Information Collection Request: Extension of a currently approved collection; Title of Information Collection: Medication Therapy Management Program Improvements; Use: Information collected by Part D medication therapy management programs (as required by the standardized format for the comprehensive medication review summary) will be used by beneficiaries or their authorized representatives, caregivers, and their healthcare providers to improve medication use and achieve better healthcare outcomes. Form Number: CMS-10396 (OMB control number 0938-1154); Frequency: Occasionally; Affected Public: Business or other for-profits; Number of Respondents: 599; Total Annual Responses: 1,211,661; Total Annual Hours: 807,451. (For policy questions regarding this collection contact Victoria Dang at 410-786-3991.)

    5. Type of Information Collection Request: Revision of a currently approved collection; Title of Information Collection: Contract Year 2018 Plan Benefit Package (PBP) Software and Formulary Submission; Use: We require that Medicare Advantage and Prescription Drug Plan organizations submit a completed PBP and formulary as part of the annual bidding process. During this process, organizations prepare their proposed plan benefit packages for the upcoming contract year and submit them to us for review and approval. We publish beneficiary education information using a variety of formats. The specific education initiatives that utilize PBP and formulary data include web application tools on www.medicare.gov and the plan benefit insert in the Medicare & You handbook. In addition, organizations utilize the PBP data to generate their Summary of Benefits marketing information. Form Number: CMS-R-262 (OMB control number 0938-0763); Frequency: Yearly; Affected Public: Business or other for-profits and Not-for-profit institutions; Number of Respondents: 524; Total Annual Responses: 5,185; Total Annual Hours: 50,619. (For policy questions regarding this collection contact Kristy Holtje at 410-786-2209.)

    Dated: October 26, 2016. William N. Parham, III, Director, Paperwork Reduction Staff, Office of Strategic Operations and Regulatory Affairs.
    [FR Doc. 2016-26246 Filed 10-28-16; 8:45 am] BILLING CODE 4120-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services [Document Identifier: CMS-10629] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY:

    Centers for Medicare & Medicaid Services, Department of Health and Human Services.

    ACTION:

    Notice.

    SUMMARY:

    The Centers for Medicare & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of 1995 (the PRA), Federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information (including each proposed extension or reinstatement of an existing collection of information) and to allow 60 days for public comment on the proposed action. Interested persons are invited to send comments regarding our burden estimates or any other aspect of this collection of information, including any of the following subjects: (1) The necessity and utility of the proposed information collection for the proper performance of the agency's functions; (2) the accuracy of the estimated burden; (3) ways to enhance the quality, utility, and clarity of the information to be collected; and (4) the use of automated collection techniques or other forms of information technology to minimize the information collection burden.

    DATES:

    Comments must be received by December 30, 2016.

    ADDRESSES:

    When commenting, please reference the document identifier or OMB control number. To be assured consideration, comments and recommendations must be submitted in any one of the following ways:

    1. Electronically. You may send your comments electronically to http://www.regulations.gov. Follow the instructions for “Comment or Submission” or “More Search Options” to find the information collection document(s) that are accepting comments.

    2. By regular mail. You may mail written comments to the following address: CMS, Office of Strategic Operations and Regulatory Affairs, Division of Regulations Development, Attention: Document Identifier/OMB Control Number___, Room C4-26-05, 7500 Security Boulevard, Baltimore, Maryland 21244-1850.

    To obtain copies of a supporting statement and any related forms for the proposed collection(s) summarized in this notice, you may make your request using one of following:

    1. Access CMS' Web site address at http://www.cms.hhs.gov/PaperworkReductionActof1995.

    2. Email your request, including your address, phone number, OMB number, and CMS document identifier, to [email protected].

    3. Call the Reports Clearance Office at (410) 786-1326.

    FOR FURTHER INFORMATION CONTACT:

    Reports Clearance Office at (410) 786-1326.

    SUPPLEMENTARY INFORMATION: Contents

    This notice sets out a summary of the use and burden associated with the following information collections. More detailed information can be found in each collection's supporting statement and associated materials (see ADDRESSES).

    CMS-10629 Waiver Application for Providers and Suppliers Subject to an Enrollment Moratorium

    Under the PRA (44 U.S.C. 3501-3520), Federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. The term “collection of information” is defined in 44 U.S.C. 3502(3) and 5 CFR 1320.3(c) and includes agency requests or requirements that members of the public submit reports, keep records, or provide information to a third party. Section 3506(c)(2)(A) of the PRA requires Federal agencies to publish a 60-day notice in the Federal Register concerning each proposed collection of information, including each proposed extension or reinstatement of an existing collection of information, before submitting the collection to OMB for approval. To comply with this requirement, CMS is publishing this notice.

    Information Collection

    1. Type of Information Collection Request: Extension of a currently approved collection;

    Title of Information Collection: Waiver Application for Providers and Suppliers Subject to an Enrollment Moratorium; Use: This demonstration, in conjunction with an expansion of the existing provider enrollment moratoria, will allow CMS to mitigate known vulnerabilities within the existing moratoria and will lead to increased investigations of fraud. Section 402(a)(l)(J) of the Social Security Amendments of 1967 (42 U.S.C. 1395b-l(a)(l)(J)) permits the Secretary to “develop or demonstrate improved methods for the investigation and prosecution of fraud in the provision of care or services under the health programs established by the Social Security Act.” In addition to the development and demonstration of improved methods for investigations, CMS will utilize this demonstration to address beneficiary access to care issues. Form Number: CMS-10629 (OMB control number: 0938-1313); Frequency: Occasionally; Affected Public: Business or other for-profit, Not-for-profit institutions; Number of Respondents: 800; Total Annual Responses: 800; Total Annual Hours: 4,800. (For policy questions regarding this collection contact Kim Jung at 410-786-9370).

    Dated: October 25, 2016. William N. Parham, III, Director, Paperwork Reduction Staff, Office of Strategic Operations and Regulatory Affairs.
    [FR Doc. 2016-26122 Filed 10-28-16; 8:45 am] BILLING CODE 4120-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services [Document Identifiers: CMS-10622, CMS-339, CMS-460, CMS-R-64, CMS-379, CMS-10311, CMS-1490, CMS-10137, and CMS-10237] Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY:

    Centers for Medicare & Medicaid Services, HHS.

    ACTION:

    Notice.

    SUMMARY:

    The Centers for Medicare & Medicaid Services (CMS) is announcing an opportunity for the public to comment on CMS' intention to collect information from the public. Under the Paperwork Reduction Act of 1995 (PRA), federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information, including each proposed extension or reinstatement of an existing collection of information, and to allow a second opportunity for public comment on the notice. Interested persons are invited to send comments regarding the burden estimate or any other aspect of this collection of information, including any of the following subjects: (1) The necessity and utility of the proposed information collection for the proper performance of the agency's functions; (2) the accuracy of the estimated burden; (3) ways to enhance the quality, utility, and clarity of the information to be collected; and (4) the use of automated collection techniques or other forms of information technology to minimize the information collection burden.

    DATES:

    Comments on the collection(s) of information must be received by the OMB desk officer by November 30, 2016.

    ADDRESSES:

    When commenting on the proposed information collections, please reference the document identifier or OMB control number. To be assured consideration, comments and recommendations must be received by the OMB desk officer via one of the following transmissions: OMB, Office of Information and Regulatory Affairs, Attention: CMS Desk Officer, Fax Number: (202) 395-5806 OR, Email: [email protected].

    To obtain copies of a supporting statement and any related forms for the proposed collection(s) summarized in this notice, you may make your request using one of following:

    1. Access CMS' Web site address at http://www.cms.hhs.gov/PaperworkReductionActof1995.

    2. Email your request, including your address, phone number, OMB number, and CMS document identifier, to [email protected].

    3. Call the Reports Clearance Office at (410) 786-1326.

    FOR FURTHER INFORMATION CONTACT:

    Reports Clearance Office at (410) 786-1326.

    SUPPLEMENTARY INFORMATION:

    Under the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3501-3520), federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. The term “collection of information” is defined in 44 U.S.C. 3502(3) and 5 CFR 1320.3(c) and includes agency requests or requirements that members of the public submit reports, keep records, or provide information to a third party. Section 3506(c)(2)(A) of the PRA (44 U.S.C. 3506(c)(2)(A)) requires federal agencies to publish a 30-day notice in the Federal Register concerning each proposed collection of information, including each proposed extension or reinstatement of an existing collection of information, before submitting the collection to OMB for approval. To comply with this requirement, CMS is publishing this notice that summarizes the following proposed collection(s) of information for public comment:

    1. Type of Information Collection Request: New collection (Request for a new OMB control number); Title of Information Collection: Evaluation of the CMS Quality Improvement Organizations: Reducing Healthcare-Acquired Conditions in Nursing Homes; Use: As mandated by Sections 1152-1154 of the Social Security Act, CMS directs the QIO program, one of the largest federal programs dedicated to improving health quality for Medicare beneficiaries. In the 11th SOW, CMS restructured the QIO program to funded Quality Innovation Networks (QIN)-QIOs, Beneficiary and Family-Centered Care (BFCC) organizations, National Coordinating Centers (NCCs), Program Collaboration Centers (PCCs), and the Strategic Innovation Engine (SIE). In the current SOW, 14 QIN-QIOs coordinate the work of 53 QIOs nationwide including all 50 states and other U.S. territories.

    CMS evaluates the quality and effectiveness of the QIO program as authorized in Part B of Title XI of the Social Security Act. CMS created the Independent Evaluation Center (IEC) to provide CMS and its stakeholders with an independent and objective program evaluation of the 11th SOW. Evaluation activities will focus on analyzing how well the QIO program is achieving the three aims of better care, better health, and lower cost as well as the effectiveness of the new QIO program structure. One of the QIN-QIOs' tasks to achieve these three aims is to support participating nursing homes in their efforts to improve quality of care and health outcomes among residents. According to the 2013 CMS Nursing Home Data Compendium, more than 15,000 nursing homes participated in Medicare and Medicaid programs with more than 1.4 million beneficiaries resided in U.S. nursing homes. These residents and their families rely on nursing homes to provide reliable, safe, high quality care. However, cognitive and functional impairments, pain, incontinence, antipsychotic drug use, and healthcare associated conditions (HAC), such as pressure ulcers and falls, remain areas of concern.

    This information collection is to provide data to assess QIN-QIOs efforts aimed at addressing these HACs in nursing homes. QIN-QIOs are responsible for recruiting nursing homes to participate in the program. We will conduct an annual survey of administrators of nursing homes participating in the QIN-QIO program (intervention group) and administrators at nursing homes that are not participating in the QIN-QIO program (comparison group). Our proposed survey assesses progress towards the goals of the QIN-QIO SOW, including activities and strategies to increase mobility among residents, reduce infections, reduce use of inappropriate antipsychotic medication among long-term stay residents.

    We plan to conduct qualitative interviews with nursing home administrators. This interview will supplement the Nursing Home Survey and provide more in-depth contextual information about the QIN-QIO program implementation within at nursing homes, including: (i) Their experience with, and perceived success of QIN-QIO collaboratives; (ii) their satisfaction with the QIN-QIO Collaborative and QIO support; (iii) perceived value and impact of QIO program; and (iv) drivers and barriers to QIN-QIO involvement and success.

    Information from QIO leadership and/or state/territory task leads will be collected by interviews and focus groups. Interviews with Nursing Home Task leaders at the QIN and QIO will be conducted in-person during site visits and/or over the phone. We will conduct focus groups with QIO-level Directors during the annual CMS Quality conference. The purpose of the interviews and focus groups is to examine: (i) QIO processes for recruiting nursing homes, peer coaches, and beneficiaries to participate in the program; (ii) strengths and challenges of QIN-QIO activities related to nursing homes; (iii) partnership and coordination with other QIN-QIO tasks; and (iv) overall lessons learned. We will also conduct qualitative interviews with nursing home peer coaches. Subsequent to the 60-day notice Federal Register notice, the survey has been revised by adding questions and rewording questions. Form Number: CMS-10622 (OMB control number: 0938-NEW); Frequency: Annually; Affected Public: Business or other For-profits and Not-for Profits institutions; Number of Respondents: 856; Total Annual Responses: 856; Total Annual Hours: 255. (For policy questions regarding this collection contact Robert Kambic at 410-786-1515.)

    2. Type of Information Collection Request: Extension of a currently approved collection; Title of Information Collection: Provider Cost Report Reimbursement Questionnaire; Use: The information collected in this form (Exhibits 1 and 2) is authorized under Sections 1815(a) and 1833(e) of the Social Security Act, 42 U.S.C. 1395g. Regulations at 42 CFR 413.20 and 413.24 require providers to submit financial and statistical records to verify the cost data disclosed on their annual Medicare cost report. Providers participating in the Medicare program are reimbursed for furnishing covered services to eligible beneficiaries on the basis of an annual cost report (filed with the provider's MAC) in which the proper reimbursement is computed. Consequently, it is necessary to collect this documentation of providers' costs and activities that supports the Medicare cost report data in order to ensure proper Medicare reimbursement to providers. Form Number: CMS-339 (OMB control number: 0938-0301); Frequency: Yearly; Affected Public: Private sector (Business or other For-profits); Number of Respondents: 2,273; Total Annual Responses: 2,273; Total Annual Hours: 15,911. (For policy questions regarding this collection contact Christine Dobrzycki at 410-786-3389.)

    3. Type of Information Collection Request: Extension of a currently approved collection; Title of Information Collection: Medicare Participation Agreement for Physicians and Suppliers; Use: Section 1842(h) of the Social Security Act permits physicians and suppliers to voluntarily participate in Medicare Part B by agreeing to take assignment on all claims for services to Medicare beneficiaries. The law also requires that the Secretary provide specific benefits to the physicians, suppliers and other persons who choose to participate. The CMS-460 is the agreement by which the physician or supplier elects to participate in Medicare. Form Number: CMS-460 (OMB control number: 0938-0373); Frequency: Yearly; Affected Public: Private sector (Business or other For-profits); Number of Respondents: 120,000; Total Annual Responses: 120,000; Total Annual Hours: 30,000. (For policy questions regarding this collection contact Mark Baldwin at 410-786-8139.)

    4. Type of Information Collection Request: Extension of a currently approved collection; Title of Information Collection: Indirect Medical Education and Supporting Regulations; Use: Section 1886(d)(5)(B) of the Social Security Act requires additional payments to be made under the Medicare Prospective Payment System (PPS) for the indirect medical educational costs a hospital incurs in connection with interns and residents (IRs) in approved teaching programs. In addition, Title 42, Part 413, sections 75 through 83 implement section 1886(d) of the Act by establishing the methodology for Medicare payment of the cost of direct graduate medical educational activities. These payments, which are adjustments (add-ons) to other payments made to a hospital under PPS, are largely determined by the number of full-time equivalent (FTE) IRs that work at a hospital during its cost reporting period. In Federal fiscal year (FY) 2015, the estimated Medicare program payments for indirect medical education (IME) costs amounted to $8.38 billion. Medicare program payments for direct graduate medical education (GME) are also based upon the number of FTE-IRs that work at a hospital. In FY 2015, the estimated Medicare program payments for GME costs amounted to $3.1 billion. Form Number: CMS-R-64 (OMB control number: 0938-0456); Frequency: Yearly; Affected Public: Private sector (Business or other For-profits); Number of Respondents: 1,245; Total Annual Responses: 1,245; Total Annual Hours: 2,490. (For policy questions regarding this collection contact Milton Jacobson at 410-786-7553.)

    5. Type of Information Collection Request: Extension of a currently approved collection; Title of Information Collection: Financial Statement of Debtor; Use: Section 1893(f)(1)) of the Social Security Act and 42 CFR 401.607 provides the authority for collection of this information. Section 42 CFR 405.607 requires that, CMS recover amounts of claims due from debtors including interest where appropriate by direct collections in lump sums or in installments. In addition, the DOJ Final Rule, the Federal Claims Collection Standards, which was published as 32 CFR parts 900-904, on November 22, 2000, in the Federal Register, Section 32 CFR 900.1 stipulates that, standards for Federal agency use in the administrative collection, offset, compromise, and the suspension or termination of collection activity. Section 32 CFR 901.8(a) states that, Agencies should obtain financial statements from debtors who represent that they are unable to pay the debt in one lump sum. Form Number: CMS-379 (OMB control number: 0938-0270); Frequency: Yearly; Affected Public: Business or other for-profits; Number of Respondents: 500; Total Annual Responses: 500; Total Annual Hours: 1,000. (For policy questions regarding this collection contact Anita Crosier at 410-786-0217.)

    6. Type of Information Collection Request: Extension of a currently approved collection; Title of Information Collection: Medicare Program/Home Health Prospective Payment System Rate Update for Calendar Year 2010: Physician Narrative Requirement and Supporting Regulation; Use: Section (o) of the Act (42 U.S.C. 1395 x) specifies certain requirements that a home health agency must meet to participate in the Medicare program. To qualify for Medicare coverage of home health services a Medicare beneficiary must meet each of the following requirements as stipulated in § 409.42: be confined to the home or an institution that is not a hospital, SNF, or nursing facility as defined in sections 1861(e)(1), 1819(a)(1) or 1919 of Act; be under the care of a physician as described in § 409.42(b); be under a plan of care that meets the requirements specified in § 409.43; the care must be furnished by or under arrangements made by a participating HHA, and the beneficiary must be in need of skilled services as described in § 409.42(c). Subsection 409.42(c) of our regulations requires that the beneficiary need at least one of the following services as certified by a physician in accordance with § 424.22: Intermittent skilled nursing services and the need for skilled services which meet the criteria in § 409.32; Physical therapy which meets the requirements of § 409.44(c), Speech-language pathology which meets the requirements of § 409.44(c); or have a continuing need for occupational therapy that meets the requirements of § 409.44(c), subject to the limitations described in § 409.42(c)(4). On March 23, 2010, the Affordable Care Act of 2010 (Pub. L., 111-148) was enacted. Section 6407(a) (amended by section 10605) of the Affordable Care Act amends the requirements for physician certification of home health services contained in Sections 1814(a)(2)(C) and 1835(a)(2)(A) by requiring that, prior to certifying a patient as eligible for Medicare's home health benefit, the physician must document that the physician himself or herself or a permitted non-physician practitioner has had a face-to-face encounter (including through the use of tele-health services, subject to the requirements in section 1834(m) of the Act)”, with the patient. The Affordable Care Act provision does not amend the statutory requirement that a physician must certify a patient's eligibility for Medicare's home health benefit, (see Sections 1814(a)(2)(C) and 1835(a)(2)(A) of the Act. Form Number: CMS-10311 (OMB control number: 0938-1083); Frequency: Yearly; Affected Public: Business or other For-profits; Number of Respondents: 345,600; Total Annual Responses: 345,600; Total Annual Hours: 28,800. (For policy questions regarding this collection contact Hillary Loeffler at 410-786-0456.)

    7. Type of Information Collection Request: New collection (Request for a new OMB control number); Title of Information Collection: Patient's Request for Medicare Payment; Use: The Form CMS-1490S form provides beneficiaries with a relatively easy form to use when filing their claims. Without the collection of this information, claims for reimbursement relating to the provision of Part B medical services/supplies could not be acted upon. This would result in a nationwide paralysis of the operation of the Federal Government's Part B Medicare program, and major problems for the patients/beneficiaries inflicting severe physical and financial hardship on beneficiaries. This form was explicitly developed for easy use by beneficiaries who file their own claims. The CMS-1490S form can be obtained from any Social Security office or Medicare Administrative Contractors or CMS. When the CMS-1490S is used, the beneficiary must attach to it his/her bills from physicians or suppliers. The form is, therefore, designed specifically to aid beneficiaries who cannot get assistance from their physicians or suppliers for completing claim forms. The form is currently approved under 0938-1197; however, we are submitting for approval as a standalone information collection request. Once a new OMB control number is issued, we will remove the burden for the CMS-1490S that is currently approved under OMB control number 0938-1197. Form Number: CMS-1490 (OMB control number: 0938-NEW); Frequency: Occasionally Affected Public: Individuals and Households; Number of Respondents: 167,839; Total Annual Responses: 167,839; Total Annual Hours: 83,920. (For policy questions regarding this collection contact Sumita Sen at 410-786-5755.)

    8. Type of Information Collection Request: Revision of a currently approved collection; Title of Information Collection: Solicitation for Applications for Medicare Prescription Drug Plan 2018 Contracts; Use: Coverage for the prescription drug benefit is provided through contracted prescription drug (PD) plans or through Medicare Advantage (MA) plans that offer integrated prescription drug and health care coverage (MA-PD plans). Cost Plans that are regulated under Section 1876 of the Social Security Act, and Employer Group Waiver Plans may also provide a Part D benefit. Organizations wishing to provide services under the Prescription Drug Benefit Program must complete an application, negotiate rates, and receive final approval from CMS. Existing Part D Sponsors may also expand their contracted service area by completing the Service Area Expansion application. Form Number: CMS-10137 (OMB control number: 0938-0936); Frequency: Yearly; Affected Public: Private sector (Business or other For-profits and Not-for-profit institutions); Number of Respondents: 463; Total Annual Responses: 160; Total Annual Hours: 1,565. (For policy questions regarding this collection contact Arianne Spaccarelli at 410-786-5715.)

    9. Type of Information Collection Request: Revision of a currently approved collection; Title of Information Collection: Applications for Part C Medicare Advantage, 1876 Cost Plans, and Employer Group Waiver Plans to Provide Part C Benefits; Use: This information collection includes the process for organizations wishing to provide healthcare services under MA and/or MA-PD plans must complete an application annually, file a bid, and receive final approval from CMS. The application process has two options for applicants that include: Request for new MA product or request for expanding the service area of an existing product. This collection process is the only mechanism for MA and/or MA-PD organizations to complete the required application process. CMS utilizes the application process as the means to review, assess and determine if applicants are compliant with the current requirements for participation in the Medicare Advantage program and to make a decision related to contract award. Form Number: CMS-10237 (OMB control number: 0938-0935); Frequency: Yearly; Affected Public: Private sector (Business or other For-profits and Not-for-profit institutions); Number of Respondents: 310; Total Annual Responses: 310; Total Annual Hours: 10,941. (For policy questions regarding this collection contact Marcella Watts at 410-786-5724.)

    Dated: October 26, 2016. William N. Parham, III, Director, Paperwork Reduction Staff, Office of Strategic Operations and Regulatory Affairs.
    [FR Doc. 2016-26242 Filed 10-28-16; 8:45 am] BILLING CODE 4120-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2016-N-3083] Report on the Performance of Drug and Biologics Firms in Conducting Postmarketing Requirements and Commitments; Availability AGENCY:

    Food and Drug Administration, HHS.

    ACTION:

    Notice of availability.

    SUMMARY:

    Under the Federal Food, Drug, and Cosmetic Act (the FD&C Act), the Food and Drug Administration (FDA or Agency) is required to report annually in the Federal Register on the status of postmarketing requirements (PMRs) and postmarketing commitments (PMCs) required of, or agreed upon by, holders of approved drug and biological products. This notice is the Agency's report on the status of the studies and clinical trials that applicants have agreed to, or are required to, conduct. A supplemental report entitled “Supplementary Report: Performance of Drug and Biologics Firms in Conducting Postmarketing Requirements (PMRs) and Postmarketing Commitments (PMCs) (FY 2013 and FY 2014),” containing additional information and analyses on the status of PMRs and PMCs as of September 30, 2013, and September 30, 2014, is available on FDA's Web site at http://www.fda.gov/Drugs/GuidanceComplianceRegulatoryInformation/Post-marketingPhaseIVCommitments/ucm064436.htm.

    FOR FURTHER INFORMATION CONTACT:

    Cathryn C. Lee, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 22, Rm. 6484, Silver Spring, MD 20993-0002, 301-796-0700; or Stephen Ripley, Center for Biologics Evaluation and Research, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 71, Rm. 3128, Silver Spring, MD 20993-0002, 240-402-7911.

    SUPPLEMENTARY INFORMATION:

    I. Background A. Postmarketing Requirements and Commitments

    A PMR is a study or clinical trial that an applicant is required by statute or regulation to conduct postapproval. A PMC is a study or clinical trial that an applicant agrees in writing to conduct postapproval, but that is not required by statute or regulation. PMRs and PMCs can be issued upon approval of a drug 1 or postapproval, if warranted.

    1 For the purposes of this notice, references to “drugs” or “drug products” include drugs approved under the FD&C Act and biological products licensed under the Public Health Service Act, other than biological products that also meet the definition of a device in section 201(h) of the FD&C Act (21 U.S.C. 321(h)).

    FDA can require application holders to conduct postmarketing studies and clinical trials:

    • To assess a known serious risk, assess signals of serious risk, or identify an unexpected serious risk (when available data indicates the potential for a serious risk) related to the use of a drug product (section 505(o)(3) of the FD&C Act, as added by the Food and Drug Administration Amendments Act of 2007 (FDAAA)).

    • Under the Pediatric Research Equity Act (PREA), to study certain new drugs for pediatric populations, when these drugs are not adequately labeled for children. Under section 505B(a)(3) of the FD&C Act, the initiation of these studies may be deferred until required safety information from other studies in adults has first been submitted and reviewed.

    • To verify and describe the predicted effect or other clinical benefit for drugs approved in accordance with the accelerated approval provisions in section 506(c)(2)(A) of the FD&C Act (21 CFR 314.510 and 601.41).

    • For a drug that was approved on the basis of animal efficacy data because human efficacy trials are not ethical or feasible (21 CFR 314.610(b)(1) and 601.91(b)(1)). PMRs for drug products approved under the animal efficacy rule 2 can be conducted only when the drug product is used for its indication and when an exigency (or event or need) arises. In the absence of a public health emergency, these studies or clinical trials will remain pending indefinitely.

    2 21 CFR 314.600 for drugs; 21 CFR 601.90 for biological products.

    B. Reporting Requirements

    Under the regulations (21 CFR 314.81(b)(2)(vii) and 601.70), applicants of approved drugs are required to submit annually a report on the status of each clinical safety, clinical efficacy, clinical pharmacology, and nonclinical toxicology study or clinical trial either required by FDA or that they have committed to conduct, either at the time of approval or after approval of their new drug application (NDA), abbreviated new drug application (ANDA), or biologics license application (BLA). Applicants are required to report to FDA on these requirements and commitments made for NDAs and ANDAs under 21 CFR 314.81(b)(2)(viii), and for BLAs under 21 CFR 601.70(b). The status of PMCs concerning chemistry, manufacturing, and production controls and the status of other studies or clinical trials conducted on an applicant's own initiative are not required to be reported under 21 CFR 314.81(b)(2)(vii) and 601.70 and are not addressed in this report. Furthermore, section 505(o)(3)(E) of the FD&C Act requires that applicants report periodically on the status of each required study or clinical trial and each study or clinical trial “otherwise undertaken . . . to investigate a safety issue . . . .”

    An applicant must report on the progress of the PMR/PMC on the anniversary of the drug product's approval 3 until the PMR/PMC is completed or terminated and FDA determines that the PMR/PMC has been fulfilled or that the PMR/PMC is either no longer feasible or would no longer provide useful information. The annual status report (ASR) must include a description of the PMR/PMC, a schedule for completing the PMR/PMC, and a characterization of the current status of the PMR/PMC. The report must also provide an explanation of the PMR/PMC status by describing briefly the progress of the PMR/PMC. A PMR/PMC schedule is expected to include the actual or projected dates for the following: (1) Submission of the final protocol to FDA; (2) completion of the study or clinical trial; and (3) submission of the final report to FDA.

    3 An applicant must submit an annual status report on the progress of each open PMR/PMC within 60 days of the anniversary date of U.S. approval of the original application or on an alternate reporting date that was granted by FDA in writing. Some applicants have requested and been granted by FDA alternate annual reporting dates to facilitate harmonized reporting across multiple applications.

    C. PMR/PMC Status Categories

    The status of the PMR/PMC must be described in the ASR according to the terms and definitions provided in 21 CFR 314.81 and 601.70. For its own reporting purposes, FDA has also established terms to describe when the conditions of the PMR/PMC have been met, and when it has been determined that a PMR/PMC is no longer necessary.4 The PMR/PMC status categories are summarized in the following list. As reflected in the definitions, the status of a PMR/PMC is generally determined based on the original schedule.5

    4 See the guidance for industry entitled “Reports on the Status of Postmarketing Study Commitments—Implementation of Section 130 of the Food and Drug Administration Modernization Act of 1997.” We update guidances periodically. To make sure you have the most recent version of a guidance, check the FDA Drugs guidance Web page at http://www.fda.gov/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/default.htm.

    5 The definitions for the terms “pending,” “ongoing,” “delayed,” “terminated,” and “submitted” are adapted from 21 CFR 314.81 and 601.70; the definitions for the terms “fulfilled” and “released” are described in the guidance for industry entitled “Reports on the Status of Postmarketing Study Commitments—Implementation of Section 130 of the Food and Drug Administration Modernization Act of 1997.”

    Pending: The study or clinical trial has not been initiated (i.e., no subjects have been enrolled or animals dosed), but does not meet the criteria for delayed (i.e., the original projected date for initiation of subject accrual or initiation of animal dosing has not passed).6

    6 It is important to note that PMRs/PMCs that are in pending status are not yet delayed; that is, per the milestones, the studies or clinical trials are indeed on schedule and are not expected to be underway yet.

    Ongoing: The study or clinical trial is proceeding according to or ahead of the original schedule.

    Delayed: The study or clinical trial is behind the original schedule.7

    7 In some instances, an applicant may have justifiable reasons for delay of its PMR/PMC (see section I.D).

    Terminated: The study or clinical trial was ended before completion, but a final report has not been submitted to FDA.

    Submitted: The study or clinical trial has been completed or terminated, and a final report has been submitted to FDA.

    Fulfilled: The final report for the study or clinical trial was submitted to FDA and FDA notified the applicant that the requirement or commitment was fulfilled through written correspondence.

    Released: FDA has informed the applicant in writing that it is released from its obligation to conduct the study or clinical trial because the study or clinical trial is no longer feasible, would no longer provide useful information, or the underlying application has been formally withdrawn.

    In addition to the above statuses, PMRs/PMCs may also be characterized as closed or open. “Open” PMRs/PMCs comprise those that are pending, ongoing, delayed, submitted, or terminated; whereas “closed” 8 PMRs/PMCs are either fulfilled or released. Open PMRs are also described by whether they are on- or off-schedule. “On-schedule” PMRs/PMCs are those that are pending, ongoing, or submitted. “Off-schedule” PMRs/PMCs are those that have missed one of the milestone dates in the original schedule and are categorized as either delayed or terminated.

    8 Previous FDA reports on the status of PMRs/PMCs used the term “completed” to refer to PMRs/PMCs that are closed.

    D. Additional Requirements

    If an applicant fails to comply with the original schedule for completion of postmarketing studies or clinical trials required under section 505(o)(3) of the FD&C Act (i.e., under the FDAAA authorities), or fails to submit periodic reports on the status of the studies or clinical trials, the applicant is considered to be in violation of section 505(o)(3), unless it has demonstrated “good cause” for its noncompliance or other violation. Failure to meet an original milestone and, as a result, falling behind the original schedule is one type of noncompliance with a PMR issued under FDAAA. In these circumstances, the FDAAA PMR is considered delayed, with or without good cause.

    Section 505B(a)(3)(B) of the FD&C Act, as amended by the Food and Drug Administration Safety and Innovation Act, authorizes FDA to grant an extension of deferral of pediatric assessments that are required under PREA.9 On its own initiative or upon request, FDA may grant an extension of a pediatric assessment deferral, provided that certain applicable PREA criteria for deferral are still met and the applicant submits certain materials in support of the extension.10 Applicants must submit requests for deferral extensions to FDA not less than 90 days before the date the deferral would otherwise expire. If FDA grants the extension of a pediatric study deferral, this new deferral date is considered the original due date of the PMR. Consequently, the status of PREA PMRs would be determined based on the new deferral date (and not the original PREA PMR schedule).

    9 This provision does not apply to PMRs required under other provisions, or to PMCs.

    10 See section 505B(a)(3)(B) of the FD&C Act.

    FDA may take enforcement action against applicants who are noncompliant with or otherwise fail to conduct studies and clinical trials required under FDA statutes and regulations (see, for example, sections 505(o)(1), 502(z), and 303(f)(4) of the FD&C Act (21 U.S.C. 355(o)(1), 352(z), and 333(f)(4))).

    II. Understanding FDA's Data on Postmarketing Studies and Clinical Trials A. FDA's Internal PMR/PMC Databases

    Databases containing information on PMRs/PMCs are maintained at the Center for Drug Evaluation and Research (CDER) and the Center for Biologics Evaluation and Research (CBER). The information in these databases is periodically updated as new PMRs/PMCs are issued, upon FDA review of PMR/PMC ASRs or other PMR/PMC correspondence, upon receipt of final reports from completed studies and clinical trials, and after the final reports are reviewed and FDA determines that the PMR/PMC has been fulfilled, or when FDA determines that the PMR/PMC is either no longer feasible or would no longer provide useful information. Because applicants typically report on the status of their PMRs/PMCs annually, and because updating the status of PMRs/PMCs in FDA's databases involves FDA review of received information, there is an inherent lag in updating the data (that is, the data are not “real time”). FDA strives to maintain as accurate information as possible on the status of PMRs/PMCs.

    Both CDER and CBER have established policies and procedures to help ensure that FDA's data on PMRs/PMCs are current and accurate. When identified, data discrepancies are addressed as expeditiously as possible and/or are corrected in later reports.

    In 2013, CDER initiated an internal audit of a sample of PMRs and PMCs that had been established after March 25, 2008,11 to ascertain the accuracy of their status. The effort resulted in revisions to the status of certain PMRs/PMCs, and procedures to improve tracking and accuracy of data on PMRs and PMCs. The details of this audit and ensuing activities are summarized in an accompanying supplemental report that is available on FDA's Web site at http://www.fda.gov/Drugs/GuidanceComplianceRegulatoryInformation/Post-marketingPhaseIVCommitments/ucm064436.htm. CDER's internal audit of its PMR/PMC data and subsequent processes for verifying and updating PMR/PMC status took several months to complete, therefore delaying FDA's reporting on PMR/PMC status for fiscal year 2013 (FY2013). As such, this report includes CDER and CBER information for both FY2013 and fiscal year 2014 (FY2014).

    11 This is the effective date of FDAAA. FDAAA included a new requirement for FDA to, among other things, review the entire backlog of PMRs and PMCs to determine which ones required revision or should be eliminated, and assign start dates and estimated completion dates for these PMRs and PMCs. FDAAA also gave new authority to require applicants to conduct and report on postmarketing studies or clinical trials to assess or identify a serious risk related to the use of a drug, and to take action against noncompliance with this requirement. Therefore, the effective date of FDAAA resulted in certain changes to FDA's establishment and monitoring of PMRs and PMCs, and the internal audit was intended to evaluate data for a sample of the PMRs and PMCs that had been established after FDAAA took effect.

    B. Publicly Available PMR/PMC Data

    FDA also maintains an online searchable and downloadable database that contains information about PMRs/PMCs that is publicly reportable (i.e., for which applicants must report on the status of the study or clinical trial, as required under section 506B of the FD&C Act). The data are a subset of all PMRs/PMCs and reflect only those postmarketing studies and clinical trials that, at the time of data retrieval, either had an open status or were closed within the past year. Information on PMRs/PMCs closed more than a year before the date the data are extracted (i.e., September 30 of the reporting fiscal year) are not included on the public Web site. The FDA Web site is updated quarterly.12 The FDA Web site does not include information about PMCs concerning chemistry, manufacturing, and controls. It is FDA policy not to post information on the Web site until it has been verified and reviewed for suitability for public disclosure.

    12http://www.accessdata.fda.gov/scripts/cder/pmc/index.cfm

    III. About This Report

    This report is published to fulfill the annual reporting requirement under section 506B(c) of the FD&C Act. Information in this report covers any PMR/PMC that was made, in writing, at the time of approval or after approval of an application or a supplement to an application (see section I.A) and summarizes the status of PMRs/PMCs in FY2013 (i.e., as of September 30, 2013) and FY2014 (i.e., as of September 30, 2014). The information in this report reflects the PMR/PMC status in CBER's and CDER's databases at the time the data were extracted (September 30 of the fiscal year). Specifically, the report summarizes the status of all open PMRs/PMCs at the end of the fiscal year, and the status of only those PMRs/PMCs that were closed in the fiscal year. If a requirement or commitment did not have a schedule, or an ASR was not received in the previous 12 months, the PMR/PMC is categorized according to the most recent information available to the Agency.13

    13 Although the data included in this report do not include a summary of reports that applicants have failed to file by their due date, the Agency notes that their inclusion or description in this report has no effect on the Agency's ability to take appropriate regulatory action in the event reports are not filed on a timely basis.

    This report reflects combined data from CDER and CBER. Information summarized in the report includes the following: (1) The number of applicants with open PMRs/PMCs 14 ; (2) the number of open PMRs/PMCs; (3) the number of applications for which an ASR was expected but was not submitted within 60 days of the anniversary date of U.S. approval or an alternate reporting date that was granted by FDA; (4) FDA-verified status of open PMRs/PMCs reported in 21 CFR 314.81(b)(2)(vii) or 601.70 ASRs; (5) the status of closed PMRs/PMCs; and (6) the distribution of the status by fiscal year of establishment 15 (fiscal year 2008 (FY2008) to FY2014) for PMRs and PMCs that were open at the end of FY2014 or closed within FY2014. The tables in this report distinguish between PMRs and PMCs, PMRs/PMCs for NDAs and BLAs,16 and on-schedule and off-schedule PMRs/PMCs, according to the original schedule milestones. A more detailed summary of this information and additional information about PMRs/PMCs is provided on FDA's Web site at http://www.fda.gov/Drugs/GuidanceComplianceRegulatoryInformation/Post-marketingPhaseIVCommitments/default.htm. In the accompanying supplemental report, information is presented separately for CDER and CBER.

    14 At the end of FY2013 and FY2014, there were no PMRs/PMCs for ANDAs that met the reporting requirements under FDAMA. Therefore, this report reflects information for NDAs and BLAs only.

    15 The establishment date is the date of the formal FDA communication to the applicant that included the final FDA required (PMR), or requested (PMC), postmarketing study or clinical trial.

    16 Before July 2014, all BLA PMR/PMC data were maintained in CBER's data system. In July 2014, the data for CDER-managed BLAs were migrated to CDER's data system. Similar to previous reports, this report presents data for CDER and CBER BLAs combined.

    Numbers published in this report and in the accompanying supplemental report on FDA's Web site cannot be compared with the numbers resulting from searches of the publicly accessible and downloadable database. This is because this report incorporates data for all PMRs/PMCs in FDA databases as of the end of the fiscal year, including PMRs/PMCs undergoing review for accuracy. The publicly accessible and downloadable database includes a subset of PMRs/PMCs, specifically those that, at the time of data retrieval, either had an open status or were closed within the past 12 months. In addition, the status information in this report is updated annually while the downloadable database is updated quarterly (i.e., in January, April, July, and October).

    IV. Summary of Information on PMR/PMC Status

    This report provides information on PMRs/PMCs as of September 30, 2013 (i.e., for FY2013) and September 30, 2014 (i.e., for FY2014). It is important to note that a comparison of the number of open and on-schedule or off-schedule PMRs/PMCs over time can be misleading because it does not take into account that the cohort of open PMRs/PMCs is not static from year to year. New PMRs/PMCs are continually being established for studies and clinical trials with varying start dates and durations; and other PMRs/PMCs are closed because they are either fulfilled or released. Also, ongoing PMRs/PMCs are carried forward into the subsequent fiscal year. Therefore, the number of on- and off-schedule PMRs/PMCs can vary from year to year, and a year-to-year comparison of on- or off-schedule PMRs/PMCs (e.g., to assess for a potential trend) is not appropriate.

    Although a comparison of the number of open and on-schedule or off-schedule PMRs/PMCs over time is not appropriate for the aforementioned reasons, a comparison of the data for FY2013 and FY2014 may be helpful in understanding the effect of CDER's 2013 audit. The observed differences are considered to reflect the results of CDER's efforts to update the information on the statuses of PMRs and PMCs following the internal audit of the data for a sample of PMRs/PMCs (see section II.A), as well as the natural progress of postmarketing studies and clinical trials over time. Finally, due to rounding, the percentages in the tables may not add up to 100 percent.

    A. Applicants With Open PMRs/PMCs

    An applicant may have multiple approved drug products, and an approved drug product may have multiple PMRs and/or PMCs. Table 1 shows that as of September 30, 2013, there were 256 unique applicants with open PMRs/PMCs under 613 unique NDAs and BLAs. There were 184 unique NDA applicants (and 496 associated applications) and 72 unique BLA applicants (and 117 associated applications) with open PMRs/PMCs.

    As of September 30, 2014, there were 257 unique applicants with open PMRs/PMCs under 639 unique NDAs and BLAs. There were 181 unique NDA applicants (and 510 associated applications) and 76 unique BLA applicants (and 129 associated applications) with open PMRs/PMCs.

    B. Annual Status Reports Received

    As previously mentioned, applicants must submit an ASR on the progress of each open PMR/PMC within 60 days of the anniversary date of U.S. approval of the original application or an alternate reporting date that was granted by FDA (21 CFR 314.81 and 21 CFR 601.70).17 Table 2 shows that there were 530 NDAs and BLAs with an ASR due in FY2013 (429 NDAs and 101 BLAs).18 Of the NDA ASRs due in that fiscal year, 60 percent (257/429) were received on time, 21 percent (90/429) were not received on time, and 19 percent (82/429) were not received during FY2013. There were 101 BLAs with an ASR due in FY2013. Of the BLA ASRs due, 69 percent (70/101) were received on time, 20 percent (20/101) were not received on time, and 11 percent (11/101) were not received during FY2013.

    17 An applicant must submit an ASR on the progress of each open PMR/PMC within 60 days of the anniversary date of U.S. approval of the original application or on an alternate reporting date that was granted by FDA in writing. Some applicants have requested and been granted by FDA alternate annual reporting dates to facilitate harmonized reporting across multiple applications.

    18 The number of ASRs that were expected is different from the total number of unique applications with open PMRs/PMCs because not all applications had an ASR due during FY2013/FY2014. Applicants with PMRs/PMCs associated with multiple applications may have submitted the ASR to only one of the applications. In addition, if all of the PMRs/PMCs for an application were established in the preceding fiscal year, or if all PMRs/PMCs for an application were closed before the ASR due date, submission of an ASR would not have been expected.

    There were 569 NDAs and BLAs with an ASR due in FY2014 (454 NDAs and 115 BLAs). Of the 454 NDA ASRs due in that fiscal year, 58 percent (265/454) were received on time, 19 percent (88/454) were not received on time, and 22 percent (101/454) were not received during FY2014. Of the 115 BLA ASRs due, 63 percent (73/115) were received on time, 19 percent (20/115) were not received on time, and 19 percent (22/115) were not received during FY2014.

    C. Overview of On- and Off-Schedule Open PMRs/PMCs

    Table 3 shows that as of September 30, 2013, most open PMRs (84 percent for NDAs and 89 percent for BLAs) and most open PMCs (77 percent for NDAs and 74 percent for BLAs) were progressing on schedule (i.e., were not delayed or terminated). Similarly, as of September 30, 2014, most open PMRs (87 percent for NDAs and 88 percent for BLAs) and most open PMCs (68 percent for NDAs and 77 percent for BLAs) were progressing on schedule.

    D. Open and On-Schedule PMRs

    Table 4 shows that as of September 30, 2013, the majority of open NDA PMRs (60 percent; 534/887) and open BLA PMRs (45 percent; 80/179) were pending.19 This is similar to the findings from fiscal year 2012.20 As of September 30, 2014, 48 percent (456/943) of open NDA PMRs and 38 percent (74/194) of open BLA PMRs were pending. Table 4 also shows that the proportion of open NDA PMRs that were categorized as ongoing increased from 19 percent (166/887) at the end of FY2013 to 32 percent (303/943) at the end of FY2014.

    19 It is important to note that PMRs/PMCs that are in pending status are not yet delayed; that is, per the milestones, the studies/clinical trials are indeed on schedule and are not expected to be underway yet.

    20 As of September 30, 2012, 58 percent of open NDA PMRs and 46 percent of open BLA PMRs were pending (79 FR 9230, February 18, 2014).

    Table 4 also shows that the proportion of open BLA PMRs that were pending decreased between FY2013 (45 percent; 80/179) and FY2014 (38 percent; 74/194). The proportion of open BLA PMRs that were ongoing did not change substantially between FY2013 (32 percent; 57/179) and FY2014 (35 percent; 68/194).

    In addition, table 4 provides detail on the status of open PMRs and PMCs for each category of PMR. The table shows that as of September 30, 2013, 50 percent (305/614) of pending PMRs for drug and biological products were in response to the requirements under PREA. The next largest category of pending PMRs for drug and biological products (47 percent; 286/614) comprises those studies/clinical trials required by FDA under FDAAA. As of September 30, 2014, PREA PMRs and FDAAA PMRs comprised 55 percent (292/530) and 42 percent (222/530) of pending PMRs, respectively.

    E. Open and Off-Schedule PMRs

    Table 5 provides additional information on the status of open and off-schedule (i.e., delayed and terminated) PMRs. At the end of FY2013, 16 percent (143/887) of the open NDA PMRs and 11 percent (20/179) of the open BLA PMRs were off-schedule. The majority of the off-schedule NDA PMRs (98 percent; 140/143) were delayed; the remaining 2 percent (3/143) were terminated. At the end of that same fiscal year, 10 percent (18/179) of the open BLA PMRs were delayed and 1 percent (2/179) were terminated. Most of the off-schedule BLA PMRs (90 percent; 18/20) were delayed.

    As of September 30, 2014, 13 percent (126/943) of the open NDA PMRs were off-schedule. Of the off-schedule NDA PMRs, 94 percent (118/126) were off-schedule because they were delayed and the remaining 6 percent (8/126) were terminated. At the end of FY2014, 12 percent (24/194) of the open BLA PMRs were off-schedule. The majority of the off-schedule BLA PMRs (88 percent; 21/24) were off-schedule because they were delayed; the remaining 2 percent (3/194) were terminated.

    In certain situations, the original PMR schedules were adjusted for unanticipated delays in the progress of the study or clinical trial (e.g., difficulties with subject enrollment in a clinical trial for a marketed drug or need for additional time to analyze results). In this report, study or clinical trial status reflects the status in relation to the original 21 study or clinical trial schedule regardless of whether FDA has acknowledged that additional time was required to complete the study or clinical trial.

    21 With the exception of PREA PMRs for which a deferral extension of the final report submission date has been granted.

    F. Open On-Schedule and Off-Schedule PMCs

    Table 6 provides the status of open on-schedule and off-schedule PMCs. As shown in the table, pending NDA PMCs comprised the largest category of all open NDA PMCs as of September 30, 2013 (37 percent; 97/264), and September 30, 2014 (29 percent; 61/207). Among all open BLA PMCs, 35 percent (88/251) and 30 percent (69/228) were pending at the end of FY2013 and FY2014, respectively.

    As of September 30, 2013, the largest category of off-schedule PMCs were delayed according to the original schedule milestones.22 Similarly, as of September 30, 2014, the majority of off-schedule NDA and BLA PMCs were delayed according to the original schedule milestones.23

    22 As of September 30, 2013, off-schedule PMCs accounted for 23 percent (61/264) of open NDA PMCs and 26 percent (65/251) of open BLA PMCs.

    23 As of September 30, 2014, off-schedule PMCs accounted for 32 percent (66/207) of open NDA PMCs and 23 percent (53/228) of open BLA PMCs.

    G. Closed PMRs and PMCs

    Table 7 provides details about PMRs and PMCs that were closed (released or fulfilled) within FY2013 and FY2014. The majority of closed PMRs were fulfilled (53 percent of NDA PMRs and 88 percent of BLA PMRs at the end of FY2013; 72 percent of NDA PMRs and 77 percent of BLA PMRs at the end of FY2014). Similarly, the majority of PMCs closed within FY2013 and FY2014 were fulfilled.

    H. Distribution of the Status of PMRs and PMCs

    Tables 8 and 9 show the distribution of the statuses of PMRs/PMCs as of September 30, 2014, of all PMRs and PMCs, presented by the year that the PMR/PMC was established (FY2008 to FY2014).24 Note that the data shown for closed (fulfilled or released) PMRs/PMCs is for all PMRs/PMCs that were closed as of FY2014. Therefore, data for PMRs/PMCs that were closed in prior fiscal years are included. Based on the data shown in table 8, an average of 243 PMRs were established each year since fiscal year 2009.25 26 Most PMRs that were established in the earlier years were either fulfilled or released. For example, as of September 30, 2014, 45 percent (57/128) of the PMRs that were established in FY2008 were fulfilled, and 22 percent (28/128) were released. The majority of PMRs that were established in more recent years were either pending (i.e., not yet underway) or ongoing (i.e., still in progress and on schedule). For example, as of September 30, 2014, 87 percent (226/260) of the PMRs established in FY2014 were pending, and 9 percent (23/260) were ongoing. Overall, of the PMRs that were pending as of September 30, 2014, 81 percent (414/512) were created within the past 3 years. Finally, table 8 shows that, on average, 7 percent of the PMRs established since FY2008 were delayed (as of September 30, 2014). Table 9 provides an overview of PMCs in a similar manner as table 8 does for PMRs and shows similar results for PMCs as those for PMRs as described above and in table 8.

    24 The establishment date is the date of the formal FDA communication to the applicant that included the final FDA required (PMR) or requested (PMC) postmarketing study or clinical trial.

    25 The number of PMRs issued at any particular period is determined by a variety of factors including but not necessarily limited to: (1) The number of NDAs approved in that period; (2) whether additional efficacy or clinical benefit issues were evaluated; (3) if any drug-associated serious risk(s) have been identified; and (4) whether or not FDA determines that a postmarketing study or clinical trial is necessary to further assess risk(s) or efficacy issues.

    26 Data for FY2008 were not included in the calculation of the average number of PMRs established each year because, given that FDAAA took effect on March 25, 2008, data are only available for a partial fiscal year.

    Table 1—Applicants and Applications (NDA/BLA) With Open Postmarketing Requirements and Commitments [Numbers as of September 30, 2013, and September 30, 2014] FY 2013 NDA 1 BLA 2 Total
  • (NDA and BLA)
  • Number of unique applicants with open PMRs/PMCs 184 72 256 Number of applications with open PMRs/PMCs 496 117 613 FY 2014 NDA 1 BLA 2 Total
  • (NDA and BLA)
  • Number of unique applicants with open PMRs/PMCs 181 76 257 Number of applications with open PMRs/PMCs 510 129 639 1 Includes two NDAs with associated PMRs/PMCs managed by CBER. 2 Includes BLAs managed by both CDER and CBER.
    Table 2—Annual Status Reports Received [Numbers as of September 30, 2013, and September 30, 2014] 1 Expected 2 Received, on time 3
  • (% of expected)
  • Received, not on time 4
  • (% of expected)
  • Expected but not received
  • (% of expected)
  • FY 2013: NDA 429 257 (60%) 90 (21%) 82 (19%) BLA 101 70 (69%) 20 (20%) 11 (11%) FY 2014: NDA 454 265 (58%) 88 (19%) 101 (22%) BLA 115 73 (63%) 20 (19%) 22 (19%) 1 Percentages may not total 100 due to rounding. 2 ASR expected during fiscal year (within 60 days (before or after) of the anniversary of original approval date or alternate agreed-upon date). 3 ASR was received within 60 days (before or after) of the anniversary of the original approval date or alternate agreed-upon date. 4 ASR was received, but not within 60 days (before or after) of the anniversary of the original approval date or alternate agreed-upon date.
    Table 3—Summary of On- and Off-Schedule Postmarketing Requirements and Commitments [Numbers as of September 30, 2013, and September 30, 2014] 1 FY 2013 Open PMRs
  • N = 1,066
  • NDA
  • (% of Open NDA PMRs)
  • BLA
  • (% of Open BLA PMRs)
  • Open PMCs
  • N = 515
  • NDA
  • (% of Open NDA PMCs)
  • BLA
  • (% of Open BLA PMCs)
  • On-schedule 744 (84%) 159 (89%) 203 (77%) 186 (74%) Off-schedule 143 (16%) 20 (11%) 61 (23%) 65 (26%) Total 887 179 264 251 Open PMRs Open PMCs FY 2014 N = 1,137 N = 435 NDA
  • (% of Open NDA PMRs)
  • BLA
  • (% of Open BLA PMRs)
  • NDA
  • (% of Open NDA PMCs)
  • BLA
  • (% of Open BLA PMCs)
  • On-schedule 817 (87%) 170 (88%) 141 (68%) 175 (77%) Off-schedule 126 (13%) 24 (12%) 66 (32%) 53 (23%) Total 943 194 207 228 1 Percentages may not total 100 due to rounding.
    Table 4—Summary of Open and On-Schedule Postmarketing Requirements [Numbers as of September 30, 2013, and September 30, 2014] 1 FY 2013 Reporting authority/PMR status NDA
  • N = 887
  • (% of Total open NDA PMRs)
  • Pending Ongoing Submitted BLA
  • N = 179
  • (% of Total open BLA PMRs)
  • Pending Ongoing Submitted
    Accelerated approval 17 (2%) 12 (1%) 1 (<1%) 1 (<1%) 8 (4%) 0 PREA 2 272 (31%) 65 (7%) 10 (1%) 33 (18%) 13 (7%) 4 (2%) Animal efficacy 3 2 (<1%) 0 0 3 (2%) 0 0 FDAAA safety (since March 25, 2008) 4 243 (27%) 89 (10%) 5 33 (4%) 43 (24%) 36 (20%) 18 (10%) Total 534 (60%) 166 (19%) 44 (5%) 80 (45%) 57 (32%) 22 (12%) NDA BLA FY 2014 N = 943 N = 194 (% of Total open NDA PMRs) (% of Total open BLA PMRs) Reporting authority/PMR status Pending Ongoing Submitted Pending Ongoing Submitted Accelerated approval 8 (<1%) 26 (3%) 0 3 (2%) 4 (2%) 2 (1%) PREA 253 (27%) 136 (14%) 27 (3%) 39 (20%) 20 (10%) 8 (4%) Animal efficacy 2 (<1%) 0 1 (<1%) 3 (2%) 0 0 FDAAA safety (since March 25, 2008) 6 193 (20%) 141 (15%) 30 (3%) 29 (15%) 44 (23%) 18 (9%) Total 456 (48%) 303 (32%) 58 (6%) 74 (38%) 68 (35%) 28 (14%) 1 Percentages may not total 100 due to rounding. 2 Many PREA studies have a pending status. PREA studies are usually deferred because the drug product is ready for approval in adults. Initiation of these studies may be deferred until additional safety information from other studies has first been submitted and reviewed before beginning the studies in pediatric populations. 3 PMRs for drug products approved under the animal efficacy rule (21 CFR 314.600 for drugs; 21 CFR 601.90 for biological products) can be conducted only when the drug product is used for its indication and when an exigency (or event or need) arises. In the absence of a public health emergency, these studies or clinical trials will remain pending indefinitely. 4 Includes one NDA PMR FDAAA safety study from CBER in pending status. 5 Includes one NDA PMR FDAAA safety study from CBER in submitted status. 6 Includes one NDA PMR FDAAA safety study from CBER in pending status.
    Table 5—Summary of Open and Off-Schedule Postmarketing Requirements [Numbers as of September 30, 2013, and September 30, 2014] 1 FY2013 Reporting authority/PMR status NDA
  • N = 887
  • (% of Open NDA PMRs)
  • Delayed Terminated BLA
  • N = 179
  • (% of Open BLA PMRs)
  • Delayed Terminated
    Accelerated approval 7 (0.8%) 1 (0.1%) 1 (0.6%) 0 PREA 94 (11%) 2 (0.2%) 6 (3%) 2 (1%) Animal efficacy 1 (0.1%) 0 0 0 FDAAA safety (since March 25, 2008) 38 (4%) 0 11 (6%) 0 Total 140 (16%) 3 (0.3%) 18 (10%) 2 (1%) NDA BLA FY 2014 N = 943 N = 194 (% of Open NDA PMRs) (% of Open BLA PMRs) Reporting authority/PMR status Delayed Terminated Delayed Terminated Accelerated approval 6 (0.6%) 2 (0.2%) 2 (1%) 0 PREA 67 (7%) 2 (0.2%) 5 (3%) 3 (2%) Animal efficacy 0 0 0 0 FDAAA safety (since March 25, 2008) 45 (5%) 4 (0.4%) 14 (7%) 0 Total 118 (13%) 8 (0.8%) 21 (11%) 3 (2%) 1 Percentages may not total 100 due to rounding.
    Table 6—Summary of Open Postmarketing Commitments [Numbers as of September 30, 2013, and September 30, 2014] 1 FY 2013 NDA
  • N = 264
  • (% Open PMCs)
  • BLA
  • N = 251
  • (% Open PMCs)
  • FY 2014 NDA
  • N = 207
  • (% Open PMCs)
  • BLA
  • N = 228
  • (% Open PMCs)
  • On-Schedule: Pending 97 (37%) 88 (35%) 61 (29%) 69 (30%) Ongoing 61 (23%) 61 (24%) 49 (24%) 76 (33%) Submitted 45 (17%) 37 (15%) 31 (15%) 30 (13%) Total 203 (77%) 186 (74%) 141 (68%) 175 (77%) Off-Schedule: Delayed 56 (21%) 63 (25%) 63 (30%) 51 (22%) Terminated 5 (2%) 2 (0.8%) 3 (1%) 2 (1%) Total 61 (23%) 65 (26%) 66 (32%) 53 (23%) 1 Percentages may not total 100 due to rounding.
    Table 7—Summary of Closed 1 Postmarketing Requirements and Commitments [Numbers as of September 30, 2013, and September 30, 2014] 2 Postmarketing requirements FY 2013 NDA
  • N = 134
  • BLA
  • N = 17
  • FY 2014 NDA
  • N = 188
  • BLA
  • N = 30
  • Closed PMRs (% of Total Closed PMRs): Requirement met (fulfilled) 71 (53%) 15 (88%) 136 (72%) 23 (77%) Requirement not met (released and new revised requirement issued) 27 (20%) 1 (6%) 14 (7%) 3 (10%) Requirement no longer feasible or drug product withdrawn (released) 36 (27%) 1 (6%) 38 (20%) 4 (13%) FY 2013 FY 2014 Postmarketing commitments NDA
  • N = 53
  • BLA
  • N = 33
  • NDA
  • N = 96
  • BLA
  • N = 70
  • Closed PMCs (% of Total Closed PMCs): Requirement met (fulfilled) 42 (79%) 28 (85%) 84 (88%) 57 (81%) Requirement not met (released and new revised requirement issued) 0 0 0 2 (3%) Requirement no longer feasible or drug product withdrawn (released) 11 (21%) 5 (15%) 12 (13%) 11 (16%) 1 The table shows data for only those PMRs/PMCs that were closed (fulfilled or released) within the fiscal year. Therefore, data for PMRs/PMCs that were closed in prior fiscal years are not included. 2 Percentages may not total 100 due to rounding.
    Table 8—Summary of Status of Postmarketing Requirements Established Between FY 2008 and FY 2014 1 2 [Numbers as of September 30, 2014] 3 PMR status as of FY 2014
  • (% of total PMRs in each establishment year)
  • Fiscal year of PMR establishment 2008 2009 2010 2011 2012 2013 2014
    Pending 11 (9%) 15 (6%) 29 (13%) 43 (17%) 60 (29%) 128 (49%) 226 (87%) Ongoing 20 (16%) 51 (20%) 49 (21%) 74 (29%) 58 (28%) 62 (24%) 23 (9%) Submitted 1 (<1%) 11 (5%) 21 (9%) 8 (3%) 15 (7%) 19 (7%) 1 (<1%) Delayed 11 (9%) 26 (11%) 18 (8%) 19 (7%) 18 (9%) 19 (7%) 0 Terminated 0 2 (<1%) 0 0 1 (<1%) 3 (1%) 1 (<1%) Released 28 (22%) 51 (21%) 22 (10%) 43 (17%) 20 (10%) 8 (3%) 1 (<1%) Fulfilled 57 (45%) 88 (36%) 92 (40%) 72 (28%) 33 (16%) 23 (9%) 8 (3%) Total 128 244 231 259 205 262 260 1 The establishment date is the date of the formal FDA communication to the applicant that included the final FDA required (PMR) or requested (PMC) postmarketing study or clinical trial. 2 The table shows data for PMRs that were closed (fulfilled or released) as of FY2014. Therefore, data for PMRs that were closed in prior fiscal years are included. 3 Percentages may not total 100 due to rounding.
    Table 9—Summary of Status of Postmarketing Commitments Established Between FY 2008 and FY 2014 1 2 [Numbers as of September 30, 2014] 3 PMC status as of FY2014
  • (% of total PMCs in each establishment year)
  • Fiscal year of PMC establishment 2008 2009 2010 2011 2012 2013 2014
    Pending 1 (1%) 4 (9%) 3 (3%) 11 (13%) 12 (23%) 22 (45%) 47 (82%) Ongoing 11 (9%) 5 (11%) 16 (18%) 25 (30%) 16 (30%) 14 (29%) 9 (16%) Submitted 1 (1%) 6 (13%) 9 (10%) 2 (2%) 5 (9%) 6 (12%) 0 Delayed 8 (7%) 8 (17%) 16 (18%) 8 (10%) 6 (11%) 3 (6%) 0 Terminated 0 1 (2%) 0 0 0 0 0 Released 12 (10%) 3 (6%) 6 (7%) 7 (9%) 0 0 0 Fulfilled 86 (72%) 20 (43%) 40 (44%) 29 (35%) 14 (26%) 4 (8%) 1 (2%) Total 119 47 90 82 53 49 57 1 The establishment date is the date of the formal FDA communication to the applicant that included the final FDA required (PMR) or requested (PMC) postmarketing study or clinical trial. 2 The table shows data for PMCs that were closed (fulfilled or released) as of FY2014. Therefore, data for PMCs that were closed in prior fiscal years are included. 3 Percentages may not total 100 due to rounding.
    Dated: October 25, 2016. Leslie Kux, Associate Commissioner for Policy.
    [FR Doc. 2016-26247 Filed 10-28-16; 8:45 am] BILLING CODE 4164-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2016-D-0435] Labeling for Permanent Hysteroscopically Placed Tubal Implants Intended for Sterilization; Guidance for Industry and Food and Drug Administration Staff; Availability AGENCY:

    Food and Drug Administration, HHS.

    ACTION:

    Notice of availability.

    SUMMARY:

    The Food and Drug Administration (FDA or Agency) is announcing the availability of the guidance entitled “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization.” This guidance addresses the inclusion of a boxed warning and patient decision checklist in the product labeling for permanent hysteroscopically placed tubal implants intended for female sterilization, and the content and format of those materials. FDA believes that the labeling described in this guidance will help to ensure that a woman receives and understands information regarding the benefits and risks of this type of device prior to undergoing implantation. FDA considered comments received on the draft guidance and revised the guidance as appropriate.

    The guidance identifies the content and format of certain labeling components for permanent, hysteroscopically placed tubal implants that are intended for sterilization. The guidance applies to all devices of this type, regardless of the insert material composition, location of intended implantation, or exact method of delivery.

    DATES:

    Submit either electronic or written comments on this guidance at any time. General comments on Agency guidance documents are welcome at any time.

    ADDRESSES:

    You may submit comments as follows:

    Electronic Submissions

    Submit electronic comments in the following way:

    Federal eRulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments. Comments submitted electronically, including attachments, to http://www.regulations.gov will be posted to the docket unchanged. Because your comment will be made public, you are solely responsible for ensuring that your comment does not include any confidential information that you or a third party may not wish to be posted, such as medical information, your or anyone else's Social Security number, or confidential business information, such as a manufacturing process. Please note that if you include your name, contact information, or other information that identifies you in the body of your comments, that information will be posted on http://www.regulations.gov.

    • If you want to submit a comment with confidential information that you do not wish to be made available to the public, submit the comment as a written/paper submission and in the manner detailed (see “Written/Paper Submissions” and “Instructions”).

    Written/Paper Submissions

    Submit written/paper submissions as follows:

    Mail/Hand delivery/Courier (for written/paper submissions): Division of Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852.

    • For written/paper comments submitted to the Division of Dockets Management, FDA will post your comment, as well as any attachments, except for information submitted, marked and identified, as confidential, if submitted as detailed in “Instructions.”

    Instructions: All submissions received must include the Docket No. FDA-2016-D-0435 for “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization, Guidance for Industry and Food and Drug Administration Staff.” Received comments will be placed in the docket and, except for those submitted as “Confidential Submissions,” publicly viewable at http://www.regulations.gov or at the Division of Dockets Management between 9 a.m. and 4 p.m., Monday through Friday.

    • Confidential Submissions—To submit a comment with confidential information that you do not wish to be made publicly available, submit your comments only as a written/paper submission. You should submit two copies total. One copy will include the information you claim to be confidential with a heading or cover note that states “THIS DOCUMENT CONTAINS CONFIDENTIAL INFORMATION.” The Agency will review this copy, including the claimed confidential information, in its consideration of comments. The second copy, which will have the claimed confidential information redacted/blacked out, will be available for public viewing and posted on http://www.regulations.gov. Submit both copies to the Division of Dockets Management. If you do not wish your name and contact information to be made publicly available, you can provide this information on the cover sheet and not in the body of your comments and you must identify this information as “confidential.” Any information marked as “confidential” will not be disclosed except in accordance with 21 CFR 10.20 and other applicable disclosure law. For more information about FDA's posting of comments to public dockets, see 80 FR 56469, September 18, 2015, or access the information at: http://www.fda.gov/regulatoryinformation/dockets/default.htm.

    Docket: For access to the docket to read background documents or the electronic and written/paper comments received, go to http://www.regulations.gov and insert the docket number, found in brackets in the heading of this document, into the “Search” box and follow the prompts and/or go to the Division of Dockets Management, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852.

    An electronic copy of the guidance document is available for download from the Internet. See the SUPPLEMENTARY INFORMATION section for information on electronic access to the guidance. Submit written requests for a single hard copy of the guidance document entitled “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization, Guidance for Industry and Food and Drug Administration Staff” to the Office of the Center Director, Guidance and Policy Development, Center for Devices and Radiological Health, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 66, Rm. 5431, Silver Spring, MD 20993-0002. Send one self-addressed adhesive label to assist that office in processing your request.

    FOR FURTHER INFORMATION CONTACT:

    Jason Roberts, Division of Reproductive, Gastro-Renal and Urological Devices, Center for Devices and Radiological Health, Food and Drug Administration, 10903 New Hampshire Ave., Bldg. 66, Rm. G218, Silver Spring, MD 20993-0002, 240-402-6400.

    SUPPLEMENTARY INFORMATION: I. Background

    Female sterilization is a commonly performed surgical procedure that permanently prevents a woman from becoming pregnant by occluding her fallopian tubes. Traditionally, such surgery has been performed by surgical bilateral tubal ligation (BTL) through a laparotomy, a mini-laparotomy, a transvaginal approach or at the time of cesarean delivery, and, more recently, laparoscopy. During surgical BTL, the fallopian tubes are cut or physically occluded by using various procedures or medical instruments, such as electrosurgical coagulation or implantable clips or rings. On November 4, 2002, FDA approved the Essure System for Permanent Birth Control, the first permanent hysteroscopically placed tubal implant, as an alternative, non-incisional method of providing female sterilization. As the number of hysteroscopic sterilizations with such devices has increased, additional information, including reports of adverse events, has accumulated. Some of these events have resulted in surgery and/or removal of the implants.

    In the Federal Register on July 22, 2015 (80 FR 43440), FDA announced a meeting of a public advisory committee to seek expert scientific and clinical opinion on the risks and benefits of the Essure System for Permanent Birth Control. On September 24, 2015, FDA convened its Obstetrics and Gynecology Devices Panel of the Medical Devices Advisory Committee to discuss available data regarding benefits, risks, and potential mitigation strategies to prevent or reduce the frequency/severity of the adverse events reported in association with this device (Ref. 1).

    A draft guidance regarding the labeling for permanent hysteroscopically placed tubal implants intended for sterilization was announced in the Federal Register on March 4, 2016 (81 FR 11577) and made available for public comment. The comment period closed on May 3, 2016. FDA reviewed and considered all public comments received and revised the guidance as appropriate, including revisions to the content and format of a boxed warning and patient decision checklist. FDA intends to require such labeling as part of a premarket approval application (PMA) for hysteroscopically placed tubal implants intended for sterilization (or a PMA supplement for an already marketed device).

    II. Significance of Guidance

    This guidance is being issued consistent with FDA's good guidance practices regulation (21 CFR 10.115). The guidance represents the current thinking of FDA on “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization.” It does not establish any rights for any person and is not binding on FDA or the public. You can use an alternative approach if it satisfies the requirements of the applicable statutes and regulations.

    III. Electronic Access

    Persons interested in obtaining a copy of the guidance may do so by downloading an electronic copy from the Internet. A search capability for all Center for Devices and Radiological Health guidance documents is available at http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/default.htm. Guidance documents are also available at http://www.regulations.gov. Persons unable to download an electronic copy of “Labeling for Permanent Hysteroscopically-Placed Tubal Implants Intended for Sterilization” may send an email request to [email protected] to receive an electronic copy of the document. Please use the document number 1500051 to identify the guidance you are requesting.

    IV. Paperwork Reduction Act of 1995

    This guidance refers to previously approved collections of information found in FDA regulations. These collections of information are subject to review by the Office of Management and Budget (OMB) under the Paperwork Reduction Act of 1995 (44 U.S.C. 3501-3520). The collections of information in 21 CFR part 801, regarding labeling, have been approved under OMB control number 0910-0485.

    V. References

    The following reference is on display in the Division of Dockets Management (see ADDRESSES) and is available for viewing by interested persons between 9 a.m. and 4 p.m., Monday through Friday; it is also available electronically at http://www.regulations.gov. FDA has verified the Web site address, as of the date this document publishes in the Federal Register, but Web sites are subject to change over time.

    1. Meeting Materials of the Obstetrics and Gynecology Devices Panel (2015), available at http://www.fda.gov/AdvisoryCommittees/CommitteesMeetingMaterials/MedicalDevices/MedicalDevicesAdvisoryCommittee/ObstetricsandGynecologyDevices/ucm463457.htm. Dated: October 26, 2016. Leslie Kux, Associate Commissioner for Policy.
    [FR Doc. 2016-26243 Filed 10-28-16; 8:45 am] BILLING CODE 4164-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Notice of Closed Meetings

    Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5 U.S.C. App.), notice is hereby given of the following meetings.

    The meetings will be closed to the public in accordance with the provisions set forth in sections 552b(c)(4) and 552b(c)(6), Title 5 U.S.C., as amended. The grant applications and the discussions could disclose confidential trade secrets or commercial property such as patentable material, and personal information concerning individuals associated with the grant applications, the disclosure of which would constitute a clearly unwarranted invasion of personal privacy.

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Cell and Molecular Biology.

    Date: November 16-17, 2016.

    Time: 8:00 a.m. to 5:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: Embassy Suites at the Chevy Chase Pavilion, 4300 Military Road NW., Washington, DC 20015.

    Contact Person: Amy Kathleen Wernimont, Ph.D., Scientific Review Officer, Center for Scientific Review, 6701 Rockledge Drivem Bethesda, MD 20892, 301-827-6427, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; PAR Panel: Studies of HIV/AIDS and Aging.

    Date: November 21, 2016.

    Time: 10:00 a.m. to 5:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting).

    Contact Person: Robert Freund, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 5216, MSC 7852, Bethesda, MD 20892, 301-435-1050, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Musculoskeletal, Oral and Skin Sciences Continuous Submission.

    Date: November 21, 2016.

    Time: 1:00 p.m. to 4:30 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892.

    Contact Person: Richard Ingraham, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 4116, MSC 7814, Bethesda, MD 20892, 301-496-8551, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Neuroscience Assay, Diagnostics and Animal Model Development.

    Date: November 29, 2016.

    Time: 8:00 a.m. to 5:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: The St. Regis Washington DC, 923 16th Street NW., Washington, DC 20006.

    Contact Person: Susan Gillmor, Ph.D., Scientific Review Officer, National Institutes of Health, Center for Scientific Review, 6701 Rockledge Drive, Bethesda, MD 20892, 301-435-1730, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Nephrology.

    Date: November 29-30, 2016.

    Time: 9:00 a.m. to 6:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting).

    Contact Person: Atul Sahai, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 2188, MSC 7818, Bethesda, MD 20892, 301-435-1198, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Biological Chemistry and Macromolecular Biophysics.

    Date: November 29, 2016.

    Time: 1:30 p.m. to 5:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Telephone Conference Call).

    Contact Person: William A Greenberg, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 4168, MSC 7806, Bethesda, MD 20892, (301) 435-1726, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Psycho/Neuropathology, Lifespan Development, and STEM Education.

    Date: November 29, 2016.

    Time: 12:00 p.m. to 6:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting).

    Contact Person: John H Newman, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 3222, MSC 7808, Bethesda, MD 20892, (301) 435-0628, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Molecular Cellular Developmental Neuroscience.

    Date: November 30, 2016.

    Time: 1:00 p.m. to 4:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Telephone Conference Call).

    Contact Person: Christine A Piggee, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 4186, MSC 7850, Bethesda, MD 20892, 301-435-0657, [email protected].

    Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Cognition and Perception.

    Date: November 30, 2016.

    Time: 1:00 p.m. to 3:00 p.m.

    Agenda: To review and evaluate grant applications.

    Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting).

    Contact Person: Wind Cowles, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 3172, Bethesda, MD 20892, [email protected].

    (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93.393-93.396, 93.837-93.844, 93.846-93.878, 93.892, 93.893, National Institutes of Health, HHS)
    Dated: October 25, 2016. Natasha M. Copeland, Program Analyst, Office of Federal Advisory Committee Policy.
    [FR Doc. 2016-26128 Filed 10-28-16; 8:45 am] BILLING CODE 4140-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Government-Owned Inventions; Availability for Licensing AGENCY:

    National Institutes of Health, HHS.

    ACTION:

    Notice.

    SUMMARY:

    The invention listed below is owned by an agency of the U.S. Government and is available for licensing and/or co-development in the U.S. in accordance with 35 U.S.C. 209 and 37 CFR part 404 to achieve expeditious commercialization of results of federally-funded research and development. Foreign patent applications are filed on selected inventions to extend market coverage for companies and may also be available for licensing and/or co-development.

    ADDRESSES:

    Invention Development and Marketing Unit, Technology Transfer Center, National Cancer Institute, 9609 Medical Center Drive, Mail Stop 9702, Rockville, MD 20850-9702.

    FOR FURTHER INFORMATION CONTACT:

    Information on licensing and co-development research collaborations, and copies of the U.S. patent applications listed below may be obtained by contacting: Attn. Invention Development and Marketing Unit, Technology Transfer Center, National Cancer Institute, 9609 Medical Center Drive, Mail Stop 9702, Rockville, MD 20850-9702, Tel. 240-276-5515 or email [email protected]. A signed Confidential Disclosure Agreement may be required to receive copies of the patent applications.

    SUPPLEMENTARY INFORMATION:

    Technology description follows.

    Title of Invention

    Small Molecule Inhibitors of Drug Resistant Forms of HIV-1 Integrase

    Description of Technology

    Integrase strand transfer inhibitors (“INSTIs”) are currently in use as a component of prophylactic antiretroviral therapy for preventing HIV-1 infection from progressing to AIDS. Three INSTIs are approved by the FDA for inclusion in antiretroviral regiments: Raltegravir (RAL), elvitegravir (EVG) and dolutegravir (DTG). Clinicians have already identified several HIV-1 integrase mutations that confer resistance to RAL and EVG, and additional mutations that confer resistance to all three INSTIs has been identified in the laboratory.

    Researchers at the National Cancer Institute discovered small-molecule compounds containing 1-hydroxy-2-oxo-1,8-naphthyridine moieties whose activity against HIV-1 integrase mutants confer resistance to currently approved INSTIs. These new compounds exhibit potent and selective activity against comprehensive and varied panels of INSTI-resistant mutants of HIV-1 integrase. Preliminary rodent efficacy, metabolic, and pharmacokinetic studies have been completed by the NCI researchers.

    The National Cancer Institute (NCI) seeks partners to in-license or co-develop this class of compounds for therapeutic use. Parties interested in licensing the technology should submit an Application for Licensing, and seek detailed information from the Licensing and Patenting Manager indicated below.

    Co-development partners would apply under a Cooperative Research and Development (CRADA) to conduct pre-clinical studies that include lead optimization, in vitro and in vivo evaluation and preclinical development of a novel series of INSTIs for the treatment of infection by HIV-1 strains with resistance to currently available integrase inhibitors, including raltegravir and elvitegravir. Under the CRADA, further in vitro and in vivo ADME, as well as activity studies, will be conducted by the partner on current and optimized lead compounds using rodent and non-rodent models. Efficacy studies in non-human primates of select compounds are needed and will be part of the CRADA program. The CRADA scope will also include all aspects of toxicity studies, and synthesis scale up under GMP of optimized lead compounds to support submission of a successful IND application.

    Interested potential CRADA collaborators can receive detailed information by contacting the Licensing and Patenting Manager (see below). Interested parties will receive detailed information on the current status of the project after signing a confidentiality disclosure agreement (CDA) with NCI. Interested candidate partners must submit a statement of interest and capability to the NCI point of contact for consideration by 5:00 p.m. Eastern Standard Time, December 30, 2016.

    Guidelines for the preparation of a full CRADA proposal will be communicated to all respondents with whom initial confidential discussions have been established. Licensing of background technology related to this CRADA opportunity, specifically HHS Reference No.: E-093-2013/0,1,2, entitled “Compounds for Inhibiting Drug-Resistant Strains of HIV-1 Integrase”, is also available to potential collaborators. All proposals received by the above date will be considered. NCI reserves the right to consider additional proposals or none at all if no partner is selected from the initial response.

    Further information about the NCI Technology Transfer Center can be found on its Web site http://techtransfer.cancer.gov.

    Potential Commercial Applications

    • HIV therapeutic for drug-resistant compounds of HIV-1 integrase

    Value Proposition

    • Currently, the only INSTI effective against drug resistant mutants of HIV-1 integrase

    Development Stage

    Pre-clinical (in vivo validation)

    Inventor(s)

    Terrence Burke, Stephen Hughes, Yves Pommier, Xue Zhao, Mathieu Metifiot, Stephen Smith, Barry Johnson, Christophe Marchand (all from NCI)

    Intellectual Property HHS Reference No.: E-093-2013/0,1,2; all entitled “Compounds For Inhibiting Drug-Resistant Strains Of HIV-1 Integrase” US Provisional App. No.: 61/952,928 filed May 16, 2013 US Provisional App. No.: 61/899,061 filed November 1, 2013 International App. No.: PCT/US2014/037905 filed May 13, 2014 Brazilian App. No.: BR1120150287603 filed May 13, 2014 Canadian App. No.: CA2912064 filed May 13, 2014 Chinese App. No.: 2014-80039611.5 filed May 13, 2014 European App. No.: 14728395.6 filed May 13, 2014 Indian App. No.: 3937/KOLNP/2015 filed May 13, 2014 Japanese App. No.: JP100078282 filed May 13, 2014 US Non-Provisional App. No.: 14/891,309 filed May 13, 2014 South African App. No.: ZA2015/08408 filed May 13, 2014 Publications Zhao, X.Z. et al., “HIV-1 Integrase Strand Transfer Inhibitors with Reduced Susceptibility to Drug Resistant Mutant Integrases”, ACS Chem Biol., Apr 15, 2016, 11(4):1074-81. Métifiot, M. et al., “Selectivity for strand-transfer over 3'-processing and susceptibility to clinical resistance of HIV-1 integrase inhibitors are driven by key enzyme-DNA interactions in the active site”, Nucleic Acids Res., Aug 19, 2016, 44(14):6896-906. Zhao, X. Z. et al., “4-Amino-1-hydroxy-2-oxo-1,8-naphthyridine-containing compounds having high potency against raltegravir-resistant integrase mutants of HIV-1”, J. Med. Chem., 57, 5190-5202 (2014), Doi: 10.1021/jm501059k Zhao, X. Z. et al., “Bicyclic 1-hydroxy-2-oxo-1,2-dihydropyridine-3-carboxamide-containing HIV-1 integrase inhibitors having high antiviral potency against cells harboring raltegravir-resistant integrase mutants”, J. Med. Chem., 57, 1573-1582 (2014), Doi: 10.1021/jm401902n Contact Information

    Requests for copies of the patent application and inquiries about licensing, research collaborations, and co-development opportunities for this invention should be sent to Lauren Nguyen-Antczak, Ph.D., J.D., Senior Licensing & Patenting Manager, NCI Technology Transfer Center, 8490 Progress Drive, Suite 400, Frederick, MD 21701, Tel: (301) 624-8752, email: [email protected].

    Dated: October 25, 2016. John D. Hewes, Technology Transfer Specialist, Technology Transfer Center, National Cancer Institute.
    [FR Doc. 2016-26129 Filed 10-28-16; 8:45 am] BILLING CODE 4140-01-P
    DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Cancer Institute; Notice of Meeting

    Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5 U.S.C. App. 2), notice is hereby given of the joint meeting of the National Cancer Advisory Board (NCAB) and NCI Board of Scientific Advisors (BSA).

    The meeting will be open to the public as indicated below, with attendance limited to space available. Individuals who plan to attend and need special assistance, such as sign language interpretation or other reasonable accommodations, should notify the Contact Person listed below in advance of the meeting. The open session will be videocast and can be accessed from the NIH Videocasting and Podcasting Web site (http://videocast.nih.gov).

    A portion of the National Cancer Advisory Board meeting will be closed to the public in accordance with the provisions set forth in section 552b(c)(6), Title 5 U.S.C., as amended, for the review, discussion, and evaluation of individual intramural programs and projects conducted by the National Cancer Institute, including consideration of personnel qualifications and performance, and the competence of individual investigators, the disclosure of which would constitute a clearly unwarranted invasion of personal privacy.

    Name of Committee: National Cancer Advisory Board; Ad Hoc Subcommittee on Global Cancer Research.

    Open: December 5, 2016, 4:30 p.m. to 6:00 p.m.

    Agenda: Discussion on Global Cancer Research.

    Place: Gaithersburg Marriott Washingtonian Center, 9751 Washington Boulevard, Lakeside 1 Meeting Room, Gaithersburg, MD 20878.

    Contact Person: Dr. Edward Trimble, Executive Secretary, NCAB Ad Hoc Subcommittee on Global Cancer Research, National Cancer Institute—Shady Grove, National Institutes of Health, 9609 Medical Center Drive, Room 3W562, Bethesda, MD 20892, (240) 276-5796, [email protected].

    Name of Committee: National Cancer Advisory Board and NCI Board of Scientific Advisors.

    Open: December 6, 2016, 8:30 a.m. to 4:00 p.m.

    Agenda: Joint meeting of the National Cancer Advisory Board and NCI Board of Scientific Advisors; NCI Board of Scientific Advisors Concepts Review, NCI acting Director's report and presentations.

    Closed: December 6, 2016, 4:00 p.m. to 5:30 p.m.

    Agenda: Review of intramural program site visit outcomes and the discussion of confidential personnel issues.

    Open: December 7, 2016, 9:00 a.m. to 12:00 p.m.

    Agenda: Joint meeting of the National Cancer Advisory Board and NCI Board of Scientific Advisors and presentations.

    Place: National Cancer Institute—Shady Grove, National Institutes of Health, 9609 Medical Center Drive, Room TE406, Bethesda, MD 20892.

    Contact Person: Paulette S. Gray, Ph.D., Director, Division of Extramural Activities, National Cancer Institute—Shady Grove, National Institutes of Health, 9609 Medical Center Drive, Room 7W444, Bethesda, MD 20892, 240-276-6340, [email protected].

    Any interested person may file written comments with the committee by forwarding the statement to the Contact Person listed on this notice. The statement should include the name, address, telephone number and when applicable, the business or professional affiliation of the interested person.

    In the interest of security, NIH has instituted stringent procedures for entrance onto the NCI—Shady Grove campus. All visitors will be asked to show one form of identification (for example, a government-issued photo ID, driver's license, or passport) and to state the purpose of their visit. Information is also available on the Institute's/Center's home page: NCAB: http://deainfo.nci.nih.gov/advisory/ncab/ncab.htm, BSA: http://deainfo.nci.nih.gov/advisory/bsa/bsa.htm, where an agenda and any additional information for the meeting will be posted when available.

    (Catalogue of Federal Domestic Assistance Program Nos. 93.392, Cancer Construction; 93.393, Cancer Cause and Prevention Research; 93.394, Cancer Detection and Diagnosis Research; 93.395, Cancer Treatment Research; 93.396, Cancer Biology Research; 93.397, Cancer Centers Support; 93.398, Cancer Research Manpower; 93.399, Cancer Control, National Institutes of Health, HHS)
    Dated: October 25, 2016. Melanie J. Gray, Program Analyst, Office of Federal Advisory Committee Policy.
    [FR Doc. 2016-26130 Filed 10-28-16; 8:45 am] BILLING CODE 4140-01-P
    DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection [Docket No. USCBP-2016-0066] Commercial Customs Operations Advisory Committee (COAC) AGENCY:

    U.S. Customs and Border Protection (CBP), Department of Homeland Security (DHS).

    ACTION:

    Committee Management; Notice of Federal Advisory Committee Meeting.

    SUMMARY:

    The Commercial Customs Operations Advisory Committee (COAC) will meet in Washington, DC. The meeting will be open to the public.

    DATES:

    The Commercial Customs Operations Advisory Committee (COAC) will meet on Thursday, November 17, 2016, from 12:30 p.m. to 4:30 p.m. EST. Please note that the meeting may close early if the committee has completed its business.

    Pre-Registration: Meeting participants may attend either in person or via webinar after pre-registering using a method indicated below:

    —For members of the public who plan to attend the meeting in person, please register by 5:00 p.m. EST by November 15, 2016 either online at https://apps.cbp.gov/te_reg/index.asp?w=97 by email to [email protected]; or by fax to (202) 325-4290. You must register prior to the meeting in order to attend the meeting in person. —For members of the public who plan to participate via webinar, please register online at https://apps.cbp.gov/te_reg/index.asp?w=96 by 5:00 p.m. EST by November 15, 2016. Please feel free to share this information with other interested members of your organization or association.

    Members of the public who are pre-registered and later need to cancel, please do so in advance of the meeting by accessing one (1) of the following links: https://apps.cbp.gov/te_reg/cancel.asp?w=97 to cancel an in person registration, or https://apps.cbp.gov/te_reg/cancel.asp?w=96 to cancel a webinar registration.

    ADDRESSES:

    The meeting will be held at the Washington Marriott Wardman Park Hotel, 2660 Woodley Road NW., Washington, DC 20008. There will be signage posted directing visitors to the location of the meeting room.

    For information on facilities or services for individuals with disabilities or to request special assistance at the meeting, contact Ms. Karmeshia Tuck, Office of Trade Relations, U.S. Customs and Border Protection at (202) 325-1030 as soon as possible.

    To facilitate public participation, we are inviting public comment on the issues the committee will consider prior to the formulation of recommendations as listed in the “Agenda” section below.

    Comments must be submitted in writing no later than November 7, 2016, and must be identified by Docket No. USCBP-2016-0066, and may be submitted by one (1) of the following methods:

    Federal eRulemaking Portal: http://www.regulations.gov. Follow the instructions for submitting comments.

    Email: [email protected]. Include the docket number in the subject line of the message.

    Fax: (202) 325-4290.

    Mail: Ms. Karmeshia Tuck, Office of Trade Relations, U.S. Customs and Border Protection, 1300 Pennsylvania Avenue NW., Room 3.5A, Washington, DC 20229.

    Instructions: All submissions received must include the words “Department of Homeland Security” and the docket number (USCBP-2016-0066) for this action. Comments received will be posted without alteration at http://www.regulations.gov. Please do not submit personal information to this docket.

    Docket: For access to the docket or to read background documents or comments, go to http://www.regulations.gov and search for Docket Number USCBP-2016-0066. To submit a comment, click the “Comment Now!” button located on the top-right hand side of the docket page.

    There will be multiple public comment periods held during the meeting on November 17, 2016. Speakers are requested to limit their comments to two (2) minutes or less to facilitate greater participation. Contact the individual listed below to register as a speaker. Please note that the public comment period for speakers may end before the time indicated on the schedule that is posted on the CBP Web page, http://www.cbp.gov/trade/stakeholder-engagement/coac.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Karmeshia Tuck, Office of Trade Relations, U.S. Customs and Border Protection, 1300 Pennsylvania Avenue NW., Room 3.5A, Washington, DC 20229; telephone (202) 344-1661; facsimile (202) 325-4290.

    SUPPLEMENTARY INFORMATION:

    Notice of this meeting is given under the Federal Advisory Committee Act, 5 U.S.C. Appendix. The Commercial Customs Operations Advisory Committee (COAC) provides advice to the Secretary of Homeland Security, the Secretary of the Treasury, and the Commissioner of U.S. Customs and Border Protection (CBP) on matters pertaining to the commercial operations of CBP and related functions within the Department of Homeland Security and the Department of the Treasury.

    Agenda

    The COAC will hear from the following subcommittees on the topics listed below and then will review, deliberate, provide observations, and formulate recommendations on how to proceed:

    1. The Trade Enforcement and Revenue Collection (TERC) Subcommittee will discuss the progress made on prior TERC, Bond Working Group, and Intellectual Property Rights Working Group recommendations, as well the recommendations from the Forced Labor Working Group.

    2. The Global Supply Chain Subcommittee will provide an update report on the progress of the Customs-Trade Partnership Against Terrorism (C-TPAT) Working Group that is reviewing and developing recommendations to update the C-TPAT minimum security criteria.

    3. The One U.S. Government Subcommittee (1 USG) will discuss the progress of the North American Single Window (NASW) Working Group's NASW approach. The subcommittee will also discuss the progress of the Automated Commercial Environment (ACE) Single Window effort.

    4. The Exports Subcommittee will give an update on the Air, Ocean, and Rail Manifest Pilots and discuss the progress of the Truck Manifest Sub-Working Group, which is coordinating with the 1 USG NASW Working Group.

    5. The Trade Modernization Subcommittee will discuss the progress of the International Engagement and Trade Facilitation Working Group which will be identifying examples of best practices in the U.S. and abroad that facilitate trade. The subcommittee will discuss the startup of the Revenue Modernization Working Group which will be generating advice pertaining to the strategic modernization of Customs and Border Protection's revenue collections process and systems. Finally, the subcommittee will discuss the startup of the Rulings and Decisions Working Group which will be identifying process improvements in the receipt and issuance of Customs and Border Protection Headquarters' rulings and decisions.

    6. The Trusted Trader Subcommittee will continue their discussion on their vision for an enhanced Trusted Trader concept that includes engagement with CBP to include relevant partner government agencies with a potential for international interoperability.

    Meeting materials will be available by November 14, 2016, at: http://www.cbp.gov/trade/stakeholder-engagement/coac/coac-public-meetings.

    Dated: October 26, 2016. Maria Luisa Boyce, Senior Advisor for Private Sector Engagement, Office of Trade Relations.
    [FR Doc. 2016-26180 Filed 10-28-16; 8:45 am] BILLING CODE 9111-14-P
    DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-HQ-MB-2016-N184; 91100-3740-GRNT 7C] Announcement of Meetings: North American Wetlands Conservation Council; Neotropical Migratory Bird Conservation Advisory Group AGENCY:

    Fish and Wildlife Service, Interior.

    ACTION:

    Notice of meetings.

    SUMMARY:

    The North American Wetlands Conservation Council (Council) will meet to select North American Wetlands Conservation Act (NAWCA) grant proposals for recommendation to the Migratory Bird Conservation Commission (Commission). The Council will consider Canadian, Mexican, and U.S. Standard grant proposals. The Advisory Group for the Neotropical Migratory Bird Conservation Act (NMBCA) grants program (Advisory Group) also will meet. The Advisory Group will discuss the strategic direction and management of the NMBCA program. Both meetings are open to the public, and interested persons may present oral or written statements.

    DATES:

    Meetings: Council: December 1, 2016, from 8:30 a.m. to 4:30 p.m.

    Advisory Group: November 30, 2016, from 8:30 a.m. to 4:30 p.m.

    Participation Deadlines: Attendance: To attend either or both meetings, contact the Council/Advisory Group Coordinator (see FOR FURTHER INFORMATION CONTACT) no later than November 23, 2016.

    Submitting Information: To submit written information or questions before the Council or Advisory Group meeting for consideration during the meeting contact the Council/Advisory Group Coordinator (see FOR FURTHER INFORMATION CONTACT) no later than November 23, 2016.

    ADDRESSES:

    The Council and Advisory Group meetings will take place at the U.S. Fish and Wildlife Service Headquarters, 5275 Leesburg Pike, Falls Church, Virginia 22041.

    FOR FURTHER INFORMATION CONTACT:

    Sarah Mott, Council/Advisory Group Coordinator, by phone at 703-358-1784; by email at [email protected]; or by U.S. mail at U.S. Fish and Wildlife Service, 5275 Leesburg Pike MS: MB, Falls Church, Virginia 22041. Persons who use a telecommunications device for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 to contact the above individual during normal business hours. FIRS is available 24 hours a day, 7 days a week, to leave a message or question with the above individual. You will receive a reply during normal business hours.

    SUPPLEMENTARY INFORMATION:

    The Council meets two to three times per year to select. The Council will consider Canadian, Mexican, and U.S. Standard NAWCA grant proposals for recommendation to the Commission. Council meetings are open to the public, and interested persons may present oral or written statements. The Advisory Group for the Neotropical Migratory Bird Conservation Act (NMBCA) grants program meets once a year. The Advisory Group will discuss the strategic direction and management of the NMBCA program. This meeting is also open to the public, and interested persons may present oral or written statements.

    About the Council

    In accordance with NAWCA (Pub. L. 101-233, 103 Stat. 1968, December 13, 1989, as amended), the State-private-Federal Council meets to consider wetland acquisition, restoration, enhancement, and management projects for recommendation to, and final funding approval by, the Commission. NAWCA provides matching grants to organizations and individuals who have developed partnerships to carry out wetlands conservation projects in the United States, Canada, and Mexico. These projects must involve long-term protection, restoration, and/or enhancement of wetlands and associated uplands habitats for the benefit of all wetlands-associated migratory birds. Project proposal due dates, application instructions, and eligibility requirements are available on the NAWCA Web site at www.fws.gov/birds/grants/north-american-wetland-conservation-act.php.

    About the Advisory Group

    In accordance with NMBCA (Pub. L. 106-247, 114 Stat. 593, July 20, 2000), the Advisory Group will hold its meeting to discuss the strategic direction and management of the NMBCA program and provide advice to the Director of the Fish and Wildlife Service. NMBCA promotes long-term conservation of neotropical migratory birds and their habitats through a competitive grants program by promoting partnerships, encouraging local conservation efforts, and achieving habitat protection in 36 countries. The goals of NMBCA include perpetuating healthy bird populations, providing financial resources for bird conservation, and fostering international cooperation. Because the greatest need is south of the U.S. border, at least 75 percent of NMBCA funding supports projects outside the United States. Project proposal due dates, application instructions, and eligibility requirements are available on the NMBCA Web site at http://www.fws.gov/birds/grants/neotropical-migratory-bird-conservation-act.php.

    Public Input Submitting Written Information or Questions

    Interested members of the public may submit relevant information or questions to be considered during the public meetings. If you wish to submit a written statement so information may be made available to the Council or Advisory Group for their consideration prior to the meetings, you must contact the Council/Advisory Group Coordinator by the date in DATES. Written statements must be supplied to the Council/Advisory Group Coordinator in both of the following formats: One hard copy with original signature, and one electronic copy via email (acceptable file formats are Adobe Acrobat PDF, MS Word, MS PowerPoint, or rich text file).

    Giving an Oral Presentation

    Individuals or groups requesting to make an oral presentation at the meetings will be limited to 2 minutes per speaker, with no more than a total of 30 minutes for all speakers. Interested parties should contact the Council/Advisory Group Coordinator by the date in DATES, in writing (preferably via email; see FOR FURTHER INFORMATION CONTACT), to be placed on the public speaker list for either of these meetings. Nonregistered public speakers will not be considered during the Council or the Advisory Group meeting. Registered speakers who wish to expand upon their oral statements, or those who had wished to speak but could not be accommodated on the agenda, are invited to submit written statements to the Council or Advisory Group within 30 days following the meeting.

    Meeting Minutes

    Summary minutes of the Council and Advisory Group meetings will be maintained by the Council/Advisory Group Coordinator at the address under FOR FURTHER INFORMATION CONTACT. Meeting notes will be available by contacting the Council/Advisory Group Coordinator within 30 days following the meeting. Personal copies may be purchased for the cost of duplication.

    Jerome Ford, Assistant Director, Migratory Birds.
    [FR Doc. 2016-26166 Filed 10-28-16; 8:45 am] BILLING CODE 4333-15-P
    DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R4-ES-2016-N161; FXES11130900000C2-167-FF09E32000] Endangered and Threatened Wildlife and Plants; 5-Year Status Review of the Red Wolf AGENCY:

    Fish and Wildlife Service, Interior.

    ACTION:

    Notice of initiation of review; request for information.

    SUMMARY:

    We, the U.S. Fish and Wildlife Service (Service), are initiating a 5-year status review for the red wolf (Canis rufus) under the Endangered Species Act of 1973, as amended (Act). A 5-year review is an assessment of the best scientific and commercial data available at the time of the review. We are requesting submission of information that has become available since the last review of this species.

    DATES:

    To allow us adequate time to conduct this review, we must receive your comments or information on or before December 30, 2016. However, we will continue to accept new information about any listed species at any time.

    ADDRESSES:

    For instructions on how to submit information and review information we receive on the red wolf, see “Request for New Information.”

    FOR FURTHER INFORMATION CONTACT:

    Aaron Valenta, Chief, Division of Restoration and Recovery, 404-679-4144.

    SUPPLEMENTARY INFORMATION:

    Why do we conduct a 5-year review?

    Under the Act (16 U.S.C. 1531 et seq.), we maintain lists of endangered and threatened wildlife and plant species in the Code of Federal Regulations (CFR) at 50 CFR 17.11 (for wildlife) and 17.12 (for plants). Section 4(c)(2)(A) of the Act requires us to review each listed species' status at least once every 5 years. Our regulations at 50 CFR 424.21 require that we publish a notice in the Federal Register announcing those species under active review. For additional information about 5-year reviews, go to http://www.fws.gov/endangered/what-we-do/recovery-overview.html, scroll down to “Learn More about 5-Year Reviews,” and click on our factsheet.

    Species Under Review

    This notice announces our active review of the red wolf (Canis rufus), which is currently listed as endangered.

    What information do we consider in our review?

    In conducting a 5-year review, the Service considers the best scientific and commercial data that have become available since the current listing determination or most recent status review of each species, such as:

    A. Species biology, including but not limited to population trends, distribution, abundance, demographics, and genetics;

    B. Habitat conditions, including but not limited to amount, distribution, and suitability;

    C. Conservation measures that have been implemented to benefit the species;

    D. Threat status and trends (see five factors under heading “How Do We Determine Whether a Species Is Endangered or Threatened?”); and

    E. Other new information, data, or corrections, including but not limited to taxonomic or nomenclatural changes, identification of erroneous information contained in the Lists of Endangered and Threatened Wildlife and Plants, and improved analytical methods.

    New information will be considered in the 5-year review and ongoing recovery programs for the species.

    Definitions

    A. Species means any species or subspecies of fish, wildlife, or plant, and any distinct population segment of any species of vertebrate which interbreeds when mature.

    B. Endangered means any species that is in danger of extinction throughout all or a significant portion of its range.

    C. Threatened means any species that is likely to become an endangered species within the foreseeable future throughout all or a significant portion of its range.

    How do we determine whether a species is endangered or threatened?

    Section 4(a)(1) of the Act establishes that we determine whether a species is endangered or threatened based on one or more of the following five factors:

    A. The present or threatened destruction, modification, or curtailment of its habitat or range;

    B. Overutilization for commercial, recreational, scientific, or educational purposes;

    C. Disease or predation;

    D. The inadequacy of existing regulatory mechanisms; or

    E. Other natural or manmade factors affecting its continued existence.

    Request for New Information

    To do any of the following, contact Aaron Valenta at the Service's Southeast Regional Office, 1875 Century Boulevard, Atlanta, GA 30345; fax 404-679-7081; email at [email protected]:

    A. To get more information on the red wolf;

    B. To submit information on the red wolf; or

    C. To review information we receive, which will be available for public inspection by appointment, during normal business hours at the Southeast Regional Office, Ecological Services Division, at the address above.

    We request any new information concerning the status of the red wolf. See “What information do we consider in our review?” above for specific criteria. Information submitted should be supported by documentation such as maps, bibliographic references, methods used to gather and analyze the data, and/or copies of any pertinent publications, reports, or letters by knowledgeable sources.

    Public Availability of Comments

    Before including your address, phone number, email address, or other personal identifying information in your comment, you should be aware that the entire comment—including your personal identifying information—may be made publicly available at any time. While you can ask us in your comment to withhold your personal identifying information from public review, we cannot guarantee that we will be able to do so.

    Authority

    We publish this document under the authority of the Endangered Species Act (16 U.S.C. 1531 et seq.).

    Dated: September 23, 2016. Mike Oetker, Acting Regional Director, Southeast Region.
    [FR Doc. 2016-26168 Filed 10-28-16; 8:45 am] BILLING CODE 4310-55-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of California AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The State of California and the Viejas (Baron Long) Group of Capitan Grande Band of Mission Indians of the Viejas Reservation entered into a Tribal-State compact governing Class III gaming. This notice announces that the compact is taking effect.

    DATES:

    The effective date of the compact is October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The Secretary took no action on the compact within 45 days of its submission. Therefore, the compact is considered to have been approved, but only to the extent the compact is consistent with IGRA. See 25 U.S.C. 2710(d)(8)(C).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26255 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of California AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The State of California and the Agua Caliente Band of Cahuilla Indians of the Agua Caliente Indian Reservation entered into a Tribal-State compact governing Class III gaming. This notice announces that the compact is taking effect.

    DATES:

    The effective date of the compact is October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The Secretary took no action on the compact within 45 days of its submission. Therefore, the compact is considered to have been approved, but only to the extent the compact is consistent with IGRA. See 25 U.S.C. 2710(d)(8)(C).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26256 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Approval of Amended Tribal-State Class III Gaming Compact in the State of South Dakota AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The Yankton Sioux Tribe of South Dakota and State of South Dakota negotiated an Amended Gaming Compact governing Class III gaming; this notice announces approval of the amended compact.

    DATES:

    Effective October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The Amended Compact adds games to the “no-limit” category, removes arbitration procedures, transfers responsibility for background checks to the Tribal Gaming Commission, increases the maximum number of slot machines the Tribe may operate, and adds a personal injury remedy for patrons. The Amended Compact is subject to review at four-year intervals. The Amended Compact is approved. See 25 U.S.C. 2710(d)(8)(A).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26253 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Approval of Amendment to Tribal-State Class III Gaming Compact in the State of California AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The Yurok Tribe (Tribe) of the Yurok Reservation and State of California (State) entered into an amendment to an existing Tribal-State compact governing Class III gaming. This notice announces approval of the amendment.

    DATES:

    Effective October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The amendment provides that the Tribe may participate in the State's workers' compensation program or, in lieu of participation in the State's statutory workers' compensation system, the Tribe may create and maintain a system that provides redress for employees' work-related injuries. The amendment is approved. See 25 U.S.C. 2710(d)(8)(A).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26251 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Tribal-State Class III Gaming Compact Taking Effect in the State of California AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The State of California and the Viejas (Baron Long) Group of Capitan Grande Band of Mission Indians of the Viejas Reservation entered into a Tribal-State compact governing Class III gaming. This notice announces that the compact is taking effect.

    DATES:

    The effective date of the compact is October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The Secretary took no action on the compact within 45 days of its submission. Therefore, the compact is considered to have been approved, but only to the extent the compact is consistent with IGRA. See 25 U.S.C. 2710(d)(8)(C).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26254 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Approval of Amendment to Tribal-State Class III Gaming Compact in the State of Oregon AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The Coquille Indian Tribe and State of Oregon entered into an amendment to an existing Tribal-State compact governing Class III gaming. This notice announces approval of the amendment.

    DATES:

    Effective October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The amendment expands on the Coquille Tribal Gaming Commission's criteria for denial or termination of contracts for vendors based on the nature and severity of the conduct that constituted the offense or crime, the time that has passed since satisfactory completion of sentence, probation, or payment of fine imposed, the number of offenses or crimes, and any extenuating circumstances that enhance or reduce the impact of the crime. The amendment is approved. See 25 U.S.C. 2710(d)(8)(A).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26252 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [178A2100DD/AAKC001030/A0A501010.999900 253G] Indian Gaming; Approval of Amended Tribal-State Class III Gaming Compact in the State of California AGENCY:

    Bureau of Indian Affairs, Interior.

    ACTION:

    Notice.

    SUMMARY:

    The Jackson Band of Miwuk Indians (Tribe) and State of California entered into an amendment to the existing Tribal-State Compact governing Class III gaming. This notice announces approval of the amendment.

    DATES:

    Effective October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant Secretary—Indian Affairs, Washington, DC 20240, (202) 219-4066.

    SUPPLEMENTARY INFORMATION:

    Section 11 of the Indian Gaming Regulatory Act (IGRA) requires the Secretary of the Interior to publish in the Federal Register notice of approved Tribal-State compacts that are for the purpose of engaging in Class III gaming activities on Indian lands. See Public Law 100-497, 25 U.S.C. 2701 et seq. All Tribal-State Class III compacts, including amendments, are subject to review and approval by the Secretary under 25 CFR 293.4. The amendment reduces and otherwise adjusts the existing compact's revenue sharing requirements and increases the available credits that may be claimed for certain infrastructure and other projects or programs underwritten by the Tribe. The amendment is approved. See 25 U.S.C. 2710(d)(8)(A).

    Dated: October 21, 2016. Lawrence S. Roberts, Principal Deputy Assistant Secretary—Indian Affairs.
    [FR Doc. 2016-26250 Filed 10-28-16; 8:45 am] BILLING CODE 4337-15-P
    DEPARTMENT OF THE INTERIOR National Park Service [NPS-WASO-EQD-SSB-22271; PPAKGAARC6, PPMPRLE1Z.LS0000 (166)] Information Collection Request: National Park Service Centennial National Household Survey AGENCY:

    National Park Service, Interior.

    ACTION:

    Notice; request for comments.

    SUMMARY:

    We (National Park Service, NPS) have sent an Information Collection Request (ICR) to OMB for review and approval. We summarize the ICR below and describe the nature of the collection and the estimated annual burden. We may not conduct or sponsor and a person is not required to respond to a collection of information unless it displays a currently valid OMB Control Number.

    DATES:

    To ensure that we are able to consider your comments on this ICR, we must receive them by November 30, 2016.

    ADDRESSES:

    Please direct all written comments on this ICR directly to the Office of Management and Budget (OMB) Office of Information and Regulatory Affairs, Attention: Desk Officer for the Department of the Interior, to [email protected] (email) or 202-395-5806 (fax); and identify your submission as 1024-0254. Please also send a copy of your comments to Phadrea Ponds, Information Collection Coordinator, National Park Service, 1201 Oakridge Drive, Fort Collins, CO 80525 (mail); or [email protected] (email). Please reference Information Collection 1024-0254 in the subject line. You may also access this ICR at www.reginfo.gov.

    FOR FURTHER INFORMATION CONTACT:

    Bret Meldrum, Chief Social Science Program, National Park Service, 1201 Oakridge Drive, Fort Collins, CO 80525 (mail); or [email protected] (email). Please reference Information Collection 1024-0254 in the subject line.

    SUPPLEMENTARY INFORMATION:

    I. Abstract

    2016 marks the 100th anniversary of the National Park Service (NPS)—a defining moment that offers an opportunity to reflect on and celebrate our accomplishments as we move forward into a new century of stewardship and engagement. As we prepare for our centennial anniversary, discussions concerning the relevancy of the National Parks have ignited the need for a third iteration of the NPS Comprehensive Survey of the American Public. This survey will include questions from the original surveys as well as updated questions that can be used to provide views from a national audience concerning the current relevancy of the NPS that would otherwise be unavailable.

    This request is to reinstate OMB Control Number 1024-0254 in order to pretest the survey and collection methods before we ask OMB to review for the consideration of approval the final version of the survey instrument. The new content is sufficiently different enough to necessitate this request to pretest question, response choice wording, and survey length before requesting approval of the final survey. The purpose and intent of the final survey will be measure the awareness, engagement, values, and preferences of both visitors and non-visitors. This information will be used to assess the relevancy of NPS as well as to assess change over time, which in turn will be used to evaluate the effectiveness of NPS efforts to increase its relevancy.

    II. Data

    OMB Control Number: 1024-0254.

    Title: National Park Service Centennial National Household Survey.

    Type of Request: Reinstatement with change to a previously approved collection.

    Affected Public: Individuals and households.

    Respondent Obligation: Voluntary.

    Frequency of Collection: One time.

    Estimated Number of Annual Responses: 120.

    Annual Burden Hours: 54 hours.

    Estimated Annual Reporting and Recordkeeping “Non-Hour Cost”: None.

    III. Comments

    A notice was published in the Federal Register (80 FR 80384) on December 24, 2015, stating that we intended to submit an information collection OMB approval of the NPS Comprehensive Survey of the American Public. In this notice, we solicited public comment for 60 days ending February 22, 2016. We did not receive any comments in response to that notice that required changes to the collection instruments.

    We again invite comments concerning this information collection on:

    • Whether or not the collection of information is necessary, including whether or not the information will have practical utility;

    • The accuracy of our estimate of the burden for this collection of information;

    • Ways to enhance the quality, utility, and clarity of the information to be collected; and

    • Ways to minimize the burden of the collection of information on respondents.

    Comments that you submit in response to this notice are a matter of public record. We will include or summarize each comment in our request to OMB to approve this IC. Before including your address, phone number, email address, or other personal identifying information in your comment, you should be aware that your entire comment, including your personal identifying information, may be made publicly available at any time. While you can ask us in your comment to withhold your personal identifying information from public review, we cannot guarantee that we will be able to do so.

    Dated: October 26, 2016. Madonna L. Baucum, Information Collection Clearance Officer, National Park Service.
    [FR Doc. 2016-26151 Filed 10-28-16; 8:45 am] BILLING CODE 4310-EH-P
    DEPARTMENT OF THE INTERIOR Bureau of Ocean Energy Management [Docket No. BOEM-2016-0071] Atlantic Wind Lease Sale 6 (ATLW-6) for Commercial Leasing for Wind Power on the Outer Continental Shelf Offshore New York—Final Sale Notice MMAA104000 AGENCY:

    Bureau of Ocean Energy Management, Interior.

    ACTION:

    Final sale notice for commercial leasing for Wind Power on the Outer Continental Shelf Offshore New York.

    SUMMARY:

    This document is the Final Sale Notice (FSN) for the sale of one commercial wind energy lease on the Outer Continental Shelf (OCS) offshore New York, pursuant to 30 CFR 585.216. The Bureau of Ocean Energy Management (BOEM or “the Bureau”) will offer Lease OCS-A 0512 for sale using a multiple-factor auction format. This FSN contains information pertaining to the area available for leasing, lease provisions and conditions, auction details, the lease form, criteria for evaluating competing bids, award procedures, appeal procedures, and lease execution. The issuance of the lease resulting from this sale would not constitute an approval of project-specific plans to develop offshore wind energy. Such plans, if submitted by the lease sale winner, would be subject to subsequent environmental and public review prior to a decision to proceed with development.

    DATES:

    BOEM will hold a mock auction for the bidders starting at 8:30 a.m. Eastern Standard (EST) on December 13, 2016. The monetary auction will be held online and will begin at 8:30 a.m. Eastern Standard Time (EST) on December 15, 2016. Additional details are provided in the section entitled, “Deadlines and Milestones for Bidders.”

    FOR FURTHER INFORMATION CONTACT:

    Wright Frank, New York Project Coordinator and Auction Manager, BOEM Office of Renewable Energy Programs, 45600 Woodland Road, VAM-OREP, Sterling, Virginia, 20166, (703) 787-1325 or [email protected].

    Background: BOEM proposed this lease sale on June 6, 2016, in Proposed Sale Notice (PSN) for Commercial Leasing for Wind Power on the Outer Continental Shelf (OCS) Offshore New York, which was published in the Federal Register with a 60-day public comment period (81 FR 36336). BOEM received 401 comment submissions in response to the PSN, which are available in the Federal Register docket (Docket ID: BOEM-2016-0027) through BOEM's Web site at: http://www.boem.gov/New-York/. BOEM has posted a document containing responses to comments submitted during the PSN comment period. The document, entitled Response to Comments, can be found at the following URL: http://www.boem.gov/New-York/.

    BOEM made several changes from the description of the New York lease sale that was published in the PSN. Three changes worth highlighting are: A 10% bidding credit for entities that establish that they are a “government authority” meeting the definition included in this notice, an adaptation to the auction format, and the removal of a small portion of the lease area. The auction format described here differs slightly from past lease sales in that bidders may have a “limited opportunity to revoke” a provisionally winning bid without penalty if the next-highest bid was submitted by a governmental entity. An explanation regarding the reduction in the area of the LA relative to the area described in the PSN is provided in the section entitled “Area Offered for Leasing.”

    Environmental Reviews

    On May 28, 2014, BOEM published a Notice of Intent (NOI) to Prepare an Environmental Assessment (EA) for commercial wind lease issuance and approval of site assessment activities on the Atlantic OCS offshore New York with a 45-day public comment period (79 FR 30643). In response to the NOI, BOEM received 32 comment submissions, a link to which is available at http://www.boem.gov/New-York/. BOEM considered these comments in determining the scope of issues and alternatives analyzed in the EA.

    On June 6, 2016, in conjunction with the PSN, BOEM published an EA for public comment (81 FR 36344). BOEM received approximately 60 submittals. Submittals included letters, emails, comment cards, and comments made to a court reporter at public meetings. BOEM identified 300 discrete comments within the submittals received. Comments were received from various stakeholders, including private citizens, environmental groups, Federal agencies, trade associations, businesses, state agencies, universities, and Federal organizations.

    Concurrent with publication of this FSN, BOEM has published a Notice of Availability (NOA) for the revised EA and Finding of No Significant Impact (FONSI) for commercial wind lease issuance and site assessment activities on the Atlantic OCS offshore New York. The EA and FONSI are available at: http://www.boem.gov/New-York/.

    All consultations necessary to inform BOEM's lease issuance decision have been completed. BOEM completed consultations with the National Oceanic and Atmospheric Administration's National Marine Fisheries Service (NMFS) and the U.S. Fish and Wildlife Service (USFWS) under the Endangered Species Act (ESA). BOEM completed formal consultation with NMFS upon receipt of a Biological Opinion on March 10, 2013, (revised on April 10, 2013). That consultation covered lease issuance and site characterization activities (i.e., high resolution geophysical surveys, biological surveys, and geotechnical sampling). On September 14, 2016, the USFWS concurred with BOEM's determination that such activities were not likely to adversely affect piping plovers, roseate terns, and red knots, and BOEM's determination of no effect on the northern long-eared bat for site characterization and assessment activities.

    BOEM also consulted with the State Historic Preservation Offices of New York and New Jersey, the National Park Service, and Monmouth County New Jersey under the National Historic Preservation Act. The Finding of No Historic Properties Affected for the Issuance of a Commercial Lease with in the New York Wind Energy Area on the Outer Continental Shelf Offshore New York can be found at: http://www.boem.gov/Renewable-Energy/Historic-Preservation-Activities/. In August 2016, the States of New York and New Jersey concurred with BOEM's consistency determination under the Coastal Zone Management Act.

    On July 11, 2016, NMFS provided comments on the EA pursuant to the Magnuson-Stevens Fishery Conservation and Management Act (MSFCMA) and recommended that BOEM coordinate with NMFS in the review of site-specific survey plans and Site Assessment Plans (SAPs). Because of the programmatic nature of the essential fish habitat (EFH) assessment, NMFS elected not to provide any specific EFH conservation measures until such time as site-specific plans are received.

    For the issuance of a commercial lease, BOEM considers the environmental consequences of associated site characterization activities (i.e., biological, archeological, geological and geophysical surveys, and core sampling). Therefore, mitigation measures designed to reduce or eliminate impacts from survey activities will be included as stipulations in Addendum “C” of the lease (OCS-A 0512). If a lease is issued, BOEM will prepare additional environmental reviews upon receipt of the lessee's SAP and/or Construction and Operations Plan (COP). BOEM will continue to work with affected stakeholders and assess ongoing and future research relating to potential survey, site assessment, and construction and operations impacts, including potential mitigation measures.

    List of Eligible Bidders: BOEM has determined that pursuant to 30 CFR 585.106 and 107, the following entities are legally, technically, and financially qualified to hold a commercial wind lease offshore New York, and therefore may participate in this lease sale as bidders subject to meeting the requirements outlined in this notice.

    Company name Company No. Avangrid Renewables, LLC 15019 CI-II NY Inc 15063 Clean Power Northeast Development Inc 15064 Convalt Energy LLC 15051 Deepwater Wind Hudson Canyon, LLC 15028 DONG Energy Wind Power (U.S.) Inc 15059 EDF Renewable Development, Inc 15027 Energy Management, Inc 15015 Fishermen's Energy, LLC 15005 Innogy US Renewable Projects LLC 15061 New York State Energy Research and Development Authority 15062 Sea Breeze Energy LLC 15044 Statoil Wind US LLC 15058 wpd offshore Alpha LLC 15060

    Deadlines and Milestones for Bidders: This section describes the major deadlines and milestones in the auction process from publication of this FSN to execution of leases pursuant to this sale. These are organized into various stages: The FSN Waiting Period; Conducting the Auction; and From the Auction to Lease execution.

    • FSN Waiting Period

    Bidder's Financial Form (BFF): Each bidder must submit a BFF to BOEM in order to participate in the auction. BOEM must receive each bidder's BFF no later than November 14, 2016. BOEM will consider extensions to this deadline only if BOEM determines that the failure to timely submit a BFF was caused by events beyond the bidder's control. The BFF can be downloaded at: http://www.boem.gov/New-York/. Once the BFF has been processed, bidders may log into pay.gov and submit bid deposits. For purposes of this auction, BOEM will not consider any BFFs submitted by bidders for previous lease sales. BOEM will only accept an originally executed paper copy of the BFF. The BFF must be executed by an authorized representative as shown on the bidder's legal qualifications. Each bidder is required to sign the self-certification in the BFF, in accordance with 18 U.S.C. 1001 (Fraud and False Statements).

    Bid Deposits: Each bidder must provide a bid deposit of $450,000 no later than November 28, 2016, in order to participate in the mock auction and the monetary auction. BOEM will consider extensions to this deadline only if BOEM determines that the failure to timely submit the bid deposit was caused by events beyond the bidder's control. Further information about bid deposits can be found in the “Bid Deposit” section of this notice.

    Non-Monetary Package: Each bidder must submit a non-monetary package if it is applying for a credit as a governmental authority as described in the “Auction Procedures: Credit Factors” section of this notice. For bidders applying for a credit, BOEM must receive non-monetary packages no later than November 28, 2016. BOEM will consider extensions to this deadline only if BOEM determines that the failure to timely submit a non-monetary package was caused by events beyond the bidder's control. Non-monetary packages must be submitted in both paper and electronic formats. BOEM considers Adobe .pdf files stored on an electronic media (i.e., flash drive) to be acceptable.

    Further information on this subject can be found in the section of this notice entitled,”Auction Procedures.”

    Reservation of Limited Opportunity to Revoke (RLOR): Under certain circumstances described in detail in this notice, a bidder that submits a provisionally winning bid may be afforded a one-hour opportunity to revoke its provisionally winning bid without penalty at the end of the auction. This opportunity will only be allowed if a governmental authority submitted the second-highest bid. In order to revoke a provisionally winning bid, a bidder must have reserved this opportunity in advance. BOEM must receive, no later than November 28, 2016, a completed form that can be downloaded from BOEM's Web site at http://www.boem.gov/New-York/, called the “Reservation of Limited Opportunity to Revoke” (RLOR). BOEM will consider extensions to this deadline only if it determines that the failure to timely submit the RLOR was caused by events beyond the bidder's control. By “opting-in,” the bidder will have an opportunity, if certain conditions are met, to revoke a provisionally winning bid without penalty during a short period of time following the auction. If the bidder does not “opt-in,” the bidder will not have this opportunity, and refusal to execute a lease pursuant to a provisionally winning bid will result in the loss of the bidder's bid deposit.

    Panel Convenes to Evaluate Non-Monetary Packages: A short time before the auction, the panel described in the “Auction Procedures” section will convene to evaluate non-monetary packages. The panel is tentatively scheduled to meet on December 9, 2016, for this purpose. If BOEM has not received a non-monetary package by November 28, 2016, then the BOEM panel designated as responsible for determining bidder eligibility for the credit may not consider that bidder for a non-monetary auction credit. Once it has made its decisions, the panel will report determinations of eligibility to BOEM. BOEM will then inform each bidder by email of the panel's determination as to whether the bidder qualifies for a non-monetary bid credit. Mock Auction: BOEM will hold a Mock Auction on December 13, 2016, beginning at 8:30 a.m. EST. The Mock Auction will be held online. BOEM will contact each bidder that has timely filed a BFF and bid deposit and provide instructions for participation. Only bidders that have timely submitted BFFs and bid deposits will be permitted to participate in the Mock Auction.

    Conducting the Auction: BOEM, through its contractor, will hold an auction as described in this notice.

    Auction: On December 15, 2016, BOEM, through its contractor, will hold the auction. The first round of the auction will start at 8:30 a.m. EST. The auction will proceed electronically according to a schedule to be distributed by the BOEM Auction Manager at the time of the auction. BOEM anticipates that the auction may continue on consecutive business days, as necessary, until the auction ends in accordance with the procedures described in the “Auction Format” section of this notice. The monetary bidding will end in the first round where BOEM receives one or zero bids at the asking price.

    Limited Opportunity to Revoke (LOR) (if criteria met): If the highest bidder has reserved an LOR, and a government authority is the second highest bidder, BOEM will contact the provisionally winning bidder through the auction system's messaging platform and ask whether the bidder would like to revoke its provisionally winning bid without penalty. The bidder will have one hour from the time the message is sent to respond via the messaging system. If the bidder fails to respond within the allotted hour, BOEM will presume the bidder does not wish to exercise its revocation right, and the bidder will lose the right to revoke its provisionally winning bid without penalty. Further information can be found in the Auction Procedures section of this notice.

    Announce Provisional Winner: BOEM will announce the provisional winner of the lease sale after the auction ends and the one-hour LOR period, if applicable, has elapsed.

    Reconvene the Panel: The panel will reconvene after the bidding has concluded to verify auction results.

    • From the Auction to Lease Execution

    Refund Non-Winners: Once the provisional winner has been announced and the panel has verified the auction results, BOEM will provide the non-winners a written explanation of why they did not win and return their bid deposits.

    Department of Justice (DOJ) Review: DOJ will have 30 days in which to conduct an antitrust review of the auction, pursuant to 43 U.S.C § 1337(c).

    Delivery of the Lease: BOEM will send three lease copies to the winner, with instructions on how to execute the lease. The first year's rent is due 45 calendar days after the winner receives the lease copies for execution.

    Return the Lease: Within 10 business days of receiving the lease copies, the auction winner must post financial assurance, pay any outstanding balance of its bonus bid (i.e., winning monetary bid minus applicable non-monetary credits and bid deposit), and sign and return the three executed lease copies.

    Execution of Lease: Once BOEM has received the lease copies and verified that all other required materials have been received, BOEM will make a final determination regarding its issuance of the lease and will execute the lease if appropriate.

    Area Offered for Leasing: The area available for sale will be auctioned as one lease, Lease OCS-A 0512 (New York LA). The New York LA consists of approximately 79,350 acres, which is reduced from the area originally proposed in the PSN. The reduction comprises five aliquots (sixteenths of an OCS block), which were removed in response to comments received from NMFS in response to the NOA of the EA and as part of consulations pursuant to the MSFCMA. A description of the final LA can be found in Addendum “A” of the lease, which BOEM has made available with this notice on its Web site at: http://www.boem.gov/New-York/.

    Map of the Area Offered for Leasing

    A map of the New York LA and GIS spatial files can be found on BOEM's Web site at: http://www.boem.gov/New-York/.

    A large scale map of the area, showing boundaries of the area with numbered blocks, is available from BOEM upon request at the following address: Bureau of Ocean Energy Management, Office of Renewable Energy Programs, 45600 Woodland Road, VAM-OREP, Sterling, Virginia, 20166, Phone: (703) 787-1300, Fax: (703) 787-1708.

    Potential Mitigation Measures and Restrictions on Development

    During the Area Identification (Area ID) process, BOEM identified three issues of concern associated with potential development of the New York Wind Energy Area (WEA): (1) Navigational safety; (2) commercial fishing; and (3) visual impacts to National Park Service lands and historic properties. Although BOEM did not remove any areas from leasing consideration during Area ID, potential bidders should be aware that future analysis of these or other issues could result in BOEM's requiring mitigation measures and/or development restrictions in all or part of the New York LA. In addition, mitigation measures and/or development restrictions could result from future BOEM environmental reviews and consultations (e.g., future consultations under section 106 of the National Historic Preservation Act or future government-to-government consultations with federally recognized tribes).

    Navigational Safety

    Potential bidders should note that future mitigation measures, including potential restrictions on the placement of structures, may be applied to development within all or portions of the New York LA to ensure navigation safety and the U.S. Coast Guard's (USCG's) ability to maintain mission readiness.

    The New York LA has been delineated to accommodate a setback of 1 nautical mile (nmi) from the adjacent Traffic Separation Schemes (TSSs) for the Port of New York and New Jersey. This setback is consistent with BOEM's delineation of other lease and wind energy areas that are in close proximity to TSSs (e.g., the lease areas offshore Massachusetts, Rhode Island/Massachusetts, Delaware, and Maryland; and the Wilmington West Wind Energy Area offshore North Carolina), and is based on input provided by the USCG as a member of the BOEM New York Intergovernmental Renewable Energy Task Force during development of the 2013 New York Request for Interest (RFI). As noted in the RFI, the LA includes aliquots that are transected by the 1 nmi setback line, and BOEM will require that no structures be installed on the portions of those aliquots located within the setback.

    In September 2015, BOEM received additional input from the USCG recommending a larger setback of 2 nmi from the TSSs and 5 nmi from the entry/exit points of the TSSs. USCG's correspondence to BOEM, which explains the recommendation, is available on BOEM's Web site at http://www.boem.gov/New-York/. In addition, on March 22, 2016, the USCG released its Final Report for its Atlantic Coast Port Access Route Study (ACPARS), available at http://www.uscg.mil/lantarea/acpars. The USCG's Marine Planning Guidelines, included as Enclosure 2 of the ACPARS, are consistent with its September 2015 recommendation to BOEM. Although BOEM did not adopt the USCG's recommendation during Area ID, BOEM may determine at a later stage in the process (e.g., after evaluating a Navigational Safety Risk Assessment that is submitted as a part of a COP) that, even with the application of mitigation measures, portions of the LA are not appropriate for the installation of wind facilities due to navigational safety concerns.

    Commercial Fishing

    Potential bidders should note that future mitigation measures may be applied to development within all or portions of the New York LA due to the use of the area as a fishery.

    BOEM received fishery-related comments in response to the RFI, Call for Information and Nominations, NOI, NOA, and several public outreach meetings. Commenters included NMFS, the New England and Mid-Atlantic Fishery Management Councils, and several fishing industry groups, primarily representing members of the sea scallop and squid fisheries. BOEM also received comments from commercial and recreational fishermen during BOEM's November 2015 fisheries workshops. A meeting summary of BOEM's November 2015 fisheries workshops and comments associated with these workshops are available on BOEM's Web site at http://www.boem.gov/New-York/, along with those comments received in response to BOEM's Federal Register notices relating to the New York LA.

    BOEM has also gathered information regarding the use of the LA as a fishery through a joint study with NMFS. This data, specific to the New York LA, is included in the revised EA and is available on BOEM's Web site at http://www.boem.gov/Fishing-Revenue-NY-Call-Area/. The spatial dataset is available at http://www.boem.gov/Renewable-Energy-GIS-Data/. Potential bidders should be aware that BOEM will be gathering additional data and may require plan-specific mitigation measures to minimize impacts.

    Between 2012 and 2016, BOEM collaborated with numerous stakeholders in the fishing and offshore wind industries to develop best management practices (BMPs) in furtherance of its goal of minimizing potential multiple use conflicts between offshore renewable energy developers and the fishing industry. As a result of this effort, BOEM has concluded that there would be great merit in a lessee's utilizing a fisheries liaison and a fisheries representative during the lessee's plan development process. BOEM has also received comments from the public regarding the importance of ensuring effective communication between the lessee and the fishing community. As a result, BOEM has issued guidance to lessees for communicating with fisheries stakeholders regarding social and economic impacts of renewable energy development on the Atlantic Outer Continental Shelf: http://www.boem.gov/Social-and-Economic-Conditions-Fishery-Communication-Guidelines/. Further, BOEM is requiring in Addendum C of the lease that the lessee develop a Fisheries Communication Plan that includes the utilization of a fisheries liaison to facilitate communication with the fishing industry.

    Visual Impacts to Historic Properties

    Potential bidders should note that future mitigation measures may be applied to development within all or portions of the New York LA to avoid, minimize, or mitigate adverse effects to historic properties or National Park Service (NPS) lands. The NPS, New York State Historic Preservation Office (NY SHPO), and New Jersey State Historic Preservation Office (NJ SHPO) have expressed concern regarding the potential for wind energy development within the New York WEA to cause adverse effects to onshore historic properties. Correspondence outlining these concerns is available on BOEM's Web site at http://www.boem.gov/New-York/.

    During the summer and fall of 2015, BOEM conducted stakeholder outreach with the NPS, NY SHPO, and NJ SHPO. BOEM also completed a study entitled, “Renewable Energy Viewshed Analysis and Visualization Simulation for the New York Outer Continental Shelf Call Area” to assist in this outreach effort and to provide scientific and technical information about visual impacts to inform its Area ID decision. Results of this study are available under the header “Visual Simulations” at http://www.boem.gov/New-York/.

    Withdrawal of Blocks: BOEM reserves the right to withdraw all or portions of the LA prior to executing the lease with the winning bidder, based upon relevant information provided to the Bureau.

    Lease Terms and Conditions: BOEM has included terms, conditions, and stipulations for the OCS commercial wind lease to be offered through this sale. After the lease is issued, BOEM reserves the right to require compliance with additional terms and conditions associated with approval of a SAP or COP.

    The lease is available on BOEM's Web site at http://www.boem.gov/New-York/. The lease includes the following seven attachments:

    • Addendum “A” (Description of Leased Area and Lease Activities);

    • Addendum “B” (Lease Term and Financial Schedule);

    • Addendum “C” (Lease Specific Terms, Conditions, and Stipulations);

    • Addendum “D” (Project Easement);

    • Addendum “E” (Rent Schedule post COP approval);

    • Appendix A to Addendum “C”: (Incident Report: Protected Species Injury or Mortality); and

    • Appendix B to Addendum “C”: (Required Data Elements for Protected Species Observer Reports).

    Addenda “A,” “B,” and “C” provide detailed descriptions of lease terms and conditions. Addenda “D” and “E” will be completed at the time of COP approval or approval with modifications.

    The most recent version of BOEM's renewable energy commercial lease form (BOEM-0008) is available on BOEM's Web site at: http://www.boem.gov/BOEM-OCS-Operation-Forms/.

    Potential bidders should note that BOEM and the Bureau of Safety and Environmental Enforcement (BSEE) are in the process of reassigning regulations relating to safety and environmental oversight and enforcement responsibilities for offshore renewable energy projects from BOEM to BSEE. Once this administrative reassignment is finalized, BOEM may make ministerial and non-substantive amendments to the lease to conform it to regulatory revisions.

    Plans: Pursuant to 30 CFR 585.601, the lessee must submit a SAP within 12 months of lease issuance. If the lessee intends to continue its commercial lease with an operations term, the lessee must submit a COP at least 6 months before the end of the site assessment term.

    Financial Terms and Conditions: This section provides an overview of the annual payments required of the lessee that will be fully described in the lease, and the financial assurance requirements that will be associated with the lease.

    Rent: Pursuant to 30 CFR 585.224(b) and 585.503, the first year's rent payment of $3 per acre is due within 45 calendar days of the date the lessee receives the lease for execution. Thereafter, annual rent payments are due on the anniversary of the Effective Date of the lease (the “Lease Anniversary”). Once commercial operations under the lease begin, BOEM will charge rent only for the portions of the lease not authorized for commercial operations, i.e., not generating electricity. However, instead of geographically dividing the LA into acreage that is “generating” and “non-generating,” the fraction of the lease accruing rent will be based on the fraction of the total nameplate capacity of the project that is not yet in operation. This fraction is calculated by dividing the nameplate capacity not yet authorized for commercial operations at the time payment is due by the anticipated nameplate capacity after full installation of the project (as described in the COP). The annual rent due for a given year is then derived by multiplying this fraction by the amount of rent that would have been due for the lessee's entire LA at the rental rate of $3 per acre.

    For a 79,350 acre lease (the size of the New York LA), the rent payment will be $238,050 per year ($3 times 79,350) if no portion of the leased area is authorized for commercial operations. If 300 megawatts (MW) of a project's nameplate capacity is operating (or authorized for operation), and the approved COP specifies a maximum project size of 500 MW, the rent payment will be $95,220. This payment is based on the 200 MW of nameplate capacity BOEM has not yet authorized for commercial operations. For the above example, this would be calculated as follows: 200MW/500MW × ($3/acre × 79,350 acres) = $95,220.

    If the lessee submits an application for relinquishment of a portion of its lease area within the first 45 calendar days following the date that the lease is received by the lessee for execution, and BOEM approves that application, no rent payment will be due on the relinquished portion of the LA. Later relinquishments of any portion of the LA will reduce the lessee's rent payments starting in the year following BOEM's approval of the relinquishment.

    The lessee also must pay rent for any project easement associated with the lease, commencing on the date that BOEM approves the COP (or modification thereof) that describes the project easement. Annual rent for a project easement is the greater of $5 per acre per year or $450 per year.

    Operating Fee

    For purposes of calculating the initial annual operating fee payment, pursuant to 30 CFR 585.506, an operating fee rate is applied to a proxy for the wholesale market value of the electricity expected to be generated from the project during its first twelve months of operations. This initial payment will be prorated to reflect the period between the commencement of commercial operations and the Lease Anniversary. The initial annual operating fee payment is due within 45 days of the commencement of commercial operations. Thereafter, subsequent annual operating fee payments are due on or before each Lease Anniversary.

    The subsequent annual operating fee payments are calculated by multiplying the operating fee rate by the imputed wholesale market value of the projected annual electric power production. For the purposes of this calculation, the imputed market value is the product of the project's annual nameplate capacity, the total number of hours in a year (8,760), the capacity factor, and the annual average price of electricity derived from a historical regional wholesale power price index. For example, the annual operating fee for a 100 MW wind facility operating at a 40% capacity (i.e., capacity factor of 0.4) with a regional wholesale power price of $50/MWh and an operating fee rate of 0.02 would be calculated as follows:

    EN31OC16.006

    Operating Fee Rate: The operating fee rate is the share of imputed wholesale market value of the projected annual electric power production due to BOEM as an annual operating fee. For the New York LA to be offered in this sale, this fee is set at 0.02 (i.e., 2%) during the entire life of commercial operations.

    Nameplate Capacity: Nameplate capacity is the maximum rated electric output, expressed in MW, that the turbines of the wind facility under commercial operations can produce at their rated wind speed as designated by the turbine's manufacturer. The lessee will specify in its COP the nameplate capacity available at the start of each year of commercial operations on the lease. For example, if the lessee specifies 20 turbines in its COP, and each is rated by the manufacturer at 5 MW, the nameplate capacity of the wind facility would be 100 MW.

    Capacity Factor: The capacity factor compares the amount of energy delivered to the grid during a period of time to the amount of energy the wind facility would have produced at full capacity. The amount of power delivered will always be less than the theoretical 100% capacity, largely because of the variability of wind speeds, transmission line loss, and down time for maintenance or other purposes.

    The capacity factor is expressed as a decimal between zero and one, and represents the share of anticipated generation of the wind facility that is delivered to the interconnection grid (i.e., where the lessee's facility interconnects with the electric grid) relative to the wind facility's generation at continuous full power operation at nameplate capacity. BOEM has set the capacity factor for the year in which commercial operations commence and the six full years thereafter at 0.4 (i.e., 40%). At the end of the sixth year, BOEM may adjust the capacity factor to reflect the performance over the previous five full years based upon the actual metered electricity generation at the delivery point to the electrical grid. BOEM may make similar adjustments to the capacity factor once every five years thereafter. The maximum change in the capacity factor from one period to the next will be limited to plus or minus 10 percent of the previous period's value.

    Wholesale Power Price Index: Pursuant to 30 CFR 585.506(c)(2)(i), the wholesale power price, expressed in dollars per MW-hour, is determined at the time each annual operating fee payment is due, based on the weighted average of the inflation-adjusted peak and off-peak spot price indices. Typically, BOEM's commercial wind leases specify an electric region and a source for referencing price information. However, at the current time, it is uncertain where a project's transmission cable may make landfall, so BOEM decided not to specify the electric region and source of price information at the lease issuance stage. The electric region of the wholesale power price index will encompass the location where the cable makes landfall. BOEM will specify the referencing price information upon approval of the COP. The wholesale power price is adjusted for inflation from the year associated with the published spot price indices to the year in which the operating fee is to be due, based on the Lease Anniversary and using annual implicit price deflators as reported by the U.S. Department of Commerce Bureau of Economic Analysis.

    Financial Assurance

    Within 10 business days after receiving the lease copies and pursuant to 30 CFR 585.515-.516, the provisional winner of the New York LA must provide an initial lease-specific bond or other approved means of meeting BOEM's initial financial assurance requirements. The provisional winner may meet financial assurance requirements by posting a surety bond or by setting up an escrow account with a trust agreement giving BOEM the right to withdraw the money held in the account on demand. BOEM encourages the provisionally winning bidder to discuss the financial assurance requirement with BOEM as soon as possible after the auction has concluded.

    BOEM will base the amount of all SAP, COP, and decommissioning financial assurance requirements on cost estimates for meeting all accrued lease obligations at the respective stages of development. The required amount of supplemental and decommissioning financial assurance will be determined on a case-by-case basis.

    The financial terms can be found in Addendum “B” of the lease, which BOEM has made available with this notice on its Web site at: http://www.boem.gov/New-York/.

    Bid Deposit: A bid deposit is an advance cash deposit submitted to BOEM in order to participate in the auction. Each bidder must submit a bid deposit of $450,000 no later than November 28, 2016. Any bidder that fails to submit the bid deposit by this deadline may be disqualified from participating in the auction. Bid deposits will be accepted online via pay.gov.

    Each bidder must fill out the BFF referenced in this FSN. BOEM has made a copy of the form available with this notice on its Web site at: http://www.boem.gov/New-York/. BOEM recommends that each bidder designate an email address in its BFF that the bidder will then use to create an account in pay.gov (if it has not already done so). Bidders may then use the Bid Deposit Form on the pay.gov Web site to leave a deposit.

    BOEM will not consider BFFs submitted by bidders for previous lease sales to satisfy the requirements of this auction. Further, BOEM will only consider BFFs submitted after the deadline if BOEM determines that the failure to timely submit the BFF was caused by events beyond the bidder's control. BOEM will only accept an original, executed paper copy of the BFF. The BFF must be executed by an authorized representative who has been identified in the qualifications package on file with BOEM as authorized to bind the company.

    Following the auction, bid deposits will be applied against bids or other obligations owed to BOEM. If the bid deposit exceeds a bidder's total financial obligation, the balance of the bid deposit will be refunded to the bidder. BOEM will refund bid deposits to non-winners once BOEM has announced the provisional winner.

    Bidders will forfeit their bid deposit if they are the provisionally winning bidder and they fail to execute a lease pursuant to their provisionally winning bid. Exercising the LOR pursuant to the rules described in this notice constitutes a limited exception to this rule, wherein if BOEM notifies a bidder that it may revoke its provisionally winning bid immediately following the lease sale, and if the bidder revokes such bid within the allotted time, then that bidder will not forfeit its $450,000 bid deposit. If a bidder exercises its LOR in this manner, BOEM will reoffer the lease to the government authority that is the second-highest bidder. In this case, the government authority would inherit the obligation to execute a lease pursuant to the government authority's now-provisionally winning bid, forfeiting its bid deposit if it does not execute the lease within the required timeframe.

    If BOEM offers a lease pursuant to a provisionally winning bid, and that bidder fails to timely return the signed lease form, establish financial assurance, or pay the balance of its bid, BOEM will retain that bidder's $450,000 bid deposit. BOEM reserves the right to reconvene the panel to determine which bidder would have won in the absence of the provisionally winning bid, and to offer a lease to that bidder.

    Minimum Bid: The minimum bid is the lowest bid price BOEM will accept as a winning bid, and it is where BOEM will start the monetary bidding. BOEM has established a minimum bid of $2.00 per acre for this lease sale. Accordingly, the minimum bid will be $158,700 for Lease OCS-A 0512.

    Auction Procedures Multiple-Factor Bidding

    As authorized under 30 CFR 585.220(a)(4) and 585.221(a)(6), BOEM will use a multiple-factor auction format, with a multiple-factor bidding system, for this lease sale. Under this system, BOEM may consider a combination of monetary and non-monetary factors, or “variables,” in determining the outcome of the auction. BOEM will appoint a panel of BOEM employees to review the non-monetary packages and verify the results of the lease sale. BOEM reserves the right to change the composition of this panel at any time.

    10% Non-Monetary Credit for Government Authorities

    In response to public comments on the PSN, BOEM is offering a 10% non-monetary bid credit in this lease sale for government authorities. In order to be considered for this non-monetary credit, BOEM must receive a bidder's non-monetary package no later than November 28, 2016, establishing that the bidder meets the definition of a government authority, below:

    Government Authority: A governmental entity, political subdivision thereof, or public benefit corporation exercising executive and/or regulatory functions within the United States.

    If a bidder wishes to establish itself as a government authority for the purposes of the auction, it must timely submit a non-monetary package for approval by BOEM. The non-monetary package may consist of new information to help a bidder demonstrate its status as a government authority, and/or may reference materials that the bidder has already submitted to BOEM to establish that the bidder is legally qualified to participate in the sale. If bidders wish to review what materials they have already submitted, they should contact Gina Best at 703-787-1341, as soon as practicable.

    Prior to the date of the auction, the panel will determine which bidders, if any, have qualified for the non-monetary credit. Bidders will be notified by email prior to the date of the auction if they have been granted a non-monetary credit. If the panel determines that no bidder is eligible to bid as a government authority and receive a credit, the auction will proceed with each bidder registered with no imputed credit. Bidders will not be notified whether other bidders have qualified for a non-monetary credit until after the bidding has concluded.

    Under the format for this sale, in each round a bidder may submit a bid proposal, i.e., a multiple-factor bid, for the LA. The multiple-factor bid made by a particular bidder in each round represents the sum of a non-monetary credit and a monetary (cash) amount. The non-monetary portion of the bid is represented by a 10% credit on the bid. This credit will be applied throughout the auction in each round as a form of imputed payment against the LA's asking price in a bidder's multiple-factor bid. The bid credit will be bundled into each bid. In each round, the auction system will show each bidder how their As-Bid auction price is affected by the credit imputed to its bid.

    Reservation of Limited Opportunity To Revoke (RLOR)

    In response to public comments on the PSN, BOEM is introducing the LOR as a feature of the New York lease sale. Each bidder may download, complete, sign and return the RLOR form from BOEM's Web site at http://www.boem.gov/New-York/. BOEM must receive the completed, signed RLOR no later than November 28, 2016. If BOEM does not receive the form by that date, BOEM will presume that the bidder does NOT wish to reserve the LOR. BOEM will consider extensions to this deadline only if BOEM determines that the failure to timely submit an RLOR was caused by events beyond the bidder's control.

    If a bidder opts into an LOR, and then becomes the provisional winner of the auction, it will be given a short opportunity just after the auction to revoke its provisionally winning bid without forfeiting its bid deposit of $450,000, if the second-place bidder is a government authority. Alternatively, bidders may choose not to opt-in. If a provisionally winning bidder does not reserve the LOR, that bidder will not be given an opportunity to revoke its provisionally winning bid following the sale without jeopardizing its bid deposit of $450,000. If a bidder fails to return the form in a timely manner, absent any extension granted by BOEM, it will be deemed to have opted out of its LOR. More information on LOR can be found in the “Determining Provisional Winner” section below.

    The Auction

    The auction will be conducted in a series of rounds. At the start of each round, BOEM will state an asking price for the LA. If a bidder is willing to meet that asking price for the LA, it will indicate this by submitting a bid equal to the asking price, i.e., a live bid. If the bidder has earned a non-monetary credit, it will meet the asking price by submitting a multiple-factor bid—that is, a live bid that consists of a monetary element (90%) and a non-monetary element (10%), the sum of which equals the asking price. Bidders without a non-monetary credit will submit a cash bid equal to the asking price.

    To participate in any round of the auction, a bidder must have submitted a live bid in the previous round. As long as there are two or more live bids for the LA, the auction proceeds to the next round. Between rounds, BOEM will raise the asking price for the LA by an increment that it determines appropriate. Asking price increments are within BOEM's sole discretion, but are based on a number of factors, including the number of bidders still active in the auction and BOEM's best estimate of how many rounds may remain before the auction is resolved. BOEM also reserves the right to increase or decrease bidding increments between rounds, if it determines that a different increment is warranted to enhance the efficiency of the auction process.

    As the auction proceeds, a bidder retains its eligibility to continue bidding as long as that bidder submitted a live bid on the LA in the previous round. Between rounds, BOEM will release information indicating the number of live bids for the LA in the previous round of the auction (i.e., the level of demand) and the asking price for the LA in the upcoming round of the auction. Bidders may be bound by any of their bids until the auction results are finalized.

    Exit Bidding

    In any round after the first round of the auction, a bidder may submit an exit bid that is higher than the previous round's asking price, but less than the current round's asking price. An exit bid must consist of a single offer price. If a bidder submits an exit bid, it is not eligible to participate in subsequent bidding rounds of the auction. During the auction, exit bids will be seen only by BOEM and not by other bidders.

    If the LA receives only exit bids in a round, BOEM will not raise the price and start another round, because no bidders would be eligible to bid in the next round.

    Determining the Provisional Winner

    The auction will end in the first round in which one or zero live bids is received. If one live bid is received, that bid is the provisionally winning bid. If no live bids are received, then the highest exit bid received is the provisionally winning bid. If there is a tie for the highest exit bid, BOEM's tie-breaking procedures will resolve the tie. If no live or exit bids are received, then there is a tie among all bidders that submitted live bids at the most recent asking price, and BOEM's tie-breaking procedures will determine the provisionally winning bid.

    LOR

    As noted, in response to public comments on the PSN, this lease sale includes an LOR. Ordinarily, if a provisionally winning bidder does not execute a lease pursuant to that provisionally winning bid, that bidder will forfeit its bid deposit. In this lease sale, a provisionally winning bidder will have a chance to revoke its provisionally winning bid without this penalty, but only under the following circumstances:

    1. The provisionally winning bidder reserved the right to a LOR through a timely-submitted RLOR in advance of the auction; and

    2. The second highest bid was submitted by a government authority.

    If these two elements are satisfied, then BOEM will offer the provisionally winning bidder one hour to revoke its provisionally winning bid. If there is a tie for the second highest bid, including a government authority, the tie will be resolved and an LOR will be offered only if the government authority has the second-place bid following resolution of the tie.

    The provisionally winning bidder will be given precisely one hour to revoke, using the messaging tool in the auction system. If that bidder wishes to revoke, the message should consist of the following statement:

    “We hereby revoke our provisionally winning bid for ATLW-6, pursuant to the Reservation of Limited Opportunity to Revoke form submitted previously.”

    If the statement above is not included verbatim in the message a bidder uses to exercise its limited right to revoke, BOEM may not accept the LOR. Once BOEM receives this message, it will consider the provisionally winning bid to be revoked. If the provisionally winning bidder revokes its bid, the government authority will then become the new provisionally winning bidder and will be subject to the conditions in 30 CFR 585.224. In this case, the provisionally winning bid will be the government authority's last bid for the LA.

    If the provisionally winning bidder does not revoke its bid within the designated hour, BOEM's requirements for the bidder will be the same as it would be for a sale without the LOR. Pursuant to 30 CFR 585.224, once BOEM sends the lease copies to the bidder, the bidder must timely pay the balance of its bid, establish financial assurance, and properly sign and return the lease copies. If the bidder fails to do so, then BOEM may not issue the lease to that bidder, in which case the bidder would forfeit its bid deposit. BOEM may consider failure of a bidder to timely pay the full amount due an indication that the bidder is no longer financially qualified to participate in other lease sales under BOEM's regulations at 30 CFR 585.106 and 585.107.

    If the highest bidder revokes its provisionally winning bid pursuant to an LOR, the government authority with the second-highest bid in the auction becomes the provisionally winning bidder and must follow all of BOEM's requirements contained in 30 CFR 585.224. The government authority would then need to execute a lease pursuant to its provisionally winning bid, or risk forfeiture of its bid deposit.

    BOEM will use its tie-breaking procedures to resolve any ties before determining whether the conditions have been met for offering a provisionally winning bidder a LOR. Ties are resolved by a random process. The auction system generates a random number for each bidder. In the event of a tie, these numbers are compared, and the bidder with the higher random number is deemed the provisional winner.

    Following the lease sale, the non-monetary panel will convene, review the auction record, and certify the results of the sale. Shortly thereafter, BOEM will notify the DOJ that it may begin its antitrust review pursuant to 43 U.S.C. 1337(c).

    If a bidder fails to execute a lease pursuant to a provisionally winning bid, BOEM may reoffer that lease to the next highest bidder. If the bidder that fails to execute is a government authority that had been declared the provisional winner after the exercise of a LOR, BOEM may first reoffer the lease to the bidder that had exercised the LOR. If BOEM reoffers the lease following a bidder's failure to execute a lease pursuant to a provisionally winning bid, the second bidder to which the lease is offered may decline the offer without forfeiting its bid deposit.

    Additional Information Regarding the Auction Format Bidder Authentication

    For the online auction, BOEM will require two-factor authentication. Prior to the auction, the Auction Manager will send several bidder authentication packages to the bidders shortly after BOEM has processed the BFFs. One package will contain digital authentication tokens allowing access to the auction Web site. The tokens will be mailed to the Primary Point of Contact indicated on the BFF. This individual is responsible for distributing the tokens to the individuals authorized to bid for that company. Bidders are to ensure that each token is returned within three business days following the auction. An addressed, stamped envelope will be provided to facilitate this process. In the event that a bidder fails to submit a bid deposit or does not participate in the auction, BOEM will de-activate that bidder's token and login information, and the bidder will be asked to return its tokens.

    The second package contains login credentials for authorized bidders. The login credentials will be mailed to the address provided in the BFF for each authorized individual. Bidders can confirm these addresses by calling 703-787-1320. This package will contain user login information and instructions for accessing the Auction System Technical Supplement and Alternative Bidding Form. The login information, along with the tokens, will be tested during the Mock Auction.

    Timing of Auction

    The auction will begin at 8:30 a.m. EST on December 15, 2016. Bidders may log in as early as 8:00 a.m. on that day. We recommend that bidders log in earlier than 8:30 a.m. on that day to ensure that any login issues are resolved prior to the start of the auction. Once bidders have logged in, they should review the auction schedule, which lists the start times, end times, and recess times of each round in the auction. Each round is structured as follows:

    • Round bidding begins;

    • Bidders enter their bids;

    • Round bidding ends and the Recess begins;

    • During the Recess, previous Round results are posted;

    • Bidders review the previous Round results and prepare their next Round bids; and

    • Next Round bidding begins.

    The first round will last about 30 minutes, though subsequent rounds may be shorter. Recesses are anticipated to last approximately 10 minutes. The descriptions of the auction schedule and asking price increments included with this FSN are tentative. Bidders should consult the auction schedule on the bidding Web site during the auction for updated times. Bidding will continue until about 6:00 p.m. each day. BOEM anticipates the auction will last one or two business days, but bidders are advised to prepare to continue bidding for additional business days as necessary to resolve the auction.

    BOEM and the auction contractors will use the auction platform messaging service to keep bidders informed on issues of interest during the auction. For example, BOEM may change the schedule at any time, including during the auction. If BOEM changes the schedule during the auction, it will use the messaging feature to notify bidders that a revision has been made, and direct bidders to the relevant page. BOEM will also use the messaging system for other changes and items of note during the auction.

    Bidders may place bids at any time during the round. At the top of the bidding page, a countdown clock will show how much time remains in the round. Bidders have until the scheduled time to place bids. Bidders should do so according to the procedures described in this notice, and the Auction System Technical Supplement. No information about the round is available until the round has closed and results have been posted, so there should be no strategic advantage to placing bids early or late in the round.

    The timing of the auction will be elaborated on and clarified in the Auction System Technical Supplement available on BOEM's Web site at: http://www.boem.gov/New-York/. The Auction System Technical Supplement describes auction procedures that are incorporated by reference in this notice, unless the procedures described in the Auction System Technical Supplement directly contradict this notice. In the event of a contradiction, this FSN is controlling.

    Prohibition on Communications Between Bidders During Auction

    During the auction, and including one hour after the auction if LOR is triggered, bidders are prohibited from communicating with each other regarding their participation in the auction. Additionally, during the auction, and including one hour after the auction if LOR is triggered, bidders are prohibited from communicating to the general public, including, but not limited to, through social media, updated Web sites, or press releases, regarding any aspect of their participation or lack thereof in the auction.

    Alternate Bidding Procedures

    Alternate Bidding Procedures enable a bidder that is having difficulties accessing the Internet to submit its bid via fax using an Alternate Bidding Form available on BOEM's Web site at: http://www.boem.gov/New-York/.

    In order to be authorized to use an Alternative Bidding Form, a bidder must call the help desk number listed in the Auction Manual before the end of the round. BOEM will authenticate the caller to ensure he/she is authorized to bid on behalf of the bidder. The bidder must explain the reasons for which he/she is forced to place a bid using the Alternate Bidding Procedures. BOEM may, in its sole discretion, permit or refuse to accept a request for the placement of a bid using the Alternate Bidding Procedures.

    If bidders need to submit an Alternate Bidding Form, they are strongly encouraged to do so before the round ends.

    Rejection or Non-Acceptance of Bids: BOEM reserves the right and authority to reject any and all bids that do not satisfy the requirements and rules of the auction, the FSN, and all applicable regulations and statutes.

    Anti-Competitive Review

    Bidding behavior in this sale is subject to Federal antitrust laws. Accordingly, following the auction, but before the acceptance of bids and the issuance of leases, BOEM will “allow the Attorney General, in consultation with the Federal Trade Commission, 30 days to review the results of the lease sale.” 43 U.S.C. 1337(c). If a bidder is found to have engaged in anti-competitive behavior in connection with its participation in the competitive bidding process, BOEM may reject the provisionally winning bid. Compliance with BOEM's auction procedures and regulations is not an absolute defense to violations of antitrust laws.

    Anti-competitive behavior determinations are fact-specific. However, such behavior may manifest itself in several different ways, including, but not limited to:

    • An express or tacit agreement among bidders not to bid in an auction, or to bid a particular price;

    • An agreement among bidders not to bid for a particular LA;

    • An agreement among bidders not to bid against each other; or

    • Other agreements among bidders that have the potential to affect the final auction price.

    BOEM will decline to award a lease if the Attorney General, in consultation with the Federal Trade Commission, determines that doing so would be inconsistent with the antitrust laws 43 U.S.C. 1337(c).

    For more information on whether specific communications or agreements could constitute a violation of Federal antitrust law, please see: http://www.justice.gov/atr/public/business-resources.html, or consult counsel.

    Process for Issuing the Lease: Once all post-auction reviews have been completed to BOEM's satisfaction, BOEM will issue three unsigned copies of the lease to the provisionally winning bidder. Within 10 business days after receiving the lease copies, the provisionally winning bidder must:

    1. Sign the lease on the bidder's behalf;

    2. File financial assurance, as required under 30 CFR 585.515-537; and

    3. Pay by electronic funds transfer (EFT) the balance (if any) of the bonus bid (winning bid less the bid deposit). BOEM requires bidders to use EFT procedures (not pay.gov, the Web site bidders used to submit bid deposits) for payment of the balance of the bonus bid, following the detailed instructions contained in the “Instructions for Making Electronic Payments” available on BOEM's Web site at: http://www.boem.gov/New-York/.

    BOEM will not execute a lease until the three requirements above have been satisfied, BOEM has accepted the provisionally winning bidder's financial assurance pursuant to 30 CFR 585.515, and BOEM has processed the provisionally winning bidder's payment.

    BOEM may extend the ten business day deadline for executing the lease on the bidder's behalf, filing the required financial assurance, and/or paying the balance of the bonus bid if it determines the delay was caused by events beyond the provisionally winning bidder's control.

    If the provisionally winning bidder does not meet these requirements or otherwise fails to comply with applicable regulations or the terms of the FSN, BOEM reserves the right to not issue the lease to that bidder. In such a case, the provisionally winning bidder will forfeit its bid deposit.

    Within 45 calendar days of the date that the provisionally winning bidder receives copies of the lease, it must pay the first year's rent using the pay.gov Renewable Energy Initial Rental Payment form available at: https://pay.gov/paygov/forms/formInstance.html?agencyFormId=27797604. Subsequent annual rent payments must be made following the detailed instructions contained in the “Instructions for Making Electronic Payments,” available on BOEM's Web site at:http://www.boem.gov/New-York/.

    Non-Procurement Debarment and Suspension Regulations: Pursuant to regulations at 43 CFR part 42, subpart C, an OCS renewable energy lessee must comply with the Department of the Interior's non-procurement debarment and suspension regulations at 2 CFR 180 and 1400. The lessee must also communicate this requirement to persons with whom the lessee does business relating to this lease, by including this term as a condition in its contracts and other transactions.

    Force Majeure: The Program Manager of BOEM's Office of Renewable Energy Programs has the discretion to change any auction details specified in the FSN, including the date and time, in case of a force majeure event that the Program Manager determines may interfere with a fair and proper lease sale process. Such events may include, but are not limited to: Natural disasters (e.g., earthquakes, hurricanes, floods, blizzards), wars, riots, acts of terrorism, fire, strikes, civil disorder or other events of a similar nature. In case of such an event, BOEM will notify all bidders via email, phone, or through the BOEM Web site at: http://www.boem.gov/Renewable-Energy-Program/index.aspx. Bidders should call 703-787-1320 if they have concerns.

    Appeals: The appeals procedures are provided in BOEM's regulations at 30 CFR 585.118(c) and 585.225. Pursuant to 30 CFR 585.225:

    (a) If BOEM rejects your bid, BOEM will provide a written statement of the reasons and refund any money deposited with your bid, without interest.

    (b) You will then be able to ask the BOEM Director for reconsideration, in writing, within 15 business days of bid rejection, under 30 CFR 585.118(c)(1). We will send you a written response either affirming or reversing the rejection.

    The procedures for appealing final decisions with respect to lease sales are described in 30 CFR 585.118(c).

    Protection of Privileged or Confidential Information

    Consistent with the Freedom of Information Act (FOIA), BOEM will protect privileged or confidential information that you submit. Exemption 4 of FOIA applies to “trade secrets and commercial or financial information that you submit that is privileged or confidential.” 5 U.S.C. 552(b)(4). If you wish to protect the confidentiality of such information, clearly mark it, “Contains Privileged or Confidential Information,” and consider submitting such information as a separate attachment. BOEM will not disclose such information, except as required by FOIA. Information that is not labeled as privileged or confidential will be regarded by BOEM as suitable for public release. Further, BOEM will not treat as confidential aggregate summaries of otherwise confidential information.

    Authority:

    This FSN is published pursuant to subsection 8(p) of the OCS Lands Act (43 U.S.C. 1337(p)) (“the Act”), as amended by section 388 of the Energy Policy Act of 2005 (EPAct), and the implementing regulations at 30 CFR part 585, including sections 211 and 216.

    Dated: October 25, 2016. Abigail Ross Hopper, Director, Bureau of Ocean Energy Management.
    [FR Doc. 2016-26240 Filed 10-28-16; 8:45 am] BILLING CODE 4310-MR-P
    DEPARTMENT OF THE INTERIOR Bureau of Ocean Energy Management [Docket No. BOEM-2016-0066] Environmental Assessment for Commercial Wind Lease Issuance and Site Assessment Activities on the Atlantic Outer Continental Shelf Offshore New York; MMAA104000 AGENCY:

    Bureau of Ocean Energy Management (BOEM), Interior.

    ACTION:

    Notice of availability of a revised environmental assessment and a finding of no significant impact.

    SUMMARY:

    BOEM is announcing the availability of a revised environmental assessment (EA) and finding of no significant impact (FONSI) for commercial wind lease issuance, site characterization activities (geophysical, geotechnical, archaeological, and biological surveys), and site assessment activities (including the installation and operation of a meteorological tower or buoys or both a tower and buoys) on the Atlantic Outer Continental Shelf offshore New York. The revised EA provides a discussion of potential impacts of the proposed action and an analysis of reasonable alternatives to the proposed action. In accordance with the requirements of the National Environmental Policy Act (NEPA) and the Council on Environmental Quality's (CEQ) regulations implementing NEPA at 40 CFR 1500-1508, BOEM issued a FONSI supported by the analysis in the revised EA. The FONSI concluded that the reasonably foreseeable environmental impacts associated with the proposed action and alternatives, as set forth in the EA, would not significantly impact the quality of the human environment; therefore, the preparation of an environmental impact statement is not required. This notice is being published concurrently with the Final Sale Notice for the New York Wind Energy Area (WEA). These documents and associated information are available on BOEM's Web site at http://www.boem.gov/New-York/.

    FOR FURTHER INFORMATION CONTACT:

    Michelle Morin, BOEM Office of Renewable Energy Programs, 45600 Woodland Road, Sterling, Virginia 20166, (703) 787-1340 or [email protected].

    SUPPLEMENTARY INFORMATION:

    In June 2016, BOEM published an EA to consider the reasonably foreseeable environmental consequences associated with commercial wind lease issuance, site characterization activities, and site assessment activities within the WEA offshore New York. A notice was published on June 6, 2016, to announce the availability of the EA and initiate a 30-day public comment period (81 FR 36344). Due to requests for extension, the public comment period closed on July 13, 2016. The EA was subsequently revised based on comments received through Regulations.gov and at public information meetings during the comment period. The revised EA provides updated environmental data, incorporates the results of consultations, and reflects a change to the proposed lease area (i.e., removal of Cholera Bank sensitive habitat). The revised EA also includes a summary of comments received on the June 2016 EA and BOEM's responses to those comments.

    In addition to the proposed action, the revised EA considers two alternatives: (1) Restricting site assessment structure placement within 2 nm (3.7 km) of the traffic separation scheme, and (2) no action. BOEM's analysis of the proposed action and alternatives takes into account standard operating conditions (SOCs) designed to avoid or minimize potential impacts to marine mammals and sea turtles. The SOCs can be found in Appendix B of the revised EA.

    BOEM will use the revised EA to inform its decisions regarding lease issuance in the New York WEA and subsequent review of site assessment plans in the lease area. The competitive leasing process is set forth at 30 CFR 585.210-585.225. A future lessee may propose a wind energy generation facility on its lease by submitting a construction and operations plan (COP) to BOEM. BOEM would then prepare a separate site- and project-specific NEPA analysis of the activities proposed in the COP.

    Authority:

    This notice of availability for an EA is in compliance with the National Environmental Policy Act (NEPA) of 1969, as amended (42 U.S.C. 4231 et seq.), and is published pursuant to 43 CFR 46.305.

    Dated: October 25, 2016. Abigail Ross Hopper, Director, Bureau of Ocean Energy Management.
    [FR Doc. 2016-26237 Filed 10-28-16; 8:45 am] BILLING CODE 4310-MR-P
    DEPARTMENT OF JUSTICE [OMB Number 1105-0086] Agency Information Collection Activities; Proposed eCollection eComments Requested; Proposed Renewal, With Change, of a Previously Approved Collection; Attorney Student Loan Repayment Program Electronic Forms AGENCY:

    Department of Justice.

    ACTION:

    30 day notice.

    SUMMARY:

    The Department of Justice (DOJ), Justice Management Division, Office of Attorney Recruitment and Management (OARM), will be submitting the following information collection request to the Office of Management and Budget (OMB) for review and approval in accordance with the Paperwork Reduction Act of 1995. This proposed information collection was previously published in the Federal Register at 81 FR 54604 on August 16, 2016, allowing for a 60 day comment period.

    DATES:

    Comments are encouraged and will be accepted for an additional 30 days until December 30, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Written comments and/or suggestions regarding the item(s) contained in this notice, especially regarding the estimated public burden and associated response time, should be directed to the U.S. Department of Justice, Office of Attorney Recruitment and Management, 450 5th Street NW., Suite 10200, Attn: Deana Willis, Washington, DC 20530 or sent to [email protected]. Written comments and/or suggestions can also be sent to the Office of Management and Budget, Office of Information and Regulatory Affairs, Attention Department of Justice Desk Officer, Washington, DC 20503 or sent to [email protected].

    SUPPLEMENTARY INFORMATION:

    Written comments and suggestions from the public and affected agencies concerning the proposed collection of information are encouraged. Your comments should address one or more of the following four points:

    (1) Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;

    (2) Evaluate the accuracy of the agencies estimate of the burden of the proposed collection of information;

    (3) Enhance the quality, utility, and clarity of the information to be collected; and

    (4) Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.

    Overview of this information collection:

    1. Type of Information Collection: Revision and renewal of a currently approved collection.

    2. The Title of the Form/Collection: Attorney Student Loan Repayment Program Electronic Forms.

    3. The agency form number, if any, and the applicable component of the Department sponsoring the collection: Form Number: None. Office of Attorney Recruitment and Management, Justice Management Division, U.S. Department of Justice.

    4. Affected public who will be asked or required to respond, as well as a brief abstract: Primary: Individuals or households. Other: None.

    The Department of Justice Attorney Student Loan Repayment Program (ASLRP) is an agency recruitment and retention incentive program based on 5 U.S.C. 5379, as amended, and 5 CFR part 537. Anyone currently employed as an attorney or hired to serve in an attorney position within the Department may request consideration for the ASLRP. The Department selects new participants during an annual open season each spring and renews current beneficiaries who remain qualified for these benefits, subject to availability of funds. There are two application forms—one for new requests, and the other for renewal requests. A justification form (applicable to new requests only) and a loan continuation form complete the collection.

    5. An estimate of the total number of respondents and the amount of time estimated for an average respondent to respond: The Department anticipates about 275 respondents annually will complete the new request form and justification form and apply for participation in the ASLRP. In addition, each year the Department expects to receive approximately 110 applications from attorneys requesting renewal of the benefits they received in previous years. It is estimated that each new request (including justification) will take two (2) hours to complete, and each renewal request approximately 20 minutes to complete.

    6. An estimate of the total public burden (in hours) associated with the collection: The estimated public burden associated with this collection is 586 hours, 40 minutes. It is estimated that new applicants will take 2 hours to complete the request form and justification and that current recipients requesting continued funding will take 20 minutes to complete a renewal form. The burden hours for collecting respondent data, 586 hours, 40 minutes, are calculated as follows: 275 new respondents × 2 hours = 550 hours, plus 110 renewing respondents × 20 minutes = 36 hours, 40 minutes.

    If additional information is required contact: Jerri Murray, Department Clearance Officer, United States Department of Justice, Justice Management Division, Policy and Planning Staff, Two Constitution Square, 145 N Street NE., 3E.405B, Washington, DC 20530.

    Dated: October 26, 2016. Jerri Murray, Department Clearance Officer for PRA, U.S. Department of Justice.
    [FR Doc. 2016-26161 Filed 10-28-16; 8:45 am] BILLING CODE 4410-PB-P
    DEPARTMENT OF LABOR Office of the Secretary Agency Information Collection Activities; Submission for OMB Review; Comment Request; Special Dipping and Coating Operations (Dip Tanks) ACTION:

    Notice.

    SUMMARY:

    The Department of Labor (DOL) is submitting the Occupational Safety and Health Administration (OSHA) sponsored information collection request (ICR) titled, “Special Dipping and Coating Operations (Dip Tanks),” to the Office of Management and Budget (OMB) for review and approval for continued use, without change, in accordance with the Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501 et seq. Public comments on the ICR are invited.

    DATES:

    The OMB will consider all written comments that agency receives on or before November 30, 2016.

    ADDRESSES:

    A copy of this ICR with applicable supporting documentation; including a description of the likely respondents, proposed frequency of response, and estimated total burden may be obtained free of charge from the RegInfo.gov Web site at http://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201604-1218-003 (this link will only become active on the day following publication of this notice) or by contacting Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at [email protected].

    Submit comments about this request by mail or courier to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-OSHA, Office of Management and Budget, Room 10235, 725 17th Street NW., Washington, DC 20503; by Fax: 202-395-5806 (this is not a toll-free number); or by email: [email protected]. Commenters are encouraged, but not required, to send a courtesy copy of any comments by mail or courier to the U.S. Department of Labor-OASAM, Office of the Chief Information Officer, Attn: Departmental Information Compliance Management Program, Room N1301, 200 Constitution Avenue NW., Washington, DC 20210; or by email: [email protected].

    FOR FURTHER INFORMATION CONTACT:

    Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at [email protected].

    (Authority: 44 U.S.C. 3507(a)(1)(D)).
    SUPPLEMENTARY INFORMATION:

    This ICR seeks to extend PRA authority for the Special Dipping and Coating Operations (Dip Tanks) information collection. The Dipping and Coating Operations Standard requires employers to post a conspicuous sign near each piece of electrostatic detearing equipment that notifies employees of the minimum safe distance they must maintain between goods undergoing electrostatic detearing and the electrodes or conductors of the equipment used in the process. See 29 CFR 1910.126(g)(4). Occupational Safety and Health Act sections 2(b)(9), 6, and 8(c) authorize this information collection. See 29 U.S.C. 651(b)(9), 655 and 657(c).

    This information collection is subject to the PRA. A Federal agency generally cannot conduct or sponsor a collection of information, and the public is generally not required to respond to an information collection, unless it is approved by the OMB under the PRA and displays a currently valid OMB Control Number. In addition, notwithstanding any other provisions of law, no person shall generally be subject to penalty for failing to comply with a collection of information that does not display a valid Control Number. See 5 CFR 1320.5(a) and 1320.6. The DOL obtains OMB approval for this information collection under OMB Control Number 1218-0237.

    OMB authorization for an ICR cannot be for more than three (3) years without renewal, and the current approval for this collection is scheduled to expire on October 31, 2016. The DOL seeks to extend PRA authorization for this information collection for three (3) more years, without any change to existing requirements. The DOL notes that existing information collection requirements submitted to the OMB receive a month-to-month extension while they undergo review. For additional substantive information about this ICR, see the related notice published in the Federal Register on March 11, 2016 (81 FR 12967).

    Interested parties are encouraged to send comments to the OMB, Office of Information and Regulatory Affairs at the address shown in the ADDRESSES section within thirty (30) days of publication of this notice in the Federal Register. In order to help ensure appropriate consideration, comments should mention OMB Control Number 1218-0237. The OMB is particularly interested in comments that:

    • Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;

    • Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;

    • Enhance the quality, utility, and clarity of the information to be collected; and

    • Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.

    Agency: DOL-OSHA.

    Title of Collection: Special Dipping and Coating Operations (Dip Tanks).

    OMB Control Number: 1218-0237.

    Affected Public: Private Sector—businesses or other for-profits.

    Total Estimated Number of Respondents: 10.

    Total Estimated Number of Responses: 10.

    Total Estimated Annual Time Burden: 1.

    Total Estimated Annual Other Costs Burden: $0

    Dated: October 25, 2016. Michel Smyth, Departmental Clearance Officer.
    [FR Doc. 2016-26213 Filed 10-28-16; 8:45 am] BILLING CODE 4510-26-P
    DEPARTMENT OF LABOR Office of the Secretary Agency Information Collection Activities; Submission for OMB Review; Comment Request; Safety Standards for Underground Coal Mine Ventilation—Belt Entry Used as an Intake Air Course To Ventilate Working Sections and Areas Where Mechanized Mining Equipment Is Being Installed or Removed ACTION:

    Notice.

    SUMMARY:

    The Department of Labor (DOL) is submitting the Mine Safety and Health Administration (MSHA) sponsored information collection request (ICR) titled, “Safety Standards for Underground Coal Mine Ventilation—Belt Entry Used as an Intake Air Course to Ventilate Working Sections and Areas Where Mechanized Mining Equipment is Being Installed or Removed,” to the Office of Management and Budget (OMB) for review and approval for continued use, without change, in accordance with the Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501 et seq. Public comments on the ICR are invited.

    DATES:

    The OMB will consider all written comments that agency receives on or before November 30, 2016.

    ADDRESSES:

    A copy of this ICR with applicable supporting documentation; including a description of the likely respondents, proposed frequency of response, and estimated total burden may be obtained free of charge from the RegInfo.gov Web site at http://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201609-1219-002 (this link will only become active on the day following publication of this notice) or by contacting Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at [email protected].

    Submit comments about this request by mail or courier to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-MSHA, Office of Management and Budget, Room 10235, 725 17th Street NW., Washington, DC 20503; by Fax: 202-395-5806 (this is not a toll-free number); or by email: [email protected]. Commenters are encouraged, but not required, to send a courtesy copy of any comments by mail or courier to the U.S. Department of Labor-OASAM, Office of the Chief Information Officer, Attn: Departmental Information Compliance Management Program, Room N1301, 200 Constitution Avenue NW., Washington, DC 20210; or by email: [email protected].

    FOR FURTHER INFORMATION CONTACT:

    Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at [email protected].

    (Authority: 44 U.S.C. 3507(a)(1)(D)).
    SUPPLEMENTARY INFORMATION:

    This ICR seeks to extend PRA authority for the Safety Standards for Underground Coal Mine Ventilation—Belt Entry Used as an Intake Air Course to Ventilate Working Sections and Areas Where Mechanized Mining Equipment is Being Installed or Removed information collection requirements codified in regulations 30 CFR part 75. More specifically, regulations 30 CFR 75.351 makes it mandatory for a mine operator electing to use belt air to ventilate a working section or area where mechanized equipment is being installed or removed to maintain records used by coal mine supervisors, miners, and Federal and State mine inspectors to show required examinations and tests were conducted. These records give insight into hazardous conditions that have been or may be encountered. Inspection records help in making decisions that ultimately affect the safety and health of miners working in belt air mines. Federal Mine Safety and Health Act of 1977 sections 101(a) and 103(h) authorize this information collection. See 30 U.S.C. 811(a) and 813(h).

    This information collection is subject to the PRA. A Federal agency generally cannot conduct or sponsor a collection of information, and the public is generally not required to respond to an information collection, unless it is approved by the OMB under the PRA and displays a currently valid OMB Control Number. In addition, notwithstanding any other provisions of law, no person shall generally be subject to penalty for failing to comply with a collection of information that does not display a valid Control Number. See 5 CFR 1320.5(a) and 1320.6. The DOL obtains OMB approval for this information collection under Control Number 1219-0138.

    OMB authorization for an ICR cannot be for more than three (3) years without renewal, and the current approval for this collection is scheduled to expire on December 31, 2016. The DOL seeks to extend PRA authorization for this information collection for three (3) more years, without any change to existing requirements. The DOL notes that existing information collection requirements submitted to the OMB receive a month-to-month extension while they undergo review. For additional substantive information about this ICR, see the related notice published in the Federal Register on June 30, 2016 (81 FR 42735).

    Interested parties are encouraged to send comments to the OMB, Office of Information and Regulatory Affairs at the address shown in the ADDRESSES section within thirty (30) days of publication of this notice in the Federal Register. In order to help ensure appropriate consideration, comments should mention OMB Control Number 1219-0138. The OMB is particularly interested in comments that:

    • Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;

    • Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;

    • Enhance the quality, utility, and clarity of the information to be collected; and

    • Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.

    Agency: DOL-MSHA.

    Title of Collection: Safety Standards for Underground Coal Mine Ventilation—Belt Entry Used as an Intake Air Course to Ventilate Working Sections and Areas Where Mechanized Mining Equipment is Being Installed or Removed.

    OMB Control Number: 1219-0138.

    Affected Public: Private Sector—businesses or other for-profits.

    Total Estimated Number of Respondents: 17.

    Total Estimated Number of Responses: 205.

    Total Estimated Annual Time Burden: 3,441 hours.

    Total Estimated Annual Other Costs Burden: $54,740.

    Dated: October 25, 2016. Michel Smyth, Departmental Clearance Officer.
    [FR Doc. 2016-26212 Filed 10-28-16; 8:45 am] BILLING CODE 4510-43-P
    DEPARTMENT OF LABOR Office of the Secretary Agency Information Collection Activities; Submission for OMB Review; Comment Request; Derricks Standard ACTION:

    Notice.

    SUMMARY:

    The Department of Labor (DOL) is submitting the Occupational Safety and Health Administration (OSHA) sponsored information collection request (ICR) titled, “Derricks Standard,” to the Office of Management and Budget (OMB) for review and approval for continued use, without change, in accordance with the Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501 et seq. Public comments on the ICR are invited.

    DATES:

    The OMB will consider all written comments that agency receives on or before November 30, 2016.

    ADDRESSES:

    A copy of this ICR with applicable supporting documentation; including a description of the likely respondents, proposed frequency of response, and estimated total burden may be obtained free of charge from the RegInfo.gov Web site at http://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201608-1218-007 (this link will only become active on the day following publication of this notice) or by contacting Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at [email protected].

    Submit comments about this request by mail or courier to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-OSHA, Office of Management and Budget, Room 10235, 725 17th Street NW., Washington, DC 20503; by Fax: 202-395-5806 (this is not a toll-free number); or by email: [email protected]. Commenters are encouraged, but not required, to send a courtesy copy of any comments by mail or courier to the U.S. Department of Labor-OASAM, Office of the Chief Information Officer, Attn: Departmental Information Compliance Management Program, Room N1301, 200 Constitution Avenue NW., Washington, DC 20210; or by email: [email protected].

    FOR FURTHER INFORMATION CONTACT:

    Contact Michel Smyth by telephone at 202-693-4129, TTY 202-693-8064, (these are not toll-free numbers) or by email at [email protected].

    Authority:

    44 U.S.C. 3507(a)(1)(D).

    SUPPLEMENTARY INFORMATION:

    This ICR seeks to extend PRA authority for the Derricks Standard information collection requirements codified in regulations 29 CFR 1910.181. The specified requirements are for marking the rated load on derricks, preparing certification records that verify the inspection of derrick ropes, and posting warning signs while the derrick is undergoing adjustments and repairs. Certification records must be maintained and disclosed upon request. Occupational Safety and Health Act sections 2(b)(3), 6(b)(7), and 8(c) authorize this information collection. See 29 U.S.C. 651(b)(3), 655(b)(7), and 657(c).

    This information collection is subject to the PRA. A Federal agency generally cannot conduct or sponsor a collection of information, and the public is generally not required to respond to an information collection, unless it is approved by the OMB under the PRA and displays a currently valid OMB Control Number. In addition, notwithstanding any other provisions of law, no person shall generally be subject to penalty for failing to comply with a collection of information that does not display a valid Control Number. See 5 CFR 1320.5(a) and 1320.6. The DOL obtains OMB approval for this information collection under Control Number 1218-0222.

    OMB authorization for an ICR cannot be for more than three (3) years without renewal, and the current approval for this collection is scheduled to expire on October 31, 2016. The DOL seeks to extend PRA authorization for this information collection for three (3) more years, without any change to existing requirements. The DOL notes that existing information collection requirements submitted to the OMB receive a month-to-month extension while they undergo review. For additional substantive information about this ICR, see the related notice published in the Federal Register on June 10, 2016 (81 FR 37644).

    Interested parties are encouraged to send comments to the OMB, Office of Information and Regulatory Affairs at the address shown in the ADDRESSES section within thirty (30) days of publication of this notice in the Federal Register. In order to help ensure appropriate consideration, comments should mention OMB Control Number 1218-0222. The OMB is particularly interested in comments that:

    • Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;

    • Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;

    • Enhance the quality, utility, and clarity of the information to be collected; and

    • Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.

    Agency: DOL-OSHA.

    Title of Collection: Derricks Standard.

    OMB Control Number: 1218-0222.

    Affected Public: Private Sector—businesses or other for-profits.

    Total Estimated Number of Respondents: 500.

    Total Estimated Number of Responses: 7,750.

    Total Estimated Annual Time Burden: 1,355 hours.

    Total Estimated Annual Other Costs Burden: $0.

    Dated: October 24, 2016. Michel Smyth, Departmental Clearance Officer.
    [FR Doc. 2016-26121 Filed 10-28-16; 8:45 am] BILLING CODE 4510-26-P
    DEPARTMENT OF LABOR Occupational Safety and Health Administration [Docket No. OSHA-2007-0039] Intertek Testing Services NA, Inc.: Application for Expansion of Recognition AGENCY:

    Occupational Safety and Health Administration (OSHA), Labor.

    ACTION:

    Notice.

    SUMMARY:

    In this notice, OSHA announces the application of Intertek Testing Services NA, Inc. for expansion of its recognition as a Nationally Recognized Testing Laboratory (NRTL) and presents the Agency's preliminary finding to grant the application.

    DATES:

    Submit comments, information, and documents in response to this notice, or requests for an extension of time to make a submission, on or before November 15, 2016.

    ADDRESSES:

    Submit comments by any of the following methods:

    1. Electronically: Submit comments and attachments electronically at http://www.regulations.gov, which is the Federal eRulemaking Portal. Follow the instructions online for making electronic submissions.

    2. Facsimile: If submissions, including attachments, are not longer than 10 pages, commenters may fax them to the OSHA Docket Office at (202) 693-1648.

    3. Regular or express mail, hand delivery, or messenger (courier) service: Submit comments, requests, and any attachments to the OSHA Docket Office, Docket No. OSHA-2007-0039, Technical Data Center, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3653, Washington, DC 20210; telephone: (202) 693-2350 (TTY number: (877) 889-5627). Note that security procedures may result in significant delays in receiving comments and other written materials by regular mail. Contact the OSHA Docket Office for information about security procedures concerning delivery of materials by express mail, hand delivery, or messenger service. The hours of operation for the OSHA Docket Office are 8:15 a.m.-4:45 p.m., e.t.

    4. Instructions: All submissions must include the Agency name and the OSHA docket number (OSHA-2007-0039). OSHA places comments and other materials, including any personal information, in the public docket without revision, and these materials will be available online at http://www.regulations.gov. Therefore, the Agency cautions commenters about submitting statements they do not want made available to the public, or submitting comments that contain personal information (either about themselves or others) such as Social Security numbers, birth dates, and medical data.

    5. Docket: To read or download submissions or other material in the docket, go to http://www.regulations.gov or the OSHA Docket Office at the address above. All documents in the docket are listed in the http://www.regulations.gov index; however, some information (e.g., copyrighted material) is not publicly available to read or download through the Web site. All submissions, including copyrighted material, are available for inspection at the OSHA Docket Office. Contact the OSHA Docket Office for assistance in locating docket submissions.

    6. Extension of comment period: Submit requests for an extension of the comment period on or before November 15, 2016 to the Office of Technical Programs and Coordination Activities, Directorate of Technical Support and Emergency Management, Occupational Safety and Health Administration, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3655, Washington, DC 20210, or by fax to (202) 693-1644.

    FOR FURTHER INFORMATION CONTACT:

    Information regarding this notice is available from the following sources:

    Press inquiries: Contact Mr. Frank Meilinger, Director, OSHA Office of Communications, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3647, Washington, DC 20210; telephone: (202) 693-1999; email: [email protected].

    General and technical information: Contact Mr. Kevin Robinson, Director, Office of Technical Programs and Coordination Activities, Directorate of Technical Support and Emergency Management, Occupational Safety and Health Administration, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3655, Washington, DC 20210; phone: (202) 693-2110 or email: [email protected].

    SUPPLEMENTARY INFORMATION:

    I. Notice of the Application for Expansion

    The Occupational Safety and Health Administration is providing notice that Intertek Testing Services NA, Inc. (ITSNA), is applying for expansion of its current recognition as an NRTL. ITSNA requests the addition of twenty-three (23) test standards to its NRTL scope of recognition.

    OSHA recognition of an NRTL signifies that the organization meets the requirements specified in 29 CFR 1910.7. Recognition is an acknowledgment that the organization can perform independent safety testing and certification of the specific products covered within its scope of recognition. Each NRTL's scope of recognition includes (1) the type of products the NRTL may test, with each type specified by its applicable test standard; and (2) the recognized site(s) that has/have the technical capability to perform the product-testing and product-certification activities for test standards within the NRTL's scope. Recognition is not a delegation or grant of government authority; however, recognition enables employers to use products approved by the NRTL to meet OSHA standards that require product testing and certification.

    The Agency processes applications by an NRTL for initial recognition and for an expansion or renewal of this recognition, following requirements in Appendix A to 29 CFR 1910.7. This appendix requires that the Agency publish two notices in the Federal Register in processing an application. In the first notice, OSHA announces the application and provides its preliminary finding. In the second notice, the Agency provides its final decision on the application. These notices set forth the NRTL's scope of recognition or modifications of that scope. OSHA maintains an informational Web page for each NRTL, including ITSNA, which details the NRTL's scope of recognition. These pages are available from the OSHA Web site at http://www.osha.gov/dts/otpca/nrtl/index.html.

    ITSNA currently has fourteen (14) facilities (sites) recognized by OSHA for product testing and certification, with its headquarters located at: Intertek Testing Services NA, Inc., 545 East Algonquin Road, Suite F, Arlington Heights, Illinois 60005. A complete list of ITSNA's scope of recognition is available at https://www.osha.gov/dts/otpca/nrtl/its.html.

    II. General Background on the Application

    ITSNA submitted an application, dated April 21, 2015 (OSHA-2007-0039-0022), to expand its recognition to include twenty-three (23) additional test standards. OSHA staff performed detailed analysis of the application packet and reviewed other pertinent information. OSHA did not perform any on-site reviews in relation to this application.

    Table 1 below lists the appropriate test standards found in ITSNA's application for expansion for testing and certification of products under the NRTL Program.

    Table 1—Proposed List of Appropriate Test Standards for Inclusion in ITSNA's NRTL Scope of Recognition Test standard Test standard title UL 5C Standard for Surface Raceways and Fittings for Use with Data, Signal and Control Circuits. UL 50E Enclosures for Electrical Equipment, Environmental Considerations. UL 565 Standard for Liquid-Level Gauges for Anhydrous Ammonia and LP-Gas. UL 60745-2-1 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-1: Particular Requirements for Drills and Impact Drills. UL 60745-2-14 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-14: Particular Requirements for Planers. UL 60745-2-17 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-17: Particular Requirements for Routers and Trimmers. UL 60745-2-3 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-3: Particular Requirements for Grinders, Polishers and Disk-Type Sanders. UL 962A Standard for Furniture Power Distribution Units. UL 1769 Standard for Cylinder Valves. UL 2061 Standard for Adapters and Cylinder Connection Devices for Portable LP-Gas Cylinder Assemblies. UL 2108 Standard for Low-Voltage Lighting Systems. UL 2238 Standard for Cable Assemblies and Fittings for Industrial Control and Signal Distribution. UL 2305 Standard for Display Units, Fabrication and Installation. UL 2438 Standard for Outdoor Seasonal-Use Cord-Connected Wiring Devices. UL 5085-2 Low Voltage Transformers—Part 2: General Purpose Transformers. UL 61010-031 Safety-Requirements for Electrical Equipment for Measurement, Control and Laboratory Use—Part 031: Safety requirements for hand-held probe assemblies for electrical measurement and test. UL 61010-2-030 Safety requirements for electrical equipment for measurement, control and laboratory use—Part 2-030: Particular requirements for testing and measuring circuits. UL 60730-2-2 Standard for Automatic Electrical Controls for Household and Similar Use; Part 2: Particular Requirements for Thermal Motor Protectors. UL 60745-2-5 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-5: Particular Requirements for Circular Saws. UL 60745-2-21 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-21: Particular Requirements for Drain Cleaners. UL 60950-23 Information Technology Equipment—Safety—Part 23: Large Data Storage Equipment. UL 62368-1 Audio/video, information and communication technology equipment—Part 1: Safety requirements. UL 1691 Single Pole Locking-Type Separable Connectors. III. Preliminary Findings on the Application

    ITSNA submitted an acceptable application for expansion of its scope of recognition. OSHA's review of the application file, and pertinent documentation, indicate that ITSNA can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of these twenty-three test standards for NRTL testing and certification listed above. This preliminary finding does not constitute an interim or temporary approval of ITSNA's application.

    OSHA welcomes public comment as to whether ITSNA meets the requirements of 29 CFR 1910.7 for expansion of its recognition as an NRTL. Comments should consist of pertinent written documents and exhibits. Commenters needing more time to comment must submit a request in writing, stating the reasons for the request. Commenters must submit the written request for an extension by the due date for comments. OSHA will limit any extension to 10 days unless the requester justifies a longer period. OSHA may deny a request for an extension if the request is not adequately justified. To obtain or review copies of the exhibits identified in this notice, as well as comments submitted to the docket, contact the Docket Office, Room N-3653, Occupational Safety and Health Administration, U.S. Department of Labor, at the above address. These materials also are available online at http://www.regulations.gov under Docket No. OSHA-2007-0039

    OSHA staff will review all comments to the docket submitted in a timely manner and, after addressing the issues raised by these comments, will recommend to the Assistant Secretary for Occupational Safety and Health whether to grant ITSNA's application for expansion of its scope of recognition. The Assistant Secretary will make the final decision on granting the application. In making this decision, the Assistant Secretary may undertake other proceedings prescribed in Appendix A to 29 CFR 1910.7.

    OSHA will publish a public notice of its final decision in the Federal Register.

    Authority and Signature

    David Michaels, Ph.D., MPH, Assistant Secretary of Labor for Occupational Safety and Health, 200 Constitution Avenue NW., Washington, DC 20210, authorized the preparation of this notice. Accordingly, the Agency is issuing this notice pursuant to 29 U.S.C. 657(g)(2), Secretary of Labor's Order No. 1-2012 (77 FR 3912, Jan. 25, 2012), and 29 CFR 1910.7.

    Signed at Washington, DC, on October 25, 2016. David Michaels, Assistant Secretary of Labor for Occupational Safety and Health.
    [FR Doc. 2016-26203 Filed 10-28-16; 8:45 am] BILLING CODE 4510-26-P
    DEPARTMENT OF LABOR Occupational Safety and Health Administration [Docket No. OSHA-2007-0042] TUV Rheinland of North America, Inc.: Applications for Expansion of Recognition and Proposed Modification to the List of Appropriate NRTL Test Standards AGENCY:

    Occupational Safety and Health Administration (OSHA), Labor.

    ACTION:

    Notice.

    SUMMARY:

    In this notice, OSHA announces the applications of TUV Rheinland of North America, Inc., for expansion of its recognition as a Nationally Recognized Testing Laboratory (NRTL) and presents the Agency's preliminary finding to grant the applications. Additionally, OSHA proposes to add a new test standard to the NRTL listing of Appropriate Test Standards.

    DATES:

    Submit comments, information, and documents in response to this notice, or requests for an extension of time to make a submission, on or before November 15, 2016.

    ADDRESSES:

    Submit comments by any of the following methods:

    1. Electronically: Submit comments and attachments electronically at http://www.regulations.gov, which is the Federal eRulemaking Portal. Follow the instructions online for making electronic submissions.

    2. Facsimile: If submissions, including attachments, are not longer than 10 pages, commenters may fax them to the OSHA Docket Office at (202) 693-1648.

    3. Regular or express mail, hand delivery, or messenger (courier) service: Submit comments, requests, and any attachments to the OSHA Docket Office, Docket No. OSHA-2007-0042, Technical Data Center, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-2625, Washington, DC 20210; telephone: (202) 693-2350 (TTY number: (877) 889-5627). Note that security procedures may result in significant delays in receiving comments and other written materials by regular mail. Contact the OSHA Docket Office for information about security procedures concerning delivery of materials by express mail, hand delivery, or messenger service. The hours of operation for the OSHA Docket Office are 8:15 a.m.-4:45 p.m., e.t.

    4. Instructions: All submissions must include the Agency name and the OSHA docket number (OSHA-2007-0042). OSHA places comments and other materials, including any personal information, in the public docket without revision, and these materials will be available online at http://www.regulations.gov. Therefore, the Agency cautions commenters about submitting statements they do not want made available to the public, or submitting comments that contain personal information (either about themselves or others) such as Social Security numbers, birth dates, and medical data.

    5. Docket: To read or download submissions or other material in the docket, go to http://www.regulations.gov or the OSHA Docket Office at the address above. All documents in the docket are listed in the http://www.regulations.gov index; however, some information (e.g., copyrighted material) is not publicly available to read or download through the Web site. All submissions, including copyrighted material, are available for inspection at the OSHA Docket Office. Contact the OSHA Docket Office for assistance in locating docket submissions.

    6. Extension of comment period: Submit requests for an extension of the comment period on or before November 15, 2016 to the Office of Technical Programs and Coordination Activities, Directorate of Technical Support and Emergency Management, Occupational Safety and Health Administration, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3655, Washington, DC 20210, or by fax to (202) 693-1644.

    FOR FURTHER INFORMATION CONTACT:

    Information regarding this notice is available from the following sources:

    Press inquiries: Contact Mr. Frank Meilinger, Director, OSHA Office of Communications, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3647, Washington, DC 20210; telephone: (202) 693-1999; email: [email protected].

    General and technical information: Contact Mr. Kevin Robinson, Director, Office of Technical Programs and Coordination Activities, Directorate of Technical Support and Emergency Management, Occupational Safety and Health Administration, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3655, Washington, DC 20210; phone: (202) 693-2110 or email: [email protected].

    SUPPLEMENTARY INFORMATION: I. Notice of the Application for Expansion

    The Occupational Safety and Health Administration is providing notice that TUV Rheinland of North America, Inc. (TUVRNA), is applying for expansion of its current recognition as an NRTL. TUVRNA requests the addition of three test standards and two additional recognized sites to its NRTL scope of recognition.

    OSHA recognition of an NRTL signifies that the organization meets the requirements specified in 29 CFR 1910.7. Recognition is an acknowledgment that the organization can perform independent safety testing and certification of the specific products covered within its scope of recognition. Each NRTL's scope of recognition includes (1) the type of products the NRTL may test, with each type specified by its applicable test standard; and (2) the recognized site(s) that has/have the technical capability to perform the product-testing and product-certification activities for test standards within the NRTL's scope. Recognition is not a delegation or grant of government authority; however, recognition enables employers to use products approved by the NRTL to meet OSHA standards that require product testing and certification.

    The Agency processes applications by an NRTL for initial recognition and for an expansion or renewal of this recognition, following requirements in Appendix A to 29 CFR 1910.7. This appendix requires that the Agency publish two notices in the Federal Register in processing an application. In the first notice, OSHA announces the application and provides its preliminary finding. In the second notice, the Agency provides its final decision on the application. These notices set forth the NRTL's scope of recognition or modifications of that scope. OSHA maintains an informational Web page for each NRTL, including TUVRNA, which details the NRTL's scope of recognition. These pages are available from the OSHA Web site at http://www.osha.gov/dts/otpca/nrtl/index.html.

    TUVRNA currently has three facilities (sites) recognized by OSHA for product testing and certification, with its headquarters located at: TUV Rheinland of North America, Inc., 12 Commerce Road, Newtown, Connecticut 06470. A complete list of TUVRNA's scope of recognition is available at https://www.osha.gov/dts/otpca/nrtl/tuv.html.

    II. General Background on the Application

    TUVRNA submitted five applications, dated April 1, 2015 (OSHA-2007-0042-0016), May 6, 2015 (OSHA-2007-0042-0017), August 20, 2015 (OSHA-2007-0042-0018), December 7, 2015 (OSHA-2007-0042-0019) and March 2, 2016 (OSHA-2007-0042-0020), to expand its recognition to include three additional test standards and two additional recognized sites. The two proposed recognized testing sites are located at: TUV Rheinland Japan Ltd., Global Technology Assessment Center, 4-25-2 Kita-Yamata, Tsuzuki-ku, Yokohama, Kanagawa, 224-0021 JAPAN and TUV Rheinland LGA Products GmbH, Am Grauen Stein 29, Koln, NRW 51105 GERMANY. OSHA performed on-site reviews of TUV Yokohama on February 16-17, 2016, and TUV Cologne on June 9-10, 2016, in relation to these applications, in which assessors found some nonconformances with the requirements of 29 CFR 1910.7. TUVRNA addressed these issues sufficiently and OSHA staff preliminarily determined that OSHA should grant the additional site applications.

    TUVRNA's expansion application also requested the addition of three test standards to its NRTL scope of recognition. OSHA staff performed a detailed analysis of the application packet and reviewed other pertinent information as well as conducted the on-site reviews discussed above. Table 1 below lists the appropriate test standards found in TUVRNA's applications for expansion for testing and certification of products under the NRTL Program.

    Table 1—Proposed List of Appropriate Test Standards for Inclusion in TUVRNA's NRTL Scope of Recognition Test standard Test standard title UL 62368-1 Audio/Visual Information and Communication Technology Equipment—Part 1: Safety Requirements. UL 1004-1 Standard for Rotating Electrical Machines—General Requirements. UL 62109-1 * Safety of Power Converters for Use in Photovoltaic Power Systems—Part 1: General Requirements. * Indicates the standard that OSHA proposes to add to the NRTL List of Appropriate Test Standards. III. Proposal To Add New Test Standard to the NRTL Program's List of Appropriate Test Standards

    Periodically, OSHA will propose to add new test standards to the NRTL list of appropriate test standards following an evaluation of the test standard document. To qualify as an appropriate test standard, the Agency evaluates the document to (1) verify it represents a product category for which OSHA requires certification by an NRTL, (2) verify the document represents an end product and not a component, and (3) verify the document defines safety test specifications (not installation or operational performance specifications). OSHA becomes aware of new test standards through various avenues. For example, OSHA may become aware of new test standards by: (1) Monitoring notifications issued by certain SDOs; (2) reviewing applications by NRTLs or applicants seeking recognition to include a new test standard in their scopes of recognition; and (3) obtaining notification from manufacturers, manufacturing organizations, government agencies, or other parties that a new test standard may be appropriate to add to its list of appropriate standards. OSHA may determine to include a new test standard in the list, for example, if the test standard is for a particular type of product that another test standard also covers or it covers a type of product that no standard previously covered.

    In this notice, OSHA proposes to add a new test standard to the NRTL Program's list of appropriate test standards. Table 2, below, lists the test standard that is new to the NRTL Program. OSHA preliminarily determined that this test standard is an appropriate test standard and proposes to include it in the NRTL Program's List of Appropriate Test Standards. OSHA seeks public comment on this preliminary determination.

    Table 2—Test Standards OSHA Is Proposing To Add to the NRTL Program's List of Appropriate Test Standards Test standard Test standard title UL 62109-1 Safety of Power Converters for Use in Photovoltaic Power Systems—Part 1: General Requirements. IV. Preliminary Findings on the Application

    TUVRNA submitted acceptable applications for expansion of its scope of recognition. OSHA's review of the application files, and pertinent documentation, indicate that TUVRNA can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of these three test standards for NRTL testing and certification listed above. OSHA's detailed on-site assessments indicate that TUVRNA can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of two sites for NRTL testing and certification. This preliminary finding does not constitute an interim or temporary approval of TUVRNA's applications.

    OSHA welcomes public comment as to whether TUVRNA meets the requirements of 29 CFR 1910.7 for expansion of its recognition as an NRTL. Comments should consist of pertinent written documents and exhibits. Commenters needing more time to comment must submit a request in writing, stating the reasons for the request. Commenters must submit the written request for an extension by the due date for comments. OSHA will limit any extension to 10 days unless the requester justifies a longer period. OSHA may deny a request for an extension if the request is not adequately justified. To obtain or review copies of the exhibits identified in this notice, as well as comments submitted to the docket, contact the Docket Office, Room N-2625, Occupational Safety and Health Administration, U.S. Department of Labor, at the above address. These materials also are available online at http://www.regulations.gov under Docket No. OSHA-2007-0042.

    OSHA staff will review all comments to the docket submitted in a timely manner and, after addressing the issues raised by these comments, will recommend to the Assistant Secretary for Occupational Safety and Health whether to grant TUVRNA's application for expansion of its scope of recognition. The Assistant Secretary will make the final decision on granting the application. In making this decision, the Assistant Secretary may undertake other proceedings prescribed in Appendix A to 29 CFR 1910.7.

    OSHA will publish a public notice of its final decision in the Federal Register.

    Authority and Signature

    David Michaels, Ph.D., MPH, Assistant Secretary of Labor for Occupational Safety and Health, 200 Constitution Avenue NW., Washington, DC 20210, authorized the preparation of this notice. Accordingly, the Agency is issuing this notice pursuant to 29 U.S.C. 657(g)(2), Secretary of Labor's Order No. 1-2012 (77 FR 3912, Jan. 25, 2012), and 29 CFR 1910.7.

    Signed at Washington, DC, on October 25, 2016. David Michaels, Assistant Secretary of Labor for Occupational Safety and Health.
    [FR Doc. 2016-26204 Filed 10-28-16; 8:45 am] BILLING CODE 4510-26-P
    DEPARTMENT OF LABOR Occupational Safety and Health Administration [Docket No. OSHA-2009-0026] Curtis-Strauss LLC: Application for Expansion of Recognition AGENCY:

    Occupational Safety and Health Administration (OSHA), Labor.

    ACTION:

    Notice.

    SUMMARY:

    In this notice, OSHA announces the application of Curtis-Strauss LLC for expansion of its recognition as a Nationally Recognized Testing Laboratory (NRTL) and presents the Agency's preliminary finding to grant the application.

    DATES:

    Submit comments, information, and documents in response to this notice, or requests for an extension of time to make a submission, on or before November 15, 2016.

    ADDRESSES:

    Submit comments by any of the following methods:

    1. Electronically: Submit comments and attachments electronically at http://www.regulations.gov, which is the Federal eRulemaking Portal. Follow the instructions online for making electronic submissions.

    2. Facsimile: If submissions, including attachments, are not longer than 10 pages, commenters may fax them to the OSHA Docket Office at (202) 693-1648.

    3. Regular or express mail, hand delivery, or messenger (courier) service: Submit comments, requests, and any attachments to the OSHA Docket Office, Docket No. OSHA-2009-0026, Technical Data Center, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-2625, Washington, DC 20210; telephone: (202) 693-2350 (TTY number: (877) 889-5627). Note that security procedures may result in significant delays in receiving comments and other written materials by regular mail. Contact the OSHA Docket Office for information about security procedures concerning delivery of materials by express mail, hand delivery, or messenger service. The hours of operation for the OSHA Docket Office are 8:15 a.m.-4:45 p.m., e.t.

    4. Instructions: All submissions must include the Agency name and the OSHA docket number (OSHA-2009-0026). OSHA places comments and other materials, including any personal information, in the public docket without revision, and these materials will be available online at http://www.regulations.gov. Therefore, the Agency cautions commenters about submitting statements they do not want made available to the public, or submitting comments that contain personal information (either about themselves or others) such as Social Security numbers, birth dates, and medical data.

    5. Docket: To read or download submissions or other material in the docket, go to http://www.regulations.gov or the OSHA Docket Office at the address above. All documents in the docket are listed in the http://www.regulations.gov index; however, some information (e.g., copyrighted material) is not publicly available to read or download through the Web site. All submissions, including copyrighted material, are available for inspection and copying at the OSHA Docket Office. Contact the OSHA Docket Office for assistance in locating docket submissions.

    6. Extension of comment period: Submit requests for an extension of the comment period on or before November 15, 2016 to the Office of Technical Programs and Coordination Activities, Directorate of Technical Support and Emergency Management, Occupational Safety and Health Administration, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3655, Washington, DC 20210, or by fax to (202) 693-1644.

    FOR FURTHER INFORMATION CONTACT:

    Information regarding this notice is available from the following sources:

    Press inquiries: Contact Mr. Frank Meilinger, Director, OSHA Office of Communications, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3647, Washington, DC 20210; telephone: (202) 693-1999; email: [email protected].

    General and technical information: Contact Mr. Kevin Robinson, Director, Office of Technical Programs and Coordination Activities, Directorate of Technical Support and Emergency Management, Occupational Safety and Health Administration, U.S. Department of Labor, 200 Constitution Avenue NW., Room N-3655, Washington, DC 20210; phone: (202) 693-2110 or email: [email protected].

    SUPPLEMENTARY INFORMATION:

    I. Notice of the Application for Expansion

    The Occupational Safety and Health Administration is providing notice that Curtis-Strauss LLC (CSL), is applying for expansion of its current recognition as an NRTL. CSL requests the addition of sixteen (16) test standards to its NRTL scope of recognition.

    OSHA recognition of an NRTL signifies that the organization meets the requirements specified in 29 CFR 1910.7. Recognition is an acknowledgment that the organization can perform independent safety testing and certification of the specific products covered within its scope of recognition. Each NRTL's scope of recognition includes (1) the type of products the NRTL may test, with each type specified by its applicable test standard; and (2) the recognized site(s) that has/have the technical capability to perform the product-testing and product-certification activities for test standards within the NRTL's scope. Recognition is not a delegation or grant of government authority; however, recognition enables employers to use products approved by the NRTL to meet OSHA standards that require product testing and certification.

    The Agency processes applications by an NRTL for initial recognition and for an expansion or renewal of this recognition, following requirements in Appendix A to 29 CFR 1910.7. This appendix requires that the Agency publish two notices in the Federal Register in processing an application. In the first notice, OSHA announces the application and provides its preliminary finding. In the second notice, the Agency provides its final decision on the application. These notices set forth the NRTL's scope of recognition or modifications of that scope. OSHA maintains an informational Web page for each NRTL, including CSL, which details the NRTL's scope of recognition. These pages are available from the OSHA Web site at http://www.osha.gov/dts/otpca/nrtl/index.html.

    CSL currently has one facility (site) recognized by OSHA for product testing and certification, with its headquarters located at: Curtis-Strauss LCC, One Distribution Center Circle, Suite #1, Littleton, MA 01460. A complete list of CSL's scope of recognition is available at https://www.osha.gov/dts/otpca/nrtl/csl.html.

    II. General Background on the Application

    CSL submitted four applications, each dated December 29, 2015 (OSHA-2009-0026-0065; OSHA-2009-0026-0066; OSHA-2009-0026-0069; OSHA-2009-0026-0068), to expand its recognition to include 16 additional test standards. OSHA staff performed a detailed analysis of the application packets and reviewed other pertinent information. OSHA did not perform any on-site reviews in relation to these applications.

    Table 1 below lists the appropriate test standards found in CSL's application for expansion for testing and certification of products under the NRTL Program.

    Table 1—Proposed List of Appropriate Test Standards for Inclusion in CSL's NRTL Scope of Recognition Test standard Test standard title UL 60745-1 Hand-Held Motor-Operated Electric Tools—Safety—Part 1: General Requirements.. UL 60745-2-1 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-1: Particular Requirements for Drills and Impact Drills. UL 60745-2-11 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-11: Particular Requirements for Reciprocating Saws. UL 60745-2-2 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-2: Particular Requirements for Screwdrivers and Impact Wrenches. UL 60745-2-3 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-3: Particular Requirements for Grinders, Polishers and Disk-Type Sanders. UL 60745-2-4 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-4: Particular Requirements for Sanders and Polishers Other Than Disk Type. UL 60745-2-5 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-5: Particular Requirements for Circular Saws. UL 60745-2-6 Hand-Held Motor-Operated Electric Tools—Safety—Part 2-6: Particular Requirements for Hammers. UL 1741 Standard for Inverters, Converters, Controllers and Interconnection System Equipment for Use With Distributed Energy Resources. UL 1778 Uninterruptable Power Systems. UL 1083 Household Electric Skillets and Frying-Type Appliances. UL 153 Standard for Portable Electric Lights. UL 1598 Luminaires. UL 1993 Self-Ballasted Lamps and Lamp Adapters. UL 8750 Standard for Light Emitting Diode (LED) Equipment for Use in Lighting Products. UL 935 Fluorescent-Lamp Ballasts. III. Preliminary Findings on the Application

    CSL submitted an acceptable application for expansion of its scope of recognition. OSHA's review of the application file, and pertinent documentation, indicate that CSL can meet the requirements prescribed by 29 CFR 1910.7 for expanding its recognition to include the addition of these 16 test standards for NRTL testing and certification listed above. This preliminary finding does not constitute an interim or temporary approval of CSL's application.

    OSHA welcomes public comment as to whether CSL meets the requirements of 29 CFR 1910.7 for expansion of its recognition as an NRTL. Comments should consist of pertinent written documents and exhibits. Commenters needing more time to comment must submit a request in writing, stating the reasons for the request. Commenters must submit the written request for an extension by the due date for comments. OSHA will limit any extension to 10 days unless the requester justifies a longer period. OSHA may deny a request for an extension if the request is not adequately justified. To obtain or review copies of the exhibits identified in this notice, as well as comments submitted to the docket, contact the Docket Office, Room N-2625, Occupational Safety and Health Administration, U.S. Department of Labor, at the above address. These materials also are available online at http://www.regulations.gov under Docket No. OSHA-2009-0026.

    OSHA staff will review all comments to the docket submitted in a timely manner and, after addressing the issues raised by these comments, will recommend to the Assistant Secretary for Occupational Safety and Health whether to grant CSL's application for expansion of its scope of recognition. The Assistant Secretary will make the final decision on granting the application. In making this decision, the Assistant Secretary may undertake other proceedings prescribed in Appendix A to 29 CFR 1910.7.

    OSHA will publish a public notice of its final decision in the Federal Register.

    Authority and Signature

    David Michaels, Ph.D., MPH, Assistant Secretary of Labor for Occupational Safety and Health, 200 Constitution Avenue NW., Washington, DC 20210, authorized the preparation of this notice. Accordingly, the Agency is issuing this notice pursuant to 29 U.S.C. 657(g)(2), Secretary of Labor's Order No. 1-2012 (77 FR 3912, Jan. 25, 2012), and 29 CFR 1910.7.

    Signed at Washington, DC, on October 25, 2016. David Michaels, Assistant Secretary of Labor for Occupational Safety and Health.
    [FR Doc. 2016-26202 Filed 10-28-16; 8:45 am] BILLING CODE 4510-26-P
    NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice: (16-077)] NASA Advisory Council; Aeronautics Committee; Meeting AGENCY:

    National Aeronautics and Space Administration.

    ACTION:

    Notice of meeting.

    SUMMARY:

    In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Aeronautics Committee of the NASA Advisory Council (NAC). This meeting will be held for the purpose of soliciting, from the aeronautics community and other persons, research and technical information relevant to program planning. This Committee reports to the NAC.

    DATES:

    Monday, November 14, 2016, 2:00 p.m.-6:00 p.m., and Tuesday, November 15, 2016, 9:45 a.m. to 3:00 p.m., Local Time.

    ADDRESSES:

    NASA Ames Conference Center, Building 3, 500 Severyns Avenue, Moffett Field, CA 94035-0001.

    FOR FURTHER INFORMATION CONTACT:

    Ms. Irma Rodriguez, Executive Secretary for the NAC Aeronautics Committee, NASA Headquarters, Washington, DC 20546, phone number (202) 358-0984, or [email protected].

    SUPPLEMENTARY INFORMATION:

    The meeting will be open to the public up to the capacity of the room. Any person interested in participating in the meeting by WebEx and telephone should contact Ms. Irma Rodriguez at (202) 358-0984 for the web link, toll-free number and passcode. The agenda for the meeting includes the following topics:

    • Computational Fluid Dynamics (CFD) Vision 2030 Implementation Plan • Vision and Strategic Planning for Advanced Aviation Operations • Autonomy Roadmap and Project Planning

    Attendees will be requested to sign a register and to comply with NASA security requirements, including the presentation of a valid picture ID before receiving access to NASA Ames Research Center. All attendees must state that they are attending the NASA Advisory Council Aeronautics Committee meeting in the NASA Ames Conference Center in Building 3. Due to the Real ID Act, Public Law 109-13, any attendees with drivers licenses issued from non-compliant states/territories must present a second form of ID. [Federal employee badge; passport; active military identification card; enhanced driver's license; U.S. Coast Guard Merchant Mariner card; Native American tribal document; school identification accompanied by an item from LIST C (documents that establish employment authorization) from the “List of the Acceptable Documents” on Form I-9]. Non-compliant states/territories are: American Samoa, Minnesota, Missouri, and Washington. Foreign nationals attending this meeting will be required to provide a copy of their passport and visa in addition to providing the following information no less than 8 working days prior to the meeting: Full name; gender; date/place of birth; citizenship; passport information (number, country, telephone); visa information (number, type, expiration date); employer affiliation information (name of institution, address. Country, telephone); title/position of attendee to Ms. Irma Rodriguez, NAC Aeronautics Committee Executive Secretary, fax (202) 358-4060. U.S. Citizens and Permanent Residents (green card holders) will need to show a valid, officially-issued picture identification at the gate to enter the NASA Research Park. For questions, please call Ms Irma Rodriguez at (202) 358-0984. It is imperative that these meetings be held on this date to accommodate the scheduling priorities of the key participants.

    Patricia D. Rausch, Advisory Committee Management Officer, National Aeronautics and Space Administration.
    [FR Doc. 2016-26145 Filed 10-28-16; 8:45 am] BILLING CODE 7510-13-P
    NATIONAL CREDIT UNION ADMINISTRATION Submission for OMB Review; Comment Request AGENCY:

    National Credit Union Administration (NCUA).

    ACTION:

    Notice.

    SUMMARY:

    The National Credit Union Administration (NCUA) will be submitting the following information collection requests to the Office of Management and Budget (OMB) for review and clearance in accordance with the Paperwork Reduction Act of 1995, Public Law 104-13, on or after the date of publication of this notice.

    DATES:

    Comments should be received on or before November 30, 2016 to be assured of consideration.

    ADDRESSES:

    Send comments regarding the burden estimate, or any other aspect of the information collection, including suggestions for reducing the burden, to (1) Office of Information and Regulatory Affairs, Office of Management and Budget, Attention: Desk Officer for NCUA, New Executive Office Building, Room 10235, Washington, DC 20503, or email at [email protected] and (2) NCUA PRA Clearance Officer, 1775 Duke Street, Alexandria, VA 22314, Suite 5067, or email at [email protected].

    FOR FURTHER INFORMATION CONTACT:

    Copies of the submission may be obtained by emailing [email protected] or viewing the entire information collection request at www.reginfo.gov.

    SUPPLEMENTARY INFORMATION:

    OMB Number: 3133-0033.

    Title: Security Program, 12 CFR 748.

    Abstract: In accordance with Title V of the Gramm-Leach-Bliley Act (15 U.S.C. 6801 et seq.), as implemented by 12 CFR part 748, federally-insured credit unions (FICU) are required to develop and implement a written security program to safeguard sensitive member information. This information collection requires that such programs be designed to respond to incidents of unauthorized access or use, in order to prevent substantial harm or serious inconvenience to members.

    Type of Review: Extension of a previously approved collection.

    Affected Public: Private Sector: Not-for-profit institutions.

    Estimated Total Annual Burden Hours: 15,982.

    OMB Number: 3133-0168.

    Title: Maximum Borrowing Authority, 12 CFR 741.2.

    Abstract: Section 741.2 of the NCUA Rules and Regulations (12 CFR 741.2) places a maximum borrowing limitation on federally insured credit unions of 50 percent of paid-in and unimpaired capital and surplus. This limitation is statutory for federal credit unions. The collection of information requirement is for federally insured state-chartered credit unions seeking a waiver from the borrowing limit. These credit unions must submit a detailed safety and soundness analysis, a proposed aggregate amount, a letter from the state regulator approving the request and an explanation of the need for the waiver to the NCUA Regional Director. This collection of information is necessary to protect the National Credit Union Share Insurance Fund (“Fund”). The NCUA must be made aware of and be able to monitor those credit unions seeking a waiver from the maximum borrowing limitation.

    Type of Review: Extension without change of a previously approved collection.

    Affected Public: Private Sector: Not-for-profit institutions.

    Estimated Total Annual Burden Hours: 16.

    By Gerard Poliquin, Secretary of the Board, the National Credit Union Administration, on October 19, 2016.

    Dated: October 26, 2016. Dawn D. Wolfgang, NCUA PRA Clearance Officer.
    [FR Doc. 2016-26169 Filed 10-28-16; 8:45 am] BILLING CODE 7535-01-P
    NUCLEAR REGULATORY COMMISSION [NRC-2015-0160] NuScale Power, LLC, Design-Specific Review Standard and Scope and Safety Review Matrix AGENCY:

    Nuclear Regulatory Commission.

    ACTION:

    NuScale design-specific review standard; issuance.

    SUMMARY:

    The U.S. Nuclear Regulatory Commission (NRC or Commission) has issued the NuScale Power, LLC, (NuScale), Design-Specific Review Standard (DSRS) Sections, and is issuing the final NuScale DSRS Scope and Safety Review Matrix, for NuScale Design Certification (DC), Combined License (COL), and Early Site Permit (ESP) reviews. The NRC staff is also issuing the DSRS public comment resolution matrices, which address the comments received on the draft DSRS. The NuScale DSRS provides guidance to the NRC staff for performing safety reviews for those specific areas where existing NUREG-0800, “Standard Review Plan [SRP] for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition,” sections do not address the unique features of the NuScale design.

    DATES:

    The DSRS sections were effective upon issuance between June 24 and August 4, 2016.

    ADDRESSES:

    Please refer to Docket ID NRC-2015-0160 when contacting the NRC about the availability of information regarding this document. You may obtain publically-available information related to this document using any of the following methods:

    Federal Rulemaking Web site: Go to http://www.regulations.gov and search for Docket ID NRC-2015-0160. Address questions about NRC dockets to Carol Gallagher; telephone: 301-415-3463; email: [email protected]. For technical questions, contact the individual listed in the FOR FURTHER INFORMATION CONTACT section of this document.

    NRC's Agencywide Documents Access and Management System (ADAMS): You may obtain publicly-available documents online in the ADAMS Public Documents collection at http://www.nrc.gov/reading-rm/adams.html. To begin the search, select “ADAMS Public Documents” and then select “Begin Web-based ADAMS Search.” For problems with ADAMS, please contact the NRC's Public Document Room (PDR) reference staff at 1-800-397-4209, 301-415-4737, or by email to [email protected]. The ADAMS accession number for each document referenced (if it is available in ADAMS) is provided the first time that it is mentioned in this document. The DSRS is available in ADAMS Package Accession No. ML15355A295 and the final NuScale DSRS Scope and Safety Review Matrix is also available in ADAMS under Accession No. ML16263A000. The resolution of comments on the draft DSRS is documented in the DSRS Public Comment Resolution Matrices (ADAMS Package Accession No. ML16083A615). In addition, for the convenience of the reader, the ADAMS accession numbers are provided in a table in the “Availability of Documents” section of this document.

    NRC's PDR: You may examine and purchase copies of public documents at the NRC's PDR, Room O1-F21, One White Flint North, 11555 Rockville Pike, Rockville, Maryland 20852.

    FOR FURTHER INFORMATION CONTACT:

    Rajender Auluck, telephone: 301-415-1025; email: [email protected] or Gregory Cranston, telephone: 301-415-0546; email: [email protected]; both are staff members of the Office of New Reactors, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001.

    SUPPLEMENTARY INFORMATION:

    I. Background

    In the Staff Requirements Memorandum (SRM) COMGBJ-10-0004/COMGEA-10-0001, “Use of Risk Insights to Enhance Safety Focus of Small Modular Reactor Reviews,” dated August 31, 2010 (ADAMS Accession No. ML102510405), the Commission provided direction to the NRC staff on the preparation for, and review of, small modular reactor (SMR) applications, with a near-term focus on integral pressurized-water reactor designs. The Commission directed the NRC staff to more fully integrate the use of risk insights into pre-application activities and the review of applications and, consistent with regulatory requirements and Commission policy statements, to align the review focus and resources to risk-significant structures, systems, and components and other aspects of the design that contribute most to safety in order to enhance the effectiveness and efficiency of the review process. The Commission directed the NRC staff to develop a design-specific, risk-informed review plan for each SMR design to address pre-application and application review activities. An important part of this review plan is the DSRS. The DSRS for the NuScale design is the result of the implementation of the Commission's direction.

    II. DSRS for the NuScale Design

    The NuScale DSRS (available in ADAMS Package Accession No. ML15355A295) reflects current NRC staff safety review methods and practices which integrate risk insights and, where appropriate, lessons learned from the NRC's reviews of DC and COL applications completed since the last revision of the NUREG-0800, SRP Introduction, Part 2, “Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: Light-Water Small Modular Reactor Edition,” January 2014 (ADAMS Accession No. ML13207A315). The NuScale DSRS Scope and Safety Review Matrix provides a complete list of SRP sections and identifies which SRP sections will be used for DC, COL, or ESP reviews concerning the NuScale design; which SRP sections are not applicable to the NuScale design; which SRP sections needed modification and were reissued as DSRS sections; and which new DSRS sections were added to address a unique design consideration in the NuScale design. The final NuScale DSRS Scope and Safety Review Matrix is available in ADAMS under Accession No. ML16263A000.

    The NRC staff developed the content of the NuScale DSRS as an alternative method for evaluating a NuScale-specific application and has determined that the application may address the DSRS in lieu of addressing the SRP, with specified exceptions. These exceptions include particular review areas in which the DSRS directs reviewers to consult the SRP and others in which the SRP is used for the review as identified in the final NuScale DSRS Scope and Safety Review Matrix. If NuScale chooses to address the DSRS, the application should identify and describe all differences between the design features (DC and COL applications only), analytical techniques, and procedural measures proposed in an application and the guidance of the applicable DSRS section (or SRP section, as specified in the NuScale DSRS Scope and Safety Review Matrix), and discuss how the proposed alternative provides an acceptable method of complying with the regulations that underlie the DSRS acceptance criteria. The staff has accepted the content of the DSRS as an alternative method for evaluating whether an application complies with NRC regulations for the NuScale Small Modular Reactor applications, provided that the application does not deviate significantly from the design and siting assumptions made by the NRC staff while preparing the DSRS. If the design or siting assumptions in a NuScale application deviate significantly from the design and siting assumptions the staff used in preparing the DSRS, the staff will use the more general guidance in the SRP, as specified in sections 52.17(a)(1)(xii), 52.47(a)(9), or 52.79(a)(41) of title 10 of the Code of Federal Regulations, depending on the type of application. Alternatively, the staff may supplement the DSRS section by adding appropriate criteria to address new design or siting assumptions.

    The NRC staff issued a Federal Register notice on June 30, 2015 (80 FR 37312), to request public comments on the draft NuScale DSRS Scope and Safety review Matrix (ADAMS Accession No. ML15156B063) and the individual NuScale-specific DSRS sections referenced in the table included in the FRN. A correction Federal Register notice was published on July 9, 2015 (80 FR 39454), to identify an additional draft DSRS section for which comments were requested. In response, the NRC received comments from: NuScale Power, LLC, by letter dated August 31, 2015 (ADAMS Accession No. ML15258A081), the Nuclear Energy Institute (NEI) by letter dated August 31, 2015 (ADAMS Accession No. ML15257A012), Mark Thomson by electronic submission dated August 31, 2015 (ADAMS Accession No. ML15292A309), an anonymous submitter by electronic submission dated August 31, 2015 (ADAMS Accession No. ML15292A310), an anonymous submitter by electronic submission dated August 31, 2015 (ADAMS Accession No. ML15292A311), Clinton Ferrara by electronic submission dated August 31, 2015 (ADAMS Accession No. ML15292A333), and Paula Ferrara by electronic submission dated August 31, 2015, (ADAMS Accession No. ML15292A334). Several of these comments have been previously discussed during public meetings held in support of developing the draft DSRS sections. These comments and resolutions have been documented in the DSRS Public Comment Resolution Matrices and are publicly available (ADAMS Package Accession No. ML16083A615).

    In the June 30, 2015 Federal Register notice, the NRC requested public comments on 115 DSRS sections. The NRC staff determined whether to develop a DSRS section after considering whether significant differences in the functions, characteristics, or attributes of the NuScale design required major revision of the related SRP section guidance, or whether structures, systems, and components identified in the NuScale design are unique and not addressed by the current SRP. Following publication of the draft version of the DSRS sections, the NRC staff revisited these criteria and determined, based on the most recent NuScale design, that it is appropriate to use the related SRP section in lieu of a draft DSRS section in a number of cases. In these cases the draft DSRS sections have not been issued as final, and the related SRP sections will be used for the NuScale review. In deciding to use the related SRP sections, the staff has not necessarily determined that the SRP sections are wholly applicable without modification. For example, as the NRC staff gains greater understanding of the NuScale design or if the design changes during the review, the staff would assess whether different or supplemental review criteria are needed. Stakeholders who believe that different or supplemental review criteria are needed may provide these views to the NRC staff for consideration during the application review period.

    The results of determinations to use the related SRP sections rather than draft DSRS sections, along with other identified issues with the draft NuScale DSRS Scope and Safety Review Matrix, are documented in a separate “transitional” NuScale DSRS Scope and Safety Review Matrix (ADAMS Accession No. ML16076A048). The “transitional” Matrix shows the differences between the draft and final NuScale DSRS Scope and Safety Review Matrices and describes the reasons for these differences. The resulting final list of DSRS titles with corresponding section numbers and ADAMS references are provided in the table below and in ADAMS Package Accession No. ML15355A295.

    In the future, should additional SRP sections be developed, the staff will determine at that time their applicability to the NuScale design. In addition, the NRC disseminates information regarding current safety issues and proposed solutions through various means, such as generic communications and the process for treating generic safety issues. When current issues are resolved, the staff will determine the need, extent, and nature of revision that should be made to the SRP and/or DSRS to reflect the new NRC guidance.

    III. Availability of Documents Section Design-specific review standard title ADAMS
  • accession No.
  • Matrix NuScale DSRS Scope and Safety Review Matrix (Transitional) ML16076A048 Matrix NuScale DSRS Scope and Safety Review Matrix (Final) ML16263A000 3.5.1.3 Turbine Missiles ML15355A364 3.7.1 Seismic Design Parameters ML15355A384 3.7.2 Seismic System Analysis ML15355A389 3.7.3 Seismic Subsystem Analysis ML15355A402 3.8.2 Steel Containment ML15355A411 3.8.4 Other Seismic Category I Structures ML15355A444 3.8.5 Foundations ML15355A451 3.11 Environmental Qualification of Mechanical and Electrical Equipment ML15355A455 4.4 Thermal and Hydraulic Design ML15355A468 5.2.4 Reactor Coolant Pressure Boundary Inservice Inspection and Testing ML15355A479 5.2.5 Reactor Coolant Pressure Boundary Leakage Detection ML15355A505 5.3.1 Reactor Vessel Materials ML15355A513 5.3.2 Pressure-Temperature Limits, Upper-Shelf Energy, and Pressurized Thermal Shock ML15355A526 5.3.3 Reactor Vessel Integrity ML15355A530 5.4.2.1 Steam Generator Materials ML15355A532 5.4.2.2 Steam Generator Program ML15355A535 5.4.7 Decay Heat Removal (DHR) System ML15355A536 BTP 5-4 Design Requirements of the Decay Heat Removal System ML15355A313 6.2.1 Containment Functional Design ML15356A259 6.2.1.1.A Containments ML15355A544 6.2.1.3 Mass and Energy Release Analysis for Postulated Loss-of-Coolant Accidents (LOCAs) ML15357A327 6.2.1.4 Mass and Energy Release Analysis for Postulated Secondary System Pipe Ruptures ML15356A241 6.2.2 Containment Heat Removal Systems ML15356A267 6.2.4 Containment Isolation System ML15356A332 6.2.5 Combustible Gas Control in Containment ML15356A356 6.2.6 Containment Leakage Testing ML15356A388 6.3 Emergency Core Cooling System ML15356A393 6.6 Inservice Inspection and Testing of Class 2 and 3 Components ML15356A396 7.0 Instrumentation and Controls—Introduction and Overview of Review Process ML15356A416 7.1 Instrumentation and Controls—Fundamental Design Principles ML15363A293 7.2 Instrumentation and Controls—System Characteristics ML15363A347 7.0, App A Instrumentation and Controls—Hazard Analysis ML15355A316 7.0, App B Instrumentation and Controls—System Architecture ML15355A318 7.0, App C Instrumentation and Controls—Simplicity ML15355A319 7.0, App D Instrumentation and Controls—References ML15355A320 8.1 Electric Power—Introduction ML15356A473 8.2 Offsite Power System ML15356A516 8.3.1 AC Power Systems (Onsite) ML15356A533 8.3.2 DC Power Systems (Onsite) ML15356A552 8.4 Station Blackout ML15356A570 9.1.2 New and Spent Fuel Storage ML15356A584 9.1.3 Spent Fuel Pool Cooling and Cleanup System ML15356A595 9.3.4 Chemical and Volume Control System ML15356A622 9.3.6 Containment Evacuation and Flooding Systems ML15356A637 9.5.2 Communications Systems ML15363A400 10.2.3 Turbine Rotor Integrity ML15356A700 10.3 Main Steam Supply System ML15355A322 10.4.7 Condensate and Feedwater System ML15355A331 11.1 Source Terms ML15355A333 11.2 Liquid Waste Management System ML15355A334 11.3 Gaseous Waste Management System ML15355A335 11.4 Solid Waste Management System ML15355A336 11.5 Process and Effluent Radiological Monitoring Instrumentation and Sampling Systems ML15355A337 11.6 Guidance on Instrumentation and Control Design Features for Process and Effluent Radiological Monitoring, and Area Radiation and Airborne Radioactivity Monitoring ML15355A338 12.2 Radiation Sources ML15350A320 12.3-12.4 Radiation Protection Design Features ML15350A339 12.5 Operational Radiation Protection Program ML15350A341 14.2 Initial Plant Test Program—Design Certification and New License Applicants ML15355A339 14.3.5 Instrumentation and Controls—Inspections, Tests, Analyses, and Acceptance Criteria ML15355A340 15.0 Introduction—Transient and Accident Analyses ML15355A302 15.0.3 Design Basis Accidents Radiological Consequence Analyses for NuScale SMR Design ML15355A341 15.1.1—15.1.4 Decrease in FW Temperature, Increase in FW Flow, Increase in Steam Flow and Inadvertent Opening of the Turbine Bypass System or Inadvertent Operation of the Decay Heat Removal System ML15355A303 15.1.5 Steam System Piping Failures Inside and Outside of Containment ML15355A304 15.1.6 Loss of Containment Vacuum ML15355A305 15.2.1-15.2.5 Loss of External Load; Turbine Trip; Loss of Condenser Vacuum; Closure of Main Steam Isolation Valve (BWR); and Steam Pressure Regulator Failure (Closed) ML15355A306 15.2.6 Loss of Non-Emergency AC Power to the Station Auxiliaries ML15363A348 15.2.7 Loss of Normal Feedwater Flow ML15355A307 15.2.8 Feedwater System Pipe Breaks Inside and Outside Containment ML15355A308 15.5.1-15.5.2 Chemical and Volume Control System Malfunction that Increases Reactor Coolant Inventory ML15363A397 15.6.5 LOCAs Resulting From Spectrum of Postulated Piping Breaks Within the Reactor Coolant Pressure Boundary ML15355A309 15.6.6 Inadvertent Opening of the Emergency Core Cooling System ML15355A310 15.9A Thermal-hydraulic Stability ML15355A311 16.0 Technical Specifications ML15355A312
    Dated at Rockville, Maryland, this 21st day of October 2016.

    For the Nuclear Regulatory Commission.

    Frank Akstulewicz, Director, Division of New Reactor Licensing, Office of New Reactors.
    [FR Doc. 2016-26210 Filed 10-28-16; 8:45 am] BILLING CODE 7590-01-P
    NUCLEAR REGULATORY COMMISSION [Docket Nos.: 52-029 and 52-030; NRC-2008-0558] Duke Energy Florida, LLC; Levy Nuclear Plant Units 1 and 2 AGENCY:

    Nuclear Regulatory Commission

    ACTION:

    Notice of intent to enter into a modified indemnity agreement.

    SUMMARY:

    The U.S. Nuclear Regulatory Commission (NRC) is issuing a notice of intent to enter into a modified indemnity agreement with Duke Energy Florida, LLC, (DEF) to operate Levy Nuclear Plant Units 1 and 2 (LNP 1 and 2). The NRC is required to publish notice of its intent to enter into an indemnity agreement which contains provisions different from the general form found in the NRC's regulations. A modification to the general form is necessary to accommodate the unique timing provisions of a combined license (COL).

    DATES:

    On October 20, 2016, the Commission authorized the Director of the Office of New Reactors to issue COLs to DEF to construct and operate LNP 1 and 2. The modified indemnity agreement would be effective upon issuance of the COLs.

    ADDRESSES:

    Please refer to Docket ID NRC-2008-0558 when contacting the NRC about the availability of information regarding this document. You may obtain publicly-available information related to this document using any of the following methods:

    Federal Rulemaking Web site: Go to http://www.regulations.gov and search for Docket ID NRC-2008-0558. Address questions about NRC dockets to Carol Gallagher; telephone: 301-415-3463; email: [email protected]. For technical questions, contact the individual listed in the FOR FURTHER INFORMATION CONTACT section of this document.

    NRC's Agencywide Documents Access and Management System (ADAMS): You may obtain publicly-available documents online in the ADAMS Public Documents collection at http://www.nrc.gov/reading-rm/adams.html. To begin the search, select “ADAMS Public Documents” and then select “Begin Web-based ADAMS Search.” For problems with ADAMS, please contact the NRC's Public Document Room (PDR) reference staff at 1-800-397-4209, 301-415-4737, or by email to [email protected].

    NRC's PDR: You may examine and purchase copies of public documents at the NRC's PDR, Room O1-F21, One White Flint North, 11555 Rockville Pike, Rockville, Maryland 20852.

    FOR FURTHER INFORMATION CONTACT:

    Donald Habib, Office of New Reactors, U.S. Nuclear Regulatory Commission, Washington DC 20555-0001; telephone: 301-415-1035, email: [email protected].

    SUPPLEMENTARY INFORMATION:

    I. Background

    On October 20, 2016, the Commission authorized issuance of COLs to DEF for LNP 1 and 2. These COLs would include a license pursuant to part 70 of title 10 of the Code of Federal Regulations (10 CFR), “Domestic Licensing of Special Nuclear Material.” Pursuant to 10 CFR 140.20(a)(1)(iii), the NRC will execute and issue agreements of indemnity effective on the date of a license under 10 CFR part 70 authorizing the licensee to possess and store special nuclear material at the site of the nuclear reactor for use as fuel in operation of the nuclear reactor after issuance of an operating license for the reactor. The general form of indemnity agreement to be entered into by the NRC with DEF is contained in 10 CFR 140.92, “Appendix B—Form of Indemnity Agreement with licensees furnishing insurance policies as proof of financial protection.”

    II. Request/Action

    Pursuant to 10 CFR 140.9, the NRC is publishing notice of its intent to enter into an indemnity agreement that contains provisions different from the general form found in 10 CFR 140.92. Modifications to the general indemnity agreement are addressed in the following discussion.

    III. Discussion

    The provisions of the general form of indemnity agreement provided in 10 CFR 140.92 address insurance and indemnity for a licensee that is authorized to operate as soon as an operating license (OL) is issued pursuant to 10 CFR part 50, “Domestic licensing of production and utilization facilities.” The DEF, however, has requested a COL pursuant to 10 CFR part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants” to construct and operate LNP 1 and 2. Unlike an OL, which authorizes operation of the facility as soon as the license is issued, a COL authorizes the construction of the facility but does not authorize operation of the facility until the Commission makes a finding pursuant to 10 CFR 52.103(g) that the acceptance criteria in the COL are met (also called a “§ 52.103(g) finding”). The COL holders are not required to maintain financial protection in the amount specified in 10 CFR 140.11(a)(4) before the § 52.103(g) finding is made, but must maintain financial protection in the amount specified by 10 CFR 140.13 upon receipt of a COL because the COL includes a license issued pursuant to 10 CFR part 70. Therefore, the provisions in the general form of indemnity agreement must be modified to address the timing differences applicable to COLs.

    Modifications to the general form of indemnity agreement will reflect the timing distinctions applicable to COLs. In addition, other modifications and their intent are described below:

    (1) References to Mutual Atomic Energy Liability Underwriters have been removed because this entity no longer exists.

    (2) Monetary amounts have been updated to reflect changes that have been made to Sec. 170. “Indemnification and Limitation of Liability” of the Atomic Energy Act of 1954, as amended (42 U.S.C. 2210).

    IV. Conclusions

    Accordingly, for the reasons discussed in this notice and in accordance with 10 CFR 140.9, the NRC hereby provides notice of its intent to enter into an agreement of indemnity with DEF for LNP 1 and 2 with the described modifications to the general form of indemnity.

    Dated at Rockville, Maryland, this 25th day of October 2016.

    For the Nuclear Regulatory Commission.

    Jennifer Dixon-Herrity, Chief, Licensing Branch, Division of New Reactor Licensing, Office of New Reactors.
    [FR Doc. 2016-26207 Filed 10-28-16; 8:45 am] BILLING CODE 7590-01-P
    POSTAL REGULATORY COMMISSION [Docket Nos. CP2015-39; MC2017-7 and CP2017-22; MC2017-8 and CP2017-23; MC2017-9 and CP2017-24; MC2017-10 and CP2017-25; MC2017-11 and CP2017-26; and MC2017-12 and CP2017-27] New Postal Products AGENCY:

    Postal Regulatory Commission.

    ACTION:

    Notice.

    SUMMARY:

    The Commission is noticing recent Postal Service filings for the Commission's consideration concerning negotiated service agreements. This notice informs the public of the filing, invites public comment, and takes other administrative steps.

    DATES:

    Comments are due: November 2, 2016 (COMMENT DUE DATE APPLIES TO ALL DOCKET NOS. LISTED ABOVE).

    ADDRESSES:

    Submit comments electronically via the Commission's Filing Online system at http://www.prc.gov. Those who cannot submit comments electronically should contact the person identified in the FOR FURTHER INFORMATION CONTACT section by telephone for advice on filing alternatives.

    FOR FURTHER INFORMATION CONTACT:

    David A. Trissell, General Counsel, at 202-789-6820.

    SUPPLEMENTARY INFORMATION: Table of Contents I. Introduction II. Docketed Proceeding(s) I. Introduction

    The Commission gives notice that the Postal Service filed request(s) for the Commission to consider matters related to negotiated service agreement(s). The request(s) may propose the addition or removal of a negotiated service agreement from the market dominant or the competitive product list, or the modification of an existing product currently appearing on the market dominant or the competitive product list.

    Section II identifies the docket number(s) associated with each Postal Service request, the title of each Postal Service request, the request's acceptance date, and the authority cited by the Postal Service for each request. For each request, the Commission appoints an officer of the Commission to represent the interests of the general public in the proceeding, pursuant to 39 U.S.C. 505 (Public Representative). Section II also establishes comment deadline(s) pertaining to each request.

    The public portions of the Postal Service's request(s) can be accessed via the Commission's Web site (http://www.prc.gov). Non-public portions of the Postal Service's request(s), if any, can be accessed through compliance with the requirements of 39 CFR 3007.40.

    The Commission invites comments on whether the Postal Service's request(s) in the captioned docket(s) are consistent with the policies of title 39. For request(s) that the Postal Service states concern market dominant product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3622, 39 U.S.C. 3642, 39 CFR part 3010, and 39 CFR part 3020, subpart B. For request(s) that the Postal Service states concern competitive product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3632, 39 U.S.C. 3633, 39 U.S.C. 3642, 39 CFR part 3015, and 39 CFR part 3020, subpart B. Comment deadline(s) for each request appear in section II.

    II. Docketed Proceeding(s)

    1. Docket No(s).: CP2015-39; Filing Title: Notice of United States Postal Service of Change in Prices Pursuant to Amendment to Priority Mail Contract 111; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 CFR 3015.5.; Public Representative: Curtis E. Kidd; Comments Due: November 2, 2016.

    2. Docket No(s).: MC2017-7 and CP2017-22; Filing Title: Request of the United States Postal Service to Add Priority Mail Contract 249 to Competitive Product List and Notice of Filing (Under Seal) of Unredacted Governors' Decision, Contract, and Supporting Data; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 U.S.C. 3642 and 39 CFR 3020.30 et seq.; Public Representative: Helen Fonda; Comments Due: November 2, 2016.

    3. Docket No(s).: MC2017-8 and CP2017-23; Filing Title: Request of the United States Postal Service to Add Priority Mail Contract 250 to Competitive Product List and Notice of Filing (Under Seal) of Unredacted Governors' Decision, Contract, and Supporting Data; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 U.S.C. 3642 and 39 CFR 3020.30 et seq.; Public Representative: Helen Fonda; Comments Due: November 2, 2016.

    4. Docket No(s).: MC2017-9 and CP2017-24; Filing Title: Request of the United States Postal Service to Add Priority Mail Contract 251 to Competitive Product List and Notice of Filing (Under Seal) of Unredacted Governors' Decision, Contract, and Supporting Data; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 U.S.C. 3642 and 39 CFR 3020.30 et seq.; Public Representative: Kenneth R. Moeller; Comments Due: November 2, 2016.

    5. Docket No(s).: MC2017-10 and CP2017-25; Filing Title: Request of the United States Postal Service to Add Priority Mail Contract 252 to Competitive Product List and Notice of Filing (Under Seal) of Unredacted Governors' Decision, Contract, and Supporting Data; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 U.S.C. 3642 and 39 CFR 3020.30 et seq.; Public Representative: Kenneth R. Moeller; Comments Due: November 2, 2016.

    6. Docket No(s).: MC2017-11 and CP2017-26; Filing Title: Request of the United States Postal Service to Add Priority Mail Contract 253 to Competitive Product List and Notice of Filing (Under Seal) of Unredacted Governors' Decision, Contract, and Supporting Data; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 U.S.C. 3642 and 39 CFR 3020.30 et seq.; Public Representative: Curtis E. Kidd; Comments Due: November 2, 2016.

    7. Docket No(s).: MC2017-12 and CP2017-27; Filing Title: Request of the United States Postal Service to Add Priority Mail Express Contract 43 to Competitive Product List and Notice of Filing (Under Seal) of Unredacted Governors' Decision, Contract, and Supporting Data; Filing Acceptance Date: October 25, 2016; Filing Authority: 39 U.S.C. 3642 and 39 CFR 3020.30 et seq.; Public Representative: Curtis E. Kidd; Comments Due: November 2, 2016.

    This notice will be published in the Federal Register.

    Stacy L. Ruble, Secretary.
    [FR Doc. 2016-26215 Filed 10-28-16; 8:45 am] BILLING CODE 7710-FW-P
    POSTAL REGULATORY COMMISSION [Docket No. CP2017-21] New Postal Product AGENCY:

    Postal Regulatory Commission.

    ACTION:

    Notice.

    SUMMARY:

    The Commission is noticing recent Postal Service filing for the Commission's consideration concerning a negotiated service agreement. This notice informs the public of the filing, invites public comment, and takes other administrative steps.

    DATES:

    Comments are due: November 1, 2016.

    ADDRESSES:

    Submit comments electronically via the Commission's Filing Online system at http://www.prc.gov. Those who cannot submit comments electronically should contact the person identified in the FOR FURTHER INFORMATION CONTACT section by telephone for advice on filing alternatives.

    FOR FURTHER INFORMATION CONTACT:

    David A. Trissell, General Counsel, at 202-789-6820.

    SUPPLEMENTARY INFORMATION: Table of Contents I. Introduction II. Docketed Proceeding(s) I. Introduction

    The Commission gives notice that the Postal Service filed request(s) for the Commission to consider matters related to negotiated service agreement(s). The request(s) may propose the addition or removal of a negotiated service agreement from the market dominant or the competitive product list, or the modification of an existing product currently appearing on the market dominant or the competitive product list.

    Section II identifies the docket number(s) associated with each Postal Service request, the title of each Postal Service request, the request's acceptance date, and the authority cited by the Postal Service for each request. For each request, the Commission appoints an officer of the Commission to represent the interests of the general public in the proceeding, pursuant to 39 U.S.C. 505 (Public Representative). Section II also establishes comment deadline(s) pertaining to each request.

    The public portions of the Postal Service's request(s) can be accessed via the Commission's Web site (http://www.prc.gov). Non-public portions of the Postal Service's request(s), if any, can be accessed through compliance with the requirements of 39 CFR 3007.40.

    The Commission invites comments on whether the Postal Service's request(s) in the captioned docket(s) are consistent with the policies of title 39. For request(s) that the Postal Service states concern market dominant product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3622, 39 U.S.C. 3642, 39 CFR part 3010, and 39 CFR part 3020, subpart B. For request(s) that the Postal Service states concern competitive product(s), applicable statutory and regulatory requirements include 39 U.S.C. 3632, 39 U.S.C. 3633, 39 U.S.C. 3642, 39 CFR part 3015, and 39 CFR part 3020, subpart B. Comment deadline(s) for each request appear in section II.

    II. Docketed Proceeding(s)

    1. Docket No(s).: CP2017-21; Filing Title: Notice of United States Postal Service of Filing a Functionally Equivalent Global Expedited Package Services 3 Negotiated Service Agreement and Application for Non-Public Treatment of Materials Filed Under Seal; Filing Acceptance Date: October 24, 2016; Filing Authority: 39 CFR 3015.5; Public Representative: Jennaca D. Upperman; Comments Due: November 1, 2016.

    This notice will be published in the Federal Register.

    Stacy L. Ruble, Secretary.
    [FR Doc. 2016-26140 Filed 10-28-16; 8:45 am] BILLING CODE 7710-FW-P
    RAILROAD RETIREMENT BOARD 2017 Railroad Experience Rating Proclamations, Monthly Compensation Base and Other Determinations AGENCY:

    Railroad Retirement Board.

    ACTION:

    Notice.

    SUMMARY:

    Pursuant to section 8(c)(2) and section 12(r)(3) of the Railroad Unemployment Insurance Act (Act) (45 U.S.C. 358(c)(2) and 45 U.S.C. 362(r)(3), respectively), the Board gives notice of the following:

    1. The balance to the credit of the Railroad Unemployment Insurance (RUI) Account, as of June 30, 2016, is $93,849,116.28;

    2. The September 30, 2016, balance of any new loans to the RUI Account, including accrued interest, is zero;

    3. The system compensation base is $4,224,601,102.31 as of June 30, 2016;

    4. The cumulative system unallocated charge balance is ($408,501,327.51) as of June 30, 2016;

    5. The pooled credit ratio for calendar year 2017 is zero;

    6. The pooled charged ratio for calendar year 2017 is zero;

    7. The surcharge rate for calendar year 2017 is 1.5 percent;

    8. The monthly compensation base under section 1(i) of the Act is $1,545 for months in calendar year 2017;

    9. The amount described in sections 1(k) and 3 of the Act as “2.5 times the monthly compensation base” is $3,862.50 for base year (calendar year) 2017;

    10. The amount described in section 4(a-2)(i)(A) of the Act as “2.5 times the monthly compensation base” is $3,862.50 with respect to disqualifications ending in calendar year 2017;

    11. The amount described in section 2(c) of the Act as “an amount that bears the same ratio to $775 as the monthly compensation base for that year as computed under section 1(i) of this Act bears to $600” is $1,996 for months in calendar year 2017;

    12. The maximum daily benefit rate under section 2(a)(3) of the Act is $72 with respect to days of unemployment and days of sickness in registration periods beginning after June 30, 2017.

    DATES:

    The balance in notice (1) and the determinations made in notices (3) through (7) are based on data as of June 30, 2016. The balance in notice (2) is based on data as of September 30, 2016. The determinations made in notices (5) through (7) apply to the calculation, under section 8(a)(1)(C) of the Act, of employer contribution rates for 2017. The determinations made in notices (8) through (11) are effective January 1, 2017. The determination made in notice (12) is effective for registration periods beginning after June 30, 2017.

    ADDRESSES:

    Secretary to the Board, Railroad Retirement Board, 844 Rush Street, Chicago, Illinois 60611-2092.

    FOR FURTHER INFORMATION CONTACT:

    Michael J. Rizzo, Bureau of the Actuary, Railroad Retirement Board, 844 Rush Street, Chicago, Illinois 60611-2092, telephone (312) 751-4771.

    SUPPLEMENTARY INFORMATION:

    The RRB is required by section 8(c)(1) of the Railroad Unemployment Insurance Act (Act) (45 U.S.C. 358(c)(1)) as amended by Public Law 100-647, to proclaim by October 15 of each year certain system-wide factors used in calculating experience-based employer contribution rates for the following year. The RRB is further required by section 8(c)(2) of the Act (45 U.S.C. 358(c)(2)) to publish the amounts so determined and proclaimed. The RRB is required by section 12(r)(3) of the Act (45 U.S.C. 362(r)(3)) to publish by December 11, 2016, the computation of the calendar year 2017 monthly compensation base (section 1(i) of the Act) and amounts described in sections 1(k), 2(c), 3 and 4(a-2)(i)(A) of the Act which are related to changes in the monthly compensation base. Also, the RRB is required to publish, by June 11, 2017, the maximum daily benefit rate under section 2(a)(3) of the Act for days of unemployment and days of sickness in registration periods beginning after June 30, 2017.

    Surcharge Rate

    A surcharge is added in the calculation of each employer's contribution rate, subject to the applicable maximum rate, for a calendar year whenever the balance to the credit of the RUI Account on the preceding June 30 is less than the greater of $100 million or the amount that bears the same ratio to $100 million as the system compensation base for that June 30 bears to the system compensation base as of June 30, 1991. If the RUI Account balance is less than $100 million (as indexed), but at least $50 million (as indexed), the surcharge will be 1.5 percent. If the RUI Account balance is less than $50 million (as indexed), but greater than zero, the surcharge will be 2.5 percent. The maximum surcharge of 3.5 percent applies if the RUI Account balance is less than zero.

    The ratio of the June 30, 2016 system compensation base of $4,224,601,102.31 to the June 30, 1991 system compensation base of $2,763,287,237.04 is 1.52883169. Multiplying 1.52883169 by $100 million yields $152,883,169.00. Multiplying $50 million by 1.52883169 produces $76,441,584.50. The Account balance on June 30, 2016, was $93,849,116.28. Accordingly, the surcharge rate for calendar year 2017 is 1.5 percent.

    Monthly Compensation Base

    For years after 1988, section 1(i) of the Act contains a formula for determining the monthly compensation base. Under the prescribed formula, the monthly compensation base increases by approximately two-thirds of the cumulative growth in average national wages since 1984. The monthly compensation base for months in calendar year 2017 shall be equal to the greater of (a) $600 or (b) $600 [1 + {(A−37,800)/56,700}], where A equals the amount of the applicable base with respect to tier 1 taxes for 2017 under section 3231(e)(2) of the Internal Revenue Code of 1986. Section 1(i) further provides that if the amount so determined is not a multiple of $5, it shall be rounded to the nearest multiple of $5.

    Using the calendar year 2017 tier 1 tax base of $127,200 for A above produces the amount of $1,546.03, which must then be rounded to $1,545. Accordingly, the monthly compensation base is determined to be $1,545 for months in calendar year 2017.

    Amounts Related to Changes in Monthly Compensation Base

    For years after 1988, sections 1(k), 3, 4(a-2)(i)(A) and 2(c) of the Act contain formulas for determining amounts related to the monthly compensation base.

    Under section 1(k), remuneration earned from employment covered under the Act cannot be considered subsidiary remuneration if the employee's base year compensation is less than 2.5 times the monthly compensation base for months in such base year. Under section 3, an employee shall be a “qualified employee” if his/her base year compensation is not less than 2.5 times the monthly compensation base for months in such base year. Under section 4(a-2)(i)(A), an employee who leaves work voluntarily without good cause is disqualified from receiving unemployment benefits until he has been paid compensation of not less than 2.5 times the monthly compensation base for months in the calendar year in which the disqualification ends.

    Multiplying 2.5 by the calendar year 2017 monthly compensation base of $1,545 produces $3,862.50. Accordingly, the amount determined under sections 1(k), 3 and 4(a-2)(i)(A) is $3,862.50 for calendar year 2017.

    Under section 2(c), the maximum amount of normal benefits paid for days of unemployment within a benefit year and the maximum amount of normal benefits paid for days of sickness within a benefit year shall not exceed an employee's compensation in the base year. In determining an employee's base year compensation, any money remuneration in a month not in excess of an amount that bears the same ratio to $775 as the monthly compensation base for that year bears to $600 shall be taken into account. The calendar year 2017 monthly compensation base is $1,545. The ratio of $1,545 to $600 is 2.57500000. Multiplying 2.57500000 by $775 produces $1,996. Accordingly, the amount determined under section 2(c) is $1,996 for months in calendar year 2017.

    Maximum Daily Benefit Rate

    Section 2(a)(3) contains a formula for determining the maximum daily benefit rate for registration periods beginning after June 30, 1989, and after each June 30 thereafter. Legislation enacted on October 9, 1996, revised the formula for indexing maximum daily benefit rates. Under the prescribed formula, the maximum daily benefit rate increases by approximately two-thirds of the cumulative growth in average national wages since 1984. The maximum daily benefit rate for registration periods beginning after June 30, 2017, shall be equal to 5 percent of the monthly compensation base for the base year immediately preceding the beginning of the benefit year. Section 2(a)(3) further provides that if the amount so computed is not a multiple of $1, it shall be rounded down to the nearest multiple of $1.

    The calendar year 2016 monthly compensation base is $1,455. Multiplying $1,455 by 0.05 yields $72.75. Accordingly, the maximum daily benefit rate for days of unemployment and days of sickness beginning in registration periods after June 30, 2017, is determined to be $72.

    Dated: October 26, 2016.

    By Authority of the Board.

    Martha P. Rico, Secretary to the Board.
    [FR Doc. 2016-26167 Filed 10-28-16; 8:45 am] BILLING CODE 7905-01-P
    SECURITIES AND EXCHANGE COMMISSION Sunshine Act Meeting FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT:

    [81 FR 73459, October 25, 2016].

    STATUS:

    Open Meeting.

    PLACE:

    100 F Street NE., Washington, DC.

    DATE AND TIME OF PREVIOUSLY ANNOUNCED MEETING:

    Wednesday, October 26, 2016 10:00 a.m.

    CHANGE IN THE MEETING:

    Time Change.

    The Open Meeting scheduled for Wednesday, October 26, 2016 at 10:00 a.m., has been changed to Wednesday, October 26, 2016 at 11:00 a.m.

    For further information and to ascertain what, if any, matters have been added, deleted or postponed, please contact:

    The Office of the Secretary at (202) 551-5400.

    Dated: October 26, 2016. Brent J. Fields, Secretary.
    [FR Doc. 2016-26304 Filed 10-27-16; 11:15 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79148; File No. SR-BatsBYX-2016-27] Self-Regulatory Organizations; Bats BYX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend BYX Rule 11.13, Order Execution and Routing October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 12, 2016, Bats BYX Exchange, Inc. (the “Exchange” or “BYX”) filed with the Securities and Exchange Commission (“Commission”) the proposed rule change as described in Items I, II, and III below, which Items have been prepared by the Exchange. The Exchange has designated this proposal as a “non-controversial” proposed rule change pursuant to Section 19(b)(3)(A) of the Act 3 and Rule 19b-4(f)(6)(iii) thereunder,4 which renders it effective upon filing with the Commission. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    3 15 U.S.C. 78s(b)(3)(A).

    4 17 CFR 240.19b-4(f)(6)(iii).

    I. Self-Regulatory Organization's Statement of the Terms of the Substance of the Proposed Rule Change

    The Exchange filed a proposal to amend Exchange Rule 11.13(b)(1) to describe when an order marked as “short” may be eligible for routing when a short sale price test restriction is in effect.

    The text of the proposed rule change is available at the Exchange's Web site at www.batstrading.com, at the principal office of the Exchange, and at the Commission's Public Reference Room.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    The Exchange proposes to amend Exchange Rule 11.13(b)(1) to describe when an order to sell marked 5 as “short” 6 may be eligible for routing when a short sale price test restriction is in effect. Under Rule 201 of Regulation SHO,7 short sale orders in a covered security 8 generally cannot be executed or displayed by a Trading Center,9 such as the Exchange, at a price that is at or below the current national best bid (“NBB”) 10 when a short sale circuit breaker is in effect for the covered security (the “short sale price test restriction”).11

    5 17 CFR 242.200(g).

    6 The term “short sale” is defined as “any sale of a security which the seller does not own or any sale which is consummated by the delivery of a security borrowed by, or for the account of, the seller.” 17 CFR 242.200(a).

    7See 17 CFR 242.201; Securities Exchange Act Release No. 61595 (February 26, 2010), 75 FR 11232 (March 10, 2010).

    8 Rule 201(a)(1) of Regulation SHO defines the term “covered security” to mean any “NMS stock” as defined under Rule 600(b)(47) of Regulation NMS. Rule 600(b)(47) of Regulation NMS defines an “NMS stock” as “any NMS security other than an option.” Rule 600(b)(46) of Regulation NMS defines an “NMS security” as “any security or class of securities for which transaction reports are collected, processed, and made available pursuant to an effective transaction reporting plan, or an effective national market system plan for reporting transactions in listed options.” 17 CFR 242.201(a)(1); 17 CFR 242.600(b)(46); and 17 CFR 242.600(b)(47).

    9 Rule 201(a)(9) of Regulation SHO states that the term “Trading Center” shall have the same meaning as in Rule 600(b)(78) of Regulation NMS. Rule 600(b)(78) of Regulation NMS defines a “Trading Center” as “a national securities exchange or national securities association that operates an SRO trading facility, an alternative trading system, an exchange market maker, an OTC market maker, or any other broker or dealer that executes orders internally by trading as principal or crossing orders as agent.” 17 CFR 242.200(a)(9); 17 CFR 242.600(b)(78).

    10 17 CFR 242.201(a)(4); 17 CFR 242.600(b)(42).

    11 17 CFR 242.201(b)(1).

    Under Rule 11.13(b)(1), an order marked “short” when a short sale price test restriction is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a short sale price test restriction and such order is an Immediate or Cancel (“IOC”) Order 12 or a BYX Market Order,13 then the order will be cancelled. If an order is ineligible for routing due to a short sale price test restriction and such order is a Limit Order,14 the Exchange will post the unfilled balance of the order to the BYX Book,15 subject to the price sliding process as defined in paragraph (g) of Exchange Rule 11.9.16

    12See Exchange Rule 11.9(b)(1).

    13See Exchange Rule 11.9(a)(2). The Exchange also proposes to remove the reference to BYX Market Orders in Rule 11.13(b)(1) as BYX Market Orders with a time-in-force of Day that are ineligible for routing due to a short sale price test restriction pursuant to Rule 201 of Regulation SHO are not cancelled, but rather posted to the BYX Book pursuant to Exchange Rule 11.9(a)(2). All other BYX Market Orders are handled in accordance with Exchange Rule 11.13(a).

    14See Exchange Rule 11.9(a)(1).

    15See Exchange Rule 1.5(e).

    16 In sum, under Exchange Rule 11.9(g), a short sale order that, at the time of entry, could not be executed or displayed in compliance with Rule 201 of Regulation SHO will be re-priced by the System at one minimum price variation above the current NBB (“Permitted Price”). See Exchange Rule 11.9(g) for a full description of the Exchange's Short Sale Price Sliding Process.

    The Exchange proposes to specify in Rule 11.13 that orders marked “short” may be eligible for routing by the Exchange when a short sale price test restriction is in effect where the User 17 selects the Post to Away 18 routing option.19 In contrast to all other routing strategies, which are routed to other Trading Centers for immediate execution, the Post to Away routing option is an order that is sent to other Trading Centers for posting and/or later execution as further described below. Under the Post to Away routing option, the remainder of a routed order is routed to and posted to the order book of a destination on the System routing table,20 as specified by the User. Orders routed pursuant to the Post to Away routing option that are identified as “short” are subject to the receiving Trading Center's processes for handling short sale orders in compliance with Rule 201 of Regulation SHO.21

    17See Exchange Rule 1.5(cc).

    18See 11.13(b)(3)(H).

    19 The Exchange also proposes to specify within Rule 11.13(b)(1) that the short sale price test restriction is declared pursuant to Rule 201 of Regulation SHO.

    20 The term “System routing table” is defined as the “the proprietary process for determining the specific trading venues to which the System routes orders and the order in which it routes them.” See Exchange Rule 11.13(b)(3).

    21See, e.g., Nasdaq Stock Market LLC (“Nasdaq”) Rule 4763; New York Stock Exchange, Inc. (“NYSE”) Rule 440B; and Nasdaq's Regulation SHO Frequently Asked Questions (updated March 10, 2011), available at https://nasdaqtrader.com/content/marketregulation/regsho/regshoFAQs.pdf.

    Under Exchange Rule 11.13(b)(1), IOC Orders marked “short” that are not eligible for routing during a short sale price test restriction will continue to be cancelled.22 The unfilled portions of Limit Orders marked “short” that are ineligible for routing due to a short sale price test restriction Exchange will continue to be posted to the BYX Book subject to the price sliding process as defined in paragraph (g) of Exchange Rule 11.9.

    22See supra note 13.

    2. Statutory Basis

    The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act 23 and furthers the objectives of Section 6(b)(5) of the Act 24 because it is designed to promote just and equitable principles of trade, remove impediments to and perfect the mechanism of a free and open market and a national market system, foster cooperation and coordination with persons engaged in facilitating transactions in securities, and, in general, protect investors and the public interest. Specifically, the proposed changes are designed to ensure clarity in the Exchange's rulebook with respect to the routing of orders in compliance with Rule 201 of Regulation SHO. In addition, providing Users the ability to send short sale orders that are routable pursuant to the Post to Away routing option provides them additional flexibility with regard to the handling of their orders. The Exchange notes that short sale orders routed pursuant to the Post to Away routing option are identified as “short” and, therefore, subject to the receiving Trading Center's processes for handling short sale orders in compliance with Rule 201 of Regulation SHO.25 The Exchange also notes that other national securities exchanges do not expressly prohibit the routing of short sale orders. For example, Nasdaq and NYSE Arca, Inc. (“NYSE Arca”) allow for the routing of short sale orders generally, and do not limit a short sale order's ability to route to certain routing options.26 Thus, the proposal is directly targeted at removing impediments to and perfecting the mechanism of a free and open market and national market system. The proposed rule change also is designed to support the principles of Section 11A(a)(1) 27 of the Act in that it seeks to assure fair competition among brokers and dealers and among exchange markets.

    23 15 U.S.C. 78f(b).

    24 15 U.S.C. 78f(b)(5).

    25See supra note 21.

    26See e.g., Nasdaq Rules 4702(a) (stating generally that an “[o]rder may . . . may be routed to other market centers for potential execution if designated as `Routable' ”) and 4763 (not prohibiting the routing of a short sale order during a short sale price test). See also e.g., NYSE Arca Rule 7.6P (not prohibiting the routing of a short sale order during a short sale price test).

    27 15 U.S.C. 78k-1(a)(1).

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders marked “short” may be routed to an away market for execution under one specific routing strategy offered by the Exchange.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    The Exchange has neither solicited nor received written comments on the proposed rule change.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act 28 and paragraph (f)(6) of Rule 19b-4 thereunder,29 the Exchange has designated this rule filing as non-controversial. The Exchange has given the Commission written notice of its intent to file the proposed rule change, along with a brief description and text of the proposed rule change at least five business days prior to the date of filing of the proposed rule change, or such shorter time as designated by the Commission.

    28 15 U.S.C. 78s(b)(3)(A).

    29 17 CFR 240.19b-4.

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-BatsBYX-2016-27 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-BatsBYX-2016-27. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549 on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of such filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-BatsBYX-2016-27, and should be submitted on or before November 21, 2016.

    30 17 CFR 200.30-3(a)(12).

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.30

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26131 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79151; File No. SR-BatsEDGX-2016-54] Self-Regulatory Organizations; Bats EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change To Amend EDGX Rule 11.11, Routing to Away Trade Centers October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 12, 2016, Bats EDGX Exchange, Inc. (the “Exchange” or “EDGX”) filed with the Securities and Exchange Commission (“Commission”) the proposed rule change as described in Items I, II, and III below, which Items have been prepared by the Exchange. The Exchange has designated this proposal as a “non-controversial” proposed rule change pursuant to Section 19(b)(3)(A) of the Act 3 and Rule 19b-4(f)(6)(iii) thereunder,4 which renders it effective upon filing with the Commission. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    3 15 U.S.C. 78s(b)(3)(A).

    4 17 CFR 240.19b-4(f)(6)(iii).

    I. Self-Regulatory Organization's Statement of the Terms of the Substance of the Proposed Rule Change

    The Exchange filed a proposal to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale instruction may be eligible for routing when a short sale price test restriction is in effect.

    The text of the proposed rule change is available at the Exchange's Web site at www.batstrading.com, at the principal office of the Exchange, and at the Commission's Public Reference Room.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    The Exchange proposes to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale 5 instruction may be eligible for routing when a short sale price test restriction is in effect. Under Rule 201 of Regulation SHO,6 a short sale order in a covered security 7 generally cannot be executed or displayed by a Trading Center,8 such as the Exchange, at a price that is at or below the current national best bid (“NBB”) 9 when a short sale circuit breaker is in effect for the covered security (the “short sale price test restriction”).10

    5See Exchange Rule 11.6(o). The term “short sale” is defined as “any sale of a security which the seller does not own or any sale which is consummated by the delivery of a security borrowed by, or for the account of, the seller.” 17 CFR 242.200(a).

    6See 17 CFR 242.201; Securities Exchange Act Release No. 61595 (February 26, 2010), 75 FR 11232 (March 10, 2010).

    7 Rule 201(a)(1) of Regulation SHO defines the term “covered security” to mean any “NMS stock” as defined under Rule 600(b)(47) of Regulation NMS. Rule 600(b)(47) of Regulation NMS defines an “NMS stock” as “any NMS security other than an option.” Rule 600(b)(46) of Regulation NMS defines an “NMS security” as “any security or class of securities for which transaction reports are collected, processed, and made available pursuant to an effective transaction reporting plan, or an effective national market system plan for reporting transactions in listed options.” 17 CFR 242.201(a)(1); 17 CFR 242.600(b)(46); and 17 CFR 242.600(b)(47).

    8 Rule 201(a)(9) of Regulation SHO states that the term “Trading Center” shall have the same meaning as in Rule 600(b)(78) of Regulation NMS. Rule 600(b)(78) of Regulation NMS defines a “Trading Center” as “a national securities exchange or national securities association that operates an SRO trading facility, an alternative trading system, an exchange market maker, an OTC market maker, or any other broker or dealer that executes orders internally by trading as principal or crossing orders as agent.” 17 CFR 242.200(a)(9); 17 CFR 242.600(b)(78).

    9 17 CFR 242.201(a)(4); 17 CFR 242.600(b)(42).

    10 17 CFR 242.201(b)(1).

    Under Rule 11.11(a), an order that includes a Short Sale instruction when a short sale price test restriction pursuant to Rule 201 of Regulation SHO is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a Short Sale Circuit Breaker 11 being in effect and such order contains a Time-in-Force of Immediate-or-Cancel (“IOC”),12 then the order will be cancelled. For any other order ineligible for routing due to a Short Sale Circuit Breaker being in effect, the Exchange will post the unfilled balance of the order to the EDGX Book,13 treat the order as if it included a Book Only 14 or Post Only 15 instruction, and subject it to the Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO, as described in Rule Exchange 11.6(l)(2),16 unless the User has elected the order Cancel Back as described in Exchange Rule 11.6(b).

    11 In order to use consistent terminology, the Exchange proposes to replace the term “short sale price test restriction” with “Short Sale Circuit Breaker” within the first sentence of Rule 11.11(a).

    12See Exchange Rule 11.6(q)(1).

    13See Exchange Rule 1.5(d).

    14See Exchange Rule 11.6(n)(3).

    15See Exchange Rule 11.6(n)(4).

    16 In sum, under Exchange Rule 11.6(l)(2), an order to sell with a Short Sale instruction that, at the time of entry, could not be executed or displayed in compliance with Rule 201 of Regulation SHO will be re-priced by the System at the Permitted Price. See Exchange Rule 11.6(l)(2) for a full description of the Exchange's Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO.

    The Exchange proposes to specify in Rule 11.11(a) that orders that include a Short Sale instruction may be eligible for routing by the Exchange when a Short Sale Circuit Breaker is in effect where the User 17 selects either the Post to Away 18 or ROOC 19 routing options. In contrast to all other routing strategies, which are routed to other Trading Centers for immediate, the Post to Away and ROOC routing options are orders that are sent to other Trading Centers for posting and/or later execution as further described below. Under the Post to Away routing option, the remainder of a routed order is routed to and posted to the order book of a destination on the System routing table,20 as specified by the User. ROOC is a routing option for orders that the User wishes to designate for participation in the opening, re-opening (following a halt, suspension, or pause), or closing process of a primary listing market other than the Exchange (e.g., the New York Stock Exchange, Inc. (“NYSE”), Nasdaq Stock Market LLC (“Nasdaq”), NYSE MKT LLC, or NYSE Arca, Inc. (“NYSE Arca”)) if received before the opening/re-opening/closing time of such market. If shares remain unexecuted after attempting to execute in the opening, re-opening, or closing process, they are either posted to the EDGX Book, executed, or routed to destinations on the System routing table.21 Orders routed pursuant to the Post to Away and ROOC routing options that include a Short Sale instruction are identified as “short” are subject to the receiving Trading Center's processes for handling short sale orders in compliance with Rule 201 of Regulation SHO.22

    17See Exchange Rule 1.5(cc).

    18See 11.11(g)(12).

    19See 11.11(g)(8).

    20 The term “System routing table” is defined as the “the proprietary process for determining the specific trading venues to which the System routes orders and the order in which it routes them.” See Exchange Rule 11.11(g).

    21 Shares returned to the Exchange after routing are handled in accordance with Exchange Rules, including Rule 11.10(a).

    22See, e.g., Nasdaq Rule 4763; NYSE Rule 440B; and Nasdaq's Regulation SHO Frequently Asked Questions (updated March 10, 2011), available at https://nasdaqtrader.com/content/marketregulation/regsho/regshoFAQs.pdf.

    Under Exchange Rule 11.11(a), orders that include a Short Sale instruction and a Time-in-Force of IOC that are not eligible for routing during a Short Sale Circuit Breaker will continue to be cancelled. For any other order that includes a Short Sale instruction that is ineligible for routing due to a Short Sale Circuit Breaker being in effect, the Exchange will continue to post the unfilled balance of the order to the EDGX Book, treat the order as if it included a Book Only or Post Only instruction, and subject it to the Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO, as described in Rule 11.6(l)(2), unless the User has elected the order Cancel Back as described in Rule 11.6(b).

    2. Statutory Basis

    The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act 23 and furthers the objectives of Section 6(b)(5) of the Act 24 because it is designed to promote just and equitable principles of trade, remove impediments to and perfect the mechanism of a free and open market and a national market system, foster cooperation and coordination with persons engaged in facilitating transactions in securities, and, in general, protect investors and the public interest. Specifically, the proposed changes are designed to ensure clarity in the Exchange's rulebook with respect to the routing of orders in compliance with Rule 201 of Regulation SHO. In addition, providing Users the ability to send short sale orders that are routable pursuant to the Post to Away and ROOC routing options provides them additional flexibility with regard to the handling of their orders. The Exchange notes that orders that include a Short Sale instruction routed pursuant to the Post to Away or ROOC routing options are identified as “short” and, therefore, subject to the receiving Trading Center's processes for handling short sale orders in compliance with Regulation SHO.25 The Exchange also notes that other national securities exchanges do not expressly prohibit the routing of short sale orders. For example, Nasdaq and NYSE Arca allow for the routing of short sale orders generally, and do not limit a short sale order's ability to route to certain routing options.26 Thus, the proposal is directly targeted at removing impediments to and perfecting the mechanism of a free and open market and national market system. The proposed rule change also is designed to support the principles of Section 11A(a)(1) 27 of the Act in that it seeks to assure fair competition among brokers and dealers and among exchange markets.

    23 15 U.S.C. 78f(b).

    24 15 U.S.C. 78f(b)(5).

    25See supra note 21.

    26See e.g., Nasdaq Rules 4702(a) (stating generally that an “[o]rder may . . . may be routed to other market centers for potential execution if designated as `Routable' ”) and 4763 (not prohibiting the routing of a short sale order during a short sale price test). See also e.g., NYSE Arca Rule 7.6P (not prohibiting the routing of a short sale order during a short sale price test).

    27 15 U.S.C. 78k-1(a)(1).

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders that include a Short Sale instruction may be routed to an away marked for execution under two specific routing strategies offered by the Exchange.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    The Exchange has neither solicited nor received written comments on the proposed rule change.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act 28 and paragraph (f)(6) of Rule 19b-4 thereunder,29 the Exchange has designated this rule filing as non-controversial. The Exchange has given the Commission written notice of its intent to file the proposed rule change, along with a brief description and text of the proposed rule change at least five business days prior to the date of filing of the proposed rule change, or such shorter time as designated by the Commission.

    28 15 U.S.C. 78s(b)(3)(A).

    29 17 CFR 240.19b-4.

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-BatsEDGX-2016-54 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-BatsEDGX-2016-54. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549 on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of such filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-BatsEDGX-2016-54, and should be submitted on or before November 21, 2016.

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.30

    Brent J. Fields, Secretary.

    30 17 CFR 200.30-3(a)(12).

    [FR Doc. 2016-26134 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79153; File No. SR-OPRA-2016-02] Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed Amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information To Amend OPRA's Non-Display Use Fees October 25, 2016.

    Pursuant to Section 11A of the Securities Exchange Act of 1934 (“Act”) 1 and Rule 608 thereunder,2 notice is hereby given that on September 29, 2016, the Options Price Reporting Authority (“OPRA”) submitted to the Securities and Exchange Commission (“Commission”) an amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information (“OPRA Plan”).3 The OPRA Plan Amendment would implement changes to OPRA's Non-Display Use Fees on November 1, 2016. The Commission is publishing this notice to provide interested persons an opportunity to submit written comments on the OPRA Plan amendment.

    1 15 U.S.C. 78k-1.

    2 17 CFR 242.608.

    3 The OPRA Plan is a national market system plan approved by the Commission pursuant to Section 11A of the Act and Rule 608 thereunder. See Securities Exchange Act Release No. 17638 (March 18, 1981), 22 SEC Docket 484 (March 31, 1981). The full text of the OPRA Plan is available at http://www.opradata.com. The OPRA Plan provides for the collection and dissemination of last sale and quotation information on options that are traded on the participant exchanges. The fourteen participants to the OPRA Plan are BATS BZX Exchange, Inc., BATS EDGX Exchange, Inc., BOX Options Exchange, LLC, Chicago Board Options Exchange, Incorporated, C2 Options Exchange, Incorporated, International Securities Exchange, LLC, ISE Gemini, LLC, ISE Mercury, LLC, Miami International Securities Exchange, LLC, NASDAQ BX, Inc., NASDAQ PHLX LLC, The NASDAQ Stock Market LLC, NYSE MKT LLC, and NYSE Arca, Inc.

    I. Description and Purpose of the Plan Amendment

    OPRA proposes to amend footnotes 10 and 11 in the OPRA Fee Schedule to clarify the application of OPRA's “Non-Display Use” fees in certain respects.4 OPRA is not proposing any changes in the Non-Display Use fees themselves, although OPRA does propose to add the word “Monthly” to the first phrase in the Non-Display Use Fee entry in the Fee Schedule.

    4 OPRA proposed its current Non-Display Use fees in Securities Exchange Act Release No. 77584 (April 12, 2016), 81 FR 22670 (April 18, 2016) (File No. OPRA-2015-01).

    (a) Elimination of the Term “Datafeed”

    OPRA proposes to eliminate the use of the term “datafeed” in footnotes 10 and 11. Some OPRA Vendors have argued that the use of the term “datafeed” in these footnotes provides a basis for saying that the Non-Display Use fees are not applicable to their downstream OPRA data recipients. That argument is based on a separate OPRA Policy entitled “Datafeeds.” 5 In that Policy, the term “datafeed” is defined as “any uncontrolled retransmission of OPRA market data.” The argument has been that an OPRA Vendor and its downstream data recipients are not making Non-Display Use of OPRA data if the Vendor “controls”—that is, entitles—the server on which the Non-Display Use of the OPRA data is made.

    5 This Policy is available on the OPRA Web site under the “Policies” tab.

    From OPRA's perspective, this is clearly incorrect. The Datafeeds Policy is directed to describing how OPRA data is received, in order to explain the circumstances in which an OPRA data recipient needs to be a party to a “Vendor Agreement”, a “Direct Access Rider” and/or an “Indirect Access Rider”. The Datafeeds Policy is not relevant to the question of how OPRA data is used, specifically the question of whether a particular use of OPRA data constitutes “Non-Display Use.” OPRA believes that the Datafeeds Policy is irrelevant to the question of the applicability of Non-Display Use fees.

    Nonetheless, OPRA recognizes that the use of the term “datafeed” in separate OPRA documents with different meanings carries the potential for confusion and that the description of OPRA's fees should be as clear as possible. Accordingly OPRA is proposing to amend footnotes 10 and 11 in its Fee Schedule so that the term “datafeed” is no longer used in the footnotes.

    (b) Exception for Category 1 Non-Display Use by a Single UserID on Behalf of Certain Data Recipients

    OPRA also proposes to add a sentence in footnote 10 to state that the Category 1 Non-Display Use Fee 6 does not apply to an OPRA data recipient during a calendar month if the data recipient: “(i) has a single UserID that uses OPRA data for Non-Display Use and (ii) is not a broker-dealer and does not place more than 390 orders in listed options per day on average during the calendar month (counting orders for this purpose in accordance with the rules of the OPRA Participant exchanges to which it submits orders during the month) for its own beneficial account(s).” This sentence is intended to provide relief from the Category 1 Non-Display Use Fee for a data recipient that has a single UserID (a single natural person) that uses OPRA data for Category 1 Non-Display Use, unless the OPRA data recipient is acting as a broker-dealer or is submitting orders to the OPRA Participant exchanges at a rate (390 orders per day on average over a calendar month) that indicates that it is making extensive use of OPRA data. Clause (ii) of this sentence is phrased to take advantage of language in the rules of the various OPRA Participant exchanges that uses the “390 orders per day on average during the calendar month” concept for purposes of determining whether a person submitting trades to the exchanges is subject to rules applicable to public customers or to professional traders. OPRA understands the rules of the various exchanges to be similar enough in substance to allow for effective and meaningful counting of orders sent to all of the OPRA Participant exchanges, even though these rules are not stated in identical language.7 OPRA believes that the same concept provides a reasonable basis for distinguishing data recipients that are appropriately exempted from the Category 1 Non-Display Use fee.8 OPRA believes that this exemption from the application of the Category 1 Non-Display Use Fee would be similar to one of the exemptions stated in the OPRA Fee Schedule from the Subscriber Indirect Access Fee, which states that the Subscriber Indirect Access Fee “shall not apply to a subscriber . . . that receives a data feed transmission on a single, stand-alone computer for the sole purpose of providing a single-screen display of OPRA Data for the subscriber's internal use.”

    6 “Category 1” Non-Display Use is defined in footnote 10 to refer to Non-Display Use of OPRA data by a recipient of the data “on its own behalf.”

    7 Although the precise language used by the various exchanges to make this distinction varies from one to the next, all of the exchanges use the “390 orders per day on average during the calendar month” concept, and their specific provisions for counting orders are similar enough to permit accurate counting of orders across exchanges for the purposes of the distinction described in the text. See, for example, Bats BZX Rules 16.1(a)(46) and Interpretations and Policies .01 to Rule 16.1; BOX Options Exchange LLC Rule 100(a)(50); CBOE Rule 1.1(ggg); ISE Rule 100(a)(37A) and Rule 100(a)(37C); MIAX Rule 100 (definition of “Priority Customer” including Interpretation and Policy .01); Nasdaq PHLX LLC Rule 1000(b)(14); and NYSE Arca, Inc. Rule 6.1A(a)(4A).

    8 OPRA believes that it is not appropriate to make the same distinction for Category 2 Non-Display Use fees (Category 2 Non-Display Use is Non-Display Use of OPRA data on behalf of clients of the OPRA data recipient) or for Category 3 Non-Display Use fees (Category 3 Non-Display Use is for the purpose of internally matching buy and sell orders within the OPRA data recipient).

    (c) Clarification With Respect to Non-Display Use Fees and Professional Subscriber Device-Based Fees

    OPRA is also proposing a separate change in footnote 10 to the OPRA Fee Schedule for a purpose relating to the administration of the Non-Display Use Fees. A few OPRA data recipients have tried to suggest that if a device is subject to the Professional Subscriber Device-Based Fees it is immune from Non-Display Use Fees, and that therefore by attaching a display monitor to a server an OPRA data recipient can avoid payment of Non-Display Use Fees even if the server is used for Non-Display Use of OPRA data. OPRA believes that this is clearly incorrect, and that this can be clearly seen in the first sentence of footnote 10 in its current form (“Non-Display Use refers to the accessing, processing or consuming . . . of OPRA market data . . . for a purpose other than in support of the datafeed recipient's display or further internal or external redistribution.” (Emphasis added.)) Nonetheless, OPRA believes that it is appropriate to make changes in footnote 10 to make clearer that a device is subject both to the Professional Subscriber Device-Based Fees and to Non-Display Use Fees if it is used both to display OPRA data and for Non-Display Use of OPRA data.

    (d) Addition of the Word “Monthly” in the Non-Display Use Fee Entry in the Fee Schedule

    OPRA proposes to add the word “Monthly” to the heading of the Non-Display Use Fee entry in its Fee Schedule, so that the entry reads “Monthly Non-Display Use Fees.” The absence of this word was recently brought to the attention of OPRA staff. The word is used in the other entries in OPRA's Fee Schedule that are for monthly fees, and OPRA believes that for clarity the word should be used in this entry as well.9 So far as OPRA staff is aware, no OPRA data recipient has relied on the absence of the word to conclude that the Non-Display Use fees are payable on any basis other than monthly.

    9 OPRA's original filing for Non-Display Use did clearly identify the fees as monthly. See Securities Exchange Act Release No. 77584, 81 FR 22760 at 22762 (“The OPRA Plan amendment adopted fees for Non-Display Use as follows: A monthly fee of $2,000/Enterprise for Category 1 . . .; a monthly fee of $2,000/Enterprise for Category 2 . . .; and a monthly fee of $2,000/Platform for Category 3 . . .”).

    (e) Effect on OPRA Revenue of Proposed Changes; Discussions With Data Recipients

    OPRA does not anticipate any material increase in its revenues as a result of the changes described in this filing—indeed, on balance, OPRA may not experience any increase at all in its revenues as a result of the changes described in this filing. A few OPRA data recipients that have resisted payment of Non-Display Use fees on the basis of the assertion that they are not receiving the data through “datafeeds” will no longer be able to make that assertion, possibly resulting in a small increase in OPRA's revenues. On the other hand, there may be recipients of OPRA data that have been paying Category 1 Non-Display Use fees and that may no longer pay them as a result of the express exemption from Category 1 Non-Display Use fees for certain data recipients with a single UserID that use OPRA data for Category 1 Non-Display Use. OPRA believes that the change described in this filing to make more explicit that payment of Device-based Fees does not make Non-Display Use fees inapplicable will have no material effect on its revenues.

    OPRA believes that the most important of these changes is the deletion of the term “datafeed” in the footnotes to its Fee Schedule, not because of its effect on OPRA revenues, but because of concerns expressed to OPRA staff by data recipients that have been paying the Non-Display Use fees and have recently been told that their competitors may not be paying the fees on the basis of the “datafeed” argument. OPRA recognizes that equal treatment for persons similarly situated is an essential aspect of its operations, and believes that elimination of the word “datafeed” is important to providing equal treatment for persons making Non-Display Use of OPRA data. Similarly, OPRA believes that it is appropriate to provide relief from the Non-Display Fee for all data recipients that make limited Category 1 Non-Display Use of OPRA data within the scope of the exception. Finally, OPRA believes that it is appropriate to reinforce the concept that Non-Display Use Fees would be applicable if Non-Display Use is being made of OPRA data, even if the Non-Display Use is being made on a device that is subject to Professional Subscriber Device-Based Fees, again in furtherance of the fundamental concept that persons similarly situated should be treated equally.

    The text of the amendment to the OPRA Plan is available at OPRA, the Commission's Public Reference Room, the OPRA Web site at http://opradata.com, and on the Commission's Web site at www.sec.gov.

    II. Implementation of the OPRA Plan Amendment

    Pursuant to paragraph (b)(3)(i) of Rule 608 of Regulation NMS under the Act, OPRA designated this amendment as establishing or changing fees or other charges collected on behalf of all of the OPRA Participant exchanges in connection with access to or use of OPRA facilities. OPRA proposes to implement the revisions in the Non-Display Use Fee footnotes that are described in this amendment on November 1, 2016. According to OPRA, implementation of the revisions as of that date will permit OPRA to provide persons that may be affected by these changes with thirty days' notice of the changes.

    The Commission may summarily abrogate the amendment within sixty days of its filing and require refiling and approval of the amendment by Commission order pursuant to Rule 608(b)(2) under the Act 10 if it appears to the Commission that such action is necessary or appropriate in the public interest, for the protection of investors, or the maintenance of fair and orderly markets, to remove impediments to, and perfect the mechanisms of, a national market system, or otherwise in furtherance of the purposes of the Act.11

    10See 17 CFR 242.608(b)(2).

    11See 17 CFR 242.608(b)(3)(iii).

    III. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the OPRA Plan amendment is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File No. SR-OPRA-2016-02 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-OPRA-2016-02. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the OPRA Plan amendment that are filed with the Commission, and all written communications relating to the OPRA Plan amendment between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549, on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of such filing also will be available for inspection and copying at the principal office of OPRA. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-OPRA-2016-02 and should be submitted on or before November 21, 2016.

    By the Commission.

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26136 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79152; File No. SR-OPRA-2016-01] Options Price Reporting Authority; Notice of Filing and Immediate Effectiveness of Proposed Amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information To Amend the Professional Subscriber Device-Based Fees and Policies with Respect to Device-Based Fees October 25, 2016.

    Pursuant to Section 11A of the Securities Exchange Act of 1934 (“Act”) 1 and Rule 608 thereunder,2 notice is hereby given that on September 29, 2016, the Options Price Reporting Authority (“OPRA”) submitted to the Securities and Exchange Commission (“Commission”) an amendment to the Plan for Reporting of Consolidated Options Last Sale Reports and Quotation Information (“OPRA Plan”).3 The OPRA Plan Amendment would implement changes to the Professional Subscriber Device-Based Fee effective January 1, 2017. The OPRA Plan Amendment would also implement minor clarifying changes to the Policies with Respect to Device-Based Fees, effective immediately. The Commission is publishing this notice to provide interested persons an opportunity to submit written comments on the OPRA Plan amendment.

    1 15 U.S.C. 78k-1.

    2 17 CFR 242.608.

    3 The OPRA Plan is a national market system plan approved by the Commission pursuant to Section 11A of the Act and Rule 608 thereunder. See Securities Exchange Act Release No. 17638 (March 18, 1981), 22 S.E.C. Docket 484 (March 31, 1981). The full text of the OPRA Plan is available at http://www.opradata.com. The OPRA Plan provides for the collection and dissemination of last sale and quotation information on options that are traded on the participant exchanges. The fourteen participants to the OPRA Plan are BATS Exchange, Inc., BOX Options Exchange, LLC, Chicago Board Options Exchange, Incorporated, C2 Options Exchange, Incorporated, EDGX Exchange, Inc., International Securities Exchange, LLC, ISE Gemini, LLC, ISE Mercury,LLC, Miami International Securities Exchange, LLC, NASDAQ OMX BX, Inc., NASDAQ OMX PHLX LLC, The NASDAQ Stock Market LLC, NYSE MKT LLC, and NYSE Arca, Inc.

    I. Description and Purpose of the Plan Amendment (a) Fee Schedule Amendments

    The primary purpose of the proposed Fee Schedule amendments is to specify OPRA's Professional Subscriber Device-Based Fee effective commencing January 1, 2017 and make conforming changes in OPRA's Enterprise Rate Professional Subscriber Fee. OPRA's Enterprise Rate Professional Subscriber Fee is available to those Professional Subscribers that elect that rate in place of the regular OPRA device-based fees.4

    4 OPRA's Enterprise Rate is based on the number of a Professional Subscriber's U.S. registered representatives and independent investment advisers who contract with the Subscriber to provide advisory services to the Subscriber's customers.

    Specifically, OPRA proposes, effective January 1, 2017, to: (1) Increase the current $29.50 monthly per device fee by $1.00; (2) to increase the Enterprise Rate, currently a monthly fee of $29.50 times the number of a Professional Subscriber's U.S.-based registered representatives, to be a monthly fee of $30.50 times the number of the Subscriber's U.S.-based registered representatives; and (3) make conforming changes to the minimum monthly fee under the Enterprise Rate. “Professional Subscribers” are persons who subscribe to OPRA data, do not qualify for the reduced fees charged to “Nonprofessional Subscribers,” and do not redistribute the OPRA data to third parties. OPRA permits the counting of “User IDs” as a surrogate for counting “devices” for purposes of its Professional Subscriber Device-based Fees.5

    5See footnote 2 in the OPRA Fee Schedule and OPRA's Policies with respect to Device-based Fees.

    The number of devices reported to OPRA as subject to Professional Subscriber Device-Based Fees has been steadily trending downwards over many years. In 2008, OPRA received device-based fees, including enterprise fees, with respect to approximately 210,500 devices. In 2014, OPRA received device-based fees, including enterprise fees, with respect to approximately 148,400 devices, and in 2015 OPRA received device-based fees, including enterprise fees, with respect to approximately 141,300 devices. OPRA is receiving device-based fees in the third calendar quarter of 2016 with respect to approximately 135,500 devices—already a reduction of approximately 4.1% from 2015. OPRA believes that this long-term downward trend is the result of the increasing use of trading algorithms and automated trading platforms and other fundamental changes in the securities industry, and OPRA anticipates that this trend is likely to continue.

    The proposed increase in the Professional Subscriber Device-Based Fees is consistent with OPRA's past practice of making incremental $1.00 increases in its monthly Professional Subscriber Device-Based Fees,6 and OPRA believes that OPRA's Professional Subscribers should not be surprised by the increase. The proposed increase in the Professional Subscriber Device-Based Fee—which is an increase of approximately 3.4%—will partially offset the impact on revenue of the reduction in the number of devices in 2016 as compared to 2015.

    6 The year 2015 was an exception: For 2015, OPRA implemented an increase of $1.50 in its Professional Subscriber Device-Based Fee, because during 2015 one of OPRA's member exchanges initiated after-hours trading, causing OPRA to incur additional expenses associated with data dissemination during expanded trading hours. OPRA implemented $1.00/month increases in its Professional Subscriber Device-Based Fee for each of the years 2008-2014 and for the year 2016. See, Securities Exchange Act Release No. 72826, 79 FR 48777 (August 18, 2014) (File No. OPRA-2014-06) and Securities Exchange Act Release No. 77585, 81 FR 22668 (April 18, 2016) (File No. OPRA-2015-02).

    A secondary purpose of the proposed Fee Schedule amendments is to add the word “display” in the statements of the monthly Professional Subscriber Device-Based Fees for the periods commencing on January 1, 2016 and January 1, 2017. A few OPRA Professional Subscribers have asked whether, if a device is subject to the Professional Subscriber Device-Based Fees, it is therefore not subject to the OPRA Non-Display Use Fees, and suggested that a Subscriber could perhaps avoid payment of Non-Display Use Fees by attaching a display monitor to a server even if the server is being used for Non-Display Use of OPRA data. OPRA believes that this suggestion is not consistent even with the current wording of the Fee Schedule, but that the addition of the word “display” will make the wording clearer in this respect.

    (b) Changes in the Policies With respect to Device-Based Fees

    The proposed changes in the Policies with respect to Device-Based Fees are for a purpose similar to the purpose described above of adding the word “display” in the OPRA Fee Schedule, namely to avert misreading the Policies as saying that, if a Professional Subscriber is paying Device-Based Fees with respect to a device, the payment of the Device-Based Fees in and of itself is a sufficient basis for not paying Non-Display Use Fees even if the Non-Display Use Fees would otherwise be applicable. No Professional Subscriber has actually suggested such a reading to OPRA, and OPRA believes that the suggestion would be untenable even in terms of the current phrasing of the Policies, but OPRA believes that it is appropriate to revise the Policies to make clearer that the Device-based Fees may not be the only fees applicable to a particular device that receives OPRA data.

    The text of the amendment to the OPRA Plan is available at OPRA, the Commission's Public Reference Room, the OPRA Web site at http://opradata.com, and on the Commission's Web site at www.sec.gov.

    II. Implementation of the OPRA Plan Amendment

    Pursuant to paragraph (b)(3)(i) of Rule 608 of Regulation NMS under the Act, OPRA designated this amendment as establishing or changing fees or other charges collected on behalf of all of the OPRA participant exchanges in connection with access to or use of OPRA facilities. OPRA proposes to implement the changes in the Professional Subscriber Device-Based Fee on January 1, 2017. Implementation of the changes in the Professional Subscriber Device-Based Fee on January 1 is consistent with OPRA's prior practice with respect to changes in this fee, and will provide ample opportunity to give persons subject to this fee advance notice of the change. OPRA also proposes to implement the changes in the Policies with respect to Device-Based Fees immediately.

    The Commission may summarily abrogate the amendment within sixty days of its filing and require refiling and approval of the amendment by Commission order pursuant to Rule 608(b)(2) under the Act 7 if it appears to the Commission that such action is necessary or appropriate in the public interest, for the protection of investors, or the maintenance of fair and orderly markets, to remove impediments to, and perfect the mechanisms of, a national market system, or otherwise in furtherance of the purposes of the Act.8

    7See 17 CFR 242.608(b)(2).

    8See 17 CFR 242.608(b)(3)(iii).

    III. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the OPRA Plan amendment is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File No. SR-OPRA-2016-01 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-OPRA-2016-01. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the OPRA Plan amendment that are filed with the Commission, and all written communications relating to the OPRA Plan amendment between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549, on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of such filing also will be available for inspection and copying at the principal office of OPRA. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-OPRA-2016-01 and should be submitted on or before November 21, 2016.

    By the Commission.

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26135 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79149; File No. SR-BatsBZX-2016-65] Self-Regulatory Organizations; Bats BZX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change to BZX Rule 11.13, Order Execution and Routing October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 12, 2016, Bats BZX Exchange, Inc. (the “Exchange” or “BZX”) filed with the Securities and Exchange Commission (“Commission”) the proposed rule change as described in Items I, II, and III below, which Items have been prepared by the Exchange. The Exchange has designated this proposal as a “non-controversial” proposed rule change pursuant to Section 19(b)(3)(A) of the Act 3 and Rule 19b-4(f)(6)(iii) thereunder,4 which renders it effective upon filing with the Commission. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    3 15 U.S.C. 78s(b)(3)(A).

    4 17 CFR 240.19b-4(f)(6)(iii).

    I. Self-Regulatory Organization's Statement of the Terms of the Substance of the Proposed Rule Change

    The Exchange filed a proposal to amend Exchange Rule 11.13(b)(1) to describe when an order marked as “short” may be eligible for routing when a short sale price test restriction is in effect.

    The text of the proposed rule change is available at the Exchange's Web site at www.batstrading.com, at the principal office of the Exchange, and at the Commission's Public Reference Room.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    The Exchange proposes to amend Exchange Rule 11.13(b)(1) to describe when an order to sell marked 5 as “short” 6 may be eligible for routing when a short sale price test restriction is in effect. Under Rule 201 of Regulation SHO,7 a short sale order in a covered security 8 generally cannot be executed or displayed by a Trading Center,9 such as the Exchange, at a price that is at or below the current national best bid (“NBB”) 10 when a short sale circuit breaker is in effect for the covered security (the “short sale price test restriction”).11

    5 17 CFR 242.200(g).

    6 The term “short sale” is defined as “any sale of a security which the seller does not own or any sale which is consummated by the delivery of a security borrowed by, or for the account of, the seller.” 17 CFR 242.200(a).

    7See 17 CFR 242.201; Securities Exchange Act Release No. 61595 (February 26, 2010), 75 FR 11232 (March 10, 2010).

    8 Rule 201(a)(1) of Regulation SHO defines the term “covered security” to mean any “NMS stock” as defined under Rule 600(b)(47) of Regulation NMS. Rule 600(b)(47) of Regulation NMS defines an “NMS stock” as “any NMS security other than an option.” Rule 600(b)(46) of Regulation NMS defines an “NMS security” as “any security or class of securities for which transaction reports are collected, processed, and made available pursuant to an effective transaction reporting plan, or an effective national market system plan for reporting transactions in listed options.” 17 CFR 242.201(a)(1); 17 CFR 242.600(b)(46); and 17 CFR 242.600(b)(47).

    9 Rule 201(a)(9) of Regulation SHO states that the term “Trading Center” shall have the same meaning as in Rule 600(b)(78) of Regulation NMS. Rule 600(b)(78) of Regulation NMS defines a “Trading Center” as “a national securities exchange or national securities association that operates an SRO trading facility, an alternative trading system, an exchange market maker, an OTC market maker, or any other broker or dealer that executes orders internally by trading as principal or crossing orders as agent.” 17 CFR 242.200(a)(9); 17 CFR 242.600(b)(78).

    10 17 CFR 242.201(a)(4); 17 CFR 242.600(b)(42).

    11 17 CFR 242.201(b)(1).

    Under Rule 11.13(b)(1), an order marked “short” when a short sale price test restriction is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a short sale price test restriction and such order is an Immediate or Cancel (“IOC”) Order 12 or a BZX Market Order,13 then the order will be cancelled. If an order is ineligible for routing due to a short sale price test restriction and such order is a Limit Order,14 the Exchange will post the unfilled balance of the order to the BZX Book,15 subject to the price sliding process as defined in paragraph (g) of Exchange Rule 11.9.16

    12See Exchange Rule 11.9(b)(1).

    13See Exchange Rule 11.9(a)(2). The Exchange also proposes to remove the reference to BZX Market Orders in Rule 11.13(b)(1) as BZX Market Orders with a time-in-force of Day that are ineligible for routing due to a short sale price test restriction pursuant to Rule 201 of Regulation SHO are not cancelled, but rather posted to the BZX Book pursuant to Exchange Rule 11.9(a)(2). All other BZX Market Orders are handled in accordance with Exchange Rule 11.13(a).

    14See Exchange Rule 11.9(a)(1).

    15See Exchange Rule 1.5(e).

    16 In sum, under Exchange Rule 11.9(g), a short sale order that, at the time of entry, could not be executed or displayed in compliance with Rule 201 of Regulation SHO will be re-priced by the System at one minimum price variation above the current NBB (“Permitted Price”). See Exchange Rule 11.9(g) for a full description of the Exchange's Short Sale Price Sliding Process.

    The Exchange proposes to specify in Rule 11.13 that orders marked “short” may be eligible for routing by the Exchange when a short sale price test restriction is in effect where the User 17 selects either the Post to Away 18 or ROOC 19 routing options.20 In contrast to all other routing strategies, which are routed to other Trading Centers for immediate execution, the Post to Away and ROOC routing options are orders that are sent to other Trading Centers for posting and/or later execution as further described below. Under the Post to Away routing option, the remainder of a routed order is routed to and posted to the order book of a destination on the System routing table,21 as specified by the User. ROOC is a routing option for orders that the User wishes to designate for participation in the opening, re-opening (following a halt, suspension, or pause), or closing process of a primary listing market other than the Exchange (e.g., the New York Stock Exchange, Inc. (“NYSE”), Nasdaq Stock Market LLC (“Nasdaq”), NYSE MKT LLC, or NYSE Arca, Inc. (“NYSE Arca”)) if received before the opening/re-opening/closing time of such market. If shares remain unexecuted after attempting to execute in the opening, re-opening, or closing process, they are either posted to the BZX Book, executed, or routed to destinations on the System routing table.22 Orders routed pursuant to the Post to Away and ROOC routing options that are identified as “short” are subject to the receiving Trading Center's processes for handling short sale orders in compliance with Rule 201 of Regulation SHO.23

    17See Exchange Rule 1.5(cc).

    18See 11.13(b)(3)(H).

    19See 11.13(b)(3)(N).

    20 The Exchange also proposes to specify within Rule 11.13(b)(1) that the short sale price test restriction is declared pursuant to Rule 201 of Regulation SHO.

    21 The term “System routing table” is defined as the “the proprietary process for determining the specific trading venues to which the System routes orders and the order in which it routes them.” See Exchange Rule 11.13(b)(3).

    22 Shares returned to the Exchange after routing are handled in accordance with Exchange Rules, including Rule 11.13(a).

    23See, e.g., Nasdaq Rule 4763; NYSE Rule 440B; and Nasdaq's Regulation SHO Frequently Asked Questions (updated March 10, 2011), available at https://nasdaqtrader.com/content/marketregulation/regsho/regshoFAQs.pdf.

    Under Exchange Rule 11.13(b)(1), IOC Orders marked “short” that are not eligible for routing during a short sale price test restriction will continue to be cancelled.24 The unfilled portions of Limit Orders marked “short” that are ineligible for routing due to a short sale price test restriction Exchange will continue to be posted to the BZX Book subject to the price sliding process as defined in paragraph (g) of Exchange Rule 11.9.

    24See supra note 13.

    2. Statutory Basis

    The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act 25 and furthers the objectives of Section 6(b)(5) of the Act 26 because it is designed to promote just and equitable principles of trade, remove impediments to and perfect the mechanism of a free and open market and a national market system, foster cooperation and coordination with persons engaged in facilitating transactions in securities, and, in general, protect investors and the public interest. Specifically, the proposed changes are designed to ensure clarity in the Exchange's rulebook with respect to the routing of orders in compliance with Rule 201 of Regulation SHO. In addition, providing Users the ability to send short sale orders that are routable pursuant to the Post to Away and ROOC routing options provides them additional flexibility with regard to the handling of their orders. The Exchange notes that short sale orders routed pursuant to the Post to Away or ROOC routing options are identified as “short” and, therefore, subject to the receiving Trading Center's processes for handling short sale orders in compliance with Regulation SHO.27 The Exchange also notes that other national securities exchanges do not expressly prohibit the routing of short sale orders. For example, Nasdaq and NYSE Arca allow for the routing of short sale orders generally, and do not limit a short sale order's ability to route to certain routing options.28 Thus, the proposal is directly targeted at removing impediments to and perfecting the mechanism of a free and open market and national market system. The proposed rule change also is designed to support the principles of Section 11A(a)(1) 29 of the Act in that it seeks to assure fair competition among brokers and dealers and among exchange markets.

    25 15 U.S.C. 78f(b).

    26 15 U.S.C. 78f(b)(5).

    27See supra note 22.

    28See e.g., Nasdaq Rules 4702(a) (stating generally that an “[o]rder may . . . may be routed to other market centers for potential execution if designated as `Routable' ”) and 4763 (not prohibiting the routing of a short sale order during a short sale price test). See also e.g., NYSE Arca Rule 7.6P (not prohibiting the routing of a short sale order during a short sale price test).

    29 15 U.S.C. 78k-1(a)(1).

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders marked “short” may be routed to an away market for execution under two specific routing strategies offered by the Exchange.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    Not applicable.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act 30 and paragraph (f)(6) of Rule 19b-4 thereunder,31 the Exchange has designated this rule filing as non-controversial. The Exchange has given the Commission written notice of its intent to file the proposed rule change, along with a brief description and text of the proposed rule change at least five business days prior to the date of filing of the proposed rule change, or such shorter time as designated by the Commission.

    30 15 U.S.C. 78s(b)(3)(A).

    31 17 CFR 240.19b-4.

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-BatsBZX-2016-65 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-BatsBZX-2016-65. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549 on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of such filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-BatsBZX-2016-65, and should be submitted on or before November 21, 2016.

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.32

    32 17 CFR 200.30-3(a)(12).

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26132 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79150; File No. SR-BatsEDGA-2016-22] Self-Regulatory Organizations; Bats EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change to EDGA Rule 11.11, Routing to Away Trading Centers October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (the “Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 12, 2016, Bats EDGA Exchange, Inc. (the “Exchange” or “EDGA”) filed with the Securities and Exchange Commission (“Commission”) the proposed rule change as described in Items I, II, and III below, which Items have been prepared by the Exchange. The Exchange has designated this proposal as a “non-controversial” proposed rule change pursuant to Section 19(b)(3)(A) of the Act 3 and Rule 19b-4(f)(6)(iii) thereunder,4 which renders it effective upon filing with the Commission. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    3 15 U.S.C. 78s(b)(3)(A).

    4 17 CFR 240.19b-4(f)(6)(iii).

    I. Self-Regulatory Organization's Statement of the Terms of the Substance of the Proposed Rule Change

    The Exchange filed a proposal to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale instruction may be eligible for routing when a short sale price test restriction is in effect.

    The text of the proposed rule change is available at the Exchange's Web site at www.batstrading.com, at the principal office of the Exchange, and at the Commission's Public Reference Room.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in Sections A, B, and C below, of the most significant parts of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    The Exchange proposes to amend Exchange Rule 11.11(a) to describe when an order that includes a Short Sale 5 instruction may be eligible for routing when a short sale price test restriction is in effect. Under Rule 201 of Regulation SHO,6 a short sale order in covered security 7 generally cannot be executed or displayed by a Trading Center,8 such as the Exchange, at a price that is at or below the current national best bid (“NBB”) 9 when a short sale circuit breaker is in effect for the covered security (the “short sale price test restriction”).10

    5See Exchange Rule 11.6(o). The term “short sale” is defined as “any sale of a security which the seller does not own or any sale which is consummated by the delivery of a security borrowed by, or for the account of, the seller.” 17 CFR 242.200(a).

    6See 17 CFR 242.201; Securities Exchange Act Release No. 61595 (February 26, 2010), 75 FR 11232 (March 10, 2010).

    7 Rule 201(a)(1) of Regulation SHO defines the term “covered security” to mean any “NMS stock” as defined under Rule 600(b)(47) of Regulation NMS. Rule 600(b)(47) of Regulation NMS defines an “NMS stock” as “any NMS security other than an option.” Rule 600(b)(46) of Regulation NMS defines an “NMS security” as “any security or class of securities for which transaction reports are collected, processed, and made available pursuant to an effective transaction reporting plan, or an effective national market system plan for reporting transactions in listed options.” 17 CFR 242.201(a)(1); 17 CFR 242.600(b)(46); and 17 CFR 242.600(b)(47).

    8 Rule 201(a)(9) of Regulation SHO states that the term “Trading Center” shall have the same meaning as in Rule 600(b)(78) of Regulation NMS. Rule 600(b)(78) of Regulation NMS defines a “Trading Center” as “a national securities exchange or national securities association that operates an SRO trading facility, an alternative trading system, an exchange market maker, an OTC market maker, or any other broker or dealer that executes orders internally by trading as principal or crossing orders as agent.” 17 CFR 242.200(a)(9); 17 CFR 242.600(b)(78).

    9 17 CFR 242.201(a)(4); 17 CFR 242.600(b)(42).

    10 17 CFR 242.201(b)(1).

    Under Rule 11.11(a), an order that includes a Short Sale instruction when a short sale price test restriction pursuant to Rule 201 of Regulation SHO is in effect is not eligible for routing by the Exchange. If an order is ineligible for routing due to a Short Sale Circuit Breaker 11 being in effect and such order contains a Time-in-Force of Immediate-or-Cancel (“IOC”),12 then the order will be cancelled. For any other order ineligible for routing due to a Short Sale Circuit Breaker being in effect, the Exchange will post the unfilled balance of the order to the EDGA Book,13 treat the order as if it included a Book Only 14 or Post Only 15 instruction, and subject it to the Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO, as described in Rule Exchange 11.6(l)(2),16 unless the User has elected the order Cancel Back as described in Exchange Rule 11.6(b).

    11 In order to use consistent terminology, the Exchange proposes to replace the term “short sale price test restriction” with “Short Sale Circuit Breaker” within the first sentence of Rule 11.11(a).

    12See Exchange Rule 11.6(q)(1).

    13See Exchange Rule 1.5(d).

    14See Exchange Rule 11.6(n)(3).

    15See Exchange Rule 11.6(n)(4).

    16 In sum, under Exchange Rule 11.6(l)(2), an order to sell with a Short Sale instruction that, at the time of entry, could not be executed or displayed in compliance with Rule 201 of Regulation SHO will be re-priced by the System at the Permitted Price. See Exchange Rule 11.6(l)(2) for a full description of the Exchange's Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO.

    The Exchange proposes to specify in Rule 11.11(a) that orders that include a Short Sale instruction may be eligible for routing by the Exchange when a Short Sale Circuit Breaker is in effect where the User 17 selects the Post to Away 18 routing option. In contrast to all other routing strategies, which are routed to other Trading Centers for immediate execution, the Post to Away routing option is an order that is sent to other Trading Centers for posting and/or later execution as further described below. Under the Post to Away routing option, the remainder of a routed order is routed to and posted to the order book of a destination on the System routing table,19 as specified by the User. Orders routed pursuant to the Post to Away routing option that include a Short Sale instruction are identified as “short” are subject to the receiving Trading Center's processes for handling short sale orders in compliance with Rule 201 of Regulation SHO.20

    17See Exchange Rule 1.5(cc).

    18See 11.11(g)(15).

    19 The term “System routing table” is defined as the “the proprietary process for determining the specific trading venues to which the System routes orders and the order in which it routes them.” See Exchange Rule 11.11(g).

    20See, e.g., Nasdaq Stock Market LLC (“Nasdaq”) Rule 4763; New York Stock Exchange, Inc. (“NYSE”) Rule 440B; and Nasdaq's Regulation SHO Frequently Asked Questions (updated March 10, 2011), available at https://nasdaqtrader.com/content/marketregulation/regsho/regshoFAQs.pdf.

    Under Exchange Rule 11.11(a), orders that include a Short Sale instruction and a Time-in-Force of IOC that are not eligible for routing during a Short Sale Circuit Breaker will continue to be cancelled. For any other order that includes a Short Sale instruction that is ineligible for routing due to a Short Sale Circuit Breaker being in effect, the Exchange will continue to post the unfilled balance of the order to the EDGA Book, treat the order as if it included a Book Only or Post Only instruction, and subject it to the Re-Pricing Instructions to Comply with Rule 201 of Regulation SHO, as described in Rule 11.6(l)(2), unless the User has elected the order Cancel Back as described in Rule 11.6(b).

    2. Statutory Basis

    The Exchange believes that the proposed rule change is consistent with Section 6(b) of the Act 21 and furthers the objectives of Section 6(b)(5) of the Act 22 because it is designed to promote just and equitable principles of trade, remove impediments to and perfect the mechanism of a free and open market and a national market system, foster cooperation and coordination with persons engaged in facilitating transactions in securities, and, in general, protect investors and the public interest. Specifically, the proposed changes are designed to ensure clarity in the Exchange's rulebook with respect to the routing of orders in compliance with Rule 201 of Regulation SHO. In addition, providing Users the ability to send short sale orders that are routable pursuant to the Post to Away routing option provides them additional flexibility with regard to the handling of their orders. The Exchange notes that orders that include a Short Sale instruction routed pursuant to the Post to Away routing option are identified as “short” and, therefore, subject to the receiving Trading Center's processes for handling short sale orders in compliance with Regulation SHO.23 The Exchange also notes that other national securities exchanges do not expressly prohibit the routing of short sale orders. For example, Nasdaq and NYSE Arca, Inc. (“NYSE Arca”) allow for the routing of short sale orders generally, and do not limit a short sale order's ability to route to certain routing options.24 Thus, the proposal is directly targeted at removing impediments to and perfecting the mechanism of a free and open market and national market system. The proposed rule change also is designed to support the principles of Section 11A(a)(1) 25 of the Act in that it seeks to assure fair competition among brokers and dealers and among exchange markets.

    21 15 U.S.C. 78f(b).

    22 15 U.S.C. 78f(b)(5).

    23See supra note 20.

    24See e.g., Nasdaq Rules 4702(a) (stating generally that an “[o]rder may . . . may be routed to other market centers for potential execution if designated as `Routable' ”) and 4763 (not prohibiting the routing of a short sale order during a short sale price test). See also e.g., NYSE Arca Rule 7.6P (not prohibiting the routing of a short sale order during a short sale price test).

    25 15 U.S.C. 78k-1(a)(1).

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will result in any burden on competition that is not necessary or appropriate in furtherance of the purposes of the Act. The Exchange is simply proposing to reflect in its rules that orders that include a Short Sale instruction may be routed to an away marked for execution under one specific routing strategy offered by the Exchange.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    Not applicable.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (A) Significantly affect the protection of investors or the public interest; (B) impose any significant burden on competition; and (C) by its terms, become operative for 30 days from the date on which it was filed or such shorter time as the Commission may designate it has become effective pursuant to Section 19(b)(3)(A) of the Act 26 and paragraph (f)(6) of Rule 19b-4 thereunder,27 the Exchange has designated this rule filing as non-controversial. The Exchange has given the Commission written notice of its intent to file the proposed rule change, along with a brief description and text of the proposed rule change at least five business days prior to the date of filing of the proposed rule change, or such shorter time as designated by the Commission.

    26 15 U.S.C. 78s(b)(3)(A).

    27 17 CFR 240.19b-4.

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (1) Necessary or appropriate in the public interest; (2) for the protection of investors; or (3) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-BatsEDGA-2016-22 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-BatsEDGA-2016-22. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549 on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of such filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-BatsEDGA-2016-22, and should be submitted on or before November 21, 2016.

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.28

    28 17 CFR 200.30-3(a)(12).

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26133 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79154; File No. SR-BX-2016-054] Self-Regulatory Organizations; NASDAQ BX, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Add Commentary .14 to Rule 4770 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot) October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (“Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 17, 2016, NASDAQ BX, Inc. (“BX” or “Exchange”) filed with the Securities and Exchange Commission (“SEC” or “Commission”) the proposed rule change as described in Items I and II, below, which Items have been prepared by the Exchange. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    I. Self-Regulatory Organization's Statement of the Terms of Substance of the Proposed Rule Change

    The Exchange proposes to add Commentary .14 to Rule 4770 (Compliance with Regulation NMS Plan to Implement a Tick Size Pilot) to provide the SEC with notice of its efforts to re-program its systems to eliminate a re-pricing functionality for certain orders in Test Group Three securities in connection with the Regulation NMS Plan to Implement a Tick Size Pilot Program (“Plan” or “Pilot”).3 BX also proposes to re-number current Commentary .12 relating to the Block Size exception to Commentary .13 as a technical correction.

    3See Securities Exchange Act Release No. 74892 (May 6, 2015), 80 FR 27513 (May 13, 2015) (“Approval Order”).

    The text of the proposed rule change is set forth below. Proposed new language is in italics; deleted text is in brackets.

    NASDAQ BX Rules 4770. Compliance With Regulation NMS Plan To Implement a Tick Size Pilot

    (a) through (d) No Change.

    Commentary: .01-.12 No change.

    .1[2]3 For purposes of qualifying for the Block Size exception under paragraph (c)(3)(D)(iii) of this Rule, the Order must have a size of 5,000 shares or more and the resulting execution upon entry must have a size of 5,000 shares or more in aggregate.

    .14 Until October 31, 2016, the treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:

    Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.

    Following entry, and if market conditions allow, the Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Exchange Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in sections A, B, and C below, of the most significant aspects of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    On September 7, 2016, the Exchange filed with the Securities and Exchange Commission (“SEC” or “Commission”) a proposed rule change (“Proposal”) to adopt paragraph (d) to Exchange Rule 4770 to describe changes to system functionality necessary to implement the Plan. The Exchange also proposed amendments to Rule 4770(a) and (c) to clarify how the Trade-at exception may be satisfied. The SEC published the Proposal in the Federal Register for notice and comment on September 20, 2016.4 BX subsequently filed three Partial Amendments to clarify aspects of the Proposal. The Commission approved the Proposal, as amended, on October 7, 2016.5

    4See Securities Exchange Act Release No. 78838 (September 14, 2016), 81 FR 64566 (September 20, 2016) (SR-BX-2016-050).

    5See Securities Exchange Act Release No. 79076 (October 7, 2016) (SR-BX-2016-050).

    In SR-BX-2016-050, BX had initially proposed a re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities.6 BX subsequently determined that it would not offer this re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities. As part of Partial Amendment No. 2 to SR-BX-2016-050, BX proposed to delete the relevant language from Rule 4770 related to this re-pricing functionality.

    6 As originally proposed, Rule 4770(d)(2) stated that Price to Comply Orders in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price. Rule 4770(d)(3) stated that, if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price. Rule 4770(d)(4) stated that, if market conditions allow, the Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the BX Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    In that amendment, BX noted that this change would only impact the treatment of Price to Comply Orders, Non-Displayed Orders, and Post-Only orders that are submitted through the OUCH and FLITE protocols in Test Group Three Pilot Securities, as these types of Orders that are currently submitted to BX through the RASH or FIX protocols are already subject to this re-pricing functionality and will remain subject to this functionality under the Pilot.

    In the Amendment, BX further noted that its systems are currently programmed so that Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities may be adjusted repeatedly to reflect changes to the NBBO and/or the best price on the BX book. BX stated that it is re-programming its systems to remove this functionality for Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities.7 In the Amendment, BX stated that it anticipated that this re-programming shall be completed no later than November 30, 2016. If it appears that this functionality will remain operational by October 17, 2016, BX indicated that it would file a proposed rule change with the SEC and will provide notice to market participants sufficiently in advance of that date to provide effective notice. The rule change and the notice to market participants will describe the current operation of the BX systems in this regard, and the timing related to the re-programming.

    7 BX has become aware that this re-pricing functionality also applies to Price to Display Orders that are entered through the OUCH and FLITE protocols in Test Group Three Securities, and is including those Orders as part of this proposal accordingly. Price to Display Orders will be treated in the same manner as Price to Comply Orders under the re-pricing functionality.

    At this time, BX is still in the process of re-programming its systems to eliminate the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols. BX anticipates that this re-programming shall be complete on or before October 31, 2016.

    Therefore, the current treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:

    Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.

    Following entry, and if market conditions allow, a Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the BX Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    In addition to this proposal, BX will also issue an Equity Trader Alert that describes the current operation of the BX systems in this regard, and the timing related to the removal of this re-pricing functionality.8

    8 BX anticipates providing additional specificity to market participants as to the timing of the new functionality at a later date.

    BX also proposes to re-number Commentary .12, which relates to the Block Size exception, to Commentary .13. A previous filing (SR-BX-2016-048) added Commentary to Rule 4770 that resulted in Commentary .11, which addresses the effective date of the Rule, being re-numbered as Commentary .12. BX therefore proposes to re-number the Commentary .12 that addresses the Block Size exception as Commentary .13.

    2. Statutory Basis

    The Exchange believes that its proposal is consistent with Section 6(b) of the Act,9 in general, and furthers the objectives of Section 6(b)(5) of the Act,10 in particular, in that it is designed to promote just and equitable principles of trade, to remove impediments to and perfect the mechanism of a free and open market and a national market system, and, in general to protect investors and the public interest. The purpose of this filing is to inform the SEC and market participants of the status of BX's attempts to re-program its systems to remove the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, and the current treatment of such orders pending the removal of this functionality. This proposal is consistent with the Act because it provides the SEC and market participants with notice of BX's efforts in this regard, and is being submitted in connection with the statements made by BX in SR-BX-2016-050 in proposing the removal of this functionality.

    9 15 U.S.C. 78f(b).

    10 15 U.S.C. 78f(b)(5).

    BX also believes that the proposal is consistent with the Act because the re-pricing functionality will not significantly impact the data gathered pursuant to the Pilot. BX notes that this re-pricing functionality only affects Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols for Test Group Three securities until the re-pricing functionality is eliminated, and only becomes relevant when an Order in a Test Group Three security would cross a Protected Quotation of another market center. BX has analyzed data relating to the frequency with which Orders in Test Group Three securities are entered with a limit price that would cross a Protected Quotation of another market center, and believes that the re-pricing functionality will be triggered infrequently once Test Group Three becomes operational.11 The Exchange also notes that it is diligently working to re-program its systems to remove this re-pricing functionality, and that it anticipates this re-programming to be complete on or before October 31, 2016.

    11 For example, on September 23, 2016, 0.3% of orders in Test Group Three securities were entered on the NASDAQ Stock Market LLC at a price that crossed the NBBO. BX believes that this number will be even lower starting October 17, 2016, when the $0.05 tick increment for Test Group Three securities is in place.

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will impose any burden on competition not necessary or appropriate in furtherance of the purposes of the Act. The purpose of this proposal is to provide the SEC and market participants with notice of BX's efforts to remove its re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, consistent with its statements in SR-BX-2016-050.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    No written comments were either solicited or received.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (i) Significantly affect the protection of investors or the public interest; (ii) impose any significant burden on competition; and (iii) become operative for 30 days from the date on which it was filed, or such shorter time as the Commission may designate, it has become effective pursuant to Section 19(b)(3)(A)(iii) of the Act 12 and subparagraph (f)(6) of Rule 19b-4 thereunder.13

    12 15 U.S.C. 78s(b)(3)(A)(iii).

    13 17 CFR 240.19b-4(f)(6).

    A proposed rule change filed under Rule 19b-4(f)(6) normally does not become operative prior to 30 days after the date of filing. Rule 19b-4(f)(6)(iii), however, permits the Commission to designate a shorter time if such action is consistent with the protection of investors and the public interest. The Exchange requests that the Commission waive the 30-day operative delay contained in Rule 19b-4(f)(6)(iii) so that this proposed change will be in operative as of October 17, 2016, the date that Test Group Three securities begin to be subject to the quoting and trading restrictions of the Plan and, therefore, the relevant language in Rule 4770.

    The Commission believes that waiving the 30-day operative delay is consistent with the protection of investors and the public interest because it will allow the Exchange to implement the proposed rules immediately thereby preventing delays in the implementation of the Plan. The Commission notes that the Pilot started implementation on October 3, 2016, Test Group Three securities were phased into the Pilot starting on October 17, 2016, and waiving the 30-day operative delay would ensure that the rules of the Exchange would be in place during implementation. Therefore, the Commission hereby waives the 30-day operative delay and designates the proposed rule change to be operative upon filing with the Commission.14

    14 For purposes only of waiving the operative delay for this proposal, the Commission has considered the proposed rule's impact on efficiency, competition, and capital formation. See 15 U.S.C. 78c(f).

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (i) Necessary or appropriate in the public interest; (ii) for the protection of investors; or (iii) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-BX-2016-054 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-BX-2016-054. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549, on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of the filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-BX-2016-054, and should be submitted on or before November 21, 2016.

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.15

    15 17 CFR 200.30-3(a)(12).

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26137 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79155; File No. SR-NASDAQ-2016-143] Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Add Commentary .14 to Rule 4770 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot) October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (“Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 17, 2016, The NASDAQ Stock Market LLC (“Nasdaq” or “Exchange”) filed with the Securities and Exchange Commission (“SEC” or “Commission”) the proposed rule change as described in Items I and II, below, which Items have been prepared by the Exchange. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.3

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    3 The text of the proposed rule change is set forth below. Proposed new language is in italics; deleted text is in brackets.

    I. Self-Regulatory Organization's Statement of the Terms of Substance of the Proposed Rule Change

    The Exchange proposes to add Commentary .14 to Rule 4770 (Compliance with Regulation NMS Plan to Implement a Tick Size Pilot) to provide the SEC with notice of its efforts to re-program its systems to eliminate a re-pricing functionality for certain orders in Test Group Three securities in connection with the Regulation NMS Plan to Implement a Tick Size Pilot Program (“Plan” or “Pilot”).4 Nasdaq also proposes to re-number current Commentary .12 relating to the Block Size exception to Commentary .13 as a technical correction.

    4See Securities Exchange Act Release No. 74892 (May 6, 2015), 80 FR 27513 (May 13, 2015) (“Approval Order”).

    The NASDAQ Stock Market Rules 4770. Compliance With Regulation NMS Plan To Implement a Tick Size Pilot

    (a) through (d) No Change.

    Commentary: .01-.12 No change.

    .1[2]3 For purposes of qualifying for the Block Size exception under paragraph (c)(3)(D)(iii) of this Rule, the Order must have a size of 5,000 shares or more and the resulting execution upon entry must have a size of 5,000 shares or more in aggregate.

    .14 Until October 31, 2016, the treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:

    Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.

    Following entry, and if market conditions allow, the Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Nasdaq Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in sections A, B, and C below, of the most significant aspects of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    On September 7, 2016, The Nasdaq Stock Market LLC (“Nasdaq” or “Exchange”) filed with the Securities and Exchange Commission (“SEC” or “Commission”) a proposed rule change (“Proposal”) to adopt paragraph (d) and Commentary .12 to Exchange Rule 4770 to describe changes to system functionality necessary to implement the Plan. The Exchange also proposed amendments to Rule 4770(a) and (c) to clarify how the Trade-at exception may be satisfied. The SEC published the Proposal in the Federal Register for notice and comment on September 20, 2016.5 Nasdaq subsequently filed three Partial Amendments to clarify aspects of the Proposal. The Commission approved the Proposal, as amended, on October 7, 2016.6

    5See Securities Exchange Act Release No. 78837 (September 14, 2016), 81 FR 64544 (September 20, 2016) (SR-NASDAQ-2016-126).

    6See Securities Exchange Act Release No. 79075 (October 7, 2016) (SR-NASDAQ-2016-126).

    In SR-NASDAQ-2016-126, Nasdaq had initially proposed a re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities.7 Nasdaq subsequently determined that it would not offer this re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities. As part of Partial Amendment No. 2 to SR-NASDAQ-2016-126, Nasdaq proposed to delete the relevant language from Rule 4770 related to this re-pricing functionality.

    7 As originally proposed, Rule 4770(d)(2) stated that Price to Comply Orders in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price. Rule 4770(d)(3) stated that, if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price. Rule 4770(d)(4) stated that, if market conditions allow, the Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Nasdaq Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    In that amendment, Nasdaq noted that this change would only impact the treatment of Price to Comply Orders, Non-Displayed Orders, and Post-Only orders that are submitted through the OUCH and FLITE protocols in Test Group Three Pilot Securities, as these types of Orders that are currently submitted to Nasdaq through the RASH, QIX or FIX protocols are already subject to this re-pricing functionality and will remain subject to this functionality under the Pilot.

    In the Amendment, Nasdaq further noted that its systems are currently programmed so that Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities may be adjusted repeatedly to reflect changes to the NBBO and/or the best price on the Nasdaq book. Nasdaq stated that it is re-programming its systems to remove this functionality for Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities.8 In the Amendment, Nasdaq stated that it anticipated that this re-programming shall be completed no later than November 30, 2016. If it appears that this functionality will remain operational by October 17, 2016, Nasdaq indicated that it would file a proposed rule change with the SEC and will provide notice to market participants sufficiently in advance of that date to provide effective notice. The rule change and the notice to market participants will describe the current operation of the Nasdaq systems in this regard, and the timing related to the re-programming.

    8 Nasdaq has become aware that this re-pricing functionality also applies to Price to Display Orders that are entered through the OUCH and FLITE protocols in Test Group Three Securities, and is including those Orders as part of this proposal accordingly. Price to Display Orders will be treated in the same manner as Price to Comply Orders under the re-pricing functionality.

    At this time, Nasdaq is still in the process of re-programming its systems to eliminate the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols. Nasdaq anticipates that this re-programming shall be complete on or before October 31, 2016.

    Therefore, the current treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:

    Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.

    Following entry, and if market conditions allow, a Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Nasdaq Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    In addition to this proposal, Nasdaq will also issue an Equity Trader Alert that describes the current operation of the Nasdaq systems in this regard, and the timing related to the removal of this re-pricing functionality.9

    9 Nasdaq anticipates providing additional specificity to market participants as to the timing of the new functionality at a later date.

    Nasdaq also proposes to re-number Commentary .12, which relates to the Block Size exception, to Commentary .13. A previous filing (SR-NASDAQ-2016-123) added Commentary to Rule 4770 that resulted in Commentary .11, which addresses the effective date of the Rule, being re-numbered as Commentary .12. Nasdaq therefore proposes to re-number the Commentary .12 that addresses the Block Size exception as Commentary .13.

    2. Statutory Basis

    The Exchange believes that its proposal is consistent with Section 6(b) of the Act,10 in general, and furthers the objectives of Section 6(b)(5) of the Act,11 in particular, in that it is designed to promote just and equitable principles of trade, to remove impediments to and perfect the mechanism of a free and open market and a national market system, and, in general to protect investors and the public interest. The purpose of this filing is to inform the SEC and market participants of the status of Nasdaq's attempts to re-program its systems to remove the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, and the current treatment of such orders pending the removal of this functionality. This proposal is consistent with the Act because it provides the SEC and market participants with notice of Nasdaq's efforts in this regard, and is being submitted in connection with the statements made by Nasdaq in SR-NASDAQ-2016-126 in proposing the removal of this functionality.

    10 15 U.S.C. 78f(b).

    11 15 U.S.C. 78f(b)(5).

    Nasdaq also believes that the proposal is consistent with the Act because the re-pricing functionality will not significantly impact the data gathered pursuant to the Pilot. Nasdaq notes that this re-pricing functionality only affects Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols for Test Group Three securities until the re-pricing functionality is eliminated, and only becomes relevant when an Order in a Test Group Three security would cross a Protected Quotation of another market center. Nasdaq has analyzed data relating to the frequency with which Orders in Test Group Three securities are entered with a limit price that would cross a Protected Quotation of another market center, and believes that the re-pricing functionality will be triggered infrequently once Test Group Three becomes operational.12 The Exchange also notes that it is diligently working to re-program its systems to remove this re-pricing functionality, and that it anticipates this re-programming to be complete on or before October 31, 2016.

    12 For example, on September 23, 2016, 0.3% of orders in Test Group Three securities were entered on Nasdaq at a price that crossed the NBBO. Nasdaq believes that this number will be even lower starting October 17, 2016, when the $0.05 tick increment for Test Group Three securities is in place.

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will impose any burden on competition not necessary or appropriate in furtherance of the purposes of the Act. The purpose of this proposal is to provide the SEC and market participants with notice of Nasdaq's efforts to remove its re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, consistent with its statements in SR-NASDAQ-2016-126.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    No written comments were either solicited or received.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (i) Significantly affect the protection of investors or the public interest; (ii) impose any significant burden on competition; and (iii) become operative for 30 days from the date on which it was filed, or such shorter time as the Commission may designate, it has become effective pursuant to Section 19(b)(3)(A)(iii) of the Act13 and subparagraph (f)(6) of Rule 19b-4 thereunder.14

    13 15 U.S.C. 78s(b)(3)(A)(iii).

    14 17 CFR 240.19b-4(f)(6).

    A proposed rule change filed under Rule 19b-4(f)(6) normally does not become operative prior to 30 days after the date of filing. Rule 19b-4(f)(6)(iii), however, permits the Commission to designate a shorter time if such action is consistent with the protection of investors and the public interest. The Exchange requests that the Commission waive the 30-day operative delay contained in Rule 19b-4(f)(6)(iii) so that this proposed change will be in operative as of October 17, 2016, the date that Test Group Three securities begin to be subject to the quoting and trading restrictions of the Plan and, therefore, the relevant language in Rule 4770.

    The Commission believes that waiving the 30-day operative delay is consistent with the protection of investors and the public interest because it will allow the Exchange to implement the proposed rules immediately thereby preventing delays in the implementation of the Plan. The Commission notes that the Pilot started implementation on October 3, 2016, Test Group Three securities were phased into the Pilot starting on October 17, 2016, and waiving the 30-day operative delay would ensure that the rules of the Exchange would be in place during implementation. Therefore, the Commission hereby waives the 30-day operative delay and designates the proposed rule change to be operative upon filing with the Commission.15

    15 For purposes only of waiving the operative delay for this proposal, the Commission has considered the proposed rule's impact on efficiency, competition, and capital formation. See 15 U.S.C. 78c(f).

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (i) Necessary or appropriate in the public interest; (ii) for the protection of investors; or (iii) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-NASDAQ-2016-143 on the subject line.

    Paper Comments

    • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090.

    All submissions should refer to File Number SR-NASDAQ-2016-143. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549, on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of the filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-NASDAQ-2016-143, and should be submitted on or before November 21, 2016.

    16 17 CFR 200.30-3(a)(12).

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.16

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26138 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    SECURITIES AND EXCHANGE COMMISSION [Release No. 34-79156; File No. SR-Phlx-2016-106] Self-Regulatory Organizations; NASDAQ PHLX LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Add Commentary .14 to Rule 3317 (Compliance With Regulation NMS Plan To Implement a Tick Size Pilot) October 25, 2016.

    Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (“Act”),1 and Rule 19b-4 thereunder,2 notice is hereby given that on October 17, 2016, NASDAQ PHLX LLC (“Phlx” or “Exchange”) filed with the Securities and Exchange Commission (“SEC” or “Commission”) the proposed rule change as described in Items I and II, below, which Items have been prepared by the Exchange. The Commission is publishing this notice to solicit comments on the proposed rule change from interested persons.

    1 15 U.S.C. 78s(b)(1).

    2 17 CFR 240.19b-4.

    I. Self-Regulatory Organization's Statement of the Terms of Substance of the Proposed Rule Change

    The Exchange proposes to add Commentary .14 to Rule 3317 (Compliance with Regulation NMS Plan to Implement a Tick Size Pilot) to provide the SEC with notice of its efforts to re-program its systems to eliminate a re-pricing functionality for certain orders in Test Group Three securities in connection with the Regulation NMS Plan to Implement a Tick Size Pilot Program (“Plan” or “Pilot”).3 Phlx also proposes to re-number current Commentary .12 relating to the Block Size exception to Commentary .13 as a technical correction.

    3See Securities Exchange Act Release No. 74892 (May 6, 2015), 80 FR 27513 (May 13, 2015) (“Approval Order”).

    The text of the proposed rule change is set forth below. Proposed new language is in italics; deleted text is in brackets.

    NASDAQ PHLX Rules 3317. Compliance with Regulation NMS Plan to Implement a Tick Size Pilot

    (a) through (d) No Change.

    Commentary: .01-.12 No change.

    .1[2]3 For purposes of qualifying for the Block Size exception under paragraph (c)(3)(D)(iii) of this Rule, the Order must have a size of 5,000 shares or more and the resulting execution upon entry must have a size of 5,000 shares or more in aggregate.

    .14 Until October 31, 2016, the treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:

    Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.

    Following entry, and if market conditions allow, the Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Exchange Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    II. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change

    In its filing with the Commission, the Exchange included statements concerning the purpose of and basis for the proposed rule change and discussed any comments it received on the proposed rule change. The text of these statements may be examined at the places specified in Item IV below. The Exchange has prepared summaries, set forth in sections A, B, and C below, of the most significant aspects of such statements.

    A. Self-Regulatory Organization's Statement of the Purpose of, and Statutory Basis for, the Proposed Rule Change 1. Purpose

    On September 7, 2016, the Exchange filed with the Securities and Exchange Commission (“SEC” or “Commission”) a proposed rule change (“Proposal”) to adopt paragraph (d) and Commentary .12 to Exchange Rule 3317 to describe changes to system functionality necessary to implement the Plan. The Exchange also proposed amendments to Rule 3317(a) and (c) to clarify how the Trade-at exception may be satisfied. The SEC published the Proposal in the Federal Register for notice and comment on September 20, 2016.4 Phlx subsequently filed three Partial Amendments to clarify aspects of the Proposal. The Commission approved the Proposal, as amended, on October 7, 2016.5

    4See Securities Exchange Act Release No. 78835 (September 14, 2016), 81 FR 64552 (September 20, 2016) (SR-Phlx-2016-92).

    5See Securities Exchange Act Release No. 79074 (October 7, 2016) (SR-Phlx-2016-92).

    In SR-Phlx-2016-92, Phlx had initially proposed a re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities.6 Phlx subsequently determined that it would not offer this re-pricing functionality for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders entered through the OUCH and FLITE protocols in Group Three securities. As part of Partial Amendment No. 2 to SR-Phlx-2016-92, Phlx proposed to delete the relevant language from Rule 4770 [sic] related to this re-pricing functionality.

    6 As originally proposed, Rule 3317(d)(2) stated that Price to Comply Orders in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price. Rule 3317(d)(3) stated that, if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price. Rule 3317(d)(4) stated that, if market conditions allow, the Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Phlx Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    In that amendment, Phlx noted that this change would only impact the treatment of Price to Comply Orders, Non-Displayed Orders, and Post-Only orders that are submitted through the OUCH and FLITE protocols in Test Group Three Pilot Securities, as these types of Orders that are currently submitted to Phlx through the RASH or FIX protocols are already subject to this re-pricing functionality and will remain subject to this functionality under the Pilot.

    In the Amendment, Phlx further noted that its systems are currently programmed so that Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities may be adjusted repeatedly to reflect changes to the NBBO and/or the best price on the Phlx book. Phlx stated that it is re-programming its systems to remove this functionality for Price to Comply Orders, Non-Displayed Orders and Post-Only Orders entered through the OUCH and FLITE protocols in Test Group Three Securities.7 In the Amendment, Phlx stated that it anticipated that this re-programming shall be completed no later than November 30, 2016. If it appears that this functionality will remain operational by October 17, 2016, Phlx indicated that it would file a proposed rule change with the SEC and will provide notice to market participants sufficiently in advance of that date to provide effective notice. The rule change and the notice to market participants will describe the current operation of the Phlx systems in this regard, and the timing related to the re-programming.

    7 BX [sic] has become aware that this re-pricing functionality also applies to Price to Display Orders that are entered through the OUCH and FLITE protocols in Test Group Three Securities, and is including those Orders as part of this proposal accordingly. Price to Display Orders will be treated in the same manner as Price to Comply Orders under the re-pricing functionality.

    At this time, Phlx is still in the process of re-programming its systems to eliminate the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols. Phlx anticipates that this re-programming shall be complete on or before October 31, 2016.

    Therefore, the current treatment of Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols in Test Group Three securities shall be as follows:

    Following entry, and if market conditions allow, a Price to Comply Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Comply Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Price to Display Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO until such time as the Price to Display Order is able to be ranked and displayed at its original entered limit price.

    Following entry, and if market conditions allow, a Non-Displayed Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO up (down) to the Order's limit price.

    Following entry, and if market conditions allow, a Post-Only Order in a Test Group Three Pilot Security will be adjusted repeatedly in accordance with changes to the NBBO or the best price on the Phlx Book, as applicable until such time as the Post-Only Order is able to be ranked and displayed at its original entered limit price.

    In addition to this proposal, Phlx will also issue an Equity Trader Alert that describes the current operation of the Phlx systems in this regard, and the timing related to the removal of this re-pricing functionality.8

    8 Phlx anticipates providing additional specificity to market participants as to the timing of the new functionality at a later date.

    Phlx also proposes to re-number Commentary .12, which relates to the Block Size exception, to Commentary .13. A previous filing (SR-Phlx-2016-90) added Commentary to Rule 3317 that resulted in Commentary .11, which addresses the effective date of the Rule, being re-numbered as Commentary .12. Phlx therefore proposes to re-number the Commentary .12 that addresses the Block Size exception as Commentary .13.

    2. Statutory Basis

    The Exchange believes that its proposal is consistent with Section 6(b) of the Act,9 in general, and furthers the objectives of Section 6(b)(5) of the Act,10 in particular, in that it is designed to promote just and equitable principles of trade, to remove impediments to and perfect the mechanism of a free and open market and a national market system, and, in general to protect investors and the public interest. The purpose of this filing is to inform the SEC and market participants of the status of Phlx's attempts to re-program its systems to remove the re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, and the current treatment of such orders pending the removal of this functionality. This proposal is consistent with the Act because it provides the SEC and market participants with notice of Phlx's efforts in this regard, and is being submitted in connection with the statements made by Phlx in SR-Phlx-2016-92 in proposing the removal of this functionality.

    9 15 U.S.C. 78f(b).

    10 15 U.S.C. 78f(b)(5).

    Phlx also believes that the proposal is consistent with the Act because the re-pricing functionality will not significantly impact the data gathered pursuant to the Pilot. Phlx notes that this re-pricing functionality only affects Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols for Test Group Three securities until the re-pricing functionality is eliminated, and only becomes relevant when an Order in a Test Group Three security would cross a Protected Quotation of another market center. Phlx has analyzed data relating to the frequency with which Orders in Test Group Three securities are entered with a limit price that would cross a Protected Quotation of another market center, and believes that the re-pricing functionality will be triggered infrequently once Test Group Three becomes operational.11 The Exchange also notes that it is diligently working to re-program its systems to remove this re-pricing functionality, and that it anticipates this re-programming to be complete on or before October 31, 2016.

    11 For example, on September 23, 2016, 0.3% of orders in Test Group Three securities were entered on the NASDAQ Stock Market LLC at a price that crossed the NBBO. Phlx believes that this number will be even lower starting October 17, 2016, when the $0.05 tick increment for Test Group Three securities is in place.

    B. Self-Regulatory Organization's Statement on Burden on Competition

    The Exchange does not believe that the proposed rule change will impose any burden on competition not necessary or appropriate in furtherance of the purposes of the Act. The purpose of this proposal is to provide the SEC and market participants with notice of Phlx's efforts to remove its re-pricing functionality in Test Group Three securities for Price to Comply Orders, Price to Display Orders, Non-Displayed Orders, and Post-Only Orders that are entered through the OUCH or FLITE protocols, consistent with its statements in SR-Phlx-2016-92.

    C. Self-Regulatory Organization's Statement on Comments on the Proposed Rule Change Received From Members, Participants, or Others

    No written comments were either solicited or received.

    III. Date of Effectiveness of the Proposed Rule Change and Timing for Commission Action

    Because the foregoing proposed rule change does not: (i) Significantly affect the protection of investors or the public interest; (ii) impose any significant burden on competition; and (iii) become operative for 30 days from the date on which it was filed, or such shorter time as the Commission may designate, it has become effective pursuant to Section 19(b)(3)(A)(iii) of the Act 12 and subparagraph (f)(6) of Rule 19b-4 thereunder.13

    12 15 U.S.C. 78s(b)(3)(A)(iii).

    13 17 CFR 240.19b-4(f)(6).

    A proposed rule change filed under Rule 19b-4(f)(6) normally does not become operative prior to 30 days after the date of filing. Rule 19b-4(f)(6)(iii), however, permits the Commission to designate a shorter time if such action is consistent with the protection of investors and the public interest. The Exchange requests that the Commission waive the 30-day operative delay contained in Rule 19b-4(f)(6)(iii) so that this proposed change will be in operative as of October 17, 2016, the date that Test Group Three securities begin to be subject to the quoting and trading restrictions of the Plan and, therefore, the relevant language in Rule 3317.

    The Commission believes that waiving the 30-day operative delay is consistent with the protection of investors and the public interest because it will allow the Exchange to implement the proposed rules immediately thereby preventing delays in the implementation of the Plan. The Commission notes that the Pilot started implementation on October 3, 2016, Test Group Three securities were phased into the Pilot starting on October 17, 2016, and waiving the 30-day operative delay would ensure that the rules of the Exchange would be in place during implementation. Therefore, the Commission hereby waives the 30-day operative delay and designates the proposed rule change to be operative upon filing with the Commission.14

    14 For purposes only of waiving the operative delay for this proposal, the Commission has considered the proposed rule's impact on efficiency, competition, and capital formation. See 15 U.S.C. 78c(f).

    At any time within 60 days of the filing of the proposed rule change, the Commission summarily may temporarily suspend such rule change if it appears to the Commission that such action is: (i) Necessary or appropriate in the public interest; (ii) for the protection of investors; or (iii) otherwise in furtherance of the purposes of the Act. If the Commission takes such action, the Commission shall institute proceedings to determine whether the proposed rule should be approved or disapproved.

    IV. Solicitation of Comments

    Interested persons are invited to submit written data, views, and arguments concerning the foregoing, including whether the proposed rule change is consistent with the Act. Comments may be submitted by any of the following methods:

    Electronic Comments

    • Use the Commission's Internet comment form (http://www.sec.gov/rules/sro.shtml); or

    • Send an email to [email protected]. Please include File Number SR-Phlx-2016-106 on the subject line.

    Paper Comments • Send paper comments in triplicate to Secretary, Securities and Exchange Commission, 100 F Street NE., Washington, DC 20549-1090. All submissions should refer to File Number SR-Phlx-2016-106. This file number should be included on the subject line if email is used. To help the Commission process and review your comments more efficiently, please use only one method. The Commission will post all comments on the Commission's Internet Web site (http://www.sec.gov/rules/sro.shtml). Copies of the submission, all subsequent amendments, all written statements with respect to the proposed rule change that are filed with the Commission, and all written communications relating to the proposed rule change between the Commission and any person, other than those that may be withheld from the public in accordance with the provisions of 5 U.S.C. 552, will be available for Web site viewing and printing in the Commission's Public Reference Room, 100 F Street NE., Washington, DC 20549, on official business days between the hours of 10:00 a.m. and 3:00 p.m. Copies of the filing also will be available for inspection and copying at the principal office of the Exchange. All comments received will be posted without change; the Commission does not edit personal identifying information from submissions. You should submit only information that you wish to make available publicly. All submissions should refer to File Number SR-Phlx-2016-106, and should be submitted on or before November 21, 2016.

    For the Commission, by the Division of Trading and Markets, pursuant to delegated authority.15

    15 17 CFR 200.30-3(a)(12).

    Brent J. Fields, Secretary.
    [FR Doc. 2016-26139 Filed 10-28-16; 8:45 am] BILLING CODE 8011-01-P
    DEPARTMENT OF TRANSPORTATION Federal Aviation Administration [Summary Notice No. 2016-108] Petition for Exemption; Summary of Petition Received; Douglas Myers AGENCY:

    Federal Aviation Administration (FAA), DOT.

    ACTION:

    Notice.

    SUMMARY:

    This notice contains a summary of a petition seeking relief from specified requirements of Title 14 of the Code of Federal Regulations. The purpose of this notice is to improve the public's awareness of, and participation in, the FAA's exemption process. Neither publication of this notice nor the inclusion or omission of information in the summary is intended to affect the legal status of the petition or its final disposition.

    DATES:

    Comments on this petition must identify the petition docket number and must be received on or before November 21, 2016.

    ADDRESSES:

    Send comments identified by docket number FAA-2016-8684:

    Federal eRulemaking Portal: Go to http://www.regulations.gov and follow the online instructions for sending your comments electronically.

    Mail: Send comments to Docket Operations, M-30; U.S. Department of Transportation (DOT), 1200 New Jersey Avenue SE., Room W12-140, West Building Ground Floor, Washington, DC 20590-0001.

    Hand Delivery or Courier: Take comments to Docket Operations in Room W12-140 of the West Building Ground Floor at 1200 New Jersey Avenue SE., Washington, DC, between 9 a.m. and 5 p.m., Monday through Friday, except Federal holidays.

    Fax: Fax comments to Docket Operations at 202-493-2251.

    Privacy: In accordance with 5 U.S.C. 553(c), DOT solicits comments from the public to better inform its rulemaking process. DOT posts these comments, without edit, including any personal information the commenter provides, to http://www.regulations.gov, as described in the system of records notice (DOT/ALL-14 FDMS), which can be reviewed at http://www.dot.gov/privacy.

    Docket: Background documents or comments received may be read at http://www.regulations.gov at any time. Follow the online instructions for accessing the docket or go to the Docket Operations in Room W12-140 of the West Building Ground Floor at 1200 New Jersey Avenue SE., Washington, DC, between 9 a.m. and 5 p.m., Monday through Friday, except Federal holidays.

    FOR FURTHER INFORMATION CONTACT:

    Christopher Morris, Office of Rulemaking, Federal Aviation Administration, 800 Independence Ave. SW., Washington, DC 20591; (202) 267-4418; [email protected].

    This notice is published pursuant to 14 CFR 11.85.

    Issued in Washington, DC, on October 25, 2016. Dale A. Bouffiou, Director, Office of Rulemaking. Petition for Exemption

    Docket No.: FAA-2016-8684.

    Petitioner: Douglas Myers.

    Section(s) of 14 CFR Affected: § 107.65.

    Description of Relief Sought: The petitioner, a Certificated Flight Instructor, requests relief from 14 CFR 107.65, which provides that a holder of a part 61 pilot certificate (other than student pilot) may establish aeronautical knowledge recency by meeting the flight review requirements specified in § 61.56 within the previous 24 months and completing either an initial or recurrent training course covering the areas of knowledge specified in § 107.74(a) or (b) in a manner acceptable to the Administrator. The petitioner proposes instead that he, and others similarly situated, be permitted to establish aeronautical knowledge recency by holding an unexpired flight instructor certificate and completing a flight instructor refresher course in accordance with 14 CFR 61.197(a).

    [FR Doc. 2016-26239 Filed 10-28-16; 8:45 am] BILLING CODE 4910-13-P
    DEPARTMENT OF TRANSPORTATION Federal Aviation Administration [Summary Notice No. 2016-98] Petition for Exemption; Summary of Petition Received; Pentastar Aviation Charter, Inc. AGENCY:

    Federal Aviation Administration (FAA), DOT.

    ACTION:

    Notice.

    SUMMARY:

    This notice contains a summary of a petition seeking relief from specified requirements of Title 14 of the Code of Federal Regulations. The purpose of this notice is to improve the public's awareness of, and participation in, the FAA's exemption process. Neither publication of this notice nor the inclusion or omission of information in the summary is intended to affect the legal status of the petition or its final disposition.

    DATES:

    Comments on this petition must identify the petition docket number and must be received on or before November 21, 2016.

    ADDRESSES:

    Send comments identified by docket number FAA-2016-5027 using any of the following methods:

    Federal eRulemaking Portal: Go to http://www.regulations.gov and follow the online instructions for sending your comments electronically.

    Mail: Send comments to Docket Operations, M-30; U.S. Department of Transportation (DOT), 1200 New Jersey Avenue SE., Room W12-140, West Building Ground Floor, Washington, DC 20590-0001.

    Hand Delivery or Courier: Take comments to Docket Operations in Room W12-140 of the West Building Ground Floor at 1200 New Jersey Avenue SE., Washington, DC, between 9 a.m. and 5 p.m., Monday through Friday, except Federal holidays.

    Fax: Fax comments to Docket Operations at 202-493-2251.

    Privacy: In accordance with 5 U.S.C. 553(c), DOT solicits comments from the public to better inform its rulemaking process. DOT posts these comments, without edit, including any personal information the commenter provides, to http://www.regulations.gov, as described in the system of records notice (DOT/ALL-14 FDMS), which can be reviewed at http://www.dot.gov/privacy.

    Docket: Background documents or comments received may be read at http://www.regulations.gov at any time. Follow the online instructions for accessing the docket or go to the Docket Operations in Room W12-140 of the West Building Ground Floor at 1200 New Jersey Avenue SE., Washington, DC, between 9 a.m. and 5 p.m., Monday through Friday, except Federal holidays.

    FOR FURTHER INFORMATION CONTACT:

    Dale Williams (202) 267-4179, Office of Rulemaking, Federal Aviation Administration, 800 Independence Avenue SW., Washington, DC 20591.

    This notice is published pursuant to 14 CFR 11.85.

    Issued in Washington, DC, on October 25, 2016. Dale A. Bouffiou, Deputy Director, Office of Rulemaking. Petition for Exemption

    Docket No.: FAA-2016-5027.

    Petitioner: Pentastar Aviation Charter, Inc.

    Section(s) of 14 CFR Affected: 135.25 (b)(c).

    Description of Relief Sought: Pentastar Aviation Charter, Inc. (Pentastar) seeks exemption from § 135.25 (b), which requires a part 135 certificate holder to have the exclusive use of at least one aircraft that meets the requirements for at least one kind of operation authorized in its operations specifications. In addition, Pentastar seeks exemption from § 135.25 (c), which specifies that, for the purposes of § 135.25 (b), a person has exclusive use of an aircraft if that person has the sole possession, control, and use of it for flight, as owner, or has a written agreement (including arrangements for performing required maintenance), in effect when the aircraft is operated, giving the person that possession, control, and use for at least 6 consecutive months. In addition, the FAA notes that an exemption from § 135.419 may be relevant to the disposition of this petition. Section 135.419 states that the FAA Administrator may require or allow an approved aircraft inspection program for any make and model aircraft of which the certificate holder has exclusive use of at least one aircraft (as defined in § 135.25(b)).

    Pentastar operates under part 135 as an eligible on demand air carrier under Certificate No. UG8A235J, which was issued initially in 1999. Pentastar currently operates 8 turbo jet aircraft and one Cessna-172 aircraft, all of which are leased aircraft. Pentastar retains responsibility for all maintenance of the aircraft on its part 135 certificate through its part 145 repair station (No. BTVR626C). None of these aircraft are operated under part 91K, however, all aircraft are operated under part 91 by owners when not in use by Pentastar. Pentastar is currently in compliance with §§ 135.25 (b) and (c) by leasing a C-172 aircraft. Pentastar maintains this aircraft on its part 135 operations specifications although it has never utilized this aircraft for on demand operations. The petitioner contends that a grant of exemption would provide an equivalent level of safety because all the aircraft utilized in actual on demand operations under its part 135 certificate would remain the same.

    [FR Doc. 2016-26238 Filed 10-28-16; 8:45 am] BILLING CODE 4910-13-P
    DEPARTMENT OF TRANSPORTATION Federal Railroad Administration [Docket No. FRA-2000-7257, Notice No. 83] Railroad Safety Advisory Committee; Notice of Meeting AGENCY:

    Federal Railroad Administration (FRA), Department of Transportation.

    ACTION:

    Announcement of Railroad Safety Advisory Committee (RSAC) meeting.

    SUMMARY:

    FRA announces the fifty-seventh meeting of the RSAC, a Federal Advisory Committee that develops railroad safety regulations through a consensus process. The RSAC meeting agenda topics will include: Opening remarks from the FRA Administrator and the FRA Associate Administrator for Railroad Safety and Chief Safety Officer; status reports from the Remote Control Locomotive, Track Standards, Hazardous Materials Issues, Rail Integrity Working Groups, and the Engineering Task Force; an informational presentation on the high-speed passenger rail equipment (Tier III) rulemaking; and an update on the status of Positive Train Control implementation. This agenda is subject to change, including possibly adding further proposed tasks.

    DATES:

    The RSAC meeting is scheduled to commence at 9:30 a.m. on Thursday, January 26, 2017, and will adjourn by 4:30 p.m.

    ADDRESSES:

    The RSAC meeting will be held at the National Association of Home Builders, National Housing Center, located at 1201 15th Street NW., Washington, DC 20005. The meeting is open to the public on a first-come, first-served basis, and is accessible to individuals with disabilities. Sign and oral interpretation can be made available if requested 10 calendar days before the meeting.

    FOR FURTHER INFORMATION CONTACT:

    Kenton Kilgore, RSAC Administrative Officer/Coordinator, FRA, 1200 New Jersey Avenue SE., Mailstop 25, Washington, DC 20590, (202) 493-6286; or Robert Lauby, Associate Administrator for Railroad Safety and Chief Safety Officer, FRA, 1200 New Jersey Avenue SE., Mailstop 25, Washington, DC 20590, (202) 493-6474.

    SUPPLEMENTARY INFORMATION:

    Under Section 10(a)(2) of the Federal Advisory Committee Act (Pub. L. 92-463), FRA is giving notice of a meeting of the RSAC. The RSAC was established to provide advice and recommendations to FRA on railroad safety matters. The RSAC is composed of 59 voting representatives from 38 member organizations, representing various rail industry perspectives. In addition, there are non-voting advisory representatives from the agencies with railroad safety regulatory responsibility in Canada and Mexico, the National Transportation Safety Board, and the Federal Transit Administration. The diversity of the RSAC ensures the requisite range of views and expertise necessary to discharge its responsibilities. See the RSAC Web site for details on prior RSAC activities and pending tasks at http://rsac.fra.dot.gov/. Please refer to the notice published in the Federal Register on March 11, 1996 (61 FR 9740), for additional information about the RSAC.

    Issued in Washington, DC, on October 25, 2016. Robert C. Lauby, Associate Administrator for Railroad Safety, Chief Safety Officer.
    [FR Doc. 2016-26147 Filed 10-28-16; 8:45 am] BILLING CODE 4910-06-P
    DEPARTMENT OF TRANSPORTATION Federal Railroad Administration [Docket No. FRA-2010-0060] Norfolk Southern Railway Company's Request for Positive Train Control Safety Plan Approval and System Certification AGENCY:

    Federal Railroad Administration (FRA), U.S. Department of Transportation (DOT).

    ACTION:

    Notice of availability and request for comments.

    SUMMARY:

    This document provides the public with notice that on August 12, 2016 Norfolk Southern Railway Company (NS) submitted to FRA its Positive Train Control Safety Plan (PTCSP) Version 1.0, dated August 12, 2016, on FRA's Secure Information Repository (SIR) site. NS asks FRA to approve its PTCSP and issue a Positive Train Control System Certification for NS's Interoperable Electronic Train Management System (I-ETMS), under 49 CFR part 236, subpart I.

    DATES:

    FRA will consider communications received by November 30, 2016 before taking final action on the PTCSP. FRA may consider comments received after that date if practicable.

    ADDRESSES:

    All communications concerning this proceeding should identify Docket Number FRA 2010-0060 and may be submitted by any of the following methods:

    Web site: http://www.regulations.gov. Follow the online instructions for submitting comments.

    Fax: 202-493-2251.

    Mail: Docket Operations Facility, U.S. Department of Transportation (DOT), 1200 New Jersey Avenue SE., W12-140, Washington, DC 20590.

    Hand Delivery: 1200 New Jersey Avenue SE., Room W12-140, Washington, DC 20590, between 9 a.m. and 5 p.m., Monday through Friday, except Federal Holidays.

    FOR FURTHER INFORMATION CONTACT:

    Dr. Mark Hartong, Senior Scientific Technical Advisor, at (202) 493-1332 or [email protected]; or Mr. David Blackmore, Staff Director, Positive Train Control Division, at (312) 835-3903 or [email protected].

    SUPPLEMENTARY INFORMATION:

    In its PTCSP, NS states the I-ETMS system it is implementing is designed as a vital overlay PTC system as defined in 49 CFR 236.1015(e)(2). The PTCSP describes NS's I-ETMS implementation and the associated I-ETMS safety processes, safety analyses, and test, validation, and verification processes used during the development of I-ETMS. The PTCSP also contains NS's operational and support requirements and procedures.

    NS's PTCSP and the accompanying request for approval and system certification are available for review online at www.regulations.gov (Docket Number FRA-2010-0060) and in person at DOT's Docket Operations Facility, 1200 New Jersey Avenue SE., W12-140, Washington, DC 20590. The Docket Operations Facility is open from 9 a.m. to 5 p.m., Monday through Friday, except Federal Holidays.

    Interested parties may comment on the PTCSP by submitting written comments or data. During its review of the PTCSP, FRA will consider any comments or data submitted. However, FRA may elect not to respond to any particular comment and, under 49 CFR 236.1009(d)(3), FRA maintains the authority to approve or disapprove the PTCSP at its sole discretion. FRA does not anticipate scheduling a public hearing regarding NS's PTCSP because the circumstances do not appear to warrant a hearing. If any interested party desires an opportunity for oral comment, the party should notify FRA in writing before the end of the comment period and specify the basis for his or her request.

    Privacy Act Notice

    Anyone can search the electronic form of any written communications and comments received into any of our dockets by the name of the individual submitting the comment (or signing the document, if submitted on behalf of an association, business, labor union, etc.). Under 49 CFR 211.3, FRA solicits comments from the public to better inform its decisions. DOT posts these comments, without edit, including any personal information the commenter provides, to www.regulations.gov, as described in the system of records notice (DOT/ALL-14 FDMS), which you can review at www.dot.gov/privacy. See https://www.regulations.gov/privacyNotice for the privacy notice of regulations.gov.

    Issued in Washington, DC, on October 25, 2016. Robert C. Lauby, Associate Administrator for Railroad Safety, Chief Safety Officer.
    [FR Doc. 2016-26146 Filed 10-28-16; 8:45 am] BILLING CODE 4910-06-P
    DEPARTMENT OF TRANSPORTATION Federal Transit Administration Pilot Program for Transit-Oriented Development Planning Project Selections AGENCY:

    Federal Transit Administration, DOT.

    ACTION:

    Pilot Program for Transit-Oriented Development Planning Announcement of Project Selections.

    SUMMARY:

    The U.S. Department of Transportation's Federal Transit Administration (FTA) announces the selection of projects with Fiscal Year (FY) 2015 and FY 2016 appropriations for the Pilot Program for Transit-Oriented Development Planning (TOD Pilot Program), as authorized by the Moving Ahead for Progress in the 21st Century Act (MAP-21) with additional funding provided by the Fixing America's Surface Transportation (FAST) Act. On April 14, 2016, FTA published a Notice of Funding Opportunity (NOFO) (81 FR 22155) announcing the availability of $20.49 million in funding for this program. This program supports comprehensive planning efforts associated with new fixed guideway and core capacity improvement projects that are seeking or have recently received funding through FTA's Fixed Guideway Capital Investment Grants (CIG) Program.

    FOR FURTHER INFORMATION CONTACT:

    Successful applicants should contact the appropriate FTA Regional Office for information regarding applying for the funds. For program-specific information, applicants may contact Benjamin Owen, FTA Office of Planning and Environment, at (202) 366-5602 or [email protected]. A list of Regional Offices can be found at www.transit.dot.gov. A TDD is available at 1-800-877-8339 (TDD/FIRS).

    SUPPLEMENTARY INFORMATION:

    In response to the NOFO, FTA received 20 proposals from 17 states requesting $17.6 million in Federal funds. Project proposals were evaluated based on each applicant's responsiveness to the program evaluation criteria as detailed in the NOFO. Two of the 20 projects were deemed ineligible to receive funds because they did not meet the eligibility requirements described in the NOFO. A further two were associated with transit projects for which the risk of major scope changes could render the proposed comprehensive planning efforts moot. Those projects would be eligible to reapply to a future TOD Pilot Program funding opportunity if and when the uncertainties are resolved. FTA has selected 16 projects as shown in Table I for a total of $14.7 million.

    Recipients selected for competitive funding should work with their FTA Regional Office to finalize the grant application in FTA's Transit Award Management System (TrAMS) for the projects identified in the attached table to quickly obligate funds. Grant applications must include eligible activities applied for in the original project application. Funds must be used consistent with the submitted proposal and for the eligible planning purposes established in the NOFO. Recipients are reminded that program requirements such as local match can be found in the NOFO. A discretionary project identification number has been assigned to each project for tracking purposes and must be used in the TrAMS application.

    Selected projects are eligible to incur costs under pre-award authority no earlier than the date projects were publicly announced, October 11, 2016. Pre-award authority does not guarantee that project expenses incurred prior to the award of a grant will be eligible for reimbursement, as eligibility for reimbursement is contingent upon all applicable requirements having been met. For more about FTA's policy on pre-award authority, please see the FTA Fiscal Year 2016 Apportionments, Allocations, and Program Information and Interim Guidance found in 81 FR 7893 (February 16, 2016). Post-award reporting requirements include submission of the Federal Financial Report and Milestone progress reports in TrAMS as appropriate (see Grant Management Requirements FTA.C.5010.1D and Program Guidance for Metropolitan Planning and State Planning and Research Program Grants C.8100.1C). Recipients must comply with all applicable Federal statutes, regulations, executive orders, FTA circulars, and other Federal requirements in carrying out the project supported by the FTA grant. FTA emphasizes that grantees must follow all third-party procurement guidance, as described in FTA.C.4220.1F. Funds allocated in this announcement must be obligated in a grant by September 30, 2017.

    Carolyn Flowers, Acting Administrator. Table I—FY 2017 Pilot Program for Transit-Oriented Development Planning Project Selections State Recipient Project ID Project description Allocation
  • ($)
  • AZ City of Phoenix D2017-TODP-001 South Central Light Rail TOD Planning Grant 2,000,000 CA Santa Clara Valley Transportation Authority (VTA) D2017-TODP-002 VTA BART Phase II—TOD and Station Access Planning Study 1,520,000 CA Los Angeles County Metropolitan Transportation Authority D2017-TODP-003 West Santa Ana Branch Transit Corridor TOD Strategic Implementation Plan 2,000,000 CO Regional Transportation District D2017-TODP-004 Colfax Corridor Connections TOD Implementation Plan (C3 TOD Plan) 1,350,000 FL Miami-Dade County D2017-TODP-005 Master TOD Plan for the Miami-Dade County East-West Corridor 960,000 ID Valley Regional Transit D2017-TODP-006 State Street Corridor Transit Oriented Development (TOD) Design and Implementation Plan 279,000 MI City of Detroit Department of Transportation D2017-TODP-007 City of Detroit Feasibility Study for Major Transit Capital Investment—East Jefferson Avenue 300,000 MO Bi-State Development Agency D2017-TODP-008 Northside-Southside Pilot Program for Transit-Oriented Development Planning 374,278 MN Metropolitan Council/Metro Transit D2017-TODP-009 METRO Blue Line Extension—Advanced Transit-Oriented Development Planning (BLRT TOD) 1,200,000 NM Rio Metro Regional Transit District D2017-TODP-010,
  • D2017-TODP-011
  • New Mexico's Knowledge Corridor: BRT, Land Use and Economic Opportunity on University Boulevard 572,000
    OR Lane Transit District D2017-TODP-012 River Road Transit Community Implementation Plan 450,000 OR Metro D2017-TODP-013 Southwest Corridor Equitable Development Strategy (SWEDS) 895,000 TX North Central Texas Council of Governments D2017-TODP-014 DART Red and Blue Line Corridors TOD Planning Study 1,400,000 UT Utah Transit Authority D2017-TODP-015 Ogden/WSU BRT—Transit Oriented Development (TOD) Analysis and Implementation Plan 250,000 VA County of Fairfax, Virginia D2017-TODP-016 Transit-Oriented Development Planning for the Richmond Highway Corridor 400,000 WI City of Milwaukee D2017-TODP-017 Milwaukee Streetcar King Drive and Walker's Point Extensions—Equitable Growth through TOD 750,000
    [FR Doc. 2016-26208 Filed 10-28-16; 8:45 am] BILLING CODE P
    DEPARTMENT OF TRANSPORTATION Office of the Secretary [Docket No. DOT-OST-2015-0246] RIN 2105-AE12 Nondiscrimination on the Basis of Disability in Air Travel: Negotiated Rulemaking Committee Seventh Meeting AGENCY:

    Office of the Secretary, Department of Transportation.

    ACTION:

    Notice of seventh public meeting of advisory committee.

    SUMMARY:

    This notice announces the seventh meeting of the Advisory Committee on Accessible Air Transportation (ACCESS Advisory Committee).

    DATES:

    The seventh meeting of the ACCESS Advisory Committee will be held on November 2, 2016, from 9:00 a.m. to 5:00 p.m., Eastern Daylight Time.

    ADDRESSES:

    The meeting will be held at Hilton Arlington, 950 N. Stafford St., Arlington, VA 22203. Attendance is open to the public up to the room's capacity of 150 attendees. Since space is limited, any member of the general public who plans to attend this meeting must notify the registration contact identified below no later than October 31, 2016.

    FOR FURTHER INFORMATION CONTACT:

    To register to attend the meeting, please contact Kyle Ilgenfritz ([email protected]; 703-442-4575 extension 128). For other information, please contact Livaughn Chapman or Vinh Nguyen, Office of the Aviation Enforcement and Proceedings, U.S. Department of Transportation, by email at [email protected] or [email protected] or by telephone at 202-366-9342.

    SUPPLEMENTARY INFORMATION:

    I. Seventh Public Meeting of the ACCESS Committee

    The seventh meeting of the ACCESS Advisory Committee will be held on November 2, from 9:00 a.m. to 5:00 p.m., Eastern Daylight Time. The meeting will be held at Hilton Arlington, 950 N. Stafford St., Arlington, VA 22203. At the meeting, the ACCESS Advisory Committee will continue to address whether to require accessible inflight entertainment (IFE) and strengthen accessibility requirements for other in-flight communications. We expect to negotiate and vote on proposals to amend the Department's disability regulation regarding this issue. Prior to the meeting, the agenda will be available on the ACCESS Advisory Committee's Web site, www.transportation.gov/access-advisory-committee. Information on how to access advisory committee documents via the FDMC is contained in Section III, below.

    The meeting will be open to the public. Attendance will be limited by the size of the meeting room (maximum 150 attendees). Because space is limited, we ask that any member of the public who plans to attend the meeting notify the registration contact, Kyle Ilgenfritz ([email protected]; 703-442-4575 extension 128) at Linkvisum, no later than October 31, 2016. At the discretion of the facilitator and the Committee and time permitting, members of the public are invited to contribute to the discussion and provide oral comments.

    II. Submitting Written Comments

    Members of the public may submit written comments on the topics to be considered during the meeting by October 31, 2016, to FDMC, Docket Number DOT-OST-2015-0246. You may submit your comments and material online or by fax, mail, or hand delivery, but please use only one of these means. DOT recommends that you include your name and a mailing address, an email address, or a phone number in the body of your document so that DOT can contact you if there are questions regarding your submission.

    To submit your comment online, go to http://www.regulations.gov, put the docket number, DOT-OST-2015-0246, in the keyword box, and click “Search.” When the new screen appears, click on the “Comment Now!” button and type your comment into the text box on the following screen. Choose whether you are submitting your comment as an individual or on behalf of a third party and then submit. If you submit your comments by mail or hand delivery, submit them in an unbound format, no larger than 81/2 by 11 inches, suitable for copying and electronic filing.

    III. Viewing Comments and Documents

    To view comments and any documents mentioned in this preamble as being available in the docket, go to www.regulations.gov. Enter the docket number, DOT-OST-2015-0246, in the keyword box, and click “Search.” Next, click the link to “Open Docket Folder” and choose the document to review. If you do not have access to the Internet, you may view the docket online by visiting the Docket Management Facility in Room W12-140 on the ground floor of the DOT West Building, 1200 New Jersey Avenue SE., Washington, DC 20590, between 9 a.m. and 5 p.m., E.T., Monday through Friday, except Federal holidays.

    IV. ACCESS Advisory Committee Charter

    The ACCESS Advisory Committee is established by charter in accordance with the Federal Advisory Committee Act (FACA), 5 U.S.C. App. 2. Secretary of Transportation Anthony Foxx approved the ACCESS Advisory Committee charter on April 6, 2016. The committee's charter sets forth policies for the operation of the advisory committee and is available on the Department's Web site at www.transportation.gov/office-general-counsel/negotiated-regulations/charter.

    V. Privacy Act

    In accordance with 5 U.S.C. 553(c), DOT solicits comments from the public to better inform its rulemaking process. DOT posts these comments, without edit, including any personal information the commenter provides, to www.regulations.gov, as described in the system of records notice (DOT/ALL-14 FDMS), which can be reviewed at www.dot.gov/privacy.

    VI. Federal Advisory Committee Act

    Notice of this meeting is being provided in accordance with the Federal Advisory Committee Act and the General Services Administration regulations covering management of Federal advisory committees. See 41 CFR part 102-3.

    Dated: October 24, 2016. Molly J. Moran, Acting General Counsel.
    [FR Doc. 2016-26192 Filed 10-28-16; 8:45 am] BILLING CODE 4910-9X-P
    DEPARTMENT OF TRANSPORTATION Office of the Secretary [Docket No. DOT-OST-2016-0204] Exploring Industry Practices on Distribution and Display of Airline Fare, Schedule, and Availability Information AGENCY:

    Office of the Secretary (OST), Department of Transportation (DOT).

    ACTION:

    Request for Information (RFI).

    SUMMARY:

    The industry group for travel sites, a number of its members which include online travel booking sites, and certain members of Congress have expressed concerns to the Department of Transportation (DOT or Department) regarding airline restrictions on the distribution and display of airline flight schedule, fare, and availability information (“flight information”). Specifically, concerns were raised about practices by some airlines to restrict the distribution and/or display of flight information by certain online travel agencies (OTAs), metasearch entities that operate flight search tools, and other stakeholders involved in the distribution of flight information and sale of air transportation. Airlines state that it is important for them to maintain control over the display and distribution of airline flight information while OTAs and metasearch entities that operate flight search tools state that actions taken by airlines to restrict the distribution or display of flight information are anticompetitive and harming consumers.

    The Department is interested in learning more about this issue. Pursuant to the Department's aviation consumer protection authority, we are requesting information on whether airline restrictions on the distribution or display of airline flight information harm consumers and constitute an unfair and deceptive business practice and/or an unfair method of competition. The Department is also requesting information on whether any entities are blocking access to critical resources needed for competitive entry into the air transportation industry. Finally, we are requesting information on whether Department action is unnecessary or whether Department action in these areas would promote a more competitive air transportation marketplace or help ensure that consumers have access to the information needed to make informed air transportation choices.

    DATES:

    Responses should be filed by December 30, 2016.

    ADDRESSES:

    You may file responses identified by the docket number DOT-OST-2016-0204 by any of the following methods:

    Federal eRulemaking Portal: go to http://www.regulations.gov and follow the online instructions for submitting comments.

    Mail: Docket Management Facility, U.S. Department of Transportation, 1200 New Jersey Ave. SE., West Building Ground Floor, Room W12-140, Washington, DC 20590-0001.

    Hand Delivery or Courier: West Building Ground Floor, Room W12-140, 1200 New Jersey Ave. SE., between 9:00 a.m. and 5:00 p.m. ET, Monday through Friday, except Federal holidays.

    Fax: (202) 493-2251.

    Instructions: You must include the agency name and docket number DOT-OST-2016-0204 at the beginning of your submission. All submissions received will be posted without change to http://www.regulations.gov, including any personal information provided.

    Privacy Act: Anyone is able to search the electronic form of all submissions received in any of our dockets by the name of the individual submitting the document (or signing the submission, if submitted on behalf of an association, business, labor union, etc.). You may review DOT's complete Privacy Act statement in the Federal Register published on April 11, 2000 (65 FR 19477-78), or you may visit http://DocketsInfo.dot.gov.

    Docket: For access to the docket to read background documents and comments received, go to http://www.regulations.gov or to the street address listed above. Follow the online instructions for accessing the docket.

    FOR FURTHER INFORMATION CONTACT:

    Kyle-Etienne Joseph, Trial Attorney, or Kimberly Graber, Chief, Consumer Protection and Competition Law Branch, Office of the Assistant General Counsel for Aviation Enforcement and Proceedings, U.S. Department of Transportation, 1200 New Jersey Ave. SE., Washington, DC 20590, 202-366-9342, 202-366-7152 (fax), [email protected] or [email protected] (email).

    SUPPLEMENTAL INFORMATION: Background

    Various entities have raised concerns to the Department regarding airlines restricting the distribution or display of information on their flights. We initially became aware of the issue in connection with certain airlines placing restrictions on flight information being displayed by metasearch sites that operate flight search tools. In a proposed rule, the Transparency of Airline Ancillary Fees and Other Consumer Protection Issues (“Consumer Rule III NPRM”), the Department sought information relating to a wide variety of distribution issues including information about the relationships between entities involved in the distribution of air transportation information. 79 FR 29974 (May 23, 2014). In the Consumer Rule III NPRM, the Department posed questions related to airline restrictions on the display of flight schedule, fare, and availability information. The Department stated that it was “considering whether carriers should be prohibited from restricting the information provided by ticket agents when those ticket agents do not sell air transportation directly to consumers but rather provide consumers with different airlines' flight information for comparison shopping.” 79 FR 29970, 29974 (May 23, 2014).

    While the rulemaking was pending, representatives of certain OTAs and representatives of metasearch sites focused on travel, and their outside counsel, met with Department representatives and urged the Department to consider taking action. Those entities stated that airlines that restrict distribution of airline fare, schedule, and availability information to metasearch sites are engaging in unfair practices and unfair methods of competition. They further stated that they were focused on enforcement action or industry guidance rather than rulemaking. See Docket item DOT-OST-2014-0056-0776.

    Subsequently, many questions and concerns have been raised with the Department by members of Congress as well as various stakeholders regarding airline restrictions on the distribution and display of flight information by third parties. The Department met with representatives from OTAs, metasearch entities, airlines, and other industry stakeholders to learn about the issue and how airline decisions to place restrictions on the distribution and display of airline flight information may impact both consumers and the broader air transportation industry. The Department wanted to understand whether the issue of primary concern to industry stakeholders was (1) airlines refusing to provide any flight information to non-airline entities such as an OTA or metasearch entity; (2) airlines providing flight information to non-airline entities but placing restrictions on how that information is displayed; or (3) airlines providing flight information to an OTA but restricting the OTA from distributing that information to a metasearch entity that operates a flight search tool but does not itself sell tickets. In addition, the Department wanted to understand the impact on consumers.

    In meetings with representatives of airlines and online travel entities, the Department asked about the restrictions and why some airlines are restricting some OTAs, metasearch entities that operate flight search tools, or other industry stakeholders from accessing flight information or from distributing and displaying flight information. The Department also asked how such restrictions may impact consumers who use OTA and metasearch Web sites to research and book air travel.

    The Department learned that some airlines have issued cease and desist letters to some OTAs demanding that these companies stop distributing airline flight information to some metasearch entities that operate flight search tools or have included language in their contracts with OTAs prohibiting them from sharing airline flight information with any metasearch entity that has not been approved by the airline. Additionally, some airlines have issued letters to metasearch entities operating flight search tools demanding that these companies stop displaying the airline's flight information or limiting how the entities display the airline's flight information on their flight search tools.

    Some airlines have explained that such actions are because there are certain Web sites marketing air transportation operated by entities with which the airline does not want to be associated because the entities provide inaccurate or incomplete information, or provide poor customer service. Additionally, certain airlines have alleged that some of these entities may have engaged in fraud. Further, several airlines have stated that they wish to control how the information regarding their flights is distributed so that the airline can market services the way it chooses, through the outlets it chooses. Some airlines also state that controlling the outlets through which information on their flights is distributed helps control their distribution costs.

    Historically, competition in airline distribution has contributed to technological and retail innovation that has benefited both industry stakeholders and business and leisure air travelers and further enhanced airline competition. Meanwhile, airlines and ticket agents had commercial incentives to display airline information to consumers as widely as possible. Generally, market forces should ensure that airlines will continue to display their fares in the outlets where consumers want to find them and that those same market forces would then result in airlines accruing the commercial benefits of displaying their services in as many reputable outlets as possible. However, some stakeholders have argued that the marketplace is no longer balanced and consumers are being harmed so the Department should not rely on market forces to resolve these distribution and display issues.

    On April 15, 2016, the White House issued Executive Order 13725: Steps to Increase Competition and Better Inform Consumers and Workers to Support Continued Growth of the American Economy (the “Executive Order”). The Executive Order expresses the importance of a fair, efficient, and competitive marketplace and notes that consumers need both competitive markets and information to make informed choices. The Department shares the goal of ensuring consumers are provided with information they need to make informed choices. In particular, as directed in the Executive Order, the DOT wants to identify any specific practices in connection with air transportation, such as blocking access to critical resources, that may impede informed consumer choice or unduly stifle new market entrants and determine whether the Department can potentially address those practices in appropriate instances. The issues raised in connection with airlines restricting ticket agents' ability to distribute or display flight information may potentially create the type of undue burdens on competition that the Executive Order has directed agencies to address. However, the Department needs to learn more about the issue to understand whether Department action is appropriate.

    Departmental Authority Under 49 U.S.C. 41712 and 40101

    Under 49 U.S.C. 41712, the Department has authority to prevent unfair or deceptive practices and unfair methods of competition. Certain OTAs and metasearch entities have stated that airline restrictions on the distribution and display of flight information amount to an unfair, deceptive, or anticompetitive practice that harms consumers and an unfair method of competition, therefore the Department has authority to act under 49 U.S.C. 41712. Meanwhile, airlines have stated that the manner in which they distribute their fare, schedule, and availability information is a private contractual matter between airlines and third parties. Airlines further contend that they have the right to determine who they do business with and where and when their content is displayed. They state that the Department has no role in this issue because airlines are not engaging in any unfair or deceptive practices or unfair methods of competition.

    The Department also is mandated to encourage and enhance consumer welfare through the benefits of a deregulated, competitive air transportation industry under the Airline Deregulation Act of 1978. The Department places maximum reliance on competitive market forces and on actual and potential competition while preventing unfair, deceptive, predatory, or anti-competitive practices in air transportation pursuant to 49 U.S.C. 40101. As a general rule, the Department does not intervene in private contractual agreements between airlines and third parties unless there is a market failure. However, to the extent commercial arrangements constitute or further an unfair or deceptive practice or unfair method of competition, resulting in harm to consumers, it would be within the Department's authority to prohibit parties from implementing such agreements or place restrictions on such agreements. As a part of any review of potentially unfair or deceptive practices or anti-competitive behavior by an airline or ticket agent, the Department considers legal precedent to make sure that any action taken is within the boundaries of Departmental authority.

    Accordingly, the Department is requesting information on whether any entities are blocking access to critical resources needed for competitive entry into the air transportation industry, whether Department action in this area would promote or hinder a more competitive air transportation marketplace, or whether Department action would help ensure that consumers have access to the information needed to make informed air transportation choices.

    Distribution of Airline Flight Information and Airline Restrictions

    The distribution of airline flight information is a complicated process that involves a number of industry stakeholders but for consumers it is currently relatively simple to obtain flight information from airline Web sites and to find and compare flight information on online travel entity Web sites Consumers routinely book air transportation through direct and indirect (non-airline) channels, including through Web sites that operate flight search tools that either lead consumers directly to airline Web sites or to an OTA with the authority to book tickets on behalf of an airline.

    Airlines make flight information available through their own channels, such as airline Web sites, call centers, and airport agents, as well as outlets that range from traditional “brick and mortar” travel agents and corporate travel agencies to OTAs. Although airlines with sufficient market presence and high load factors may have incentives to limit the outlets through which their fares are displayed, airlines are generally motivated to ensure their flight information is widely available to increase consumer exposure and generate sales. Historically, the most efficient and cost effective way for airlines to distribute flight information was to provide it to entities that consolidated the information of multiple airlines and made it available to interested parties. Accordingly airlines have in the past provided information on their flights with few or no contract restrictions on the redistribution of flight information.

    Industry participants, such as travel agents and metasearch entities that want the flight information of multiple carriers, have in the past been able to obtain flight information by subscribing to distributors of schedule information such as the Official Airline Guide (OAG) and Innovata, distributors of fare and fare related data such as the Airline Tariff Publishing Company (ATPCO) and Societe Internationale de Telecommunications Aeronautiques (SITA), and global distribution systems (“GDS”), which aggregate and distribute combined flight information that generally includes schedules, fares, and availability to subscribers. It is our understanding that in most cases, OTAs that market flight information directly to the public through Web site displays obtain that information from GDSs as their primary non-airline source. OTAs sometimes distribute flight information obtained from GDSs and other entities onward to metasearch entities that operate flight search tools. These metasearch entities often combine information obtained from OTAs with information obtained directly from GDSs and other distributors and/or airlines. Regardless of the source, the information is generally combined and displayed on online travel sites marketed to consumers in flight search tools displaying flight information for multiple airlines.

    Just as airlines have financial incentives to widely distribute and display information on their flights, OTAs and metasearch engines operating flight search tools have financial incentives to distribute and display airline information. It is common for metasearch entities that operate flight search tools to include in search results links to OTAs that are able to sell air transportation on behalf of an airline. Stakeholders have informed the Department that there are a number of fee structures that exist between metasearch entities operating flight search tools and the entities that provide them flight information, whether airlines or OTAs. In connection with the relationship between an OTA and a metasearch engine, although fee structures may vary, generally speaking, when consumers follow a link from a metasearch entity flight search tool to an OTA Web site that allows consumers to book flights, the OTA pays the metasearch site a referral fee. Additionally, OTAs generally receive payments from GDSs for bookings made directly on OTA Web sites. GDSs in turn are paid a fee by airlines for such bookings. Accordingly, although airlines often benefit from having their flights marketed through a variety of outlets, airlines prefer to have consumers book directly through an airline channel, for which the airline generally bears the cost of operating its own channel but avoids paying booking fees to others such as GDSs, OTAs, or metasearch entities.

    Certain airlines have placed restrictions on certain third party industry stakeholders such as GDSs and data aggregators, prohibiting them from distributing information to any entities that the airline does not approve. Additionally, certain airlines are prohibiting OTAs from distributing flight information on to metasearch entities, although it is not clear how many airlines have imposed these prohibitions. Some airlines are also prohibiting particular OTAs or metasearch entities from displaying flight information for an airline's codeshare partners, and at times, preventing OTAs and metasearch entities from displaying an airline's flight information altogether. In other instances, some airlines are prohibiting metasearch entities operating flight search tools from displaying flight information for that airline with any links to OTA Web sites. Instead, the only links must be to airline Web sites. As discussed below, representatives of ticket agents allege these airline restrictions harm consumers whereas airlines argue that they have legitimate business reasons for imposing these restrictions.

    Availability of Flight Information and Concerns Regarding Proprietary Nature of Flight Information

    In connection with airline restrictions on ticket agent distribution or display of flight information, some ticket agents have stated to the Department that they believe flight information is public information and that airlines should not be allowed to place restrictions on it. Conversely, airlines believe flight information is both proprietary and protected under intellectual property laws and that airlines have the right to maintain control over its distribution and display.

    As a result of the availability of airline flight information through so many outlets (e.g. GDSs, OAG, ATPCO, Innovata, etc.), some industry stakeholders believe that airline schedule, fare, and availability information is not airline property and is instead similar to bus or train schedules that are widely available to the public. Some OTAs and metasearch entities that operate flight search tools have stated that airlines have historically provided airline flight information to the general public and that it is purely factual data; therefore, according to these entities, airline flight information has historically not been, and still should not be, considered the intellectual property of airlines.

    On the other hand, airlines state that despite the fact that airline flight information has historically been disseminated and available to the general public, airlines have invested significant money in developing methods to set schedules and fares, to effectively market air transportation, and ultimately to fill as many seats as possible on the flights an airline operates. Further, unlike bus or train fares and schedules that change infrequently, airline fares, schedules, and availability can change many times a day in response to a competitive marketplace. According to many airlines, as a result of the investment that airlines have made in developing flight prices, schedules, and availability, the flight information that they produce and distribute to the air transportation industry is proprietary information.

    Additionally, airlines have indicated that they have an interest in controlling where and how flight information is displayed in order to control airline distribution costs and ensure adequate customer service. Unlike service providers and makers of consumer products that do not sell directly to the public and only sell through an intermediary, airlines sell their services directly to consumers as well as through agents. Despite this distribution model of direct and indirect channels, airlines generally retain control of fares, particularly in domestic air transportation, and do not allow agents to discount or increase fares or to play any role in establishing schedules or seat availability. As such, some airlines believe that because they control fares and the related services, they are entitled to retain ultimate control over how and where this information is distributed and/or displayed.

    Consumer Options for Researching and Purchasing Air Transportation

    Some ticket agents have indicated to the Department that a potential consumer harm that may stem from allowing airlines to restrict the display and distribution of flight information is a reduction in consumers' ability to view a full range of flight options in one location. They also state that ticket agents that operate flight search tools typically display information in a manner that is helpful to travelers seeking to purchase air transportation. Flight search tools consolidate flight options for consumers on one Web site so that consumers do not need to visit multiple Web sites to identify the options for air travel on a number of airlines for a given itinerary. Such flight search tools may also combine the flights of multiple airlines or various one-way fares for consumers in an attempt to identify the most cost-effective and efficient itinerary. According to ticket agents, combining carriers, one way tickets, or both are options that average consumers would be unlikely to find on their own when searching multiple airline Web sites. These flight search tools often default to ranking flight options in order from the lowest to highest cost flight option but offer other ranking options as well, such as by particular airline, arrival time, or travel time. Consumers visiting these Web sites can determine which flight options best suit their needs and preferences, for example, by taking a flight at a less popular time, enduring a long layover in order to save money on air fare, or paying more for the convenience of a non-stop flight.

    Further, according to ticket agents, some flight options offered are only offered by ticket agents and not airlines. However, according to ticket agents, this is an area in which airlines are increasingly imposing restrictions on OTAs and metasearch entities operating flight search tools. These entities state that certain airlines are prohibiting ticket agents from offering flight options combining one way fares for different flight segments or from combining segments and fares from multiple carriers. For example, if a consumer wished to fly from Buffalo, New York to Hartford, Connecticut, and then to Washington, DC, and then return to Buffalo, it is often significantly less expensive to buy multiple one way tickets for this itinerary on different carriers as opposed to purchasing this itinerary as one group of flights from one carrier. Some airlines have limited the ability of ticket agents to book this itinerary as a series of one way flights. Flight search tools that combine one way fares may save consumers time and provide options the consumer would otherwise not be aware of. Searching for one way tickets on multiple carrier Web sites to find a multi-carrier itinerary that fits a consumer's needs might not yield as consumer-beneficial results.

    In addition, discounted tickets that OTAs offer as part of tour packages are not presented on airline Web sites. According to ticket agents, without the ability to efficiently view flight information across multiple airlines on a ticket agent Web site, transactions are less efficient. Consumers may need to visit numerous Web sites more than once in the days before purchasing air transportation to find a current fare for the most cost effective itinerary to match their travel plans. Ticket agents also note that some Web sites offer consumers the ability to review trends in pricing for various flights so that consumers can theoretically identify the optimal date to purchase a ticket before traveling, on-time performance information for flights, customer reviews of specific itineraries, optimal seat ratings 1 for various aircraft, as well as hotel options, rental cars, and other products like entertainment. Additionally, according to ticket agents, research suggests that the more time-consuming it is for consumers to research and select airfare, the more burdensome, and potentially costly, air travel becomes for consumers.

    1 Optimal seat ratings indicate which seats on an aircraft are the best seats to sit in during travel for a specific aircraft type and configuration.

    On the other hand, many airlines state that airline limitations placed on OTAs and metasearch sites operating flight search tools do not harm consumers. Airlines note that they also provide on-time performance information and tour package options. Airlines observe that ticket agents are not alleging that airlines are attempting to place limitations on OTAs or metasearch entity product offerings unrelated to air transportation, nor have they alleged that airlines are trying to restrict displays of customer reviews of itineraries or airline seat ratings. Further, despite placing limitations on some OTAs and metasearch entities operating flight search tools, most airlines allow what they consider to be “desirable” OTAs and metasearch entities to distribute and display the airline's flight information. Meanwhile, some airlines note that one of the largest airlines in the U.S. does not distribute its flight information through GDSs or OTAs. Most consumers, particularly the most price sensitive consumers, generally search multiple Web sites, including those operated by airlines as well as ticket agent flight search tools, before purchasing air transportation. According to the airlines, their actions have not had a significant impact on the search process that consumers use to identify and purchase or reserve air transportation.

    Airlines also observe that there has been a significant consolidation in the ownership of OTAs. Most leisure consumer bookings come through a small number of OTAs. Airlines assert that although consumers may believe they are comparing multiple outlets, several of those outlets are owned by the same parent company. According to airlines, the consolidation of OTAs is significant to flight option distribution and consumers may be harmed by limited OTA competition as those entities consolidate and no longer innovate to compete with each other.

    Moreover, many airlines state that they should maintain ultimate control over how their airline product is offered and displayed to consumers because the flying experience that airlines offer to consumers is a unique product that individual airlines have invested significant resources to develop. For example, airlines state that some ticket agent Web sites do not display airline information in a way that optimizes the product that airlines are offering to consumers. Specifically, some ticket agent Web sites have included outdated airline logos, presented information in what airlines believe to be a disorganized and suboptimal way, and failed to offer customers the tailored experience that airlines offer. Airlines have expressed concern about improper display of airline information or poor customer service experiences that they believe may negatively impact consumer perception of the airlines' brand. Airlines have stated that some examples of poor experiences include excessively long layovers that customers are unaware of when booking through ticket agents, the failure of ticket agents to process refund requests, an inability of ticket agents to accurately relay flight status and other important information to consumers in a timely fashion, and other negative interactions that consumers may attribute to airlines. Some airlines also allege they are concerned about entities that engage in fraudulent activity by selling fraudulent tickets for travel on well-known airlines. In some of these instances, consumers contact the airlines directly to request a refund for an invalid ticket. Airlines are concerned that consumers defrauded by such entities may believe that the airlines are to blame. Certain airlines have demanded that entities that they consider undesirable cease displaying the airline's flights. They have also placed contractual limitations on the ability of GDSs and OTAs to distribute flight information to unapproved entities.

    Further, airlines state that they allow access to their products through numerous OTAs and metasearch entities in addition to their own sites and that consumers are able to shop for air transportation on or through many of those Web sites. Airlines believe that the purchase of air transportation via the internet is an efficient process regardless of whether consumers access flight information through OTAs, metasearch entities that operate flight search tools, or airlines' Web sites. Accordingly, airlines assert that any Department action limiting airlines' ability to control how and where airline flight information is displayed would harm both consumers and the airline's brand. Several airlines also point out that it is in their financial interest to allow reputable OTAs and metasearch entities to display and distribute airline flight information despite a desire to have as many passengers book directly with the airline as possible. Airlines need to make their services available through the outlets that consumers choose to use. Bookings via OTAs in many instances account for a large percentage of airline sales and referrals from metasearch entities that operate flight search tools are also important. Meanwhile, GDSs have historically included provisions in contracts with airlines that require airlines to offer all of the same fares that the airline offers to ticket agents that subscribe to the GDS. Therefore, in many instances, airlines are not able to offer discount fares only available from the airline to drive consumers from purchasing through ticket agents to purchasing from the airline based on pricing. Accordingly, some airlines assert that it is not in their interest or even commercially viable to remove flight information from OTA or metasearch entity Web sites entirely. Some airlines have stated that, due to the quantity of bookings that originate on OTA or metasearch entity Web sites, it is unlikely that airlines would ever prevent all OTAs and metasearch sites that operate flight search tools from displaying and/or distributing airline flight information.

    Competition in the Airline Industry and Price Competition

    Some ticket agents assert that Web sites such as theirs can potentially better position new entrant airlines to compete with larger and more established airlines, especially considering recent airline consolidation. They state that new entrant airlines often offer consumers low ticket prices and increase the number of flight options for a given itinerary. This increase in air travel options tends to drive down airfares, which in turn allows more consumers to take advantage of air transportation. Some ticket agents also believe that new entrant airlines benefit from the exposure that they gain by advertising airfares on ticket agent Web sites alongside the fares offered by larger more established carriers. Some ticket agents allege that by allowing them to display and distribute flight information for all airlines that offer service for a given itinerary, ticket agent Web sites will promote price competition in some of the more concentrated markets where the dominance of legacy airlines and other larger airlines would otherwise lead to higher airfares for consumers.

    Airlines state that airline restrictions on the distribution and display of flight information is unrelated to airline market power. Accordingly, airlines assert that consolidation within the airline industry should not be taken into account when considering the issue of airline restrictions on ticket agent distribution or the display of flight information.

    Ticket agents also argue that by displaying flight combinations such as one way flights or flights on multiple carriers that are not offered by airlines, OTAs and metasearch entities operating flight search tools are creating price competition and improving consumer access to information.

    Airlines counter that not all carriers use non-airline distribution channels such as OTAs or metasearch entities operating flight search tools. According to some airlines, the fact that not every flight option is available through every non-airline flight information outlet does not support the idea that price competition is harmed. According to the airlines, flight information for most airlines is available through a variety of outlets, but more importantly, flight information for every airline is readily available on the airline's own Web site. Moreover, airlines have to publish information on their flights in order to sell tickets. Therefore, they do not believe price competition is harmed simply by some airlines limiting where that airline's flight information is displayed when the information is available elsewhere, such as an airline Web site.

    Request for Information

    The Department has considered the information that has been provided thus far and now requests additional information from all stakeholders—airlines, ticket agents, consumers, and other affected parties. The Department is not proposing to take any specific action at this time. Rather, the Department is requesting information that will assist the Department in determining whether airline restrictions on the distribution and display of flight information are causing consumer harm, are unfair or deceptive in some way, or are anticompetitive. If airline restrictions are causing consumer harm or are unfair, deceptive, or anticompetitive, the comments would assist the Department in determining what action is appropriate, if any. Also, consistent with 49 U.S.C. 40101, the Department places maximum reliance on competitive market forces and on actual and potential competition while preventing unfair, deceptive, predatory, or anticompetitive practices in air transportation. We are also requesting information on the extent to which airline practices to restrict the distribution and display of information on its flights benefits consumers. Further, the Department is specifically requesting information on whether any entities are blocking access to critical resources needed for competitive entry into the air transportation industry and whether Department action in this area would promote a more competitive air transportation marketplace. In addition, the Department is seeking information on whether action in this area would improve consumer access to the information needed to make informed air transportation choices. Information that provides historical or statistical data or peer-reviewed studies will be particularly helpful for determining whether or not Departmental action is appropriate in this area.

    As an initial matter, the Department requests information on the proprietary nature of flight information and whether the wide-spread availability of that information is relevant to airline restrictions on the distribution or display of flight information. Specifically, when flight information is released to consumers by airlines and made generally available to the public (e.g., published on an airline's Web site), do stakeholders consider this flight information to be factual non-proprietary information? Do stakeholders consider the airline schedule, fare or availability information, singularly or in combination, the proprietary information of the airline that produces the information? Do stakeholders consider the schedule, fare, and availability information proprietary only when this information is combined in one product but not when distributed separately?

    Consumer Access to Information Needed To Make Informed Air Transportation Choices

    In connection with consumer options for researching and purchasing air transportation, what is the value that OTA or metasearch entity flight search tools provide? To what extent do consumers, including leisure travelers, small businesses and corporate customers, benefit from saved search costs, greater confidence in search results, access to lower fares, or more travel options than they would have obtained from separate searches of individual airline Web sites? In this request for information, have we accurately described the types of actions airlines have taken that impact OTA and metasearch entity Web sites? If not, what are those actions and how do they impact OTA and metasearch entity Web sites? What effect do those actions have on the utility of OTA and metasearch entity Web sites for consumers? Do ticket agents that provide flight search tools offer consumers any flight information that consumers cannot obtain by visiting multiple airline Web sites? What effect does an inability to display schedule, fare or seat availability information of a large, well-known airline, or group of airlines, have on the utility of air travel comparison sites for consumers? Would access to one or two of those categories of airline information without, e.g., seat availability information, be of any practical use on its own?

    It has been pointed out that not all airlines currently distribute information on their flights through OTAs or metasearch entities operating flight search tools and that those tools do not necessarily have the same level of information that is available on airline Web sites. Do airline restrictions currently placed on the distribution and/or display of airline flight information limit the ability of consumers to identify the best flight options available to meet consumer needs? If yes, how? Are the existing limitations of OTA or metasearch entity Web sites relevant to the ability of consumers who use those Web sites to identify the best flight options available to meet consumer needs?

    Airlines Stated Reasons for Restricting Flight Information

    As explained above, airlines have stated that in some cases they are restricting the sharing and use of their flight information by some Web sites or entities that airlines believe are disreputable or simply do not market the airline's flights in a manner that the airline would like. Some airlines have indicated that OTAs or metasearch entities have provided inaccurate or incomplete information about airline services and products, provided poor customer service, or engaged in marketing practices the airline does not approve of, and have in some cases engaged in fraud. Airlines say such conduct tarnishes the airline brand, and for these reasons airlines are trying to prevent or restrict these entities from marketing and selling their airline's products and services. Thus, airlines claim that their actions to restrict use of their flight information benefit both airlines and consumers. Some airlines also acknowledge that they are attempting to direct more consumers to their own Web sites for financial reasons as well as marketing reasons. Are there any other reasons why airlines are restricting the sharing and use of their flight information? What information is available to determine the scope and magnitude of the problems described by airlines? How many entities engage in the practices as described by airlines, and what portion of the OTA and metasearch market do these entities represent? How many consumers use these Web sites? What is the average number of consumer complaints for each of these issues regarding such entities that airlines receive each year? How would DOT appropriately measure and evaluate the effects of the problems as described by airlines? Is action by DOT necessary to allow airlines to protect their legitimate interests and also ensure that consumers are able to make informed flight choices?

    Effects of Airlines Restricting Use of Flight Information

    We note that flight information is available through airline Web sites. Would a reduction in the availability of airline flight information on non-airline Web sites due to airline restrictions on the distribution and/or display of such information have a significant negative impact on consumers? If so, what are those impacts, and do they disproportionately affect some subsets of consumers? According to the information provided to the Department, no airline has indicated an intent to withdraw completely from ticket agent Web sites. However, if an airline that currently distributes flight information through ticket agent Web sites withdrew completely from those Web sites, would that reduce or eliminate the ability of consumers to identify the most suitable flight options? If not, how many airlines would have to withdraw from ticket agent Web sites to eliminate the ability of consumers to identify the most suitable flight options?

    Is there information to suggest that many airlines will eventually withhold flight information entirely from all or most Web sites that offer flight search tools? How many consumers would fail to investigate more than one airline Web site, with the result that they may not locate the optimal itinerary or fare?

    If it is essential for consumers to be able to view as many airline flight options as possible on OTA and metasearch entity Web sites to identify the best flight options, what information is essential? Is schedule information sufficient or are both schedule and fare information necessary? Do consumers need availability information to identify the best flight options?

    We note that airlines create fare rules and generally do not allow certain combinations of flight segments. Are consumers less likely to combine one-way fares when searching for an itinerary on multiple airline Web sites rather than a ticket agent Web site due to the amount of time it may take to identify these flights and pair them together by making multiple purchases?

    We note that some airlines are placing restrictions on OTAs and metasearch entity Web sites preventing them from displaying codeshare flights, which at times may be the cheapest or most efficient flight options for consumers. Are consumers less likely to discover these codeshare flight options when airlines restrict the display of these flights on OTA and metasearch Web sites? Can consumers gain access to the same information by visiting airline Web sites directly?

    Is Department action in connection with airline distribution practices necessary to ensure consumers have the information they need to make informed choices?

    Competitive Air Transportation Marketplace

    In connection with competition between airlines, we are requesting information on the impact of airline restrictions on the distribution or display of flight information on competition. What value, if any, do OTA and metasearch entity Web sites that operate flight search tools provide in facilitating or enabling competition among airlines? Does having airline information available through multiple outlets, including ticket agent outlets, impact price competition? Would the absence of several airlines that currently participate in ticket agent outlets impact price competition? Does the ability or inability of metasearch entities that operate flight search tools to provide links to OTAs impact price competition?

    If restrictions placed on the distribution and/or display of airline flight information limits the flight options available on Web sites operating flight search tools that market multiple airlines, has that limitation in options lead to higher prices for consumers? If so, how? How would restrictions in the future potentially lead to higher prices?

    It is our understanding that most airlines do not permit fare “discounting” by OTAs. Are OTAs or metasearch entities that operate flight search tools able to identify fares that are lower than fares that can be found on airline Web sites? Do OTAs receive discounts from GDSs which allow them to then price flights lower than airlines?

    Some ticket agents have stated that flight search tools are able to identify lower prices on OTA Web sites than are available on airline Web sites and that the lower fare or both fares are displayed absent any airline restriction. If lower prices are identified by OTAs, do these prices serve as a competitive check on airline prices when displayed on flight search tools adjacent to the prices offered by airlines?

    In the past, OTAs negotiated special deals, rates, and promotions from airlines that resulted in consumers obtaining discounted fares. More recently, it is our understanding that contractual arrangements between airlines, GDSs, and OTAs generally include provisions that prevent OTAs or airlines from offering discounted fares that are not available through all other outlets. Accordingly, discounted fares that might otherwise be available to consumers are no longer offered. We request information on how these types of private contractual arrangements impact consumers and whether they are unfair or anticompetitive.

    Resources Needed for Competitive Entry

    Some stakeholders have argued that having flight information for multiple airlines available through the flight search tools of OTAs and metasearch entities operates a platform for smaller and new entrant airlines to compete with larger, better known airlines. They suggest that absent ticket agent Web sites that offer the flight information of multiple airlines, consumers will fly only well-known carriers that they recognize from advertisements and the airline's continuous length of operation in a given market. If OTA and metasearch entity Web sites do not provide the flight information of larger, better known airlines, will consumers stop using those Web sites? If consumers do not use those Web sites, and instead search only airline Web sites, will that impact the ability of smaller or new entrant airlines to compete with larger, better known airlines because consumers will not search Web sites that do not include largest airlines? Conversely, would the ability of new entrant airlines to compete with larger airlines be enhanced by the lack of competition if large, well-known airlines limit or do not permit information on their flights to be displayed on OTA or metasearch entity Web sites and therefore consumers find only smaller airline flight options on those sites? Is Department action in this area necessary to ensure airline restrictions on the distribution or display of flight information does not harm competition? If so, what action is appropriate?

    We are requesting information on all of the issues and concerns identified above and any information relevant to this issue.

    Issued this 18th day of October 2016, in Washington, DC. Molly J. Moran, Acting General Counsel.
    [FR Doc. 2016-26191 Filed 10-28-16; 8:45 am] BILLING CODE 4910-9X-P
    DEPARTMENT OF THE TREASURY Bureau of the Fiscal Service Fee Schedule for the Transfer of U.S. Treasury Book-Entry Securities Held on the National Book-Entry System AGENCY:

    Bureau of the Fiscal Service, Fiscal Service, Treasury.

    ACTION:

    Notice.

    SUMMARY:

    The Department of the Treasury (Treasury) is announcing a new fee schedule applicable to transfers of U.S. Treasury book-entry securities maintained on the National Book-Entry System (NBES) that occur on or after January 3, 2017.

    DATES:

    Effective January 3, 2017.

    FOR FURTHER INFORMATION CONTACT:

    Brandon Taylor or Janeene Wilson, Bureau of the Fiscal Service, 202-504-3550.

    SUPPLEMENTARY INFORMATION:

    Treasury has established a fee structure for the transfer of Treasury book-entry securities maintained on NBES. Treasury reassesses this fee structure periodically based on our review of the latest book-entry costs and volumes.

    For each Treasury securities transfer or reversal sent or received on or after January 3, 2017, the basic fee will increase from $0.81 to $0.93. The Federal Reserve System also charges a funds movement fee for each of these transactions for the funds settlement component of a Treasury securities transfer.1 The surcharge for an off-line Treasury book-entry securities transfer will increase from $50.00 to $70.00. Off-line refers to the sending and receiving of transfer messages to or from a Federal Reserve Bank by means other than on-line access, such as by written, facsimile, or telephone voice instruction. The basic transfer fee assessed to both sends and receives is reflective of costs associated with the processing of securities transfers. The off-line surcharge, which is in addition to the basic fee and the funds movement fee, reflects the additional processing costs associated with the manual processing of off-line securities transfers.

    1 The Board of Governors of the Federal Reserve System sets this fee separately from the fees assessed by Treasury. As of January 4, 2016, that fee was $0.11 per transaction. For a current listing of the Federal Reserve System's fees, please refer to https://www.frbservices.org/servicefees/.

    Treasury does not charge a fee for account maintenance, the stripping and reconstitution of Treasury securities, the wires associated with original issues, or interest and redemption payments. Treasury currently absorbs these costs.

    The fees described in this notice apply only to the transfer of Treasury book-entry securities held on NBES. Information concerning fees for book-entry transfers of Government Agency securities, which are priced by the Federal Reserve, is set out in a separate Federal Register notice published by the Federal Reserve.

    The following is the Treasury fee schedule that will take effect on January 3, 2017, for book-entry transfers on NBES:

    Treasury—NBES Fee Schedule—Effective January 3, 2017 [In dollars] Transfer type Basic fee Off-line
  • surcharge
  • On-line transfer originated 0.93 N/A On-line transfer received 0.93 N/A On-line reversal transfer originated 0.93 N/A On-line reversal transfer received 0.93 N/A Off-line transfer originated 0.93 70.00 Off-line transfer received 0.93 70.00 Off-line account switch received 0.93 0.00 Off-line reversal transfer originated 0.93 70.00 Off-line reversal transfer received 0.93 70.00
    Authority:

    31 CFR 357.45.

    David A. Lebryk, Fiscal Assistant Secretary.
    [FR Doc. 2016-26079 Filed 10-27-16; 11:15 am] BILLING CODE 4810-AS-P
    DEPARTMENT OF THE TREASURY Office of Foreign Assets Control Unblocking of Specially Designated Nationals and Blocked Persons Resulting From the Termination of the National Emergency and Revocation of Executive Orders Related to Burma AGENCY:

    Office of Foreign Assets Control, Treasury.

    ACTION:

    Notice.

    SUMMARY:

    The Treasury Department's Office of Foreign Assets Control (OFAC) is removing from the Specially Designated Nationals and Blocked Persons List (SDN List) the names of the persons listed below whose property and interests in property had been blocked pursuant to Executive Order 13310 of July 28, 2003 (Blocking Property of the Government of Burma and Prohibiting Certain Transactions), Executive Order 13448 of October 18, 2007 (Blocking Property and Prohibiting Certain Transactions Related to Burma), Executive Order 13464 of April 30, 2008 (Blocking Property and Prohibiting Certain Transactions Related To Burma), and Executive Order 13619 of July 11, 2012 (Blocking Property of Persons Threatening the Peace, Security, or Stability of Burma).

    DATES:

    OFAC's actions described in this notice are effective as of October 7, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Associate Director for Global Targeting, tel.: 202/622-2420, Assistant Director for Sanctions Compliance & Evaluation, tel.: 202/622-2490, Assistant Director for Licensing, tel.: 202/622-2480, Office of Foreign Assets Control, or Chief Counsel (Foreign Assets Control), tel.: 202/622-2410 (not toll free numbers).

    SUPPLEMENTARY INFORMATION:

    Electronic and Facsimile Availability

    The SDN List and additional information concerning OFAC sanctions programs are available from OFAC's Web site (https://www.treasury.gov/resource-center/sanctions/Pages/default.aspx).

    Notice of OFAC Actions

    On October 7, 2016, the President signed an Executive Order terminating the national emergency declared in Executive Order 13047 of May 20, 1997 (Prohibiting New Investment in Burma), and revoked that order, Executive Order 13310 of July 28, 2003 (Blocking Property of the Government of Burma and Prohibiting Certain Transactions), Executive Order 13448 of October 18, 2007 (Blocking Property and Prohibiting Certain Transactions Related to Burma), Executive Order 13464 of April 30, 2008 (Blocking Property and Prohibiting Certain Transactions Related To Burma), Executive Order 13619 of July 11, 2012 (Blocking Property of Persons Threatening the Peace, Security, or Stability of Burma), and Executive Order 13651 of August 6, 2013 (Prohibiting Certain Imports of Burmese Jadeite and Rubies).

    As such, the following individuals and entities are no longer subject to the blocking provisions in any of the Burma-related Executive Orders revoked by the President and are being removed from the SDN List as of the effective date of Executive Order 13742 of October 7, 2016, Termination of Emergency With Respect to the Actions and Policies of the Government of Burma:

    1. HTOO TRADING COMPANY LIMITED (a.k.a. HTOO TRADING GROUP COMPANY), 5 Pyay Road, Hlaing Township, Yangon, Burma [BURMA].

    2. HTOO WOOD PRODUCTS PTE. LIMITED (a.k.a. HTOO FURNITURE; a.k.a. HTOO WOOD; a.k.a. HTOO WOOD PRODUCTS; a.k.a. HTOO WOOD-BASED INDUSTRY), 3 Shenton Way, #24-02 Shenton House, Singapore, 068805, Singapore; 5 Pyay Road, Hlaing Township, Yangon, Burma; Shwe Pyithar T/S, Tangon, Burma; No. 21 Thukha Waddy Road, Yankin T/S, Yangon, Burma [BURMA].

    3. HTOO GROUP OF COMPANIES, 5 Pyay Road, Hlaing Township, Yangon, Burma [BURMA].

    4. HTAY, Thein; DOB 07 Sep 1955; POB Taunggyi, Burma; Lieutenant General; Chief of Defence Industries; Chief of Army Ordnance Industries (individual) [BURMA].

    5. HOTEL MAX (a.k.a. HOTEL CHAUNG THA BEACH RESORT), No. 1 Ywama Curve, Ba Yint Naung Road, Block-2, Hlaing Township, Yangon, Burma [BURMA].

    6. LWIN, Saw, Burma; DOB 1939; alt. nationality Burma; alt. citizen Burma; Major General, Minister of Industry 2 (individual) [BURMA].

    7. MANN, Aung Thet (a.k.a. KO, Shwe Mann Ko), c/o Htoo Trading Company Limited, undetermined; c/o Htoo Group of Companies, undetermined; c/o Ayer Shwe Wah Company Limited, undetermined; DOB 19 Jun 1977 (individual) [BURMA].

    8. MAX (MYANMAR) CONSTRUCTION CO., LTD, 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].

    9. MAX MYANMAR GEMS AND JEWELLERY CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].

    10. MAX MYANMAR GROUP OF COMPANIES (a.k.a. MAX MYANMAR; a.k.a. MAX MYANMAR CO.; a.k.a. MAX MYANMAR COMPANY LIMITED; a.k.a. MAX MYANMAR GROUP), No. 1 Ywama Curve, Ba Yint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].

    11. MAX MYANMAR MANUFACTURING CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].

    12. PAVO AIRCRAFT LEASING PTE. LTD., 3 Shenton Way, #24-02 Shenton House 068805, Singapore [BURMA].

    13. OO, Tin Aung Myint (a.k.a. OO, Thiha Thura Tin Aung Myint); DOB 27 May 1950; nationality Burma; citizen Burma; Lieutenant-General; Quartermaster General; Minister of Military Affairs; Member, State Peace and Development Council (individual) [BURMA].

    14. OO, Maung; DOB 1952; nationality Burma; citizen Burma; Major General; Minister of Home Affairs (individual) [BURMA].

    15. OO, Kyaw Nyunt; DOB 30 Jun 1959; Lieutenant Colonel; Staff Officer (Grade 1), D.D.I. (individual) [BURMA].

    16. NYEIN, Chan (a.k.a. NYEIN, Chan, Dr.; a.k.a. NYEIN, Chang, Dr.), Burma; DOB 1944; alt. nationality Burma; alt. citizen Burma; Minister of Education (individual) [BURMA].

    17. NG, Sor Hong (a.k.a. LAW, Cecilia; a.k.a. LO, Cecilia; a.k.a. NG, Cecilia), 150 Prince Charles Crescent, #18-03, Singapore 159012, Singapore; 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore; DOB 1958; citizen Singapore; Identification Number S1481823E (Singapore); Chief Executive, Managing Director, and Owner, Golden Aaron Pte. Ltd., Singapore; Director and Owner, G A Ardmore Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Capital Pte. Ltd., Singapore; Director and Owner, G A Foodstuffs Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Land Pte. Ltd., Singapore; Director and Owner, G A Resort Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Sentosa Pte. Ltd., Singapore; Chief Executive, Director and Owner, G A Treasure Pte. Ltd., Singapore; Director and Owner, G A Whitehouse Pte. Ltd., Singapore; Chief Executive, Manager, and Owner, S H Ng Trading Pte. Ltd., Singapore (individual) [BURMA].

    18. MYINT, Ye; DOB 21 Oct 1943; nationality Burma; citizen Burma; Lieutenant-General; Chief, Military Affairs; Chief, Bureau of Special Operation 1; Member, State Peace and Development Council (individual) [BURMA].

    19. MYINT, Tin Lin (a.k.a. MYINT, Daw Tin Lin); DOB 25 Jan 1947; wife of Ye Myint (individual) [BURMA].

    20. MYINT, Kyaw (a.k.a. MYINT, Kyaw, Dr.), Burma; DOB 1940; alt. nationality Burma; alt. citizen Burma; Minister of Health (individual) [BURMA].

    21. MYINT, Htay (a.k.a. MYINT, U Htay), Burma; DOB 06 Feb 1955; nationality Burma; citizen Burma; Chairman, Yuzana Company Limited (individual) [BURMA].

    22. MYAWADDY TRADING LTD. (a.k.a. MYAWADDY TRADING CO.), 189-191 Maha Bandoola Street, Botataung P.O, Yangon, Burma [BURMA].

    23. MYANMAR ECONOMIC CORPORATION (a.k.a. MEC), 74-76 Shwedagon Pagoda Road, Dagon Township, Yangon, Burma [BURMA].

    24. MYANMAR IMPERIAL JADE CO., LTD, 22 Sule Pagoda Road, Mayangone Township, Yangon, Burma [BURMA].

    25. MYANMAR IVANHOE COPPER COMPANY LIMITED (a.k.a. MICCL; a.k.a. MONYWA JVCO; a.k.a. MYANMAR IVANHOE COPPER CO. LTD.), 70 (I) Bo Chein Street, 6.5 miles Pyay Road, Yangon, Burma; 70 (I) Bo Chein Street, Pyay Road, Hlaing Township, Yangon, Burma; Monywa, Sagaing Division, Burma [BURMA].

    26. STATE PEACE AND DEVELOPMENT COUNCIL OF BURMA [BURMA].

    27. THA, Soe, Burma; DOB 1945; alt. nationality Burma; alt. citizen Burma; Minister of National Planning and Economic Development (individual) [BURMA].

    28. THAUNG (a.k.a. THAUNG, U), Burma; DOB 06 Jul 1937; alt. nationality Burma; alt. citizen Burma; Minister of Labor; Minister of Science & Technology (individual) [BURMA].

    29. ASIA WORLD PORT MANAGEMENT CO. LTD (a.k.a. ASIA WORLD PORT MANAGEMENT; a.k.a. “PORT MANAGEMENT CO. LTD.”), 61-62 Wartan St, Bahosi Yeiktha, Rangoon, Burma [BURMA].

    30. AUREUM PALACE HOTELS AND RESORTS (a.k.a. AUREUEM PALACE HOTEL AND RESORT (BAGAN); a.k.a. AUREUEM PALACE HOTEL AND RESORT (NGAPALI); a.k.a. AUREUM PALACE HOTEL AND RESORT (NGWE SAUNG); a.k.a. AUREUM PALACE HOTEL AND RESORT GROUP CO. LTD.; a.k.a. AUREUM PALACE HOTEL RESORT; a.k.a. AUREUM PALACE RESORTS; a.k.a. AUREUM PALACE RESORTS AND SPA), No. 41 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; Thandwe, Rakhine, Burma [BURMA].

    31. ASIA WORLD INDUSTRIES LTD., No. 21/22 Upper Pansodan St., Aung San Stadium (East Wing), Mingalar Taung Nyunt, Rangoon, Burma [BURMA].

    32. AYE, Maung; DOB 25 Dec 1937; nationality Burma; citizen Burma; Vice Senior General; Vice-Chairman of the State Peace and Development Council; Deputy Commander-in-Chief, Myanmar Defense Services (Tatmadaw); Commander-in-Chief, Myanmar Army (individual) [BURMA].

    33. AYER SHWE WAH COMPANY LIMITED (a.k.a. AYE YAR SHWE WAH; a.k.a. AYER SHWE WA; a.k.a. AYEYA SHWE WAR COMPANY), 5 Pyay Road, Hlaing Township, Yangon, Burma [BURMA].

    34. AYEYARWADY BANK (a.k.a. AYEYARWADDY BANK LTD; a.k.a. IRRAWADDY BANK), Block (111-112), Asint Myint Zay, Zabu Thiri Township, Nay Pyi Taw, Burma; No. 1 Ywama Curve, Ba Yint Naung Road, Block (2), Hlaing Township, Yangon, Burma; SWIFT/BIC AYAB MM MY [BURMA].

    35. DIRECTORATE OF DEFENCE INDUSTRIES (a.k.a. KA PA SA; a.k.a. “DDI”), Burma; Ministry of Defence, Shwedagon Pagoda Road, Yangon, Burma [BURMA].

    36. ESPACE AVENIR EXECUTIVE SERVICED APARTMENT (a.k.a. ESPACE AVENIR), No. 523, Pyay Road, Kamaryut Township, Yangon, Burma [BURMA].

    37. EXCELLENCE MINERAL MANUFACTURING CO., LTD., Plot No. (142), U Ta Yuoat Gyi Street, Industrial Zone No. (4), Hlaing Thar Yar Township, Yangon, Burma [BURMA].

    38. G A ARDMORE PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    39. G A ARDMORE PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore; 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    40. G A CAPITAL PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore [BURMA].

    41. LAW, Steven (a.k.a. CHUNG, Lo Ping; a.k.a. HTUN MYINT NAING; a.k.a. LAW, Stephen; a.k.a. LO, Ping Han; a.k.a. LO, Ping Hau; a.k.a. LO, Ping Zhong; a.k.a. LO, Steven; a.k.a. TUN MYINT NAING; a.k.a. U MYINT NAING), No. 124 Insein Road, Ward (9), Hlaing Township, Rangoon, Burma; 61-62 Bahosi Development Housing, Wadan St., Lanmadaw Township, Rangoon, Burma; 330 Strand Rd., Latha Township, Rangoon, Burma; 8A Jalan Teliti, Singapore, Singapore; 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore; DOB 16 May 1958; alt. DOB 27 Aug 1960; POB Lashio, Burma; citizen Burma; Passport 937174 (Burma) (individual) [BURMA].

    42. KO, Myint Myint (a.k.a. KO, Daw Myint Myint); DOB 11 Jan 1946; wife of Saw Tun (individual) [BURMA].

    43. INNWA BANK LTD (a.k.a. INNWA BANK), 554-556 Corner of Merchant Street and 35th Street, Kyauktada Township, Yangon, Burma; SWIFT/BIC AVAB MM M1 [BURMA].

    44. HTWE, Aung; DOB 01 Feb 1943; nationality Burma; citizen Burma; Lieutenant-General; Chief of Armed Forces Training; Member, State Peace and Development Council (individual) [BURMA].

    45. GOLD OCEAN PTE LTD, 101 Cecil Street #08-08, Tong Eng Building, Singapore 069533, Singapore; 1 Scotts Road, #21-07/08 Shaw Centre, Singapore 228208, Singapore [BURMA].

    46. GOLD ENERGY CO. LTD., No. 74 Lan Thit Road, Insein Township, Rangoon, Burma; Taungngu (Tungoo) Branch, Karen State, Burma [BURMA].

    47. G A FOODSTUFFS PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore [BURMA].

    48. SOE MIN HTAIK CO. LTD. (a.k.a. SOE MIN HTIKE CO., LTD.; a.k.a. SOE MIN JTIAK CO. LTD.; a.k.a. SOE MING HTIKE), No. 4, 6A Kabaaye Pagoda Road, Mayangon Township, Yangon, Burma; No. 3, Kan Street, No. 10 Ward, Hlaing Township, Yangon, Burma [BURMA].

    49. SOE, Myint Myint (a.k.a. SOE, Daw Myint Myint); DOB 15 Jan 1953; wife of Nyan Win (individual) [BURMA].

    50. SWE, Myint; DOB 24 Jun 1951; nationality Burma; citizen Burma; Lieutenant-General; Chief of Military Affairs Security (individual) [BURMA].

    51. TERRESTRIAL PTE. LTD., 3 Raffles Place, #06-01 Bharat Building, Singapore 048617, Singapore; 10 Anson Road, #23-16 International Plaza, Singapore 079903, Singapore [BURMA].

    52. THAUNG, Aung, No. 1099, PuBa Thiri Township, Ottara (South) Ward, Nay Pyi Taw, Burma; DOB 01 Dec 1940; POB Kyauk Kaw Village, Thaung Tha Township, Burma; Gender Male; National ID No. 13/KaLaNa (Naing) 011849 (Burma); Lower House Member of Parliament (individual) [BURMA].

    53. THEIN, Tin Naing, Burma; DOB 1955; alt. nationality Burma; alt. citizen Burma; Brigadier General, Minister of Commerce (individual) [BURMA].

    54. THI, Lun; DOB 18 Jul 1940; nationality Burma; citizen Burma; Brigadier-General; Minister of Energy (individual) [BURMA].

    55. TUN, Hla, Burma; DOB 11 Jul 1951; alt. nationality Burma; alt. citizen Burma; Major General, Minister of Finance and Revenue (individual) [BURMA].

    56. TUN, Saw, Burma; DOB 08 May 1935; alt. nationality Burma; alt. citizen Burma; Major General, Minister of Construction (individual) [BURMA].

    57. UNION OF MYANMAR ECONOMIC HOLDINGS LIMITED (a.k.a. MYANMAR ECONOMIC HOLDINGS LIMITED; a.k.a. UMEH; a.k.a. UNION OF MYANMAR ECONOMIC HOLDINGS COMPANY LIMITED), 189-191 Maha Bandoola Road, Botahtaung Township, Yangon, Burma [BURMA].

    58. LO, Hsing Han (a.k.a. LAW, Hsit-han; a.k.a. LO, Hsin Han; a.k.a. LO, Hsing-han), 20-23 Masoeyein Kyang St., Mayangone, Rangoon, Burma; 20B Massoeyein St., 9 Mile, Rangoon, Burma, Burma; 60-61 Strand Rd., Latha Township, Rangoon, Burma; 330 Strand Rd, Latha Township, Rangoon, Burma; 20 Wingabar Rd, Rangoon, Burma; 36 19th St., Lower Blk, Latha Township, Rangoon, Burma; 47 Latha St., Latha Township, Rangoon, Burma; 152 Sule Pagoda Rd, Rangoon, Burma; 126A Damazedi Rd, Bahan Township, Rangoon, Burma; DOB 1938; alt. DOB 1935 (individual) [BURMA].

    59. GREEN LUCK TRADING COMPANY (a.k.a. GREEN LUCK TRADING COMPANY LIMITED), No. 61/62 Bahosi Development, Wadan Street, Lanmadaw Township, Rangoon, Burma; No. 74 Lan Thit Street, Insein Township, Rangoon, Burma [BURMA].

    60. GOLDEN AARON PTE. LTD. (a.k.a. CHINA FOCUS DEVELOPMENT; a.k.a. CHINA FOCUS DEVELOPMENT LIMITED; a.k.a. CHINA FOCUS DEVELOPMENT LTD.), 3 Shenton Way, 10-01, Shenton House, Singapore 068805, Singapore; 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore; China; Unit 2612A, Kuntai International Center, No. 12 Chaowai Street, Chaoyang District, Beijing 100020, China [BURMA].

    61. GREAT SUCCESS PTE. LTD., 1 Scotts Road, #21/07-08 Shaw Centre, Singapore, 228208, Singapore; 101 Cecil Street #08-08, Tong Eng Building, Singapore, 069533, Singapore [BURMA].

    62. G A LAND PTE. LTD., 1 Scotts Road, 21-07/08 Shaw House, Singapore 228208, Singapore [BURMA].

    63. G A RESORT PTE. LTD., 1 Scotts Road, 21-07 Shaw House, Singapore 228208, Singapore; 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    64. G A RESORT PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    65. G A TREASURE PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    66. G A TREASURE PTE. LTD., 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    67. G A WHITEHOUSE PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    68. G A WHITEHOUSE PTE. LTD., 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    69. LIN, Aung Thein (a.k.a. “LYNN, Aung Thein”), Burma; DOB 1952; alt. nationality Burma; alt. citizen Burma; Brigadier General, Mayor and Chairman of Yangon City (Rangoon) City Development Committee (individual) [BURMA].

    70. ZAY GABAR COMPANY (a.k.a. ZAYKABAR COMPANY), Burma [BURMA].

    71. ZAW, Thein, Burma; DOB 20 Oct 1951; alt. nationality Burma; alt. citizen Burma; Brigadier General, Minister of Telecommunications, Post, & Telegraph (individual) [BURMA].

    72. ZA, Pye Phyo Tay, Burma; 6 Cairnhill Circle, Number 18-07, Cairnhill Crest 229813, Singapore; DOB 29 Jan 1987; nationality Burma; citizen Burma; Son of Tay Za. (individual) [BURMA].

    73. ASIA WORLD CO. LTD. (a.k.a. ASIA WORLD), 61-62 Bahosi Development Housing, Wadan St., Lanmadaw Township, Rangoon, Burma [BURMA].

    74. ZAW, Zaw (a.k.a. ZAW, U Zaw); DOB 22 Oct 1966; nationality Burma; citizen Burma; Passport 828461 (Burma) issued 18 May 2006 expires 17 May 2009 (individual) [BURMA] (Linked To: HOTEL MAX; Linked To: MAX MYANMAR GROUP OF COMPANIES; Linked To: MAX SINGAPORE INTERNATIONAL PTE. LTD.).

    75. SENTOSA TREASURE PTE. LTD., 3 Shenton Way, 10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    76. SHWE NAR WAH COMPANY LIMITED, No. 39/40, Bogyoke Aung San Road, Bahosi Housing, Lanmadaw, Rangoon, Burma; Registration ID 1922/2007-2008 (Burma) [BURMA] (Linked To: LAW, Steven).

    77. SHWE, Khin (a.k.a. SHWE, Khin, Dr.), Burma; DOB 21 Jan 1952; alt. nationality Burma; alt. citizen Burma; President, Zay Gabar Company (individual) [BURMA].

    78. SHWE, Than; DOB 02 Feb 1935; alt. DOB 02 Feb 1933; nationality Burma; citizen Burma; Senior General, Minister of Defense and Commander-in-Chief of Defense Services; Chairman, State Peace and Development Council (individual) [BURMA].

    79. MIN, Zaw, Burma; DOB 10 Jan 1949; alt. nationality Burma; alt. citizen Burma; Colonel, Minister of Electric Power 1 (individual) [BURMA].

    80. G A CAPITAL PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    81. AIR BAGAN HOLDINGS PTE. LTD. (a.k.a. AIR BAGAN; a.k.a. AIRBAGAN), 545 Orchard Road, #01-04 Far East Shopping Centre, Singapore 238882, Singapore; 56 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; 9, 78th Street, Bet, 33rd and 34th Street, Mandalay, Burma; 134 Bogyoke Street, Myoma Quarter, Taunggyi, Burma; 3, Aung Thate Di Quarter, Nyaung U, Burma; Sandoway Inn, Thandwe, Burma; Pathein Hotel, Kanthonesint, Petheing-Monywa Road, Burma; 572 Ye Yeik That Street, Pear Ayekari Hotel, Myauk Ywa Quarter, Burma; 48 Quarter 2, Zay Tan Lay Yat, Kyaing Tong, Burma; 156 Bogyoke Aung San Road, Aung Chan Thar Building, San Sai Quarter, Tachileik, Burma; Myeik Golf Club, Pearl Mon Hotel, Airport Junction, Myeik, Burma; 244 Bet, Duwa Za Junn & Bayin Naung St., Third Quarter, Myitkyina, Burma; 414 Bogyoke Road, Kaw Thaung, Burma; Room (2), YMCA Building, Bogyoke Aung San Road, Forestry Quarter, Taunggyi, Burma; No. 407, Zei Phyu Kone Quarter, Near Ngapali Junction, Thandwe, Burma; No. Mitharsu (Family Video), No. 131/B Zay Taung Bak Lane, Zayit Quarter, Dawei, Burma; No. 13 (B), Zay Tan Gyi Street, Quarter (3), Zay Than Gyi Quarter, Kyaing Tong, Burma; 179 (Nya) Bogyoke Road, San Sai (Kha) Quarter, Tachileik, Burma; No. E (4), Construction Housing, Sumbrabun Road, Ayar Quarter, Myitkyina, Burma; No. 445, Anawa Quarter, Myinttzu Thaka Road, Kawthaung, Burma; No. 4, Naypyidaw Airport Compound, Naypyidaw, Burma; Kalaymyo, Red Cross Building, Bogyoke Street, Kalay Myo, Burma; Room-17, Stadium Building, Theinni Main Road, 12 Quarter, Lashio, Burma; Unit #310, 3rd Floor, Silom Complex, 191 Silom Road, Silom Bangrak, Bangkok 10500, Thailand; Room No. T1-112 & T-112A, Level 1, Main Terminal Building, Suvarnabhumi Airport, Bangpli, Ssamutprakarn 10540, Thailand; Doing business as AIR BAGAN [BURMA].

    82. ASIA GREEN DEVELOPMENT BANK (a.k.a. AGD BANK), 168 Thiri Yatanar Shopping Complex, Zabu Thiri Township, Nay Pyi Taw, Burma; 73/75 Sule Pagoda Road, Pabedan Township, Yangon, Burma; SWIFT/BIC AGDB MM MY [BURMA].

    83. AIR BAGAN LIMITED (a.k.a. AIR BAGAN), 56 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; 9, 78th Street, Bet, Mandalay, Burma; 134 Bogyoke Street, Myoma Quarter, Taunggyi, Burma; 3, Aung Thate Di Quarter, Nyaung U, Burma; Sandoway Inn, Thandwe, Burma; Pathein Hotel, Kanthonesint, Petheing-Monywa Road, Burma; 572 Ye Yeik Tha Street, Pear Ayekari Hotel, Myauk Ywa Quarter, Burma; 48 Quarter 2, Zay Tan Lay Yat, Kyaing Tong, Burma; 156 Bogyoke Aung San Road, Aung Chan Thar Building, San Sai Quarter, Tachileik, Burma; Myeik Golf Club, Pearl Mon Hotel, Airport Junction, Myeik, Burma; 244 Bet, Duwa Zaw Junn & Bayin Naung St., Thida Quarter, Myitkyina, Burma; 414 Bogyoke Road, Kaw Thaung, Burma; No.6/88, 6 Quarter, Lalway, Naypyitaw, Burma; Kalaymyo, Red Cross Building, Bogyoke Street, Kalay Myo, Burma; Room (2), YMCA Building, Bogyoke Aung San Road, Forestry Quarter, Taunggyi, Burma; No. 407, Zei Phyu Kone Quarter, Near Ngapali Junction, Thandwe, Burma; No. Mitharsu (Family Video), No. 131/B Zay Taung Bak Lane, Zayit Quarter, Dawei, Burma; No. 13 (B) Zay Tan Gyi Street, Quarter (3), Zay Than Gyi Quarter, Kyaing Tong, Burma; 179 (Nya) Bogyoke Road, San Sai (Kha) Quarter, Tachileik, Burma; No. E (4), Construction Housing, Sumbrabun Road, Ayar Quarter, Myitkyina, Burma; No. 445, Anawa Quarter, Myinttzu Thaka Road, Kawthaung, Burma; No. 4, Naypyidaw, Airport Compound, Naypyidaw, Burma; Room-17, Stadium Building, Theinni Main Road, 12 Quarter, Lashio, Burma; Unit #310, 3rd Floor, Silom Complex, 191 Silom Road, Silom Bangrak, Bangkok 10500, Thailand; Room No. T1-112 & T1-112A, Level 1, Main Terminal Building, Suvarnabhumi Airport, Bangpli, Ssamutprakarn 10540, Thailand; Doing business as AIR BAGAN. [BURMA].

    84. ASIA LIGHT CO. LTD., Mingalar Taung Nyunt Tower, 6 Upper Pansoden Street, Aung San Stadium Eastern Wing, Rangoon, Burma; 15/19 Kunjan Rd., S Aung San Std, Rangoon, Burma [BURMA].

    85. ASIA MEGA LINK CO., LTD., No. 39/40, Bogyoke Aung San Road, Bahosi Housing, Lanmadaw, Rangoon, Burma; Registration ID 1679/2009-2010 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).

    86. ASIA MEGA LINK SERVICES CO., LTD., No. 44/45, Bogyoke Aung San Road, Bahosi Housing Complex, Lanmadaw, Rangoon, Burma; Registration ID 2652/2010-2011 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).

    87. ASIA METAL COMPANY LIMITED, No. 106 Pan Pe Khaung Maung Khtet Road, Industrial Zone (4), Shwe Pyi Thar Township, Yangon, Burma; No. (40) Yangon-Mandalay Road, Kywe Sekan, Pyay Gyi Tagon Township, Mandalay, Burma; No. A/B (1-5), Paung Laung (24) Street, Ext., Ward (2), Nay Pyi Taw, Pyinmana, Burma; Web site http://www.amcsteel.com; Email Address [email protected] [BURMA].

    88. ASIA PIONEER IMPEX PTE. LTD., 10 Anson Road, #23-16 International Plaza, Singapore 079903, Singapore [BURMA].

    89. MYAWADDY BANK LTD. (a.k.a. MYAWADDY BANK), 24/26 Sule Pagoda Road, Yangon, Burma [BURMA].

    90. MYANMAR TREASURE RESORTS (a.k.a. MYANMAR TREASURE BEACH RESORT; a.k.a. MYANMAR TREASURE BEACH RESORTS; a.k.a. MYANMAR TREASURE RESORT (BAGAN); a.k.a. MYANMAR TREASURE RESORT (PATHEIN); a.k.a. “MYANMAR TREASURE RESORT II”), No. 41 Shwe Taung Gyar Street, Bahan Township, Yangon, Burma; No 56 Shwe Taung Gyar Road, Golden Valley, Bahan Township, Yangon, Burma [BURMA].

    91. MYANMAR RUBY ENTERPRISE CO. LTD. (a.k.a. MYANMAR RUBY ENTERPRISE), 24/26 Sule Pagoda Road, Kyauktada Township, Yangon, Burma [BURMA].

    92. MAX MYANMAR SERVICES CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].

    93. MAX MYANMAR TRADING CO., LTD., 1 Ywama Curve, Bayint Naung Road, Ward (2), Hlaing Township, Yangon, Burma [BURMA].

    94. MAX SINGAPORE INTERNATIONAL PTE. LTD., 3 Shenton Way, #24-02, Shenton House 068805, Singapore [BURMA].

    95. MYANMAR AVIA EXPORT COMPANY LIMITED (a.k.a. MYANMAR AVIA EXPORT) [BURMA].

    96. YUZANA COMPANY LIMITED (a.k.a. YUZANA CONSTRUCTION), No. 130 Yuzana Centre, Shwegondaing Road, Bahan Township, Yangon, Burma [BURMA].

    97. PAVO TRADING PTE. LTD., 3 Shenton Way, #24-02 Shenton House, Singapore 068805, Singapore [BURMA].

    98. PIONEER AERODROME SERVICES CO., LTD., No. 203/204, Thiri Mingalar Housing, Strand Rd, Ahlone, Rangoon, Burma; Registration ID 620/2007-2008 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).

    99. WIN, Nyan; DOB 22 Jan 1953; nationality Burma; citizen Burma; Major General; Minister of Foreign Affairs (individual) [BURMA].

    100. WIN, Kyaw; DOB 03 Jan 1944; nationality Burma; citizen Burma; Lieutenant-General; Chief of Bureau of Special Operation 2; Member, State Peace and Development Council (individual) [BURMA].

    101. ROYAL KUMUDRA HOTEL, No. 9 Hotel Zone, Nay Pyi Taw, Burma; No. 1 Ywama Curve, Ba Yint Naung Road, Block (2), Hlaing Township, Rangoon, Burma [BURMA].

    102. S H NG TRADING, 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    103. GREEN ASIA SERVICES CO., LTD., No. 61/62, Bahosi Housing, War Tan St., Lanmadaw T/S, Rangoon, Burma; Registration ID 4013/2011-2012 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).

    104. GLOBAL WORLD INSURANCE COMPANY LIMITED, No. 44, Thein Phyu Road, Corner of Bogyoke Aung San Road and Thein Phyu Road, Pazuntaung, Rangoon, Burma; Registration ID 2511/2012-2013 (Burma) [BURMA] (Linked To: ASIA WORLD CO. LTD.).

    105. G A FOODSTUFFS PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    106. G A LAND PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    107. THET, Khin Lay (a.k.a. THET, Daw Khin Lay); DOB 19 Jun 1947; wife of Thura Shwe Mann (individual) [BURMA].

    108. THIHA (a.k.a. THI HA), c/o Htoo Group of Companies, undetermined; c/o Htoo Trading Company Limited, undetermined; DOB 24 Jun 1960 (individual) [BURMA].

    109. ZA, Tay (a.k.a. TAYZA; a.k.a. TEZA; a.k.a. ZA, Te; a.k.a. ZA, U Tay; a.k.a. ZA, U Te), 6 Cairnhill Circle, Number 18-07, Cairnhill Crest 229813, Singapore; Burma; DOB 18 Jul 1964; alt. DOB 18 Jun 1967; nationality Burma; citizen Burma; Managing Director, Htoo Trading Company Limited; Chairman, Air Bagan Holdings Pte. Ltd. (d.b.a. Air Bagan); Managing Director, Pavo Trading Pte. Ltd. (individual) [BURMA].

    110. G A SENTOSA PTE. LTD., 3 Shenton Way, #10-01 Shenton House, Singapore 068805, Singapore [BURMA].

    111. G A SENTOSA PTE. LTD., 101 Cecil Street, 08-08 Tong Eng Building, Singapore 069533, Singapore [BURMA].

    Dated: October 11, 2016. John E. Smith, Acting Director, Office of Foreign Assets Control.
    [FR Doc. 2016-26124 Filed 10-28-16; 8:45 am] BILLING CODE 4810-AL-P
    DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0789] Proposed Information Collection (Application Requirements To Receive VA Dental Insurance Plan Benefits Under 38 CFR 17.169) Activity: Comment Request AGENCY:

    Veterans Health Administration, Department of Veterans Affairs.

    ACTION:

    Notice.

    SUMMARY:

    The Veterans Health Administration (VHA) is announcing an opportunity for public comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act (PRA) of 1995, Federal agencies are required to publish notice in the Federal Register concerning each proposed collection of information, including each proposed extension of a currently approved collection, and allow 60 days for public comment in response to the notice. This notice solicits comments on information needed to identify areas for improvement in clinical training programs.

    DATES:

    Written comments and recommendations on the proposed collection of information should be received on or before December 30, 2016.

    ADDRESSES:

    Submit written comments on the collection of information through the Federal Docket Management System (FDMS) at www.Regulations.gov; or to Brian McCarthy, Office of Regulatory and Administrative Affairs, Veterans Health Administration (10B4), Department of Veterans Affairs, 810 Vermont Avenue NW., Washington, DC 20420 or email: [email protected]. Please refer to “OMB Control No. 2900-0789” in any correspondence. During the comment period, comments may be viewed online through FDMS.

    FOR FURTHER INFORMATION CONTACT:

    Brian McCarthy at (202) 461-6345.

    SUPPLEMENTARY INFORMATION:

    Under the PRA of 1995 (Pub. L. 104-13; 44 U.S.C. 3501-3521), Federal agencies must obtain approval from OMB for each collection of information they conduct or sponsor. This request for comment is being made pursuant to Section 3506(c)(2)(A) of the PRA.

    With respect to the following collection of information, VHA invites comments on: (1) Whether the proposed collection of information is necessary for the proper performance of VHA's functions, including whether the information will have practical utility; (2) the accuracy of VHA's estimate of the burden of the proposed collection of information; (3) ways to enhance the quality, utility, and clarity of the information to be collected; and (4) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or the use of other forms of information technology.

    Titles: VA Dental Insurance Plan (VADIP) Fact Sheet.

    OMB Control Number: 2900-0789.

    Type of Review: Revision of a currently approved collection.

    Abstracts: Department of Veteran Affairs Dental Insurance Reauthorization Act of 2016 (Pub. L. 114-218) requires VA to establish and administer a dental insurance plan for Veterans enrolled in VA health care and survivors and dependents of Veterans eligible for VA's Civilian Health and Medical Program (CHAMPVA). Public Law 114-218 requires VA to contract with a private insurer (using the Federal contracting process) to offer dental insurance, and the private insurer will be responsible for virtually all aspects of the administration of the dental insurance program. VA's role will primarily be to form the contract with the private insurer and verify eligibility of veterans and certain survivors and dependents. Enrolled veterans and certain survivors and dependents of veterans will be required to complete an application to be enrolled in this dental insurance program, and will be required to submit certain documentation/information for certain types of disenrollment requests and for appeals of claims decisions. VA will not prescribe the form these collections are to take, but is prescribing regulations that nonetheless require these collections. These collections are required to fulfill VA's obligations under Public Law 114-218.

    Affected Public: Individuals or households.

    Estimated Annual Burden: 40,750.

    Estimated Average Burden per Respondent: 76 minutes.

    Frequency of Response: Annually.

    Estimated Annual Responses: 301,500.

    By direction of the Secretary.

    Cynthia Harvey-Pryor, Program Specialist, Office of Privacy and Records Management, Department of Veterans Affairs.
    [FR Doc. 2016-26149 Filed 10-28-16; 8:45 am] BILLING CODE 8320-01-P
    81 210 Monday, October 31, 2016 Rules and Regulations Part II Department of Education 34 CFR Parts 612 and 686 Teacher Preparation Issues; Final Rule DEPARTMENT OF EDUCATION 34 CFR Parts 612 and 686 [Docket ID ED-2014-OPE-0057] RIN 1840-AD07 Teacher Preparation Issues AGENCY:

    Office of Postsecondary Education, Department of Education.

    ACTION:

    Final regulations.

    SUMMARY:

    The Secretary establishes new regulations to implement requirements for the teacher preparation program accountability system under title II of the Higher Education Act of 1965, as amended (HEA), that will result in the collection and dissemination of more meaningful data on teacher preparation program quality (title II reporting system). The Secretary also amends the regulations governing the Teacher Education Assistance for College and Higher Education (TEACH) Grant program under title IV of the HEA to condition TEACH Grant program funding on teacher preparation program quality and to update, clarify, and improve the current regulations and align them with title II reporting system data.

    DATES:

    The regulations in 34 CFR part 612 are effective November 30, 2016. The amendments to part 686 are effective on July 1, 2017, except for amendatory instructions 4.A., 4.B., 4.C.iv., 4.C.x. and 4.C.xi., amending 34 CFR 686.2(d) and (e), and amendatory instruction 6, amending 34 CFR 686.11, which are effective on July 1, 2021.

    FOR FURTHER INFORMATION CONTACT:

    Sophia McArdle, Ph.D., U.S. Department of Education, 400 Maryland Avenue SW., Room 6W256, Washington, DC 20202. Telephone: (202) 453-6318 or by email: [email protected].

    If you use a telecommunications device for the deaf (TDD) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1-800-877-8339.

    SUPPLEMENTARY INFORMATION: Executive Summary Purpose of This Regulatory Action

    Section 205 of the HEA requires States and institutions of higher education (IHEs) annually to report on various characteristics of their teacher preparation programs, including an assessment of program performance. These reporting requirements exist in part to ensure that members of the public, prospective teachers and employers (districts and schools), and the States, IHEs, and programs themselves have accurate information on the quality of these teacher preparation programs. These requirements also provide an impetus to States and IHEs to make improvements where they are needed. Thousands of novice teachers enter the profession every year 1 and their students deserve to have well-prepared teachers.

    1 U.S. Department of Education, Digest of Education Statistics (2013). Public and private elementary and secondary teachers, enrollment, pupil/teacher ratios, and new teacher hires: Selected years, fall 1955 through fall 2023 [Data File]. Retrieved from: http://nces.ed.gov/programs/digest/d13/tables/dt13_208.20.asp.

    Research from States such as Tennessee, North Carolina, and Washington indicates that some teacher preparation programs report statistically significant differences in the student learning outcomes of their graduates.2 Statutory reporting requirements on teacher preparation program quality for States and IHEs are broad. The Department's existing title II reporting system framework has not, however, ensured sufficient quality feedback to various stakeholders on program performance. A U.S. Government Accountability Office (GAO) report found that some States are not assessing whether teacher preparation programs are low-performing, as required by law, and so prospective teachers may have difficulty identifying low-performing teacher preparation programs, possibly resulting in teachers who are not fully prepared to educate children.3 In addition, struggling teacher preparation programs may not receive the technical assistance they need and, like the teaching candidates themselves, school districts, and other stakeholders, will not be able to make informed decisions.

    2 See Report Card on the Effectiveness of Teacher Training Programs, Tennessee 2014201420142014 Report Card. (n.d.). Retrieved from: www.tn.gov/thec/article/report-card; Goldhaber, D., & Liddle, S. (2013). The Gateway to the Profession: Assessing Teacher Preparation Programs Based on Student Achievement. Economics of Education Review, 34, 29-44.

    3 See U.S. Government Accountability Office (GAO) (2015). Teacher Preparation Programs: Education Should Ensure States Identify Low-Performing Programs and Improve Information-Sharing. GAO-15-598. Washington, DC. Retrieved from: http://gao.gov/products/GAO-15-598. (Hereafter referred to as “GAO.”)

    Moreover, section 205 of the HEA requires States to report on the criteria they use to assess whether teacher preparation programs are low-performing or at-risk of being low-performing, but it is difficult to identify programs in need of remediation or closure because few of the reporting requirements ask for information indicative of program quality. The GAO report noted that half the States said current title II reporting system data were “slightly useful,” “neither useful nor not useful,” or “not useful”; over half the teacher preparation programs surveyed said the data were not useful in assessing their programs; and none of the surveyed school district staff said they used the data.4 The Secretary is committed to ensuring that the measures by which States judge the quality of teacher preparation programs reflect the true quality of the programs and provide information that facilitates program improvement and, by extension, improvement in student achievement.

    4 GAO at 26.

    The final regulations address shortcomings in the current system by defining the indicators of quality that a State must use to assess the performance of its teacher preparation programs, including more meaningful indicators of program inputs and program outcomes, such as the ability of the program's graduates to produce gains in student learning 5 (understanding that not all students will learn at the same rate). The final regulations build on current State data systems and linkages and create a much-needed feedback loop to facilitate program improvement and provide valuable information to prospective teachers, potential employers, and the general public.

    5 Rivkin, S. G., Hanushek, E. A., & Kain, J. F. (2005). Teachers, schools, and academic achievement. Econometrica, 73(2), 417-458. http://doi.org/10.1111/j.1468-0262.2005.00584.x.

    The final regulations also link assessments of program performance under HEA title II to eligibility for the Federal TEACH Grant program. The TEACH Grant program, authorized by section 420M of the HEA, provides grants to eligible IHEs, which, in turn, use the funds to provide grants of up to $4,000 annually to eligible teacher preparation candidates who agree to serve as full-time teachers in high-need fields at low-income schools for not less than four academic years within eight years after completing their courses of study. If a TEACH Grant recipient fails to complete his or her service obligation, the grant is converted into a Federal Direct Unsubsidized Stafford Loan that must be repaid with interest.

    Pursuant to section 420L(1)(A) of the HEA, one of the eligibility requirements for an institution to participate in the TEACH Grant program is that it must provide high-quality teacher preparation. However, of the 38 programs identified by States as “low-performing” or “at-risk,” 22 programs were offered by IHEs participating in the TEACH Grant program. The final regulations limit TEACH Grant eligibility to only those programs that States have identified as “effective” or higher in their assessments of program performance under HEA title II.

    Summary of the Major Provisions of This Regulatory Action

    The final regulations—

    • Establish necessary definitions and requirements for IHEs and States related to the quality of teacher preparation programs, and require States to develop measures for assessing teacher preparation performance.

    • Establish indicators that States must use to report on teacher preparation program performance, to help ensure that the quality of teacher preparation programs is judged on reliable and valid indicators of program performance.

    • Establish the areas States must consider in identifying teacher preparation programs that are low-performing and at-risk of being low-performing, the actions States must take with respect to those programs, and the consequences for a low-performing program that loses State approval or financial support. The final regulations also establish the conditions under which a program that loses State approval or financial support may regain its eligibility for title IV, HEA funding.

    • Establish a link between the State's classification of a teacher preparation program's performance under the title II reporting system and that program's identification as “high-quality” for TEACH Grant eligibility purposes.

    • Establish provisions that allow TEACH Grant recipients to satisfy the requirements of their agreement to serve by teaching in a high-need field that was designated as high-need at the time the grant was received.

    • Establish conditions that allow TEACH Grant recipients to have their service obligations discharged if they are totally and permanently disabled. The final regulations also establish conditions under which a student who had a prior service obligation discharged due to total and permanent disability may receive a new TEACH Grant.

    Costs and Benefits

    The benefits, costs, and transfers related to the regulations are discussed in more detail in the Regulatory Impact Analysis (RIA) section of this document. Significant benefits of the final regulations include improvements to the HEA title II accountability system that will enable prospective teachers to make more informed choices about their enrollment in a teacher preparation program, and will enable employers of prospective teachers to make more informed hiring decisions. Further, the final regulations will create incentives for States and IHEs to monitor and continuously improve the quality of their teacher preparation programs. Most importantly, the final regulations will help support elementary and secondary school students because the changes will lead to better prepared, higher quality teachers in classrooms, including for students in high-need schools and communities, who are disproportionately taught by less experienced teachers.

    The net budget impact of the final regulations is approximately $0.49 million in reduced costs over the TEACH Grant cohorts from 2016 to 2026. We estimate that the total cost annualized over 10 years of the final regulations is between $27.5 million and $27.7 million (see the Accounting Statement section of this document).

    On December 3, 2014, the Secretary published a notice of proposed rulemaking (NPRM) for these parts in the Federal Register (79 FR 71820). The final regulations contain changes from the NPRM, which are fully explained in the Analysis of Comments and Changes section of this document. Some commenters requested clarification regarding how the proposed State reporting requirements would affect teacher preparation programs provided through distance education and TEACH Grant eligibility for students enrolled in teacher preparation programs provided through distance education. In response to these comments, on April 1, 2016, the Department published a supplemental notice of proposed rulemaking (Supplemental NPRM) in the Federal Register (81 FR 18808) that reopened the public comments period for 30 days solely to seek comment on those specific issues. The Department specifically requested on public comments on issues related to reporting by States on teacher preparation programs provided through distance education, and TEACH Grant eligibility requirements for teacher preparation programs provided through distance education. The comment period for the Supplemental NPRM closed on May 2, 2016.

    Public Comment: In response to our invitation in the December 3, 2014, NPRM, approximately 4,800 parties submitted comments on the proposed regulations. In response to our invitation in the Supplemental NPRM, the Department received 58 comments.

    We discuss substantive issues under the sections of the proposed regulations to which they pertain. Generally, we do not address technical or other minor changes.

    Analysis of Comments and Changes: An analysis of the comments and of any changes in the regulations since publication of the NPRM and the Supplemental NPRM follows.

    Part 612—Title II Reporting System Subpart A—Scope, Purpose, and Definitions Section 612.1 Scope and Purpose Statutory Authority

    Comments: A number of commenters raised concerns about whether the Department has authority under the HEA to issue these regulations. In this regard, several commenters asserted that the Department does not have the statutory authority to require States to include student learning outcomes, employment outcomes, and survey outcomes among the indicators of academic content knowledge and teaching skills that would be included in the State's report card under § 612.5. Commenters also claimed that the HEA does not authorize the Department to require States, in identifying low-performing or at-risk teacher preparation programs, to use those indicators of academic content knowledge and teaching skills as would be required under § 612.6. These commenters argued that section 207 of the HEA provides that levels of performance shall be determined solely by the State, and that the Department may not provide itself authority to mandate these requirements through regulations when the HEA does not do so.

    Commenters argued that only the State may determine whether to include student academic achievement data (and by inference our other proposed indicators of academic content knowledge and teaching skills) in their assessments of teacher preparation program performance. One commenter contended that the Department's attempt to “shoehorn” student achievement data into the academic content knowledge and teaching skills of students enrolled in teacher preparation programs (section 205(b)(1)(F)) would render meaningless the language of section 207(a) that gives the State the authority to establish levels of performance, and what those levels contain. These commenters argued that, as a result, the HEA prohibits the Department from requiring States to use any particular indicators. Other commenters argued that such State authority also flows from section 205(b)(1)(F) of the HEA, which provides that, in the State Report Card (SRC), the State must include a description of the method of assessing teacher preparation program performance. This includes indicators of the academic content knowledge and teaching skills of the students enrolled in such programs.

    Commenters also stated that the Department does not have the authority to require that a State's criteria for assessing the performance of any teacher preparation program include the indicators of academic content knowledge and teaching skills, including, “in significant part,” student learning outcomes and employment outcomes for high-need schools. See proposed §§ 612.6(a)(1) and 612.4(b)(1). Similar concerns were expressed with respect to proposed § 612.4(b)(2), which provided that a State could determine that a teacher preparation program was effective (or higher) only if the program was found to have “satisfactory or higher” student learning outcomes.

    Discussion: Before we respond to the comments about specific regulations and statutory provisions, we think it would be helpful to outline the statutory framework under which we are issuing these regulations. Section 205(a) of the HEA requires that each IHE that provides a teacher preparation program leading to State certification or licensure and that enrolls students who receive HEA student financial assistance report on a statutorily enumerated series of data elements for the programs it provides. Section 205(b) of the HEA requires each State that receives funds under the HEA to provide to the Secretary and make widely available to the public information on, among other things, the quality of traditional and alternative route teacher preparation programs that includes not less than the statutorily enumerated series of data elements. The State must do so in a uniform and comprehensible manner, conforming to definitions and methods established by the Secretary. Section 205(c) of the HEA directs the Secretary to prescribe regulations to ensure the validity, reliability, accuracy, and integrity of the data submitted. Section 206(b) requires that IHEs provide assurance to the Secretary that their teacher training programs respond to the needs of LEAs, are closely linked with the instructional decisions novice teachers confront in the classroom, and prepare candidates to work with diverse populations and in urban and rural settings, as applicable. Section 207(a) of the HEA provides that in order to receive funds under the HEA, a State must conduct an assessment to identify low-performing teacher preparation programs in the State, and help those programs through provision of technical assistance. Section 207(a) further provides that the State's report identify programs that the State determines to be low-performing or at risk of being low-performing, and that levels of performance are to be determined solely by the State.

    The proposed regulations, like the final regulations, reflect the fundamental principle and the statutory requirement that the assessment of teacher preparation program performance must be conducted by the State, with criteria the State establishes and levels of differentiated performance that are determined by the State. Section 205(b)(1)(F) of the HEA provides that a State must include in its report card a description of its criteria for assessing the performance of teacher preparation programs within IHEs in the State and that those criteria must include indicators of the academic content knowledge and teaching skills of students enrolled in such programs. Significantly, section 205(b)(1) further provides that the State's report card must conform with definitions and methods established by the Secretary, and section 205(c) authorizes the Secretary to prescribe regulations to ensure the reliability, validity, integrity, and accuracy of the data submitted in the report cards.

    Consistent with those statutory provisions, § 612.5 establishes the indicators States must use to comply with the reporting requirement in section 205(b)(1)(F), namely by having States include in the report card their criteria for program assessment and the indicators of academic content knowledge and teaching skills that they must include in those criteria. While the term “teaching skills” is defined in section 200(23) of the HEA, the definition is complex and the statute does not indicate what are appropriate indicators of academic content knowledge and teaching skills of those who complete teacher preparation programs. Thus, in § 612.5, we establish reasonable definitions of these basic, but ambiguous statutory phrases in an admittedly complex area—how States may reasonably assess the performance of their teacher preparation programs—so that the conclusions States reach about the performance of individual programs are valid and reliable in compliance with the statute. We discuss the reasonableness of the four general indicators of academic content knowledge and teaching skills that the Secretary has established in § 612.5 later in this preamble under the heading What indicators must a State use to report on teacher preparation program performance for purposes of the State report card?. Ultimately though, section 205(b) clearly permits the Secretary to establish definitions for the types of information that must be included in the State report cards, and, in doing so, complements the Secretary's general authority to define statutory phrases that are ambiguous or require clarification.

    The provisions of § 612.5 are also wholly consistent with section 207(a) of the HEA. Section 207(a) provides that States determine the levels of program performance in their assessments of program performance and discusses the criteria a State “may” include in those levels of performance. However, section 207(a) does not negate the basic requirement in section 205(b) that States include indicators of academic content knowledge and teaching skills within their program assessment criteria or the authority of the Secretary to establish definitions for report card elements. Moreover, the regulations do not limit a State's authority to establish, use, and report other criteria that the State determines are appropriate for generating a valid and reliable assessment of teacher preparation program performance. Section 612.5(b) of the regulations expressly permits States to supplement the required indicators with other indicators of a teacher's effect on student performance, including other indicators of academic content and knowledge and teaching skills, provided that the State uses the same indicators for all teacher preparation programs in the State. In addition, working with stakeholders, States are free to determine how to apply these various criteria and indicators in order to determine, assess, and report whether a preparation program is low-performing or at-risk of being low-performing.

    We appreciate commenters' concerns regarding the provisions in §§ 612.4(b)(1) and (b)(2) and 612.6(b)(1) regarding weighting and consideration of certain indicators. Based on consideration of the public comments and the potential complexity of these requirements, we have removed these provisions from the final regulations. While we have taken this action, we continue to believe strongly that providing significant weight to these indicators when determining a teacher preparation program's level of performance is very important. The ability of novice teachers to promote positive student academic growth should be central to the missions of all teacher preparation programs, and having those programs focus on producing well-prepared novice teachers who work and stay in high-need schools is critical to meeting the Nation's needs. Therefore, as they develop their measures and weights for assessing and reporting the performance of each teacher preparation program in their SRCs, we strongly encourage States, in consultation with their stakeholders, to give significant weight to these indicators.

    Changes: We have revised §§ 612.4(b)(1) and 612.6(a)(1) to remove the requirement for States to include student learning outcomes and employment outcomes, “in significant part,” in their use of indicators of academic content knowledge and teaching skills as part of their criteria for assessing the performance of each teacher preparation program. We also have revised § 612.4(b)(2) to remove the requirement that permitted States to determine that a teacher preparation program was effective (or higher quality) only if the State found the program to have “satisfactory or higher” student learning outcomes.

    Comments: Several commenters objected to the Department's proposal to establish four performance levels for States' assessment of their teacher preparation programs. They argued that section 207(a), which specifically requires States to report those programs found to be either low-performing or at-risk of being low-performing, establishes the need for three performance levels (low-performing, at-risk of being low-performing, and all other programs) and that the Department lacks authority to require reporting on the four performance levels proposed in the NPRM, i.e., those programs that are “low-performing,” “at-risk,” exceptional,” and everything else. These commenters stated that these provisions of the HEA give to the States the authority to determine whether to establish more than three performance levels.

    Discussion: Section 205(b) of the HEA provides that State reports “shall include not less than the following,” and this provision authorizes the Secretary to add reporting elements to the State reports. It was on this basis that we proposed, in § 612.4(b)(1), to supplement the statutorily required elements to require States, when making meaningful differentiation in teacher preparation program performance, to use at least four performance levels, including exceptional. While we encourage States to identify programs that are exceptional in order to recognize and celebrate outstanding programs, and so that prospective teachers and their employers know of them and others may learn from them, in consideration of comments that urged the Secretary not to require States to report a fourth performance level and other comments that expressed concerns about overall implementation costs, we are not adopting this proposal in the final regulations.

    Changes: We have revised § 612.4(b)(1) to remove the requirement for States to rate their teacher preparation programs using the category “exceptional.” We have also removed the definition of “exceptional teacher preparation program” from the Definitions section in § 612.2.

    Comments: Several commenters raised concerns about whether the provisions of § 612.6 are consistent with section 205(b)(2) of the HEA, which prohibits the Secretary from creating a national list or ranking of States, institutions, or schools using the scaled scores required under section 205. Some of these commenters acknowledged the usefulness of a system for public information on teacher preparation. However, the commenters argued that, if these regulations are implemented, the Federal government would instead be creating a program rating system in violation of section 205(b)(2).

    Commenters also stated that by mandating a system for rating teacher preparation programs, including the indicators by which teacher preparation programs must be rated, what a State must consider in identifying low-performing or at-risk teacher preparation programs, and the actions a State must take with respect to low-performing programs (proposed §§ 612.4, 612.5, and 612.6), the Federal government is impinging on the authority of States, which authorize, regulate, and approve IHEs and their teacher preparation programs.

    Discussion: Although section 207(a) of the HEA expressly requires States to include in their SRCs a list of programs that they have identified as low-performing or at-risk of being low-performing, the regulations do not in any other way require States to specify or create a list or ranking of institutions or programs and the Department has no intention of requiring States to do so. Nor will the Department be creating a national list or ranking of States, institutions, or teacher preparation programs. Thus, there is no conflict with section 205(b)(2).

    As we discussed in response to the prior set of comments, these regulations establish definitions for terms provided in title II of the HEA in order to help ensure that the State and IHE reporting system meet its purpose. In authorizing the Secretary to define statutory terms and establish reporting methods needed to properly implement the title II reporting system, neither Congress nor the Department is abrogating State authority to authorize, regulate, and approve IHEs and their teacher preparation programs. Finally, in response to the comments that proposed §§ 612.4, 612.5, and 612.6 would impermissibly impinge on the authority of States in terms of actions they must take with respect to low-performing programs, we note that the regulations do little more than clarify the sanctions that Congress requires in section 207(b) of the HEA. Those sanctions address the circumstances in which students enrolled in a low-performing program may continue to receive or regain Federal student financial assistance, and thus the Federal government has a direct interest in the subject.

    Changes: None.

    Comments: One commenter contended that Federal law provides no authority to compel LEAs to develop the criteria and implement the collection and reporting of student learning outcome data, and that there is little that the commenter's State can do to require LEA compliance with those reporting requirements.

    Discussion: Section 205(b) of the HEA requires all States receiving HEA funds to provide the information the law identifies “in a uniform and comprehensible manner that conforms with the definitions and methods established by the Secretary.” These regulations place responsibility for compliance upon the States, not the LEAs.

    Since all LEAs stand to benefit from the success of the new reporting system through improved transparency and information about the quality of teacher preparation programs from which they may recruit and hire new teachers, we assume that all LEAs will want to work with their States to find manageable ways to implement the regulations. Moreover, without more information from the commenter, we cannot address why a particular State would not have the authority to insist that an LEA provide the State with the information it needs to meet these reporting requirements.

    Changes: None.

    Federal-State-Institution Relationship, Generally

    Comments: Many commenters commented generally that the proposed regulations are an example of Federal overreach and represent a profound and improper shift in the historic relationship among institutions, States, school districts, accrediting agencies, and the Federal government in the area of teacher preparation and certification. For example, one commenter stated that the proposal threatens the American tradition of Federal non-interference with academic judgments, and makes the Department the national arbiter of what teacher preparation programs should teach, who they should teach, and how they should teach. Commenters also contended that the proposed regulations impermissibly interfere with local and State control and governance by circumventing States' rights delegated to local school districts and the citizens of those districts to control the characteristics of quality educators and to determine program approval.

    Discussion: The need for teacher preparation programs to produce teachers who can adequately and effectively teach to the needs of the Nation's elementary and secondary school students is national in scope and self-evident. Congress enacted the HEA title II reporting system as an important tool to address this need. Our final regulations are intended to give the public confidence that, as Congress anticipated when it enacted sections 205(b) and 207 of the HEA, States have reasonably determined whether teacher preparation programs are, or are not, meeting the States' expectations for their performance. While the regulations provide for use of certain minimum indicators and procedures for determining and reporting program performance, they provide States with a substantial amount of discretion in how to measure these indicators, what additional indicators a State may choose to add, and how to weight and combine these indicators and criteria into an overall assessment of a teacher preparation program's performance. Thus, the final regulations are consistent with the traditional importance of State decision-making in the area of evaluating educational performance. The public, however, must have confidence that the procedures and criteria that each State uses to assess program performance and to report programs as low-performing or at-risk are reasonable and transparent. Consistent with the statutory requirement that States report annually to the Secretary and to the public “in a uniform and comprehensible manner that conforms to the definitions and methods established by the Secretary,” the regulations aim to help ensure that each State report meets this basic test.

    We disagree with comments that allege that the regulations reflect overreach by the Federal government into the province of States regarding the approval of teacher preparation programs and the academic domain of institutions that conduct these programs. The regulations do not constrain the academic judgments of particular institutions, what those institutions should teach in their specific programs, which students should attend those programs, or how those programs should be conducted. Nor do they dictate which teacher preparation programs States should approve or should not approve. Rather, by clarifying limited areas in which sections 205 and 207 of the HEA are unclear, the regulations implement the statutory mandate that, consistent with definitions and reporting methods the Secretary establishes, States assess the quality of the teacher preparation programs in their State, identify those that are low-performing or at-risk of being low-performing, and work to improve the performance of those programs.

    With the changes we are making in these final regulations, the system for determining whether a program is low-performing or at-risk of being low-performing is unarguably a State-determined system. Specifically, as noted above, in assessing and reporting program performance, each State is free to (1) adopt and report other measures of program performance it believes are appropriate, (2) use discretion in how to measure student learning outcomes, employment outcomes, survey outcomes, and minimum program characteristics, and (3) determine for itself how these indicators of academic content knowledge and teaching skills and other criteria a State may choose to use will produce a valid and reliable overall assessment of each program's performance. Thus, the assessment system that each State will use is developed by the State, and does not compromise the ability of the State and its stakeholders to determine what is and is not a low-performing or at-risk teacher preparation program.

    Changes: None.

    Constitutional Issues

    Comments: One commenter stated that the proposed regulations amounted to a coercive activity that violates the U.S. Constitution's Spending Clause (i.e., Article I, Section 8, Clause 1 of the U.S. Constitution). The commenter argued that sections 205 and 207 of the HEA are grounded in the Spending Clause and Spending Clause jurisprudence, including cases such as Arlington C. Sch. Dist. Bd. of Educ. v. Murphy, 548 U.S. 291 (2006), which provides that States are not bound by requirements of which they have no clear notice. In particular, the commenter asserted that, in examining the text of the statute in order to decide whether to accept Federal financial assistance, a State would not have clear notice that it would be required to commit substantial amounts of funds to develop the infrastructure required to include student learning outcome data in its SRC or include student learning outcomes in its evaluation of teacher preparation programs. Some commenters stated that the proposed regulations violate the Tenth Amendment to the U.S. Constitution.

    Discussion: Congress' authority to enact the provisions in title II of the HEA governing the State reporting system flows from its authority to “. . . provide for general Welfare of the United States.” Article I, Section 8, Clause 1 (commonly referred to as Congress' “spending authority”). Under that authority, Congress authorized the Secretary to implement the provisions of sections 205 through 207. Thus, the regulations do not conflict with Congress' authority under the Spending Clause. With respect to cases such as Arlington C. Sch. Dist. Bd. of Educ. v. Murphy, States have full notice of their responsibilities under the reporting system through the rulemaking process the Department has conducted under the Administrative Procedure Act and the General Education Provisions Act to develop these regulations.

    We also do not perceive a legitimate Tenth Amendment issue. The Tenth Amendment provides in pertinent part that powers not delegated to the Federal government by the Constitution are reserved to the States. Congress used its spending authority to require institutions that enroll students who receive Federal student financial assistance in teacher preparation programs, and States that receive HEA funds, to submit information as required by the Secretary in their institutional report cards (IRCs) and SRCs. Thus, the Secretary's authority to define the ambiguous statutory term “indicators of academic content knowledge and teaching skills” to include the measures the regulations establish, coupled with the authority States have under section 205(b)(1)(F) of the HEA to establish other criteria with which they assess program performance, resolves any claim that the assessment of program performance is a matter left to the States under the Tenth Amendment.

    Changes: None.

    Unfunded Mandates

    Comments: Some commenters stated that the proposed regulations would amount to an unfunded mandate, in that they would require States, institutions with teacher preparation programs, and public schools to bear significant implementation costs, yet offer no Federal funding to cover them. To pay for this unfunded mandate, several commenters stated that costs would be passed on to students via tuition increases, decreases in funding for higher education, or both.

    Discussion: These regulations do not constitute an unfunded mandate. Section 205(b) makes reporting “in a uniform and comprehensible manner that conforms with the definitions and methods established by the Secretary” a condition of the State's receipt of HEA funds. And, as we have stated, the regulations implement this statutory mandate.

    Changes: None.

    Loss of Eligibility To Enroll Students Who Receive HEA-Funded Student Financial Aid

    Comments: Many commenters stated that the Department lacks authority to establish Federally defined performance criteria for the purpose of determining a teacher preparation program's eligibility for student financial aid under title IV of the HEA. Commenters expressed concern that the Department is departing from the current model, in which the Department determines institutional eligibility for title IV student aid, to a model in which this function would be outsourced to the States. While some commenters acknowledged that, under the HEA, a teacher preparation program loses its title IV eligibility if its State decides to withdraw approval or financial support, commenters asserted that the HEA does not intend for this State determination to be coupled with a prescriptive Federal mandate governing how the determination should be made. A number of commenters also stated that the regulations would result in a process of determining eligibility for Federal student aid that will vary by State.

    Similarly, some commenters stated that the proposed requirements in § 612.8(b)(1) for regaining eligibility to enroll students who receive title IV aid exceed the statutory authority in section 207(b)(4) of the HEA, which provides that a program is reinstated upon a demonstration of improved performance, as determined by the State. Commenters expressed concern that the proposed regulations would shift this responsibility from the State to the Federal government, and stated that teacher preparation programs could be caught in limbo. They argued that if a State had already reinstated funding and identified that a program had improved performance, the program's ability to enroll students who receive student financial aid would be conditioned on the Secretary's approval. The commenters contended that policy changes as significant as these should come from Congress, after scrutiny and deliberation of a reauthorized HEA.

    Discussion: Section 207(b) of the HEA states, in relevant part:

    Any teacher preparation program from which the State has withdrawn the State's approval, or terminated the State's financial support, due to the low performance of the program based upon the State assessment described in subsection (a)—

    (1) Shall be ineligible for any funding for professional development activities awarded by the Department;

    (2) May not be permitted to accept or enroll any student who receives aid under title IV in the institution's teacher preparation program;

    (3) Shall provide transitional support, including remedial services if necessary, for students enrolled at the institution at the time of termination of financial support or withdrawal of approval; and

    (4) Shall be reinstated upon demonstration of improved performance, as determined by the State.

    Sections 612.7 and 612.8 implement this statutory provision through procedures that mirror existing requirements governing termination and reinstatement of student financial support under title IV of the HEA. As noted in the preceding discussion, our regulations do not usurp State authority to determine how to assess whether a given program is low-performing, and our requirement that States do so using, among other things, the indicators of novice teachers' academic content knowledge and teaching skills identified in § 612.5 is consistent with title II of the HEA.

    Consistent with section 207(a) of the HEA, a State determines a teacher preparation program's performance level based on the State's use of those indicators and any other criteria or indicators the State chooses to use to measure the overall level of the program's performance. In addition, consistent with section 207(b), the loss of eligibility to enroll students receiving Federal student financial aid does not depend upon a Department decision. Rather, the State determines whether the performance of a particular teacher preparation program is so poor that it withdraws the State's approval of, or terminates the State's financial support for, that program. Each State may use a different decision model to make this determination, as contemplated by section 207(b).

    Commenters' objections to our proposal for how a program subject to section 207(b) may regain eligibility to enroll students who receive title IV aid are misplaced. Section 207(b)(4) of the HEA provides that a program found to be low-performing is reinstated upon the State's determination that the program has improved, which presumably would need to include the State's reinstatement of State approval or financial support, since otherwise the institution would continue to lose its ability to accept or enroll students who receive title IV aid in its teacher preparation programs. However, the initial loss of eligibility to enroll students who receive title IV aid is a significant event, and we believe that Congress intended that section 207(b)(4) be read and implemented not in isolation, but rather in the context of the procedures established in 34 CFR 600.20 for reinstatement of eligibility based on the State's determination of improved performance.

    Changes: None.

    Relationship to Department Waivers Under ESEA Flexibility

    Comments: A number of commenters stated that the proposed regulations inappropriately extend the Federal requirements of the Department's Elementary and Secondary Education Act (ESEA) flexibility initiative to States that have either chosen not to seek a waiver of certain ESEA requirements or have applied for a waiver but not received one. The commenters argued that requiring States to assess all students in non-tested grades and subjects (i.e., those grades and subjects for which testing is not required under title I, part A of the ESEA)—a practice that is currently required only in States with ESEA flexibility or in States that have chosen to participate in the Race to the Top program—sets a dangerous precedent.

    Discussion: While the regulations are similar to requirements the Department established for States that received ESEA flexibility or Race to the Top grants regarding linking data on student growth to individual teachers of non-tested grades and subjects under ESEA title I, part A, they are independent of those requirements. While section 4(c) of the Every Student Succeeds Act (ESSA) 6 ends conditions of waivers granted under ESEA flexibility on August 1, 2016, States that received ESEA flexibility or a Race to the Top grant may well have a head start in implementing systems for linking academic growth data for elementary and secondary school students to individual novice teachers, and then linking data on these novice teachers to individual teacher preparation programs. However, we believe that all States have a strong interest and incentive in finding out whether each of their teacher preparation programs is meeting the needs of their K-12 students and the expectations of their parents and the public. We therefore expect that States will seek to work with other stakeholders to find appropriate ways to generate the data needed to perform the program assessments that these regulations implementing section 205 of the HEA require.

    6 ESSA, which was signed into law in December 2015 (e.g., after the NPRM was published), reauthorizes and amends the ESEA.

    Changes: None.

    Consistency With State Law and Practice

    Comments: A number of commenters expressed concerns about whether the proposed regulations were consistent with State law. Some commenters stated that California law prohibits the kind of data sharing between the two State agencies, the California Commission on Teacher Credentialing (CTC) and the California Department of Education (CDE), that would be needed to implement the proposed regulations. Specifically, the commenter stated that section 44230.5 of the California Education Code (CEC) does not allow CTC to release information on credential holders to any entity other than the type of credential and employing district. In addition, the commenter noted that California statutes (sections 44660-44665 of the CEC) authorize each of the approximately 1,800 districts and charter schools to independently negotiate and implement teacher evaluations, so there is no statewide collection of teacher evaluation data. The commenter also noted that current law prohibits employers from sharing teacher evaluation data with teacher preparation programs or with the State if an individual teacher would be identifiable.

    Another commenter argued that in various ways the proposed regulations constitute a Federal overreach with regard to what Missouri provides in terms of State and local control and governance. Specifically, the commenter stated that proposed regulations circumvent: The rights of Missouri school districts and citizens under the Missouri constitution to control the characteristics of quality education; the authority of the Missouri legislative process and the State Board of Education to determine program quality; State law, specifically, according to the commenter Missouri House Bill 1490 limits how school districts can share locally held student data such as student learning outcomes; and the process already underway to improve teacher preparation in Missouri.

    Other commenters expressed concern that our proposal to require States to use student learning outcomes, employment outcomes, and survey outcomes, as defined in the proposed regulations, would create inconsistencies with what they consider to be the more comprehensive and more nuanced way in which their States assess teacher preparation program performance and then provide relevant feedback to programs and the institutions that operate them.

    Finally, a number of commenters argued that requirements related to indicators of academic content knowledge and teaching skills are unnecessary because there is already an organization, the Council for the Accreditation of Educator Preparation (CAEP), which requires IHEs to report information similar to what the regulations require. These commenters claimed that the reporting of data on indicators of academic content knowledge and teaching skills related to each individual program on the SRC may be duplicative and unnecessary.

    Discussion: With respect to comments on the CEC, we generally defer to each State to interpret its own laws. However, assuming that the CTC will play a role in how California would implement these regulations, we do not read section 44230.5 of the CEC to prohibit CTC from releasing information on credential holders to any entity other than the type of credential and employing district, as the commenters state. Rather, the provision requires CTC to “establish a nonpersonally identifiable educator identification number for each educator to whom it issues a credential, certificate, permit, or other document authorizing that individual to provide a service in the public schools.” Moreover, while sections 44660 through 44665 of the CEC authorize each LEA in California to independently negotiate and implement teacher evaluations, we do not read this to mean that California is prohibited from collecting data relevant to the student learning outcomes of novice teachers and link them to the teachers' preparation program. Commenters did not cite any provision of the CEC that prohibits LEAs from sharing teacher evaluation data with teacher preparation programs or the State if it is done without identifying any individual teachers. We assume that use of the nonpersonally identifiable educator identification number that section 44230.5 of the CEC directs would provide one way to accomplish this task. Finally, we have reviewed the commenters' brief description of the employer surveys and teacher entry and retention data that California is developing for use in its assessments of teacher preparation programs. Based on the comments, and as discussed more fully under the subheading Student Learning Outcomes, we believe that the final regulations are not inconsistent with California's approach.

    While the commenter who referred to Missouri law raised several broad concerns about purported Federal overreach of the State's laws, these concerns were very general. However, we note that in previously applying for and receiving ESEA flexibility, the Missouri Department of Elementary and Secondary Education (MDESE) agreed to have LEAs in the State implement basic changes in their teacher evaluation systems that would allow them to generate student growth data that would fulfill the student learning outcomes requirement. In doing so the MDESE demonstrated that it was fully able to implement these types of activities without conflict with State law. Moreover, the regulations address neither how a State or LEA are to determine the characteristics of effective educators, nor State procedures and authority for determining when to approve a teacher preparation program. Nor do the regulations undermine any State efforts to improve teacher preparation; in implementing their responsibilities under sections 205(b) and 207(a) of the HEA, they simply require that, in assessing the level of performance of each teacher preparation program, States examine and report data about the performance of novice teachers the program produces.

    Finally, we note that, as enacted, House Bill 1490 specifically directs the Missouri State Board of Education to issue a rule regarding gathering student data in the Statewide Longitudinal Data System in terms of the Board's need to make certain data elements available to the public. This is the very process the State presumably would use to gather and report the data that these regulations require. In addition, we read House Bill 1490 to prohibit the MDESE, unless otherwise authorized, “to transfer personally identifiable student data”, something that the regulations do not contemplate. Further, we do not read House Bill 1490 as establishing the kind of limitation on LEAs' sharing student data with the MDESE that the commenter stresses. House Bill 1490 also requires the State Board to ensure compliance with the Family Educational Rights and Privacy Act (FERPA) and other laws and policies; see our discussion of comment on FERPA and State privacy laws under § 612.4(b)(3)(ii)(E).

    We are mindful that a number of States have begun their own efforts to use various methods and procedures to examine how well their teacher preparation programs are performing. For the title II reporting system, HEA provides that State reporting must use common definitions and reporting methods as the Secretary shall determine necessary. While the regulations require all States to use data on student learning outcomes, employment outcomes, survey outcomes, and minimum program characteristics to determine which programs are low-performing or at-risk of being low-performing, States may, after working with their stakeholders, also adopt other criteria and indicators. We also know from the recent GAO report that more than half the States were already using information on program graduates' effectiveness in their teacher preparation program approval or renewal processes and at least 10 others planned to do so—data we would expect to align with these reporting requirements.7 Hence, we trust that what States report in the SRCs will complement their own systems of assessing program performance.

    7 GAO at 13-14.

    Finally, with regard to the work of CAEP, we agree that CAEP may require some institutional reporting that may be similar to the reporting required under the title II reporting system; however, reporting information to CAEP does not satisfy the reporting requirements under title II. Regardless of the information reported to CAEP, States and institutions still have a statutory obligation to submit SRCs and IRCs. The CAEP reporting requirements include the reporting of data associated with student learning outcomes, employment outcomes, and survey outcomes; however, CAEP standards do not require the disaggregation of data for individual teacher preparation programs but this disaggregation is necessary for title II reporting.

    Changes: None.

    Cost Implications

    Comments: A number of commenters raised concerns about the costs of implementing the regulations. They stated that the implementation costs, such as those for the required statewide data systems to be designed, implemented, and refined in the pilot year, would require States either to take funds away from other programs or raise taxes or fees to comply. The commenters noted that these costs could be passed on to students via tuition increases or result in decreased State funding for higher education, and that doing so would create many other unintended consequences, such as drawing State funding away from hiring of educators, minority-serving institutions, or future innovation, reforms, and accountability initiatives. Commenters also stated that the cost to institutions of implementing the regulations could pull funding away from earning national accreditation.

    Some commenters also expressed concern about the costs to States of providing technical assistance to teacher preparation programs that they find to be low-performing, and suggested that those programs could lose State approval or financial support.

    Finally, in view of the challenges in collecting accurate and meaningful data on teacher preparation program graduates who fan out across the United States, commenters argued that the Department should find ways to provide financial resources to States and institutions to help them gather the kinds of data the regulations will require.

    Discussion: The United States has a critical need to ensure that it is getting a good return on the billions of dollars of public funds it spends producing novice teachers. The teacher preparation program reporting system established in title II of the HEA provides an important tool for understanding whether these programs are making good on this investment. But the system can only serve its purpose if States measure and report a program's performance in a variety of ways—in particular, based on important inputs, such as good clinical education and support, as well as on important outcomes, such as novice teachers' success in improving student performance.

    The regulations are designed to achieve these goals, while maintaining State responsibility for deciding how to consider the indicators of academic content knowledge and teaching skills described in § 612.5, along with other relevant criteria States choose to use. We recognize that moving from the current system—in which States, using criteria of their choosing, identified only 39 programs nationally in 2011 as low-performing or at-risk of being low-performing (see the NPRM, 79 FR 71823)—to one in which such determinations are based on meaningful indicators and criteria of program effectiveness is not without cost. We understand that States will need to make important decisions about how to provide for these costs. However, as explained in the Regulatory Impact Analysis section of this document, we concluded both that (1) these costs are manageable, regardless of States' current ability to establish the systems they will need, and (2) the benefits of a system in which the public has confidence that program reporting is valid and reliable are worth those costs.

    While providing technical assistance to low-performing teacher preparation programs will entail some costs, § 612.6(b) simply codifies the statutory requirement Congress established in section 207(a) of the HEA and offers examples of what this technical assistance could entail. Moreover, we assume that a State would want to provide such technical assistance rather than have the program continue to be low-performing and so remain at-risk of losing State support (and eligibility to enroll students who receive title IV aid).

    Finally, commenters requested that we identify funding sources to help States and IHEs gather the required data on students who, upon completing their programs, do not stay in the State. We encourage States to gather and use data on all program graduates regardless of the State to which they ultimately move. However, given the evident costs of doing so on an interstate basis, the final regulations permit States to exclude these students from their calculations of student learning outcomes, their teacher placement and retention rates and from the employer and teacher survey (see the definitions of teacher placement and retention rate in § 612.2) and provisions governing student learning outcomes and survey outcomes in § 612.5(a)(1)(iii) and (a)(3)(ii).

    Changes: None.

    Section 612.2 Definitions Content and Pedagogical Knowledge

    Comments: Several commenters requested that we revise the definition of “content and pedagogical knowledge” to specifically refer to a teacher's ability to factor students' cultural, linguistic, and experiential backgrounds into the design and implementation of productive learning experiences. The commenters stated that pedagogical diversity is an important construct in elementary and secondary education and should be included in this definition.

    Additional commenters requested that this definition specifically refer to knowledge and skills regarding assessment. These commenters stated that the ability to measure student learning outcomes depends upon a teacher's ability to understand the assessment of such learning and not just from the conveyance and explanation of content.

    Another commenter recommended that we specifically mention the distinct set of instructional skills necessary to address the needs of students who are gifted and talented. This commenter stated that there is a general lack of awareness of how to identify and support advanced and gifted learners, and that this lack of awareness has contributed to concerns about how well the Nation's top students are doing compared to top students around the world. The commenter also stated that this disparity could be rectified if teachers were required to address the specific needs of this group of students.

    Multiple commenters requested that we develop data definitions and metrics related to the definition of “content and pedagogical knowledge,” and then collect related data on a national level. They stated that such a national reporting system would facilitate continuous improvement and quality assurance on a systemic level, while significantly reducing burden on States and programs.

    Other commenters recommended that to directly assess for content knowledge and pedagogy, the definition of the term include rating graduates of teacher preparation programs based on a portfolio of the teaching candidates' work over the course of the academic program. These commenters stated that reviewing a portfolio reflecting a recent graduate's pedagogical preparation would be more reliable than rating an individual based on student learning, which cannot be reliably measured.

    Discussion: The proposed definition of “content and pedagogical knowledge” reflected the specific and detailed suggestions of a consensus of non-Federal negotiators. We believe that the definition is sufficiently broad to address, in general terms, the key areas of content and pedagogical knowledge that aspiring teachers should gain in their teacher preparation programs.

    In this regard, we note that the purpose here is not to offer a comprehensive definition of the term that all States must use, as the commenters appear to recommend. Rather, it is to provide a general roadmap for States to use as they work with stakeholders (see § 612.4(c)) to decide how best to determine whether programs that lack the accreditation referenced in § 612.5(a)(4)(i) will ensure that students have the requisite content and pedagogical knowledge they will need as teachers before they complete the programs.

    For this reason, we believe that requiring States to use a more prescriptive definition or to develop common data definitions and metrics aligned to that definition, as many commenters urged, would create unnecessary costs and burdens. Similarly, we do not believe that collecting this kind of data on a national level through the title II reporting system is worth the significant cost and burden that it would entail. Instead, we believe that States, working in consultation with stakeholders, should determine whether their State systems for evaluating program performance should include the kinds of additions to the definition of content and pedagogical knowledge that the commenters recommend.

    We also stress that our definition underscores the need for teacher preparation programs to train teachers to have the content knowledge and pedagogical skills needed to address the learning needs of all students. It specifically refers to the need for a teacher to possess the distinct skills necessary to meet the needs of English learners and students with disabilities, both because students in these two groups face particular challenges and require additional support, and to emphasize the need for programs to train aspiring teachers to teach to the learning needs of the most vulnerable students they will have in their classrooms. While the definition's focus on all students plainly includes students who are gifted and talented, as well as students in all other subgroups, we do not believe that, for purposes of this title II reporting system, the definition of “content and pedagogical skills” requires similar special reference to those or other student groups. However, we emphasize again that States are free to adopt many of the commenters' recommendations. For example, because the definition refers to “effective learning experiences that make the discipline accessible and meaningful for all students,” States may consider a teacher's ability to factor students' cultural, linguistic, and experiential backgrounds into the design and implementation of productive learning experiences, just as States may include a specific focus on the learning needs of students who are gifted and talented.

    Finally, through this definition we are not mandating a particular method for assessing the content and pedagogical knowledge of teachers. As such, under the definition, States may allow teacher preparation programs to use a portfolio review to assess teachers' acquisition of content and pedagogical knowledge.

    Changes: None.

    Employer Survey

    Comments: None.

    Discussion: The proposed definition of “survey outcomes” specified that a State would be required to survey the employers or supervisors of new teachers who were in their first year of teaching in the State where their teacher preparation program is located. To avoid confusion with regard to teacher preparation programs provided through distance education, in the final regulations we have removed the phrase “where their teacher preparation program is located” from the final definition of “employer survey.” In addition to including a requirement to survey those in their first year of teaching in the State and their employers in the “survey outcomes” provision that we have moved to § 612.5(a)(3) of the final regulations, we are including the same clarification in the definitions of “employer survey” and “teacher survey”. We also changed the term “new teacher” to “novice teacher” for the reasons discussed in this document under the definition of “novice teacher.”

    Changes: We have revised the definition of “employer survey” to clarify that this survey is of employers or supervisors of novice teachers who are in their first year of teaching.

    Employment Outcomes

    Comments: None.

    Discussion: Upon review of the proposed regulations, we recognized that the original structure of the regulations could have generated confusion. We are concerned that having a definition for the term “employment outcomes” in § 612.2, when that provision largely serves to operationalize other definitions in the context of § 612.5, was not the clearest way to present these requirements. We therefore are moving the explanations and requirements of those terms into the text of § 612.5(a).

    Changes: We have removed the proposed definition of “employment outcomes” from § 612.2, and moved the text and requirements from the proposed definition to § 612.5(a)(2).

    Exceptional Teacher Preparation Program

    Comments: Many commenters opposed having the regulations define, and having States identify in their SRCs, “exceptional teacher preparation programs”, stating that section 207(a) of the HEA only gives the Department authority to require reporting of three categories of teacher preparation programs: Low-performing, at-risk of being low-performing, and teacher preparation programs that are neither low-performing nor at-risk. A number of commenters noted that some States have used a designation of exceptional and found that the rating did not indicate truly exceptional educational quality. They also stated that teacher preparation programs have used that rating in their marketing materials, and that it may mislead the public as to the quality of the program. In addition, commenters noted that, with respect to the determination of a high-quality teacher preparation program for TEACH Grant program eligibility, it makes no practical difference whether a teacher preparation program is rated as effective or exceptional because eligible students would be able to receive TEACH Grants whether the programs in which they enroll are effective, exceptional, or some other classification above effective.

    Discussion: Section 207(a) of the HEA requires that a State identify programs as low-performing or at-risk of being low-performing, and report those programs in its SRC. However, section 205(b) of the HEA authorizes the Secretary to require States to include other information in their SRCs. Therefore, we proposed that States report which teacher preparation programs they had identified as exceptional because we believe the public should know which teacher preparation programs each State has concluded are working very well. We continue to urge States to identify for the public those teacher preparation programs that are indeed exceptional. Nonetheless, based on our consideration of the concerns raised in the comments, and the costs of reporting using this fourth performance level, we have decided to remove this requirement from the final regulations. Doing so has no impact on TEACH Grants because, as commenters noted, an institution's eligibility to offer TEACH Grants is impacted only where a State has identified a teacher preparation program as low-performing or at-risk. Despite these changes, we encourage States to adopt and report on this additional performance level.

    Changes: We have removed the proposed definition of “exceptional teacher preparation program,” and revised the proposed definition of “effective teacher preparation program” under § 612.2 to mean a teacher preparation program with a level of performance that is higher than low-performing or at-risk. We have also revised § 612.4(b)(1) to remove the requirement that an SRC include “exceptional” as a fourth teacher preparation program performance level.

    High-Need School

    Comments: Multiple commenters requested that States be allowed to develop and use their own definitions of “high-need school” so that State systems do not need to be modified to comply with the regulations. These commenters stated that many States had made great strides in improving the quality of teacher preparation programs, and that the definition of “high-need school” may detract from the reforms already in place in those States. In addition, the commenters noted that States are in the best position to define a high-need school since they can do so with better knowledge of State-specific context.

    Some commenters suggested, alternatively, that the Department include an additional disaggregation requirement for high-need subject areas. These commenters stated that targeting high-need subject areas would have a greater connection to employment outcomes than would high-need schools and, as such, should be tracked as a separate category when judging the quality of teacher preparation programs.

    A number of commenters requested that the definition of high-need school include schools with low graduation rates. Other commenters agreed that this definition should be based on poverty, as defined in section 200(11) of the HEA, but also recommended that a performance component should be included. Specifically, these commenters suggested that high schools in which one-third or more of the students do not graduate on time be designated as high-need schools. Other commenters recommended including geography as an indicator of a school's need, arguing that, in their experience, high schools' urbanicity plays a significant role in determining student success.

    Other commenters expressed concerns with using a quartile-based ranking of all schools to determine which schools are considered high need. These commenters stated that such an approach may lead to schools with very different economic conditions being considered high need. For example, a school in one district might fall into the lowest quartile with only 15 percent of students living in poverty while a school in another district would need to have 75 percent of students living in poverty to meet the same designation.

    Discussion: Our definition of “high-need school” mirrors the definition of that term in section 200(11)(A) of the HEA and, we believe, provides sufficient breadth and flexibility for all States to use it to help determine the performance of their teacher preparation programs. Under the definition, all schools that are in an LEA's highest quartile of schools ranked by family need based on measures that include student eligibility for free and reduced price lunch are deemed high-need schools. (We focus here on this measure of poverty because we believe that this is the primary measure on which many LEAs will collect data.) So, too, are schools with high individual family poverty rates measured by large numbers or percentages of students who are eligible for free and reduced price lunches. Hence, for purposes of title II reporting, not only will all schools with sufficiently high family poverty rates be considered high-need schools, but, regardless of the school's level of family poverty level, every LEA in the Nation with four or more schools will have at least one high-need school. The definition therefore eliminates a novice teacher's LEA preference as a factor affecting the placement or retention rate in high-need schools, and thus permits these measures to work well with this definition of high-need school. This would not necessarily be true if we permitted States to adopt their own definitions of this term.

    We acknowledge the concern expressed by some commenters that the definition of “high-need school” permits schools in different LEAs (and indeed, depending on the breakdown of an LEA's schools in the highest quartile based on poverty, in the same LEA as well) that serve communities with very different levels of poverty all to be considered high-need. However, for a reporting system that will use placement and retention rates in high-need schools as factors bearing on the performance of each teacher preparation program, States may consider applying significantly greater weight to employment outcomes for novice teachers who work in LEAs and schools that serve high-poverty areas than for novice teachers who work in LEAs and schools that serve low-poverty areas.

    Moreover, while we acknowledge that the definition of “high-need school” in section 200(11)(A) of the HEA does not apply to the statutory provisions requiring the submission of SRCs and IRCs, we believe that if we use the term in the title II reporting system it is reasonable that we should give some deference to the definition used elsewhere in title II of the HEA. For reasons provided above, we believe the definition can work well for the indicators concerning teacher placement and retention rates in high-need schools.

    Furthermore, we disagree with the comments that the definition of “high-need school” should include high-need subject areas. As defined in the regulations, a “teacher preparation program” is a program that leads to an initial State teacher certification or licensure in a specific field. Thus, the State's assessment of a teacher preparation program's performance already focuses on a specific subject area, including those we believe States would generally consider to be high-need. In addition, maintaining focus on placement of teachers in schools where students come from families with high actual or relative poverty levels, and not on the subject areas they teach in those schools, will help maintain a focus on the success of students who have fewer opportunities. We therefore do not see the benefit of further burdening State reporting by separately carrying into the definition of a “high-need school” as commenters recommend, factors that focus on high-need subjects.

    We also disagree that the definition of “high-need school” should include an additional criterion of low graduation rates. While we agree that addressing the needs of schools with low graduation rates is a major priority, we believe the definition of “high-need school” should focus on the poverty level of the area the school serves. The measure is easy to calculate and understand, and including this additional component would complicate the data collection and analysis process for States. However, we believe there is a sufficiently high correlation between schools in high-poverty areas, which our definition would deem high-need, and the schools with low graduation rates on which the commenters desire to have the definition focus. We believe this correlation means that a large proportion of low-performing schools would be included in a definition of high-need schools that focuses on poverty.

    Changes: None.

    Comments: None.

    Discussion: Under paragraphs (i)(B) and (ii) of the definition of “high-need school” in the regulations, the identification of a high-need school may be based, in part, on the percentage of students enrolled in the school that are eligible for free or reduced price school lunch under the Richard B. Russell National School Lunch Act. With the passage of the Healthy, Hunger-Free Kids Act of 2010, the National School Lunch Program (NSLP) now includes a new universal meal option, the “Community Eligibility Provision” (CEP or Community Eligibility). CEP reduces burden at the household and local level by eliminating the need to obtain eligibility data from families through individual household applications, and permits schools, if they meet certain criteria, to provide meal service to all students at no charge to the students or their families. To be eligible to participate in Community Eligibility, schools must: (1) Have at least 40 percent of their students qualify for free meals through “direct certification” 8 in the year prior to implementing Community Eligibility; (2) agree to serve free breakfasts and lunches to all students; and, (3) agree to cover, with non-Federal funds, any costs of providing free meals to students above the amounts provided by Federal assistance.

    8 “Direct certification” is a process by which schools identify students as eligible for free meals using data from, among other sources, the Supplemental Nutrition Assistance Program (SNAP) or the Temporary Assistance for Needy Families (TANF) program.

    CEP schools are not permitted to use household applications to determine a reimbursement percentage from the USDA. Rather, the USDA determines meal reimbursement for CEP schools based on “claiming percentages,” calculated by multiplying the percentage of students identified through the direct certification data by a multiplier established in the Healthy, Hunger-Free Kids Act of 2010 and set in regulation at 1.6. The 1.6 multiplier provides an estimate of the number of students that would be eligible for free and reduced-price meals in CEP schools if the schools determined eligibility through traditional means, using both direct certification and household applications. If a State uses NSLP data from CEP schools when determining whether schools are high-need schools, it should not use the number of children actually receiving free meals in CEP schools to determine the percentage of students from low-income families because, in those schools, some children receiving free meals live in households that do not meet a definition of low-income. Therefore, States that wish to use NSLP data for purposes of determining the percentage of children from low-income families in schools that are participating in Community Eligibility should use the number of children for whom the LEA is receiving reimbursement from the USDA (direct certification total with the 1.6 multiplier), not to exceed 100 percent of children enrolled. For example, we can consider a school that participates in Community Eligibility with an enrollment of 1,000 children. The school identifies 600 children through direct certification data as eligible for the NSLP. The school multiplies 600 by 1.6, and that result is 960. The LEA would receive reimbursement through the NSLP for meals for 960 children, or 96 percent of students enrolled. In a ranking of schools in the LEA on the basis of the percentage of students from low-income families, even though 100 percent of students are receiving free meals through NSLP, the school would be ranked on the basis of 96 percent of students from low-income families. The use of claiming percentages for identifying CEP schools as high-need schools, rather than the number of students actually receiving free lunch through NSLP ensures comparability, regardless of an individual school's decision regarding participation in the program.

    Changes: None.

    Novice Teacher

    Comments: Many commenters expressed concerns about the proposed definition of “new teacher.” These commenters noted that the definition distinguishes between traditional teacher preparation programs and alternative route teacher preparation programs. The commenters argued that, because alternative route teacher preparation programs place their participants as teachers while they are still enrolled, these participants will have already established teacher retention rates by the time they complete their programs. Traditional program participants, on the other hand, are only placed as teachers after earning their credential, leaving their programs at a comparative disadvantage under the indicators of academic content knowledge and teaching skills. Many of these commenters contended that, as a result, comparisons between traditional teacher preparation programs and alternative route teacher preparation programs will be invalid. Others recommended that the word “licensure” be changed to “professional licensure” to alleviate the need for States to compare traditional teacher preparation programs and alternative route teacher preparation programs.

    A number of commenters claimed that the proposed definition confused the attainment of certification or licensure with graduation from a program, which is often a precursor for certification or licensure. They stated that the proposed definition was not clear regarding how States would report on recent program completers who are entering the classroom. Others noted that some States allow individuals to be employed as full-time teachers for up to five years before obtaining licensure. They contended that reporting all of these categories together would provide misleading statistics on teacher preparation programs.

    Other commenters specifically requested that the definition include pre-kindergarten teachers (if a State requires postsecondary education and training for pre-kindergarten teachers), and that pre-kindergarten teachers be reflected in teacher preparation program assessment.

    A number of commenters also recommended that the word “recent” be removed from the definition of “new teacher” so that individuals who take time off between completing their teaching degree and obtaining a job in a classroom are still considered to be new teachers. They argued that individuals who take time off to raise a family or who do not immediately find a full-time teaching position should still be considered new teachers if they have not already had full-time teaching experience. Other commenters stated that the term “new teacher” may result in confusion based on State decisions about when an individual may begin teaching. For example, the commenters stated that in Colorado teachers may obtain an alternative license and begin teaching before completing a formal licensure program. As such, new teachers may have been teaching for up to three years at the point that the proposed definition would consider them to be a “new teacher,” and the proposed definition therefore may cause confusion among data entry staff about which individuals should be reported as new teachers. They recommended the we replace the term “new teacher” with the term “employed completer” because the latter more clearly reflects that an individual would need to complete his or her program and have found employment to be included in the reporting requirements.

    Discussion: The intent of the proposed definition of “new teacher” was to capture those individuals who have newly entered the classroom and become responsible for student outcomes. Upon review of the public comments, we agree that the proposed definition of “new teacher” is unclear and needs revision.

    We understand that many alternative route teacher preparation programs place their participants as teachers while they are enrolled in their programs, and many traditional preparation program participants are only placed after earning their credential. Furthermore, we agree that direct comparisons between alternative route and traditional teacher preparation programs could be misleading if done without a more complete understanding of the inherent differences between the two types of programs. For example, a recent completer of an alternative route program may actually have several more years of teaching experience than a recent graduate of a traditional teacher preparation program, so apparent differences in their performance may be based more on the specific teacher's experience than the quality of the preparation program.

    In addition, we agree with commenters that the preparation of preschool teachers is a critical part of improving early childhood education, and inclusion of these staff in the assessment of teacher preparation program quality could provide valuable insights. We strongly encourage States that require preschool teachers to obtain either the same level of licensure as elementary school teachers, or a level of licensure focused on preschool or early childhood education, to include preschool teachers who teach in public schools in their assessment of the quality of their teacher preparation programs. However, we also recognize that preschool licensure and teacher evaluation requirements vary among States and among settings, and therefore believe that it is important to leave the determination of whether and how to include preschool teachers in this measure to the States. We hope that States will base their determination on what is most supportive of high-quality early childhood education in their State.

    We also agree with commenters that the proposed term “new teacher” may result in confusion based on State decisions about when individuals in an alternative route program have the certification they need to begin teaching, and that, in some cases, these individuals may have taught for up to three years before the proposed definition would consider them to be new teachers. We believe, however, that the term “employed completer” could be problematic for alternative route programs because, while their participants are employed, they may not have yet completed their program.

    Likewise, we agree with commenters who expressed concern that our proposed definition of “new teacher” confuses the attainment of certification or licensure with graduation from a program leading to recommendation for certification or licensure.

    For all of these reasons, we are removing the term and definition of “new teacher” and replacing it with the term “novice teacher,” which we are defining as “a teacher of record in the first three years of teaching who teaches elementary or secondary public school students, which may include, at a State's discretion, preschool students.” We believe this new term and definition more clearly distinguish between individuals who have met all the requirements of a teacher preparation program (recent graduates), and those who have been assigned the lead responsibility for a student's learning (i.e., a teacher of record as defined in this document) but who may or may not have completed their teacher preparation program. In doing so, we also have adopted language that captures as novice teachers those individuals who are responsible for student outcomes, because these are the teachers on whom a program's student learning outcomes should focus. We chose a period of three years because we believe this is a reasonable timeframe in which one could consider a teacher to be a novice, and because it is the length of time for which retention rate data will be collected. In this regard, the definition of novice teacher continues to include three cohorts of teachers, but treats the first year of teaching as the first year as a teacher of record regardless of whether the teacher has completed a preparation program (as is the case for most traditional programs) or is still in process of completing it (as is the case for alternate route programs).

    Finally, we agree with commenters that we should remove the word “recent” from the definition, and have made this change. As commenters suggest, making this change will ensure that individuals who take time off between completing their teacher preparation program and obtaining a job in a classroom, or who do not immediately find a full-time teaching position, are still included in the definition of “novice teacher.” Therefore, our definition of “novice teacher” does not include the word “recent”; the term instead clarifies that a novice teacher is an individual who is responsible for student outcomes, while still allowing individuals who are recent graduates to be categorized as novice teachers for three years in order to account for delays in placement.

    Changes: We have removed the term “new teacher” and replaced it with the term “novice teacher,” which we define as “a teacher of record in the first three years of teaching who teaches elementary or secondary public school students, which may include, at a State's discretion, preschool students.” See the discussion below regarding the definition of “teacher of record.”

    Quality Clinical Preparation

    Comments: Commenters provided a number of specific suggestions for revising the proposed definition of “quality clinical preparation.”

    Commenters suggested that the definition include a requirement that mentor teachers be “effective.” While our proposed definition did not use the term “mentor teacher,” we interpret the comments as pertaining to the language of paragraph (1) of the proposed definition—the requirement that those LEA-based personnel who provide training be qualified clinical instructors. Commenters also suggested that we eliminate the phrase “at least in part” when referring to the training to be provided by qualified clinical instructors, and that we require the clinical practice to include experience with high-need and high-ability students, as well as the use of data analysis and development of classroom management skills.

    Other commenters suggested that the definition require multiple clinical or field experiences, or both, with effective mentor teachers who (1) address the needs of diverse, rural, or underrepresented student populations in elementary and secondary schools, including English learners, students with disabilities, high-need students, and high-ability students, and (2) assess the clinical experiences using a performance-based protocol to demonstrate teacher candidates' mastery of content and pedagogy.

    Some commenters suggested that the definition require that teacher candidates use specific research-based practices in addition to those currently listed in the definition, including data analysis, differentiation, and classroom management. The commenters recommended that all instructors be qualified clinical instructors, and that they ensure that clinical experiences include working with high-need and high-ability students because doing so will provide a more robust and realistic clinical experience.

    Commenters further suggested that “quality clinical preparation” use a program model similar to that utilized by many alternative route programs. This model would include significant in-service training and support as a fundamental and required component, alongside an accelerated pre-service training program. Another commenter suggested the inclusion of residency programs in the definition.

    Commenters also suggested that the Department adopt, for the title II reporting system, the definitions of the terms “clinical experience” and “clinical practice” used by CAEP so that the regulatory definitions describe a collaborative relationship between a teacher preparation program and a school district. Commenters explained that CAEP defines “clinical experiences” as guided, hands-on, practical applications and demonstrations of professional knowledge of theory to practice, skills, and dispositions through collaborative and facilitated learning in field-based assignments, tasks, activities, and assessments across a variety of settings. Commenters further explained that CAEP defines “clinical practice” as student teaching or internship opportunities that provide candidates with an intensive and extensive culminating field-based set of responsibilities, assignments, tasks, activities, and assessments that demonstrate candidates' progressive development of the professional knowledge, skills, and dispositions to be effective educators. Another commenter recommended that we develop common definitions of data and metrics on quality clinical preparation.

    Discussion: We agree with the commenters that it is important to ensure that mentor teachers and qualified clinical instructors are effective. Effective instructors play an important role in ensuring that students in teacher preparation programs receive the best possible clinical training if they are to become effective educators. However, we believe that defining the term “quality clinical preparation” to provide that all clinical instructors, whether LEA-based or not, meet specific established qualification requirements and use a training standard that is publicly available (as required by paragraph (1) of our definition) reasonably ensures that students are receiving clinical training from effective instructors.

    We agree with the recommendation to remove the phrase “at least in part” from the definition, so that all training must be provided by quality clinical instructors.

    We decline to revise the definition to provide that quality clinical preparation specifically include work with high-need or high-ability students, using data analysis and differentiation, and developing classroom management skills. We agree that these are important elements in developing highly effective educators and could be an important part of clinical preparation. However, the purpose of this definition is to highlight general characteristics of quality clinical instruction that must be reflected in how a State assesses teacher preparation program performance, rather than provide a comprehensive list of elements of quality clinical preparation. We believe that including the additional elements suggested by the commenters would result in an overly prescriptive definition. We note, however, that States are free to supplement this definition with additional criteria for assessing teacher preparation program performance.

    We also decline to revise the definition to provide that quality clinical preparation be assessed using a performance-based protocol as a means of demonstrating student mastery of content and pedagogy. While this is a strong approach that States may choose to take, we are not revising the definition to prescribe this particular method because we believe it may in some cases be overly burdensome.

    We decline commenters' recommendation to include significant in-service training and support as a fundamental and required component, alongside an accelerated pre-service training program. Similarly, we reject the suggestion to include residency programs in the definition. Here again, we feel that both of these additional qualifications would result in a definition that is too prescriptive. Moreover, as noted above, this definition is meant to highlight general characteristics of quality clinical instruction that must be reflected in how a State assesses teacher preparation program performance, rather than to provide a comprehensive list of elements of quality clinical preparation.

    Furthermore, while we understand why commenters recommended that we use CAEP's definitions, we do not want to issue an overly prescriptive definition of what is and is not quality clinical preparation, nor do we want to endorse any particular organization's approach. Rather, we are defining a basic indicator of teacher preparation program performance for programs that do not meet the program accreditation provision in § 612.5(a)(4)(i). However, States are free to build the CAEP definitions into their own criteria for assessing teacher preparation program performance; furthermore, programs may implement CAEP criteria.

    We encourage States and teacher preparation programs to adopt research-based practices of effective teacher preparation for all aspects of their program accountability systems. Indeed, we believe the accountability systems that States establish will help programs and States to gather more evidence about what aspects of clinical training and other parts of preparation programs lead to the most successful teachers. However, we decline to develop more precise regulatory definitions of data and metrics on quality clinical preparation because we feel that these should be determined by the State in collaboration with IHEs, LEAs, and other stakeholders (see § 612.4(c)).

    Changes: We have revised the definition of “quality clinical preparation” by removing the phrase “at least in part” to ensure that all training is provided by quality clinical instructors.

    Recent Graduate

    Comments: Multiple commenters recommended replacing the term “recent graduate” with the term “program completer” to include candidates who have met all program requirements, regardless of enrollment in a traditional teacher preparation program or an alternative route teacher preparation program. In addition, they recommended that States be able to determine the criteria that a candidate must satisfy in order to be considered a program completer.

    Other commenters recommended changing the definition of “recent graduate” to limit it to those graduates of teacher preparation programs who are currently credentialed and practicing teachers. The commenters stated that this would avoid having programs with completers who become gainfully employed in a non-education field or enroll in graduate school being penalized when the State determines the program's performance.

    Discussion: We intended the term “recent graduate” to capture those individuals who have met all the requirements of the teacher preparation program within the last three title II reporting years. We recognize that a number of alternative route programs do not use the term “graduate” to refer to individuals who have met those requirements. However, using the term “recent graduate” to encompass both individuals who complete traditional teacher preparation programs and those who complete alternative route programs is simpler than creating a separate term for alternative route participants. Thus, we continue to believe that the term “recent graduate,” as defined, appropriately captures the relevant population for purposes of the regulations.

    Furthermore, we decline to amend the definition to include only those individuals who are currently credentialed and practicing teachers. Doing so would create confusion between this term and “novice teacher” (defined elsewhere in this document). The term “novice teacher” is designed to capture individuals who are in their first three years of teaching, whereas the definition of “recent graduate” is designed to capture individuals who have completed a program, regardless of whether they are teaching. In order to maintain this distinction, we have retained the prohibitions that currently exist in the definitions in the title II reporting system against using recommendation to the State for licensure or becoming a teacher of record as a condition of being identified as a recent graduate.

    We are, however, making slight modifications to the proposed definition. Specifically, we are removing the reference to being hired as a full-time teacher and instead using the phrase “becoming a teacher of record.” We do not believe this substantially changes the meaning of “recent graduate,” but it does clarify which newly hired, full-time teachers are to be captured under the definition.

    We decline to provide States with additional flexibility in establishing other criteria for making a candidate a program completer because we believe that the revised definition of the term “recent graduate” provides States with sufficient flexibility. We believe that the additional flexibility suggested by the commenters would result in definitions that stray from the intent of the regulations.

    Some commenters expressed concern that programs would be penalized if some individuals who have completed them go on to become gainfully employed in a non-education field or enroll in graduate school. We feel that it is important for the public and prospective students to know the degree to which participants in a teacher preparation program do not become teachers, regardless of whether they become gainfully employed in a non-education field. However, we think it is reasonable to allow States flexibility to exclude certain individuals when determining the teacher placement and retention rates (i.e., those recent graduates who have taken teaching positions in another State, or who have enrolled in graduate school or entered military service). For these reasons, we have not adopted the commenters' recommendation to limit the definition of “recent graduate” to those graduates of teacher preparation programs who are currently credentialed and practicing teachers.

    Changes: We have revised the definition of “recent graduate” to clarify that a teacher preparation program may not use the criterion “becoming a teacher of record” when it determines if an individual has met all of the program requirements.

    Rigorous Teacher Candidate Exit Qualifications

    Comments: One commenter recommended that we remove the reference to entry requirements from the proposed definition of “rigorous teacher entry and exit requirements” because using rigorous entry requirements to assess teacher preparation program performance could compromise the mission of minority-serving institutions, which often welcome disadvantaged students and develop them into profession-ready teachers. Commenters said that those institutions and others seek, in part, to identify potential teacher candidates whose backgrounds are similar to students they may ultimately teach but who, while not meeting purely grade- or test-based entry requirements, could become well-qualified teachers through an effective preparation program.

    Commenters recommended adding a number of specific items to the definition of exit qualifications, such as classroom management, differentiated instructional planning, and an assessment of student growth over time.

    Another commenter suggested amending the definition to include culturally competent teaching, which the commenter defined as the ability of educators to teach students intellectual, social, emotional, and political knowledge by utilizing their diverse cultural knowledge, prior experiences, linguistic needs, and performance styles. This commenter stated that culturally competent teaching is an essential pedagogical skill that teachers must possess. The commenter also recommended that we include as separate terms and define “culturally competent education” and “culturally competent leadership”. Finally, this commenter requested that we develop guidance on culturally and linguistically appropriate approaches in education.

    Discussion: Although overall research findings regarding the effect of teacher preparation program selectivity on student outcomes are generally mixed, some research indicates there is a correlation between admission requirements for teacher preparation programs and the teaching effectiveness of program graduates.9 In addition, under our proposed definition, States and programs could define “rigorous entry requirements” in many and varied ways, including through evidence of other skills and characteristics determined by programs to correlate with graduates' teaching effectiveness, such as grit, disposition, or performance-based assessments relevant to teaching. Nonetheless, we understand that prospective teachers who themselves come from high-need schools—and who may therefore bring a strong understanding of the backgrounds of students they may eventually teach—could be disproportionately affected by grade-based or test-based entry requirements. Additionally, because the primary emphasis of the regulations is to ensure that candidates graduate from teacher preparation programs ready to teach, we agree that measures of program effectiveness should emphasize rigorous exit requirements over program entry requirements. Therefore, we are revising the regulations to require only rigorous exit standards.

    9 See, for example: Henry, G., & Bastian, K. (2015). Measuring Up: The National Council on Teacher Quality's Ratings of Teacher Preparation Programs and Measures of Teacher Performance.

    In our definition of rigorous exit requirements, we identified four basic characteristics that we believe all teacher candidates should possess. Regarding the specific components of rigorous exit requirements that commenters suggested (such as standards-based and differentiated planning, classroom management, and cultural competency), the definition does not preclude States from including those kinds of elements as rigorous exit requirements. We acknowledge that these additional characteristics, including cultural competency, may also be important, but we believe that the inclusion of these additional characteristics should be left to the discretion of States, in consultation with their stakeholders. To the extent that they choose to include them, States would need to develop definitions for each additional element. We also encourage interested parties to bring these suggestions forward to their States in the stakeholder engagement process required of all States in the design of their performance rating systems (see § 612.4(c)). Given that we are not adding cultural competency into the definition of rigorous candidate exit requirements, we are not adding the recommended related definitions or developing guidance on this topic at this time.

    In addition, as we reviewed comments, we realized both that the phrase “at a minimum” was misplaced in the sentence and should refer not to the use of an assessment but to the use of validated standards and measures of the candidate's effectiveness, and that the second use of “measures of” in the phrase “measures of candidate effectiveness including measures of curriculum planning” was redundant.

    Changes: We have revised the term “rigorous teacher candidate entry and exit qualifications” by removing entry qualifications. We have also revised the language in § 612.5(a)(4)(ii)(C) accordingly. In addition, we have moved the phrase “at a minimum” from preceding “assessment of candidate performance” to preceding “on validated professional teaching standards.” Finally, we have revised the phrase “measures of candidate effectiveness including measures of curriculum planning” to read “measures of candidate effectiveness in curriculum planning.”

    Student Achievement in Non-Tested Grades and Subjects

    Comments: Multiple commenters opposed the definition of the term “student achievement in non-tested grades and subjects,” and provided different recommendations on how the definition should be revised. Some commenters recommended removing the definition from the regulations altogether, noting that, for some subjects (such as music, art, theater, and physical education), there simply are not effective or valid ways to judge the growth of student achievement by test scores. Others recommended that student achievement in non-tested grades and subjects be aligned to State and local standards. These commenters asserted that alignment with State and local standards will ensure rigor and consistency for non-tested grades and subjects. A number of commenters also recommended that teachers who teach in non-tested subjects should be able to use scores from an already administered test to count toward their effectiveness rating, a policy that some States have already implemented to address student achievement in non-tested subjects.

    Discussion: We have adopted the recommendation to remove the definition of “student achievement in non-tested grades and subjects,” and have moved the substance of this definition to the definition of “student growth.” Upon review of comments regarding this definition, as well as comments pertaining to student learning outcomes more generally, we have also altered the requirements in § 612.5(a)(1)(ii) for the calculation of student learning outcomes—specifically by permitting a State to use another State-determined measure relevant to calculating student learning outcomes instead of only student growth or a teacher evaluation measure. We believe that the increased flexibility resulting from these changes sufficiently addresses commenter concerns regarding the definition of “student achievement in non-tested grades and subjects.” We also believe it is important that the regulations permit States to determine an effective and valid way to measure growth for students in all grades and subjects not covered by section 1111(b)(2222of the ESEA, as amended by the ESSA, and that the revisions we have made provide sufficient flexibility for States to do so.

    Under the revised definition of student growth, States must use measures of student learning and performance, such as students' results on pre-tests and end-of-course-tests, objective performance-based assessments, student learning objectives, student performance on English language proficiency assessments, and other measures of student achievement that are rigorous, comparable across schools, and consistent with State requirements. Further, as a number of commenters recommended that the definition of student achievement in non-tested grades and subjects includes alignment to State and local standards, we feel that this new definition of student growth, in conjunction with altered requirements in the calculation of student learning outcomes, is sufficiently flexible to allow such alignment. Further, a State could adopt the commenters' recommendations summarized above under the revised requirements for the calculation of student learning outcomes and the revised definition of “student growth.”

    We note that the quality of individual teachers is not being measured by the student learning outcomes indicator. Rather, it will help measure overall performance of a teacher preparation program through an examination of student growth in the many grades and subjects taught by novice teachers that are not part of the State's assessment system under section 1111(b) of the ESEA, as amended by the ESSA.

    Changes: The definition of student achievement in non-tested grades and subjects has been removed. The substance of the definition has been moved to the definition of student growth.

    Student Achievement in Tested Grades and Subjects

    Comments: A number of commenters opposed the definition of “student achievement in tested grades and subjects” because of its link to ESEA standardized test scores and the definitions used in ESEA flexibility. Commenters found this objectionable because these sources are subject to change, which could present complications in future implementation of the regulations. Further, the commenters asserted that standardized testing and value-added models (VAM) 10 are not valid or reliable and should not be used to assess teacher preparation programs.

    10 In various comments, commenters used the phrases “value-added modeling,” “value-added metrics,” “value-added measures,” “value-added methods,” “value-added estimation,” and “value-added analysis.” For purposes of these comments, we understand the use of these terms to reflect similar ideas and concepts, so for ease of presentation of our summary of the comments and our responses to them, we use the single phrase “value-added models,” abbreviated as VAM.

    Discussion: We have adopted the recommendation to remove the definition of “student achievement in tested grades and subjects.” While we have moved the substance of this definition to the definition of “student growth,” we have also altered the requirements for the calculation of student learning outcomes upon review of comments related to this definition and comments pertaining to student learning outcomes more generally. We believe that the increased flexibility resulting from these changes sufficiently addresses commenter concerns regarding the definition of “student achievement in tested grades and subjects.” We believe it is important that the regulations permit States to determine an effective and valid way to measure growth for students in grades and subjects covered by section 1111(b)(2) of the ESEA, as amended by ESSA, and that the revisions we have made provide sufficient flexibility for States to do so.

    While the revised requirement does not necessitate the use of ESEA standardized test scores, we believe that the use of such scores could be a valid and reliable measure of student growth and encourage its use in determining student learning outcomes where appropriate.11

    11 See, for example: Chetty, R., Friedman, J., & Rockoff, J. (2014). Measuring the Impacts of Teachers II: Teacher Value-Added and Student Outcomes in Adulthood. American Economic Review, 104(9), 2633-2679 (hereafter referred to as “Chetty et al.”)

    We now turn to the comments from those who asserted that maintaining a link between this definition and conditions of waivers granted to States under ESEA flexibility is problematic. While we maintain the substance of this definition in the definition of “student growth,” in view of section 4(c) of ESSA, which terminates waivers the Department granted under ESEA flexibility as of August 1, 2016, we have revised the requirements for calculation of student learning outcomes in § 612.5(a)(1)(ii) to allow States the flexibility to use “another State-determined measure relevant to calculating student learning outcomes.” We believe that doing so allows the flexibility recommended by commenters. In addition, as we have stressed above in the discussion of Federal-State-Institution Relationship, Generally, under the regulations States have flexibility in how to weight each of the indicators of academic content knowledge and teaching skills.

    Finally, the use of value-added measures are not specifically included in the definition in the revised requirements for the calculation of student learning outcomes, or otherwise required by the regulations. However, we believe that there is convincing evidence that value-added scores, based on standardized tests, can be valid and reliable measures of teacher effectiveness and a teacher's effect on long-term student outcomes.12 See our response to comments regarding § 612.5(a)(1), which provides an in-depth discussion of the use of student growth and VAM, and why we firmly believe that our student learning outcome measure, which references “student achievement in tested grades and subjects” is valid and reliable.

    12 See, for example: Chetty, et al. at 2633-2679.

    Changes: The definition of student achievement in tested grades and subjects has been removed. The substance of the definition has been moved to the definition of student growth.

    Student Growth

    Comments: Multiple commenters opposed the proposed definition of “student growth” because the definition, which was linked to ESEA standardized test scores and definitions of terms used for Race to the Top, would also be linked to VAM, which commenters stated are not valid or reliable. Additionally, other commenters disagreed with the suggestion that student growth may be defined as a simple comparison of achievement between two points in time, which they said downplays the potential challenges of incorporating such measures into evaluation systems.

    A number of commenters also stated that the definition of “student growth” has created new testing requirements in areas that were previously not tested. They urged that non-tested grades and subjects should not be a part of the definition of student growth. By including them in this definition, the commenters argued, States and school districts would be required to test students in currently non-tested areas, which they contended should remain non-tested. Several commenters also stated that, even as the value of yearly student testing is being questioned, the regulations would effectively add cost and burden to States that have not sought ESEA flexibility or received Race to the Top funds.

    Discussion: These regulations define student growth as the change in student achievement between two or more points in time, using a student's score on the State's assessments under section 1111(b)(2) of the ESEA, as amended by ESSA, or other measures of student learning and performance, such as student results on pre-tests and end-of-course tests; objective performance-based assessments; student learning objectives; student performance on English language proficiency assessments; and other measures that are rigorous, comparable across schools, and consistent with State guidelines.

    Due to the removal of separate definitions of student achievement in tested grades and subjects and student achievement in non-tested grades and subjects, and their replacement by one flexible definition of student growth, we believe we have addressed many concerns raised by commenters. This definition, for example, no longer requires States to use ESEA standardized test scores to measure student growth in any grade or subject, and does not require the use of definitions of terms used for Race to the Top.

    We recognize commenters' assertion that student growth defined as a comparison of achievement between two points in time downplays the potential challenges of incorporating such measures into evaluation systems. However, since the revised definition of student growth and the revised requirements for calculating student learning outcomes allow States a large degree of flexibility in how such measures are applied, we do not believe the revised definition will place a significant burden on States to implement and incorporate these concepts into their teacher preparation assessment systems.

    We have addressed commenters' recommendation that non-tested grades and subjects not be a part of the definition of student growth by removing the definition of student achievement in non-tested grades and subjects, and providing States with flexibility in how they apply the definition of student growth, should they choose to use it for measuring a program's student learning outcomes. However, we continue to believe that student growth in non-tested grades and subjects can and should be measured at regular intervals. Further, the revisions to the definition address commenters' concerns that the regulations would effectively add cost and burden to States that have not sought ESEA flexibility or received Race to the Top funds.

    Consistent with the definition, and in conjunction with the altered requirements for the calculation of student learning outcomes, and the removal of the definition of student achievement in tested grades and subjects as well as the definition of student achievement in non-tested grades and subjects, States have significant flexibility to determine the methods they use for measuring student growth and the extent to which it is factored into a teacher preparation program's performance rating. The Department's revised definition of “student growth” is meant to provide States with more flexibility in response to commenters. Additionally, if a State chooses to use a method that controls for additional factors affecting student and teacher performance, like VAM, the regulations permit it to do so. See our response to comments in § 612.5(a)(1), which provides an in-depth discussion of the use of student growth and VAM.

    Changes: The definition of student growth has been revised to be the change in student achievement between two or more points in time, using a student's scores on the State's assessments under section 1111(b)(2) of the ESEA or other measures of student learning and performance, such as student results on pre-tests and end-of-course tests; objective performance-based assessments; student learning objectives; student performance on English language proficiency assessments; and other measures that are rigorous, comparable across schools, and consistent with State guidelines, rather than the change between two or more points in time in student achievement in tested grades and subjects and non-tested grades and subjects.

    Student Learning Outcomes

    Comments: None.

    Discussion: Due to many commenters' concerns regarding State flexibility, the use of ESEA standardized test scores, and the relationships between our original proposed requirements and those under ESEA flexibility, we have included a provision in § 612.5(a)(1)(ii)(C) allowing States to use a State-determined measure relevant to calculating student learning outcomes. This measure may be used alone, or in combination with student growth and a teacher evaluation measure, as defined. As with the measure for student growth, State-determined learning outcomes must be rigorous, comparable across schools, and consistent with state guidelines. Additionally, such measures should allow for meaningful differentiation between teachers. If a State did not select an indicator that allowed for such meaningful differentiation among teachers, and instead chose an indicator that led to consistently high results among teachers without reflecting existing inconsistencies in student learning outcomes—such as average daily attendance in schools, which is often uniformly quite high even in the lowest performing schools—the result would be very problematic. This is because doing so would not allow the State to meaningfully differentiate among teachers for the purposes of identifying which teachers, and thus which teacher preparation programs, are making a positive contribution to improving student learning outcomes.

    Further, upon review of the proposed regulations, we recognized that the structure could be confusing. In particular, we were concerned that having a definition for the term “student learning outcomes” in § 612.2, when it largely serves to operationalize other definitions in the context of § 612.5, was not the clearest way to present these requirements. We therefore are moving the explanations and requirements of this term into the text of § 612.5(a).

    Changes: We have altered the requirements in § 612.5(a)(1)(ii) for calculating “student learning outcomes” to provide States with additional flexibility. We have also removed the proposed definition of “student learning outcomes” from § 612.2, and moved the substance of the text and requirements of the student learning outcomes definition to § 612.5(a)(1).

    Survey Outcomes

    Comments: Commenters argued that States need flexibility on the types of indicators used to evaluate and improve teacher preparation programs. They suggested that States be required to gather data through teacher and employer surveys in a teacher's first three years of teaching, but be afforded the flexibility to determine the content of the surveys. Commenters added that specific content dictated from the Federal level would limit innovation in an area where best practices are still developing.

    Some commenters also stated that it is important to follow graduates through surveys for their first five years of employment, rather than just their first year of teaching (as proposed in the regulations) to obtain a rich and well-informed understanding of the profession over time, as the first five years is a significant period when teachers decide whether to leave or stay in the profession.

    Commenters were concerned about the inclusion of probationary certificate teachers in surveys of teachers and employers for purposes of reporting teacher preparation program performance. Commenters noted that, in Texas, alternate route participants may be issued a probationary certificate that allows the participants to be employed as teachers of record for a period of up to three years while they are completing the requirements for a standard certificate. As a result, these probationary certificate holders would meet the proposed definition of “new teacher” and, therefore, they and their supervisors would be asked to respond to surveys that States would use to determine teacher preparation program performance, even though they have not completed their programs.

    In addition, commenters asked which States are responsible for surveying teachers from a distance education program and their employers or supervisors.

    Discussion: The regulations do not specify the number or type of questions to be included in employer or teacher surveys. Rather, we have left decisions about the content of these surveys to each State. We also note that, under the regulations, States may survey novice teachers and their employers for a number of consecutive years, even though they are only required to survey during the first year of teaching.

    The goal of every teacher preparation program is to effectively prepare aspiring teachers to step into a classroom and teach all of their students well. As the regulations are intended to help States determine whether each teacher preparation program is meeting this goal, we have decided to focus on novice teachers in their first year of teaching, regardless of the type of certification the teachers have or the type of teacher preparation program they attended or are attending. When a teacher is given primary responsibility for the learning outcomes of a group of students, the type of program she attended or is still attending is largely irrelevant—she is expected to ensure that her students learn. We expect that alternative route teacher preparation programs are ensuring that the teachers they place in classrooms prior to completion of their coursework are sufficiently prepared to ensure student growth in that school year. We recognize that these teachers, and those who completed traditional teacher preparation programs, will grow and develop as teachers in their first few years in the classroom.

    We agree with commenters who suggested that surveying teachers and their employers about the quality of training in the teachers' preparation program would provide a more rich and well-informed understanding of the programs over time. However, we decline to require that States survey novice teachers and their employers for more than one year. As an indicator of novice teachers' academic content knowledge and teaching skills, these surveys are a much more robust indicator of program performance in preparing novice teachers for teaching when completed in the first year of teaching. In this way, the program is still fresh and teachers and employers can best focus on the unique impact of the program independent of other factors that may contribute to teaching quality such as on-the-job training. However, if they so choose, States are free to survey novice teachers and their employers in subsequent years beyond a teacher's first year of teaching, and consider the survey results in their assessment of teacher preparation program effectiveness.

    For teacher preparation programs provided through distance education, a State must survey the novice teachers described in the definition of “teacher survey” who have completed such a program and who teach in that State, as well as the employers of those same teachers.

    Changes: None.

    Comments: None.

    Discussion: Upon review, we recognized that the structure of the proposed regulations could be confusing. In particular, we were concerned that having a definition for the term “survey outcomes” in § 612.2, when it largely serves to operationalize other definitions in the context of § 612.5, was not the clearest way to present these requirements. We therefore are removing the definition of “survey outcomes” from § 612.2 and moving its explanations and requirements into § 612.5(a)(3).

    Through this change, we are clarifying that the surveys will assess whether novice teachers possess the academic content knowledge and teaching skills needed to succeed in the classroom. We do so for consistency with § 612.5(a), which requires States to assess, for each teacher preparation program, indicators of academic content knowledge and teaching skills of novice teachers from that program. We also have removed the provision that the survey is of teachers in their first year of teaching in the State where the teacher preparation is located, and instead provide that the survey is of teachers in their first year teaching in the State. This change is designed to be consistent with new language related to the reporting of teacher preparation programs provided through distance education, as discussed later in this document. Finally, we are changing the term “new teacher” to “novice teacher” for the reasons discussed under the definition of “novice teacher.”

    Changes: We have moved the content of the proposed definition of “survey outcomes” from § 612.2, with edits for clarity, to § 612.5(a)(3). We have also replaced the term “new teacher” with “novice teacher” in § 612.5(a)(3).

    Teacher Evaluation Measure

    Comments: Many commenters noted that the proposed definition of “teacher evaluation measure” is based on the definition of “student growth.” Therefore, commenters stated that the definition is based on VAM, which they argued, citing research, is not valid or reliable for this purpose.

    Discussion: The proposed definition of “teacher evaluation measure” did include a measure of student growth. However, while VAM reflects a permissible way to examine student growth, neither in the final definition of teacher evaluation measure nor anywhere else in these regulations is the use of VAM required. For a more detailed discussion of the use of VAM, please see the discussion of § 612.5(a)(1).

    Changes: None.

    Comments: Commenters stated that the proposed definitions of “teacher evaluation measure” and “student growth” offer value from a reporting standpoint and should be used when available. Commenters also noted that it would be useful to understand novice teachers' impact on student growth and recommended that States be required to report student growth outcomes separately from teacher evaluation measures where both are available.

    Commenters also noted that not all States may have teacher evaluation measures that meet the proposed definition because not all States require student growth to be a significant factor in teacher evaluations, as required by the proposed definition. Other commenters suggested that, while student growth or achievement should be listed as the primary factors in calculating teacher evaluation measures, other factors such as teacher portfolios and student and teacher surveys should be included as secondary considerations.

    Some commenters felt that any use of student performance to evaluate effectiveness of teacher instruction needs to include multiple measures over a period of time (more than one to two years) and take into consideration the context (socioeconomic, etc.) in which the instruction occurred.

    Discussion: We first stress that the regulations allow States to use “teacher evaluation measures” as one option for student learning outcomes; use of these measures is not required. States also may use student growth or, another State-determined measure relevant to calculating student learning outcomes, or combination of these three options.

    Furthermore, while we agree that reporting on student growth separately from teacher evaluation measures would likely provide the public with more information about the performance of novice teachers, we are committed to providing States the flexibility to develop performance systems that best meet their specific needs. In addition, because of the evident cost and burden of disaggregating student growth data from teacher evaluation measures, we do not believe that the HEA title II reporting system is the right vehicle for gathering this information. As a result, we decline to require separate reporting.

    States may consider having LEAs incorporate teacher portfolios and student and teacher surveys into teacher evaluation measures, as the commenters recommended. In this regard, we note that the definition of “teacher evaluation measure” requires use of multiple valid measures, and we believe that teacher evaluation systems that use such additional measures of professional practice provide the best information on a teacher's effectiveness. We also note that, because the definition of “novice teacher” encompasses the first three years as a teacher of record, teacher evaluation measures that include up to three years of student growth data are acceptable measures of student learning outcomes under § 612.5(a)(1). In addition, States can control for different kinds of student and classroom characteristics in ways that apply our definition of student learning outcomes and student growth. See the discussion of § 612.5(a)(2) for further information of the student learning outcomes indicator.

    With regard to the comment that some States lack teacher evaluation measures that meet the proposed definition because they do not require student growth to be a significant factor in teacher evaluations, we previously explained in our discussion of § 612.1 (and do so again in our discussion of § 612.6) our reasons for removing any proposed weightings of indicators from these regulations. Thus we have removed the phrase, “as a significant factor,” from the definition of teacher evaluation measure.

    Changes: We have removed the words “as a significant factor” from the second sentence of the definition.

    Comments: None.

    Discussion: In response to the student learning outcomes indicator, some commenters recommended that States be allowed to use the teacher evaluation system they have in place. By proposing definitions relevant to student learning outcomes that align with previous Department initiatives, our intention was that the teacher evaluation systems of States that include student growth as a significant factor, especially those that had been granted ESEA flexibility, would meet the requirements for student learning outcomes under the regulations. Upon further review, we determined that revision to the definition of “teacher evaluation measure” is necessary to ensure that States are able to use teacher evaluation measures to collect data for student learning outcomes if the teacher evaluation measures include student growth, and in order to ensure that the definition describes the measure itself, which is then operationalized through a State's calculation.

    We understand that some States and districts that use student growth in their teacher evaluation systems do not do so for teachers in their first year, or first several years, of teaching. We are satisfied that such systems meet the requirements of the regulations so long as student growth is used as one of the multiple valid measures to assess teacher performance within the first three years of teaching. To ensure such systems meet the definition of “teacher evaluation measure,” we are revising the phrase “in determining each teacher's performance level” in the first sentence of the definition so that it reads “in determining teacher performance.”

    Furthermore, for the reasons included in the discussion of §§ 612.1 and 612.6, we are removing the phrase “as a significant factor” from the definition. In addition, we are removing the phrase “of performance levels” from the second sentence of the definition, as inclusion of that phrase in the NPRM was an error.

    In addition, we have determined that the parenthetical phrase beginning “such as” could be shortened without changing the intent, which is to provide examples of other measures of professional practice.

    Finally, in response to commenters' desire for additional flexibility in calculating student learning outcomes, and given the newly enacted ESSA, under which waivers granted under ESEA flexibility will terminate as of August 1, 2016, we have revised the regulations so that States may use any State-determined measure relevant to calculating student learning outcomes, or combination of these three options.

    Changes: We have revised the definition of “teacher evaluation measure” by removing the phrase “By grade span and subject area and consistent with statewide guidelines, the percentage of new teachers rated at each performance level under” and replaced it with “A teacher's performance level based on”. We have removed the final phrase “determining each teacher's performance level” and replaced it with “assessing teacher performance.” We have also revised the parenthetical phrase beginning “such as” so that it reads “such as observations based on rigorous teacher performance standards, teacher portfolios, and student and parent surveys.”

    Teacher of Record

    Comments: Commenters requested that the Department establish a definition of “teacher of record,” but did not provide us with recommended language.

    Discussion: We used the term “teacher of record” in the proposed definition of “new teacher,” and have retained it as part of the definitions of “novice teacher” and “recent graduate.” We agree that a definition of “teacher of record” will be helpful and will add clarity to those two definitions.

    We are adopting a commonly used definition of “teacher of record” that focuses on a teacher or co-teacher who is responsible for student outcomes and determining a student's proficiency in the grade or subject being taught.

    Changes: We have added to § 612.2 a definition of “teacher of record,” and defined it to mean a teacher (including a teacher in a co-teaching assignment) who has been assigned the lead responsibility for student learning in a subject or course section.

    Teacher Placement Rate

    Comments: Some commenters questioned whether it was beyond the Department's authority to set detailed expectations for teacher placement rates. Several commenters expressed concerns about which individuals would and would not be counted as “placed” when calculating this rate. In this regard, the commenters argued that the Federal government should not mandate the definitive list of individuals whom a State may exclude from the placement rate calculation; rather, they stated that those decisions should be entirely up to the States.

    Discussion: In response to commenters who questioned the Department's authority to establish detailed expectations for a program's teacher placement rate, we note that the regulations simply define the teacher placement rate and how it is to be calculated. The regulations also generally require that States use it as an indicator of academic content and teaching skills when assessing a program's level of performance. And they require this use because we strongly believe both (1) that a program's teacher placement rate is an important indicator of academic content knowledge and teaching skills of recent graduates, and (2) that a rate that is very low, like one that is very high, is a reasonable indicator of whether the program is successfully performing one of its basic functions—to produce individuals who become hired as teachers of record.

    The regulations do not, as the commenters state, establish any detailed expectations of what such a low (or high) teacher placement rate is or should be. This they leave up to each State, in consultation with its group of stakeholders as required under § 612.4(c).

    We decline to accept commenters' recommendations to allow States to determine who may be excluded from placement rate calculations beyond the exclusions the regulations permit in the definition of “teacher placement rate.” Congress has directed that States report their teacher placement rate data “in a uniform and comprehensible manner that conforms to the definitions and methods established by the Secretary.” See section 205(a) of the HEA. We believe the groups of recent graduates that we permit States, at their discretion, to exclude from these calculations—teachers teaching out of State and in private schools, and teachers who have enrolled in graduate school or entered the military—reflect the most common and accepted groups of recent graduates that States should be able to exclude, either because States cannot readily track them or because individual decisions to forgo becoming teachers does not speak to the program's performance. Commenters did not propose another comparable group whose failure to become novice teachers should allow a State to exclude them in calculations of a program's teacher placement rate, and upon review of the comments we have not identified such a group.

    We accept that, in discussing this matter with its group of stakeholders, a State may identify one or more such groups of recent graduates whose decisions to pass up opportunities to become novice teachers are also reasonable. However, as we said above, a teacher placement rate becomes an indicator of a teacher preparation program's performance when it is unreasonably low, i.e., below a level of reasonableness the State establishes based on the fact that the program exists to produce new teachers. We are not aware of any additional categories of recent graduates that are not already included in the allowable exclusions that would be both sufficiently large and whose circumstances are out of the control of the teacher preparation program that would, without their exclusion, result in an unreasonably low teacher placement rate. Given this, we believe States do not need the additional flexibility that the commenters propose.

    Changes: None.

    Comments: Commenters also expressed concern about participants who are hired in non-teaching jobs while enrolled and then withdraw from the program to pursue those jobs, suggesting that these students should not be counted against the program. Some commenters questioned the efficacy of teacher placement rates as an indicator of teacher preparation program performance, given the number of teachers who may be excluded from the calculation for various reasons (e.g., those who teach in private schools). Other commenters were more generally concerned that the discretion granted to States to exclude certain categories of novice teachers meant that the information available on teacher preparation programs would not be comparable across States.

    Some commenters objected to permitting States to exclude teachers or recent graduates who take teaching positions out of State, arguing that, to be useful, placement rate data need to be gathered across State boundaries as program graduates work in numerous States.

    Discussion: We believe that the revised definition of “recent graduate,” as well as the allowable exclusions in the definitions of both teacher placement and retention rates, not only alleviate obvious sources of burden, but provide States with sufficient flexibility to calculate these rates in reasonable ways. Program participants who do not complete the program do not become recent graduates, and would not be included in calculations of the teacher placement rate. However, if the commenters intended to address recent graduates who were employed in non-teaching positions while in or after completing the program, we would decline to accept the recommendation to exclude individuals because we believe that, except for those who become teachers out of State or in private schools, those who enroll in graduate school, or those who enter the military (which the regulations permit States to exclude), it is important to assess teacher preparation programs based on factors that include their success rates in having recent graduates hired as teachers of record.

    With regard to the efficacy of the teacher placement rate as an indicator of program performance, we understand that employment outcomes, including teacher placement rates, are influenced by many factors, some of which are outside of a program's control. However, we believe that employment outcomes are, in general, a good reflection of program because they signal a program's ability to produce graduates whom schools and districts deem to be qualified and seek to hire and retain. Moreover, abnormally low employment outcomes are an indication that something about the program is amiss (just as abnormally high outcomes suggest something is working very well). Further discussion on this topic can be found under the subheading Employment Outcomes as a Measure of Performance, § 612.5(a)(2).

    While we are sympathetic to the commenters' concern that the proposed definition of teacher placement rate permits States to calculate employment outcomes only using data on teachers hired to teach in public schools, States may not, depending on State law, be able to require that private schools cooperate in the State data collection that the regulations require. We do note that, generally, teacher preparation programs are designed to prepare teachers to meet the requirements to teach in public schools nationwide, and over 90 percent of teachers in elementary and secondary schools do not work in private schools.13 Additionally, requiring States to collect data on teachers employed in private schools or out of State, as well as those who enroll in graduate school or enter the military, would create undue burden on States. The regulations do not prevent teacher preparation entities from working with their States to secure data on recent graduates who are subject to one or more of the permissible State exclusions and likewise do not prevent the State using those data in calculating the program's employment outcomes, including teacher placement rates.

    13 According to data from the Bureau of Labor Statistics, in May 2014, of the 3,696,580 individuals employed as preschool, primary, secondary, and special education school teachers in elementary and secondary schools nationwide, only 358,770 were employed in private schools. See www.bls.gov/oes/current/naics4_611100.htm and www.bls.gov/oes/current/611100_5.htm .

    Similarly, we appreciate commenters' recommendation that the regulations include placement rate data for those recent graduates who take teaching positions in a different State. Certainly, many novice teachers do become teachers of record in States other than those where their teacher preparation programs are located. We encourage States and programs to develop interstate data-sharing mechanisms to facilitate reporting on indicators of program performance to be as comprehensive and meaningful as possible.

    Until States have a ready means of gathering these kinds of data on an interstate basis, we appreciate that many States may find the costs and complexities of this data-gathering to be daunting. On the other hand, we do not view the lack of these data (or the lack of data on recent graduates teaching in private schools) to undermine the reasonableness of employment outcomes as indicators of program performance. As we have explained, it is when employment outcomes are particularly low that they become indicators of poor performance, and we are confident that the States, working in consultation with their stakeholders, can determine an appropriate threshold for teacher placement and retention rates.

    Finally, we understand that the discretion that the regulations grant to each State to exclude novice teachers who teach in other States and who work in private schools (and those program graduates who go on to graduate school or join the military) means that the teacher placement rates for teacher preparation programs will not be comparable across States. This is not a major concern. The purpose of the regulations and the SRC itself is to ensure that each State reports those programs that have been determined to be low-performing or at-risk of being low-performing based on reasonable and transparent criteria. We believe that each State, in consultation with its stakeholders (see § 612.4(c), should exercise flexibility to determine whether to have the teacher placement rate reflect inclusion of those program graduates identified in paragraph (ii) of the definition.

    Changes: None.

    Comments: Several commenters recommended that a State with a statewide preschool program that requires early educators to have postsecondary training and certification and State licensure be required to include data on early educators in the teacher placement rate, rather than simply permit such inclusion at the State's discretion.

    Discussion: We strongly encourage States with a statewide preschool program where early educators are required to obtain State licensure equivalent to elementary school teachers to include these teachers in their placement data. However, we decline to require States to include these early educators in calculations of programs' teacher placement rates because early childhood education centers are often independent from local districts, or are run by external entities. This would make it extremely difficult for States to determine a valid and reasonable placement rate for these teachers.

    Changes: None.

    Comments: Commenters recommended that teachers who have been hired in part-time teaching positions be counted as “placed,” arguing that the placement of teachers in part-time teaching positions is not evidence of a lower quality teacher preparation program.

    Discussion: We are persuaded by comments that a teacher may function in a part-time capacity as a teacher of record in the subject area and grade level for which the teacher was trained and that, in those instances, it would not be appropriate to count this part-time placement against a program's teacher placement rate. As such, we have removed the requirement that a teacher placement rate be based on the percentage of recent graduates teaching in full-time positions.

    Changes: We have removed the full-time employment requirement from the definition of “teacher placement rate.”

    Comments: Commenters asked whether a participant attending a teacher preparation program who is already employed as a teacher by an LEA prior to graduation would be counted as “placed” post-graduation. Commenters felt that excluding such students may unduly penalize programs that tailor their recruitment of aspiring teachers to working adults.

    Discussion: We are uncertain whether the commenter is referring to a teacher who has already received initial certification or licensure and is enrolled in a graduate degree program or is a participant in an alternative route to certification program and is working as a teacher as a condition of participation in the program. As discussed in the section titled “Teacher Preparation Program,” a teacher preparation program is defined, in part, as a program that prepares an individual for initial certification or licensure. As a result, it is unlikely that a working teacher would be participating in such a program. See the section titled “Alternative Route Programs” for a discussion of the use of teacher placement rate in alternative route programs.

    Comments: Some commenters recommended that the teacher placement rate calculation account for regional differences in job availability and the general competitiveness of the employment market. In addition, commenters argued that placement rates should also convey whether the placement is in the area in which the candidate is trained to teach or out-of-field (i.e., where there is a mismatch between the teacher's content training and the area of the placement). The commenters suggested that young teachers may be more likely to get hired in out-of-field positions because they are among the few willing to take those jobs. Commenters contended that many teachers from alternative route programs (including Teach for America) are in out-of-field placements and should be recognized as such. Commenters also argued that high-need schools are notoriously staffed by out-of-field teachers, thus, they recommended that placement rate data account for the congruency of the placement. The commenters stated this is especially important if the final regulations include placement rates in high-need schools as an indicator of program performance.

    Discussion: We encourage entities operating teacher preparation programs to take factors affecting supply and demand, such as regional differences in job availability and the general competitiveness of the employment market, into consideration when they design and implement their programs and work to have their participants placed as teachers.

    Nonetheless, we decline to accept the recommendation that the regulations require that the teacher placement rate calculation account for these regional differences in job availability and the competitiveness of the employment market. Doing so would be complex, and would entail very large costs of cross-tabulating data on teacher preparation program location, area of residence of the program graduate, teacher placement data, and a series of employment and job market indicators. States may certainly choose to account for regional differences in job availability and the general competitiveness of the employment market and pursue the additional data collection that such effort would entail. However, we decline to require it.

    As explained in the NPRM, while we acknowledge that teacher placement rates are affected by some considerations outside of the program's control, we believe that placement rates are still a valid indicator of the quality of a teacher preparation program (see the discussion of employment outcomes under § 612.5(a)(2)).

    We understand that teachers may be hired to teach subjects and areas in which they were not prepared, and that out-of-field placement is more frequent in high-need schools. However, we maintain the requirement that the teacher placement rate assess the extent to which program graduates become novice teachers in the grade-level, grade-span, and subject area in which they were trained. A high incidence of out-of-field placement reflects that the teacher preparation program is not in touch with the hiring needs of likely prospective employers, thus providing its participants with the academic content knowledge and teaching skills to teach in the fields that do not match employers' teaching needs. We also recognize that placing teachers in positions for which they were not prepared could lead to less effective teaching and exacerbate the challenges already apparent in high-need schools.

    Changes: None.

    Comments: Some commenters stated that, while it is appropriate to exclude the categories of teachers listed in the proposed definition of “teacher placement rate,” data on the excluded teachers would still be valuable to track for purposes of the State's quality rating system. Commenters proposed requiring States to report the number of teachers excluded in each category.

    Discussion: Like the commenters, we believe that the number of recent graduates that a State excludes from its calculation of a program's teacher placement rate could provide useful information to the program. For reasons expressed above in response to comments, however, we believe a program's teacher placement rate will be a reasonable measure of program performance without reliance on the number of teachers in each category whom a State chooses to exclude from its calculations. Moreover, we do not believe that the number of recent graduates who go on to teach in other States or in private schools, or who enter graduate school or the military, is a reflection of a program's quality. Because the purpose of the teacher placement rate, like all of the regulations' indicators of academic content knowledge and teaching skills, is to provide information on the performance of the program, we decline to require that States report data in their SRCs. We nonetheless encourage States to consider obtaining, securing, and publicizing these data as a way to make information they provide about each program more robust.

    Changes: None.

    Comments: Commenters stated that it is important to have teacher placement data beyond the first year following graduation, because graduates sometimes move among districts in the early years of their careers. One commenter noted that, in the commenter's State, data are currently available only for teachers in their first year of teaching, and that there is an important Federal role in securing these data beyond this first year.

    Discussion: From our review of the comments, we are unclear whether the commenters intended to refer to a program's teacher retention rate, because recent graduates who become novice teachers and then immediately move to another district would be captured by the teacher retention rate calculation. But because our definition of “novice teacher” includes an initial three-year teaching period, program's teacher retention rate would still continue to track these teachers in future years.

    In addition, we believe a number of commenters may have misunderstood how the teacher placement rate is calculated and used. Specifically, a number of commenters seemed to believe that the teacher placement rate is only calculated in the first year after program completion. This is inaccurate. The teacher placement rate is determined by calculating the percentage of recent graduates who have become novice teachers, regardless of their retention. As such, the teacher placement rate captures any recent graduate who works as a teacher of record in an elementary or secondary public school, which may include preschool at the State's discretion, within three years of program completion.

    In order to provide additional clarity, we provide the following example. We examine a theoretical group of graduates from a single teacher preparation program, as outlined in Table 1. In examining the example, it is important to understand that a State reports in its SRC for a given year a program's teacher retention rate based on data from the second preceding title II reporting year (as the term is defined in the regulations). Thus, recent graduates in 2018 (in the 2017-2018 title II reporting year) might become novice teachers in 2018-2019. The State collects these data in time to report them in the SRC to be submitted in October 2019. Please see the discussion of the timing of the SRC under § 612.4(a)(1)(i) General State Report Card reporting and § 612.4(b) Timeline for changes in the reporting timeline from that proposed in the NPRM.

    BILLING CODE 4000-01-P ER31OC16.000 ER31OC16.001 BILLING CODE 4000-01-C

    In this example, the teacher preparation program has five individuals who met all of the requirements for program completion in the 2016-2017 academic year. The State counts these individuals (A, B, C, D, and E) in the denominator of the placement rate for the program's recent graduates in each of the State's 2018, 2019, and 2020 SRCs because they are, or could be, recent graduates who had become novice teachers in each of the prior title II reporting years. Moreover, in each of these years, the State would determine how many of these individuals have become novice teachers. In the 2018 SRC, the State identifies that A and B have become novice teachers in the prior reporting year. As such, the State divides the total number of recent graduates who have become novice teachers (2) by the total number of recent graduates from 2016-2017 (5). Hence, in the 2018 SRC, this teacher preparation program has a teacher placement rate of 40 percent.

    In the State's 2019 SRC, all individuals who completed the program in 2017 and those who completed in 2018 (the 2016-2017 and 2017-2018 title II reporting years) meet the definition of recent graduate. In the 2018-2019 academic year, one additional completer from the 2016-2017 academic year has become a novice teacher (C), and five (F, G, H, J, and K) of the six 2017-2018 program completers have become novice teachers. In this instance, Teacher J is included as a recent graduate who has become a novice teacher even though Teacher J is not teaching in the current year. This is because the definition requires inclusion of all recent graduates who have become novice teachers at any time, regardless of their retention. Teacher J is counted as a successfully placed teacher. The fact that Teacher J is no longer still employed as a teacher is captured in the teacher retention rate, not here. As such, in the 2019 SRC, the teacher preparation program's teacher placement rate is 73 percent (eight program completers out of eleven have been placed).

    In the State's 2020 SRC, there are no additional cohorts to add to the pool of recent graduates in this example although, in reality, States will be calculating this measure using three rolling cohorts of program completers each year. In this example, Teacher D has newly obtained placement as a novice teacher and would therefore be included in the numerator. As with Teacher J in the prior year's SRC, Teachers G and K remain in the numerator even though they are no longer teachers of record because they have been placed as novice teachers previously. In the 2020 SRC, the teacher preparation program's teacher placement rate is 82 percent (nine program completers out of eleven have been placed).

    In the 2021 SRC, individuals who completed their teacher preparation program in the 2016-2017 academic year (A, B, C, D, and E) are no longer considered recent graduates since they completed their programs prior to the preceding three title II reporting years (2018, 2019, 2020). As such, the only cohort of recent graduates the State examines for this hypothetical teacher preparation program are those that completed the program in the 2016-2017 academic year (F, G, H, I, J, and K). In the 2020-2021 academic year, Teacher I is placed as a novice teacher. Once again, Teachers G and J are included in the numerator even though they are not currently employed as teachers because they have previously been placed as novice teachers. The program's teacher placement rate in the 2021 SRC would be 100 percent.

    In the 2022 SRC, this hypothetical teacher preparation program has no recent graduates, as no one completed the requirements of the program in any of the three preceding title II reporting years (2019, 2020, or 2021).

    As noted above, it is important to restate that recent graduates who have become novice teachers at any point, such as Teacher J, are included in the numerator of this calculation, regardless of whether they were retained as a teacher of record in a subsequent year. As such, if an individual completed a teacher preparation program in Year 1 and became a novice teacher in Year 2, regardless of whether he or she is still a novice teacher in Year 3, the individual is considered to have been successfully placed under this measure. Issues regarding retention of teachers are captured by the teacher retention rate measure, and therefore departures from a teaching position have no negative consequences under the teacher placement rate.

    We have adopted these procedures for State reporting of a program's teacher placement rate in each year's SRC to keep them consistent with the proposal we presented in the NPRM for reporting teacher placement rates over a three-year period, in line with the change in the SRC reporting date, and as simple and straightforward as possible. This led us to make certain non-substantive changes to the proposed definition of teacher retention rate so that the definition is clearer and less verbose. In doing so, we have removed the State's option of excluding novice teachers who have taken teaching positions that do not require State certification (paragraph (ii)(C) of the proposed definition) because it seems superfluous; our definition of teacher preparation program is one that leads to an initial State teacher certification or licensure in a specific field.

    Changes: We have revised the definition of “teacher placement rate” to include:

    (i) The percentage of recent graduates who have become novice teachers (regardless of retention) for the grade level, span, and subject area in which they were prepared.

    (ii) At the State's discretion, exclusion from the rate calculated under paragraph (i) of this definition of one or more of the following, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State:

    (A) Recent graduates who have taken teaching positions in another State.

    (B) Recent graduates who have taken teaching positions in private schools.

    (C) Recent graduates who have enrolled in graduate school or entered military service.

    Comments: None.

    Discussion: The Department recognizes that a State may be unable to accurately determine the total number of recent graduates in cases where a teacher preparation program provided through distance education is offered by a teacher preparation entity that is physically located in another State. Each institution of higher education conducting a teacher preparation program is required to submit an IRC, which would include the total number of recent graduates from each program, to the State in which it is physically located. If the teacher preparation entity operates a teacher preparation program provided through distance education in other States, it is not required to submit an IRC in those States. As a result, a State with a teacher preparation program provided through distance education that is operated by an entity physically located in another State will not have access to information on the total number of recent graduates from such program. Even if the State could access the number of recent graduates, recent graduates who neither reside in nor intend to teach in such State would be captured, inflating the number of recent graduates and resulting in a teacher placement rate that is artificially low.

    For these reasons, we have has determined that it is appropriate to allow States to use the total number of recent graduates who have obtained initial certification or licensure in the State, rather than the total number of recent graduates, when calculating teacher placement rates for teacher preparation programs provided through distance education. We believe that teacher placement rate calculated using the number of recent graduates who have obtained initial certification or licensure is likely more accurate in these instances than total recent graduates from a multi-state program. Even so, since fewer recent graduates obtain initial certification or licensure than the total number of recent graduates, the teacher placement rate may be artificially high. To address this, we have also revised the employment outcomes section in § 612.5(a)(2) to allow States a greater degree of flexibility in calculating and weighting employment outcomes for teacher preparation programs offered through distance education.

    Changes: We have revised the definition of teacher placement rate in § 612.2 to allow States to use the total number of recent graduates who have obtained initial certification or licensure in the State during the three preceding title II reporting years as the denominator in their calculation of teacher placement rate for teacher preparation programs provided through distance education instead of the total number of recent graduates.

    Teacher Preparation Program

    Comments: Commenters stated that the regulations are designed for undergraduate teacher preparation programs rather than graduate programs, in part because the definition of teacher preparation program is linked to specific teaching fields. This could result in small program sizes for post-baccalaureate preparation programs.

    Another commenter noted that it offers a number of graduate degree programs in education that do not lead to initial certification, but that the programs which institutions and States report on under part 612 are limited to those leading to initial certification.

    Other commenters urged that aggregation of data to elementary and secondary data sets would be more appropriate in States with a primarily post-baccalaureate teacher preparation model. We understand that commenters are suggesting that our proposed definition of “teacher preparation program,” with its focus on the provision of a specific license or certificate in a specific field, will give States whose programs are primarily at the post-baccalaureate level considerable trouble collecting and reporting data for the required indicators given their small size. (See generally § 612.4(b)(3).)

    Discussion: The definition of teacher preparation program in the regulations is designed to apply to both undergraduate and graduate level teacher preparation programs. We do not agree that the definition is designed to fit teacher preparation programs better at one or another level. With regard to the commenters' concerns about greater applicability to graduate-level programs, while the commenters identified these as concerns regarding the definition of teacher preparation program, we understand the issues described to be about program size, which is addressed in § 612.4(b). As such, these comments are addressed in the discussion of program size under § 612.4(b)(3)(ii). We do believe that it is important to clarify that a teacher preparation program for purposes of title II, HEA reporting is one that leads to initial certification, as has been the case under the title II reporting system since its inception.

    Changes: We have revised the definition of the term “teacher preparation program” to clarify that it is one that leads to initial state teacher certification or licensure in a specific field.

    Comments: Commenters noted that, because teacher preparation programs in some States confer academic degrees (e.g., Bachelor of Arts in English) on graduates rather than degrees in education, it would be impossible to identify graduates of teacher preparation programs and obtain information on teacher preparation graduates. Additionally, some commenters were concerned that the definition does not account for students who transfer between programs or institutions, or distinguish between students who attended more than one program; it confers all of the credit or responsibility for these students' academic content knowledge and teaching skills on the program from which the student graduates. In the case of alternative route programs, commenters stated that students may have received academic training from a different program, which could unfairly either reflect poorly on, or give credit to, the alternative route program.

    Discussion: Under the regulatory definition of the term, a teacher preparation program, whether alternative route or traditional, must lead to an initial State teacher certification or licensure in a specific field. As a result, a program that does not lead to an initial State teacher certification or licensure in a specific field (e.g., a Bachelor of Arts in English without some additional education-related coursework) is not considered a teacher preparation program that is reported on under title II. For example, a program that provides a degree in curriculum design, confers a Masters of Education, but does not prepare students for an initial State certification or licensure, would not qualify as a teacher preparation program under this definition. However, a program that prepares individuals to be high school English teachers, including preparing them for an initial State certification or licensure, but confers no degree would be considered a teacher preparation program. The specific type of degree granted by the program (if any) is irrelevant to the definition in these regulations. Regardless of their structure, all teacher preparation programs are responsible for ensuring their students are prepared with the academic content knowledge and teaching skills they need to succeed in the classroom. Therefore, by having the regulatory definition of teacher preparation program encompass all teacher preparation programs, regardless of their structure, that lead to initial State teacher certification or licensure in a specific field, it makes sense that States must report on the performance and associated data of each of these programs.

    While we understand that students often transfer during their college careers, we believe that that the teacher preparation program that ultimately determines that a student is prepared for initial certification or licensure is the one responsible for his or her performance as a teacher. This is so regardless of whether the student started in that program or a different one. The same is true for alternative route programs. Since alternative route programs enroll individuals who have had careers, work experiences, or academic training in fields other than education, participants in these programs have almost by definition had academic training elsewhere. However, we believe it is fully appropriate to have the alternative route program assume full responsibility for effective teacher training under the title II reporting system, as it is the program that determined the teacher to have sufficient academic content knowledge and teaching skills to complete the requirements of the program.

    Finally, we note that in § 612.5(a)(4), the regulations also require States to determine whether teacher preparation programs have rigorous exit requirements. Hence, regardless of student transfers, the public will know whether the State considers program completers to have reached a high standard of preparation.

    Changes: None.

    Comment: None.

    Discussion: In considering the comments we received on alternative route to certification programs, we realized that our proposed definition of “teacher preparation program” did not address the circumstance where the program, while leading to an initial teacher certification or licensure in a specific field, enrolls some students in a traditional teacher preparation program and other students in an alternative route to certification program (i.e., hybrid programs). Like the students enrolled in each of these two programmatic components, the components themselves are plainly very different. Principally, one offers instruction to those who will not become teachers of record until after they graduate and become certified to teach, while the other offers instruction to those who already are teachers of record (and have met State requirements to teach while enrolled in their teacher preparation program), and that thereby supports and complements those individuals' current teaching experiences. Thus, while each component is “offered by [the same] teacher preparation entity” and “leads to an initial State teacher certification or licensure in a specific field,” this is where the similarity may end.

    We therefore have concluded that our proposed definition of a teacher preparation program does not fit these hybrid programs. Having an IHE or the State report composite information for a teacher preparation program that has both a traditional and alternative route component does not make sense; reporting in the aggregate will mask what is happening with or in each component. The clearest and simplest way to avoid the confusion in reporting that would otherwise result is to have IHEs and States treat each component of such a hybrid program as its own teacher preparation program. We have revised the definition of a “teacher preparation program” in § 612.2 to do just that. While doing so may create more small teacher preparation programs that require States to aggregate data under § 612.4(b)(3)(ii), this consequence will be far outweighed by the benefits of cleaner and clearer information.

    Changes: We have revised the definition of a “teacher preparation program” in § 612.2 to clarify that where some participants in the program are in a traditional route to certification or licensure in a specific field, and others are in an alternative route to certification or licensure in that same field, the traditional and alternative route component is each its own teacher preparation program.

    Teacher Retention Rate

    Comments: Some commenters stated that by requiring reporting on teacher retention rates, both generally and for high-need schools, program officials—and their potential applicants—can ascertain if the programs are aligning themselves with districts' staffing needs.

    Other commenters stated that two of the allowable options for calculating the teacher retention rate would provide useful information regarding: (1) The percentage of new teachers hired into full-time teaching positions and serving at least three consecutive years within five years of being certified or licensed; and (2) the percentage of new teachers hired full-time and reaching tenure within five years of being certified. According to commenters, the focus of the third option, new teachers who were hired and then fired for reasons other than budget cuts, could be problematic because it overlooks teachers who voluntarily leave high-need schools, or the profession altogether. Other commenters recommended removing the definition of teacher retention rate from the regulations.

    Another commenter stated that the teacher retention rate, which we had proposed to define as any of the three specific rates as selected by the State, creates the potential for incorrect calculations and confusion for consumers when teachers have initial certification in multiple States; however, the commenter did not offer further information to clarify its meaning. In addition, commenters stated that the proposed definition allows for new teachers who are not retained due to market conditions or circumstances particular to the LEA and beyond the control of teachers or schools to be excluded from calculation of the retention rate, a standard that allows each school to determine the criteria for those conditions, which are subject to interpretation.

    Several commenters requested clarification of the definition. Some asked us to clarify what we meant by tenure. Another commenter asked us to clarify how to treat teachers on probationary certificates.

    Another commenter recommended that the Department amend the teacher retention rate definition so that it is used to help rate teacher preparation programs by comparing the program's recent graduates who demonstrate effectiveness and remain in teaching to those who fail to achieve high ratings on evaluations. One commenter suggested that programs track the number of years graduates taught over the course of five years, regardless of whether or not the years taught were consecutive. Others suggested shortening the timeframe for reporting on retention so that the rate would be reported for each of three consecutive years and, as we understand the comments, would apply to individuals after they became novice teachers.

    Discussion: We agree with the commenters who stated that reporting on teacher retention rates both generally and for high-need schools ensures that teacher preparation programs are aligning themselves with districts' staffing needs.

    In response to comments, we have clarified and simplified the definition of teacher retention rate. We agree with commenters that the third proposed option, by which one subtracts from 100 percent the percentage of novice teachers who were hired and fired for reasons other than budget cuts, is not a true measure of retention because it excludes those who voluntarily leave the profession. Therefore, we have removed it as an option for calculating the retention rate. Doing so also addresses those concerns that the third option allowed for too much discretion in interpreting when local conditions beyond the schools' control caused teachers to no longer be retained.

    We also agree with commenters that the second proposed option for calculating the rate, which looked to the percentage of new teachers not receiving tenure within five years, is confusing and does not make sense when looking at new teachers, which we had proposed to define as covering a three-year teaching period, as tenure may not be reached during that timeframe. For these reasons, we also have removed this option from the definition. Doing so addresses the commenters' concerns that multiple methods for calculating the rate would create confusion. We also believe this addresses the comments regarding our use of the term tenure as potentially causing confusion.

    We also note that our proposed definition of teacher retention rate did not bring in the concept of certification in the State in which one teaches. Therefore, we do not believe this definition will cause the confusion identified by the commenter who was concerned about teachers who were certified to teach in multiple States.

    Additionally, we revised the first option for calculating the teacher retention rate to clarify that the rate must be calculated three times for each cohort of novice teachers—after the first, second, and third years as a novice teacher. We agree with commenters who recommended shortening the timeframe for reporting on retention from three of five years to the first three consecutive years. We made this change because the definition of recent graduate already builds in a three-year window to allow for delay in placement, and to simplify the data collection and reporting requirements associated with this indicator.

    We also agree with the recommendation that States calculate a program's retention rate based on three consecutive years after individuals become novice teachers. We believe reporting on each year for the first three years is a reasonable indicator of academic content and teaching skills in that it shows how well a program prepares novice teachers to remain in teaching, and also both promotes greater transparency and helps employers make more informed hiring decisions. We note that teacher retention rate is calculated for all novice teachers, which includes those on probationary certificates. This is further explained in the discussion of “Alternative Route Programs” in section 612.5(a)(2).

    We appreciate the suggestions that we should require States to report a comparison of retention rates of novice teachers based on their evaluation ratings, but decline to prescribe this measure as doing so would create costs and complexities that we do not think are sufficiently necessary in determining a program's broad level of performance. States that are interested in such information for the purposes of transparency or accountability are welcome to consider it as another criterion for assessing program performance or for other purposes.

    BILLING CODE 4000-01-P ER31OC16.002 ER31OC16.003

    When calculating teacher retention rate, it is important to first note that the academic year in which an individual met all of the requirements for program completion is not relevant. Contrary to teacher placement rate, the defining concern of a teacher retention rate calculation is the first year in which an individual becomes a teacher of record for P-12 public school students. In this example, we use the same basic information as we did for the teacher placement rate example. As such, Table 2a recreates Table 1, with calculations for teacher retention rate instead of the teacher placement rate. However, because the first year in which an individual becomes a novice teacher is the basis for the calculations, rather than the year of program completion, we could rearrange Table 2a in the order in which teachers first became novice teachers as in Table 2b.

    In addition, Table 2b removes data on program completion, and eliminates both extraneous information before an individual becomes a novice teacher and employment information after the State is no longer required to report on these individuals for purposes of the teacher retention rate.

    ER31OC16.004 ER31OC16.005

    In this example, this particular teacher preparation program has five individuals who became novice teachers for the first time in the 2017-2018 academic year (Teachers A, B, F, G, and J). For purposes of this definition, we refer to these individuals as a cohort of novice teachers. As described below, the State will first calculate a teacher retention rate for this teacher preparation program in the October 2019 State report card. In that year, the State will determine how many members of the 2017-2018 cohort of novice teachers have been continuously employed through the current year. Of Teachers A, B, F, G, and J, only teachers A, B, F, and G are still teaching in 2018-2019. As such, the State calculates a teacher retention rate of 80 percent for this teacher preparation program for the 2019 State Report Card.

    In the October 2020 SRC, the State is required to report on the 2017-2018 cohort and the 2018-2019 cohort. The membership of the 2017-2018 cohort does not change. From that cohort, Teachers A, B, and F were employed in both the 2018-2019 academic year and the 2019-2020 academic year. The 2018-2019 cohort consists of Teachers C, H, and K. Of those, only Teachers C and H are employed as teachers of record in the 2019-2020 academic year. Therefore, the State reports a teacher retention rate of 60 percent for the 2017-2018 cohort—because three teachers (A, B, and F) were continuously employed through the current year out of the five total teachers (A, B, F, G, and J) in that cohort—and 67 percent for the 2018-2019 cohort—because 2 teachers (C and H) were employed in the current year of the three total teachers (C, H, and K) in that cohort.

    In the October 2021 SRC, the State will be reporting on three cohorts of novice teachers for the first time—the 2017-2018 cohort (A, B, F, G, and J), the 2018-2019 cohort (C, H, and K), and the 2019-2020 cohort (D). Of the 2017-2018 cohort, only Teachers A and F have been continuously employed as a teacher of record since the 2017-2018 academic year, therefore the State will report a retention rate of 40 percent for this cohort (two out of five). Of the 2018-2019 cohort, only Teachers C and H have been continuously employed since the 2018-2019 academic year. Despite being a teacher of record for the 2020-2021 academic year, Teacher K does not count towards this program's teacher retention rate because Teacher K was not a teacher of record in the 2019-2020 academic year, and therefore has not been continuously employed. The State would report a 67 percent retention rate for the 2018-2019 cohort (two out of three). For the 2019-2020 cohort, Teacher D is still a teacher of record in the current year. As such, the State reports a teacher retention rate of 100 percent for that cohort.

    Beginning with the 2022 SRC, the State no longer reports on the 2017-2018 cohort. Instead, the State reports on the three most recent cohorts of novice teachers—2018-2019 (C, H, and K), 2019-2020 (D), and 2020-2021 (E and I). Of the members of the 2018-2019 cohort, both Teachers C and H have been employed as teachers of record in each year from their first year as teachers of record through the current reporting year. Teacher K is still not included in the calculation because of the failure to be employed as a teacher of record in the 2019-2020 academic year. Therefore, the State reports a 67 percent retention rate for this cohort. Of the 2019-2020 cohort, Teacher D has been employed in each academic year since first becoming a teacher of record. The State would report a 100 percent retention rate for this cohort. Teachers E and I, of the 2020-2021 cohort, have also been retained in the 2021-2022 academic year. As such, the State reports a teacher retention rate of 100 percent in the 2022 SRC for this cohort.

    Changes: We have revised the definition of teacher retention rate by removing the second and third proposed options for calculating it. We have replaced the first option with a method for calculating the percentage of novice teachers who have been continuously employed as teachers of record in each year between their first year as a novice teacher and the current reporting year. In doing so, we also clarify that the teacher retention rate is based on the percentage of novice teachers in each of the three cohorts of novice teachers immediately preceding the current title II reporting year.

    Comments: None.

    Discussion: Upon review of comments, we recognized that the data necessary to calculate teacher retention rate, as we had proposed to define this term, will not be available for the October 2018, 2019 and 2020 State reports. We have therefore clarified in § 612.5(a)(2)(ii)the reporting requirements for this indicator for these initial implementation years. In doing so, we have re-designated proposed § 612.5(a)(2)(ii), which permits States to assess traditional and alternative route teacher preparation programs differently based on whether there are specific components of the programs' policies or structure that affect employment outcomes, as § 612.5(a)(2)(iii).

    Changes: We have added § 612.5(a)(2)(ii) to clarify that: For the October 2018 State report, the rate does not apply; for the October 2019 State report, the rate is based on the cohort of novice teachers identified in the 2017-18 title II reporting year; for the October 2020 State report, separate rates will be calculated for the cohorts of novice teachers identified in the 2017-18 and 2018-19 title II reporting years. In addition, we have re-designated proposed § 612.5(a)(2)(ii) as § 612.5(a)(2)(iii).

    Teacher Survey

    Comments: Commenters stated that the proposed definition of teacher survey was unclear about whether all novice teachers or only a sample of novice teachers must be surveyed. Commenters also stated that the proposed definition missed an opportunity to collect meaningful data about teacher preparation program performance because it would only require a survey of novice teachers serving in full-time teaching positions for the grade level, span, and subject area in which they were prepared, and not all the completers of programs. One commenter noted that Massachusetts plans to collect survey data from recent graduates upon completion and novice teachers after a year of employment.

    Some commenters provided recommendations regarding survey content. These commenters argued that the teacher survey include questions to determine whether a teacher preparation program succeeded in the following areas, which, according to the commenters, research shows are important for preparing teachers to advance student achievement: producing student learning and raising student achievement for all students; using data to assess and address student learning challenges and successes; providing differentiated teaching strategies for students with varied learning needs, including English learners; keeping students engaged; managing classroom behavior; and using technology to improve teaching and increase student learning.

    Discussion: While the proposed definition of survey outcomes provided that States would have to survey all novice teachers in their first year of teaching in the State where their teacher preparation program is located, our proposed definition of teacher survey limited this to those teachers in full-time teaching positions. We agree with the commenters' explanations for why States should need to survey all novice teachers, and not just those who are in full-time teaching positions. For clarity, in addition to including the requirement that “survey outcomes” be of all novice teachers, which we have moved from its own definition in proposed §§ 612.2 to 612.5(a)(3), we have revised the definition of “teacher survey” accordingly. We are also changing the term “new teacher” to “novice teacher” for the reasons discussed under the definition of “novice teacher.”

    However, we believe that requiring States to survey all program completers would put undue burden on States by requiring them to locate individuals who have not been hired as teachers. Rather, we believe it is enough that States ensure that surveys are conducted of all novice teachers who are in their first year of teaching. We note that this change provides consistency with the revised definition of employer survey, which is a survey of employers or supervisors designed to capture their perceptions of whether the novice teachers they employ or supervise, who are in their first year of teaching, were effectively prepared. The goal of a teacher preparation program is to effectively prepare aspiring teachers to step into a classroom prepared to teach. As the regulations seek to help States reach reasonable determinations of whether teacher preparation programs are meeting this goal, the definition of survey outcomes focuses on novice teachers in their first year of teaching. We note that the regulations do not prohibit States from surveying additional individuals or conducting their surveys of cohorts of teachers over longer periods of time, and we encourage States to consider doing so. However, considering the costs associated with further surveys of the same cohorts of novice teachers, we believe that requiring that these teachers be surveyed once, during their first year of teaching, provides sufficient information about the basic issue—how well their program prepared them to teach.

    We believe that States, in consultation with their stakeholders (see § 612.4(c)), are in the best position to determine the content of the surveys used to evaluate the teacher preparation programs in their State. Therefore, the regulations do not specify the number or types of questions to be included in employer or teacher surveys.

    Changes: We have revised the definition of “teacher survey” to require States to administer surveys to all novice teachers in their first year of teaching in the State.

    Title II Reporting Year

    Comments: None.

    Discussion: Since its inception, the title II reporting system has used the term “academic year” to refer to a period of twelve consecutive months, starting September 1 and ending August 31, during which States collect and subsequently report data on their annual report cards. This period of data collection and reporting is familiar to States, institutions, and the public; however, the proposed regulations did not contain a definition of this reporting period. In order to confirm that we do not intend for States to implement the regulations in a way that changes their longstanding practice of using that “academic year” as the period for their data collection and reporting, we believe that it is appropriate to add a definition to the regulations. However, to avoid confusion with the very generic term academic year, which may mean different things at the teacher preparation program and LEA levels, we instead use the term “title II reporting year.”

    Changes: We added the term “title II reporting year” under § 612.2, and defined it as a period of twelve consecutive months, starting September 1 and ending August 31.

    Subpart B—Reporting Requirements Section 612.3 What are the regulatory reporting requirements for the institutional report card? Timeline of Reporting Requirements (34 CFR 612.3)

    Comments: While there was some support for our proposal to change the IRC due date from April to October, many commenters stated that the proposed October 2017 pilot start date for the annual reporting cycle for the IRC, using data pertaining to an institution's programs and novice teachers for the 2016-2017 academic year, would be unworkable. Several commenters therefore strongly recommended that our proposal to move the due date for the IRC up by six months to October following the end of the institutions' academic year not be implemented.

    Commenters said that the change would make it impossible to collect reliable data on several factors and on large numbers of recent students. They stated that it would be impossible to submit a final IRC by October 1 because students take State licensing assessments, as well as enter into, drop from, and complete programs through August 31, and therefore final student data, pass rates for students who took assessments used for teacher certification or licensure by the State, and other information would not be available until September or October of each year. Other commenters indicated that, because most teacher preparation programs will need to aggregate multiple years of data to meet the program size threshold for reporting, the October submission date will unnecessarily rush the production and posting of their aggregated teacher preparation program data. Some commenters noted that changing the IRC due date to October (for reporting on students and programs for the prior academic year) would require a change in the definition of academic year because, without such a change, the October reports could not reflect scores on assessment tests that students or program completers took through August 31st. Alternatively, the proposal would require institutions to prepare and submit supplemental reports later in the year in order for the reports to fully reflect information for the prior academic year.

    Some commenters also stated that LEAs have limited staffing and cannot provide assistance to institutions during the summer when data would be collected, or that because teacher hiring often occurs in August, an October IRC due date does not provide enough time to collect reliable employment data.

    Discussion: We believe that the NPRM confused many commenters, leading them to believe that IRC reporting would occur in the October immediately after the end of the title II academic year on August 31. Rather, we had intended that the reporting would be on the prior year's academic year (e.g., the October 1, 2018 IRC would report data on the 2016-2017 academic year). However, as we discuss in our response to comments on our proposals for the timing of the SRC under § 612.4(a)(1)(i) General State Report Card reporting and § 612.4(b) Timeline, we have decided to maintain the submission date for the SRC report in October, and so also maintain the due date for the IRC as April of the year following the title II reporting year.

    Finally, while several commenters opined that an October date for submission of the IRC did not provide sufficient time for institutions to receive information from LEAs, we do not believe that the regulations require LEAs to submit any information to institutions for purposes of the IRC. We assume that the comments were based on a misunderstanding surrounding the data to be reported in the IRC. While our proposed indicators of program performance would require States to receive and report information from LEAs, institutions would not need to receive comparable information from LEAs in order to prepare and submit their IRCs.

    Changes: We have revised § 612.3 to provide that the first IRC under the regulations, which would cover the 2016-2017 academic year, is due not later than April 30, 2018.

    Institutional Report Card (34 CFR 612.3(a))

    Comments: Multiple commenters noted that the proposed regulations regarding the IRCs do not take into account all of the existing reporting demands, including not only the title II report, but also reports for national and regional accrediting bodies. Another commenter stated that, because feedback loops already exist to improve teacher preparation programs, there is no need to have a Federal report card on each teacher preparation program.

    On the other hand, some commenters suggested that teacher preparation programs report the demographics and outcomes of enrolled teacher candidates by race and ethnicity. Specifically, commenters suggested reporting the graduation rates, dropout rates, placement rates for graduates, first-year evaluation scores (if available), and the percentage of teacher candidates who stay within the teaching profession for one, three, and five years. Another commenter also suggested that gender, age, grade-level, and specialized areas of study be included; and that the data be available for cross-tabulation (a method of analysis allowing comparison of the relationship between two variables). One commenter stated that because title II reporting metrics are geared to evaluate how IHEs provide training, recruitment, and education to first-time graduates of education programs, the metrics cannot be applied to alternative route certification programs, which primarily train career changers who already have a degree and content knowledge. This commenter argued that attempting to compare the results of title II metrics from alternative route certification programs and traditional IHE-based programs will result in untrue conclusions because the programs' student candidates are so different.

    Another commenter suggested that, in order to ensure that States are able to separately report on the performance of alternative route preparation programs, IHEs should report whether they have a partnership agreement with alternative route providers, and identify the candidates enrolled in each of those programs. The commenter noted that, while doing so may lead States to identify groups of small numbers of alternative route program participants, it may eliminate the possibility that candidates who actually participate in alternative route programs are identified as graduates of a traditional preparation program at the same IHE.

    Another commenter stated that the variety of program academic calendars, with their different “start” and “end” dates in different months and seasons of the year, created another source of inaccurate reporting. The commenter explained that, with students entering a program on different dates, the need to aggregate cohorts will result in diffuse data that have relatively little meaning since the cohort will lose its cohesiveness. As such, the commenter stated, the data reported based on aggregate cohorts should not be used in assessing or evaluating the impact of programs on participants.

    A number of commenters noted what they claimed were inherent flaws in our proposed IRC. They argued that it has not been tested for validity, feasibility, or unintended consequences, and therefore should not be used to judge the quality of teacher preparation programs.

    Discussion: In response to comments that would have IHEs report more information on race, ethnicity, sex, and other characteristics of their students or graduates, the content of the IRC is mandated by section 205(a) of the HEA. Section 205(a)(C)(ii) of the HEA provides the sole information that IHEs must report regarding the characteristics of their students: “the number of students in the program (disaggregated by race, ethnicity, and gender).” Therefore, we do not have the authority to waive or change the statutorily prescribed annual reporting requirements for the IRC.

    Regarding the recommendation that institutions report whether their teacher preparation programs have partnership agreements with alternative route providers, we note that section 205(a) of the HEA neither provides for IHEs to include this type of information in their IRCs nor authorizes the Secretary to add reporting elements to them. However, if they choose, States could require institutions to report such data to them for inclusion in the SRCs. We defer to States on whether they need such information and, if so, the best way to require IHEs to provide it.

    In response to the comment that the IRC is unnecessary because institutions already have feedback loops for program improvement, we note that by requiring each institution to make the information in the IRC available to the general public Congress plainly intends that the report serve a public interest that goes beyond the private use the institution may make of the reported data. We thus disagree that the current feedback loops that IHEs may have for program improvement satisfy Congress' intent in this regard.

    We understand that there are differences between traditional and alternative route teacher preparation programs and that variability among programs in each category (including program start and end dates) exists. However, section 205(a) of the HEA is very clear that an IHE that conducts either a traditional or alternative route teacher preparation program must submit an IRC that contains the information Congress has prescribed. Moreover, we do not agree that the characteristics of any of these programs, specifically the demographics of the participants in these programs or whether participants have already earned an undergraduate degree, would necessarily lead to inaccurate or confusing reporting of the information Congress requires. Nor do we believe that the IRC reporting requirements are so geared to evaluate how IHEs provide training, recruitment, and education to first-time graduates of education programs that IHEs operating alternative route programs cannot explain the specifics of their responses.

    We do acknowledge that direct comparisons of traditional and alternative route programs would potentially be misleading without additional information. However, this is generally true for comparisons of all types of programs. For example, a comparison of average cost of tuition and fees between two institutions could be misleading without the additional context of the average value of financial aid provided to each student. Simply because analyzing specific data out of context could potentially generate confusion does not mitigate the value of reporting the information to the general public that, as we have noted, Congress requires.

    With specific regard to the fact that programs have different operating schedules, the IRC would have all IHEs report on students participating in teacher preparation programs during the reporting year based on their graduation date from the program. This would be true regardless of the programs' start date or whether the students have previous education credentials. We also believe the IRC would become too cumbersome if we tried to tailor the specific reporting requirements in section 205(a) of the HEA to address and reflect each individual program start time, or if the regulations created different reporting structures based on the program start time or the previous career or educational background of the program participants.

    Furthermore, we see no need for any testing of data reported in the IRC for validity, feasibility, or unintended consequences. The data required by these regulations are the data that Congress has specified in section 205(a) of the HEA. We do not perceive the data elements in section 205(a) as posing any particular issues of validity. Just as they would in any congressionally mandated report, we expect all institutions to report valid data in their IRCs and, if data quality issues exist we expect institutions will address them so as to meet their statutory obligations. Further, we have identified no issues with the feasibility of reporting the required data. While we have worked to simplify institutional reporting, institutions have previously reported the same or similar data in their IRCs, albeit at a different level of aggregation. Finally, we fail to see any unintended consequences that follow from meeting the statutory reporting requirements. To the extent that States use the data in the IRC to help assess whether a program is low-performing or at-risk of being low-performing under section 207(a) of the HEA, under our regulations this would occur only if, in consultation with their stakeholders under § 612.4(c), States decide to use these data for this purpose. If institutions are concerned about such a use of these data, we encourage them to be active participants in the consultative process.

    Changes: None.

    Prominent and Prompt Posting of Institutional Report Card (34 CFR 612.3(b))

    Comments: Multiple commenters supported the requirement to have each IHE post the information in its IRC on its Web site and, if applicable, on the teacher preparation program's own Web site. Based on the cost estimates in the NPRM, however, several commenters raised concerns about the ability of IHEs to do so.

    Discussion: We appreciate the commenters' support for our proposal as an appropriate and efficient way for IHEs to meet their statutory responsibility to report annually the content of its IRC to the general public (see section 205(a)(1) of the HEA).

    We discuss the comments regarding concerns about the cost estimates in the IRC Reporting Requirements section of the Discussion of Costs, Benefits, and Transfers in this document.

    Changes: None.

    Availability of Institutional Report Card (34 CFR 612.3(c))

    Comments: One commenter recommended that we mandate that each IHE provide the information contained in its IRC in promotional and other materials it makes available to prospective students, rather than leaving it to the discretion of the institution.

    Discussion: While we believe that prospective students or participants of a teacher preparation program need to have ready access to the information in the institution's IRC, we do not believe that requiring the IHE to provide this information in its promotional materials is either reasonable or necessary. We believe that the costs of doing so would be very large and would likely outweigh the benefits. For example, many institutions may make large printing orders for pamphlets, brochures, and other promotional materials that get used over the course of several years. Requiring the inclusion of IRC information in those materials would require that institutions both make these promotional materials longer and print them more often. As the regulations already mandate that this information be prominently posted on the institution's Web site, we fail to see a substantial benefit to prospective students that outweighs the additional cost to the institution.

    However, while not requiring the information to be included in promotional materials, we encourage IHEs and their teacher preparation programs to provide it in places that prospective students can easily find and access. We believe IHEs can find creative ways to go beyond the regulatory requirements to provide this information to students and the public without incurring significant costs.

    Changes: None.

    Section 612.4 What are the regulatory reporting requirements for the State report card? General (34 CFR 612.4(a))

    Comments: None.

    Discussion: As proposed, § 612.4(a) required all States to meet the annual reporting requirements. For clarity, we have revised this provision to provide, as does section 205(b) of the HEA, that all States that receive HEA funds must do so.

    Changes: We have revised § 612.4(a) to provide that all States that receive funds under the HEA must meet the reporting requirements required by this regulation.

    General (Timeline) (34 CFR 612.4(a)(1)(i)) and Reporting of Information on Teacher Preparation Program Performance (Timeline) (34 CFR 612.4(b))

    Comments: Many commenters expressed concern with their State's ability to build data systems and to collect and report the required data under the proposed timeline.

    Commenters noted that the proposed timeline does not allow States enough time to implement the proposed regulations, and that the associated logistical challenges impose undue and costly burdens on States. Commenters noted that States need more time to make decisions about data collection, involve stakeholders, and to pilot and revise the data systems—activities that they said cannot be completed in one year.

    Several commenters recommended extending the timeline for implementation by at least five years. Some commenters suggested delaying the reporting of program ratings until at least 2021 to give States more time to create data linkages and validate data. Other commenters pointed out that their States receive employment and student learning data from LEAs in the fall or winter, which they said makes reporting outcomes in their SRCs in October of each year, as we had proposed, impossible. Still other commenters noted that some data, by their nature, may not be available to report by October. Another commenter suggested that institutions should report in October, States should report outcome data (but not performance designations) in February, and then the States should report performance designations in June, effectively creating an additional reporting requirement. To address the timing problems in the proposed schedule for SRC submission, other commenters recommended that the Department continue having States submit their SRCs in October. On the other hand, some commenters supported or encouraged the Department to maintain the proposed timelines.

    Many commenters stated that no State currently implements the proposed teacher preparation program rating system. Therefore, to evaluate effectiveness, or to uncover unintended consequences, these commenters emphasized the importance of permitting States to develop and evaluate pilot programs before broader implementation. Some commenters therefore recommended that the proposed implementation timeline be delayed until the process had been piloted and evaluated for efficiency while others recommended a multiyear pilot program.

    Discussion: We appreciate the comments supporting the proposed reporting timeline changes to the SRC. However, in view of the public's explanation of problems that our proposed reporting schedule could cause, we are persuaded that the title II reporting cycle should remain as currently established—with the institutions submitting their IRCs in April of each year, and States submitting their SRCs the following October. IHEs and States are familiar with this schedule, and we see that our proposal to switch the reporting dates, while having the theoretical advantage of permitting the public to review information much earlier, was largely unworkable.

    Under the final regulations, the initial SRC (a pilot) would be due October 31, 2018, for the 2016-2017 academic year. The October 2018 due date provides much more time for submission of the SRC. As we note in the discussion of comments received on § 612.3(a) (Reporting Requirements for the IRC), IHEs will continue to report on their programs, including pass rates for students who took assessments used for initial certification or licensure by the State in which the teacher preparation program is located, from the prior academic year, by April 30 of each year. States therefore will have these data available for their October 31 reporting. Because the outcome data States will need to collect to help assess the performance of their teacher preparation programs (i.e., student learning outcomes, employment outcomes, and survey outcomes) would be collected on novice teachers employed by LEAs from the prior school year, these data would likewise be available in time for the October 31 SRC reporting. Given this, we believe all States will have enough time by October 31 of each year to obtain the data they need to submit their SRCs. In addition, since States are expected to periodically examine the quality of their data collection and reporting under § 612.4(c)(2), we expect that States have a process by which to make modifications to their system if they desire to do so.

    By maintaining the current reporting cycle, States will have a year (2016-2017) to design and implement a system. The 42 States, District of Columbia, and the Commonwealth of Puerto Rico that were previously granted ESEA flexibility are therefore well positioned to meet the requirements of these regulations because they either already have the systems in place to measure student learning outcomes or have worked to do so. Moreover, with the flexibility that § 612.5(a)(1)(ii) now provides for States to measure student learning outcomes using student growth, a teacher evaluation measure, or another State-determined measure relevant to calculating student learning outcomes (or any combination of these three), all States should be able to design and implement their systems in time to submit their initial reports by October 31, 2018. Additionally, at least 30 States, the District of Columbia, and the Commonwealth of Puerto Rico either already have the ability to aggregate data on the achievement of students taught by recent graduates and link those data back to teacher preparation programs. Similarly, as discussed below, 30 States already implement teacher surveys that could be modified to be used in this accountability system.

    Particularly given the added flexibility in § 612.5(a)(1)(ii), as most States already have or are well on their way to having the systems required to implement the regulations, we are confident that the reduction in time to prepare before the pilot SRC will be prepared and submitted will prove to be manageable. We understand that some States will not have complete datasets available for all indicators during initial implementation, and so may need to make adjustments based on experience during the pilot year. We also stress that the October 2018 SRC is a pilot report; any State identification of a program as low-performing or at-risk of being low-performing included in that report would not have implications either for the program generally or for that program's eligibility to participate in the TEACH Grant program. Full SRC reporting begins in October 2019.

    In addition, maintaining the SRC reporting date of October 31 also is important so that those who want to apply for admission to teacher preparation programs and for receipt of TEACH Grants as early as January of the year they wish to begin the program know which IHEs have programs that States have identified in their SRCs as at-risk or low-performing. Prospective students should have this information as soon as they can so that they know both the State's assessment of each program's level of performance and which IHEs lack authority to award TEACH Grants. See our response to public comment regarding the definition of a TEACH Grant-eligible institution in § 686.2.

    In summary, under our revised reporting cycle, the SRC is due about five months earlier than in the proposed regulations. However, because the report due October 31, 2018 is a pilot report, we believe that States will have sufficient time to complete work establishing their reporting and related systems to permit submission of all information in the SRC by the first full reporting date of October 31, 2019. While we appreciate the comments suggesting that States be able to develop and evaluate pilot programs before broader implementation, or that the implementation timeline be delayed until the State process has been piloted and evaluated for efficiency, we do not believe that adding more time for States to develop their systems is necessary. Lastly, maintaining the existing timeline does not affect the timing of consequences for TEACH Grants for at-risk or low-performing teacher preparation programs. Under the regulations, the TEACH Grant consequences would apply for the 2021-2022 award year.

    Changes: We have revised § 612.4(a) to provide that State reports under these final regulations would be due on October 31, 2018. We also changed the date for SRC reporting to October wherever it appears in the final regulations.

    Comments: Some commenters expressed concern with States' ability to implement valid and reliable surveys in the time provided. Commenters argued that issues related to who to survey, when to survey, and how often to survey would make this the most challenging performance indicator to develop, implement, and use for determining a program's performance level. Commenters stated that an institution's capacity to track graduates accurately and completely is highly dependent on the existence of sophisticated State data systems that track teacher employment and on appropriate incentives to assure high response rates to surveys, noting that many States do not have such systems in place and some are just beginning to implement them. Commenters suggested that the Department consider easing the timeline for implementation of surveys to reduce the cost and burden of implementation of surveys.

    Discussion: According to the GAO survey of States, 30 States have used surveys that assessed principals' and other district personnel's satisfaction with recent traditional teacher preparation program graduates when evaluating programs seeking State approval.14 We believe these States can modify these existing survey instruments to develop teacher and employer surveys that comply with the regulations without substantial additional burden. Additionally, States that do not currently use such surveys may be able to shorten the time period for developing their own surveys by using whole surveys or individual questions already employed by other States as a template. States may also choose to shorten the time required to analyze survey results by focusing on quantitative survey responses (e.g., score on a Likert scale or number of hours of training in a specific teaching skill) rather than taking the time to code and analyze qualitative written responses. However, we note that, in many instances, qualitative responses may provide important additional information on program quality. As such, States could opt to include qualitative questions in their surveys and send the responses to the applicable teacher preparation programs for their own analysis. With a far smaller set of responses to analyze, individual programs would be able to review and respond much more quickly than the State. However, these are decisions left to the States and their stakeholders to resolve.

    14 GAO at 13.

    Changes: None.

    Comments: A number of commenters indicated confusion about when particular aspects of the proposed IRC and SRC are to be reported and recommended clarification.

    Discussion: We agree with the recommendation to clarify the reporting of cohorts and metrics for reporting years. The chart below outlines how certain metrics will be reported and the reporting calendar. We understand that the information reported on the SRC may differ from the example provided below because initially some data may be unavailable or incomplete. In these instances, we expect that States will weight indicators for which data are unavailable in a way that is consistent and applies equivalent levels of accountability across programs.

    Table 3—Implementation and Reporting Calendar Example Year 2018 2019 2020 2021 2022 Institutional Report Card (IRC) IRC Due Date April 30, 2018 April 30, 2019 April 30, 2020 April 30, 2021 April 30, 2022. Pass Rate Recent graduates (from AY 2016-17) Recent graduates (from AY 2017-18) Recent graduates (from AY 2018-19) Recent graduates (from AY 2019-20) Recent graduates (from AY 2020-21). State Report Card (SRC) SRC Due Date October 31, 2018 (Pilot) October 31, 2019 October 31, 2020 October 31, 2021 October 31, 2022. Placement Rate C1 C1, C2 C1, C2, C3 C2, C3, C4 C3, C4, C5. Retention Rate N/A C1 C1, C2 C1, C2, C3 C2, C3, C4. Student Learning Outcomes C1 C1, C2 C1, C2, C3 C2, C3, C4 C3, C4, C5. Survey Outcomes C1 C2 C3 C4 C5. TEACH Eligibility Not impacted Not impacted Impacts 2021-22 Award Year Impacts 2022-23 Award Year Impacts 2023-24 Award Year. Academic Year (AY): Title II academic year runs from September 1 to August 31. Award year: Title IV award year runs from July 1 to June 30. Note: Data systems are to be designed and implemented during the 2016-17 school year. C1: Cohort 1, novice teachers whose first year in the classroom is 2017-18. C2: Cohort 2, novice teachers whose first year in the classroom is 2018-19. C3: Cohort 3, novice teachers whose first year in the classroom is 2019-20. C4: Cohort 4, novice teachers whose first year in the classroom is 2020-21. C5: Cohort 5, novice teachers whose first year in the classroom in 2021-22.

    Changes: None.

    Comments: To reduce information collection and dissemination burden on States, a commenter asked that the Department provide a mechanism for rolling up IRC data into the State data systems.

    Discussion: The Department currently provides a system by which all IHEs may electronically submit their IRC data, and which also prepopulates the SRC with relevant information from the IRCs. We intend to continue to provide this system.

    Changes: None.

    Comments: Some commenters stated that States should be able to replace the SRC reporting requirements in these regulations with their own State-defined accountability and improvement systems for teacher preparation programs.

    Discussion: We disagree that States should be able to replace the SRC reporting requirements with their own State-defined accountability and improvement systems for teacher preparation programs. Section 205(b) of the HEA requires reporting of the elements in the SRC by any State that receives HEA funding. The measures included in the regulations are either specifically required by that provision or are needed to give reasonable meaning to the statutorily required indicators of academic content knowledge and teaching skills a State must use to assess a teacher preparation program's performance. However, § 612.5(b) specifically permits a State to assess a program's performance using additional indicators predictive of a teacher's effect on student performance, provided that it uses the same indicators for all teacher preparation programs in the State. Following stakeholder consultation (see § 612.4(c)), States are free to adopt criteria for assessing program performance beyond those addressed in the regulations.

    Changes: None.

    Comments: Some commenters recommended that the Department provide adequate time for States to examine and address the costs of tracking student progress and academic gains for teacher preparation program completers who teach out of State.

    Discussion: Section 612.5(a)(1) has been revised to clarify that States may exclude data regarding teacher performance, or student academic progress or growth, for calculating a program's student learning outcomes for novice teachers who teach out of State (and who teach in private schools). See also the discussion of comments for § 612.5(a)(1) (student learning outcomes). To the extent that States wish to include this information, they can continue to pilot and analyze data collection quality and methodology for a number of years before including it in their SRCs.

    Changes: None.

    Comments: One commenter specifically recommended laddering in the proposed performance criteria only after norming has occurred. We interpret this comment to mean that States should have time to collect data on the required indicators for multiple years on all programs and use that data to establish specific thresholds for acceptable program performance on each indicator. This would require a longer timeline before using the indicators to assess program performance than the Department had proposed.

    Discussion: We will not require “laddering in” the criteria in § 612.5 only after norming has occurred, as the commenter suggested, because we believe that States should be able to set identifiable targets for these criteria without respect to the current distribution of program for an indicator (e.g., a teacher retention rate of less than 50 percent as an indicator of low performance). These regulations are not intended to have States identify any particular percentage of teacher preparation programs as low-performing or at-risk of being low-performing. Rather, while they establish indicators that each State will use and report, they leave the process for how it determines a teacher preparation program's overall rating to the discretion of the State and its consultative group. If States wish to incorporate norming, norming around specific performance thresholds could be completed during the pilot year and, over time, performance thresholds can be adjusted during the periodic examinations of the evaluation systems that States must conduct.

    Changes: None.

    Comments: Some commenters noted that having States assess the performance of teacher preparation programs on a yearly basis seems likely to drain already limited State and institutional resources.

    Discussion: Section 207(a) of the HEA expressly requires States to provide an “annual list of low-performing [and at-risk] teacher preparation programs.” We believe that Congress intended the State program assessment requirement itself also to be met annually. While we have strived to develop a system that keeps costs manageable, we also believe that the improvement of teacher preparation programs and consumers' use of information in the SRC on program performance necessitate both annual reporting and program determinations.

    Changes: None.

    Comments: A number of commenters stated that the availability of student growth and achievement data that are derived from State assessment results and district-determined measures are subject to State legislative requirements and, if the legislature changes them, the State assessments given or the times when they are administered could be drastically impacted. One commenter stated that, because the State operates on a biennial budget cycle, it could not request authority for creating the administrative position the State needs to comply with the proposed regulations until the 2017-2019 budget cycle.

    Discussion: We understand that the availability of data States will need to calculate student learning outcomes for student achievement in tested grades and subjects depends to some extent on State legislative decisions to maintain compatible State assessments subject to section 1111(b)(2) of the ESEA, as amended by the ESSA. But we also assume that State legislatures will ensure that their States have the means to comply with this Federal law, as well as the means to permit the State to calculate student growth based on the definition of “student achievement in tested grades and subjects” in § 612.2. Moreover, we believe that our decision to revise § 612.5(a)(1)(ii) to include an option for States to use “another State-determined measure relevant to calculating student learning outcomes” should address the commenters' concerns.

    In addition, the commenter who raised concerns based on the State legislature being in session on only a biennial basis did not provide enough information to permit us to consider why this necessarily bars the State's compliance with these regulations.

    Changes: None.

    Program-Level Reporting (Including Distance Education) (34 CFR 612.4(a)(1)(i))

    Comments: Some commenters supported the shift to reporting at the individual teacher preparation program rather than at the overall institutional level. A couple of commenters agreed that States should perform assessments of each program, but be allowed to determine the most appropriate way to include outcomes in the individual program determinations, including determining how to roll-up outcomes from the program level to the entity level. Other commenters noted that States should be required to report outcomes by the overall entity, rather than by the individual program, because such reporting would increase the reliability of the measures and would be less confusing to students. Some commenters expressed concerns that only those programs that have data demonstrating their graduates' effectiveness in the public schools in the State where the institution is located would receive a top rating, and entity-level reporting and rating would reduce this concern. If States report by entity, they could report the range in data across programs in addition to the median, or report data by quartile. This would make transparent the differences within an entity while maintaining appropriate thresholds.

    Commenters also stated that there are too many variations in program size and, as we understand the comment, in the way States credential their teacher preparation programs to mandate a single Federal approach to disaggregated program reporting for the entire Nation.

    Discussion: We appreciate the comments supporting the shift to reporting at the program level. The regulations provide extensive flexibility to States to determine how to measure and use outcomes in determining program ratings. If a State wishes to aggregate program level outcomes to the entity level, it is free to do so, though such aggregation would not replace the requirements to report at the program level unless the program (and the method of aggregation) meets the small-size requirements in § 612.4(b)(3)(ii). Regarding the comment that reporting at the institutional level is more reliable, we note that the commenter did not provide any additional context for this statement, though we assume this statement is based on a generalized notion that data for the institution as a whole might be more robust because of the overall institution's much larger number of recent graduates. While we agree that aggregation at a higher level would generate more data for each indicator, we believe that the program size threshold in § 612.5(b)(3) sufficiently addresses this concern while also ensuring that the general public and prospective students have access to data that are as specific as possible to the individual programs operated by the institution.

    We fail to understand how defining a teacher preparation program as we have, in terms of initial State teacher certification or licensure in a specific field, creates concerns that top ratings would only go to programs with data showing the effectiveness of graduates working in public schools in the State. So long as the number of novice teachers the program produces meets the minimum threshold size addressed in § 612.4(b)(3) (excluding, at the State's discretion, teachers teaching out of State and in private schools from determinations of student learning outcomes and teacher placement and retention rates as permitted by § 612.5(a)(1) and § 612.2, respectively), we are satisfied that the reporting of program information will be sufficiently robust and obviate concerns about data reliability.

    Moreover, we disagree with the comments that students would find reporting of outcomes at the institution level less confusing than reporting at the teacher preparation program level. We believe students want information about teacher preparation programs that are specific to the areas in which they want to teach so they can make important educational and career decisions, such as whether to enroll in a specific teacher preparation program. This information would be presented most clearly at the teacher preparation program level rather than at the institutional level, where many programs would be collapsed such that a student would not only lack information about whether a specific program in which she is interested is low-performing or at risk of being low-performing, but also be unable to review data relative to indicators of the program's performance.

    We also disagree with the claim that program level reporting as required under these regulations is inappropriate due to the variation in program size and structure across and within States. Since the commenters did not provide an example of how the requirements of these regulations make program level reporting impossible to implement, we cannot address these concerns more specifically than to say that since use of indicators of program performance will generate information unique to each program, we fail to see why variation in program size and structure undermine these regulations.

    Changes: None.

    Comments: There were many comments related to the reporting of information for teacher preparation programs provided through distance education. Several commenters indicated that the proposed regulations are unclear on how the reporting process would work for distance education programs large enough to meet a State's threshold for inclusion on their report card (see § 612.4(b)(3)), but that lack a physical presence in the State. Commenters indicated that, under our proposed regulations, States would need to identify out-of-State institutions (and their teacher preparation programs) that are serving individuals within their borders through distance education, and then collect the data, analyze it, and provide assessments on these programs operated from other States. Thus, commenters noted, States may need more authority either through regulatory action or legislation to be able to collect information from institutions over which they do not currently have authority.

    Commenters also requested that the Department clarify what would happen to distance education programs and their currently enrolled students if multiple States would be assessing a single program's effectiveness and doing so with differing results. One commenter suggested a “home State” model in which, rather than developing ratings for each program in each State, all of a provider's distance education programs would be evaluated by the State in which the provider, as opposed to the program participants, is physically located. The commenter argued that this model would increase the reliability of the measures and decrease student confusion, especially where comparability of measures between States is concerned. Unless such a home State model is adopted, the commenter argued, other States may discriminate against programs physically located and operated in other States by, as we understand the comment, using the process of evaluating program performance to create excessive barriers to entry in order to protect in-State institutions. Another commenter asked that the proposed regulations provide a specific definition of the term “distance education.”

    Several commenters expressed support for the change to § 612.4(a)(1)(ii) proposed in the Supplemental NPRM, which would require that reporting on the quality of all teacher preparation programs provided through distance education in the State be made by using procedures for reporting that are consistent with § 612.4(b)(4), but based on whether the program produces at least 25 or fewer than 25 new teachers whom the State certified to teach in a given reporting year.

    While commenters indicated that reporting on hybrid teacher preparation programs was a complicated issue, commenters did not provide recommendations specific to two questions regarding hybrid programs that were posed in the Supplemental NPRM. The first question asked under what circumstances, for purposes of both reporting and determining the teacher preparation program's level of overall performance, a State should use procedures applicable to teacher education programs offered through distance education and when it should use procedures for teacher preparation programs provided at brick-and-mortar institutions. Second, we asked, for a single program, if one State uses procedures applicable to teacher preparation programs provided through distance education, and another State uses procedures for teacher preparation programs provided at brick-and-mortar institutions, what are the implications, especially for TEACH Grant eligibility, and how these inconsistencies should be addressed.

    In response to our questions, many commenters indicated that it was unclear how to determine whether a teacher preparation program should be classified as a teacher preparation program provided through distance education for reporting under § 612.4(a)(1)(ii) and asked for clarification regarding how to determine under what circumstances a teacher preparation program should be considered a teacher preparation program provided through distance education. One commenter recommended that we define a teacher preparation program provided through distance education program to be one where the full and complete program can be completed without an enrollee ever being physically present at the brick-and-mortar institution or any of its branch offices.

    Commenters expressed a number of concerns about reporting. Some commenters indicated that while the December 3, 2014, NPRM allowed States to report on programs that produced fewer than 25 new teachers, it was unclear whether the same permission would be applied to distance education programs through the Supplemental NPRM. Additionally, a few commenters thought that, in cases where students apply for certification in more than one State, the outcomes of a single student could be reported multiple times by multiple States. Other commenters felt that if States are expected to evaluate distance education graduates from other States' programs, the regulations should be revised to focus on programs that are tailored to meet other States' requirements. A commenter suggested that the State in which a distance education program is headquartered should be responsible for gathering the data reported by the other States in which the program operates and then, using their data along with other States' data, that State where the program is headquartered should make the determination as to the performance rating of that program. Doing so would establish one rating for each distance education program, which would come from the State in which it is headquartered. The commenter expressed that this would create a simplified rating system similar to brick-and-mortar institutions. Another commenter stated that the proposed approach would force the States to create a duplicative and unnecessary second tracking system through their licensure process for graduates of their own teacher preparation programs provided through distance education who remain in the State.

    Many commenters voiced concerns related to the identification and tracking of distance education programs provided through distance education. Specifically, commenters indicated that, because the method by which a teacher preparation program is delivered is not transcribed or officially recorded on educational credentials, the receiving State (the State where the teacher has applied for certification) has no way to distinguish teacher preparation programs provided through distance education from brick-and-mortar teacher preparation programs. Furthermore, receiving States would not be able to readily distinguish individual teacher preparation programs provided through distance education from one another.

    Finally, a commenter stated that the proposed regulations do not require States to provide any notice of their rating, and do not articulate an appeal process to enable institutions to challenge, inspect, or correct the data and information on the basis of which they might have received an adverse rating. Commenters also indicated that teacher preparation programs themselves should receive data on States' student and program evaluation criteria.

    Discussion: Regarding comments that the regulations need to describe how teacher preparation programs provided through distance education programs should be reported, we intended for a State to report on these programs operating in that State in the same way it reports on the State's brick-and-mortar teacher preparation programs.

    We appreciate commenters' expressions of support for the change to the proposed regulations under § 612.4(a)(1)(ii), as proposed in the Supplemental NPRM, requiring that reporting on the quality of all teacher preparation programs provided through distance education in the State be made by using procedures for reporting that are consistent with proposed § 612.4(b)(4), but based on whether the program produces at least 25 or fewer than 25 new teachers whom the State certified to teach in a given reporting year. In considering the language of proposed § 612.4(a)(1)(ii) and the need for clarity on the reporting requirements for teacher preparation programs provided through distance education, we have concluded that the provision would be simpler if it simply incorporated by reference the reporting requirements for those programs in § 612.4(b)(3) of the final regulations.

    While we agree with the commenters who stated that the proposed regulations were unclear on what constitutes a teacher preparation program provided through distance education, we decline to accept the recommendation to define a distance education program where the full and complete program can be completed without an enrollee ever being physically present at the brick-and-mortar institution or any of its branch offices because this definition would not be inclusive of teacher preparation programs providing significant portions of the program through distance education. In addition, the proposed definition would allow the teacher preparation program to easily modify its requirements such that it would not be considered a teacher preparation program provided through distance education.

    Instead, in order to clarify what constitutes a teacher preparation program provided through distance education, we are adding the term “teacher preparation program provided through distance education” to § 612.2 and defining it as a teacher preparation program in which 50 percent or more of the program's required coursework is offered through distance education. The term distance education is defined under 34 CFR 600.2 to mean education that uses one or more specified technologies to deliver instruction to students who are separated from the instructor and to support regular and substantive interaction between the students and the instructor, either synchronously or asynchronously. The technologies may include the internet; one-way and two-way transmissions through open broadcast, closed circuit, cable, microwave, broadband lines, fiber optics, satellite, or wireless communications devices; audio conferencing; or video cassettes, DVDs, and CD-ROMs, if the cassettes, DVDs, or CD-ROMs are used in a course in conjunction with any of the technologies previously in this definition. We have incorporated this definition by reference (see § 612.2(a)).

    In the Supplemental NPRM, we specifically requested public comment on how to determine when a program that has both brick-and-mortar and distance education components should be considered a teacher preparation program provided through distance education. While we received no suggestions, we believe that it is reasonable that if 50 percent or more of a teacher preparation program's required coursework is offered through distance education, it should be considered a teacher preparation program provided through distance education because the majority of the program is offered through distance education. This 50 percent threshold is consistent with thresholds used elsewhere in Departmental regulations, such as those relating to correspondence courses under 34 CFR 600.7 or treatment of institutional eligibility for disbursement of title IV HEA funds for additional locations under 34 CFR 600.10(b)(3).

    In addition, we do not agree with the suggestion for a “home State” reporting model, in which all of a provider's distance education programs would be evaluated by the State in which the provider is physically located. First, section 205(b) of the HEA requires States to report on the performance of their teacher preparation programs. We feel strongly both that, to date, defining the program at the institutional level has not produced meaningful results, and that where programs provided through distance education prepare individuals to teach in different States, those States—and not only the “home State”—should assess those programs' performance. In addition, we believe that each State should, as the law anticipates, speak for itself about what it concludes is the performance of each teacher preparation program provided through distance education operating within its boundaries. Commenters did not provide any evidence to support their assertion that States would discriminate against distance learning programs physically located in other States, nor do we understand how they would do so if, as § 612.4(a) anticipates, they develop and apply the same set of criteria (taking into consideration the need to have different employment outcomes as provided in § 612.4(b)(2) given the nature of these programs) for assessing the performance of brick-and-mortar programs and programs provided through distance education programs.

    Regarding reporting concerns, we provide under § 612.4(b)(3)(i) for annual reporting on the performance of each teacher preparation program that produces a total of 25 or more recent graduates in a given reporting year (that is, a program size threshold of 25), or, at the State's discretion, a lower program size threshold (e.g., 15 or 20). Thus, States can use a lower threshold than the 25 recent graduates. We do not agree that in cases where students apply for certification in more than one State, a single student would necessarily be counted multiple times. For calculations of the placement rate for a program provided through distance education, the student who teaches in one State but who has received teaching certification in that State and others would be included in the denominator of placement rates calculated by these other States only if those States chose not to exclude recent graduates teaching out of State from their calculations. (The same would be true of graduates of brick-and-mortar programs.) But those other States would only report and use a placement rate in assessing the performance of programs provided through distance education if they have graduates of those programs who are certified in their States (in which case the program size threshold and aggregation procedures in § 612.4(b) would apply).

    Further, for the purposes of the teacher placement rate, § 612.5(a)(2)(iv) permits a State, at its discretion, to assess the teacher placement rate for teacher preparation programs provided through distance education differently from the teacher placement rate for other teacher preparation programs based on whether the differences in the way the rate is calculated for teacher preparation programs provided through distance education affect employment outcomes.

    States that certify at least 25 teachers from a teacher preparation program provided through distance education do have an interest in that program and will be reporting on the program as a program in their States. Moreover, we disagree that States in which distance education programs are headquartered should round up data from other States, determine a performance rating, and report it for several reasons. In addition to placing a higher cost and burden on a particular State, this methodology would undermine the goal of States having a say in the quality of the program that is being used to certify teachers in the State. The State where a teacher preparation program operating in multiple States is housed is not the only State with an interest in the program. Finally, we do not believe that the regulations would force States to create a duplicative and unnecessary second tracking system because a State is already required to report on teacher preparation programs in the State.

    We agree with commenters' concerns regarding the identification and tracking of teacher preparation programs provided through distance education. To address this concern, institutions will be asked to report which of their teacher preparation programs are teacher preparation programs provided through distance education in the IRC, which the institutions provide to the State. The receiving State can then verify this information during the teacher certification process for a teacher candidate in the State.

    We note that an appeal process regarding a teacher preparation program's performance is provided for under § 612.4(c). We also note that teacher preparation programs will have access to data on States' student and program evaluation criteria because State report cards are required to be publicly available.

    Changes: We are adding the term “teacher preparation program provided through distance education” to § 612.2 and defining it as a teacher preparation program in which 50 percent or more of the program's required coursework is offered through distance education. We are also providing under § 612.4(a)(1)(ii) that States must report on the quality of all teacher preparation programs provided through distance education in the State consistent with § 612.4(b)(3).

    Making the State Report Card Available on the State's Web Site (34 CFR 612.4(a)(2))

    Comments: One commenter supported the proposed change that any data used by the State to help evaluate program performance should be published at the indicator level to ensure that programs understand the areas they need to improve, and to provide additional information to students about program success. Other commenters stated that posting SRCs does not lead to constructive student learning or to meeting pre-service preparation program improvement goals. Many commenters stated that the method by which States would share information with consumers to ensure understanding of a teacher preparation program's employment outcomes or overall rating is not stipulated in the regulations and, furthermore, that the Department does not specifically require that this information be shared.

    Discussion: We appreciate the comment supporting publication of the SRC data on the State's Web site. The regulation specifically requires posting “the State report card information” on the Web site, and this information includes all data that reflect how well a program meets indicators of academic content and teaching skills and other criteria the State uses to assess a program's level of performance, the program's identified level of performance, and all other information contained in the SRC.

    While posting of the SRC data on the State's Web site may not lead directly to student learning or teacher preparation program improvement, it does provide the public with basic information about the performance of each program and other, broader measures about teacher preparation in the State. Moreover, making this information widely available to the general public is a requirement of section 205(b)(1) of the HEA. Posting this information on the State's Web site is the easiest and least costly way for States to meet this requirement. We also note that the commenters are mistaken in their belief that our proposed regulations did not require that information regarding teacher preparation programs be shared with consumers. Proposed § 612.4(a)(2) would require States to post on their Web sites all of the information required to be included in their SRCs, and these data include the data on each program's student learning outcomes, employment outcomes, and survey outcomes, and how the data contribute to the State's overall evaluation of the program's performance. The final regulations similarly require the State to include all of these data in the SRC, and § 612.4(a)(2) specifically requires the State to make the same SRC information it provides to the Secretary in its SRC widely available to the general public by posting it on the State's Web site.

    Changes: None.

    Meaningful Differentiations in Teacher Preparation Program Performance (34 CFR 612.4(b)(1))

    Comments: Multiple commenters expressed general opposition to our proposal that in the SRC the State make meaningful differentiation of teacher preparation program performance using at least four performance levels. These commenters stated that such ratings would not take into account the uniqueness of each program, such as the program's size, mission, and diversity, and therefore would not provide an accurate rating of a program.

    Others noted that simply ascribing one of the four proposed performance levels to a program is not nuanced or sophisticated enough to fully explain the quality of a teacher preparation program. They recommended removing the requirement that SEAs provide a single rating to each program, and allow States instead to publish the results of a series of performance criteria for each program.

    Discussion: As noted under § 612.1, we have withdrawn our proposal to require States to identify programs that are exceptional. Therefore, § 612.4(b)(1), like section 207(a) of the HEA, requires States in their SRCs to identify programs as being low-performing, at-risk of being low-performing, or effective or better, with any additional categories established at the State's discretion. This revised rating requirement mirrors the requirements of section 207(a) the HEA for reporting programs that are low-performing or at-risk of being low-performing (and thus by inference also identifying those programs that are performing well).

    States cannot meet this requirement unless they establish procedures for using criteria, including indicators of academic content knowledge and teaching skills (see § 612.4(b)(2)(i)), to determine which programs are classified in each category. The requirement of § 612.4(b)(1) that States make meaningful differentiation of teacher preparation program performance using at least these three categories simply gives this statutory requirement regulatory expression. While § 612.4(b)(1) permits States to categorize teacher preparation programs using more than three levels of performance if they wish, the HEA cannot be properly implemented without States making meaningful differentiation among programs based on their overall performance.

    We do not believe that these regulations disregard the uniqueness of each program's size, mission, or diversity, as they are intended to provide a minimum set of criteria with which States determine program performance. They do not prescribe the methods by which programs meet a State's criteria for program effectiveness.

    Changes: We have revised § 612.4(b)(1) by removing the proposed fourth program performance level, “exceptional teacher preparation program,” from the rating system.

    Comments: Commenters, for various reasons, opposed our proposal to require States, in making meaningful differentiation in program performance, to consider employment outcomes in high-need schools and student learning outcomes “in significant part.” Some commenters requested clarification on what “significant” means with regard to weighting employment outcomes for high-need schools and student learning outcomes in determining meaningful differentiations of teacher preparation programs. Commenters also noted that including employment outcomes for high-need schools will add another level of complexity to an already confusing and challenging process. Some commenters recommended the Department maintain the focus on teacher placement and retention rates, but eliminate the incentives to place recent graduates in high-need schools. They stated that doing so will permit these indicators to focus on the quality of the program without requiring the program to have a focus on having its students teach in high-need schools, something that may not be in the mission of all teacher preparation programs.

    Multiple other commenters expressed confusion about whether or not the regulations incentivize placement in high-need schools by making such placement a significant part of how States must determine the rating of a teacher preparation program. Some commenters argued that, on the one hand, the requirement that States use student learning outcomes to help assess a program's overall performance could incentivize teacher preparation programs having teaching candidates become teachers in schools where students are likely to have higher test scores. On the other hand, they argued that the proposed regulations would also assess program performance using, as one indicator, placement of candidates in high-need schools, an indicator that commenters stated would work in the opposite direction. These commenters argued that this could cause confusion and will create challenges in implementing the regulations by not giving States and programs a clear sense of which issue is of greater importance—student learning outcomes or placement of teachers in high-need schools.

    Other commenters recommended that the Department set specific thresholds based on the affluence of the area the school serves. For example, commenters recommended that 85 percent of program graduates who work in affluent, high-performing schools should have a certain level of student learning outcomes, but that, to have the same level of program performance, only 60 percent of program graduates who work in high-need schools have perform at that same level.

    Multiple commenters also opposed the inclusion of student learning outcomes, employment outcomes, and survey outcomes as indicators of the performance of teacher preparation programs. These commenters believed that student learning outcomes are embedded in the concept of VAM found in standardized testing, a concept they believe constitutes a flawed methodology that does not accurately represent teacher preparation program effectiveness.

    Discussion: The final regulations require meaningful differentiation of teacher preparation programs on the basis of criteria that include employment in high-need schools as an indicator of program graduates' (or in the case of alternative route programs, participants') academic content knowledge and teaching skills for several reasons. First, like much of the education community, we recognize that the Nation needs more teachers who are better prepared to teach in high-need schools. We strongly believe that teacher preparation programs should accept a share of the responsibility for meeting this challenge. Second, data collected in response to this indicator should actually help distinguish the distinct missions of teacher preparation programs. For example, certain schools have historically focused their programs on recruiting and preparing teachers to teach in high-need schools—a contribution States and those institutions may understandably want to recognize. Third, we know that some indicators may be influenced by graduates' (or in the case of alternative route programs, participants') placement in high-need schools (e.g., teacher retention rates tend to be lower in high-need schools), and States may also want to consider this factor as they determine how to use the various criteria and indicators of academic content knowledge and teaching skills to identify an overall level of program performance.

    However, while States retain the authority to determine thresholds for performance under each indicator, in consultation with their stakeholder groups (see § 612.4(c)), we encourage States to choose thresholds purposefully. We believe that all students, regardless of their race, ethnicity, or socioeconomic status, are capable of performing at high levels, and that all teacher preparation programs need to work to ensure that teachers in all schools are capable of helping them do so. We encourage States to carefully consider whether differential performance standards for teachers in high-need schools reflect sufficiently ambitious targets to ensure that all children have access to a high quality education.

    Similarly, we encourage States to employ measures of student learning outcomes that are nuanced enough to control for prior student achievement and observable socio-economic factors so that a teacher's contribution to student learning is not affected by the affluence of his or her school. Overall, the concerns stated here would also be mitigated by use of growth, rather than some indicator of absolute performance, in the measure of student learning outcomes. But, here again, we feel strongly that decisions about how and when student learning outcomes are weighted differently should be left to each State and its consultation with stakeholders.

    We respond to the commenters' objections to our requirement that States use student learning outcomes, employment outcomes, and survey outcomes in their assessment of the performance levels of their teacher preparation programs in our discussion of comment on these subjects in § 612.5(a). For reasons we addressed above in the discussion of § 612.1, while still strongly encouraging States to give significant weight to these indicators in assessing a program's performance, we have omitted from the final regulations any requirement that States consider employment outcomes in high-need schools and student outcomes “in significant part.”

    Changes: We have revised § 612.4(b)(1) by removing the phrase “including, in significant part, employment outcomes for high-need schools and student learning outcomes.”

    Comments: Commenters recommended that States and their stakeholders have the authority to determine how and to what extent outcomes are included in accountability decisions for teacher preparation programs in order to mitigate the concerns regarding the validity and reliability of the student growth indicators. These commenters stated that we should give more authority to States and LEAs to identify indicators and their relative weighting that would be the greatest benefit to their community. Other commenters also stated that the proposal to require States to provide meaningful differentiations in teacher preparation programs may conflict with existing State structures of accountability, and by giving States increased flexibility, the Department would avoid inconsistencies with State-determined levels of quality.

    Discussion: Having withdrawn our proposal to require that student growth and employment outcomes in high-need schools be considered “in significant part,” the final regulations provide States with broad flexibility in how they weight different indicators of academic content knowledge and teaching skills in evaluating teacher preparation programs. While we strongly encourage States to give significant weight to these important indicators of a teacher preparation program's performance, we provide each State full authority to determine, in consultation with its stakeholders, how each of their criteria, including the required indicators of academic content knowledge and teaching skills, can be best used to fit the individual needs of its schools, teachers, and teacher preparation programs.

    Changes: None.

    Satisfactory or Higher Student Learning Outcomes for Programs Identified as Effective or Higher (34 CFR 612.4(b)(2))

    Comments: Multiple commenters asked us to define the phrase “satisfactory or higher student learning outcomes,” asking specifically what requirements a program would have to meet to be rated as effective or higher. They also stated that States had insufficient guidance on how to define programs as “effective.” Some commenters also noted that providing flexibility to States to determine when a program's student learning outcomes are satisfactory would diminish the ability to compare teacher preparation programs, and opposed giving States the flexibility to determine for themselves when a program has “satisfactory” student learning outcomes. However, other commenters disagreed, stating that States should have flexibility to determine when the teachers trained by a particular teacher preparation program have students who have achieved satisfactory student learning outcomes since States would have a better ability to know how individual teacher preparation programs have helped to meet these States' needs.

    Other commenters recommended modifying the regulations so that States would need to determine programs to have “above average student learning outcomes” in order to rate them in the highest category of teacher preparation performance. Another commenter suggested that student learning data be disaggregated by student groups to show hidden inequities, and that States be required to develop a pilot program to use subgroup data in their measurement of teacher preparation programs, such that if the student subgroup performance falls short the program could not be rated as effective or higher.

    Discussion: The Department continues to believe that a teacher preparation program should not be rated effective if the learning outcomes of the students taught by its graduates (or, in the case of alternative route programs, its participants) are not satisfactory. And we appreciate the comments from those who supported our proposal. Nonetheless, we are persuaded by the comments from those who urged that States should have the flexibility to determine how to apply the criteria and indicators of student academic achievement and learning needs to determine the performance level of each program, and have removed this provision from the regulations.

    Changes: We have removed § 612.4(b)(2). In addition, we have renumbered § 612.4(b)(3) through (b)(5) as § 612.4(b)(2) through (b)(4).

    Data for Each Indicator (34 CFR 612.4(b)(2)(i))

    Comments: One commenter requested confirmation that the commenter's State would not be required to report the disaggregated data on student growth based on assessment test scores for individual teachers, teacher preparation programs, or entities on the SRC because the educator effectiveness measure approved for its ESEA flexibility waiver meets the requirements for student learning outcomes in proposed §§ 612.4(b) and 612.5(a)(1) for both tested and non-tested subjects. The commenter stated that it would be cost prohibitive to submit student growth information on the SRC separately from reporting on its educator effectiveness measure under ESEA flexibility. Furthermore, some commenters were concerned that a State's student privacy laws would make it difficult to access the disaggregated data as required.

    In addition, some commenters opposed our proposed § 612.4(b)(2)(i)(B) requiring each State to include in its SRC an assurance that a teacher preparation program either is accredited or produces teachers with content and pedagogical knowledge because of what they described as the federalization of professional standards. They indicated that our proposal to offer each State the option of presenting an assurance that the program is accredited by a specialized accrediting agency would, at best, make the specialized accreditor an agent of the Federal government, and at worst, effectively mandate specialized accreditation by CAEP. The commenters argued instead that professional accreditation should remain a voluntary, independent process based on evolving standards of the profession. Commenters also noted that no definition of specialized accreditation was proposed and requested that we include a definition of this term. One commenter recommended that a definition of specialized accreditation include the criteria that would be used by the Secretary to recognize an agency for the accreditation of professional teacher preparation programs, and that one of the criteria for a specialized agency should be the inclusion of alternative certification programs as eligible professional teacher preparation programs.

    Discussion: Under § 612.4(b)(2)(i), States may choose to report student learning outcomes using a teacher evaluation measure that meets the definition in § 612.2. But if they do so, States still must report student learning outcomes for each teacher preparation program in the SRC.

    We believe that the costs of this SRC reporting will be manageable for all States, and have provided a detailed discussion of costs in the RIA section of this document. For further discussion of reporting on student learning outcomes, see the discussion in this document of § 612.5(a)(1). We also emphasize that States will report these data in the aggregate at the teacher preparation program level and not at the teacher level. Furthermore, while States will need to comply with applicable Federal and State student privacy laws in the data they report in their SRC, the commenters have not provided information to help us understand how our requirements, except as we discuss for § 612.4(b)(3)(ii)(E), are affected by State student privacy laws.

    In addition, as we reviewed these comments and the proposed regulatory language, we realized the word “disaggregated” was unclear with regard to the factors by which the data should be disaggregated, and redundant with regard to the description of indicators in § 612.5. We have therefore removed this word from § 612.4(b)(2)(i).

    Under § 612.5(a)(4) States must annually report whether each program is administered by an entity that is accredited by a specialized accrediting agency recognized by the Secretary, or produces candidates (1) with content and pedagogical knowledge and quality clinical preparation, and (2) who have met rigorous teacher candidate exit qualifications. Upon review of the comments and the language of § 612.5(a)(4), we have determined that proposed § 612.4(b)(3)(i)(B), which would have had States provide an assurance in their SRCs that each program met the characteristics described in § 612.5(a)(4), is not needed. We address the substantive comments offered on that provision in our discussion of comments on § 612.5(a)(4).

    Finally, in reviewing the public comment, we realized that the proposed regulations focused only on having States report in their SRCs the data they would provide for indicators of academic knowledge and teaching skills that are used to determine the performance level of each teacher preparation program. This, of course, was because State use of those indicators was the focus of the proposed regulations. But we did not mean to suggest that in their SRCs, States would not also report the data they would use for other indicators and criteria they establish for identifying each's program's level of performance. While the instructions in section V of the proposed SRCs imply that States are to report their data for all indicators and criteria they use, we have revised those instructions to clarify this point.

    Changes: We have revised § 612.4(b)(2)(i) by removing the word “disaggregated.” We also have removed proposed § 612.4(b)(2)(i)(B) from the regulations.

    Weighting of Indicators (34 CFR 612.4(b)(2)(ii))

    Comments: Some commenters stated that a formulaic approach, which they argued was implied by the requirement to establish the weights of each indicator, will not yield meaningful differentiations among programs. The commenters recommended that States be allowed to use a multiple-measures system for assessing the performance of teacher preparation programs that relies on robust evidence, includes outcomes, and gives weight to professional judgment. In addition, some commenters recommended that stakeholders provide input as to how and to what extent outcomes are included in a teacher preparation program's overall performance rating.

    Several commenters noted that the flexibility our proposed regulations provide to States to determine the weighting system for use of criteria and indicators to assess teacher preparation program performance undermines what the commenters state is the Department's goal of providing meaningful data to, among other things, facilitate State-to-State comparisons. The commenters argue that consumers might incorrectly assume the all States are applying the same metrics to assess program performance, and so draw incorrect conclusions especially for programs located near each other but located in different States. Several commenters also expressed concerns about the Department's proposal in § 612.5(a)(2) that States be able to weigh employment outcomes differently for alternative route programs and traditional teacher preparation programs. The commenters argued that all teacher preparation programs should be held to the same standards and levels of accountability.

    Commenters also stated that our proposal, by which we understand the commenters to mean the proposed use of student learning outcomes, employment outcomes and survey outcomes as indicators of academic content knowledge and teaching skills of teachers whom programs prepare, should be adjusted based on the duration of the teachers' experience. Commenters stated we should do so because information about newer teachers' training programs should be emphasized over information about more experienced teachers, for whom data reflecting these indicators would likely be less useful.

    Some commenters asked whether, if a VAM is used to generate information for indicators of student learning outcomes, the indicators should be weighted to count gains made by the lower performing third of the student population more than gains made by the upper third of the population because it would be harder to increase the former students' scores. The commenters noted that poorer performing students will have the ability to improve by greater amounts than those who score higher on tests.

    Several commenters believed that the weighting of the indicators used to report on teacher preparation program performance is a critical decision, particularly with respect to the weighting of indicators specific to high-need schools, and because of this, decisions on weighting should be determined after data are collected and analyzed. As an example of why the group of stakeholders should have information available prior to making weighting decisions, the commenter noted that, if teacher placement in high-need schools has a relatively low-weight and student growth is negatively associated with the percentage of economically disadvantaged students enrolled in the school, programs may game the system by choosing to counsel students to seek employment in non-high-need schools.

    Finally, several commenters stated that the regulations incentivize programs to place graduates in better performing schools, noting that the proposed regulations appeared to require that student learning outcomes be given the most weight. On the other hand, the commenters stated that the proposed regulations incentivize the placement of graduates in high-need schools, and argued that employment rates in high-need schools would receive the next highest weight. They argued that this contradiction would lead to confusion and challenges in implementing the regulations.

    Discussion: We have included a summary of these comments here because they generally address how States should weight the indicators and criteria used to assess the performance of teacher preparation programs, and advantages and disadvantages of giving weight to certain indicators. However, we stress that we did not intend for States to adopt any particular system of weighting to generate an overall level of performance for each teacher preparation program from the various indicators and criteria they would use. Rather, proposed § 612.4(b)(3)(ii), like § 612.4(b)(2)(ii) of the final regulations, simply directs States to report in their SRCs the weighting it has given to the various indicators in § 612.5). Thus, we are not requiring any State to adopt some form of formulaic approach. And States may, if they choose, build into their indicators and criteria a reliance on robust evidence and outcomes, and give weight to professional judgment.

    States plainly need to be able to implement procedures for taking the data relevant to each of the indicators of academic knowledge and teaching skills and other criteria they use to assess program performance, and turn those data into a reported overall level of program performance. We do not see how States can do this without somehow providing some form of weight to each of the indicators they use. However, the specific method by which a State does so is left to each State, in consultation with its stakeholders (see § 612.4(c)), to determine.

    As we addressed in the discussion of § 612.1, we had proposed in § 612.4(b)(1) that a State's assessment of a program's performance needed to be based “in significant part” on the results for two indicators, student learning outcomes and employment outcomes in high-need schools. But as we noted in our discussion of comment on §§ 612.1 and 612.4(b)(1), while strongly encouraging States to adopt these provisions in their procedures for assessing a program's performance, we have revised these final regulations to omit that proposal and any other language that any regulatory indicator receive special weight.

    Furthermore, the flexibility the regulations accord to States to determine how these factors should be weighed to determine a program's level of performance extends to the relative weight a State might accord to factors like a teacher's experience and to student learning outcomes of teachers in low-performing versus high-performing schools. It also extends to the weight a State would provide to employment outcomes for traditional teacher preparation programs and alternative route teacher preparation programs; after all, these types of programs are very different in their concept, who they recruit, and when they work with LEAs to place aspiring teachers as teachers of record. In addition, State flexibility extends to a State's ability to assess the overall performance of each teacher preparation program using other indicators of academic content knowledge and teaching skills beyond those contained in the regulations. We do not believe that this flexibility undermines any Departmental goal, or goal that Congress had in enacting the title II reporting system.

    Thus, while a State must report the procedures and weighting of indicators of academic content knowledge and teaching skills and other criteria it uses to assess program performance in its SRC, we believe States should be able to exercise flexibility to determine how they will identify programs that are low-performing or at-risk of being so. In establishing these regulations, we stress that our goal is simple: to ensure that the public—prospective teaching candidates, LEAs that will employ novice teachers, and State and national policy makers alike—has confidence that States are reasonably identifying programs that are and are not working, and understand how States are distinguishing between the two. The flexibilities the regulations accord to States to determine how to determine a program's level of performance is fully consistent with this goal. Furthermore, given the variation we expect to find in State approaches and the different environments in which each State operates, we reiterate that any State-to-State comparisons will need to be made only with utmost caution.

    As noted above, our discussion of §§ 612.1 and 612.4(b)(1) stressed both (1) our hope that States would adopt our proposals that student learning outcomes and employment outcomes for high-need schools be given significant weight, and that to be considered effective a teacher preparation program would show positive student learning outcomes, and (2) our decision not to establish these proposals as State requirements. Thus, we likewise leave to States issues regarding incentives that any given weight might cause to placements of aspiring teachers and the programs themselves.

    Finally, in reviewing the public comment, we realized that the proposed regulations focused only on having States report in their SRCs the weights they would provide to indicators of academic knowledge and teaching skills used to determine the performance level of each teacher preparation program. This, of course, was because State use of those indicators was the focus of the proposed regulations. But we did not mean to suggest that in their SRCs, States would not also report the weights they would provide to other indicators and criteria they establish for identifying each program's level of performance. While the instructions in section V of the proposed SRCs imply that States are to report their weighting for all indicators and criteria they use, we have revised them to clarify this point.

    Changes: None.

    Reporting the Performance of All Teacher Preparation Programs (34 CFR 612.4(b)(3))

    Comments: Commenters stated that a number of non-traditional teacher preparation program providers will never meet the criteria for inclusion in annual reports due to their small numbers of students. Commenters noted that this implies that many of the most exemplary programs will neither be recognized nor rewarded and may even be harmed by their omission in reports provided to the media and public. Commenters expressed concern that this might lead prospective students and parents to exclude them as viable options, resulting in decreased program enrollment.

    Other commenters asked for more clarity on the various methods for a program to reach the threshold of 25 new teachers (or other threshold set by the State). The commenters also stated that a State could design this threshold to limit the impact on programs. Other commenters noted that smaller teacher preparation programs may not have the technical and human resources to collect the data for proposed reporting requirements, i.e., tracking employment and impact on student learning, and asked if the goal of these proposed regulations is to encourage small programs to close or merge with larger ones.

    Discussion: The regulations establish minimum requirements for States to use in assessing and reporting the performance of each teacher preparation program, and are not intended to facilitate the merger or closure of small programs. The proposed regulations provided States with three methods of identifying and reporting the performance of teacher preparation programs that produce fewer than 25 new teachers—or such lower number as the State might choose—in a given reporting year by aggregating data to reach the minimum thresholds. Under the final regulations, States could: (1) Combine a teacher preparation program's performance data with data for other teacher preparation programs that are operated by the same teacher preparation entity and are similar to or broader than the program in content; (2) combine data over multiple years for up to four years until the size threshold is met; or (3) use a combination of the two methods. Given statistical and privacy issues that are particular to small programs, we believe that these aggregation methods will adequately address the desire to have the performance of all programs, large and small, reported in SRCs. In addition, while we strongly believe that all teacher preparation programs should want to gather student learning outcomes and results of employment and survey results to help them to improve their programs, States, not institutions, ultimately have the responsibility to report under § 612.4.

    The proposed regulations had focused State reporting and small program aggregation procedures on the number of new teachers a teacher preparation program produced. Based on further consideration of these and other comments, it became clear that the term “new teacher” was problematic in this case as it was in other places. We realized that this approach would not hold teacher preparation programs accountable for producing recent graduates who do not become novice teachers. Because we believe that the fundamental purpose of these programs is to produce novice teachers, we have concluded that our proposal to have State reporting of a program's performance depend on the number of new teachers that the program produces was misplaced.

    Therefore, in order to better account for individuals who complete a teacher preparation program but who do not become novice teachers, we are requiring a State to report annually on the performance of each “brick-and-mortar” teacher preparation program that produces a total of 25 or more recent graduates (or such lower threshold as the State may establish). Similarly, aggregation procedures for smaller programs apply to each teacher preparation program that produces fewer than 25 recent graduates (or such lower threshold as the State may establish). For teacher preparation programs provided through distance education, the requirement is the same except that, since States are not likely to know the number of recent graduates, States will continue to look at whether the program has that same threshold number of 25 recent graduates, but in this case, to be counted, these recent graduates need to have received an initial certification or licensure from the State that allows them to serve in the State as teachers of record for K-12 students.

    Changes: We have revised § 612.4(b)(3) to provide that a State's annual reporting of a teacher preparation program's performance, and whether it provides this reporting alternatively through small program aggregation procedures, depends on whether the program produces a total of 25 or more recent graduates (or such lower threshold as the State may establish). For programs provided through distance education, the number of recent graduates counted will be those who have received an initial certification or licensure from the State that allows them to serve in the State as teachers of record for K-12 students.

    Annual Performance Reporting of Teacher Preparation Programs (612.4(b)(3)(i))

    Comments: Two commenters stated that differentiated reporting for large and small teacher preparation programs, coupled with allowing States to establish what the commenters referred to as “certain criteria,” will lead to invalid comparisons and rankings both within and among States.

    Discussion: The regulations require separate reporting of the performance of any teacher preparation program that annually produces 25 or more recent graduates. For programs that annually produce fewer recent graduates, the regulations also establish procedures for data aggregation that result in reporting on all of the State's teacher preparation programs (except for those programs that are particularly small and for which aggregation procedures cannot be applied, or where the aggregation would be in conflict with State or Federal privacy or confidentiality laws). Based on concerns expressed during the negotiated rulemaking sessions, the Department believes that use of an “n-size” of 25 (or such smaller number that a State may adopt) and the means of reporting the performance of smaller programs through the aggregation procedures address privacy and reliability concerns while promoting the goal of having States report on the performance of as many programs as possible. Moreover, we reiterate that the purpose of these regulations is to identify key indicators that States will use to assess the level of performance for each program, and provide transparency about how it identifies that level. We are not proposing any rankings and continue to caution against making comparisons of programs based on data States report.

    Changes: None.

    Performance Reporting of Small Teacher Preparation Programs: General (34 CFR 612.4(b)(3)(ii))

    Comments: Commenters stated that the low population in some States makes privacy of students in elementary and secondary schools, and in teacher preparation programs, difficult or impossible to assure. The commenters further stated that aggregating student growth data to the school level to assure privacy in the title II report would result in meaningless ratings, because the teachers in the schools more than likely completed the preparation program at different institutions.

    Several commenters were concerned that our proposals for aggregating data to be used to annually identify and report the level of performance of small teacher preparation programs would make year-by-year comparisons and longitudinal trends difficult to assess in any meaningful way, since it is very likely that States will use different aggregation methods institution-by-institution and year-by-year.

    Commenters noted that many small rural teacher preparation programs and programs producing small numbers of teachers who disperse across the country after program completion do not have the requisite threshold size of 25. Commenters stated that for these programs, States may be unable to collect sufficient valid data. The result will be misinformed high-stakes decision making.

    Some commenters proposed that States be able to report a minimum of 10 new teachers with aggregation when a minimum is not met instead of 25. Other options would be to report what data they have or aggregate previous years to meet “n” size.

    One commenter recommended that rankings be initially based on a relatively few, normed criteria common to, and appropriate for, all sized programs and States, i.e., a common baseline ranking system. The commenter stated that to do otherwise could result in States rushing to the lowest (not highest) common denominator to protect both quality programs from being unfairly ranked in comparison with weaker programs in other States, and small premier programs from unfair comparisons with mediocre larger programs.

    Two commenters stated that even though the proposed rules create several ways in which States may report the performance of teacher preparation programs that annually produce fewer than 25 teachers per year, the feasibility of annual reporting at the program level in some States would be so limited it would not be meaningful. The commenters added that regardless of the aggregation strategy, having a minimum threshold of 25 will protect the confidentiality of completers for reporting, but requiring annual reporting of programs that produce 25 or more recent graduates per year will omit a significant number of individual programs from the SRC. Several commenters had similar concerns and stated that annual reporting of the teacher preparation program performance would not be feasible for the majority of teacher preparation programs across the country due to their size or where the student lives. Commenters specifically mentioned that many programs at Historically Black Colleges and Universities will have small cell sizes for graduates, which will make statistical conclusions difficult. Another commenter had concerns with the manner in which particular individual personnel data will be protected from public disclosure, while commenters supported procedural improvements in the proposed regulations discussed in the negotiated rulemaking sessions that addressed student privacy concerns by increasing the reporting threshold from 10 to 25.

    Commenters further expressed concerns that for some States, where the number of teachers a program produces per year is less than 25, the manual calculation that States would need to perform to combine programs to aggregate the number of students up to 25 so that the States would then report the assessment of program performance and information on indicators would not only be excessive, but may lead to significant inconsistencies across entities and from one year to the next.

    Discussion: We first reiterate that we have revised § 612.5(a)(1)(ii) so that States do not need to use student growth, either by itself or as used in a teacher evaluation measure, for student learning outcomes when assessing a teacher preparation program's level of performance. While we encourage them to do so, if, for reasons the commenters provided or other reasons, they do not want to do so, States may instead use “another State-determined measure relevant to calculating student learning outcomes.”

    We do not share commenters' concerns about small elementary and secondary schools where privacy concerns purportedly require a school-level calculation of student growth measures rather than calculation of student growth at the teacher level, or related concerns about student learning outcomes for an individual teacher not yielding useable information about a particular teacher preparation program. Student learning outcomes applicable to a particular teacher preparation program would not be aggregated at the school level. Whether measured using student growth, a teacher evaluation measure, or another State-determined measure relevant to calculating student learning outcomes, each teacher—whether employed in a large school or a small school—has some impact on student learning. Under our regulations, these impacts would be aggregated across all schools (or at least all public schools in the State in which the program is located) that employ novice teachers the program had prepared.

    For small teacher preparation programs, we believe that a State's use of the aggregation methods reasonably balances the need for annual reporting on teacher preparation program performance with the special challenges of generating a meaningful annual snapshot of program quality for programs that annually produce few teachers. By permitting aggregation to the threshold level of similar or broader programs run by the same teacher preparation entity (paragraph (b)(3)(ii)(A)) or over a period of up to four years (ii)(B)), or both (ii)(C)), we are offering States options for meeting their annual reporting responsibilities for all programs. However, if aggregation under any of the methods identified in § 612.4(b)(3)(ii)(A)-(C) would still not yield the requisite program size threshold of 25 recent graduates or such lower number that a State establishes, or if reporting such data would be inconsistent with Federal or State privacy and confidentiality laws and regulations, § 612.4(b)(3)(ii)(D) and § 612.4(b)(5) provide that the State would not need to report data on, or identify an overall performance rating for, that program.

    Our regulations give States flexibility to determine, with their consultative groups, their own ways of determining a teacher preparation program's performance. But if a State were to use the “lowest common denominator” in evaluating programs, as the commenter suggested, it would not be meeting the requirement in § 612.4(b)(1) to identify meaningful differentiation between programs. We continue to caution against making comparisons of the performance of each teacher preparation program, or the data for each indicator and criterion a State uses to determine the overall level of performance, that States report in their SRCs. Each teacher preparation program is different; each has a different mission and draws different groups of aspiring teachers. The purpose of this reporting is to permit the public to understand which programs a State determines to be low-performing or at-risk of being low-performing, and the reasons for this determination. The regulations do not create a national ranking system for comparing the performance of programs across States. For these reasons, we do not believe that the regulations provide perverse incentives for States to lower their standards relative to other States.

    While we appreciate the commenter's recommendation that States be required to use a set of normed criteria common across all sized programs and all States, section 205(b) of the HEA requires each State to include in its SRC its criteria for assessing program performance, including indicators of academic content knowledge and teaching skills. Therefore, subject only to use of the indicators of academic content knowledge and teaching skills defined in these regulations, the law provides that each State determine how to assess a program's performance and, in doing so, how to weight different criteria and indicators that bear on the overall assessment of a program's performance.

    We appreciate the commenters' statements about potential challenges and limitations that the regulations' aggregation procedures pose for small teacher preparation programs. However, while we agree that a State's use of these procedures for small programs may produce results that are less meaningful than those for programs that annually produce 25 or more recent graduates (or such lower threshold as the State establishes), we believe that they do provide information that is far more meaningful than the omission of information about performance of these small programs altogether. We also appreciate commenters' concerns that for some States, the process of aggregating program data could entail significant effort. But we assume that data for indicators of this and other programs of the same teacher preparation entities would be procured electronically, and, therefore, do not believe that aggregation of data would necessarily need to be performed manually or that the effort involved would be “excessive”. Moreover, the commenters do not explain why use of the aggregation methods to identify programs that are low-performing or at-risk of being low-performing should lead to significant inconsistencies across entities and from one year to the next, nor do we agree this will be the case.

    Like the commenter, we are concerned about protection of individual personnel data from public disclosure. But we do not see how the procedures for aggregating data on small programs, such that what the State reports concerns a combined program that meets the size threshold of 25 (or such lower size threshold as the State establishes) creates legitimate concerns about such disclosure. And as our proposed regulations did not contain a size threshold of 10, we do not believe we need to make edits to address the specific commenters' concerns regarding our threshold number.

    Changes: None.

    Aggregating Data for Teacher Preparation Programs Operated by the Same Entity (34 CFR 612.4(b)(3)(ii)(A))

    Comments: One commenter expressed concerns for how our proposed definition of a teacher preparation program meshed with how States would report data for and make an overall assessment of the performance of small teacher preparation programs. The commenter noted that the proposed regulations define a teacher preparation program as a program that is “offered by a teacher preparation entity that leads to a specific State teacher certification or licensure in a specific field.” It therefore appears that a program that is a “secondary mathematics program” would instead be a “secondary program.” Based on the proposed regulatory language about aggregation of performance data among teacher preparation programs that are operated by the same teacher preparation entity and are similar to or broader than the program (§ 612.4(b)(3)(ii)(A)), the commenter added that it appears that a State can collapse secondary content areas (e.g., biology, physics) and call it a “secondary program.”

    Discussion: As explained in our discussion of the prior comments, we feel that meeting the program size threshold of 25 novice teachers (or any lower threshold a State establishes) by aggregating performance data for each of these smaller programs with performance data of similar or broader programs that the teacher preparation entity operates (thus, in effect, reporting on a broader-based set of teacher preparation programs) is an acceptable and reasonable way for a State to report on the performance of these programs. Depending on program size, reporting could also be even broader, potentially having reporting for the entire teacher preparation entity. Indicators of teacher preparation performance would then be outcomes for all graduates of the combined set of programs, regardless of what subjects they teach. A State's use of these aggregation methods balances the need to annually report on program performance with the special challenges of generating a meaningful annual snapshot of program quality for programs that annually produce few novice teachers. We understand the commenter's concern that these aggregation measures do not precisely align with the definition of teacher preparation program and permit, to use the commenter's example, a program that is a “secondary mathematics program” to potentially have its performance reported as a broader “secondary program.” But as we noted in our response to prior comments, if a State does not choose to establish a lower size threshold that would permit reporting of the secondary mathematics program, aggregating performance data for that program with another similar program still provides benefits that far exceed having the State report no program performance information at all.

    TEACH Grant eligibility would not be impacted because either the State will determine and report the program's performance by aggregating relevant data on that program with data for other teacher preparation programs that are operated by the same teacher preparation entity and are similar to or broader than the program in content, or the program will meet the exceptions provided in § 612.4(b)(3)(ii)(D) and § 612.4(b)(5).

    Changes: None.

    Aggregating Data in Performance Reporting (34 CFR 612.4(b)(3)(ii)(B))

    Comments: Several commenters stated that aggregating data for any given teacher preparation program over four years to meet the program size threshold would result in a significant lack of reliability; some urged the Department to cap the number of years allowed for aggregating data at three years. Another commenter raised concerns about reported data on any given program being affected by program characteristics that are prone to change significantly in the span of four years (i.e., faculty turnover and changes in clinical practice, curriculum, and assessments). The commenter noted that many States' programs will not meet the criterion of setting the minimum number of program completers, which the commenter stated our proposed regulations set at ten. The commenter asked the Department to consider a number of aggregation methods to reach a higher completer count.

    Discussion: The proposed regulations did not establish, as a threshold for reporting performance data and the program's level of performance, a minimum of ten program completers. Rather, where a teacher preparation program does not annually produce 25 or more recent graduates (or such lower threshold as the State may establish), proposed § 612.4(b)(3)(ii)(B) would permit a State to aggregate its performance data in any year with performance data for the same program generated over a period of up to four years. We appreciate that aggregating data on a program's new teachers over a period of up to four years is not ideal; as commenters note, program characteristics may change significantly in the span of four years.

    However, given the challenges of having States report on the performance of small programs, we believe that providing States this option, as well as options for aggregating data on the program with similar or broader programs of the same teacher preparation entity (§§ 612.4(b)(3)(ii)(A) and (C)), allows the State to make a reasonable determination of the program's level of performance. This is particularly so given that the regulations require that the State identify only whether a given teacher preparation program is low-performing or at-risk of being low-performing. We note that States have the option to aggregate across programs within an entity, if in consultation with stakeholders, they find that produces a more accurate representation of program quality. See § 612.4(b)(3)(ii)(A)). We believe that a State's use of these alternative methods would produce more reliable and valid measures of quality for each of these smaller programs and reasonably balance the need annually to report on program performance with the special challenges of generating a meaningful annual snapshot of program quality for programs that annually produce few novice teachers.

    The commenters who recommended reducing the maximum time for aggregating data on the same small program from four years to three did not explain why the option of having an additional year to report on very small programs was preferable to omitting a report on program performance altogether if the program was still below the size threshold after three years. We do not believe that it is preferable. Moreover, if a State does not want to aggregate performance data for the same small program over a full four years, the regulations permit it instead to combine performance data with data for other programs operated by the same entity that are similar or broader.

    Changes: None.

    Aggregating Data in Performance Reporting of Small Teacher Preparation Programs (34 CFR 612.4(b)(3)(ii)(C))

    Comments: Commenters noted that while the proposed rule asserts that States may use their discretion on how to report on the performance of teacher preparation programs that do not meet the threshold of 25 novice teachers (or any lower threshold the State establishes), the State may still be reporting on less than half of its programs. Commenters note that if this occurs, the Department's approach will not serve the purpose of increased accountability of all programs. Another commenter stated that human judgment would have to be used to aggregate data across programs or across years in order to meet the reporting threshold, and this would introduce error in the level of performance the State assigns to the program in what the commenter characterizes as a high-stakes accountability system.

    Another commenter appears to understand that the government wants to review larger data fields for analysis and reporting, but stated that the assumption that data from a program with a smaller “n” size is not report worthy may dampen innovation and learning from a sponsoring organization with a stated goal of producing a limited number of teachers or is in a locale needing a limited number of teachers. The commenter noted that, if a State were to combine programs, report years, or some other combination to get to 25, the Federally stated goal of collecting information about each program, rather than the overall sponsoring organization, is gone. The commenter argued that § 612.4(c), which the commenter states requires that States report on teacher preparation at the individual program level, appears to contradict the over 25 completer rule for reporting.

    Discussion: We expect that, working with their consultative group (see § 612.4(c)), States will adopt reasonable criteria for deciding which procedure to use in aggregating performance data for programs that do not meet the minimum threshold. We also expect that a key factor in the State's judgment of how to proceed will be how best to minimize error and confusion in reporting data for indicators of academic content knowledge and teaching skills and other criteria the State uses, and the program's overall level of performance. States will want to produce the most reliable and valid measures of quality for each of these smaller programs. Finally, while the commenter is correct that § 612.4(c) requires States to work with a consultative group on procedures for assessing and reporting the performance of each teacher preparation program in the State, how the State does so for small programs is governed by § 612.4(b)(3)(ii).

    Changes: None.

    No Required State Reporting on Small Teacher Preparation Programs That Cannot Meet Reporting Options (34 CFR 612.4(b)(4)(ii)(D))

    Comments: Some commenters urged the Department not to exempt from State title II reporting those teacher preparation programs that are so small they are unable to meet the proposed threshold size requirements even with the options for small programs we had proposed.

    Discussion: If a teacher preparation program produces so few recent graduates that the State cannot use any of the aggregation methods to enable reporting of program performance within a four-year period, we do not believe that use of the regulations' indicators of academic content knowledge and teaching skills to assess its performance will produce meaningful results.

    Changes: None.

    No Required State Reporting Where Inconsistent With Federal and State Privacy and Confidentiality Laws (34 CFR 612.4(b)(3)(ii)(E))

    Comments: Two commenters objected to the proposed regulations because of concerns that the teacher evaluation data and individual student data that would be collected and reported would potentially violate State statutes protecting or sharing elementary and secondary student performance data and teacher evaluation results with any outside entity. One commenter expressed general concern about whether this kind of reporting would violate the privacy rights of teachers, particularly those who are working in their initial years of teaching.

    Another commenter recommended that the proposed regulations include what the commenter characterized as the exemption in the Family Educational Rights and Privacy Act (FERPA) (34 CFR 99.31 or 99.35) that allows for the re-disclosure of student-level data for the purposes of teacher preparation program accountability. The commenter stressed that the proposed regulations do not address a restriction in FERPA that prevents teacher preparation programs from being able to access data that the States will receive on program performance. The commenter voiced concern that as a result of this restriction in FERPA, IHEs will be unable to perform the analyses to determine which components of their teacher preparation programs are leading to improvements in student academic growth and which are not, and urged that we include an exemption in 34 CFR 99.31 or 99.35 to permit the re-disclosure of student-level data to IHEs for the purposes of promoting teacher preparation program accountability. From a program improvement standpoint, the commenter argues that aggregated data are meaningless; teacher preparation programs need fine-grained, person-specific data (data at the lowest level possible) that can be linked to student information housed within the program.

    Yet another commenter stated that surveying students (by which we interpret the comment to mean surveying elementary or secondary school students) or parents raises general issues involving FERPA.

    Discussion: The Department appreciates the concerns raised about the privacy of information on students and teachers. Proposed § 612.4(b)(4)(ii)(E) provided that a State is not required to report data on a particular teacher preparation program that does not meet the size thresholds under § 612.4(b)(4)(ii)(A)-(C) if reporting these data would be inconsistent with Federal or State privacy and confidentiality laws and regulations. We had proposed to limit this provision to these small programs because we did (and do) not believe that, for larger programs, Federal or State laws would prohibit States or State agencies from receiving the information they need under our indicators of academic content knowledge and teaching skills to identify a program's level of performance. The commenters did not provide the text of any specific State law to make us think otherwise, and for reasons we discuss below, we are confident that FERPA does not create such concerns. Still, in an abundance of caution, we have revised this provision to clarify that no reporting of data under § 612.4(b) is needed if such reporting is inconsistent with Federal or State confidentiality laws. We also have redesignated this provision as § 612.4(b)(5) to clarify that it is not limited to reporting of small teacher preparation programs. States should be aware of any restrictions in reporting because of State privacy laws that affect students or teachers.

    At the Federal level, the final regulations do not amend 34 CFR part 99, which are the regulations implementing section 444 of the General Education Provisions Act (GEPA), commonly referred to as FERPA. FERPA is a Federal law that protects the privacy of personally identifiable information in students' education records. See 20 U.S.C. 1232g; 34 CFR part 99. FERPA applies to educational agencies and institutions (elementary and secondary schools, school districts, colleges and universities) that are recipients of Federal funds under a program administered by the Department. FERPA prohibits educational agencies and institutions to which it applies from disclosing personally identifiable information from students' education records, without the prior written consent of the parent or eligible student, unless the disclosure meets an exception to FERPA's general consent requirement. The term “education records” means those records that are: (1) Directly related to a student; and (2) maintained by an educational agency or institution or by a party acting for the agency or institution. Education records would encompass student records that LEAs maintain and that States will need in order to have the data needed to apply the regulatory indicators of academic content and teaching skills to individual teacher preparation programs.

    As the commenter implicitly noted, one of the exceptions to FERPA's general consent requirement permits the disclosure of personally identifiable information from education records by an educational agency or institution to authorized representatives of a State educational authority (as well as to local educational authorities, the Secretary, the Attorney General of the United States, and the Comptroller General of the United States) as may be necessary in connection with the audit, evaluation, or the enforcement of Federal legal requirements related to Federal or State supported education programs (termed the “audit and evaluation exception”). The term “State and local educational authority” is not specifically defined in FERPA. However, we have previously explained in the preamble to FERPA regulations published in the Federal Register on December 2, 2011 (76 FR 75604, 75606), that the term “State and local educational authority” refers to an SEA, a State postsecondary commission, Bureau of Indian Education, or any other entity that is responsible for and authorized under local, State, or Federal law to supervise, plan, coordinate, advise, audit, or evaluate elementary, secondary, or postsecondary Federal- or State-supported education programs and services in the State. Accordingly, an educational agency or institution, such as an LEA, may disclose personally identifiable information from students' education records to a State educational authority that has the authority to access such information for audit, evaluation, compliance, or enforcement purposes under FERPA.

    We understand that all SEAs exercise this authority with regard to data provided by LEAs, and therefore FERPA permits LEAs to provide to SEAs the data the State needs to assess the indicators our regulations require. Whether other State agencies such as those that oversee or help to administer aspects of higher education programs or State teacher certification requirements are also State education authorities, and so may likewise receive such data, depends on State law. The Department would therefore need to consider State law (including valid administrative regulations) and the particular responsibilities of a State agency before providing additional guidance about whether a particular State entity qualifies as a State educational authority under FERPA.

    The commenter would have us go further, and amend the FERPA regulations to permit State educational authorities to re-disclose this personally identifiable information from students' education records to IHEs or the programs themselves in order to give them the disaggregated data they need to improve the programs. While we understand the commenter's objective, we do not have the legal authority to do this.

    Finally, in response to other comments, FERPA does not extend privacy protections to an LEA's records on teachers. Nor do the final regulations require any reporting of survey results from elementary or secondary school students or their parents. To the extent that either is maintained by LEAs, disclosures would be subject to the same exceptions and limitations under FERPA as records of or related to students.

    Changes: We have revised § 612.4(b)(3)(ii)(E) and have redesignated it as § 612.4(b)(5) to clarify that where reporting of data on a particular program would be inconsistent with Federal or State privacy or confidentiality laws or regulations, the exclusion from State reporting of these data is not limited to small programs subject to § 612.4(b)(3)(ii).

    Fair and Equitable Methods: Consultation With Stakeholders (34 CFR 612.4(c)(1))

    Comments: We received several comments on the proposed list of stakeholders that each State would be required to include, at a minimum, in the group with which the State must consult when establishing the procedures for assessing and reporting the performance of each teacher preparation program in the State (proposed § 612.4(c)(1)(i)). Some commenters supported the list of stakeholders. One commenter specifically supported the inclusion of representatives of institutions serving minority and low-income students.

    Some commenters believed that, as the relevant stakeholders will vary by State, the regulations should not specify any of the stakeholders that each State must include, leaving the determination of necessary stakeholders to each State's discretion.

    Some commenters suggested that States be required to include representatives beyond those listed in the proposed rule. In this regard, commenters stated that representatives of small teacher preparation programs are needed to help the State to annually revisit the aggregation of data for programs with fewer novice teachers than the program size threshold, as would be required under proposed § 612.4(b)(4)(ii). Some commenters recommended adding advocates for low-income and underserved elementary and secondary school students. Some commenters also stated that advocates for students of color, including civil rights organizations, should be required members of the group. In addition, commenters believed that the regulations should require the inclusion of a representative of at least one teacher preparation program provided through distance education, as distance education programs will have unique concerns.

    One commenter recommended adding individuals with expertise in testing and assessment to the list of stakeholders. This commenter noted, for example, that there are psychologists who have expertise in aspects of psychological testing and assessment across the variety of contexts in which psychological and behavioral tests are administered. The commenter stated that, when possible, experts such as these who are vested stakeholders in education should be consulted in an effort to ensure the procedures for assessing teacher preparation programs are appropriate and of high quality, and that their involvement would help prevent potential adverse, unintended consequences in these assessments.

    Some commenters supported the need for student and parent input into the process of establishing procedures for evaluating program performance but questioned the degree to which elementary and secondary school students and their parents should be expected to provide input on the effectiveness of teacher preparation programs.

    One commenter supported including representatives of school boards, but recommended adding the word “local” before “school boards” to clarify that the phrase “school boards” does not simply refer to State boards of education.

    Discussion: We believe that all States must consult with the core group of individuals and entities that are most involved with, and affected by, how teachers are prepared to teach. To ensure that this is done, we have specified this core group of individuals and entities in the regulations. We agree with the commenters that States should be required to include in the group of stakeholders with whom a State must consult representatives of small teacher preparation programs (i.e., programs that produce fewer than a program size threshold of 25 novice teachers in a given year or any lower threshold set by a State, as described in § 612.4(b)(3)(ii)). We agree that the participation of representatives of small programs, as is required by § 612.4(c)(ii)(D), is essential because one of the procedures for assessing and reporting the performance of each teacher preparation program that States must develop with stakeholders includes the aggregation of data for small programs (§ 612.4(c)(1)(ii)(B)).

    We also agree with commenters that States should be required to include as stakeholders advocates for underserved students, such as low-income students and students of color, who are not specifically advocates for English learners and students with disabilities. Section 612.4(c)(ii)(I) includes these individuals, and they could be, for example, representatives of civil rights organizations. To best meet the needs of each State, and to provide room for States to identify other groups of underserved students, the regulations do not specify what those additional groups of underserved students must be.

    We agree with the recommendation to require States to include a representative of at least one teacher preparation program provided through distance education in the group of stakeholders as we agree teacher preparation programs provided through distance education are different from brick-and-mortar programs, and warrant representation on the stakeholder group. Under the final regulations, except for the teacher placement rates, States collect information on those programs and report their performance on the same basis as brick-and-mortar programs. See the discussion of comment on Program-Level Reporting (including distance education) (34 CFR 612.4(a)(1)(i)).

    While a State may include individuals with expertise in testing and assessment in the group of stakeholders, we do not require this because States alternatively may either wish to consult with such individuals through other arrangements, or have other means for acquiring information in this area that they need.

    Nonetheless, we encourage States to use their discretion to add representatives from other groups to ensure the process for developing their procedures and for assessing and reporting program performance are fair and equitable.

    We thank commenters for their support for our inclusion of representatives of “elementary through secondary students and their parents” in the consultative group. We included them because of the importance of having teacher preparation programs focus on their ultimate customers—elementary and secondary school students.

    Finally, we agree that the regulation should clarify that the school board representatives whom a State must include in its consultative group of stakeholders are those of local school boards. Similarly, we believe that the regulation should clarify that the superintendents whom a State must include in the group of stakeholders are LEA superintendents.

    Changes: We have revised § 612.4(c)(1)(i) to clarify that a State must include representatives of small programs, other groups of underserved students, of local school boards and LEA superintendents and a representative of at least one teacher preparation program provided through distance education in the group with which the State must consult when establishing its procedures.

    Comments: Commenters recommended that States should not be required to establish consequences (associated with a program's identification as low-performing or at-risk of being low-performing), as required under proposed § 612.4(c)(1)(ii)(C), until after the phase-in of the regulations. Commenters stated that, because errors will be made in the calculation of data and in determining the weights associated with specific indicators, States should be required to calculate, analyze, and publish the data for at least two years before high-stakes consequences are attached. Commenters believed that this would ensure initial unintended consequences are identified and addressed before programs are subject to high-stakes consequences. Commenters also expressed concerns about the ability of States, under the proposed timeline for implementation, to implement appropriate opportunities for programs to challenge the accuracy of their performance data and classification of their program under proposed § 612.4(c)(1)(ii)(D).

    Commenters also stated that the proposed requirement that the procedures for assessing and reporting the performance of each teacher preparation program in the State must include State-level rewards and consequences associated with the designated performance levels is inappropriate because the HEA does not require States to develop rewards or consequences associated with the designated performance levels of teacher preparation programs. Commenters also questioned the amount of information that States would have to share with the group of stakeholders establishing the procedures on the fiscal status of the State to determine what the rewards should be for high-performing programs. Commenters noted that rewards are envisioned as financial in nature, but States operate under tight fiscal constraints. Commenters believed that States would not want to find themselves in an environment where rewards could not be distributed yet consequences (i.e., the retracting of monies) would ensue.

    In addition, commenters were concerned about the lack of standards in the requirement that States implement a process for programs to challenge the accuracy of their performance data and classification. Commenters noted that many aspects of the rating system carry the potential for inaccurate data to be inputted or for data to be miscalculated. Commenters noted that the proposed regulations do not address how to ensure a robust and transparent appeals process for programs to challenge their classification.

    Discussion: We believe the implementation schedule for these final regulations provides sufficient time for States to implement the regulations, including the time necessary to develop the procedures for assessing and reporting the performance of each teacher preparation program in the State (see the discussion of comments related to the implementation timeline for the regulations in General (Timeline) (34 CFR 612.4(a)(1)(i)) and Reporting of Information on Teacher Preparation Program Performance (Timeline) (34 CFR 612.4(b)). We note that States can use results from the pilot reporting year, when States are not required to classify program performance, to adjust their procedures. These adjustments could include the weighting of indicators, the procedure for program challenges, and other changes needed to ensure that unintended consequences are identified and addressed before the consequences have high stakes for programs. Additionally, under § 612.4(c)(2), a State has the discretion to determine how frequently it will periodically examine the quality of the data collection and reporting activities it conducts, and States may find it beneficial to examine and make changes to their systems more frequently during the initial implementation stage.

    The regulations do not require a State to have State-level rewards or consequences associated with teacher preparation performance levels. To the extent that the State does, § 612.4(b)(2)(iii) requires a State to provide that information in the SRC, and § 612.4(c)(1)(ii)(C) requires the State to include those rewards or consequences in the procedures for assessing and reporting program performance it establishes in consultation with a representative group of stakeholders in accordance with § 612.4(c)(1)(i).

    Certainly, whether a State can afford to provide financial rewards is an essential consideration in the development of any State-level rewards. We leave it up to each State to determine, in accordance with any applicable State laws or regulations, the amount of information to be shared in the development of any State-level rewards or consequences.

    As a part of establishing appropriate opportunities for teacher preparation programs to challenge the accuracy of their performance data and program classification, States are responsible for determining the related procedures and standards, again in consultation with the required representative group of stakeholders. We expect that these procedures and standards will afford programs meaningful and timely opportunities to appeal the accuracy of their performance data and overall program performance level.

    Changes: None.

    Fair and Equitable Methods: State Examination of Data Collection and Reporting (34 CFR 612.4(c)(2))

    Comments: Commenters asserted that the proposed requirement for a State to periodically examine the quality of its data collection and reporting activities under proposed § 612.4(c)(2) is insufficient. The commenters contended that data collection and reporting activities must be routinely and rigorously examined and analyzed to ensure transparency and accuracy in the data and in the high-stakes results resulting from the use of the data. According to these commenters, State data systems are not at this time equipped to fully implement the regulations, and thus careful scrutiny of the data collection—especially in the early years of the data systems—is vital to ensure that data from multiple sources are accurate, and, if they are not, that modifications are made. Commenters also suggested that there should be a mechanism to adjust measures when schools close or school boundaries change as programs with smaller numbers of graduates concentrated in particular schools could be significantly impacted by these changes that are outside the control of teacher preparation programs.

    Discussion: The regulations do not specify how often a State must examine the quality of its data collection and reporting activities and make any appropriate modifications, requiring only that it be done “periodically.” We think that the frequency and extent of this review is best left to each State, in consultation with its representative group of stakeholders. We understand, as indicated by commenters, that many State data systems are not currently ready to fully implement the regulations, and therefore it is likely that such examinations and modifications will need to be made more frequently during the development stage than will be necessary once the systems have been in place and operating for a while. As States have the discretion to determine the frequency of their examinations and modifications, they may establish triggers for examining and, if necessary, modifying their procedures. This could include developing a mechanism to modify the procedures in certain situations, such as where school closures and school boundary changes may inadvertently affect certain teacher preparation programs.

    Changes: None.

    Section 612.5 What indicators must a State use to report on teacher preparation program performance for purposes of the State report card? Indicators a State Must Use To Report on Teacher Preparation Programs in the State Report Card (34 CFR 612.5(a))

    Comments: Some commenters expressed support for the proposed indicators, believing they may push States to hold teacher preparation programs more accountable. Some commenters were generally supportive of the feedback loop where teacher candidate placement, retention, and elementary and secondary classroom student achievement results can be reported back to the programs and published so that the programs can improve.

    In general, many commenters opposed the use of the indicators of academic content knowledge and teaching skills in the SRC, stating that these indicators are arbitrary, and that there is no empirical evidence that connects the indicators to a quality teacher preparation program; that the proposed indicators have never been tested or evaluated to determine their workability; and that there is no consensus in research or among the teaching profession that the proposed performance indicators combine to accurately represent teacher preparation program quality. Other commenters opined that there is no evidence that the indicators selected actually represent program effectiveness, and further stated that no algorithm would accurately reflect program effectiveness and be able to connect those variables to a ranking system. Many commenters expressed concern about the proposed assessment system, stating that reliability and validity data are lacking. Some commenters indicated that reporting may not need to be annual since multi-year data are more reliable.

    Commenters also stated that valid conclusions about teacher preparation program quality cannot be drawn using data with questionable validity and with confounding factors that cannot be controlled at the national level to produce a national rating system for teacher preparation programs. Many other commenters stated that teacher performance cannot be equated with the performance of the students they teach and that there are additional factors that impact teacher preparation program effectiveness that have not been taken into account by the proposed regulations. We interpret other comments as expressing concern that use of the outcome indicators would not necessarily help to ensure that teachers are better prepared before entering the classroom.

    Commenters stated that there are many potential opportunities for measurement error in the outcome indicators and therefore the existing data do not support a large, fully scaled implementation of this accountability system. Commenters argued that the regulations extend an untested performance assessment into a high-stakes realm by determining eligibility for Federal student aid through assessing the effectiveness of each teacher preparation program. One commenter stated that, in proposing the regulations, the Department did not consider issues that increase measurement error, and thus decrease the validity of inferences that can be made about teacher quality. For example, students who graduate but do not find a teaching job because they have chosen to stay in a specific geographic location would essentially count against a school and its respective ranking. Several commenters suggested that we pilot the proposed system and assess its outcomes, using factors that are flexible and contextualized within a narrative, without high-stakes consequences until any issues in data collection are worked out.

    Discussion: We appreciate commenters' concerns about the validity and reliability of the individual indicators of academic content knowledge and teaching skill in the proposed regulations, as well as the relationship between these indicators and the level of performance of a teacher preparation program. However, we believe the commenters misunderstood the point we were making in the preamble to the NPRM about the basis for the proposed indicators. We were not asserting that rigorous research studies had necessarily demonstrated the proposed indicators—and particularly those for student learning outcomes, employment outcomes, employment outcomes in high-need schools and survey outcomes--to be valid and reliable. Where we believe that such research shows one or more of the indicators to be valid and reliable, we have highlighted those findings in our response to the comment on that indicator. But our assertion in the preamble to the NPRM was that use of these indicators would produce information about the performance-level of each teacher preparation program that, speaking broadly, is valid and reliable. We certainly did not say that these indicators were necessarily the only measures that would permit the State's identification of each program's level of performance to be appropriate. And in our discussion of public comments we have clarified that States are free to work with their consultative group (see § 612.4(c)) to establish other measures the State would use as well.

    In broad terms, validity here refers to the accuracy of these indicators in measuring what they are supposed to measure, i.e., that they collectively work to provide significant information about a teacher preparation program's level of performance. Again, in broad terms, reliability here refers to the extent to which these indicators collectively can be used to assess a program's level of performance and to yield consistent results.

    For reasons we explain below, we believe it is important that teacher preparation programs produce new teachers who positively impact student academic success, take jobs as teachers and stay in the profession at least three years, and feel confident about the training the programs have provided to them. This is what these three indicators in our final regulations do—and by contrast what is missing from the criteria that States have reported in SRCs that they have used to date to assess program performance.

    We do not believe that State conclusions about the performance levels of their teacher preparation programs can be valid or reliable if they, as State criteria have done to date, focus on inputs a program offers any more than an automobile manufacturer's assessment of the validity and reliability of its safety and performance testing make sense if they do not pay attention to how the vehicles actually perform on the road.

    Our final regulations give States, working with their stakeholders, the responsibility for establishing procedures for ensuring that use of these indicators, and such other indicators of academic content knowledge and teaching skills and other criteria the State may establish, permits the State to reasonably identify (i.e., with reasonable validity and reliability) those teacher preparation programs that are low-performing or at-risk of being low-performing. We understand that, to do this, they will need to identify and implement procedures for generating relevant data on how each program reflects these measures and criteria, and for using those data to assess each program in terms of its differentiated levels of performance. But we have no doubt that States can do this in ways that are fair to entities that are operating good programs while at the same time are fair to prospective teachers, prospective employers, elementary and secondary school students and their parents, and the general public—all of whom rely on States to identify and address problems with low-performing or at-risk programs.

    We further note that by defining novice teacher to include a three-year teaching period, which applies collected for student learning outcomes and employment outcomes, the regulations will have States use data for these indicators of program performance over multiple years. Doing so will increase reliability of the overall level of performance the State assigns to each program in at least two respects. First, it will decrease the chance that one aberrational year of performance or any given cohort of program graduates (or program participants in the case of alternative route teacher preparation programs) has a disproportionate effect on a program's performance. And second, it will decrease the chance that the level of performance a State reports for a program will be invalid or unreliable.

    We stress, however, that the student learning outcomes, employment outcomes, and survey outcomes that the regulations require States to use as indicators of academic content and teaching skills are not simply measures that logically are important to assessing a program's true level of performance. Rather, as we discuss below, we believe that these measures are also workable, based on research, and reflective of the direction in which many States and programs are going, even if not reflecting an outright consensus of all teacher preparation programs.

    In this regard, we disagree with the commenters' assertions that these measures are arbitrary, lack evidence of support, and have not been tested. The Department's decision to require use of these measures as indicators of academic content knowledge and teaching skills is reinforced by the adoption of similar indicators by CAEP,15 which reviews over half of the Nation's teacher preparation programs—and by the States of North Carolina, Tennessee, Ohio, and Louisiana, which already report annually on indictors of teacher preparation program performance based on data from State assessments. The recent GAO report determined that more than half the States already utilize data on program completers' effectiveness (such as surveys, placement rates, and teacher evaluation results) in assessing programs, with at least ten more planning to do so.16 These measures also reflect input received from many non-Federal negotiators during negotiated rulemaking. Taken together, we believe that the adoption of these measures of academic content knowledge and teaching skills reflects the direction in which the field is moving, and the current use of similar indicators by several SEAs demonstrates their feasibility.

    15 CAEP 2013 Accreditation Standards, Standard 4, Indicator 4. (2013). Retrieved from http://caepnet.org/standards/introduction. Amended by the CAEP Board of Directors February 13, 2015.

    16 GAO at 13-14.

    We acknowledge that many factors account for the variation in a teacher's impact on student learning. However, we strongly believe that a principal function of any teacher preparation program is to train teachers to promote the academic growth of all students regardless of their personal and family circumstances, and that the indicators whose use the regulations prescribe are already being used to help measure programs' success in doing so. For example, Tennessee employs some of the outcome measures that the regulations require, and reports that some teacher preparation programs consistently produce teachers with statistically significant student learning outcomes over multiple years.17 Delaware also collects and reports data on the performance and effectiveness of program graduates by student achievement and reports differentiated student learning outcomes by teacher preparation program.18 Studies of programs in Washington State 19 and New York City,20 as well as data from the University of North Carolina system,21 also demonstrate that graduates of different teacher preparation programs show statistically significant differences in value-added scores. The same kinds of data from Tennessee and North Carolina show large differences in teacher placement and retention rates among programs. In Ohio 22 and North Carolina, survey data also demonstrate that, on average, graduates of teacher preparation programs can have large differences in opinions of the quality of their preparation for the classroom. And a separate study of North Carolina teacher preparation programs found statistically significant correlations between programs that collect outcomes data on graduates and their graduate's value-added scores.23 These results reinforce that teacher preparation programs play an important role in teacher effectiveness, and so give prospective students and employers important information about which teacher preparation programs most consistently produce teachers who can best promote student academic achievement.

    17 See Report Card on the Effectiveness of Teacher Training Programs, Tennessee 2014 Report Card. (n.d.). Retrieved from www.tn.gov/thec/article/report-card.

    18 See 2015 Delaware Educator Preparation Program Reports. (n.d.). Retrieved June 27, 2016 from www.doe.k12.de.us/domain/398.

    19 Goldhaber, D., & Liddle, S. (2013). The Gateway to the Profession: Assessing Teacher Preparation Programs Based on Student Achievement. Economics of Education Review, 34, 29-44.

    20 Boyd, D., Grossman, P., Lankford, H., Loeb, S., & Wyckoff, J. (2009). Teacher Preparation and Student Achievement. Education Evaluation and Policy Analysis, 31(4), 416-440.

    21 See UNC Educator Quality Dashboard. (n.d.). Retrieved from http://tqdashboard.northcarolina.edu/performance-employment/.

    22 See, for example: 2013 Educator Preparation Performance Report Adolescence to Young Adult (7-12) Integrated Mathematics Ohio State University. Retrieved from http://regents.ohio.gov/educator-accountability/performance-report/2013/OhioStateUniversity/OHSU_IntegratedMathematics.pdf.

    23 Henry, G., & Bastian, K. (2015). Measuring Up: The National Council on Teacher Quality's Ratings of Teacher Preparation Programs and Measures of Teacher Performance.

    While we acknowledge that some studies of teacher preparation programs 24 find very small differences at the program level in graduates' average effect on student outcomes, we believe that the examples we have cited above provide a reasonable basis for States' use of student learning outcomes weighted in ways that they have determined best reflect the importance of this indicator. In addition, we believe the data will help programs develop insights into how they can more consistently generate high-performing graduates.

    24 For example: C. Koedel, E. Parsons, M. Podgursky, & M. Ehlert (2015). “Teacher Preparation Programs and Teacher Quality: Are There Real Differences Across Programs?” Education Finance and Policy, 10(4): 508-534; P. von Hippel, L. Bellows, C. Osborne, J. Arnold Lincove, & N. Mills (2014). “Teacher Quality Differences Between Teacher Preparation Programs: How Big? How Reliable? Which Programs Are Different?” Retrieved from Social Science Research Network, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2506935.

    We have found little research one way or the other that directly ties the performance of teacher preparation programs to employment outcomes and survey outcomes. However, we believe that these other measures—program graduates and alternative route program participants' employment as teachers, retention in the profession, and perceptions (with those of their employers) of how well their programs have trained them for the classroom—strongly complement use of student learning outcomes in that they help to complete the picture of how well programs have really trained teachers to take and maintain their teaching responsibilities.

    We understand that research into how best to evaluate both teacher effectiveness and the quality of teacher preparation programs continues. To accommodate future developments in research that improve a State's ability to measure program quality as well as State perspectives of how the performance of teacher preparation programs should best be measured, the regulations allow a State to include other indicators of academic content knowledge and teaching skills that measure teachers' effects on student performance (see § 612.5(b)). In addition, given their importance, while we strongly encourage States to provide significant weight in particular to the student learning outcomes and retention rate outcomes in high-need schools in their procedures for assessing program performance, the Department has eliminated the proposed requirements in § 612.4(b)(1) that States consider these measures “in significant part.” The change confirms States' ability to determine how to weight each of these indicators to reflect their own understanding of how best to assess program performance and address any concerns with measurement error. Moreover, the regulations offer States a pilot year, corresponding to the 2017-18 reporting year (for data States are to report in SRCs by October 31, 2018, in which to address and correct for any issues with data collection, measurement error, validity, or reliability in their reported data.

    Use of these indicators themselves, of course, does not ensure that novice teachers are prepared to enter the classroom. However, we believe that the regulations, including the requirement for public reporting on each indicator and criterion a State uses to assess a program's level of performance, provide strong incentives for teacher preparation programs to use the feedback from these measures to ensure that the novice teachers they train are ready to take on their teaching responsibilities when they enter the classroom.

    We continue to stress that the data on program performance that States report in their SRCs do not create and are not designed to promote any kind of a national, in-State, or interstate rating system for teacher preparation programs, and caution the public against using reported data in this way. Rather, States will use reported data to evaluate program quality based on the indicators of academic content knowledge and teaching skills and other criteria of program performance that they decide to use for this purpose. Of course, the Department and the public at large will use the reported information to gain confidence in State decisions about which programs are low-performing and at-risk of being low-performing (and are at any other performance level the State establishes) and the process and data States use to make these decisions.

    Changes: None.

    Comments: Commenters stated that it is not feasible to collect and report student learning outcomes or survey data separately by credential program for science, technology, engineering, and mathematics (STEM) programs in a meaningful way when only one science test is administered, and teacher preparation program graduates teach two or more science disciplines with job placements in at least two fields.

    Discussion: We interpret these comments to be about teacher preparation programs that train teachers to teach STEM subjects. We also interpret these comments to mean that certain conditions—including, the placement or retention of recent graduates in more than one field, having only one statewide science assessment at the high school level, and perhaps program size—may complicate State data collection and reporting on the required indicators for preparation programs that produce STEM teachers.

    The regulations define the term “teacher of record” to clarify that teacher preparation programs will be assessed on the aggregate outcomes of novice teachers who are assigned the lead responsibility for a student's learning in the subject area. In this way, although they may generate more data for the student learning outcomes measure, novice teachers who are teachers of record for more than one subject area are treated the same as those who teach in only one subject area.

    We do not understand why a science teacher whose district administers only one examination is in a different position than a teacher of any other subject. More important, science is not yet a tested grade or subject under section 1111(b)(2) of the ESEA, as amended by ESSA. Therefore, for the purposes of generating data on a program's student learning outcomes, States that use the definition of “student growth” in § 612.2 will determine student growth for teacher preparation programs that train science teachers through use of measures of student learning and performance that are rigorous, comparable across schools, and consistent with State guidelines. These might include student results on pre-tests and end-of-course tests, objective performance-based assessments, and student learning objectives.

    To the extent that the comments refer to small programs that train STEM teachers, the commenters did not indicate why our proposed procedures for reporting data and levels of performance for small teacher preparation programs did not adequately address their concerns. For reasons we discussed in response to comments on aggregating and then reporting data for small teacher preparation programs (§ 612.4(b)(3)(ii)), we believe the procedures the regulations establish for reporting performance of small programs adequately address concerns about program size.

    Changes: None.

    Comments: Commenters noted that the transition to new State assessments may affect reporting on student learning outcomes and stated that the proposed regulations fail to indicate when and how States must use the results of State assessments during such a transition for the purpose of evaluating teacher preparation program quality.

    Discussion: For various reasons, one or more States are often transitioning to new State assessments, and this is likely to continue as States implement section 1111(b)(2) of the ESEA, as amended by ESSA. Therefore, transitioning to new State assessments should not impact a State's ability to use data from these assessments as a measure of student learning outcomes, since there are valid statistical methods for determining student growth even during these periods of transition. However, how this should occur is best left to each State that is going through such a transition, just as it is best to leave to each State whether to use another State-determined measure relevant to calculating student learning outcomes as permitted by § 612.5(a)(1)(ii)(C) instead.

    Changes: None.

    Comments: Commenters recommended that the student learning outcomes indicator take into account whether a student with disabilities uses accommodations, and who is providing the accommodation. Another commenter was especially concerned about special education teachers' individualized progress monitoring plans created to evaluate a student's progress on individualized learning outcomes. The commenter noted that current research cautions against aggregation of student data gathered with these tools for the purposes of teacher evaluation.

    Discussion: Under the regulations, outcome data is reported on “teachers of record,” defined as teachers (including a teacher in a co-teaching assignment) who have been assigned the lead responsibility for a student's learning in a subject or course section. The teacher of record for a class that includes students with disabilities who require accommodations is responsible for the learning of those students, which may include ensuring the proper accommodations are provided. We decline to require, as data to be reported as part of the indicator, the number of students with disabilities requiring special accommodations because we assume that the LEA will meet its responsibilities to provide needed accommodations, and out of consideration for the additional reporting burden the proposal would place on States. However, States are free to adopt this recommendation if they choose to do so.

    In terms of gathering data about the learning outcomes for students with disabilities, the regulations do not require the teacher of record to use special education teachers' individualized monitoring plans to document student learning outcomes but rather expect teachers to identify, based on the unique needs of the students with disabilities, the appropriate data source. However, we stress that this issue highlights the importance of consultation with key stakeholders, like parents of and advocates for students with disabilities, as States determine how to calculate their student learning outcomes.

    Changes: None.

    Comments: Commenters recommended that the regulations establish the use of other or additional indicators, including the new teacher performance assessment edTPA, measures suggested by the Higher Education Task Force on Teacher Preparation, and standardized observations of teachers in the classroom. Some commenters contended that a teacher's effectiveness can only be measured by mentor teachers and university field instructors. Other commenters recommended applying more weight to some indicators, such as students' evaluations of their teachers, or increasing emphasis on other indicators, such as teachers' scores on their licensure tests.

    Discussion: We believe that the indicators of academic content knowledge and teaching skills that the regulations require States to use in assessing a program's performance (i.e., student learning outcomes, employment outcomes, survey outcomes, and information about basic aspects of the program) are the most important such indicators in that, by focusing on a few key areas, they provide direct information about whether the program is meeting its basic purposes. We decline to require that States use additional or other indicators like those suggested because we strongly believe they are less direct measures of academic content knowledge and teaching skills that would also add significant cost and complexity. However, we note that if district evaluations of novice teachers use multiple valid measures in determining performance levels that include, among other things, data on student growth for all students, they are “teacher evaluation measures” under § 612.2. Therefore, § 612.5(a)(1)(ii) permits the State to use and report the results of those evaluations as student learning outcomes.

    Moreover, under § 612.5(b), in assessing the performance of each teacher preparation program, a State may use additional indicators of academic content and teaching skills of its choosing, provided the State uses a consistent approach for all of its teacher preparation programs and these additional indicators provide information on how the graduates produced by the program perform in the classroom. In consultation with their stakeholder groups, States may wish to use additional indicators, such as edTPA, teacher classroom observations, or student survey results, to assess teacher preparation program performance.

    As we addressed in our discussion of comment on § 612.4(b)(2)(ii) (Weighting of Indicators), we encourage States to give significant weight to student learning outcomes and employment outcomes in high-need schools. However, we have removed from the final regulations any requirement that States give special weight to these or other indicators of academic content knowledge and teaching skills. Thus, while States must include in their SRCs the weights they give to each indicator and any other criteria they use to identify a program's level of performance, each State has full authority to determine the weighting it gives to each indicator or criterion.

    Changes: None.

    Comments: Some commenters expressed concerns that the regulations permit the exclusion of some program graduates (e.g., those leaving the State or taking jobs in private schools), thus providing an incomplete representation of program performance. In particular, commenters recommended using measures that capture the academic content knowledge and teaching skills of all recent graduates, such as State licensure test scores, portfolio assessments, student and parent surveys, performance on the edTPA, and the rate at which graduates retake licensure assessments (as opposed to pass rates).

    Discussion: While the three outcome-based measures required by the regulations assess the performance of program graduates who become novice teachers, the requirement in § 612.5(a)(4) for an indication of either a program's specialized accreditation or that it provides certain minimum characteristics examines performance based on multiple input-based measures that apply to all program participants, including those who do not become novice teachers. States are not required to also assess teacher preparation programs on the basis of any of the additional factors that commenters suggest, i.e., State licensure test scores, portfolio assessments, student and parent surveys, performance on the edTPA, and the rate at which graduates retake licensure assessments. However, we note that IHEs must continue to include information in their IRCs on the pass rates of a program's students on assessments required for State certification. Furthermore, in consultation with their stakeholders, States may choose to use the data and other factors commenters recommend to help determine a program's level of performance.

    Changes: None.

    Comments: One commenter recommended that the Department fund a comprehensive five-year pilot of a variety of measures for assessing the range of K-12 student outcomes associated with teacher preparation.

    Discussion: Committing funds for research is outside the scope of the regulations. We note that the Institute of Education Sciences and other research organizations are conducting research on teacher preparation programs that the Department believes will inform advances in the field.

    Changes: None.

    Comments: Some commenters stated that a teacher preparation program's cost of attendance and the average starting salary of the novice teachers produced by the program should be included as mandatory indicators for program ratings because these two factors, along with student outcomes, would better allow stakeholders to understand the costs and benefits of a specific teacher preparation program.

    Discussion: Section 205(b)(1)(F) of the HEA requires each State to identify in its SRC the criteria it is using to identify the performance of each teacher preparation program within the State, including its indicators of the academic knowledge and teaching skills of the program's students. The regulations define these indicators to include four measures that States must use as these indicators.

    While we agree that information that helps prospective students identify programs that offer a good value is important, the purpose of sections 205(b)(1)(F) and 207(a) of the HEA, and thus our regulations, is to have States identify and report on meaningful criteria that they use to identify a program's level of performance—and specifically whether the program is low-performing or at-risk of being low-performing. While we encourage States to find ways to make information on a program's costs available to the public, we do not believe the information is sufficiently related to a program's level of performance to warrant the additional costs of requiring States to report it. For similar reasons, we decline to add this consumer information to the SRC as additional data States need to report independent of its use in assessing the program's level of performance.

    Changes: None.

    Comments: Multiple commenters stated that the teacher preparation system in the United States should mirror that of other countries and broaden the definition of classroom readiness. These commenters stated that teacher preparation programs should address readiness within a more holistic, developmental, and collective framework. Others stated that the teacher preparation system should emphasize experiential and community service styles of teaching and learning to increase student engagement.

    Discussion: While we appreciate commenters' suggestions that teacher preparation programs should be evaluated using holistic measures similar to those used by other countries, we decline to include these kinds of criteria because we believe that the ability to influence student growth and achievement is the most direct measure of academic knowledge and teaching skills. However, the regulations permit States to include indicators like those recommended by the commenters in their criteria for assessing program performance.

    Changes: None.

    Comments: Commenters noted that post-graduation professional development impacts a teacher's job performance in that there may be a difference between teachers who continue to learn during their early teaching years compared to those who do not, but that the proposed regulations did not take this factor into account.

    Discussion: By requiring the use of data from the first, second, and third year of teaching, the student learning outcomes measure captures improvements in the impact of teachers on student learning made over the first three years of teaching. To the extent that professional development received in the first three years of teaching contributes to a teacher's impact on student learning, the student learning outcomes measure may reflect it.

    The commenters may be suggesting that student learning outcomes of novice teachers are partially the consequence of the professional development they receive, yet the proposed regulations seem to attribute student learning outcomes to only the teacher preparation program. The preparation that novice teachers receive in their teacher preparation programs, of course, is not the only factor that influences student learning outcomes. But for reasons we have stated, the failure of recent graduates as a whole to demonstrate positive student learning outcomes is an indicator that something in the teacher preparation program is not working. We recognize that novice teachers receive various forms of professional development, but believe that high-quality teacher preparation programs produce graduates who have the knowledge and skills they need to earn positive reviews and stay in the classroom regardless of the type of training they receive on the job.

    Changes: None.

    Comments: Commenters were concerned that the proposed regulations would pressure States to rate some programs as low-performing even if all programs in a State are performing adequately. Commenters noted that the regulations need to ensure that programs are all rated on their own merits, rather than ranked against one another—i.e., criterion-referenced rather than norm-referenced. The commenters contended that, otherwise, programs would compete against one another rather than work together to continually improve the quality of novice teachers. Commenters stated that such competition could lead to further isolation of programs rather than fostering the collaboration necessary for addressing shortages in high-need fields.

    Some commenters stated that although there can be differences in traditional and alternative route programs that make comparison difficult, political forces that are pro- or anti-alternative route programs can attempt to make certain types of programs look better or worse. Further, commenters noted that it will be difficult for the Department to enforce equivalent levels of accountability and reporting when differences exist across States' indicators and relative weighting decisions.

    Another commenter recommended that, to provide context, programs and States should also report raw numbers in addition to rates for these metrics.

    Discussion: We interpret the comment on low-performing programs to argue that these regulations might be viewed as requiring a State to rate a certain number of programs as low performing regardless of their performance. Section 207(a) of the HEA requires that States provide in the SRCs an annual list of low-performing teacher preparation programs and identify those programs that are at risk of being put on the list of low-performing programs. While the regulations require States to establish at least three performance categories (those two and all other programs, which would therefore be considered effective or higher), we encourage States also to differentiate between teacher preparation programs whose performance is satisfactory and those whose performance is truly exceptional. We believe that recognizing, and where possible rewarding (see § 612.4(c)(1)(ii)(C)), excellence will help other programs learn from best practice and facilitate program improvement of teacher preparation programs and entities. Actions like these will encourage collaboration, especially in preparing teachers to succeed in high-need areas.

    However, we stress that the Department has no expectation or desire that a State will designate a certain number or percentage of its programs as low-performing or at-risk of being low-performing. Rather, we want States to do what our regulations provide: Assess the level of performance of each teacher preparation program based on what they determine to be differentiated levels of performance, and report in the SRCs (1) the data they secure about each program based on the indicators and other criteria they use to assess program performance, (2) the weighting of these data to generate the program's level of performance, and (3) a list of programs it found to be low-performing or at-risk of being low-performing. Beyond this, these regulations do not create, and are not designed to promote, an in-State or inter-State ranking system, or to rank traditional versus alternative route programs based on the reported data.

    We acknowledge that if they choose, States may employ growth measures specifically based on a relative distribution of teacher scores statewide, which could constitute a “norm-referenced” indicator. While these statewide scores may not improve on the whole, an individual teacher preparation program's performance can still show improvement (or declines) relative to average teacher performance in the State. The Department notes that programs are evaluated on multiple measures of program quality and the other required indicators can be criterion-referenced. For example, a State may set a specific threshold for retention rate or employer satisfaction that a program must meet to be rated as effective. Additionally, States may decide to compare any norm-referenced student learning outcomes, and other indicators, to those of teachers prepared out of State to determine relative improvement of teacher preparation programs as a whole.25 But whether or not to take steps like these is purely a State decision.

    25 See, for example: See UNC Educator Quality Dashboard.(n.d.). Retrieved from http://tqdashboard.northcarolina.edu/performance-employment/.

    With respect to the recommendation that report cards include raw numbers as well as rates attributable to the indicators and other criteria used to assess program performance, § 612.4(b)(2)(i) requires the State to report data relative to each indicator identified in § 612.5. Section V of the instructions for the SRC asks for the numbers and percentages used in the calculation of the indicators of academic content knowledge and teaching skills and any other indicators and criteria a State uses.

    Changes: None.

    Comments: Commenters contended that the proposed regulations do not specifically address the skills enumerated in the definition of “teaching skills.”

    Discussion: The commenters are correct that the regulations do not specifically address the various “teaching skills” identified in the definition of the term in section 200(23) of the HEA. However, we strongly believe that they do not need to do so.

    The regulations require States to use establish four indicators of academic content knowledge and teaching skills—student learning outcomes, employment outcomes, survey results, and minimum program characteristics—in assessing the level of a teacher preparation program's performance under sections 205(b)(1)(F) and 207(a) of the HEA. In establishing these indicators, we are mindful of the definition of “teaching skills” in section 200(23) of the HEA, which includes skills that enable a teacher to increase student learning, achievement, and the ability to apply knowledge, and to effectively convey and explain academic subject matter. In both the NPRM and the discussion of our response to comment on § 612.5(a)(1)-(4), we explain why each of the four measures is, in fact, a reasonable indicator of whether teachers have academic content knowledge and teaching skills. We see no reason the regulations need either to enumerate the definition of teaching skills in section 200(23) or to expressly tie these indicators to the statutory definition of one term included in “academic content knowledge and teaching skills”.

    Changes: None.

    Comments: Some commenters stated that the use of a rating system with associated consequences is a “test and punish” accountability model similar to the K-12 accountability system under the ESEA, as amended by the No Child Left Behind Act (NCLB). They contended that such a system limits innovation and growth within academia and denies the importance of capacity building.

    Discussion: We do not believe that the requirements the regulations establish for the title II reporting system are punitive. The existing HEA title II reporting framework has not provided useful feedback to teacher preparation programs, prospective teachers, other stakeholders, or the public on program performance. Until now, States have identified few programs deserving of recognition or remediation. This is because few of the criteria they to date have reported that they use to assess program performance, under section 205(b)(1)(F) of the HEA, rely on information that examines program quality from the most critical perspective—teachers' ability to impact student achievement once they begin teaching. Given the importance of academic knowledge and teaching skills, we are confident that the associated indicators in the regulations will help provide more meaningful information about the quality of these programs, which will then facilitate self-improvement and, by extension, production of novice teachers better trained to help students achieve once they enter the classroom.

    Thus, the regulations address shortcomings in the current State reporting system by defining indicators of academic content knowledge and teaching skills, focusing on program outcomes that States will use to assess program performance. The regulations build on current State systems and create a much-needed feedback loop to facilitate program improvement and provide valuable information to prospective teachers, potential employers, the general public, and the programs themselves. We agree that program innovation and capacity building are worthwhile, and we believe that what States will report on each program will encourage these efforts.

    Under the regulations, teacher preparation programs whose graduates (or participants, if they are teachers while being trained in an alternative route program) do not demonstrate positive student learning outcomes are not punished, nor are States required to punish programs. To the extent that proposed § 612.4(b)(2), which would have permitted a program to be considered effective or higher only if the teachers it produces demonstrate satisfactory or higher student learning outcomes, raised concerns about the regulations seeming punitive, we have removed that provision from the final regulations. Thus, the regulations echo the requirements of section 207(a) of the HEA, which requires that States annually identify teacher preparation programs that are low-performing or that are at-risk of becoming low-performing, and section 207(b) of the HEA, which prescribes the consequences for a program from which the State has withdrawn its approval or terminated its financial support. For a discussion of the relationship between the State classification of teacher preparation programs and TEACH Grant eligibility, see § 686.2 regarding a TEACH Grant-eligible program.

    Changes: None.

    Comments: None.

    Discussion: In removing the term “new teacher” and adding the term “novice teacher,” as discussed earlier in this document, it became unclear for what period of time a State must report data related to those teachers. To resolve this, we have clarified that a State may, at its discretion, exclude from reporting those individuals who have not become novice teachers after three years of becoming a “recent graduate,” as defined in the regulations. We believe that requiring States to report on individuals who become novice teachers more than three years after those teachers graduated from a teacher preparation program is overly burdensome and would not provide an accurate reflection of teacher preparation program quality.

    Changes: We have added § 612.5(c) to clarify that States may exclude from reporting under § 612.5(a)(1)-(3) individuals who have not become novice teachers after three years of becoming recent graduates.

    Student Learning Outcomes (34 CFR 612.5(a)(1)) Growth, VAM, and Other Methodological Concerns

    Comments: Many commenters argued that the proposed definition of “student learning outcomes” invites States to use VAM to judge teachers and teacher preparation programs. Those commenters argued that because the efficacy of VAM is not established, the definition of “student learning outcomes” is not solidly grounded in research.

    Discussion: For those States that choose to do so, the final regulations permit States to use any measures of student growth for novice teachers that meet the definitions in § 612.2 in reporting on a program's student learning outcomes. Their options include a simple comparison of student scores on assessments between two points in time for grades and subjects subject to section 1111(b)(2) of the ESEA, as amended by ESSA, a range of options measuring student learning and performance for non-tested grades and subjects (which can also be used to supplement scores for tested grads and subjects), or more complex statistical measures, like student growth percentiles (SGPs) or VAM that control for observable student characteristics. A detailed discussion of the use of VAM as a specific growth measure follows below; the discussion addresses the use of VAM in student learning outcomes, should States choose to use it. However, we also note that the requirement for States to assess teacher preparation programs based, in part, on student learning outcomes also allows States that choose not to use student growth to use a teacher evaluation measure or another State-determined measure relevant to calculating student learning outcomes. Nothing in the final regulations require the use of VAM over other methodologies for calculating student growth, specifically, or student learning outcomes, more broadly.

    These comments also led us to see potential confusion in the proposed definitions of student learning outcomes and student growth. In reviewing the proposed regulations, we recognized that the original structure of the definition of “student learning outcomes” could cause confusion. We are concerned that having a definition for the term, which was intended only to operationalize the other definitions in the context of § 612.5, was not the clearest way to present the requirements. To clarify how student learning outcomes are considered under the regulations, we have removed the definition of “student learning outcomes” from § 612.2, and revised § 612.5(a)(1) to incorporate, and operationalize, that definition.

    Changes: We have removed the definition of “student learning outcomes” and revised § 612.5(a)(1) to incorporate key aspects of that proposed definition. In addition, we have provided States with the option to determine student learning outcomes using another State-determined measure relevant to calculating student learning outcomes.

    Comments: Many commenters stated that the proposed student learning outcomes would not adequately serve as an indicator of academic content knowledge and teaching skills for the purpose of assessing teacher preparation program performance. Commenters also contended that tests only measure the ability to memorize and that several kinds of intelligence and ways of learning cannot be measured by testing.

    In general, commenters questioned the Department's basis for the use of student learning outcomes as one measure of teacher preparation program performance, citing research to support their claim that the method of measuring student learning outcomes as proposed in the regulations is neither valid nor reliable, and that there is no evidence to support the idea that student outcomes are related to the quality of the teacher preparation program attended by the teacher. Commenters further expressed concerns about the emphasis on linking children's test scores on mandated standardized tests to student learning outcomes. Commenters also stated that teacher preparation programs are responsible for only a small portion of the variation in teacher quality.

    Commenters proposed that aggregate teacher evaluation results be the only measure of student learning outcomes so long as the State teacher evaluations do no overly rely on results from standardized tests. Commenters stated that in at least one State, teacher evaluations cannot be used as part of teacher licensure decisions or to reappoint teachers due to the subjective nature of the evaluations.

    Some commenters argued that student growth cannot be defined as a simple comparison of achievement between two points in time.

    One commenter, who stated that the proposed regulatory approach is thorough and aligned with current trends in evaluation, also expressed concern that K-12 student performance (achievement) data are generally a snapshot in time, typically the result of one standardized test, that does not identify growth over time, the context of the test taking, or other variables that impact student learning.

    Commenters further cited research that concluded that student achievement in the classroom is not a valid predictor of whether the teacher's preparation program was high quality and asserted that other professions do not use data in such a simplistic way.

    Another commenter stated that local teacher evaluation instruments vary significantly across towns and States.

    Another commenter stated that student performance data reported in the aggregate and by subgroups to determine trends and areas for improvement is acceptable but should not be used to label or categorize a school system, school, or classroom teacher.

    Discussion: As discussed above, in the final regulations we have removed the requirement that States consider student growth “in significant part,” in their procedures for annually assessing teacher preparation program performance. Therefore, while we encourage States to use student growth as their measure of student learning outcomes and to adopt such a weighting of student learning outcomes on their own, our regulations give States broad flexibility to decide how to weight student learning outcomes in consultation with stakeholders (see § 612.4(c)), with the aim of it being a sound and reasonable indicator of teacher preparation program performance. Similarly, we decline commenters' suggestions to restrict the measure of student learning outcomes to only aggregated teacher evaluation results, in order to maintain that flexibility. With our decision to permit States to use their own State-determined measure relevant to calculating student learning outcomes rather than student growth or a teacher evaluation measure, we have provided even more State flexibility in calculating student learning outcomes than commenters had requested.

    As we have previously stated, we intend the use of all indicators of academic content knowledge and teaching skills to produce information about the performance-level of each teacher preparation program that, speaking broadly, is valid and reliable. It is clear from the comments we received that there is not an outright consensus on using student learning outcomes to help measure teacher preparation program performance; however, we strongly believe that a program's ability to prepare teachers who can positively influence student academic achievement is both an indicator of their academic content knowledge and teaching skills, and a critical measure for assessing a teacher preparation program's performance. Student learning outcomes therefore belong among multiple measures States must use. We continue to highlight growth as a particularly appropriate way to measure a teacher's effect on student learning because it takes a student's prior achievement into account, gives a teacher an opportunity to demonstrate success regardless of the student characteristics of the class, and therefore reflects the contribution of the teacher to student learning. Even where student growth is not used, producing teachers who can make a positive contribution to student learning should be a fundamental objective of any teacher preparation program and the reason why it should work to provide prospective teachers with academic content and teaching skills. Hence, student learning outcomes, as we define them in the regulations, associated with each teacher preparation program are an important part of an assessment of any program's performance.

    States therefore need to collect data on student learning outcomes—through either student growth that examines the change in student achievement in both tested and non-tested grades and subjects, a teacher evaluation measure as defined in the regulations, or another State-determined measure relevant to calculating student learning outcomes—and then link these data to the teacher preparation program that produced (or in the case of an alternative route program, is producing) these teachers.

    In so doing, States may if they wish choose to use statistical measures of growth, like VAM or student growth percentiles, that control for student demographics that are typically associated with student achievement. There are multiple examples of the use of similar student learning outcomes in existing research and State reporting. Tennessee, for example, reports that some teacher preparation programs consistently exhibit statistically significant differences in student learning outcomes over multiple years, indicating that scores are reliable from one year to the next.26 Studies from Washington State 27 and New York City 28 also find statistically significant differences in the student learning outcomes of teachers from different teacher preparation programs as does the University of North Carolina in how it assesses its own teacher preparation programs.29 Moreover, a teacher's effect on student growth is commonly used in education research and evaluation studies conducted by the Institute of Education Sciences as a valid measure of the effectiveness of other aspects of teacher training, like induction or professional development.30

    26 See Report Card on the Effectiveness of Teacher Training Programs, Tennessee 2014 Report Card. (n.d.). Retrieved from www.tn.gov/thec/article/report-card.

    27 D. Goldhaber & S. Liddle (2013). “The Gateway to the Profession: Assessing Teacher Preparation Programs Based on Student Achievement.” Economics of Education Review, 34: 29-44.

    28 D. Boyd, P. Grossman, H. Lankford, S. Loeb, & J. Wyckoff. (2009). Teacher Preparation and Student Achievement. Education Evaluation and Policy Analysis, 31(4), 416-440.

    29 See UNC Educator Quality Dashboard.(n.d.). Retrieved from http://tqdashboard.northcarolina.edu/performance-employment/.

    30 See for example, S. Glazerman, E. Isenberg, S. Dolfin, M. Bleeker, A. Johnson, M. Grider & M. Jacobus. 2010). Impacts of comprehensive teacher induction: Final results from a randomized controlled study (NCEE 2010-4027). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

    While some studies of teacher preparation programs 31 in other States have not found statistically significant differences at the preparation program level in graduates' effects on student outcomes, we believe that there are enough examples of statistically significant differences in program performance on student learning outcomes to justify their inclusion in the SRC. In addition, because even these studies show a wide range of individual teacher effectiveness within a program, using these data can provide new insights that can help programs to produce more consistently high-performing graduates.

    31 Koedel, C., Parsons, E., Podgursky, M., & Ehlert, M. (2015). Teacher Preparation Programs and Teacher Quality: Are There Real Differences Across Programs? Education Finance and Policy, 10(4), 508-534.

    Moreover, looking at the related issue of educator evaluations, there is debate about the level of reliability and validity of the individual elements used in different teacher evaluation systems. However, there is evidence that student growth can be a useful and effective component in teacher evaluation systems. For example, a study found that dismissal threats and financial incentives based partially upon growth scores positively influenced teacher performance.32 33 In addition, there is evidence that combining multiple measures, including student growth, into an overall evaluation result for a teacher can produce a more valid and reliable result than any one measure alone.34 For these reasons, this regulation and § 612.5(b) continue to give States the option of using teacher evaluation systems based on multiple measures that include student growth to satisfy the student learning outcomes requirement.

    32 Dee, T., & Wyckoff, J. (2015). Incentives, Selection, and Teacher Performance: Evidence from IMPACT. Journal of Policy Analysis and Management, 34(2), 267-297. doi:10.3386/w19529.

    33 Henry, G., & Bastian, K. (2015). Measuring Up: The National Council on Teacher Quality's Ratings of Teacher Preparation Programs and Measures of Teacher Performance.

    34 Mihaly, K., McCaffrey, D., Staiger, D., & Lockwood, J. (2013, January 8). A Composite Estimator of Effective Teaching.

    Teacher preparation programs may well only account for some of the variation in student learning outcomes. However, this does not absolve programs from being accountable for the extent to which their graduates positively impact student achievement. Thus, while the regulations are not intended to address the entire scope of student achievement or all factors that contribute to student learning outcomes, the regulations focus on student learning outcomes as an indicator of whether or not the program is performing properly. In doing so, one would expect that, through a greater focus on their student learning outcomes, States and teacher preparation programs will thereby have the benefit of some basic data about where their work to provide all students with academic content knowledge and teaching skills need to improve.

    Changes: None.

    Comments: Other commenters stated that there are many additional factors that can impact student learning outcomes that were not taken into account in the proposed regulations; that teacher evaluation is incomplete without taking into account the context in which teachers work on a daily basis; and that VAM only account for some contextual factors. Commenters stated that any proposed policies to directly link student test scores to teacher evaluation and teacher preparation programs must recognize that schools and classrooms are situated in a broader socioeconomic context.

    Commenters pointed out that not all graduates from a specific institution or program will be teaching in similar school contexts and that many factors influencing student achievement cannot be controlled for between testing intervals. Commenters also cited other contributing factors to test results that are not in a teacher's control, including poverty and poverty-related stress; inadequate access to health care; food insecurity; the student's development, family, home life, and community; the student's background knowledge; the available resources in the school district and classroom; school leadership, school curriculum, students not taking testing situations seriously; and school working conditions. Commenters also noted that students are not randomly placed into classrooms or schools, and are often grouped by socioeconomic class, and linguistic segregation, which influences test results.

    Discussion: Many commenters described unmeasured or poorly measured student and classroom characteristics that might bias the measurement of student outcomes and noted that students are not randomly assigned to teachers. These are valid concerns and many of the factors stated are correlated with student performance.

    However, teacher preparation programs should prepare novice teachers to be effective and successful in all classroom environments, including in high-need schools. It is for this reason, as well as to encourage States to highlight successes in these areas, that we include as indicators of academic content knowledge and teaching skills, placement and retention rates in high-need schools.

    In addition, States and school districts can control for different kinds of student and classroom characteristics in the ways in which they determine student learning outcomes (and student growth). States can, for example, control for school level characteristics like the concentration of low-income students in the school and in doing so compare teachers who teach in similar schools. Evidence cited below that student growth, as measured by well-designed statistical models, captures the causal effects of teachers on their students also suggests that measures of student growth can successfully mitigate much of potential bias, and supports the conclusion that non-random sorting of students into classrooms does not cause substantial bias in student learning outcomes. We stress, however, the decision to use such controls and other statistical measures to control for student and school characteristics in calculating student learning outcomes is up to States in consultation with their stakeholder groups.

    Changes: None.

    Comments: Commenters contended that although the proposed regulations offer States the option of using a teacher evaluation measure in lieu of, or in addition to, a student growth measure, this option does not provide a real alternative because it also requires that the three performance levels in the teacher evaluation measure include, as a significant factor, data on student growth, and student growth relies on student test scores. Also, while the regulations provide that evaluations need not rely on VAM, commenters suggested that VAM will drive teacher effectiveness determinations because student learning is assessed either through student growth (which includes the use of VAM) or teacher evaluation (which is based in large part on student growth), so there really is no realistic option besides VAM. Commenters also stated that VAM requirements in Race to the Top and ESEA flexibility, along with State-level legislative action, create a context in which districts are compelled to use VAM.

    A large number of commenters stated that research points to the challenges and ineffectiveness of using VAM to evaluate both teachers and teacher preparation programs, and asserted that the data collected will be neither meaningful nor useful. Commenters also stated that use of VAM for decision-making in education has been discredited by leading academic and professional organizations such as the American Statistical Association (ASA) 35 , the American Educational Research Association, and the National Academy of Education.36 37 Commenters provided research in support of their arguments, asserting in particular ASA's contention that VAM do not meet professional standards for validity and reliability when applied to teacher preparation programs. Commenters voiced concerns that VAM typically measure correlation and not causation, often citing the ASA's assertions. Commenters also contended that student outcomes have not been shown to be correlated with, much less predictive of, good teaching; VAM scores and rankings can change substantially when a different model or test is used, and variation among teachers accounts for a small part of the variation in student test scores. One commenter stated that student learning outcomes are not data but target skills and therefore the Department incorrectly defined “student learning outcomes.” We interpret this comment to mean that tests that may form the base of student growth only measure certain skills rather than longer term student outcomes.

    35 American Statistical Association. (2014). ASA Statement on Using Value-Added Models for Educational Assessment: www.amstat.org/policy/pdfs/ASA_VAM_Statement.pdf.

    36 American Education Research Association (AERA) and National Academy of Education. (2011).Getting teacher evaluation right: A brief for policymakers. Washington, DC: AERA.

    37 Feuer, M. J., Floden, R. E., Chudowsky, N., & Ahn, J. (2013). Evaluation of Teacher Preparation Programs: Purposes, Methods, and Policy Options. Washington, DC: National Academy of Education.

    Many commenters also noted that value-added models of student achievement are developed and normed to test student achievement, not to evaluate educators, so using these models to evaluate educators is invalid because the tests have not been validated for that purpose. Commenters further noted that value-added models of student achievement tied to individual teachers should not be used for high-stakes, individual-level decisions or comparisons across highly dissimilar schools or student populations.

    Commenters stated that in psychometric terms, VAM are not reliable. They contended that it is a well-established principle that reliability is a necessary but not sufficient condition for validity. If judgments about a teacher preparation program vary based on the method of estimating value-added scores, inferences made about programs cannot be trusted.

    Others noted Edward Haertel's 38 conclusion that no statistical manipulation can assure fair comparisons of teachers working in very different schools, with very different students, under very different conditions. Commenters also noted Bruce Baker's conclusions that even a 20 percent weight to VAM scores can skew results too much. Thus, according to the commenters, though the proposed regulations permit States to define what is “significant” for the purposes of using student learning outcomes “in significant part,” unreliable and invalid VAM scores end up with at least a 20 percent weight in teacher evaluations.

    38 Haertel, E. 2013. Reliability and Validity on Inferences about Teachers Based on Student Test Scores. The 14th William H. Angoff Memorial Lecture, March 22. Princeton, NJ: Educational Testing Service. Retrieved from www.ets.org/Media/Research/pdf/PICANG14.pdf.

    Discussion: The proposed definition of teacher evaluation measure in § 612.2 did provide that student growth be considered in significant part, but we have removed that aspect of the definition of teacher evaluation measure from the final regulations. Moreover, we agree that use of such an evaluation system may have been required, for example, in order for a State to receive ESEA flexibility, and States may still choose to consider student growth in significant part in a teacher evaluation measure. However, not only are States not required to include growth “in significant part” in a teacher evaluation measure used for student learning outcomes, but § 612.5(a)(1)(ii) clarifies that States may choose to measure student learning outcomes without using student growth at all.

    On the use of VAM specifically, we reiterate that the regulations permit multiple ways of measuring student learning outcomes without use of VAM; if they use student growth, States are not required to use VAM. We note also that use of VAM was not a requirement of Race to the Top, nor was it a requirement of ESEA Flexibility, although many States that received Race to the Top funds or ESEA flexibility committed to using statistical models of student growth based on test scores. We also stress that in the context of these regulations, a State that chooses to use VAM and other statistical measures of student growth would use them to help assess the performance of teacher preparation programs as a whole. Neither the proposed nor final regulations address, as many commenters stated, how or whether a State or district might use the results of a statistical model for individual teachers' evaluations and any resulting personnel actions.

    Many States and districts currently use a variety of statistical methods in teacher, principal, and school evaluation, as well as in State accountability systems. VAM are one such way of measuring student learning outcomes that are used by many States and districts for these accountability purposes. While we stress that the regulations do not require or anticipate the use of VAM to calculate student learning outcomes or teacher evaluation measures, we offer the following summary of VAM in view of the significant amount of comments the Department received on the subject.

    VAM are statistical methodologies developed by researchers to estimate a teacher's unique contribution to growth in student achievement, and are used in teacher evaluation and evaluation of teacher preparation programs. Several experimental and quasi-experimental studies conducted in a variety of districts have found that VAM scores can measure the causal impact teachers have on student learning.39 There is also strong evidence that VAM measure more than a teacher's ability to improve test scores; a recent paper found that teachers with higher VAM scores improved long term student outcomes such as earnings and college enrollment.40 While tests often measure specific skills, these long-term effects show that measures of student growth are, in fact, measuring a teacher's effect on student outcomes rather than simple, rote memorization, test preparation on certain target skills, or a teacher's performance based solely on one specific student test. VAM have also been shown to consistently measure teacher quality over time and across different kinds of schools. A well-executed, randomized controlled trial found that, after the second year, elementary school students taught by teachers with high VAM scores who were induced to transfer to low-performing schools had higher reading and mathematics scores than students taught by comparison teachers in the same kinds of schools.41

    39 For example: Kane, T., & Staiger, D. (2008). Estimating teacher impacts on student achievement: An experimental evaluation. doi:10.3386/w14607;; Kane, T., McCaffrey, D., Miller, T., & Staiger, D. (2013). Have We Identified Effective Teachers? Validating Measures of Effective Teaching Using Random Assignment; Bacher-Hicks, A., Kane, T., & Staiger, D.(2014). Validating Teacher Effect Estimates Using Changes in Teacher Assignment in Los Angeles (Working Paper No. 20657). Retrieved from National Bureau of Economic Research Web Web site: www.nber.org/papers/w20657; Chetty, et al. at 2633-2679 and 2593-2632.

    40 Chetty, et al at 2633-2679.

    41 Glazerman, S., Protik, A., Teh, B., Bruch, J., & Max, J. (2013). Transfer incentives for high-performing teachers: Final results from a multisite randomized experiment (NCEE 2014-4003). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. http://files.eric.ed.gov/fulltext/ED544269.pdf.

    The Department therefore disagrees with commenters who state that the efficacy of VAM is not grounded in sound research. We believe that VAM is commonly used as a component in many teacher evaluation systems precisely because the method minimizes the influence of observable factors independent of the teacher that might affect student achievement growth, like student poverty levels and prior levels of achievement.

    Several commenters raised important points to consider with using VAM for teacher evaluation. Many cited the April 8, 2014, “ASA Statement on Using Value-Added Models for Educational Assessment” cited in the summary of comment, that makes several reasonable recommendations regarding the use of VAM, including its endorsement of wise use of data, statistical models, and designed experiments for improving the quality of education. We believe that the definitions of “student learning outcomes” and “student growth” in the regulations, is fully compatible with valid and reliable ways of including VAM to assess the impact of teachers on student academic growth. Therefore, States that chose to use VAM to generate student learning outcomes would have the means to do what the ASA study recommends: Use data and statistical models to improve the quality of their teacher preparation programs. The ASA also wisely cautions that VAMs are complex statistical models, necessitating high levels of statistical expertise to develop and run and should include estimates of the model's precision. These specific recommendations are entirely consistent with the regulations, and we encourage States to follow them when using VAM.

    We disagree, however, with the ASA and commenters' assertions that VAM typically measures correlation, not causation, and that VAM does not measure teacher contributions toward other student outcomes. These assertions contradict the evidence cited above that VAM does measure the causal effects of teachers on student achievement, and that teachers with high VAM scores also improve long-term student outcomes.

    The implication of the various studies we cited in this section is clear; not only can VAM identify teachers who improve short- and long-term student outcomes, but VAM can play a substantial role in effective, useful teacher evaluation systems.

    However, as we have said, States do not need to use VAM to generate student learning outcomes. Working with their stakeholders States can, if they choose, establish other means of reporting a teacher preparation program's “student learning outcomes” that meet the basic standard in § 612.5(a)(1).

    Changes: None.

    Comments: Two commenters suggested that the United States Government Accountability Office (GAO) do an analysis and suggest alternatives to VAM.

    Discussion: The Secretary of Education has no authority to direct GAO's work, so these comments are outside the Department's authority, and the scope of the regulations.

    Changes: None.

    Comments: Several commenters opined that it is not fair to measure new teachers in the manner proposed in the regulations because it takes new teachers three to five years to become good at their craft. Other commenters mentioned that value-added scores cannot be generated until at least two years after a teacher candidate has graduated.

    Discussion: We recognize the importance of experience in a teacher's development. However, while teachers can be expected to improve in effectiveness throughout their first few years in the classroom, under § 612.5(a)(1)) a State is not using student learning outcomes to measure or predict the future or long-term performance of any individual teacher. It is using student learning outcomes to measure the performance of the teacher preparation program that the novice teacher completed—performance that, in part, should be measured in terms of a novice teacher's ability to achieve positive student learning outcomes in the first year the teacher begins to teach.

    We note, however, that there is strong evidence that early career performance is a significant predictor of future performance. Two studies have found that growth scores in the first two years of a teacher's career, as measured by VAM, better predict future performance than measured teacher characteristics that are generally available to districts, such as a teacher's pathway into teaching, available credentialing scores and SAT scores, and competitiveness of undergraduate institution.42 Given that early career performance is a good predictor of future performance, it is reasonable to use early career results of the graduates of teacher preparation programs as an indicator of the performance of those programs. These studies also demonstrate that VAM scores can be calculated for first-year teachers.

    42 Atteberry, A., Loeb, S., & Wyckoff, J. (2015). Do first impressions matter? Improvement in early career teacher effectiveness. American Educational Research Association (AERA) Open.; Goldhaber, D., & Hansen, M. (2010). Assessing the Potential of Using Value-Added Estimates of Teacher Job Performance for Making Tenure Decisions. Working Paper 31. National Center for Analysis of Longitudinal Data in Education Research.

    Moreover, even if States choose not to use VAM results as student growth measures, the function of teacher preparation programs is to train teachers to be ready to teach when they enter the classroom. We believe student learning outcomes should be measured early in a teacher's career, when the impact of their preparation is likely to be the strongest. However, while we urge States to give significant weight to their student outcome measures across the board, the regulations leave to each State how to weight the indicators of academic content knowledge and teaching skills for novice teachers in their first and other years of teaching.

    Changes: None.

    Differences Between Accountability and Improvement

    Comments: Commenters stated that the Department is confusing accountability with improvement by requiring data on and accountability of programs. Several commenters remarked that VAM will not guarantee continuous program improvement.

    Discussion: The regulations require States to use the indicators of academic content knowledge and teaching skills identified in § 612.5(a), which may include VAM if a State chooses, to determine the performance level of each teacher preparation program, to report the data generated for each program, and to provide a list of which programs the State considers to be low-performing or at-risk of being low-performing. In addition, reporting the data the State uses to measure student learning outcomes will help States, IHEs, and other entities with teacher preparation programs to determine where their program graduates (or program participants in the case of alternative route to teaching programs) are or are not succeeding in increasing student achievement. No information available to those operating teacher preparation programs, whether from VAM or another source, can, on its own, ensure the programs' continuous improvement. However, those operating teacher preparation programs can use data on a program's student learning outcomes—along with data from employment outcomes, survey outcomes, and characteristics of the program—to identify key areas for improvement and focus their efforts. In addition, the availability of these data will provide States with key information in deciding what technical assistance to provide to these programs.

    Changes: None.

    Consistency

    Comments: One commenter noted the lack of consistency in assessments at the State level, which we understand to be assessments of students across LEAs within the same State, will make the regulations almost impossible to operationalize. Another commenter noted that the comparisons will be invalid, unreliable, and inherently biased in favor of providers that enjoy State sponsorship and are most likely to receive favorable treatment under a State-sponsored assessment schema (which we understand to mean “scheme”). Until there is a common State assessment which we understand to mean common assessment of students across States, the commenter argued that any evaluation of teachers using student progress and growth will be variable at best.

    Discussion: We first note that, regardless of the assessments a State uses to calculate student learning outcomes, the definition of student growth in § 612.2 requires that such assessments be comparable across schools and consistent with State policies. While comparability across LEAs is not an issue for assessments administered pursuant to section 1111(b)(2) of the ESEA—which are other assessments used by the State for purposes of calculating student growth may not be identical, but are required to be comparable. As such, we do not believe that LEA-to-LEA or school-to-school variation in the particular assessments that are administered should inherently bias the calculation of student learning outcomes across teacher preparation programs.

    Regarding comparability across States in the assessments administered to students, nothing in this regulation requires such comparability and, we believe such a requirement would infringe upon the discretion States have historically been provided under the ESEA in determining State standards, assessments, and curricula.

    We understand the other comment to question the validity of comparisons of teacher preparation program ratings, as reported in the SRC. We continue to stress that the data regarding program performance reported in the SRCs and required by the regulations do not create, or intend to promote, any in-State or inter-State ranking system. Rather, we anticipate that States will use reported data to evaluate program performance based on State-specific weighting.

    Changes: None.

    Special Populations and Untested Subjects

    Comments: Two commenters stated that VAMs will have an unfair impact on special education programs. Another commenter stated that for certain subjects, such as music education, it is difficult for students to demonstrate growth.

    One commenter stated that there are validity issues with using tests to measure the skills of deaf children since standardized tests are based on hearing norms and may not be applicable to deaf children. Another commenter noted that deaf and hard-of-hearing K-12 students almost always fall below expected grade level standards, impacting student growth and, as a result, teacher preparation program ratings under our proposed regulations. In a similar vein, one commenter expressed concern that teacher preparation programs that prepare teachers of English learners may be unfairly branded as low-performing or at-risk because the students are forced to conform to tests that are neither valid nor reliable for them.

    Discussion: The Department is very sensitive to the different teaching and learning experiences associated with students with disabilities (including deaf and hard-of-hearing students) and English learners, and encourages States to use student learning outcome measures that allow teachers to demonstrate positive impact on student learning outcomes regardless of the prior achievement or other characteristics of students in their classroom. Where States use the results of assessments or other tests for student learning outcomes, such measures must also conform to appropriate testing accommodations provided to students that allow them to demonstrate content mastery instead of reflecting specific disabilities or language barriers.

    We expect that these measures of student learning outcomes and other indicators used in State systems under this regulation will be developed in consultation with key stakeholders (see § 612.4(c)), and be based on measures of achievement that conform to student learning outcomes as described in in § 612.5(a)(1)(ii).

    Changes: None.

    Comments: Several commenters cited a study 43 stating unintended consequences associated with the high-stakes use of VAM, which emerged through teachers' responses. Commenters stated that the study revealed, among other things, that teachers felt heightened pressure and competition. This reduced morale and collaboration, and encouraged cheating or teaching to the test.

    43 Collins, C (2014). Houston, we have a problem: Teachers find no value in the SAS education value-added assessment system (EVAAS®), Education Policy Analysis Archives, 22(98).

    Some commenters stated that by, in effect, telling teacher preparation programs that their graduates should engage in behaviors that lift the test scores of their students, the likely main effect will be classrooms that are more directly committed to test preparation (and to what the psychometric community calls score inflation) than to advancement of a comprehensive education.

    Discussion: The Department is sensitive to issues of pressure on teachers to artificially raise student assessment scores, and perceptions of some teachers that this emphasis on testing reduces teacher morale and collaboration. However, States and LEAs have responsibility to ensure that test data are monitored for cheating and other forms of manipulation, and we have no reason to believe that the regulations will increase these incidents. With regard to reducing teacher morale and collaboration, value-added scores are typically calculated statewide for all teachers in a common grade and subject. Because teachers are compared to all similarly situated teachers statewide, it is very unlikely that a teacher could affect her own score by refusing to collaborate with other teachers in a single school. We encourage teachers to collaborate across grades, subjects, and schools to improve their practice, but also stress that the regulations use student learning outcomes only to help assess the performance of teacher preparation programs. Under the regulations, where a State does not use student growth or teacher evaluation data already gathered for purposes of an LEA educator evaluation, data related to student learning outcomes is only used to help assess the quality of teacher preparation programs, and not the quality of individual teachers.

    Changes: None.

    Comments: Commenters were concerned that the regulations will not benefit high-need schools and communities because the indicator for student learning outcomes creates a disincentive for programs to place teachers in high-need schools and certain high-need fields, such as English as a Second Language. In particular, commenters expressed concern about the requirements that student learning outcomes be given significant weight and that a program have satisfactory or higher student learning outcomes in order to be considered effective. Commenters expressed particular concern in these areas with regard to Historically Black Colleges and Universities and other programs whose graduates, the commenters stated, are more likely to work in high-need schools.

    Commenters opined that, to avoid unfavorable outcomes, teacher preparation programs will seek to place their graduates in higher-performing schools. Rather than encouraging stronger partnerships, commenters expressed concern that programs will abandon efforts to place graduates in low-performing schools. Others were concerned that teachers will self-select out of high-need schools, and a few commenters noted that high-performing schools will continue to have the most resources while teacher shortages in high-need schools, such as those in Native American communities, will be exacerbated.

    Some commenters stated that it was unfair to assess a teacher preparation program based on, as we interpret the comment, the student learning outcomes of the novice teachers produced by the program because the students taught by novice teachers may also receive instruction from other teachers who may have more than three years of experience teaching.

    Discussion: As we have already noted, under the final regulations, States are not required to apply special weight to any of the indicators of academic content knowledge and teaching skills. Because of their special importance to the purpose of teacher preparation programs, we strongly encourage, but do not require, States to include employment outcomes for high-need schools and student learning outcomes in significant part when assessing teacher preparation program performance. We also encourage, but do not require, States to identify the quality of a teacher preparation program as effective or higher if the State determined that the program's graduates produce student learning outcomes that are satisfactory or higher.

    For the purposes of the regulations, student learning outcomes may be calculated using student growth. Because growth measures the change in student achievement between two or more points in time, the prior achievement of students is taken into account. Teacher preparation programs may thus be assessed, in part, based on their recent graduates' efforts to increase student growth, not on whether the teachers' classrooms contained students who started as high or low achieving. For this reason, teachers—regardless of the academic achievement level of the students they teach—have the same opportunity to positively impact student growth. Likewise, teacher preparation programs that place students in high-need schools have the same opportunity to achieve satisfactory or higher student learning outcomes. These regulations take into account the commenters' concerns related to teacher equity as placement and retention in high-need schools are required metrics.

    We recognize that many factors influence student achievement. Commenters who note that students taught by novice teachers may also receive instruction from other teachers who may have more than three years of experience teaching cite but one factor. But the objective in having States use student growth as an indicator of the performance of a teacher preparation program is not to finely calculate how novice teachers impact student growth. As we have said, it rather is to have the State determine whether a program's student learning outcomes are so far from the mark as to be an indicator of poor program performance.

    For these reasons, we disagree with commenters that the student learning outcomes measure will discourage preparation programs and teachers from serving high-need schools. We therefore decline to make changes to the regulations.

    Changes: None.

    Comments: Commenters expressed concern with labeling programs as low-performing if student data are not made available about such programs. The commenters stated that this may lead to identifying high-quality programs as low-performing. They were also concerned about transparency, and noted that it would be unfair to label any program without actual information on how that label was earned.

    Discussion: We interpret the commenters' concern to be that States may not be able to report on student learning outcomes for particular teacher preparation programs because districts do not provide data on student learning outcomes, and yet still identify programs as low performing. In response, we clarify that the State is responsible for securing the information needed to report on each program's student learning outcomes. Given the public interest in program performance and the interest of school districts in having better information about the programs in which prospective employees have received their training, we are confident that each State can influence its school districts to get maximum cooperation in providing needed data.

    Alternatively, to the extent that the commenter was referring to difficulties obtaining data for student learning outcomes (or other of our indicators of academic content and teaching skills) because of the small size of the teacher preparation programs, § 612.4(b)(3)(ii) provides different options for aggregation of data so the State can provide these programs with appropriate performance ratings. In this case, except for teacher preparation programs that are so small that even these aggregation methods will not permit the State to identify a performance level (see § 612.4(b)(3)(ii)(D) and § 612.4(b)(5)), all programs will have data on student learning outcomes with which to determine the program's level of performance.

    Changes: None.

    State and Local Concerns

    Comments: Several commenters expressed concerns about their specific State laws regarding data collection as they affect data needed for student learning outcomes. Other commenters noted that some States have specific laws preventing aggregated student achievement data from being reported for individual teachers. One commenter said that its State did not require annual teacher evaluations. Some commenters indicated that State standards should be nationally coordinated.

    One commenter asked the Department to confirm that the commenters' State's ESEA flexibility waiver would meet the student learning outcome requirements for both tested and non-tested grades and subjects, and if so, given the difficulty and cost, whether the State would still be required to report disaggregated data on student growth in assessment test scores for individual teachers, programs, or entities in the SRC. Commenters also noted that LEAs could be especially burdened, with no corresponding State or Federal authority to compel LEA compliance. A commenter stated that in one city most teachers have 20 to 40 percent of their evaluations based on tests in subjects they do not teach.

    Commenters urged that States be given flexibility in determining the components of data collection and reporting systems with minimal common elements. This would, as commenters indicated, ultimately delay the State's ability to make valid and reliable determinations of teacher preparation program quality. Some commenters stated that States should be required to use student learning outcomes as a factor in performance designations, but allow each State to determine how best to incorporate these outcomes into accountability systems.

    Commenters noted that a plan for creating or implementing a measure of student achievement in content areas for which States do not have valid statewide achievement data was not proposed, nor was a plan proposed to pilot or fund such standardized measures.

    Discussion: We agree and understand that some States may have to make changes (including legislative, regulatory, budgetary, etc.) in order to comply with the regulations. We have allowed time for these activities to take place, if necessary, by providing time for data system set-up and piloting before full State reporting is required as of October 31, 2019. We note that § 612.4(b)(4)(ii)(E) of the proposed regulations and § 612.4(b)(5)) of the final regulations expressly exempt reporting of data where doing so would violate Federal or State privacy laws or regulations. We also provide in § 612.4(c)(2) that States must periodically examine the quality of the data collection and make adjustments as necessary. So if problems arise, States need to work on ways to resolve them.

    Regarding the suggestion that State standards for student learning outcomes should be nationally coordinated, States are free to coordinate. But how each State assesses a program's performance is a State decision; the HEA does not otherwise provide for such national coordination.

    With respect to the comment asking whether a State's ESEA flexibility waiver would meet the student learning outcomes requirement for both tested and non-tested grades and subjects, this issue is likely no longer relevant since the enactment of the ESSA will make ESEA flexibility waivers null and void on August 1, 2016. However, in response to the commenters' question, so long as the State is implementing the evaluation systems as they committed to do in order to receive ESEA flexibility, the data it uses for student learning outcomes would most likely represent an acceptable way, among other ways, to comply with the title II reporting requirements.

    We understand the comment, that LEAs would be especially burdened with no corresponding State or Federal authority to compel LEA compliance, to refer to LEA financial costs. It is unclear that LEAs would be so burdened. We believe that our cost estimates, as revised to respond to public comment, are accurate. Therefore, we also believe that States, LEAs, and IHEs will be able meet responsibilities under this reporting system without need for new funding sources. We discuss authorities related to LEA compliance in the discussion under § 612.1.

    Regarding specific reporting recommendations for State flexibility in use of student learning outcomes, State must use the indicators of academic content knowledge and teaching skills identified in § 612.5(a). However, States otherwise determine for themselves how to use these indicators and other indicators and criteria they may establish to assess a program's performance. In identifying the performance level of each program, States also determine the weighting of all indicators and criteria they use to assess program performance.

    Finally, we understand that all States are working to implement their responsibilities to provide results of student assessments for grades and subjects in which assessments are required under section 1111(b)(2) of the ESEA, as amended by ESSA. With respect to the comment that the Department did not propose a plan for creating or implementing a measure of student achievement in content areas for which States do not have valid statewide achievement data, the regulations give States substantial flexibility in how they measure student achievement. Moreover, we do not agree that time to pilot such new assessments or growth calculations, or more Federal funding in this area, is needed.

    Changes: None.

    Permitted Exclusions From Calculation of Student Learning Outcomes

    Comments: None.

    Discussion: In proposing use of student learning outcomes for assessing a teacher preparation program's performance, we had intended that States be able, in their discretion, to exclude student learning outcomes associated with recent graduates who take teaching positions out of State or in private schools—just as the proposed regulations would have permitted States to do in calculating employment outcomes. Our discussion of costs associated with implementation of student learning outcomes in the NPRM (79 FR 71879) noted the proposed regulations permitted the exclusion for teachers teaching out of State. And respectful of the autonomy accorded to private schools, we never intended that States be required to obtain data on student learning outcomes regarding recent graduates teaching in those schools.

    However, upon review of the definitions of the terms “student achievement in non-tested grades and subjects,” “student achievement in tested grades and subjects,” and “teacher evaluation measure” in proposed § 612.2, we realized that these definitions did not clearly authorize States to exclude student learning outcomes associated with these teachers from their calculation of a teacher preparation program's aggregate student learning outcomes. Therefore, we have revised § 612.5(a)(1) to include authority for the State to exclude data on student learning outcomes for students of novice teachers teaching out of State or in private schools from its calculation of a teacher preparation program's student learning outcomes. In doing so, as with the definitions of teacher placement rate and teacher retention rate, we have included in the regulations a requirement that the State use a consistent approach with regard to omitting or using these data in assessing and reporting on all teacher preparation programs.

    Changes: We have revised section 612.5(a)(1) to provide that in calculating a teacher preparation program's aggregate student learning outcomes, at its discretion a State may exclude student learning outcomes of students taught by novice teachers teaching out of State or in private schools, or both, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State.

    Employment Outcomes (34 CFR 612.5(a)(2)) Measures of Employment Outcomes

    Comments: Many commenters suggested revisions to the definition of “employment outcomes.” Some commenters mentioned that the four measures included in the definition (placement rates, high-need school placement rates, retention rates, and high-need school retention rates) are not appropriate measures of a program's success in preparing teachers. One commenter recommended that high-need school placement rates not be included as a required program measure, and that instead the Department allow States to use it at their discretion. Other commenters recommended including placement and retention data for preschool teachers in States where their statewide preschool program postsecondary training and certification is required, and the State licenses those educators.

    Discussion: For several reasons, we disagree with commenters that the employment outcome measures are inappropriate measures of teacher preparation program quality. The goals of any teacher preparation program should be to provide prospective teachers with the skills and knowledge needed to pursue a teaching career, remain successfully employed as a teacher, and in doing so produce teachers who meet the needs of LEAs and their students. Therefore, the rate at which a program's graduates become and remain employed as teachers is a critical indicator of program quality.

    In addition, programs that persistently produce teachers who fail to find jobs, or, once teaching, fail to remain employed as teachers, may well not be providing the level of academic content knowledge and teaching skills that novice teachers need to succeed in the classroom. Working with their stakeholders (see § 612.4(c)), each State will determine the point at which the reported employment outcomes for a program go from the acceptable to the unacceptable, the latter indicating a problem with the quality of the program. We fully believe that these outcomes reflect another reasonable way to define an indicator of academic content knowledge and teaching skills, and that unacceptable employment outcomes show something is wrong with the quality of preparation the teaching candidates have received.

    Further, we believe that given the need for teacher preparation programs to produce teachers who are prepared to address the needs of students in high-need schools, it is reasonable and appropriate that indicators of academic content and teaching skills used to help assess a program's performance focus particular attention on teachers in those schools. Therefore, we do not believe that States should have the option to include teacher placement rates (and teacher retention rates) for high-need schools in their SRCs.

    We agree with commenters that, in States where postsecondary training and certification is required, and the State licenses those teachers, data on the placement and retention of preschool teachers should be reported. We strongly encourage States to report this information. However, we decline to require that they do so because pre-kindergarten licensure and teacher evaluation requirements vary significantly between States and among settings, and given these State and local differences in approach we believe that it is important to leave the determination of whether and how to include preschool teachers in this measure to the States.

    Changes: None.

    Teacher Placement Rate

    Comments: One commenter recommended that the teacher placement rate account for “congruency,” which we interpret to mean whether novice teachers are teaching in the grade level, grade span, and subject area in which they were prepared. The commenter noted that teacher preparation programs that are placing teachers in out-of-field positions are not aligning with districts' staffing needs. In addition, we understand the commenter was noting that procedures LEAs use for filling vacancies with teachers from alternative route programs need to acknowledge the congruency issue and build in a mechanism to remediate it.

    Discussion: We agree that teachers should be placed in a position for which they have content knowledge and are prepared. For this reason, the proposed and final regulations define “teacher placement rate” as the percentage of recent graduates who have become novice teachers (regardless of retention) for the grade level, grade span, and subject area in which they were prepared, except, as discussed in the section titled “Alternative Route Programs,” we have revised the regulations to provide that a State is not required to calculate a teacher placement rate for alternative route to certification programs. While we do not agree that teacher preparation programs typically place teachers in their teaching positions, programs that do not work to ensure that novice teachers obtain employment as teachers in a grade level, span, or subject area that is the same as that or which they were prepared will likely fare relatively poorly on the placement rate measure.

    We disagree with the commenter's suggestion that alternative route program participants are teaching in out-of-field positions. Employment as a teacher is generally a prerequisite to entry into alternative route programs, and the alternative route program participants are being prepared for an initial certification or licensure in the field in which they are teaching. We do not know of evidence to suggest that most participants in alternative route programs become teachers of record without first having demonstrated adequate subject-matter content knowledge in the subjects they teach.

    Nonetheless, traditional route programs and alternative route programs recruit from different groups of prospective teachers and have different characteristics. It is for this reason that, both in our proposed and final regulations, States are permitted to assess the employment outcomes of traditional route programs versus alternative route programs differently, provided that the different assessments result in equivalent standards of accountability and reporting.

    Changes: None.

    Teacher Retention Rate

    Comments: Many commenters expressed concern that the teacher retention rate measure does not consider other factors that influence retention, including induction programs, the support novice teachers receive in the classroom, and the districts' resources. Other commenters suggested requiring each State to demand from its accredited programs a 65 percent retention rate after five years.

    Some commenters also expressed concern about how the retention rate measure will be used to assess performance during the first few years of implementation. They stated that it would be unfair to rate teacher preparation programs without complete information on retention rates.

    Discussion: We acknowledge that retention rates are affected by factors outside the teacher preparation program's control. However, we believe that a teacher retention rate that is extraordinarily low, just as one that is extraordinarily high, is an important indicator of the degree to which a teacher preparation program adequately prepares teachers to teach in the schools that hire them and thus is a useful and appropriate indicator of academic content knowledge and teaching skills that the State would use to assess the program's performance. The regulations leave to the States, in consultation with their stakeholders (see § 612.4(c)) the determination about how they calculate and then weight a program's retention rate. While we agree that programs should strive for high retention rates, and encourage States to set rigorous performance goals for their programs, we do not believe that the Department should set a specific desired rate for this indicator. Rather, we believe the States are best suited to determine how to implement and weight this measure. However, we retain the proposal to have the retention rate apply over the first three years of teaching both because we believe that having novice teachers remain in teaching for the first three years is key, and because having States continue to generate data five years out as the commenter recommended is unnecessarily burdensome.

    We understand that, during the initial years of implementation, States will not have complete data on retention. We expect that States will weigh indicators for which data are unavailable during these initial implementation years in a way that is consistent and applies equivalent levels of accountability across programs. For further discussion of the reporting cycle and implementation timeline, see § 612.4(a). We also note that, as we explain in our response to comments on the definition of “teacher retention rate”, under the final regulations States will report on teachers who remain in the profession in the first three consecutive years after placement.

    Changes: None.

    Comments: Commenters expressed concern that the categories of teachers who can be excluded from the “teacher placement rate” calculation are different from those who can be excluded from the “teacher retention rate” calculation. Commenters believed this could unfairly affect the rating of teacher preparation programs.

    Discussion: We agree that differences in the categories of teachers who can be excluded from the “teacher placement rate” calculation and the “teacher retention rate” calculation should not result in an inaccurate portrayal of teacher preparation program performance on these measures. Under the proposed regulations, the categories of teachers who could be excluded from these calculations would have been the same with two exceptions: Novice teachers who are not retained specifically and directly due to budget cuts may be excluded from the calculation of teacher retention rate only, as may recent graduates who have taken teaching positions that do not require State certification. A teacher placement rate captures whether a recent graduate has ever become a novice teacher and therefore is reliant on initial placement as a teacher of record. Retention in a teaching position has no bearing on this initial placement, and therefore allowing States to exclude teachers from the placement rate who were not retained due to budget cuts would not be appropriate. Therefore, the option to exclude this category of teachers from the retention rate calculation does not create inconsistencies between these measures.

    However, permitting States to exclude from the teacher placement rate calculation, but not from the teacher retention rate calculation, recent graduates who have taken teaching positions that do not require State certification could create inconsistencies between the measures. Moreover, upon further review, we believe permitting the exclusion of this category of teachers from either calculation runs contrary to the purpose of the regulations, which is to assess the performance of programs that lead to an initial State teacher certification or licensure in a specific field. For these reasons, the option to exclude this category of teachers has been removed from the definition of “teacher placement rate” in the final regulations (see § 612.2). With this change, the differences between the categories of teachers that can be excluded from teacher placement rate and teacher retention rate will not unfairly impact the outcomes of these measures, so long as the State uses a consistent approach to assess and report on all programs in the State.

    Changes: None.

    Comments: Commenters stated that this the teacher retention rate measure would reflect poorly on special education teachers, who have a high turnover rate, and on the programs that prepare them. They argued that, in response to the regulations, some institutions will reduce or eliminate their special education preparation programs rather than risk low ratings.

    Discussion: Novice special education teachers have chosen their area of specialization, and their teacher preparation programs trained them consistent with State requirements. The percentage of these teachers, like teachers trained in other areas, who leave their area of specialization within their first three years of teaching, or leave teaching completely, is too high on an aggregated national basis.

    We acknowledge that special education teachers face particular challenges, and that like other teachers, there are a variety of reasons—some dealing with the demands of their specialty, and some dealing with a desire for other responsibilities, or personal factors—for novice special education teachers to decide to move to other professional areas. For example, some teachers with special education training, after initial employment, may choose to work in regular education classrooms, where many children with disabilities are taught consistent with the least restrictive environment provisions of the Individuals with Disabilities Education Act. Their specialized training can be of great benefit in the regular education setting.

    Under our regulations, States will determine how to apply the teacher retention indicator, and so determine in consultation with their stakeholders (see § 612.4(c)) what levels of retention would be so unreasonably low (or so unexpectedly high) to reflect on the quality of the teacher preparation program. We believe this State flexibility will incorporate consideration of the programmatic quality of special education teacher preparation and the general circumstances of employment of these teachers. Special education teachers are teachers first and foremost, and we do not believe the programs that train special education teachers should be exempted from the State's overall calculations of their teacher retention rates. Demand for teachers trained in special education is expected to remain high, and given the flexibility States have to determine what is a reasonable retention rate for novice special education teachers, we do not believe that this indicator of program quality will result in a reduction of special education preparation programs.

    Changes: None.

    Placement in High-Need Schools

    Comments: Many commenters noted that incentivizing the placement of novice teachers in high-need schools contradicts the ESEA requirement that States work against congregating novice teachers in high-need schools. The “Excellent Educators for All” 44 initiative asks States to work to ensure that high-need schools obtain and retain more experienced teachers. Commenters believed States would be challenged to meet the contradictory goals of the mandated rating system and the Department's other initiatives.

    44 Equitable Access to Excellent Educators: State Plans to Ensure Equitable Access to Excellent Educators.(2014). Retrieved from http://www2.ed.gov/programs/titleiparta/resources.html.

    Discussion: The required use of teacher placement and retention rates (i.e., our employment rate outcomes) are intended to provide data that confirm the extent to which those whom a teacher preparation program prepares go on to become novice teachers and remain in teaching for at least three years. Moreover, placement rates overall are particularly important, in that they provide a baseline context for evaluating a program's retention rates. Our employment outcomes include similar measures that focus on high-need schools because of the special responsibility of programs to meet the needs of those schools until such time as SEAs and LEAs truly have implemented their responsibilities under 1111(g)(1)(B) and 1112(b)(2) of the ESEA, as amended by ESSA, (corresponding to similar requirements in sections 1111(b)(8)(C) and 1112(c)(1)(L) of the ESEA, as previously amended by NCLB) to take actions to ensure that low-income children and children of color are not taught at higher rates than other children by inexperienced, unqualified, or out-of-field teachers.

    The Department required all States to submit State Plans to Ensure Equitable Access to Excellent Educations (Educator Equity Plans) to address this requirement, and we look forward to the time when employment outcomes that focus on high-need schools are unnecessary. However, it is much too early to remove employment indicators that focus on high-need schools. For this reason, we decline to accept the commenters' recommendation that we do so because of concern that these reporting requirements are inconsistent with those under the ESEA.

    We add that, just as States will establish the weights to these outcomes in assessing the level of program performance, States also may adjust their expectations for placement and retention rates for high-need schools in order to support successful implementation of their State plans.

    Changes: None.

    Comments: Many commenters expressed concern about placing novice teachers in high-need schools without additional support systems. Several other commenters stated that the proposed regulations would add to the problem of chronic turnover of the least experienced teachers in high-need schools.

    Discussion: We agree that high-need schools face special challenges, and that teachers who are placed in high-need schools need to be prepared for those challenges so that they have a positive impact on the achievement and growth of their students. By requiring transparency in reporting of employment outcomes through disaggregated information about high-need schools, we hope that preparation programs and high-need schools and districts will work together to ensure novice teachers have the academic content knowledge and teaching skills they need when placed as well as the supports they need to stay in high-need schools.

    We disagree with commenters that the regulations will lead to higher turnover rates. By requiring reporting on teacher preparation rates by program, we believe that employers will be better able to identify programs with strong track records for preparing novice teachers who stay, and succeed, in high-need schools. This information will help employers make informed hiring decisions and may ultimately help districts reduce teacher turnover rates.

    Changes: None.

    State Flexibility To Define and Incorporate Measures

    Comments: Commenters suggested that States be able to define the specific employment information they are collecting, as well as the process for collecting it, so that they can use the systems they already have in place. Other commenters suggested that the Department require that States use employment outcomes as a factor in performance designations, but allow each State to determine how best to incorporate these outcomes into accountability systems.

    Several commenters suggested additional indicators that could be used to report on employment outcomes. Specifically, commenters suggested that programs should report the demographics and outcomes of enrolled teacher candidates by race and ethnicity (graduation rate, dropout rates, placement rates for graduates, first-year evaluation scores (if available), and the percentage of teachers candidates who stay within the teaching profession for one, three, and five years). Also, commenters suggested that the Department include the use of readily-available financial data when reporting employment outcomes. Another commenter suggested that the Department collect information on how many teachers from each teacher preparation program attain an exemplary rating through the statewide evaluation systems. Finally, one commenter suggested counting the number of times schools hire graduates from the same teacher preparation program.

    Discussion: As with the other indicators, States have flexibility to determine how the employment outcome measures will be implemented and used to assess the performance of teacher preparation programs. If a State wants to adopt the recommendations in the way it implements collecting data on placement and retention rates, it certainly may do so. But we are mindful of the additional costs associated with calculating these employment measures for each teacher preparation program that would come from adopting commenters' recommendations to disaggregate their employment measures by category of teachers or to include the other categories of data they recommend.

    We do not believe that further disaggregation of data as recommended will produce a sufficiently useful indicator of teacher preparation program performance to justify a requirement that all States implement one or more of these recommendations. We therefore decline to adopt them. We also do not believe additional indicators are necessary to assess the academic content knowledge and teaching skills of the novice teachers from each teacher preparation program though consistent with § 612.5(b), States are free to adopt them if they choose to do so.

    Changes: None.

    Employment Outcomes as a Measure of Program Performance

    Comments: Commenters suggested that States be expected to report data on teacher placement, without being required to use the data in making annual program performance designations.

    Several commenters noted that school districts often handle their own decisions about hiring and placement of new school teachers, which severely limits institutions' ability to place teachers in schools. Many commenters advised against using employment data in assessments of teacher preparation programs. Some stated that these data would fail to recognize the importance of teacher preparation program students' variable career paths and potential for employment in teaching-related fields. To narrowly define teacher preparation program quality in terms of a limited conception of employment for graduates is misguided and unnecessarily damaging.

    Other commenters argued that the assumption underlying this proposed measure of a relationship between program quality and teacher turnover is not supported by research, especially in high-need schools. They stated that there are too many variables that impact teacher hiring, placement, and retention to effectively connect that variable to the quality of teacher preparation programs. Examples provided include: The economy and budget cuts, layoffs that poor school districts are likely to implement, State politics, the unavailability of a position in given content area, personal choices (e.g., having a family), better paying positions, out of State positions, private school positions, military installations and military spouses, few opportunities for advancement, and geographic hiring patterns (e.g., rural versus urban hiring patterns). Some commenters also stated that edTPA, which they described as an exam that is similar to a bar exam for teaching, would be a much more direct, valid measure of a graduate's skills.

    Discussion: We acknowledge that there are factors outside of a program's control that influence teacher placement rates and teacher retention rates. As commenters note, teacher preparation program graduates (or alternative route program participants if a State chooses to look at them rather than program graduates) may decide to enter or leave the profession due to family considerations, working conditions at their school, or other reasons that do not necessarily reflect upon the quality of their teacher preparation program or the level of content knowledge and teaching skills of the program's graduates.

    In applying these employment outcome measures, it would be absurd to assume that States will treat a rate that is below 100 percent as a poor reflection on the quality of the teacher preparation program. Rather, in applying these measures States may determine what placement rates and retention rates would be so low (or so high, if they choose to identify exceptionally performing programs) as to speak to the quality of the program itself.

    However, while factors like those commenters identify affect employment outcomes, we believe that the primary goal of teacher preparation programs should be to produce graduates who successfully become classroom teachers and stay in teaching at least several years. We believe that high placement and retention rates are indicators that a teacher preparation program's graduates (or an alternative route program's participants if a State chooses to look at them rather than program graduates) have the requisite content knowledge and teaching skills to demonstrate sufficient competency to find a job, earn positive reviews, and choose to stay in the profession. This view is shared by States like North Carolina, Louisiana, and Tennessee, as well as CAEP, which require reporting on similar outcomes for teacher preparation programs.

    Commenters accurately point out that teachers in low-performing schools with high concentrations of students of color have significantly higher rates of turnover. Research from New York State confirms this finding, but also shows that first-year teachers who leave a school are, on average, significantly less effective than those who stay.45 This finding, along with other similar findings,46 indicates that teacher retention and teaching skills are positively associated with one another. Another study found that when given a choice between teachers who transfer schools, schools tend to choose the teachers with greater impact on student outcomes,47 suggesting that hiring decisions are also indications of teacher skills and content knowledge. Research studies 48 and available State data 49 on teacher preparation programs placement and retention rates also show that there can be large differences in employment outcomes across programs within a State. While these rates are no doubt influenced by many factors, the Department believes that they are in part a reflection of the quality of the program, because they signal a program's ability to produce graduates that schools and districts deem to be qualified.

    45 Boyd, D., Grossman, P., Lankford, H., & Loeb, S. (2008). Who Leaves? Teacher Attrition and Student Achievement? (Working Paper No. 14022). Retrieved from National Bureau of Economic Research.

    46 Goldhaber, D., Gross, P., & Player, D. (2007). Are public schools really losing their “best”? Assessing the career transitions of teachers and their implications for the quality of the teacher workforce (Working Paper No. 12).

    47 Boyd, D., Lankford, H., Loeb, S., Ronfeldt, M., & Wyckoff, J. (2011). The role of teacher quality in retention and hiring: Using applications to transfer to uncover preferences of teachers and schools. Journal of Policy Analysis and Management, 30(1), 88-110.

    48 Kane, T., Rockoff, J., & Staiger, D. (2008). What does certification tell us about teacher effectiveness? Evidence from New York City. Economics of Education Review, 27(6), 615-631-615-631.

    49 See, for example information on these indicators reported by Tennessee and North Carolina: Report Card on the Effectiveness of Teacher Training Programs, Tennessee 2014 Report Card. (n.d.). Retrieved November 30, 2015, from www.tn.gov/thec/Divisions/AcademicAffairs/rttt/report_card/2014/report_card/14report_card.shtml; UNC Educator Quality Dashboard. (n.d.). Retrieved from http://tqdashboard.northcarolina.edu/performance-employment/.

    The use of employment outcomes as indicators of the performance of a teacher preparation program also reflects the relationship between teacher retention rates and student outcomes. At the school level, high teacher turnover can have multiple negative effects on student learning. When a teacher leaves a school, it is more likely that the vacancy will be filled by a less-experienced and, on average, less-effective teacher, which will lower the achievement of students in the school. In addition to this effect on the composition of a school's teacher workforce, the findings of Ronfeldt, et al. suggest that disruption from teacher turnover has an additional negative effect on the school as a whole, in part, by lowering the effectiveness of the teachers who remain in the school.50

    50 Ronfeldt, M., Loeb, S., & Wyckoff, J. (2013). How Teacher Turnover Harms Student Achievement. American Education Research Journal, 50(1), 4-36.

    Thus, we believe that employment outcomes, taken together, serve not only as reasonable indicators of academic content knowledge and teaching skill, but also as potentially important incentives for programs and States to focus on a program's ability to produce graduates with the skills and preparation to teach for many years. Placement rates overall and in high-need schools specifically, are particularly important, in that they provide a baseline context for evaluating a program's retention rates. In an extreme example, a program may have 100 graduates, but if only one graduate who actually secures employment as a teacher, and continues to teach, that school would have a retention rate of 100 percent. Plainly, such a retention rate does not provide a meaningful or complete assessment of the program's impact on teacher retention rate, and thus on this indicator of program quality. Similarly, two programs may each produce 100 teachers, but one program only places teachers in high-need schools, while the other places no teachers in high-need schools. Even if the programs produced graduates of the exact same quality, the program that serves high-need schools would be likely to have lower retention rates, due to the challenges described in comments and above.

    Finally, we reiterate that States have flexibility to determine how employment outcomes should be weighted, so that they may match their metrics to their individual needs and conditions. In regard to using other available measures of teaching ability and academic content knowledge, like edTPA, we believe that, taken together, outcome-based measures that we require (student learning outcomes, employment outcomes, and survey outcomes) are the most direct measures of academic content knowledge and teaching skills. Placement and retention rates reflect the experiences of program's recent graduates and novice teachers over the course of three to six years (depending on when recent graduates become novice teachers), which cannot be captured by other measures. We acknowledge that States may wish to include additional indicators, such as student survey results, to assess teacher preparation program performance. Section 612.5(b) permits States to do so. However, we decline to require that States use additional or other indicators like those suggested in place of employment outcomes, because we strongly believe they are less direct measures of academic content knowledge and teaching skills.

    Changes: None.

    Validity and Reliability

    Comments: Several commenters indicated that the teacher retention data that States would need to collect for each program do not meet the standards for being valid or reliable. They stated that data on program graduates will be incomplete because States can exclude teachers who move across State lines, teach in private schools or in positions which do not require certification, or who join the military or go to graduate school. Commenters further expressed concern over the numerous requests for additional data regarding persistence, academic achievement, and job placement that are currently beyond the reach of most educator preparation programs.

    Discussion: As we have previously stated, we intend the use of all indicators of academic content knowledge and teaching skill to produce information about the performance-level of each teacher preparation program that, speaking broadly, is valid and reliable. See, generally, our discussion of the issue in response to public comment on Indicators a State Must Use to Report on Teacher Preparation Programs in the State Report Card (34 CFR 612.5(a)).

    It is clear from the comments we received that there is not an outright consensus on using employment outcomes to measure teacher preparation programs; however, we strongly believe that the inclusion of employment outcomes with other measures contributes to States' abilities to make valid and reliable decisions about program performance. Under the regulations, States will work with their stakeholders (see § 612.4(c)) to establish methods for evaluating the quality of data related to a program's outcome measures, and all other indicators, to ensure that the reported data are fair and equitable. As we discussed in the NPRM, in doing so, the State should use this process to ensure the reliability, validity, integrity, and accuracy of all data reported about the performance of teacher preparation programs. We recognize the burden that reporting on employment outcomes may place on individual programs, and for this reason, we suggest, but do not require, that States examine their capacity, within their longitudinal data systems, to track employment outcomes because we believe this will reduce costs for IHEs and increase efficiency of data collection.

    We recognize that program graduates may not end up teaching in the same State as their teacher preparation program for a variety of reasons and suggest, but do not require, that States create inter-State partnerships to better track employment outcomes of program completers as well as agreements that allow them to track military service, graduate school enrollment, and employment as teacher in a private school. But we do not believe that the exclusion of these recent graduates, or those who go on to teach in private schools, jeopardizes reasonable use of this indicator of teacher preparation program performance. As noted, previously, we have revised the regulations so that States may not exclude recent graduates employed in positions which do not require certification from their calculations of employment outcomes. Working with their stakeholders (see § 612.4(c) States will be able to determine how best to apply the retention rate data that they have.

    Finally, we understand that many teacher preparation programs do not currently collect data on factors like job placement, how long their graduates who become teachers stay in the profession, and the gains in academic achievement that are associated with their graduates. However, collecting this information is not beyond those programs' capacity. Moreover, the regulations make the State responsible for ensuring that data needed for each indicator to assess program performance are secured and used. How they will do so would be a subject for State discussion with its consultative group.

    Changes: None.

    Data Collection and Reporting Concerns

    Comments: Commenters recommended that placement-rate data be collected beyond the first year after graduation and across State boundaries. Another commenter noted that a State would need to know which “novice teachers” or “recent graduates” who attended teacher preparation programs in their State are not actually teaching in their State, and it is unclear how a State would be able to get this information. Several commenters further stated that States would need information about program graduates who teach in private schools that is not publically available and may violate privacy laws to obtain.

    Commenters were concerned about how often data will be updated by the Department. They stated that, due to teachers changing schools mid-year, data will be outdated and not helpful to the consumer. Several commenters suggested that a national database would need to be in place for accurate data collection so institutions would be able to track graduates across State boundaries. Two commenters noted that it will be difficult to follow graduates over several years and collect accurate data to address all of the areas relevant to a program's retention rate, and that therefore reported rates would reflect a great deal of missing data.

    Another commenter suggested that the Department provide support for the development and implementation of data systems that will allow States to safely and securely share employment, placement, and retention data.

    Discussion: We note first that, due to the definition of the terms “teacher placement rate” and “recent graduate” (see § 612.2), placement rate data is collected on individuals who have met the requirements of program in any of the three title II reporting years preceding the current reporting year.

    In order to decrease the costs associated with calculating teacher placement and teacher retention rates and to better focus the data collection, our proposed and final definitions of teacher placement rate and teacher retention rate in § 612.2 permit States to exclude certain categories of novice teachers from their calculations for their teacher preparation programs, provided that each State uses a consistent approach to assess and report on all of the teacher preparation programs in the State. As we have already noted, these categories include teachers who teach in other States, teach in private schools, are not retained specifically and directly due to budget cuts, or join the military or enroll in graduate school. While we encourage States to work to capture these data to make the placement and retention rates for each program as robust as possible, we understand that current practicalities may affect their ability to do so for one or more of these categories of teachers. But we strongly believe that, except in rare circumstances, States will have enough data on employment outcomes for each program, based on the numbers of recent graduates who take teaching positions in the State, to use as an indicator of the program's performance.

    To address confidentiality concerns, § 612.4(b)(5) expressly exempts reporting of data where doing so would violate Federal or State privacy laws or regulations.

    The regulations do not require States to submit documentation with the SRCs that supports their data collections; they only must submit the ultimate calculation for each program's indicator (and its weighting). However, States may not omit program graduates (or participants in alternative route programs if a State chooses to look at participants rather than program graduates) from any of the calculations of employment or survey outcomes indicators without being able to verify that these individuals are in the groups that the regulators permit States to omit.

    Some commenters recommended that the Department maintain a national database, while others seemed to think that we plan to maintain such a database. States must submit their SRCs to the Department annually, and the Department intends to make these reports and the data they include, like SRCs that States annually submitted in prior years, publicly available. The Department has no other plans for activities relevant to a national database.

    Commenters were concerned about difficulties in following graduates for the three-year period proposed in the NPRM. As discussed in response to comment on the “teacher retention rate” definition in § 612.2, we have modified the definition of “teacher retention rate” so that States will be reporting on the first three years a teacher is in the classroom rather than three out of the first five years. We believe this change addresses the commenters' concerns.

    As we interpret the comment, one commenter suggested we provide support for more robust data systems so that States have access to the employment data of teachers who move to other States. We have technical assistance resources dedicated to helping States collect and use longitudinal data, including the Statewide Longitudinal Data System's Education Data Technical Assistance Program and the Privacy Technical Assistance Center, which focuses on the privacy and security of student data. We will look into whether these resources may be able to help address this matter.

    Changes: None.

    Alternative Route Programs

    Comments: Commenters stated that the calculation of placement and retention rates for alternative route teacher preparation programs should be different from those for traditional route teacher preparation programs. Others asked that the regulations ensure the use of multiple measures by States in assessing traditional and alternative route programs. Many commenters stated that the proposed regulations give advantages to alternative route programs, as programs that train teachers on the job get significant advantages by being allowed to count all of their participants as employed while they are still learning to teach, virtually ensuring a very high placement rate for those programs. Other commenters suggested that the common starting point for both alternative and traditional route programs should be the point at which a candidate has the opportunity to become a teacher of record.

    As an alternative, commenters suggested that the Department alter the definition of “new teacher” so that both traditional and alternative route teacher candidates start on equal ground. For example, the definition might include “after all coursework is completed,” “at the point a teacher is placed in the classroom,” or “at the moment a teacher becomes a teacher of record.” Commenters recommended that teacher retention rate should be more in line with CAEP standards, which do not differentiate accountability for alternate and traditional route teacher preparation programs.

    Many commenters were concerned about the ability of States to weight employment outcomes differently for alternative and traditional route programs, thus creating unfair comparisons among States or programs in different States while providing the illusion of fair comparisons by using the same metrics. One commenter was concerned about a teacher preparation program's ability to place candidates in fields where a degree in a specific discipline is needed, as those jobs will go to those with the discipline degree and not to a teacher preparation program degree, thus giving teachers from alternative route programs an advantage. Others stated that demographics may impact whether a student enrolls in a traditional or an alternative route program, so comparing the two types of programs in any way is not appropriate.

    Discussion: We agree that employment outcomes could vary based solely on the type, rather than the quality, of a teacher preparation program. While there is great variability both among traditional route programs and among alternative route programs, those two types of programs have characteristics that are generally very different from each other. We agree with commenters that, due to the fundamental characteristics of alternative certification programs (in particular the likelihood that all participants will be employed as teachers of record while completing coursework), the reporting of teacher placement rate data of individuals who participated in such programs will inevitably result in 100 percent placement rate. However, creation of a different methodology for calculating the teacher placement rate solely for alternative route programs would be unnecessarily complex and potentially confusing for States as they implement these regulations and for the public as they examine the data. Accordingly, we have removed the requirement that States report and assess the teacher placement rate of alternative route programs from the final regulations. States may, at their discretion, continue to include teacher placement rate for alternative certification programs in their reporting system if they determine that this information is meaningful and deserves weight. However, they are not required to do so by these final regulations.

    For reasons discussed in the Meaningful Differentiations in Teacher Preparation Program Performance section of this preamble, we have not removed the requirement that States report the teacher placement rate in high-need schools for alternative route programs. If a teacher is employed as a teacher of record in a high-need school prior to program completion, that teacher will be considered to have been placed when the State calculates and reports a teacher placement rate for high-need schools. Unlike teacher placement rate generally, the teacher placement rate in high-need schools can be used to meaningfully differentiate between programs of varying quality.

    Recognizing both that (a) the differences in the characteristics of traditional and alternative route programs may create differences between teacher placement rate in high-need schools and (b) our removal of the requirement to include teacher placement rate for alternative certification programs creates a different number of required indicators for Employment Outcomes between the two program types, we have revised § 612.5(a)(2) to clarify that (1) in their overall assessment of program performance States may assess employment outcomes for these programs differently, and (2) States may do so provided that differences in assessments and the reasons for those differences are transparent and that assessments result in equivalent levels of accountability and reporting irrespective of the type of program.

    We believe States are best suited to analyze their traditional and alternative route programs and determine how best to apply employment outcomes to assess the overall performance of these programs. As such, to further promote transparency and fair treatment, we have revised section V of the SRC to include the need for each State to describe the rationale for how the State is treating the employment outcomes differently, provided it has not chosen to add a measure of placement rate for alternative route programs and does in fact have different bases for accountability.

    We also believe that, as we had proposed, States should apply equivalent standards of accountability in how they treat employment outcomes for traditional programs and alternative route programs, and suggest a few approaches States might consider for achieving such equivalency.

    For example, a State might devise a system with five areas in which a teacher preparation program must have satisfactory outcomes in order to be considered not low-performing or at-risk of being low-performing. For the employment outcomes measure (and leaving aside the need for employment outcomes for high-need schools), a State might determine that traditional route programs must have a teacher placement rate of at least 80 percent and a second-year teacher retention rate of at least 70 percent to be considered as having satisfactory employment outcomes. The State may, in consultation with stakeholders, determine that a second-year retention rate of 85 percent for alternative certification programs results in an equivalent level of accountability for those programs, given that almost all participants in such programs in the State are placed and retained for some period of time during their program.

    As another example, a State might establish a numerical scale wherein the employment outcomes for all teacher preparation programs in the State account for 20 percent. A State might then determine that teacher placement (overall and at high-needs schools) and teacher retention (overall and at high-needs schools) outcomes are weighted equally, say at 10 percent each, for all traditional route programs, but weight the placement rate in high-need schools at 10 percent and retention rate (overall and at high-needs schools) at 10 percent for alternative route programs.

    We also recognize that some alternative route programs are specifically designed to recruit high-quality participants who may be committed to teach only for a few years. Many also recruit participants who in college had academic majors in fields similar to what they will teach. Since a significant aspect of our indicators of academic content knowledge and teaching skills focus on the success of novice teachers regardless of the nature of their teacher preparation program, we do not believe we should establish a one-size-fits-all rule here. Rather, we think that States are in a better position to determine how the employment outcomes should best be used to help assess the performance of alternative route and traditional route programs.

    We agree that use of multiple measures of program performance is important. We reiterate that the regulations require that, in reporting the performance of all programs, both traditional and alternative route, States must use the four indicators of academic content knowledge and teaching skills the regulations identify in § 612.5(a), including employment outcomes—the teacher placement rate (excepting the requirement here for alternative route programs), teacher placement rate in high-need schools, teacher retention rate, and teacher retention rate in high-need schools—in addition to any indicators of academic content knowledge and teaching skills and other criteria they may establish on their own.

    However, we do not know of any inherent differences between traditional route programs and alternative route programs that should require different treatment of the other required indicators—student learning outcomes, survey outcomes, and the basic characteristics of the program addressed in § 612.5(a)(4). Nor do we see any reason why any differences in the type of individuals that traditional route programs and alternative route programs enroll should mean that the program's student learning outcomes should be assessed differently.

    Finally, while some commenters argued about the relative advantage of alternative route or traditional route programs in reporting on employment outcomes, we reiterate that neither the regulations nor the SRCs pit programs against each other. Each State determines what teacher preparation programs are and are not low-performing or at-risk of being low-performing (as well as in any other category of performance it may establish). Each State then reports the data that reflect the indicators and criteria used to make this determination, and identifies those programs that are low-performing or at-risk of being low-performing. Of course, any differences in how employment outcomes are applied to traditional route and alternative route programs would need to result in equivalent levels of accountability and reporting (see § 612.5(a)(2)(B)). But the issue for each State is identifying each program's level of performance relative to the level of expectations the State established—not relative to levels of performance or results for indicators or criteria that apply to other programs.

    Changes: We have revised § 612.5(a)(2)(iii) to clarify that in their overall assessment of program performance States may assess employment outcomes for traditional route programs and programs provided through alternative routes differently provided that doing results in equivalent levels of accountability.

    We have also added a new § 612.5(a)(2)(v) to provide that a State is not required to calculate a teacher placement rate under paragraph (a)(2)(i)(A) of that section for alternative route to certification programs.

    Teacher Preparation Programs Provided Through Distance Education

    Comments: None.

    Discussion: In reviewing the proposed regulations, we recognized that, as with alternative route programs, teacher preparation programs provided through distance education may pose unique challenges to States in calculating employment outcomes under § 612.5(a)(2). Specifically, because such programs may operate across State lines, an individual State may be unable to accurately determine the total number of recent graduates from any given program and only a subset of that total would be, in theory, preparing to teach in that State. For example, a teacher preparation entity may be physically located in State A and operate a teacher preparation program provided through distance education in both State A and State B. While the teacher preparation entity is required to submit an IRC to State A, which would include the total number of recent graduates from their program, only a subset of that total number would be residing in or preparing to teach in State A. Therefore, when State A calculates the teacher placement rate for that program, it would generate an artificially low rate. In addition, State B would face the same issue if it had ready access to the total number of recent graduates (which it would not as the program would not be required to submit an IRC to State B). Any teacher placement rate that State B attempts to calculate for this, or any other, teacher preparation program provided through distance education would be artificially low as recent graduates who did not reside in State B, did not enroll in a teacher preparation program in State B, and never intended to seek initial certification or licensure in State B would be included in the denominator of the teacher placement rate calculation.

    Recognizing these types of issues, the Department has determined that it is appropriate to create an alternative method for States to calculate employment outcomes for teacher preparation programs provided through distance education. Specifically, we have revised the definition of teacher placement rate to allow States, in calculating teacher placement rate for teacher preparation programs provided through distance education, to use the total number of recent graduates who have obtained initial certification or licensure in the State during the three preceding title II reporting years as the denominator in their calculation instead of the total number of recent graduates. Additionally, we believe it is appropriate to give States greater flexibility in assessing these outcomes, and have added a new § 612.5(a)(2)(iv) which allows States to assess teacher placement rates differently for teacher preparation programs provided through distance education provided that the differences in assessment are transparent and result in similar levels of accountability for all teacher preparation programs.

    Changes: We have added § 612.5(a)(2)(iv), which allows States to assess teacher placement rates differently for teacher preparation programs provided through distance education so long as the differences in assessment are transparent and result in similar levels of accountability.

    Survey Outcomes (34 CFR 612.5(a)(3))

    Comments: Several commenters agreed that there is value in using surveys of teacher preparation program graduates and the administrators who employ and supervise them to evaluate the programs, with some commenters noting that such surveys are already in place. Some commenters expressed concerns about the use of survey data as part of a rating system with high-stakes consequences for teacher preparation programs. Some commenters felt that States should have discretion about how or even whether to incorporate survey outcomes into an accountability system. Other commenters suggested making surveys one of a number of options that States could elect to include in their systems for evaluating the quality of teacher preparation programs. Still other commenters felt that, because surveys are currently in place for the evaluation of teacher preparation programs (for example, through State, accrediting agency, and institutional requirements), Federal regulations requiring the use of survey outcomes for this purpose would be either duplicative or add unnecessary burden if they differ from what currently exists. One commenter stated that Massachusetts is currently building valid and reliable surveys of novice teachers, recent graduates, employers, and supervising practitioners on educator preparation, and this work exceeds the expectation of the proposed rules. However, the commenter also was concerned about the reliability, validity, and feasibility of using survey outcomes as an independent measure for assessing teacher preparation program performance. The commenter felt that the proposed regulations do not specify how States would report survey results in a way that captures both qualitative and quantitative data. The commenter expressed doubt that aggregating survey data into a single data point for reporting purposes would convey valuable information, and stated that doing so would diminish the usefulness of the survey data and could lead to distorted conclusions.

    In addition, commenters recommended allowing institutions themselves to conduct and report annual survey data for teacher graduates and employers, noting that a number of institutions currently conduct well-honed, rigorous surveys of teacher preparation program graduates and their employers. Commenters were concerned with the addition of a uniform State-level survey for assessing teacher preparation programs, stating that it is not possible to obtain high individual response rates for two surveys addressing the same area. Commenters contended that, as a result, the extensive longitudinal survey databases established by some of the best teacher education programs in the Nation will be at-risk, resulting in the potential loss of the baseline data, the annual data, and the continuous improvement systems associated with these surveys despite years of investment in them and substantial demonstrated benefits.

    Some commenters noted that it is hard to predict how reliable the teacher and employer surveys required by the regulations would be as an indicator of teacher preparation program quality, since the proposed regulations do not specify how these surveys would be developed or whether they would be the same across the State or States. In addition, the commenters noted that it is hard to predict how reliable the surveys may be in capturing teacher and employer perceptions of how adequately prepared teachers are since these surveys do not exist in most places and would have to be created. Commenters also stated that survey data will need to be standardized for all of a State's institutions, which will likely result in a significant cost to States.

    Some commenters stated that, in lieu of surveys, States should be allowed to create preparation program-school system partnerships that provide for joint design and administration of the preparation program. They claimed when local school systems and preparation programs jointly design and oversee the preparation program, surveys are unnecessary because the partnership creates one preparation program entity that is responsible for the quality of preparation and satisfaction of district and school leaders.

    Discussion: As we stressed in the NPRM, many new teachers report entering the profession feeling unprepared for classroom realities. Since teacher preparation programs have responsibility for preparing teachers for these classroom realities, we believe that asking novice teachers whether they feel prepared to teach, and asking those who supervise them whether they feel those novice teachers are prepared to teach, generate results that are necessary components in any State's process of assessing the level of a teacher preparation program's performance. Moreover, while all States do not have experience employing surveys to determine program effectiveness, we believe that their use for this purpose has been well established. As noted in the NPRM, two major national organizations focused on teacher preparation and others in the higher education world are now incorporating this kind of survey data as an indicator of program quality (see 79 FR 71840).

    We share the belief of these organizations that a novice teacher's perception, and that of his or her employer, of the teacher's readiness and capability during the first year teaching are key indicators of that individual's academic knowledge and teaching skills as well as whether his or her preparation program is training teachers well. In addition, aside from wanting to ensure that what States report about each program's level of performance is reasonable, a major byproduct of the regulations is that they can ensure that States have accurate information on the quality of teacher preparation programs so that they and the programs can make improvements where needed and recognize excellence where it exists.

    Regarding commenters concerns about the validity and reliability of the use of survey results to help assess program performance, we first reference our general discussion of the issue in response to public comment on Indicators a State Must Use to Report on Teacher Preparation Programs in the State Report Card (34 CFR 612.5(a)).

    Beyond this, it plainly is important that States develop procedures to enable teachers' and employers' perceptions to be appropriately used and have the desired impacts, and at the same time to enable States to use survey results in ways that treat all programs fairly. To do so, we strongly encourage States to standardize their use of surveys so that for novice teachers who are similarly situated, they seek common information from them and their employers. We are confident that, in consultation with key stakeholders as provided for in § 612.4(c)(1), States will be able to develop a standardized, unbiased, and reliable set of survey questions, or ensure that IHE surveys meet the same standard. This goal would be very difficult to achieve, however, if States relied on existing surveys (unless modified appropriately) whose questions vary in content and thus solicit different information and responses. Of course, it is likely that many strong surveys already exist and are in use, and we encourage States to consider using such an existing survey so long as it comports with § 612.5(a)(3). Where a State finds an existing survey of novice teachers and their employers to be adequate, doing so will avoid the cost and time of preparing another, and to the extent possible, prevent the need for teachers and employers to complete more than one survey, which commenters reasonably would like to avoid. Concerns about the cost and burden of implementing teacher and employer surveys are discussed further with the next set of comments on this section.

    We note that States have the discretion to determine how they will publicly post the results of surveys and how they will aggregate the results associated with teachers from each program for use as an indicator of that program's performance. We encourage States to report survey results disaggregated by question (as is done, for example, by Ohio 51 and North Carolina 52 ), as we believe this information would be particularly useful for prospective teachers in evaluating the strengths of different teacher preparation programs. At some point, however, States must identify any programs that are low-performing or at-risk of being low-performing, and to accomplish this they will need to aggregate quantitative and qualitative survey responses in some way, in a method developed in consultation with key stakeholders as provided for in § 612.4(c)(1).

    51 See, for example: 2013 Educator Preparation Performance Report Adolescence to Young Adult (7-12) Integrated Mathematics Ohio State University. (2013). Retrieved from http://regents.ohio.gov/educator-accountability/performance-report/2013/OhioStateUniversity/OHSU_IntegratedMathematics.pdf.

    52 See UNC Educator Quality Dashboard. Retrieved from http://tqdashboard.northcarolina.edu/performance-employment/.

    Like those who commented, we believe that partnerships between teacher preparation programs and local school systems have great value in improving the transition of individuals whom teacher preparation programs train to the classroom and a novice teacher's overall effectiveness. However, these partnerships cannot replace survey results as an indicator of the program's performance.

    Changes: None.

    Comments: Commenters suggested that the Department consider options for reducing the cost and burden of implementation, such as clarifying that States would not have to survey 100 percent of novice teachers or permitting States to conduct surveys less frequently than every year.

    Commenters stated that, if used as expected for comparability purposes, the survey would likely need to be designed by and conducted through a third-party agency with professional credentials in survey design and survey administration. They stated that sampling errors and various forms of bias can easily skew survey results and the survey would need to be managed by a professional third-party group, which would likely be a significant cost to States.

    One commenter recommended that a national training and technical assistance center be established to build data capacity, consistency, and quality among States and educator preparation providers to support scalable continuous improvement and program quality in teacher preparation. In support of this recommendation, the commenter, an accreditor of education preparation providers, stated that, based on its analysis of its first annual collection of outcome data from education preparation providers, and its follow-up survey of education preparation providers, the availability of survey outcomes data differs by survey type. The commenter noted that while 714 teacher preparation program providers reported that they have access to completer survey data, 250 providers reported that they did not have access. In addition, the commenter noted that teacher preparation program providers indicated that there were many challenges in reporting employment status, including State data systems as well as programs that export completers across the nation or internationally.

    Discussion: To obtain the most comprehensive feedback possible, it is important for States to survey all novice teachers who are employed as teachers in their first year of teaching and their employers. This is because feedback from novice teachers is one indicator of how successfully a preparation program imparts knowledge of content and academic skills, and survey results from only a sample may introduce unnecessary opportunities for error and increased cost and burden. There is no established n-size at which point a sample is guaranteed to be representative, but rather, statistical calculations must be made to verify that the sample is representative of the characteristics of program completers or participants. While drawing a larger sample often increases the likelihood that it will be representative, we believe that for nearly all programs, a representative sample will not be substantially smaller than the total population of completers. Therefore, we do not believe that there is a meaningful advantage to undertaking the analysis required to draw a representative sample. Furthermore, we believe that any potential advantage does not outweigh the potential for error that could be introduced by States or programs that unwittingly draw a biased sample, or report that their sample is representative, when in fact it is not. As with student learning outcomes and employment outcomes, we have clarified in § 612.5(a)(3)(ii) that a State may exclude from its calculations of a program's survey outcomes those survey outcomes for all novice teachers who have taken teaching positions in private schools so long as the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State.

    We note that in ensuring that the required surveys are reasonable and appropriate, States have some control over the cost of, and the time necessary for, implementing the surveys. Through consultation with their stakeholders (see § 612.4(c)), they determine the number and type of questions in each survey, and the method of dissemination and collection. However, we believe that it is important that teacher and employer surveys be conducted at least annually. Section 207(a) of the HEA requires that States annually identify teacher preparation programs that are low-performing or at-risk of being low-performing. To implement this requirement, we strongly believe that States need to use data from the year being evaluated to identify those programs. If data from past years were used for annual evaluations, it would impede low-performing programs from seeing the benefits of their improvement, tend to enable deteriorating programs to rest on their laurels, and prevent prospective teachers and employers and the public at large from seeing the program's actual level of performance. Moreover, because the regulations require these surveys only of novice teachers in their first year of teaching, the commenter's proposal to collect survey outcomes less than annually would mean that entire cohorts of graduates would not be providing their assessment of the quality of their preparation program.

    In considering the comment, we realized that while we estimated costs of reporting all indicators of academic content knowledge and teaching skills, including survey outcomes, on an annual basis, the regulations did not adequately clarify the need to collect and report data related to each indicator annually. Therefore, we have revised § 612.4(b)(2)(i) to require that data for each indicator be provided annually for the most recent title II reporting year.

    Further discussion regarding the cost and burden of implementing teacher and employer surveys can be found in the Discussion of Costs, Benefits, and Transfers in the RIA section of this document.

    The regulations do not prescribe any particular method for obtaining the completed surveys, and States may certainly work with their teacher preparation programs and teacher preparation entities to implement effective ways to obtain survey results. Beyond this, we expect that States will seek and employ the assistance that they need to develop, implement, and manage teacher and employer surveys as they see fit. We expect that States will ensure the validity and reliability of survey outcomes—including how to address responder bias—when they establish their procedures for assessing and reporting the performance of each teacher preparation program with a representative group of stakeholders, as is required under § 612.4(c)(1)(i). The regulations do not specify the process States must use to develop, implement, or manage their employer surveys, so whether they choose to use third-party entities to help them do so is up to them.

    Finally, we believe it is important for the Department to work with States and teacher preparation programs across the nation to improve those programs, and we look forward to engaging in continuing dialogue about how this can be done and what the appropriate role of the Department should be. However, the commenters' request for a national training and technical assistance center to support scalable continuous improvement and to improve program quality is outside the scope of this regulation—which is focused on the States' use of indicators of academic content knowledge and teaching skills in their processes of identifying those programs that are low-performing, or at-risk of being low-performing, and other matters related to reporting under the title II reporting system.

    Changes: We have added § 612.5(a)(3)(ii) to clarify that a State may exclude form its calculations of a program's survey outcomes those for novice teachers who take teaching positions in private schools so long as the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State. In addition, we have revised 612.4(b)(2)(i) to provide that data for each of the indicators identified in § 612.5 is to be for the most recent title II reporting year.

    Comments: Commenters also expressed specific concerns about response bias on surveys, such as the belief that teacher surveys often end up providing information about the personal likes or dislikes of the respondent that can be attributed to issues not related to program effectiveness. Commenters stated that surveys can be useful tools for the evaluation of programs and methods, but believed the use of surveys in a ratings scheme is highly problematic given how susceptible they are to what some commenters referred to as “political manipulation.” In addition, commenters stated that surveys of employer satisfaction may be substantially biased by the relationship of school principals to the teacher preparation program. Commenters felt that principals who are graduates of programs at specific institutions are likely to have a positive bias toward teachers they hire from those institutions. Commenters also believed that teacher preparation programs unaffiliated with the educational leadership at the school will be disadvantaged by comparison.

    Commenters also felt that two of our suggestions in the NPRM to ensure completion of surveys—that States consider using commercially available survey software or that teachers be required to complete a survey before they can access their class rosters—raise tremendous questions about the security of student data and the sharing of identifying information with commercial entities.

    Discussion: We expect that States will ensure the validity and reliability of survey outcomes, including how to address responder bias and avoid “political manipulation” and like problems when they establish their procedures for assessing and reporting the performance of each teacher preparation program with a representative group of stakeholders, as is required under § 612.4(c)(1)(i).

    While it may be true that responder bias could impact any survey data, we expect that the variety and number of responses from novice teachers employed at different schools and within different school districts will ensure that such bias will not substantially affect overall survey results.

    There is no reason student data should ever be captured in any survey results, even if commercially available software is used or teachers are required to complete a survey before they can access and verify their class rosters. Commenters did not identify any particular concerns related to State or Federal privacy laws, and we do not understand what they might be. That being said, we fully expect States will design their survey procedures in keeping with requirements of any applicable privacy laws.

    Changes: None.

    Comments: Some commenters expressed concerns with the effect that a low response rate would have on the use of survey data as an indicator of teacher preparation program quality. Commenters noted that obtaining responses to teacher and employer surveys can be quite burdensome due to the difficulty in tracking graduates and identifying their employers. Moreover, commenters stated that obtaining their responses is frequently unsuccessful. Some commenters noted that, even with aggressive follow-up, it would be difficult to obtain a sufficient number of responses to warrant using them in high-stakes decision making about program quality. Some commenters felt that the regulations should offer alternatives or otherwise address what happens if an institution is unable to secure sufficient survey responses.

    One commenter shared that, since 2007, the Illinois Association of Deans of Public Colleges of Education has conducted graduate surveys of new teachers from the twelve Illinois public universities, by mailing surveys to new teachers and their employers. The response rate for new teachers has been extremely low (44.2 percent for the 2012 survey and 22.6 percent for the 2013 survey). The supervisor response has been higher, but still insufficient, according to the commenter, for the purpose of rating programs (65.3 percent for the 2012 survey and 40.5 percent for the 2013 survey). In addition, the commenter stated that some data from these surveys indicate differences in the responses provided by new teachers and their supervisors. The commenter felt that the low response rate is compounded when trying to find matched pairs of teachers and supervisors. Using results from an institution's new teacher survey data, the commenter was only able to identify 29 out of 104 possible matched pairs in 2012 and 11 out of 106 possible matched pairs in 2013.

    One commenter from an IHE stated that the institution's return rate on graduate surveys over the past 24 years has been 10 to 24 percent, which they stated is in line with national response rates. While the institution's last survey of 50 school principals had a 50 percent return rate, the commenter noted that her institution only surveys those school divisions which they know regularly hire its graduates because it does not have a source from which it can obtain actual employment information for all graduates. According to the commenter, a statewide process that better ensures that all school administrators provide feedback would be very helpful, but could also be very burdensome for the schools.

    Another commenter noted that the response rate from the institution's graduates increased significantly when the questionnaire went out via email, rather than through the United States Postal Service; however, the response rate from school district administrators remained dismal, no matter what format was used—mail, email, Facebook, Instagram, SurveyMonkey, etc. One commenter added that defaulting back to the position of having teachers complete surveys during their school days, and thus being yet another imposition on content time in the classroom, was not a good alternative to address low response rates. Commenters saw an important Federal role in accurately tracking program graduates across State boundaries.

    Discussion: We agree that low response rates can affect the validity and reliability of survey outcomes as an indicator of program performance. While we are not sure why States would necessarily need to have match pairs of surveys from novice teachers and their employers as long as they achieve what the State and its consultative group determine to be a sufficient response rate, we expect that States will work to develop procedures that will promote adequate response rates in their consultation with stakeholders, as required under § 612.4(c)(1)(i)). We also expect that States will use survey data received for the initial pilot reporting year (2017-2018), when States are not required to identify program performance, to adjust their procedures, address insufficient response rates, and address other issues affecting validity and reliability of survey results. We also note that since States, working with their stakeholders, may determine how to weight the various indicators and criteria they use to come up with a program's overall level of performance, they also have the means to address survey response rates that they deem too low to provide any meaningful indicator of program quality.

    We believe that States can increase their response rate by incorporating the surveys into other structures, for example, having LEAs disseminate the survey at various points throughout teachers' induction period. Surveys may also be made part of required end-of-year closeout activities for teachers and their supervisors. As the regulations require States to survey only those teachers who are teaching in public schools and the public school employees who employ them (see the discussion of the definition of a novice teacher under § 612.2(d)), we believe that approaches such as these will enable States to achieve reasonably high response rates and, thus, valid survey results.

    Finally, before the Department would consider working to develop a system, like one the commenter suggested, for tracking program graduates across State boundaries, we would want to consult with States, IHEs and other stakeholders.

    Changes: None.

    Specialized Accreditation (34 CFR 612.5(a)(4)(i))

    Comments: Commenters were both supportive of and opposed to the proposed provision regarding specialized accreditation. Some commenters noted that CAEP, the new specialized accreditor for teacher preparation programs, is not an accreditor currently recognized by the Department, which creates the possibility that there would be no federally-recognized specialized accreditor for teacher preparation programs. Commenters believed that the inclusion of this metric is premature without an organization, which the Secretary recognizes, that can confer accreditation on these programs. Other commenters argued that this provision inserts the Federal government into the State program approval process by mandating specific requirements that a State must consider when approving teacher preparation programs within its jurisdiction. They further stated that, although the Department references CAEP and its standards for what they referred to as a justification for some of the mandated indicators, CAEP does not accredit at the program level. They noted that, in fact, no accreditor provides accreditation specifically to individual teacher preparation programs; CAEP does so only to entities that offer these programs.

    Commenters raised an additional concern that the Department is seeking to implicitly mandate national accreditation, which would result in increased costs; and that the proposed regulations set a disturbing precedent by effectively mandating specialized accreditation as a requirement for demonstrating program quality. Some commenters were concerned that with CAEP as the only national accreditor for teacher preparation, variety of and access to national accreditation would be limited and controlled.

    Other commenters expressed concern that our proposal to offer each State the option of presenting an assurance that the program is accredited by a specialized accrediting agency would, at best, make the specialized accreditor an agent of the Federal government, and at worst, effectively mandate specialized accreditation by CAEP. The commenters argued instead that professional accreditation should remain a voluntary, independent process based on evolving standards of the profession.

    Some commenters asked that the requirement for State reporting on accreditation or program characteristics in § 612.5(a)(4)(i) and (ii) be removed because these are duplicative of existing State efforts with no clear benefit to understanding whether a teacher preparation program can effectively prepare candidates for classroom success, and because the proposed regulations are redundant to work being done for State and national accreditation. Other commenters recommended that States should not be required to adhere to one national system because absent a floor for compliance purposes, States may build better accreditation systems. One commenter proposed that, as an alternative to program accreditation, States be allowed to include other indicators predictive of a teacher's effect on student performance, such as evidence of the effective use of positive behavioral interventions and supports on the basis of the aggregate number of suspensions and expulsions written by educators from each teacher preparation program.

    Some commenters argued that stronger standards are essential to improving teacher preparation programs, and providing some gradation of ratings of how well preparation programs are doing would provide useful information to the prospective candidates, hiring districts, and the teacher preparation programs the IRCs and SRCs are intended to inform. They noted that as long as CAEP continued with these accreditation levels, rather than lumping them all together under a high-level assurance, indicators of these levels should be reflected in the rating system. They also stated that where States do not require accreditation, States should attempt to assess the level at which programs are meeting the additional criteria.

    Some commenters argued that accreditation alone is sufficient to hold teacher preparation programs accountable. Other commenters stated their agreement that active participation in professional accreditation should be recognized as an indicator of program quality. One commenter supported the alignment between the proposed regulations and CAEP's annual outcomes-based reporting measures, but was concerned that the regulations as proposed would spawn 50 separate State reporting systems, data definitions, and processes for quality assurance. The commenter supported incentivizing accreditation and holding all teacher preparation programs to the same standards and reporting requirements, and stated that CAEP's new accreditation process would achieve the goals of the proposed rules on a national level, while removing burden from the States. The commenter expressed concern about the requirement that the Secretary recognize the specialized accrediting agency, and the statement in the preamble of the NPRM that alternative route programs are often not eligible for specialized accreditation.

    The commenter also indicated that current input- and compliance-based system requirements within the Department's recognition process for accreditors runs counter to the overarching goal of providing meaningful data and feedback loops for continuous improvement. The commenter noted that CAEP was launched to bring all teacher preparation programs, whether alternative, higher education based, or online-based, into the fold of accreditation. The commenter recommended that specialized accrediting agencies recognized by the Council for Higher Education Accreditation (CHEA) should be allowed to serve as a State indicator for program quality.

    Commenters also noted that no definition of specialized accreditation was proposed, and requested that we include a definition of this term. One commenter recommended that a definition of specialized accreditation include the criteria that the Secretary would use to recognize an agency for the accreditation of professional teacher preparation programs, and that one of the criteria for a specialized agency should be the inclusion of alternative certification programs as eligible professional teacher preparation programs.

    Discussion: First, it is important to note that these regulations do not set requirements for States' teacher preparation program approval processes. The regulations establish requirements for States' reporting to the Secretary on teacher preparation programs in their States, and specifically their identification of programs determined to be low-performing or at-risk of being low-performing, and the basis for those determinations.

    Also, upon review of the comments, we realized that imprecise wording in the proposed regulations likely led to misunderstanding of its intent regarding program-level accreditation. Our intent was simple: to allow States able to certify that the entity offering the teacher preparation program had been accredited by a teacher preparation program accreditor recognized by the Secretary to rely on that accreditation to demonstrate that the program produces teacher candidates with the basic qualifications identified in § 612.5(a)(4)(ii) rather than having to separately report on those qualifications. The proposed regulations would not have required separate accreditation of each individual program offered by an entity, but we have revised § 612.5(a)(4)(i) to better reflect this intent. In response to the concern about whether an entity that administers an alternative route program can receive such accreditation, the entity can apply for CAEP accreditation, as one of the commenters noted.

    As summarized above, commenters presented opposing views of the role in the regulations of national accreditation through an accreditor recognized by the Secretary: Opinions that the inclusion of national accreditation in the regulations represented an unauthorized mandate for accreditation on the one hand, and an implication that accreditation alone was sufficient, thus making other options or further indicators unnecessary, on the other. Similarly, some commenters argued that the regulations require too much standardization across States (through either accreditation or a consistent set of broad indicators), while others argued that the regulations either allow too much variability among States (leading to lack of comparability) or encourage the duplicative effort of creating over 50 separate systems.

    In the final regulations we seek to balance these concerns. States are to assess whether a program either has Federally recognized accreditation (§ 612.5(a)(4)(i)) or produces teacher candidates with certain characteristics (§ 612.5(a)(4)(ii)). Allowing States to report and assess whether their teacher preparation programs have specialized accreditation or produce teacher candidates with specific characteristics is not a mandate that a program fulfill either option, and it may eliminate or reduce duplication of effort by the State. If a State has an existing process to assess the program characteristics in § 612.5(a)(4)(ii), it can use that process rather than report on whether a program has specialized accreditation; conversely, if a State would like to simply use accreditation by an agency that evaluates factors in § 612.5(a)(4)(ii) (whether federally recognized or not) to fulfill this requirement, it may choose do so. We believe these factors do relate to preparation of effective teachers, which is reflected in standards and expectations developed by the field, including the CAEP standards. And since accreditation remains a voluntary process, we cannot rely on it alone for transparency and accountability across all programs.

    We now address the commenters' statement that there may be no federally recognized accreditor for educator preparation entities. If there is none, and a State would like to use accreditation by an agency whose standards align with the elements listed in § 612.5(a)(4)(ii) (whether federally recognized or not) to fulfill the requirements in § 612.5(a)(4)(ii), it may do so. In fact, many States have worked or are working with CAEP on partnerships to align standards, data collection, and processes.

    As we summarized above, some commenters requested that we include a definition of specialized accreditation, and that it include criteria the Secretary would use to recognize an agency for accreditation of teacher preparation programs, and that one of the criteria should be inclusion of alternative certification programs as eligible programs. While we appreciate these comments, we believe they are outside the scope of the proposed and final regulations.

    Finally, because teacher preparation program oversight authority lies with the States, we do not intend for the regulations to require a single approach—via accreditation or otherwise—for all States to use in assessing the characteristics of teacher preparation programs. We do, however, encourage States to work together in designing data collection processes, in order to reduce or share costs, learn from one another, and allow greater comparability across States.

    In terms of the use of other specific indicators (e.g., positive behavioral interventions), we encourage interested parties to bring these suggestions forward to their States in the stakeholder engagement process required of all States under § 612.4(c).

    As one commenter noted, the current statutory recognition process for accreditors is heavily input based, while the emphasis of the regulations is on outcomes. Any significant reorientation of the accreditor recognition process would require statutory change. Nonetheless, given the rigor and general acceptance of the Federal recognition process, we believe that accreditation only by a Federally recognized accreditor be specifically assessed in § 612.5(a)(4)(i), rather than accreditors recognized by outside agencies such as CHEA. For programs not accredited by a federally recognized accreditor, States determine whether or to what degree a program meets characteristics for the alternative, § 612.5(a)(4)(ii).

    Because the regulation provides for use of State procedures as an alternative to specialized accreditor recognized by the Secretary, nothing in § 612.5(a)(4) would mandate program accreditation by CAEP or any other entity. Nor would the regulation otherwise interfere in what commenters argue should be a voluntary, independent process based on evolving standards of the profession. Indeed, this provision does not require any program accreditation at all.

    Changes: We have revised § 612.5(a)(4)(i) to clarify that the assessment of whether a program is accredited by a specialized accreditor could be fulfilled by assessing the accreditation of the entity administering teacher preparation programs, not by accreditation of the individual programs themselves.

    Characteristics of Teacher Preparation Programs (34 CFR 612.5(a)(4)(ii))

    Comments: Multiple commenters expressed opposition to this provision, which would have States report whether a program lacking specialized accreditation under § 612.5(a)(4)(ii), has certain basic program characteristics. They stated that it is Federal overreach into areas of State or institutional control. For example, while commenters raised the issue in other contexts, one commenter noted that entrance and exit qualifications of teacher candidates have traditionally been the right of the institution to determine when considering requirements of State approval of teacher preparation programs. Other commenters expressed concern about Federal involvement in State and accrediting agency approval of teacher preparation programs, in which they stated that the Federal government should have limited involvement.

    Other commenters expressed concern about the consequences of creating rigorous teacher candidate entry and exit qualifications. Some commenters expressed concerns that this requirement does not take into account the unique missions of the institutions and will have a disproportionate and negative impact on MSIs, which may see decreases in eligible teacher preparation program candidates by denying entry to candidates who do not meet entry requirements established by this provision. These commenters were concerned that rigorous entrance requirements could decrease diversity in the teaching profession.

    Commenters also expressed general opposition to requiring rigorous entry and exit qualifications because they felt that the general assurance of entry and exit requirements did little to provide transparency or differentiate programs by program quality. Therefore, the provisions were unneeded, and only added to the confusion and bureaucracy of these requirements.

    Other commenters noted that a lack of clinical experience similar to the teaching environment in which they begin their careers results in a struggle for novice teachers, limiting their ability to meet the needs of their students in their early years in the classroom. They suggested that the regulations include “teaching placement,” for example, or “produces teacher candidates with content and pedagogical knowledge and quality clinical preparation relevant to their teaching placement, who have met rigorous teacher candidate entry and exit qualifications pursuant” to increase the skills and knowledge of teacher preparation program completers who are being placed in the classroom as a teacher.

    Discussion: While some commenters expressed concern with Federal overreach, as noted in the earlier discussion of § 612.5(a)(4)(i) these regulations do not set any requirements that States have established for approving teacher preparation programs; they establish requirements for State reporting to the Secretary on teacher preparation programs and how they determined whether any given program was low-performing or at-risk of being low-performing. In addition, a State may report whether institutions have fulfilled requirements in § 612.5(a)(4) through one of two options: Accreditation by an accreditor recognized by the Secretary or, consistent with § 612.5(a)(4)(ii), showing that the program produces teacher candidates (1) with content and pedagogical knowledge and quality clinical preparation, and (2) who have met rigorous exit qualifications (including, as we observe in response to the comments summarized immediately above, by being accredited by an agency whose standards align with the elements listed in § 612.5(a)(4)(ii)). Thus, the regulations do not require that programs produce teacher candidates with any Federally prescribed rigorous exit requirements or quality clinical preparation.

    Rather, as discussed in our response to public comment in the section on Specialized Accreditation, States have the authority to use their own process to determine whether a program has these characteristics. We feel that this authority provides ample flexibility for State discretion in how to treat this indicator in assessing overall program performance and the information about each program that could help that program in areas of program design. Moreover, the basic elements identified in § 612.5(a)(4)(ii) reflect recommendations of the non-Federal negotiators, and we agree with them that the presence or absence of these elements should impact the overall level of a teacher preparation program's performance.

    The earlier discussion of “rigorous entry and exit requirements” in our discussion of public comment on Definitions addresses the comments regarding rigorous entry requirements. We have revised § 612.5(a)(4)(ii)(C) accordingly to focus solely on rigorous exit standards. As mentioned in that previous discussion, the Department also encourages all States to include diversity of program graduates as an indicator in their performance rating systems, to recognize those programs that are addressing this critical need in the teaching workforce.

    Ensuring that the program produces teacher candidates who have met rigorous exit qualifications alone will not provide necessary transparency or differentiation of program quality. However, having States report data on the full set of indicators for each program will provide significant and useful information, and explain the basis for a State's determination that a particular program is or is not low-performing or at-risk of being low-performing.

    We agree with the importance of high quality clinical experience. However, it is unrealistic to require programs to ensure that each candidate's clinical experience is directly relevant to his or her future, as yet undetermined, teaching placement.

    Changes: We have revised § 612.5(a)(4)(ii)(C) to require a State to assess whether the teacher preparation program produces teacher candidates who have met rigorous teacher candidate exit qualifications. We have removed the proposed requirement that States assess whether teacher candidates meet rigorous entry requirements.

    Comments: None.

    Discussion: Under § 612.5(a)(4) States must annually report whether a program is administered by an entity that is accredited by a specialized accrediting agency or produces candidates with the same knowledge, preparation, and qualifications. Upon review of the comments and the language of § 612.5(a)(4), we realized that the proposed lead stem to § 612.5(a)(4)(ii), “consistent with § 612.4(b)(3)(i)(B)”, is not needed since the proposed latter provision has been removed.

    Changes: We have removed the phrase “consistent with § 612.4(b)(3)(i)(B)” from § 612.5(a)(4)(ii).

    Other Indicators of a Teacher's Effect on Student Performance (34 CFR 612.5(b))

    Comments: Multiple commenters provided examples of other indicators that may be predictive of a teacher's effect on student performance and requested the Department to include them. Commenters stated that a teacher preparation program (by which we assume the commenters meant “State”) should be required to report on the extent to which each program meets workforce demands in their State or local area. Commenters argued this would go further than just reporting job placement, and inform the public about how the program works with the local school systems to prepare qualified teacher candidates for likely positions. Other commenters stated that, in addition to assessments, students should evaluate their own learning, reiterating that this would be a more well-rounded approach to assessing student success. One commenter recommended that the diversity of a teacher preparation program's students should be a metric to assess teacher preparation programs to ensure that teacher preparation programs have significant diversity in the teachers who will be placed in the classroom.

    Discussion: We acknowledge that a State might find that other indicators beyond those the regulations require including those recommended by the commenters, could be used to provide additional information on teacher preparation program performance. The regulations permit States to use (in which case they need to report on) additional indicators of academic content knowledge and teaching skills to assess program performance, including other measures that assess the effect of novice teachers on student performance. In addition, as we have previously noted, States also may apply and report on other criteria they have established for identifying which teacher preparation programs are low-performing or at-risk of being low-performing.

    In reviewing commenters' suggestions, we realized that the term “predictive” in the phrase “predictive of a teacher's effect on student performance” is inaccurate. The additional measures States may use are indicators of their academic content knowledge and teaching skill, rather than predictors of teacher performance.

    We therefore are removing the word “predictive” from the regulations. If a State uses other indicators of academic content knowledge and teaching skills, it must, as we had proposed, apply the same indicators for all of its teacher preparation programs to ensure consistent evaluation of preparation programs within the State.

    Changes: We have removed the word “predictive” from § 612.5(b).

    Comments: None.

    Discussion: As we addressed in the discussion of public comments on Scope and Purpose (§ 612.1), we have removed the proposed requirement that in assessing the performance of each teacher preparation program States consider student learning outcomes “in significant part.” In addition, as we addressed in the discussion of public comments on Requirements for State Reporting on Characteristics of Teacher Preparation Programs (§ 612.5(a)(4)(ii)), we have removed rigorous entry requirements from the characteristics of teacher preparation programs whose administering entities do not have accreditation by an agency approved by the Secretary. Proposed § 612.6(a)(1) stated that States must use student learning outcomes in significant part to identify low-performing or at risk programs, and proposed § 612.6(b) stated that the technical assistance that a State must provide to low-performing programs included technical assistance in the form of information on assessing the rigor of their entry requirements. We have removed both phrases from the final regulations.

    Changes: The phrase “in significant part” has been removed from § 612.6(a)(1), and “entry requirement and” has been removed from § 612.6(b).

    What must a State consider in identifying low-performing teacher preparation programs or at-risk teacher preparation programs, and what actions must a State take with respect to those identified as low-performing? (34 CFR 612.6)

    Comments: Some commenters supported the requirement in § 612.6(b) that at a minimum, a State must provide technical assistance to low-performing teacher preparation programs in the State to help them improve their performance. Commenters were supportive of targeted technical assistance because it has the possibility of strengthening teacher preparation programs and the proposed requirements would allow States and teacher preparation programs to focus on continuous improvement and particular areas of strength and need. Commenters indicated that they were pleased that the first step for a State upon identifying a teacher preparation program as at-risk or low-performing is providing that program with technical support, including sharing data from specific indicators to be used to improve instruction and clinical practice. Commenters noted that States can help bridge the gap between teacher preparation programs and LEAs by using that data to create supports for those teachers whose needs were not met by their program. Commenters commended the examples of technical assistance provided in the regulations.

    Some commenters suggested additional examples of technical assistance to include in the regulations. Commenters believed that technical assistance could include: Training teachers to serve as clinical faculty or cooperating teachers using the National Board for Professional Teaching Standards; integrating models of accomplished practice into the preparation program curriculum; and assisting preparation programs to provide richer clinical experiences. Commenters also suggested including first-year teacher mentoring programs and peer networks as potential ways in which a State could provide technical assistance to low-performing programs. One commenter noted that, in a recent survey of educators, teachers cite mentor programs in their first year of teaching (90 percent) and peer networks (84 percent) as the top ways to improve teacher training programs.

    Commenters recommended that States have the discretion to determine the scope of the technical assistance, rather than requiring that technical assistance focus only on low-performing programs. This would allow States to distribute support as appropriate in an individual context, and minimize the risk of missing essential opportunities to identify best practices from high-performing programs and supporting those programs who are best-positioned to be increasingly productive and effective providers. Commenters suggested that entities who administer teacher preparation programs be responsible for seeking and resourcing improvement for their low-performing programs.

    Some commenters suggested that the Federal government provide financial assistance to States to facilitate the provision of technical assistance to low-performing programs. Commenters suggested that the Department make competitive grants available to States to distribute to low-performing programs in support of program improvement. Commenters also suggested that the Federal government offer meaningful incentives to help States design, test, and share approaches to strengthening weak programs and support research to assess effective interventions, as it would be difficult for States to offer the required technical assistance because State agencies have little experience and few staff in this area. In addition, commenters recommended that a national training and technical assistance center be established to build data capacity, consistency, and quality among States and teacher preparation programs to support scalable continuous improvement and program quality in educator preparation.

    Commenters recommended that, in addition to a description of the procedures used to assist low-performing programs as required by section 207 of the HEA, States should be required to describe in the SRC the technical assistance they provide to low-performing teacher preparation programs in the last year. Commenters suggested that this would shift the information reported from descriptions of processes to more detailed information about real technical assistance efforts, which could inform technical assistance efforts in other States.

    Commenters suggested adding a timeframe for States to provide the technical assistance to low-performing programs. Commenters suggested a maximum of three months from the time that the program is identified as low-performing because, while waiting for the assistance, and in the early stages of its implementation, the program will continue to produce teacher candidates of lower quality.

    Commenters suggested that States should be required to offer the assistance of a team of well-recognized scholars in teacher education and in the education of diverse students in P-12 schools to assist in the assessment and redesign of programs that are rated below effective. Some commenters noted that States with publically supported universities designated as Historically Black Colleges and Universities, Hispanic Serving Institutions, and tribal institutions are required to file with the Secretary a supplemental report of equity in funding and other support to these institutions. Private and publically supported institutions in these categories often lack the resources to attract the most recognized scholars in the field.

    Discussion: The Department appreciates the commenters' support for the requirement that States provide technical assistance to improve the performance of any teacher preparation program in its State that has been identified as low-performing.

    We decline to adopt the recommendations of commenters who suggested that the regulations require States to provide specific types of technical assistance because we seek to provide States with flexibility to design technical assistance that is appropriate for the circumstances of each low-performing program. States have the discretion to implement technical assistance in a variety of ways. The regulations outline the minimum requirements, and we encourage States that wish to do more, such as providing assistance to at-risk or other programs, to do so. Furthermore, nothing in the regulations prohibits States from providing technical assistance to at-risk programs in addition to low-performing programs. Similarly, while we encourage States to provide timely assistance to low-performing programs, we decline to prescribe a certain timeframe so that States have the flexibility to meet these requirements according to their capacity. In the SRC, States are required to provide a description of the process used to determine the kind of technical assistance to provide to low-performing programs and how such assistance is administered.

    The Department appreciates comments requesting Federal guidance and resources to support high-quality technical assistance. We agree that such activities could be beneficial. However, the commenters' suggestions that the Department provide financial assistance to States to facilitate their provision of technical assistance, and to teacher preparation programs to support their improvement, and request for national technical assistance centers to support scalable continuous improvement and to improve program quality, are outside the scope of this regulation, which is focused on reporting. The Department will consider ways to help States implement this and other provisions of the regulations, including by facilitating the sharing of best practices across States.

    Changes: None.

    Subpart C—Consequences of Withdrawal of State Approval or Financial Support What are the consequences for a low-performing teacher preparation program that loses the State's approval or the State's financial support? (34 CFR 612.7(a))

    Comments: Multiple commenters opposed the consequences for a low-performing teacher preparation program based on their opinion that the loss of TEACH Grant eligibility will result in decreased access to higher education for students. Commenters noted that, as institutions become unable to award TEACH Grants to students in low-performing teacher preparation programs, students attending those programs would also lose access to TEACH Grant funds and thereby be responsible for the additional costs that the financial aid program normally would have covered. If low-income students are required to cover additional amounts of their tuition, the commenters asserted, they will be less likely to continue their education or to enroll in the first place, if they are prospective students. The commenters noted that this would disproportionately impact low-income and minority teacher preparation students and decrease the enrollment for those populations.

    A number of commenters expressed their concerns about the impacts of losing financial aid eligibility, and stated that decreasing financial aid for prospective teachers would negatively impact the number of teachers joining the profession. As costs for higher education continue to increase and less financial aid is available, prospective teacher preparation program students may decide not to enroll in a teacher preparation program, and instead pursue other fields that may offer other financial incentives to offset the costs associated with college. The commenters believed this would result in fewer teachers entering the field because fewer students would begin and complete teacher preparation programs, thus increasing teacher shortages. Other commenters were concerned about how performance results of teacher preparation programs may impact job outcomes for students who attended those programs in the past as their ability to obtain jobs may be impacted by the rating of a program they have not attended recently. The commenters noted that being rated as low-performing would likely reduce the ability of a program to recruit, enroll, and retain students, which would translate into fewer teachers being available for teaching positions. Others stated that there will be a decrease in the number of students who seek certification in a high-need subject area due to link between TEACH Grant eligibility and teacher preparation program metrics. They believe this will increase teacher shortages in areas that have a shortage of qualified teachers. Additional commenters believed that results from an individual teacher would affect privacy concerns and further drive potential teachers away from the field due to fears that their performance would be published in a public manner.

    Some commenters were specifically concerned about the requirement that low-performing programs be required to provide transition support and remedial services to students enrolled at the time of termination of State support or approval. The commenters noted that low-performing programs are unlikely to have the resources or capacity to provide transitional support to students.

    Discussion: As an initial matter, we note that the requirements in § 612.7(a) are drawn directly from section 207(b) of the HEA, which provides that a teacher preparation program from which the State has withdrawn its approval or financial support due to the State's identification of the program as low-performing may not, among other things, accept or enroll any student who receives title IV student aid. Section 207(b) of the HEA and § 612.7(a) do not concern simply the consequences of a program being rated as low-performing, but rather the consequences associated with a State's withdrawal of the approval of a program or the State's termination of its financial support based on such a rating. Similarly, section 207(b) of the HEA and § 612.7(a) do not concern a program's loss of eligibility to participate in the TEACH Grant program pursuant to part 686, but rather the statutory prohibition on the award of title IV student aid to students enrolled in such a teacher preparation program.

    We disagree with the commenters that the loss of TEACH Grant funds will have a negative impact on affordability of, and access to attend, teacher preparation programs. A program that loses its eligibility would be required to provide transitional support, if necessary, to students enrolled at the institution at the time of termination of financial support or withdrawal of approval to assist students in finding another teacher preparation program that is eligible to enroll students receiving title IV, HEA funds. By providing transition services to students, individuals who receive title IV, HEA funds would be able to find another program in which to use their financial aid and continue in a teacher preparation program in a manner that will still address college affordability. We also disagree with the commenters who stated that low-performing programs are unlikely to have the resources to provide transitional support to students. We believe that an IHE with a low-performing teacher preparation program will be offering other programs that may not be considered low-performing. As such, an IHE will have resources to provide transition services to students affected by the teacher preparation program being labeled as low-performing even if the money does not come directly from the teacher preparation program.

    While teacher preparation program labels may negatively impact job market outcomes because low-performing teacher preparation programs' ability to recruit and enroll future cohorts of students would be negatively impacted by the rating, we believe these labels better serve the interests of students who deserve to know the quality of the program they may enroll in. As we have explained, § 612.7 applies only to programs that lose State approval or financial support as a result of being identified by the State as low-performing. It does not apply to every program that is identified as low-performing. We believe that, while providing information about the quality of a program to a prospective student may impact the student's enrollment decision, a student who wishes to become a teacher will find and enroll in a program that has not lost State approval or State financial support. We believe that providing quality consumer information to prospective students will allow them to make informed enrollment decisions. Students who are aware that a teacher preparation program is not approved by the State may reasonably choose not to enter that program. Individuals who wish to enter the teaching field will continue to find programs that prepare them for the workforce, while avoiding less effective programs. By doing so, we believe, the overall impact to the number of individuals entering the field will be minimal. Section 612.4(b) implements protections and allowances for teacher preparation programs with a program size of fewer than 25 students, which would help to protect against privacy violations, but does not require sharing information on individual teacher effectiveness with the general public.

    In addition, we believe that, as section 207(b) of the HEA requires, removing title IV, HEA program eligibility from low-performing teacher preparation programs that lose State approval or financial support as a result of the State assessment will encourage individuals to enroll in more successful teacher preparation programs. This will keep more prospective teachers enrolled and will mitigate any negative impact on teacher employment rates.

    While these regulations specify that the teacher placement rate and the teacher retention rate be calculated separately for high-need schools, no requirements have been created to track employment outcomes based on high-need subject areas. We believe that an emphasis on high-need schools will help focus on improving student success across the board for students in these schools. In addition, the requirement to report performance at the individual teacher preparation program level will likely promote reporting by high-need subjects as well.

    Section 612.7(a) codifies statutory requirements related to teacher preparation programs that lose State approval or State financial support, and the Department does not have flexibility to alter the language. This includes the requirements for providing transitional services to students enrolled. However, we believe that many transition services are already being offered by colleges and universities, as well as through community organizations focused on student transition to higher education. For example, identifying potential colleges and support in admissions and financial aid application completion, disability support services, remedial education, as well as career services support are all components of transition services that most IHEs offer to some degree to their student body.

    The regulations do not require that an institution dictate how a student is assisted at the time of termination of financial support or withdrawal of approval from the State. Transition services may include helping a student transfer to another program at the same institution that still receives State funding and State approval, or another program at another institution. The transition services offered by the institution should be in the best interest of the student and assist the student in meeting their educational and occupational goals. However, the Department believes that teacher preparation programs may be offering these services through their staff already and those services should not stop because of the consequences of withdrawal of State approval or financial support.

    Changes: None.

    Institutional Requirements for Institutions Administering a Teacher Preparation Program That Has Lost State Approval or Financial Support (34 CFR 612.7(b))

    Comments: One commenter believed that the Department should require States to notify K-12 school officials in the instance where a teacher preparation program student is involved in clinical practice at the school, noting that the K-12 school would be impacted by the loss of State support for the teacher preparation program.

    Discussion: We decline to require schools and districts to be notified directly when a teacher preparation program of a student teacher is assessed as low-performing. While that information would be available to the public, we believe that directly notifying school officials may unfairly paint students within that program as ineffective. A student enrolled in a low-performing teacher preparation program may be an effective and successful teacher and we believe that notifying school officials directly may influence the school officials to believe the student teacher would be a poor performer even though there would be no evidence about the individual supporting this assumption.

    Additionally, we intend § 612.7(b) to focus exclusively on the title IV, HEA consequences to the teacher preparation program that loses State approval or financial support and on the students enrolled in those programs. This subsection describes the procedure that a program must undertake to ensure that students are informed of the loss of State approval or financial support.

    Changes: None.

    How does a low-performing teacher preparation program regain eligibility to accept or enroll students receiving title IV, HEA program funds after a loss of the State's approval or the State's financial support? (34 CFR 612.8(a))

    Comments: One commenter noted that even if a State has given its reinstatement of funds and recognition of improved performance, the program would have to wait for the Department's approval to be fully reinstated. The commenter stated that this would be Federal overreach into State jurisdiction and decision-making. Additionally, the commenter noted that the regulations appear to make access to title IV, HEA funds for an entire institution contingent on the ratings of teacher preparation programs.

    Another commenter noted that some programs might not ever regain authorization to prepare teachers if they must transfer students to other programs since there will not be any future student outcomes associated with the recent graduates of the low-performing programs.

    Discussion: We decline to adopt the suggestion of the commenter that the Department should not require an application by a low-performing teacher preparation program to regain their eligibility to accept or enroll students receiving title IV, HEA funds which had previously lost their eligibility to do so. Section 207(b)(4) of the HEA provides that a teacher preparation program that loses eligibility to enroll students receiving title IV, HEA funds may be reinstated upon demonstration of improved performance, as determined by the State. Reinstatement of eligibility of a low-performing teacher preparation program would occur if the program meets two criteria: (1) Improved performance on the teacher preparation program performance criteria in § 612.5 as determined by the State; and (2) reinstatement of the State's approval or the State's financial support, or, if both were lost, the State's approval and the State's financial support. Section 612.8 operationalizes the process for an institution to notify the Secretary that the State has determined the program has improved its performance sufficiently to regain the States approval or financial support and the teacher preparation should again be permitted to enroll students receiving title IV aid.

    We do not propose to tie the entire institution's eligibility for title IV, HEA funds to the performance of their teacher preparation program. Any loss of title IV, HEA funds based on these regulations would only apply to the institution's teacher preparation program and not to the entire institution. Therefore, an institution would be able to have both title IV eligible and non-title IV eligible programs at their institution. In addition, based on the reporting by program, an institution could have both eligible and non-eligible title IV teacher preparation programs based on the rating of each program. The remaining programs at the institution would still be eligible to receive title IV, HEA funds. We are concerned that our inclusion of proposed § 612.8(b)(2) may have led the commenter to believe that an entire institution would be prohibited from participating in the title IV programs as a result of a teacher preparation program's loss of approval or financial support based on low performance. To avoid such confusion, we have removed § 612.8(b)(2) from the final regulations. The institutional eligibility requirements in part 600 sufficiently describe the requirements for institutions to participate in the title IV, HEA programs.

    We believe that providing transitional support to students enrolled at the institution at the time a State may terminate financial support or withdraw approval of a teacher preparation program will provide appropriate consumer protections to students. We disagree with the commenter who stated it would be impossible for a program to improve its performance on the State assessment, because there could not be any data available on which the program could be assessed, such as student learning outcomes associated with programs if the program was prohibited from enrolling additional title IV eligible students. Programs would not be prohibited from enrolling students to determine future student outcomes. Programs that have lost State approval or financial support would be limited only in their ability to enroll additional title IV eligible students, not to enroll all students.

    Changes: We have removed § 612.8(b)(2), which was related to institutional eligibility.

    Part 686—Teacher Education Assistance for College and Higher Education (TEACH) Grant Program Subpart A—Scope, Purpose, and General Definitions Section 686.1 Scope and Purpose

    Comments: None.

    Discussion: The Higher Education Opportunity Act of 2008 (HEOA) (Pub. L. 110-315) amended section 465(a)(2)(A) of the HEA to include educational service agencies in the description of the term low-income school, and added a new section 481(f) that provides that the term “educational service agency” has the meaning given the term in section 9101 of the ESEA. Also, the ESSA maintained the definition of the term “educational service agency”, but it now appears in section 8101 of the ESEA, as amended by the ESSA. We proposed changes to the TEACH Grant program regulations to incorporate the statutory change, such as replacing the definition of the term “school serving low-income students (low-income school)” in § 686.2 with the term “school or educational service agency serving low-income students (low-income school).” Previously, § 686.1 stated that in exchange for a TEACH Grant, a student must agree to serve as a full-time teacher in a high-need field in a school serving low-income students. We revise the section to provide that a student must teach in a school or educational service agency serving low-income students.

    Changes: We revised § 686.1 to update the citation in the definition of the term educational service agency to section 8101 of the ESEA, as amended, and to use the new term “school or educational service agency serving low-income students (low-income school)” in place of the term “school serving low-income students (low-income school).”

    Section 686.2 Definitions Classification of Instructional Program (CIP)

    Comments: None.

    Discussion: In the NPRM, we proposed to use the CIP to identify TEACH Grant-eligible STEM programs. Because, as discussed below, we are no longer identifying TEACH Grant-eligible STEM programs, the term CIP is not used in the final regulations.

    Changes: We have removed the definition of the term CIP from § 686.2.

    High-Quality Teacher Preparation Program Not Provided Through Distance Education § 686.2

    Comments: None.

    Discussion: In the NPRM, we proposed a definition for the term “high-quality teacher preparation program.” In response to comments, we have added a definition of a “high-quality teacher preparation program provided through distance education” in § 686.2. We make a corresponding change to the proposed definition of the term “high-quality teacher preparation program” to distinguish a “high-quality teacher preparation program not provided through distance education” from a “high-quality teacher preparation program provided through distance education.”

    Furthermore, to ensure that the TEACH Grant program regulations are consistent with the changes made to part 612, we have revised the timelines that we proposed in the definition of the term high-quality teacher preparation program in part 686 that we now incorporate in the terms “high quality teacher preparation not provided through distance education” and “high quality teacher preparation program provided through distance education.” We have also removed the phrase “or of higher quality” from “effective or of higher quality” to align the definition of “high-quality teacher preparation program not provided through distance education” with the definition of the term “effective teacher preparation program” in 34 CFR 612.1(d), which provides that an effective teacher preparation program is a program with a level of performance higher than a low-performing teacher preparation or an at-risk teacher preparation program. The phrase “or of higher quality” was redundant and unnecessary.

    The new definition is consistent with changes we made respect to program-level reporting (including distance education), which are described in the section of the preamble related to § 612.4(a)(1)(i). We note that the new definition of the term “high quality teacher preparation program not provided through distance education” relates to the classification of the program under 34 CFR 612.4(b) made by the State where the program was located, as the proposed definition of the term “high-quality teacher preparation program” provided. This is in contrast to the definition of the term “high-quality teacher preparation program provided through distance education” discussed later in this document.

    Also, the proposed definition provided that in the 2020-2021 award year, a program would be “high-quality” only if it was classified as an effective teacher preparation program in either or both the April 2019 and/or April 2020 State Report Cards. We have determined that this provision is unnecessary and have deleted it. Now, because the first State Report Cards under the regulations will be submitted in October 2019, we have provided that starting with the 2021-2022 award year, a program is high-quality if it is not classified by the State to be less than an effective teacher preparation program based on 34 CFR 612.4(b) in two out of the previous three years. We note that in the NPRM, the definition of the term “high-quality teacher preparation program” contained an error. The proposed definition provided that a program would be considered high-quality if it were classified as effective or of higher quality for two out of three years. We intended the requirement to be that a program is high-quality if it is not rated at a rating lower than effective for two out of three years. This is a more reasonable standard, and allows a program that has been rated as less than effective to improve its rating before becoming ineligible to award TEACH Grants.

    Changes: We have added to § 686.2 the term “high-quality teacher preparation program not provided through distance education” and defined it as a teacher preparation program at which less than 50 percent of the program's required coursework is offered through distance education that, starting with the 2021-2022 award year and subsequent award years, is not classified by the State to be less than an effective teacher preparation program, based on 34 CFR 612.4(b) in two out of the previous three years or meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or 34 CFR 612.4(b)(5).

    High-Quality Teacher Preparation Program Provided Through Distance Education § 686.2

    Comments: In response to the Supplemental NPRM, many commenters stated that it was unfair that one State's classification of a teacher preparation program provided through distance education as low-performing or at-risk of being low-performing would determine TEACH Grant eligibility for all students enrolled in that program who receive TEACH Grants, even if other States classified the program as effective. Commenters did not propose alternative options. One commenter argued that the determination of institutional eligibility to disburse TEACH Grants is meant to rest squarely with the Department, separate from determinations relating to the title II reporting system. Another commenter suggested that there should be a single set of performance standards for TEACH Grants to which all States agree to hold distance education program accountable. Some commenters felt teacher preparation programs provided through distance education might have few students in a State and, as a result, might become victims of an unusually unrepresentative sample in a particular State.

    Several commenters stated that it was unclear how the proposed regulations would take into account TEACH Grant eligibility for students enrolled in a teacher preparation program provided through distance education that does not lead to initial certification or if the program does not receive an evaluation by a State. Another commenter stated that the proposed regulations would effectively impose a requirement for distance education institutions to adopt a 50-State authorization compliance strategy to offer their distance education teacher licensure programs to students in all 50 States.

    Discussion: We are persuaded by the commenters that the proposed regulations were too stringent. Consequently, we are revising the proposed definition of “high-quality teacher preparation program provided through distance education” such that, to become ineligible to participate in the TEACH Grant program, the teacher preparation program provided through distance education would need to be rated as low-performing or at-risk for two out of three years by the same State. This revision focuses on the classification of a teacher preparation program provided through distance education as provided by the same State rather than the classification of a program by multiple States to which the commenters objected. Moreover, this is consistent with the treatment of teacher preparation programs at brick-and-mortar institutions which also have to be classified as low-performing or at-risk for two out of three years by the same State to become ineligible to participate in the TEACH Grant program.

    We disagree with the commenter that the determination of institutional eligibility to disburse TEACH Grants is meant to rest squarely with the Department, separate from determinations relating to teacher preparation program performance under title II of the HEA. The HEA provides that the Secretary determines which teacher preparation programs are high-quality, and the Secretary has reasonably decided to rely, in part, on the classification of teacher preparation program performance by States under title II of the HEA. Further, as the performance rating of teacher preparation programs not provided through distance education could also be subject to unrepresentative samples (for example, programs located near a State border), this concern is not limited to teacher preparation programs provided through distance education.

    The performance standards related to title II are left to a State's discretion; thus, if States want to work together create a single set of performance standards, there is no barrier to them doing so.

    By way of clarification, the HEA and current regulations provide for TEACH Grant eligibility for students enrolled in post-baccalaureate and master's degree programs. The eligibility of programs that do not lead to initial certification is not based on a title II performance rating. In addition, if the teacher preparation program provided through distance education is not classified by a State for a given year due to small n-size, students would still be able to receive TEACH Grants if the program meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or 34 CFR 612.4(b)(5). We disagree that the regulations effectively impose a requirement for distance education institutions to adopt a 50-State authorization compliance strategy to offer their distance education teacher licensure programs to students in all 50 States. Rather, our regulations provide, in part, for reporting on teacher preparation programs provided through distance education under the title II reporting system with the resulting performance level classification of the program based on that reporting forming the basis for that program's eligibility to disburse TEACH Grants.

    Changes: We have revised the definition of a high-quality teacher preparation program provided through distance education to be a teacher preparation program at which at least 50 percent of the program's required coursework is offered through distance education and that starting with the 2021-2022 award year and subsequent award years, is not classified by the same State to be less than an effective teacher preparation program based on 34 CFR 612.4(b) in two of the previous three years or meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or 34 CFR 612.4(b)(5).

    TEACH Grant-Eligible Institution

    Comments: Several commenters disagreed with our proposal to link TEACH Grant program eligibility to State ratings of teacher preparation program performance conducted under the title II reporting system described in part 612. Commenters asserted that State ratings of teacher preparation programs should not determine TEACH Grant program eligibility because it is not a good precedent to withhold financial aid from qualified students on the basis of the quality of the program in which the student is enrolled. Commenters also expressed concern that, under part 612, each State may develop its own criteria for assessing teacher preparation program quality, and that this variation between States will impact teacher preparation programs' eligibility for TEACH Grants. Commenters stated that using different quality measures to determine student eligibility for TEACH Grants will be unfair to students, as programs in different States will be evaluated using different criteria.

    A commenter that offers only graduate degree programs and no programs that lead to initial certification noted that the HEA provides that current teachers may be eligible for TEACH Grants to obtain graduate degrees, and questioned how those students could obtain TEACH Grants under the proposed definitions of the terms “TEACH Grant-eligible institution” and “TEACH Grant-eligible program.”.

    Commenters also expressed concern that the proposed definition of the term TEACH Grant-eligible institution will result in an overall reduction in the number of institutions that are eligible to provide TEACH Grants, and that, because of this reduction, fewer students will pursue high-need fields such as special education, or teach in high-poverty, diverse, urban or rural communities where student test scores may be lower. One commenter stated that it is unfair to punish students by denying them access to financial aid when the States they live in and the institutions they attend may not be able to supply the data on which the teacher preparation programs are being assessed.

    Discussion: We believe that creating a link between institutions with teacher preparation programs eligible for TEACH Grants and the ratings of teacher preparation programs under the title II reporting system is critical, and will allow the Secretary to identify what teacher preparation programs are high-quality. An “eligible institution,” as defined in section 420L(1)(A) of the HEA, is one that the Secretary determines “provides high-quality teacher preparation and professional development services, including extensive clinical experience as part of pre-service preparation,” among other requirements. Consistent with this requirement, we have defined the term “TEACH Grant-eligible program” to include those teacher preparation programs that a State has determined provide at least effective teacher preparation. Under title II of the HEA, States are required to assess the quality of teacher preparation programs in the State and to make a determination as to whether a program is low-performing or at-risk of being low-performing. A teacher preparation program that does not fall under either one of these categories is considered an effective teacher preparation program under these final regulations. It is appropriate and reasonable for the Secretary to rely on a State's assessment of the quality of teacher preparation programs in that State for purposes of determining which programs are TEACH Grant-eligible programs.

    We agree that States will assess teacher preparation programs based on different criteria and measures. The HEA only requires a State to assess the quality of teacher preparation in that State and does not require comparability between States. That different States may use different standards is not necessarily unfair, as it is reasonable for States to consider specific conditions in their States when designing their annual assessments. We believe it is important that students receiving TEACH Grants be enrolled in programs that the State has identified as providing effective teacher preparation.

    We agree that in addition to ensuring that students wishing to achieve initial certification to become teachers are eligible for TEACH Grants, the HEA provides that a teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field, or a teacher who is using high-quality alternative certification routes to become certified is eligible to receive TEACH Grants. To ensure that these eligible students are able to obtain TEACH grants, we have modified the definitions of the terms “TEACH Grant-eligible institution” and “TEACH Grant-eligible program.”

    We also acknowledge the possibility that the overall number of institutions eligible to award TEACH Grants could decrease, because a TEACH Grant-eligible institution now must, in most cases, provide at least one high quality teacher preparation program, while in the current regulation, an institution may be TEACH Grant-eligible if it offers a baccalaureate degree that, in combination with other training or experience, will prepare an individual to teach in a high-need field and has entered into an agreement with another institution to provide courses necessary for its students to begin a career in teaching. We note that so long as an otherwise eligible institution has one high-quality teacher preparation program not provided through distance education or one high-quality program provided through distance education, it continues to be a TEACH Grant-eligible institution. Furthermore, we do not believe that fewer incentives for students to pursue fields such as special education or to teach in high-poverty, diverse, or rural communities where test scores may be lower would necessarily be created. TEACH Grants will continue to be available to students so long as their teacher preparation programs are classified as effective teacher preparation programs by the State (subject to the exceptions previously discussed), and we are not aware of any evidence that programs that prepare teachers who pursue fields such as special education or who teach in communities where test scores are lower will be classified as at-risk or low-performing teacher preparation programs on the basis of lower test scores. We believe that those students will choose to pursue those fields while enrolled in high-quality programs. The larger reason that the number of institutions providing TEACH Grants may decrease is that the final regulations narrow the definition of a TEACH Grant-eligible institution to generally those institutions that offer at least one high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education at the baccalaureate or master's degree level (that also meets additional requirements) and institutions that provide a high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education that is a post-baccalaureate program of study.

    We do not agree that student learning outcomes for any subgroup, including for teachers who teach students with disabilities, would necessarily be lower if properly measured. Further, student learning outcomes is one of multiple measures used to determine a rating and, thereby, TEACH eligibility. So a single measure, whether student learning outcomes or another, would not necessarily lead to the teacher preparation program being determined by the State to be low-performing or at-risk of being low-performing and correspondingly being ineligible for TEACH Grants. As discussed elsewhere in this document, States determine the ways to measure student learning outcomes that give all teachers a chance to demonstrate effectiveness regardless of the composition of their classrooms, and States may also determine weights of the criteria used in their State assessments of teacher preparation program quality.

    We do not agree with the comment that the definition of the term Teach Grant-eligible program will unfairly punish students who live in States or attend institutions that fail to comply with the regulations in part 612 by failing to supply the data required in that part. Section 205 of the HEA requires States and institutions to submit IRCs and SRCs annually. In addition, students will have access to information about a teacher preparation program's eligibility before they enroll so that they may select programs that are TEACH Grant-eligible. Section 686.3(c) also allows students who are currently enrolled in a TEACH Grant-eligible program to receive additional TEACH Grants to complete their program, even if the program becomes ineligible to award TEACH Grants to new students.

    For reasons discussed under the TEACH Grant-eligible program section of this document, we have made conforming changes to the definition of a TEACH Grant-eligible program that are reflected in the definition of TEACH Grant-eligible institution where applicable.

    Changes: We have revised the definition of a TEACH Grant-eligible institution to provide that, if an institution provides a program that is the equivalent of an associate degree as defined in § 668.8(b)(1) that is acceptable for full credit toward a baccalaureate degree in a high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education or provides a master's degree program that does not meet the definition of the terms “high quality teacher preparation not provided through distance education” or “high quality teacher preparation program that is provided through distance education” because it is not subject to reporting under 34 CFR part 612, but that prepares (1) a teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field; or (2) a teacher who is using high-quality alternative certification routes to become certified, the institution is considered a TEACH Grant-eligible institution.

    TEACH Grant-Eligible Program

    Comments: A commenter recommended the definition of Teach Grant-eligible program be amended to add “or equivalent,” related to the eligibility of a two-year program so that the definition would read, “Provides a two-year or equivalent program that is acceptable for full credit toward a baccalaureate degree in a high-quality teacher preparation program” because some programs could be less than two years, but the curriculum covered is the equivalent of a two-year program.

    Discussion: We agree with the commenter that some programs could be less than two years, but the curriculum could cover the equivalent of a two-year program, and therefore agree that the provision regarding what constitutes an eligible two-year program of study should be revised. However, we base the revision on already existing regulations regarding “eligible program” rather than the commenter's specific language recommendations. The regulations for “eligible program” in § 668.8 provide that an eligible program is an educational program that is provided by a participating institution and satisfies other relevant requirements contained in the section, including that an eligible program provided by an institution of higher education must, in part, lead to an associate, bachelors, professional, or graduate degree or be at least a two academic-year program that is acceptable for full credit toward a bachelor's degree. For purposes of § 668.8, the Secretary considers an “equivalent of an associate degree” to be, in part, the successful completion of at least a two-year program that is acceptable for full credit toward a bachelor's degree and qualifies a student for admission into the third year of a bachelor's degree program. Based on these existing regulations, we amended the proposed definition of TEACH Grant-eligible program to provide that a program that is the equivalent of an associate degree as defined in § 668.8(b)(1) that is acceptable for full credit toward a baccalaureate degree in a high-quality teacher preparation program is considered to be a TEACH Grant-eligible program. In addition, as described in the discussion of the term “TEACH Grant-eligible institution,” we have made a corresponding change to the definition of the term “TEACH Grant-eligible program” to ensure that programs that prepare graduate degree students who are eligible to receive TEACH grants pursuant to section 420N(a)(2)(B) of the HEA are eligible programs. This change applies to programs that are not assessed by a State under title II of the HEA.

    Changes: We have revised the definition of TEACH Grant-eligible program to provide that a program that is a two-year program or is the equivalent of an associate degree as defined in § 668.8(b)(1) that is acceptable for full credit toward a baccalaureate degree in a high quality teacher preparation program is also considered to be a TEACH Grant-eligible program. We have also clarified that a master's degree program that does not meet the definition of the terms “high quality teacher preparation not provided through distance education” or “high quality teacher preparation program that is provided through distance education” because it is not subject to reporting under 34 CFR part 612, but that prepares (1) a teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field; or (2) a teacher who is using high-quality alternative certification routes to become certified is a TEACH Grant-eligible program.

    TEACH Grant-Eligible STEM Program

    Comments: Multiple commenters stated that the proposed definition of the term TEACH Grant-eligible STEM program was not discussed during the negotiated rulemaking process and unreasonably creates a separate standard for TEACH Grant eligibility without the corresponding reporting required in the SRC. Commenters generally stated that all teacher preparation programs should be held accountable in a fair and equitable manner. Commenters further stated that the Department did not provide any rationale for excepting STEM programs from the ratings of teacher preparation programs described in part 612. Commenters also noted that the proposed definition ignores foreign language, special education, bilingual education, and reading specialists, which are identified as high-need fields in the HEA. Several commenters also disagreed with the different treatment provided to STEM programs under the definition because they believed that STEM fields were being given extra allowances with respect to failing programs and that creating different standards of program effectiveness for STEM programs and teacher preparation programs makes little sense. Commenters suggested that, instead, the Department should require that STEM programs to be rated as effective or exceptional in order for students in those programs to receive TEACH Grants.

    Commenters also questioned what criteria the Secretary would use to determine eligibility, since the Secretary would be responsible for determining which STEM programs are TEACH Grant-eligible. Finally, commenters emphasized the importance of the pedagogical aspects of teacher education.

    Discussion: We agree that it is important that teacher preparation programs that are considered TEACH Grant-eligible programs be high-quality programs, and that the proposed definition of the term TEACH Grant-eligible STEM program may not achieve that goal. The regulations in part 612 only apply to teacher preparation programs, which are defined in that part generally as programs that lead to an initial State teacher certification or licensure in a specific field. Many STEM programs do not lead to an initial State teacher certification or licensure, and hence are not subject to the State assessments described in part 612 and section 207 of the HEA. We have carefully considered the commenters' concerns, and have decided to remove our proposed definition of the term TEACH Grant-eligible STEM program because it would be difficult to implement and would result in different types of programs being held to different quality standards. We also acknowledge the importance of the pedagogical aspects of teacher education. A result of the removal of this definition will be that a student must be enrolled in a high-quality teacher preparation program as defined in § 686.2(e) to be eligible for a TEACH Grant, and that few students participating in STEM programs will receive TEACH Grants. Those students may be eligible for TEACH Grants for post-baccalaureate or graduate study after completion of their STEM programs.

    Changes: We have removed the TEACH Grant-eligible STEM program definition from § 686.2, as well as references to and uses of that definition elsewhere in part 686 where this term appeared.

    Section 686.11 Eligibility To Receive a TEACH Grant

    Comments: Some commenters supported linking TEACH Grant eligibility to the title II reporting system for the 2020-2021 title IV award year, noting that this would prevent programs that fail to prepare teachers effectively from remaining TEACH Grant-eligible, and that linking TEACH Grant program eligibility to teacher preparation program quality is an important lever to bring accountability to programs equipping teachers to teach in the highest need schools. Other commenters were concerned that linking title II teacher preparation program ratings to TEACH Grant eligibility will have a negative impact on recruitment for teacher preparation programs, will restrict student access to TEACH Grants, and will negatively impact college affordability for many students, especially for low- and middle-income students and students of color who may be disproportionately impacted because these students typically depend more on Federal student aid. Commenters were concerned that limiting aid for these students, as well as for students in rural communities or students in special education programs, would further increase teacher shortages in these areas, would slow down progress in building a culturally and racially representative educator workforce, and possibly exacerbate current or pending teacher shortages across the nation in general. Many commenters opined that, because there is no evidence supporting the use of existing student growth models for determining institutional eligibility for the TEACH Grant program, institutional eligibility for TEACH Grants and student eligibility for all title IV Federal student aid in a teacher preparation program would be determined based on an invalid and unreliable rating system. Some commenters recommended that Federal student aid be based on student need, not institutional ratings, that they asserted lack a sound research base because of the potential unknown impacts on underrepresented groups. Others expressed concern that financial aid offices would experience more burden and more risk of error in the student financial aid packaging process because they would have more information to review to determine student eligibility. This would include, for distance education programs, where each student lives and which programs are eligible in which States.

    Many commenters stated that the proposed regulations would grant the State, rather than the Department of Education, authority to determine TEACH Grant eligibility, which is a delegation of authority that Congress did not provide the Department, and that a State's strict requirements may make the TEACH Grant program unusable by institutions, thereby eliminating TEACH Grant funding from students at those institutions. It was recommended that the regulations allow for professional judgment regarding TEACH Grant eligibility, that TEACH Grants mimic Federal Pell grants in annual aggregates, and that a link should be available at studentloans.gov for TEACH Grant requirements. One commenter further claimed that the proposed regulations represent a profound and unwelcome shift in the historic relationship between colleges, States, and the Federal government and that there is no indication that the HEA envisions the kind of approach to institutional and program eligibility for TEACH Grants proposed in the regulations. The commenter opined that substantive changes to the eligibility requirements should be addressed through the legislative process, rather than through regulation. A commenter noted that a purpose of the proposed regulations is to deal with deficiencies in the TEACH Grant program, and thus, the Department should focus specifically on issues with the TEACH Grant program and not connect these to reporting of the teacher preparation programs.

    Discussion: We appreciate the comments supporting the linking of TEACH Grant eligibility to the title II reporting system for the 2021-2022 title IV award year. We disagree, however, with comments suggesting that such a link will have a negative impact on recruitment for teacher preparation programs and restrict student access to TEACH Grants because this circumstance would only arise in the case of programs rated other than effective, and it is not unreasonable for students to choose to attend teacher preparation programs that are effective over those that are not. While we agree that low- and middle-income students and students of color are more likely to depend on Federal student aid, the regulations would not affect their eligibility for Federal student aid as long as they are enrolled in a TEACH Grant-eligible teacher preparation program at a TEACH Grant-eligible institution. The same would be true for students in rural communities or in special education programs. Because student eligibility for Federal student aid would not be affected in these circumstances, teacher shortages in these areas also would not be impacted. In 2011, only 38 institutions were identified by their States as having a low-performing teacher preparation program.53 That evaluation was based on an institution-wide assessment of quality. Under part 612, each individual teacher preparation program offered by an institution will be evaluated by the State, and it would be unlikely for all teacher preparation programs at an institution to be rated as low-performing. We believe that students reliant on Federal student aid will have sufficient options to enroll in high-quality teacher preparation programs under the final regulations. While we hope that students would use the ratings of teacher preparation programs to pick more effective programs initially, we also provide under § 686.3 that an otherwise eligible student who received a TEACH Grant for enrollment in a TEACH Grant-eligible program is eligible to receive additional TEACH Grants to complete that program, even if that program is no longer considered TEACH Grant-eligible. An otherwise eligible student who received a TEACH Grant for enrollment in a program before July 1 of the year these final regulations become effective would remain eligible to receive additional TEACH Grants to complete the program even if the program is no longer considered TEACH Grant-eligible under § 686.2(e).

    53 U.S. Department of Education, Office of Postsecondary Education (2013). Preparing and Credentialing the Nation's Teachers: The Secretary's Ninth Report on Teacher Quality. Washington, DC. Retrieved from https://title2.ed.gov/Public/TitleIIReport13.pdf. (Hereafter referred to as “Secretary's Ninth Report.”)

    With respect to comments objecting to the use of student growth to determine TEACH Grant eligibility, student growth is only one of the many indicators that States use to assess teacher preparation program quality in part 612, and States have discretion to determine the weight assigned to that indicator in their assessment.

    While the new regulations will require financial aid offices to track and review additional information with respect to student eligibility for TEACH Grants, we do not agree that this would result in greater risk of incorrect packaging of financial aid. For an institution to begin and continue to participate in any title IV, HEA program, the institution must demonstrate to the Secretary that it is capable of administering that program under the standards of administrative capability provided under § 668.16 (Standards of administrative capability). An institution that does not meet administrative capability standards would not be eligible to disburse any title IV, HEA funds, including TEACH Grants. Moreover, institutions have always had to determine whether a student seeking a TEACH Grant is enrolled in a TEACH Grant-eligible program. The final regulations require the institution to be aware of whether any of the teacher preparation programs at the institution have been rated as low-performing or at-risk by the State when identifying which programs that it offers are TEACH Grant-eligible programs.

    We disagree with comments asserting that the proposed regulations would grant States, rather than the Department, authority to determine TEACH Grant eligibility, which they claimed is a delegation of authority that Congress did not authorize. The HEA provides that an “eligible institution” for purposes of the TEACH Grant program is one “that the Secretary determines . . . provides high quality teacher preparation . . . .” The Secretary has determined that States are in the best position to assess the quality of teacher preparation programs located in their States, and it is reasonable for the Secretary to rely on the results of the State assessment required by section 207 of the HEA. We believe that it is appropriate to use the regulatory process to define how the Secretary determines that an institution provides high quality teacher preparation and that the final regulations reasonably amend the current requirements so that they are more meaningful.

    We also disagree with commenters that a State's strict requirements may make the TEACH Grant program unusable by institutions and thereby eliminate TEACH Grant funding for students at those institutions. We believe that States will conduct careful and reasonable assessments of teacher preparation programs located in their States, and we also believe if a State determines a program is not effective at providing teacher preparation, students should not receive TEACH Grants to attend that program.

    Regarding the recommendation that the regulations allow for professional judgment regarding TEACH Grant eligibility, there is no prohibition regarding the use of professional judgment for the TEACH Grant program, provided that all applicable regulatory requirements are met. With respect to the comment suggesting that the TEACH Grant program should mimic the Pell Grant program in annual aggregates, we note that, just as the Pell Grant program has its own annual aggregates, the TEACH Grant program has its own statutory annual award limits that must be adhered to. The HEA provides that a undergraduate or post-graduate student may receive up to $4,000 per year, and § 686.3(a) provides that an undergraduate or post-baccalaureate student may receive the equivalent of up to four Scheduled Awards during the period required for completion of the first undergraduate baccalaureate program of study and the first post-baccalaureate program of study combined. For graduate students, the HEA provides up to $4,000 per year, and § 686.3(b) stipulates that a graduate student may receive the equivalent of up to two Scheduled Awards during the period required for the completion of the TEACH Grant-eligible master's degree program of study.

    Regarding the comment requesting a link to the TEACH Grant program via the studentloans.gov Web site, we do not believe that adding a link to the studentloans.gov Web site for TEACH Grants would be helpful, and could in fact be confusing. This Web site is specific to loans, not grants. Only if a student does not fulfill the Agreement to Serve is the TEACH Grant converted to a Direct Unsubsidized Loan. The Web site already includes a link to the teach-ats.ed.gov Web site, where students can complete TEACH Grant counseling and the Agreement to Serve. The Department does provide information about the TEACH Grant program on its studentaid.ed.gov Web site.

    We disagree with the comment that the Department should focus specifically on issues or deficiencies with the TEACH Grant program and not connect any issues or deficiencies to reporting of teacher preparation programs under title II. The regulations are intended to improve the TEACH Grant program, in part, by operationalizing the definition of a high-quality teacher preparation program by connecting the definition to the ratings of teacher preparation programs under the title II reporting system. The regulations are not meant to address specific TEACH Grant program issues or program deficiencies.

    We decline to adopt the suggestion that an at-risk teacher preparation program should be given the opportunity and support to improve before any consequences, including those regarding TEACH Grants, are imposed. The HEA specifies that TEACH Grants may only be provided to high-quality teacher preparation programs, and we do not believe that a program identified as being at-risk should be considered a high-quality teacher preparation program. With respect to the comment that institutions in the specific commenter's State will remove themselves from participation in the TEACH Grant program rather than pursue high-stakes Federal requirements, we note that, while we cannot prevent institutions from ending their participation in the program, we believe that institutions understand the need for providing TEACH Grants to eligible students and that institutions will continue to try to meet that need. Additionally, we note that all institutions that enroll students receiving Federal financial assistance are required to submit an annual IRC under section 205(a) of the HEA, and that all States that receive funds under the HEA must submit an annual SRC. These provisions apply whether or not an institution participates in the TEACH Grant program.

    We agree with the commenters who recommended avoiding specific carve-outs for potential mathematics and science teachers. As discussed under the section titled “TEACH Grant-eligible STEM program,” we have removed the TEACH Grant-eligible STEM program definition from § 686.2 and deleted the term where it appeared elsewhere in § 686.

    Changes: None.

    § 686.42 Discharge of Agreement To Serve

    Comments: None.

    Discussion: Section 686.42(b) describes the procedure we use to determine a TEACH Grant recipient's eligibility for discharge of an agreement to serve based on the recipient's total and permanent disability. We intend this procedure to mirror the procedure outlined in § 685.213 which governs discharge of Direct Loans. We are making a change to § 686.42(b) to make the discharge procedures for TEACH Grants more consistent with the Direct Loan discharge procedures. Specifically, § 685.213(b)(7)(ii)(C) provides that the Secretary does not require a borrower to pay interest on a Direct Loan for the period from the date the loan was discharged until the date the borrower's obligation to repay the loan was reinstated. This idea was not clearly stated in § 686.42(b). We have added new § 686.42(b)(4) to explicitly state that if the TEACH Grant of a recipient whose TEACH Grant agreement to serve is reinstated is later converted to a Direct Unsubsidized Stafford Loan, the recipient will not be required to pay interest that accrued on the TEACH Grant disbursements from the date the agreement to serve was discharged until the date the agreement to serve was reinstated. Similarly, § 685.213(b)(7)(iii) describes the information that the Secretary's notification to a borrower in the event of reinstatement of the loan will include. We have amended § 686.42(b)(3) to make the TEACH Grant regulations more consistent with the Direct Loan regulations. Specifically, we removed proposed § 686.42(b)(3)(iii), which provided that interest accrual would resume on TEACH Grant disbursements made prior to the date of discharge if the agreement was reinstated.

    Changes: We have removed proposed § 686.42(b)(3)(iii) and added a new § 686.42(b)(4) to more clearly describe that, if the TEACH Grant of a recipient whose TEACH Grant agreement to serve is reinstated is later converted to a Direct Unsubsidized Stafford Loan, the recipient will not be required to pay interest that accrued on the TEACH Grant disbursements from the date the agreement to serve was discharged until the date the agreement to serve was reinstated. This change also makes the TEACH Grant regulation related to total and permanent disability more consistent with the Direct Loan discharge procedures.

    Executive Orders 12866 and 13563 Regulatory Impact Analysis

    Under Executive Order 12866, the Secretary must determine whether this regulatory action is “significant” and, therefore, subject to the requirements of the Executive order and subject to review by the Office of Management and Budget (OMB). Section 3(f) of Executive Order 12866 defines a “significant regulatory action” as an action likely to result in a rule that may—

    (1) Have an annual effect on the economy of $100 million or more, or adversely affect a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities in a material way (also referred to as an “economically significant” rule);

    (2) Create serious inconsistency or otherwise interfere with an action taken or planned by another agency;

    (3) Materially alter the budgetary impacts of entitlement grants, user fees, or loan programs or the rights and obligations of recipients thereof; or

    (4) Raise novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles stated in the Executive order.

    This final regulatory action is a significant regulatory action subject to review by OMB under section 3(f) of Executive Order 12866.

    We have also reviewed these regulations under Executive Order 13563, which supplements and explicitly reaffirms the principles, structures, and definitions governing regulatory review established in Executive Order 12866. To the extent permitted by law, Executive Order 13563 requires that an agency—

    (1) Propose or adopt regulations only on a reasoned determination that their benefits justify their costs (recognizing that some benefits and costs are difficult to quantify);

    (2) Tailor its regulations to impose the least burden on society, consistent with obtaining regulatory objectives and taking into account—among other things and to the extent practicable—the costs of cumulative regulations;

    (3) In choosing among alternative regulatory approaches, select those approaches that maximize net benefits (including potential economic, environmental, public health and safety, and other advantages; distributive impacts; and equity);

    (4) To the extent feasible, specify performance objectives, rather than the behavior or manner of compliance a regulated entity must adopt; and

    (5) Identify and assess available alternatives to direct regulation, including economic incentives—such as user fees or marketable permits—to encourage the desired behavior, or provide information that enables the public to make choices.

    Executive Order 13563 also requires an agency “to use the best available techniques to quantify anticipated present and future benefits and costs as accurately as possible.” The Office of Information and Regulatory Affairs of OMB has emphasized that these techniques may include “identifying changing future compliance costs that might result from technological innovation or anticipated behavioral changes.”

    We are issuing these final regulations only on a reasoned determination that their benefits justify their costs. In choosing among alternative regulatory approaches, we selected those approaches that maximize net benefits. Based on the analysis that follows, the Department believes that these regulations are consistent with the principles in Executive Order 13563.

    We also have determined that this regulatory action does not unduly interfere with State, local, or tribal governments in the exercise of their governmental functions.

    In this RIA we discuss the need for regulatory action, the potential costs and benefits, net budget impacts, assumptions, limitations, and data sources, as well as regulatory alternatives we considered. Although the majority of the costs related to information collection are discussed within this RIA, elsewhere in this document under Paperwork Reduction Act of 1995, we also identify and further explain burdens specifically associated with information collection requirements.

    1. Need for Regulatory Action

    Recent international assessments of student achievement have revealed that students in the United States are significantly behind students in other countries in science, reading, and mathematics.54 Although many factors influence student achievement, a large body of research has used value-added modeling to demonstrate that teacher quality is the largest in-school factor affecting student achievement.55 We use “value-added” modeling and related terms to refer to statistical methods that use changes in the academic achievement of students over time to isolate and estimate the effect of particular factors, such as family, school, or teachers, on changes in student achievement.56 One study found that the difference between having a teacher who performed at a level one standard deviation below the mean and a teacher who performed at a level one standard deviation above the mean was equivalent to student learning gains of a full year's worth of knowledge.57

    54 Kelly, D., Xie, H., Nord, C.W., Jenkins, F., Chan, J.Y., Kastberg, D. (2013). Performance of U.S. 15-Year-Old Students in Mathematics, Science, and Reading Literacy in an International Context: First Look at PISA 2012 (NCES 2014-024). Retrieved from U.S. Department of Education, National Center for Education Statistics Web site: http://nces.ed.gov/pubs2014/2014024rev.pdf.

    55 Sanders, W., Rivers, J.C. (1996). Cumulative and Residual Effects of Teachers on Future Student Academic Achievement. Retrieved from University of Tennessee, Value-Added Research and Assessment Center; Rivkin, S., Hanushek, E., & Kane, T. (2005). Teachers, Schools, and Academic Achievement. Econometirica, 417-458; Rockoff, J. (2004). The Impact of Individual Teachers on Student Achievement: Evidence from Panel Data. American Economic Review, 94(2), 247-252.

    56 For more information on approaches to value-added modeling, see also: Braun, H. (2005). Using Student Progress to Evaluate Teachers: A Primer on Value-Added Models. Retrieved from http://files.eric.ed.gov/fulltext/ED529977.pdf; Sanders, W.J. (2006). Comparisons Among Various Educational Assessment Value-Added Models, Power of Two—National Value-Added Conference, Battelle for Kids, Columbus, OH. SAS, Inc.

    57 E. Hanushek. (1992). The Trade-Off between Child Quantity and Quality. Journal of Political Economy, 100(1), 84-117.

    A number of factors are associated with teacher quality, including academic content knowledge, in-service training, and years of experience, but researchers and policymakers have begun to examine whether student achievement discrepancies can be explained by differences in the preparation their teachers received before entering the classroom.58 An influential study on this topic found that the effectiveness of teachers in public schools in New York City who were prepared through different teacher preparation programs varied in statistically significant ways, as the student growth found using value-added measures shows.59

    58 D. Harris & T. Sass. (2011). Teacher Training, Teacher Quality, and Student Achievement. Journal of Public Economics, 95(7-8), 798-812; D. Aaronson, L. Barrow, & W. Sanders. (2007). Teachers and Student Achievement in the Chicago Public High Schools. Journal of Labor Economics, 25(1), 95-135; D. Boyd, H. Lankford, S. Loeb, J. Rockoff, & Wyckoff, J. (2008). The Narrowing Gap in New York City Teacher Qualifications and Its Implications for Student Achievement in High-Poverty Schools. Journal of Policy Analysis and Management, 27(4), 793-818.

    59 D. Boyd, P. Grossman, H. Lankford, S. Loeb, & J. Wyckoff (2009). “Teacher Preparation and Student Achievement.” Education Evaluation and Policy Analysis, 31(4): 416-440.

    Subsequent studies have examined the value-added scores of teachers prepared through different teacher preparation programs in Missouri, Louisiana, North Carolina, Tennessee, and Washington.60 Many of these studies have found statistically significant differences between teachers prepared at different preparation programs. For example, State officials in Tennessee and Louisiana have worked with researchers to examine whether student achievement could be used to inform teacher preparation program accountability. After controlling for observable differences in students, researchers in Tennessee found that the most effective teacher preparation programs in that State produced graduates who were two to three times more likely than other novice teachers to be in the top quintile of teachers in a particular subject area, as measured by increases in the achievement of their students, with the least-effective programs producing teachers who were equally likely to be in the bottom quintile.61 Analyses based on Louisiana data on student growth linked to the programs that prepared students' teachers found some statistically significant differences in teacher effectiveness.62 Although the study's sample size was small, three teacher preparation programs produced novice teachers who appeared, on average, to be as effective as teachers with at least two years of experience, based on growth in student achievement in four or more content areas.63 A study analyzing differences between teacher preparation programs in Washington based on the value-added scores of their graduates also found a few statistically significant differences, which the authors argued were educationally meaningful.64 In mathematics, the average difference between teachers from the highest performing program and the lowest performing program was approximately 1.5 times the difference in performance between students eligible for free or reduced-price lunches and those who are not, while in reading the average difference was 2.3 times larger.65

    60 Koedel, C., Parsons, E., Podgursky, M., & Ehlert, M. (2015). Teacher Preparation Programs and Teacher Quality: Are There Real Differences Across Programs? Education Finance and Policy, 10(4), 508-534.; Campbell, S., Henry, G., Patterson, K., Yi, P. (2011). Teacher Preparation Program Effectiveness Report. Carolina Institute for Public Policy; Goldhaber, D., & Liddle, S. (2013). The Gateway to the Profession: Assessing Teacher Preparation Programs Based on Student Achievement. Economics of Education Review, 34, 29-44.

    61 Tennessee Higher Education Commission. Report Card on the Effectiveness of Teacher Training Programs, 2010.

    62 Gansle, K., Noell, G., Knox, R.M., Schafer, M.J. (2010). Value Added Assessment of Teacher Preparation Programs in Louisiana: 2007-2008 TO 2009-2010 Overview of 2010-11 Results. Retrieved from Louisiana Board of Regents.

    63 Ibid.

    64 Goldhaber, D., & Liddle, S. (2013). The Gateway to the Profession: Assessing Teacher Preparation Programs Based on Student Achievement. Economics of Education Review, 34, 29-44.

    65 Ibid. 1.5 times the difference between students eligible for free or reduced price lunch is approximately 12 percent of a standard deviation, while 2.3 times the difference is approximately 19 percent of a standard deviation.

    In contrast to these findings, Koedel, et al. found very small differences in effectiveness between teachers prepared at different programs in Missouri.66 The vast majority of variation in teacher effectiveness was within programs, instead of between programs.67 However, the authors noted that the lack of variation between programs in Missouri could reflect a lack of competitive pressure to spur innovation within traditional teacher preparation programs.68 A robust evaluation system that included outcomes could spur innovation and increase differentiation between teacher preparation programs.69

    66 Koedel, C., Parsons, E., Podgursky, M., & Ehlert, M. (2015). Teacher Preparation Programs and Teacher Quality: Are There Real Differences Across Programs? Education Finance and Policy, 10(4), 508-534.

    67 Ibid.

    68 Ibid.

    69 Ibid.

    We acknowledge that there is debate in the research community about the specifications that should be used when conducting value-added analyses of the effectiveness of teachers prepared through different preparation programs,70 but also recognize that the field is moving in the direction of weighting value-added analyses in assessments of teacher preparation program quality.

    70 For a discussion of issues and considerations related to using school fixed effects models to compare the effectiveness of teachers from different teacher preparation programs who are working in the same school, see Lockwood, J.R., McCaffrey, D., Mihaly, K., Sass, T.(2012). Where You Come From or Where You Go? Distinguishing Between School Quality and the Effectiveness of Teacher Preparation Program Graduates. (Working Paper 63). Retrieved from National Center for Analysis of Longitudinal Data in Education Research.

    Thus, despite the methodological debate in the research community, CAEP has developed new standards that require, among other measures, evidence that students completing a teacher preparation program positively impact student learning.71 The new standards are currently voluntary for the more than 900 education preparation providers who participate in the education preparation accreditation system. Participating institutions account for nearly 60 percent of the providers of educator preparation in the United States, and their enrollments account for nearly two-thirds of newly prepared teachers. The new CAEP standards will be required beginning in 2016.72 The standards are an indication that the effectiveness ratings of teachers trained through teacher preparation programs are increasingly being used as a way to evaluate teacher preparation program performance. The research on teacher preparation program effectiveness is relevant to the elementary and secondary schools that rely on teacher preparation programs to recruit and select talented individuals and prepare them to become future teachers. In 2011-2012 (the most recent year for which data are available), 203,701 individuals completed either a traditional teacher preparation program or an alternative route program. The National Center for Education Statistics (NCES) projects that by 2020, public and private schools will need to hire as many as 362,000 teachers each year due to teacher retirement and attrition and increased student enrollment.73 In order to meet the needs of public and private schools, States may have to expand traditional and alternative route programs to prepare more teachers, find new ways to recruit and train qualified individuals, or reduce the need for novice teachers by reducing attrition or developing different staffing models. Better information on the quality of teacher preparation programs will help States and LEAs make sound staffing decisions.

    71 CAEP 2013 Accreditation Standards.(2013). Retrieved from http://caepnet.files.wordpress.com/2013/09/final_board_approved1.

    72 Teacher Preparation: Ensuring a Quality Teacher in Every Classroom. Hearing before the Senate Committee on Health, Education, Labor and Pensions. 113th Congress. 113th Cong. (2014)(Statement by Mary Brabeck).

    73 U.S. Department of Education (2015). Table 208.20. Digest of Education Statistics, 2014. Retrieved from National Center for Education Statistics.

    Despite research suggesting that the academic achievement of students taught by graduates of different teacher preparation programs may vary with regard to their teacher's program, analyses linking student achievement to teacher preparation programs have not been conducted and made available publicly for teacher preparation programs in all States. Congress has recognized the value of assessing and reporting on the quality of teacher preparation, and requires States and IHEs to report detailed information about the quality of teacher preparation programs in the State under the HEA. When reauthorizing the title II reporting system, members of Congress noted a goal of having teacher preparation programs explore ways to assess the impact of their programs' graduates on student academic achievement. In fact, the report accompanying the House Bill (H. Rep. 110-500) included the following statement, “[i]t is the intent of the Committee that teacher preparation programs, both traditional and those providing alternative routes to State certification, should strive to increase the quality of individuals graduating from their programs with the goal of exploring ways to assess the impact of such programs on student's academic achievement.”

    Moreover, in roundtable discussions and negotiated rulemaking sessions held by the Department, stakeholders repeatedly expressed concern that the current title II reporting system provides little meaningful data on the quality of teacher preparation programs or the impact of those programs' graduates on student achievement. The recent GAO report on teacher preparation programs noted that half or more of the States and teacher preparation programs surveyed said the current title II data collection was not useful to assessing their programs; and none of the surveyed school district staff said they used the data.74

    74 GAO at 26.

    Currently, States must annually calculate and report data on more than 400 data elements, and IHEs must report on more than 150 elements. While some information requested in the current reporting system is statutorily required, other elements—such as whether the IHE requires a personality test prior to admission—are not required by statute and do not provide information that is particularly useful to the public. Thus, stakeholders stressed at the negotiated rulemaking sessions that the current system is too focused on inputs and that outcome-based measures would provide more meaningful information.

    Similarly, even some of the statutorily-required data elements in the current reporting system do not provide meaningful information on program performance and how program graduates are likely to perform in a classroom. For example, the HEA requires IHEs to report both scaled scores on licensure tests and pass rates for students who complete their teacher preparation programs. Yet, research provides mixed findings on the relationship between licensure test scores and teacher effectiveness.75 This may be because most licensure tests were designed to measure the knowledge and skills of prospective teachers but not necessarily to predict classroom effectiveness.76 The predictive value of licensure exams is further eroded by the significant variation in State pass/cut scores on these exams, with many States setting pass scores at a very low level. The National Council on Teacher Quality found that every State except Massachusetts sets its pass/cut scores on content assessments for elementary school teachers below the average score for all test takers, and most States set pass/cut scores at the 16th percentile or lower.77 Further, even with low pass/cut scores, some States allow teacher candidates to take licensure exams multiple times. Some States also permit IHEs to exclude students who have completed all program coursework but have not passed licensure exams when the IHEs report pass rates on these exams for individuals who have completed teacher preparation programs under the current title II reporting system. This may explain, in part, why States and IHEs have reported over the past three years a consistently high average pass rate on licensure or certification exams ranging between 95 and 96 percent for individuals who completed traditional teacher preparation programs in the 2009-10 academic year.78

    75 Clotfelter, C., Ladd, H., & Vigdor, J. (2010). Teacher Credentials and Student Achievement: Longitudinal Analysis with Student Fixed Effects. Economics of Education Review, 26(6), 673-682; Goldhaber, D. (2007). Everyone's Doing It, But What Does Teacher Testing Tell Us about Teacher Effectiveness? The Journal of Human Resources, 42(4), 765-794; Buddin, R., & Zamarro, G. (2009). Teacher Qualifications and Student Achievement in Urban Elementary Schools. Journal of Urban Economics,66, 103-115.

    76 Goldhaber, D. (2007). Everyone's Doing It, But What Does Teacher Testing Tell Us about Teacher Effectiveness? The Journal of Human Resources, 42(4), 765-794.

    77 National Council on Teacher Quality, State Teacher Policy Yearbook, 2011. Washington, DC: National Council on Teacher Quality (2011). For more on licensure tests, see U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Policy and Program Studies Service (2010), Recent Trends in Mean Scores and Characteristics of Test-Takers on Praxis II Licensure Tests. Washington, DC: U.S. Department of Education.

    78 Secretary's Tenth Report.

    Thus, while the current title II reporting system produces detailed and voluminous data about teacher preparation programs, the data do not convey a clear picture of program quality as measured by how program graduates will perform in a classroom. This lack of meaningful data prevents school districts, principals, and prospective teacher candidates from making informed choices, creating a market failure due to imperfect information.

    On the demand side, principals and school districts lack information about the past performance of teachers from different teacher preparation programs and may rely on inaccurate assumptions about the quality of teacher preparation programs when recruiting and hiring novice teachers. An accountability system that provides information about how teacher preparation program graduates are likely to perform in a classroom and how likely they are to stay in the classroom will be valuable to school districts and principals seeking to efficiently recruit, hire, train, and retain high-quality educators. Such a system can help to reduce teacher attrition, a particularly important problem because many novice teachers do not remain in the profession, with more than a quarter of novice teachers leaving the teaching profession altogether within three years of becoming classroom teachers.79 High teacher turnover rates are problematic because research has demonstrated that, on average, student achievement increases considerably with more years of teacher experience in the first three through five years of teaching.80

    79 Ingersoll, R. (2003). Is There Really a Teacher Shortage? Retrieved from University of Washington Center for the Study of Teaching and Policy Web site: http://depts.washington.edu/ctpmail/PDFs/Shortage-RI-09-2003.pdf.

    80 Ferguson, R.F. & Ladd, H.F. (1996). How and why money matters: An analysis of Alabama schools. In H.F. Ladd (Ed.), Holding schools accountable: Performance-based education reform (pp. 265-298). Washington, DC: The Brookings Institution; Hanushek, E., Kain, J., O'Brien, D., & Rivkin, S. (2005). The Market for Teacher Quality (Working Paper no. 11154). Retrieved from National Bureau for Economic Research Web site: www.nber.org/papers/w11154; Gordon, R., Kane, T., Staiger, D. (2006). Identifying Effective Teachers Using Performance on the Job; Clotfelter, C., Ladd, H., & Vigdor, J. (2007). How and Why Do Teacher Credentials Matter for Student Achievement? (Working Paper No. 2). Retrieved from National Center for Analysis of Longitudinal Data in Education Research.; Kane, T., Rockoff, J., Staiger, D. (2008). What does certification tell us about teacher effectiveness? Evidence from New York City. Economics of Education Review, 27(6), 615-631.

    On the supply side, when considering which program to attend, prospective teachers lack comparative information about the placement rates and effectiveness of a program's graduates. Teacher candidates may enroll in a program without the benefit of information on employment rates post-graduation, employer and graduate feedback on program quality, and, most importantly, without understanding how well the program prepared prospective teachers to be effective in the classroom. NCES data indicate that 66 percent of certified teachers who received their bachelor's degree in 2008 took out loans to finance their undergraduate education. These teachers borrowed an average of $22,905.81 The average base salary for full-time teachers with a bachelor's degree in their first year of teaching in public elementary and secondary schools is $38,490.82 Thus, two-thirds of prospective teacher candidates may incur debt equivalent to 60 percent of their starting salary in order to attend teacher preparation programs without access to reliable indicators of how well these programs will prepare them for classroom teaching or help them find a teaching position in their chosen field. A better accountability system with more meaningful information will enable prospective teachers to make more informed choices while also enabling and encouraging States, IHEs, and alternative route providers to monitor and continuously improve the quality of their teacher preparation programs.

    81 National Center for Education Statistics (2009). Baccalaureate and Beyond Longitudinal Study. Washington, DC: U.S. Department of Education.

    82 National Center for Education Statistics (2015). Digest of Education Statistics, 2014. Washington, DC: U.S. Department of Education (2015): Table 211.20.

    The lack of meaningful data also prevents States from restricting program credentials to programs with the demonstrated ability to prepare more effective teachers, or accurately identifying low-performing and at-risk teacher preparation programs and helping these programs improve. Not surprisingly, States have not identified many programs as low-performing or at-risk based on the data currently collected. In the latest title II reporting requirement submissions, twenty-one States did not classify any teacher preparation programs as low-performing or at-risk.83 Of the programs identified by States as low-performing or at-risk, 28 were based in IHEs that participate in the Teacher Education Assistance for College and Higher Education (TEACH) Grant program. The GAO also found that some States were not assessing whether programs in their State were low performing at all.84 Since the beginning of Title II, HEA reporting in 2001, 29 States and territories have never identified a single IHE with an at-risk or low-performing teacher preparation program.85 Under the final regulations, however, every State will collect and report more meaningful information about teacher preparation program performance which will enable them to target scarce public funding more efficiently through direct support to more effective teacher preparation programs and State financial aid to prospective students attending those programs.

    83 Secretary's Tenth Report.

    84 GAO at 17.

    85 Secretary's Tenth Report.

    Similarly, under the current title II reporting system, the Federal government is unable to ensure that financial assistance for prospective teachers is used to help students attend programs with the best record for producing effective classroom teachers. The final regulations help accomplish this by ensuring that program performance information is available for all teacher preparation programs in all States and by restricting eligibility for Federal TEACH Grants to programs that are rated “effective.”

    Most importantly, elementary and secondary school students, including those students in high-need schools and communities who are disproportionately taught by recent teacher preparation program graduates, will be the ultimate beneficiaries of an improved teacher preparation program accountability system.86 Such a system better focuses State and Federal resources on promising teacher preparation programs while informing teacher candidates and potential employers about high-performing teacher preparation programs and enabling States to more effectively identify and improve low-performing teacher preparation programs.

    86 Several studies have found that inexperienced teachers are far more likely to be assigned to high-poverty schools, including Boyd, D., Lankford, H., Loeb, S., Rockoff, J., & Wyckoff, J. (2008). The Narrowing Gap in New York City Teacher Qualifications and Its Implications for Student Achievement in High-Poverty Schools. Journal of Policy Analysis and Management, 27(4), 793-818; Clotfelter, C., Ladd, H., Vigdor, J., & Wheeler, J. (2007). High Poverty Schools and the Distribution of Teachers and Principals. North Carolina Law Review, 85, 1345-1379; Sass, T., Hannaway, J., Xu, Z., Figlio, D., & Feng, L. (2010). Value Added of Teachers in High-Poverty Schools and Lower-Poverty Schools (Working Paper No. 52). Retrieved from National Center for Analysis of Longitudinal Data in Education Research at www.coweninstitute.com/wp-content/uploads/2011/01/1001469-calder-working-paper-52-1.pdf.

    Recognizing the benefits of improved information on teacher preparation program quality and associated accountability, several States have already developed and implemented systems that map teacher effectiveness data back to teacher preparation programs. The regulations help ensure that all States generate useful data that are accessible to the public to support efforts to improve teacher preparation programs.

    Brief Summary of the Regulations

    The Department's plan to improve teacher preparation has three core elements: (1) Reduce the reporting burden on IHEs while encouraging States to make use of data on teacher effectiveness to build an effective teacher preparation accountability system driven by meaningful indicators of quality (title II accountability system); (2) reform targeted financial aid for students preparing to become teachers by directing scholarship aid to students attending higher-performing teacher preparation programs (TEACH Grants); and (3) provide more support for IHEs that prepare high-quality teachers.

    The regulations address the first two elements of this plan. Improving institutional and State reporting and State accountability builds on the work that States like Louisiana and Tennessee have already started, as well as work that is underway in States receiving grants under Phase One or Two of the Race to the Top Fund.87 All of these States have, will soon have, or plan to have statewide systems that track the academic growth of a teacher's students by the teacher preparation program from which the teacher graduated and, as a result, will be better able to identify the teacher preparation programs that are producing effective teachers and the policies and programs that need to be strengthened to scale those effects.

    87 The applications and Scopes of Work for States that received a grant under Phase One or Two of the Race to the Top Fund are available online at: http://www2.ed.gov/programs/racetothetop/awards.html.

    Consistent with feedback the Department has received from stakeholders, under the regulations States must assess the quality of teacher preparation programs according to the following indicators: (1) Student learning outcomes of students taught by graduates of teacher preparation programs (as measured by aggregating learning outcomes of students taught by graduates of each teacher preparation program); (2) job placement and retention rates of these graduates (based on the number of program graduates who are hired into teaching positions and whether they stay in those positions); and (3) survey outcomes for surveys of program graduates and their employers (based on questions about whether or not graduates of each teacher preparation program are prepared to be effective classroom teachers).

    The regulations will help provide meaningful information on program quality to prospective teacher candidates, school districts, States, and IHEs that administer traditional teacher preparation programs and alternative routes to State certification or licensure programs. The regulations will make data available that also can inform academic program selection, program improvement, and accountability.

    During public roundtable discussions and subsequent negotiated rulemaking sessions, the Department consulted with representatives from the teacher preparation community, States, teacher preparation program students, teachers, and other stakeholders about the best way to produce more meaningful data on the quality of teacher preparation programs while also reducing the reporting burden on States and teacher preparation programs where possible. The regulations specify three types of outcomes States must use to assess teacher preparation program quality, but States retain discretion to select the most appropriate methods to collect and report these data. In order to give States and stakeholders sufficient time to develop these methods, the requirements of these regulations are implemented over several years.

    2. Discussion of Costs, Benefits, and Transfers

    The Department has analyzed the costs of complying with the final regulations. Due to uncertainty about the current capacity of States in some relevant areas and the considerable discretion the regulations will provide States (e.g., the flexibility States would have in determining who conducts the teacher and employer surveys), we cannot evaluate the costs of implementing the regulations with absolute precision. In the NPRM, the Department estimated that the total annualized cost of these regulations would be between $42.0 million and $42.1 million over ten years. However, based on public comments received, it became clear to us that this estimate created confusion. In particular, a number of commenters incorrectly interpreted this estimate as the total cost of the regulations over a ten year period. That is not correct. The estimates in the NPRM captured an annualized cost (i.e., between $42.0 million and $42.1 million per year over the ten year period) rather than a total cost (i.e., between $42.0 million and $42.1 million in total over ten years). In addition, these estimated costs reflected both startup and ongoing costs, so affected entities would likely see costs higher than these estimates in the first year of implementation and costs lower than these estimates in subsequent years. The Department believed that these assumptions were clearly outlined for the public in the NPRM; however, based on the nature of public comments received, we recognize that additional explanation is necessary.

    The Department has reviewed the comments submitted in response to the NPRM and has revised some assumptions in response to the information we received. We discuss specific public comments, where relevant, in the appropriate sections below. In general, we do not discuss non-substantive comments.

    A number of commenters expressed general concerns regarding the cost estimates included in the NPRM and indicated that implementing these regulations would cost far more than $42.0 million over ten years. As noted above, we believe most of these comments arose from a fundamental misunderstanding of the estimates presented in the NPRM. While several commenters attempted to provide alternate cost estimates, we note that many of these estimates were unreasonably high because they included costs for activities or initiatives that are not required by the regulations. For instance, in one alternate estimate (submitted jointly by the California Department of Education, the California Commission on Teacher Credentialing, and the California State Board of Education) cited by a number of commenters, over 95 percent of the costs outlined were due to non-required activities such as dramatically expanding State standardized assessments to all grades and subjects or completing time- and cost-intensive teacher evaluations of all teachers in the State in every year. Nonetheless, we have taken portions of those estimates into account where appropriate (i.e., where the alternate estimates reflect actual requirements of the final regulations) in revising our assumptions.

    In addition, some commenters argued that our initial estimates were too low because they did not include costs for activities not directly required by the regulations. These activities included making changes in State laws where those laws prohibited the sharing of data between State entities responsible for teacher certification and the State educational agency. Upon reviewing these comments, we have declined to include estimates of these potential costs. Such costs are difficult to quantify, as it is unclear how many States would be affected, how extensive the needed changes would be, or how much time and resources would be required on the part of State legislatures. Also, we believe that many States removed potential barriers in order to receive ESEA flexibility prior to the passage of ESSA, further minimizing the potential cost of legislative changes. To the extent that States do experience costs associated with these actions, or other actions not specifically required by the regulations and therefore not outlined below (e.g., costs associated with including more than the minimum number of participants in the consultation process described in § 612.4(c)), our estimates will not account for those costs.

    We have also updated our estimates using the most recently available wage rates from the Bureau of Labor Statistics. We have also updated our estimates of the number of teacher preparation programs and teacher preparation entities using the most recent data submitted to the Department in the 2015 title II data collection. While no commenters specifically addressed these issues, we believe that these updates will provide the most reasonable estimate of costs.

    Based on revised assumptions, the Department estimates that the total annualized cost of the regulations will be between $27.5 million and $27.7 million (see the Accounting Statement section of this document for further detail). This estimate is significantly lower than the total annualized cost estimated in the proposed rule. The largest driver of this decrease is the increased flexibility provided to States under § 612.5(a)(1)(ii), as explained below. To provide additional context, we provide estimates in Table 3 for IHEs, States, and LEAs in Year 1 and Year 5. These estimates are not annualized or calculated on a net present value basis, but instead represent real dollar estimates.

    Table 4—Estimated Costs by Entity Type in Years 1 and 5 Year 1 Year 5 IHE $4,800,050 $4,415,930 State $24,077,040 $16,111,570 LEA $5,859,820 $5,859,820 Total $34,736,910 $26,387,320

    Relative to these costs, the major benefit of the requirements, taken as a whole, will be better publicly available information on the effectiveness of teacher preparation programs that can be used by prospective students when choosing programs to attend; employers in selecting teacher preparation program graduates to recruit, train, and hire; States in making funding decisions; and teacher preparation programs themselves in seeking to improve.

    The following is a detailed analysis of the estimated costs of implementing the specific requirements, including the costs of complying with paperwork-related requirements, followed by a discussion of the anticipated benefits.88 The burden hours of implementing specific paperwork-related requirements are also shown in the tables in the Paperwork Reduction Act section of this document.

    88 Unless otherwise specified, all hourly wage estimates for particular occupation categories were taken from the May 2014 National Occupational Employment and Wage Estimates for Federal, State, and local government published by the Department of Labor's Bureau of Labor Statistics and available online at www.bls.gov/oes/current/999001.htm.

    Title II Accountability System (HEA Title II Regulations)

    Section 205(a) of the HEA requires that each IHE that provides a teacher preparation program leading to State certification or licensure report on a statutorily enumerated series of data elements for the programs it provides. Section 205(b) of the HEA requires that each State that receives funds under the HEA provide to the Secretary and make widely available to the public information on the quality of traditional and alternative route teacher preparation programs that includes not less than the statutorily enumerated series of data elements it provides. The State must do so in a uniform and comprehensible manner, conforming with definitions and methods established by the Secretary. Section 205(c) of the HEA directs the Secretary to prescribe regulations to ensure the validity, reliability, accuracy, and integrity of the data submitted. Section 206(b) requires that IHEs provide assurance to the Secretary that their teacher training programs respond to the needs of LEAs, be closely linked with the instructional decisions novice teachers confront in the classroom, and prepare candidates to work with diverse populations and in urban and rural settings, as applicable. Consistent with these statutory provisions, the Department is issuing regulations to ensure that the data reported by IHEs and States is accurate. The following sections provide a detailed examination of the costs associated with each of the regulatory provisions.

    Institutional Report Card Reporting Requirements

    The regulations require that beginning on April 1, 2018, and annually thereafter, each IHE that conducts a traditional teacher preparation program or alternative route to State certification or licensure program and enrolls students receiving title IV, HEA funds, report to the State on the quality of its program using an IRC prescribed by the Secretary.

    Under the current IRC, IHEs typically report at the entity level, rather than the program level, such that an IHE that administers multiple teacher preparation programs typically gathers data on each of those programs, aggregates the data, and reports the required information as a single teacher preparation entity on a single report card. By contrast, the regulations generally require that States report on program performance at the individual program level. The Department originally estimated that the initial burden for each IHE to adjust its recordkeeping systems in order to report the required data separately for each of its teacher preparation programs would be four hours per IHE. Numerous commenters argued that this estimate was low. Several commenters argued that initial set-up would take 8 to 12 hours, while others argued that it would take 20 to 40 hours per IHE. While we recognize that the amount of time it will take to initially adjust their record-keeping systems will vary, we believe that the estimates in excess of 20 hours are too high, given that IHEs will only be adjusting the way in which they report data, rather than collecting new data. However, the Department found arguments in favor of both 8 hours and 12 hours to be compelling and reasonable. We believe that eight hours is a reasonable estimate for how long it will take to complete this process generally; and for institutions with greater levels of oversight, review, or complexity, this process may take longer. Without additional information about the specific levels of review and oversight at individual institutions, we assume that the amount of time it will take institutions to complete this work will be normally distributed between 8 and 12 hours, with a national average of 10 hours per institution. Therefore, the Department has upwardly revised its initial estimate of four hours to ten hours. In the most recent year for which data are available, 1,490 IHEs submitted IRCs to the Department, for an estimated one-time cost of $384,120.89

    89 Unless otherwise specified, for paperwork reporting requirements, we use a wage rate of $25.78, which is based on a weighted national average hourly wage for full-time Federal, State and local government workers in office and administrative support (75 percent) and managerial occupations (25 percent), as reported by the Bureau of Labor Statistics in the National Occupational Employment and Wage Estimates, May 2014.

    One commenter argued that institutions would have to make costly updates and upgrades to their existing information technology (IT) platforms in order to generate the required new reports. However, given that institutions will not be required to generate reports on any new data elements, but only disaggregate the data already being collected by program, and that we include cost estimates for making the necessary changes to their existing systems in order to generate reports in that way, we do not believe it would be appropriate to include additional costs associated with large IT purchases in this cost estimate.

    The Department further estimated that each of the 1,490 IHEs would need to spend 78 hours to collect the data elements required for the IRC for its teacher preparation programs. Several commenters argued that it would take longer than 78 hours to collect the data elements required for the IRC each year. The Department reviewed its original estimates in light of these comments and the new requirement for IHEs to identify, in their IRCs, whether each program met the definition of a teacher preparation program provided through distance education. Pursuant to that review, the Department has increased its initial estimate to 80 hours, for an annual cost of $3,072,980.

    We originally estimated that entering the required information into the information collection instrument would take 13.65 hours per entity. We currently estimate that, on average, it takes one hour for institutions to enter the data for the current IRC. The Department believed that it would take institutions approximately as long to complete the report for each program as it does currently for the entire entity. As such, the regulations would result in an additional burden of the time to complete all individual program level reports minus the current entity time burden. In the NPRM, this estimate was based on an average of 14.65 teacher preparation programs per entity—22,312 IHE-based programs divided by 1,522 IHEs. Given that entities are already taking approximately one hour to complete the report, we estimated the time burden associated with this regulation at 13.65 hours (14.65 hours to complete individual program level reports minus one hour of current entity time burden). Based on the most recent data available, we now estimate an average of 16.40 teacher preparation programs per entity—24,430 IHE-based programs divided by 1,490 IHEs. This results in a total cost of $591,550 to the 1,490 IHEs. One commenter stated that it would take a total of 140 hours to enter the required information into the information collection instrument. However, it appears that this estimate is based on an assumption that it would require 10 hours of data entry for each program at an institution. Given the number of data elements involved and our understanding of how long institutions have historically taken to complete data entry tasks, we believe this estimate is high, and that our revised estimate, as described above, is appropriate.

    The regulations also require that each IHE provide the information reported on the IRC to the general public by prominently and promptly posting the IRC on the IHE's Web site, and, if applicable, on the teacher preparation portion of the Web site. We originally estimated that each IHE would require 30 minutes to post the IRC. One commenter stated that this estimate was reasonable given the tasks involved, while two commenters argued that this was an underestimate. One of these commenters stated that posting data on the institutional Web site often involved multiple staff, which was not captured in the Department's initial estimate. Another commenter argued that this estimate did not take into account time for data verification, drafting of summary text to accompany the document, or ensuring compliance with the Americans with Disabilities Act (ADA). Given that institutions will simply be posting on their Web site the final IRC that was submitted to the Department, we assume that the document has already been reviewed by all necessary parties and that all included data have been verified prior to being submitted to the Department. As such, the requirement to post the IRC to the Web site should not incur any additional levels of review or data validation. Regarding ADA compliance, we assume the commenter was referring to the broad set of statutory requirements regarding accessibility of communications by entities receiving Federal funding. In general, it is our belief that the vast majority of institutions, when developing materials for public dissemination, already ensure that such materials meet government- and industry-recognized standards for accessibility. To the extent that they do not already do so, nothing in the regulations imposes additional accessibility requirements beyond those in the Rehabilitation Act of 1973, as amended, or the ADA. As such, while there may be accessibility-related work associated with the preparation of these documents that is not already within the standard procedures of the institution, such work is not a burden created by the regulations. Thus, we believe our initial estimate of 30 minutes is appropriate, for an annual cumulative cost of $19,210. The estimated total annual cost to IHEs to meet the requirements concerning IRCs would be $3,991,030.

    We note that several commenters, in response to the Supplemental NPRM, argued that institutions would experience increased compliance costs given new provisions related to teacher preparation programs provided through distance education. However, nothing in the Supplemental NPRM proposed changes to institutional burden under § 612.3. Under the final regulations, the only increased burden on IHEs with respect to teacher preparation programs provided through distance education is that they identify whether each of the teacher preparation programs they offer meet the definition in § 612.2. We believe that the additional two hours estimated for data collection above the Department's initial estimate provides more than enough time for IHEs to meet this requirement. We do not estimate additional compliance costs to accrue to IHEs as a result of provisions in this regulation related to teacher preparation programs provided through distance education.

    State Report Card Reporting Requirements

    Section 205(b) of the HEA requires each State that receives funds under the HEA to report annually to the Secretary on the quality of teacher preparation in the State, both for traditional teacher preparation programs and for alternative routes to State certification or licensure programs, and to make this report available to the general public. In the NPRM, the Department estimated that the 50 States, the District of Columbia, the Commonwealth of Puerto Rico, Guam, American Samoa, the United States Virgin Islands, the Commonwealth of the Northern Mariana Islands, and the Freely Associated States, which include the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau would each need 235 hours to report the data required under the SRC.

    In response to the original NPRM, two commenters argued that this estimate was too low. Specifically, one commenter stated that, based on the amount of time their State has historically devoted to reporting the data in the SRC, it would take approximately 372.5 hours to complete. We note that not all States will be able to complete the reporting requirements in 235 hours and that some States, particularly those with more complex systems or more institutions, will take much longer. We also note that the State identified by the commenter in developing the 372.5 hour estimate meets both of those conditions—it uses a separate reporting structure to develop its SRC (one of only two States nationwide to do so), and has an above-average number of preparation programs. As such, it is reasonable to assume that this State would require more than the nationwide average amount of time to complete the process. Another commenter stated that the Department's estimates did not take into account the amount of time and potential staff resources needed to prepare and post the information. We note that there are many other aspects of preparing and posting the data that are not reflected in this estimate, such as collecting, verifying, and validating the data. We also note that this estimate does not take into account the time required to report on student learning outcomes, employment outcomes, or survey results. However, all of these estimates are included elsewhere in these cost estimates. We believe that, taken as a whole, all of these various elements appropriately capture the time and staff resources necessary to comply with the SRC reporting requirement.

    As proposed in the Supplemental NPRM, and as described in greater detail below, in these final regulations, States will be required to report on teacher preparation programs offered through distance education that produce 25 or more certified teachers in their State. The Department estimates that the reporting on these additional programs, in conjunction with the reduction in the total number of teacher preparation programs from our initial estimates in the NPRM, will result in a net increase in the time necessary to report the data required in the SRC from the 235 hours estimated in the NPRM to 243 hours, for an annual cost of $369,610.

    Section 612.4(a)(2) requires that States post the SRC on the State's Web site. Because all States already have at least one Web site in operation, we originally estimated that posting the SRC on an existing Web site would require no more than half an hour at a cost of $25.78 per hour. Two commenters suggested that this estimate was too low. One commenter argued that the Department's initial estimate did not take into account time to create Web-ready materials or to address technical errors. In general, the regulations do not require the SRC to be posted in any specific format and we believe that it would take a State minimal time to create a file that would be compliant with the regulations by, for example, creating a PDF containing the SRC. We were unable to determine from this comment the specific technical errors that the commenter was concerned about, but believe that enough States will need less than the originally estimated 30 minutes to post the SRC so that the overall average will not be affected if a handful of States encounter technical issues. Another commenter estimated that, using its current Web reporting system, it would take approximately 450 hours to initially set up the SRC Web site with a recurring 8 hours annually to update it. However, we note that the system the commenter describes is more labor intensive and includes more data analysis than the regulations require. While we recognize the value in States' actively trying to make the SRC data more accessible and useful to the public, we cannot accurately estimate how many States will choose to do more than the regulations require, or what costs they would encounter to do so. We have therefore opted to estimate only the time and costs necessary to comply with the regulations. As such, we retain our initial estimate of 30 minutes to post the SRC. For the 50 States, the District of Columbia, the Commonwealth of Puerto Rico, Guam, American Samoa, the United States Virgin Islands, the Commonwealth of the Northern Mariana Islands, the Freely Associated States, which include the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau the total annual estimated cost of meeting this requirement would be $760.

    Scope of State Reporting

    The costs associated with the reporting requirements in paragraphs (b) and (c) of § 612.4 are discussed in the following paragraphs. The requirements regarding reporting of a teacher preparation program's indicators of academic content knowledge and teaching skills do not apply to the insular areas of American Samoa, Guam, the Commonwealth of the Northern Mariana Islands, the U.S. Virgin Islands, the freely associated States of the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau. Due to their size and limited resources and capacity in some of these areas, we believe that the cost to these insular areas of collecting and reporting data on these indicators would not be warranted.

    Number of Distance Education Programs

    As described in the Supplemental NPRM (81 FR 18808), the Department initially estimated that the portions of this regulation relating to reporting on teacher preparation programs offered through distance education would result in 812 additional reporting instances for States. A number of commenters acknowledged the difficulty in arriving at an accurate estimate of the number of teacher preparation programs offered through distance education that would be subject to reporting under the final regulation. However, those commenters also noted that, without a clear definition from the Department on what constitutes a teacher preparation program offered through distance education, it would be exceptionally difficult to offer an alternative estimate. No commenters provided alternate estimates. In these final regulations, the Department has adopted a definition of teacher preparation program offered through distance education. We believe that this definition is consistent with our initial estimation methodology and have no reason to adjust that estimate at this time.

    Reporting of Information on Teacher Preparation Program Performance

    Under § 612.4(b)(1), a State would be required to make meaningful differentiations in teacher preparation program performance using at least three performance levels—low-performing teacher preparation program, at-risk teacher preparation program, and effective teacher preparation program—based on the indicators in § 612.5, including student learning outcomes and employment outcomes for teachers in high-need schools. Because States would have the discretion to determine the weighting of these indicators, the Department assumes that States would consult with early adopter States or researchers to determine best practices for making such determinations and whether an underlying qualitative basis should exist for these decisions. The Department originally estimated that State higher education authorities responsible for making State-level classifications of teacher preparation programs would require at least 35 hours to discuss methods for ensuring that meaningful differentiations are made in their classifications. This initial estimate also included determining what it meant for particular indicators to be included “in significant part” and what constituted “satisfactory” student learning outcomes, which are not included in the final regulations.

    A number of commenters stated that 35 hours was an underestimate. Of the commenters that suggested alternative estimates, those estimates typically ranged from 60 to 70 hours (the highest estimate was 350 hours). Based on these comments, the Department believes that its original estimate would not have provided sufficient time for multiple staff to meet and discuss teacher preparation program quality in a meaningful way. As such, and given that these staff will be making decisions regarding a smaller range of issues, the Department is revising its estimate to 70 hours per State. We believe that this amount of time would be sufficient for staff to discuss and make decisions on these issues in a meaningful and purposeful way. To estimate the cost per State, we assume that the State employee or employees would likely be in a managerial position (with national average hourly earnings of $45.58), for a total one-time cost for each of the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of $165,910.

    Fair and Equitable Methods

    Section 612.4(c)(1) requires States to consult with a representative group of stakeholders to determine the procedures for assessing and reporting the performance of each teacher preparation program in the State. The regulations specify that these stakeholders must include, at a minimum, representatives of leaders and faculty of traditional teacher preparation programs and alternative routes to State certification or licensure programs; students of teacher preparation programs; LEA superintendents; local school board members; elementary and secondary school leaders and instructional staff; elementary and secondary school students and their parents; IHEs that serve high proportions of low-income students or students of color, or English learners; advocates for English learners and students with disabilities; officials of the State's standards board or other appropriate standards body; and a representative of at least one teacher preparation program provided through distance education. Because the final regulations do not prescribe any particular methods or activities, we expect that States will implement these requirements in ways that vary considerably, depending on their population and geography and any applicable State laws concerning public meetings.

    Many commenters stated that their States would likely adopt methods different from those outlined below. In particular, these commenters argued that their States would include more than the minimum number of participants we used for these estimates. In general, while States may opt to do more than what is required by the regulations, for purposes of estimating the cost, we have based the estimate on what the regulations require. If States opt to include more participants or consult with them more frequently or for longer periods of time, then the costs incurred by States and the participants would be higher.

    In order to estimate the cost of implementing these requirements, we assume that the average State will convene at least three meetings with at least the following representatives from required categories of stakeholders: One administrator or faculty member from a traditional teacher preparation program, one administrator or faculty member from an alternative route teacher preparation program, one student from a traditional or alternative route teacher preparation program, one teacher or other instructional staff, one representative of a small teacher preparation program, one LEA superintendent, one local school board member, one student in elementary or secondary school and one of his or her parents, one administrator or faculty member from an IHE that serves high percentages of low-income students or students of color, one representative of the interests of English learners, one representative of the interests of students with disabilities, one official from the State's standards board or other appropriate standards body, and one administrator or faculty from a teacher preparation program provided through distance education. We note that a representative of a small teacher preparation program and a representative from a teacher preparation program provided through distance education were not required stakeholders in the proposed regulations, but are included in these final regulations.

    To estimate the cost of participating in these meetings for the required categories of stakeholders, we initially assumed that each meeting would require four hours of each participant's time and used the following national average hourly wages for full-time State government workers employed in these professions: Postsecondary education administrators, $50.57 (4 stakeholders); elementary or secondary education administrators, $50.97 (1 stakeholder); postsecondary teachers, $45.78 (1 stakeholder); primary, secondary, and special education school teachers, $41.66 (1 stakeholder). For the official from the State's standards board or other appropriate standards body, we used the national average hourly earnings of $59.32 for chief executives employed by State governments. For the representatives of the interests of students who are English learners and students with disabilities, we used the national average hourly earnings of $62.64 for lawyers in educational services (including private, State, and local government schools). For the opportunity cost to the representatives of elementary and secondary school students, we used the Federal minimum wage of $7.25 per hour and the average hourly wage for all workers of $22.71. These wage rates could represent either the involvement of a parent and a student at these meetings, or a single representative from an organization representing their interests who has an above average wage rate (i.e., $29.96). We used the average hourly wage rate for all workers ($22.71) for the school board official. For the student from a traditional or alternative route teacher preparation program, we used the 25th percentile of hourly wage for all workers of $11.04. We also assumed that at least two State employees in managerial positions (with national average hourly earnings of $45.58) would attend each meeting, with one budget or policy analyst to assist them (with national average hourly earnings of $33.98).90

    90 Unless otherwise noted, all wage rates in this section are based on average hourly earnings as reported by in the May 2014 National Occupational Employment and Wage Estimates from the Bureau of Labor Statistics available online at www.bls.gov/oes/current/oessrci.htm. Where hourly wages were unavailable, we estimated hourly wages using average annual wages from this source and the average annual hours worked from the National Compensation Survey, 2010.

    A number of commenters stated that this consultation process would take longer than the 12 hours in our initial estimate and that our estimates did not include time for preparation for the meetings or for participant travel. Alternate estimates from commenters ranged from 56 hours to 3,900 hours. Based on the comments we received, the Department believes that both States and participants may opt to meet for longer periods of time at each meeting or more frequently. However, we believe that many of the estimates from commenters were overestimates for an annual process. For example, the 3,900 hour estimate would require a commitment on the part of participants totaling 75 hours per week for 52 weeks per year. We believe this is highly unrealistic. However, we do recognize that States and interested parties may wish to spend a greater amount of time in the first year to discuss and establish the initial framework than we initially estimated. As such, we are increasing our initial estimate of 12 hours in the first year to 60 hours. We believe that this amount of time will provide an adequate amount of time for discussion of these important issues. We therefore estimate the cumulative cost to the 50 States, the District of Columbia, and Puerto Rico to be $2,385,900.

    We also recognize that, although the Department initially only estimated this consultative process occurring once every five years, States may wish to have a continuing consultation with these stakeholders. We believe that this engagement would take place either over email or conference call, or with an on-site meeting. We therefore are adding an estimated 20 hours per year for the intervening years for consulting with stakeholders. We therefore estimate that these additional consultations with stakeholders will cumulatively cost the 50 States, the District of Columbia, and Puerto Rico $690,110.

    States would also be required to report on the State-level rewards or consequences associated with the designated performance levels and on the opportunities they provide for teacher preparation programs to challenge the accuracy of their performance data and classification of the program. Costs associated with implementing these requirements are estimated in the discussion of annual costs associated with the SRC.

    Procedures for Assessing and Reporting Performance

    Under final § 612.4(b)(3), a State would be required to ensure that teacher preparation programs in the State are included on the SRC, but with some flexibility due to the Department's recognition that reporting on teacher preparation programs particularly consisting of a small number of prospective teachers could present privacy and data validity concerns. See § 612.4(b)(5). The Department originally estimated that each State would need up to 14 hours to review and analyze applicable State and Federal privacy laws and regulations and existing research on the practices of other States that set program size thresholds in order to determine the most appropriate aggregation level and procedures for its own teacher preparation program reporting. Most of the comments the Department received on this estimate focused on the comparability of data across years and stated that this process would have to be conducted annually in order to reassess appropriate cut points. The Department agrees that comparability could be an issue in several instances, but is equally concerned with variability in the data induced solely by the small size of programs. As such, we believe providing States the flexibility to aggregate data across small programs is key to ensuring meaningful data for the public. Upon further review, the Department also recognized an error in the NPRM, in which we initially stated that this review would be a one-time cost. Contrary to that statement, our overall estimates in the NPRM included this cost on an annual basis. This review will likely take place annually to determine whether there are any necessary changes in law, regulation, or practice that need to be taken into consideration. As such, we are revising our statement to clarify that these costs will be reflected annually. However, because of the error in the original description of the burden estimate, this change does not substantively affect the underlying calculations.

    Two commenters stated that the Department's initial estimate seemed low given the amount of work involved and three other commenters stated that the Department's initial estimates were adequate. Another commenter stated that this process would likely take longer in his State. No commenters offered alternative estimates. For the vast majority of States, we continue to believe that 14 hours is a sufficient amount of time for staff to review and analyze the applicable laws and statutes. However, given the potential complexity of these issues, as raised by commenters, we recognize that there may be additional staff involved and additional meetings required for purposes of consultation. In order to account for these additional burdens where they may exist, the Department is increasing its initial estimate to 20 hours. We believe that this will provide sufficient time for review, analysis, and discussion of these important issues. This provides an estimated cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of $51,750, based on the average national hourly earnings for a lawyer employed full-time by a State government ($49.76).

    Required Elements of the State Report Card

    For purposes of reporting under § 612.4, each State will need to establish indicators that would be used to assess the academic content knowledge and teaching skills of the graduates of teacher preparation programs within its jurisdiction. At a minimum, States must base their assessments on student learning outcomes, employment outcomes, survey outcomes, and whether or not the program is accredited by a specialized accrediting agency recognized by the Secretary for accreditation of professional teacher education programs, or provides teacher candidates with content and pedagogical knowledge, and quality clinical preparation, and has rigorous teacher candidate exit qualifications.

    States are required to report these outcomes for teacher preparation programs within their jurisdiction, with the only exceptions being for small programs for which aggregation under § 612.4(b)(3)(ii) would not yield the program size threshold (or for a State that chooses a lower program size threshold, would not yield the lower program size threshold) for that program, and for any program where reporting data would lead to conflicts with Federal or State privacy and confidentiality laws and regulations.

    Student Learning Outcomes

    In § 612.5, the Department requires that States assess the performance of teacher preparation programs based in part on data on the aggregate learning outcomes of students taught by novice teachers prepared by those programs. States have the option of calculating these outcomes using student growth, a teacher evaluation measure that includes student growth, another State-determined measure relevant to calculating student learning outcomes, or a combination of the three. Regardless of how they determine student learning outcomes, States are required to link these data to novice teachers and their teacher preparation programs. In the NPRM, we used available sources of information to assess the extent to which States appeared to already have the capacity to measure student learning outcomes and estimated the additional costs States that did not currently have the capacity might incur in order to comply with the regulations. However, in these final regulations, the Department has expanded the definition of “teacher evaluation measure” and provided States with the discretion to use a State-determined measure relevant to calculating student learning outcomes, which they did not have in the proposed regulations. In our initial estimates, the Department assumed that only eight States would experience costs associated with measuring student learning outcomes. Of those, the Department noted that two already had annual teacher evaluations that included at least some objective evidence of student learning. For these two States, we estimated it would cost approximately $596,720 to comply with the proposed regulations. For the six remaining States, we estimated a cost of $16,079,390. We note that several commenters raised concerns about the specifics of some of our assumptions in making these estimates, particularly the amount of time we assumed it would take to complete the tasks we described. We outline and respond to those comments below. However, given the revised definition of “teacher evaluation measure,” the additional option for States to use a State-defined measure other than student growth or a teacher evaluation measure, and the measures that States are already planning to implement consistent with ESSA, we believe all States either already have in place a system for measuring student learning outcomes or are already planning to have one in place absent these regulations. As such, we no longer believe that States will incur costs associated with measuring student learning outcomes solely as a result of these regulations.

    Tested Grades and Subjects

    In the NPRM, we assumed that the States would not need to incur any additional costs to measure student growth for tested grades and subjects and would only need to link these outcomes to teacher preparation programs by first linking the students' teachers to the teacher preparation program from which they graduated. The costs of linking student learning outcomes to teacher preparation programs are discussed below. Several commenters stated that assuming no costs for teachers in tested grades and subjects was unrealistic because this estimate was based on assurances provided by States, rather than on an assessment of actual State practice. We recognize the commenters' point. States that have made assurances to provide these student growth data may not currently be providing this information to teachers and therefore will still incur a cost to do so. However, such cost and burden is not occurring as a result of the regulations, but as a result of prior assurances made by the States under other programs. In general, we do not include costs herein that arise from other programs or requirements, but only those that are newly created by the final rule. As such, we continue to estimate no new costs in this area for States to comply with this final rule.

    Non-Tested Grades and Subjects

    In the NPRM, we assumed that the District of Columbia, Puerto Rico, and the 42 States, which all that had their requests for flexibility regarding specific requirements of the ESEA approved, would not incur additional costs to comply with the proposed regulations. This was, in part, because the teacher evaluation measures that they agreed to implement as part of the flexibility would meet the definition of a “teacher evaluation measure” under the proposed regulations. Some commenters expressed doubt that there would be no additional costs for these States, and others cited costs associated with developing new assessments for all currently non-tested grades and subjects (totaling as many as 57 new assessments). We recognize that States likely incurred costs to implement statewide comprehensive teacher evaluations. However, those additional costs did not accrue to States as a result of the regulations, but instead as part of their efforts under flexibility agreements. Therefore, we do not include an analysis of costs for States that received ESEA flexibility herein. Additionally, as noted previously, the regulations do not require States to develop new assessments for all currently non-tested grades and subjects. Therefore, we do not include costs for such efforts in these estimates.

    To estimate, in the NPRM, the cost of measuring student growth for teachers in non-tested grades and subjects in the eight States that were not approved for ESEA flexibility, we divided the States into two groups—those who had annual teacher evaluations with at least some objective evidence of student learning outcomes and those that did not.

    For those States that did not have an annual teacher evaluation in place, we estimated that it would take approximately 6.85 hours of a teacher's time and 5.05 hours of an evaluator's time to measure student growth using student learning objectives. Two commenters stated that these were underestimates, specifically noting that certain student outcomes (e.g., in the arts) are process-oriented and would likely take longer. We recognize that it may be more time-intensive to develop student learning objectives to measure student growth in some subject areas. However, the Rhode Island model we used as a basis for these estimates was designed to be used across subject areas, including the arts. Further, we believe that both teachers and evaluators would have sufficient expertise in their content areas that they would be able to complete the activities outlined in the Rhode Island guidance in times approximating our initial estimates. As such, we continue to believe those estimates were appropriate for the average teacher.

    In fact, we believe that this estimate likely overstated the cost to States that already require annual evaluations of all novice teachers because many of these evaluations would already encompass many of the activities in the framework. The National Council on Teacher Quality has reported that two of the eight States that did not receive ESEA flexibility required annual evaluations of all novice teachers and that those evaluations included at least some objective evidence of student learning. In these States, we initially estimated that teachers and evaluators would need to spend only a combined three hours to develop and measure against student learning objectives for the 4,629 novice teachers in these States.

    Several commenters stated that their States did not currently have these data, and others argued that this estimate did not account for the costs of verifying the data. We understand that States may not currently have structures in place to measure student learning outcomes as defined in the proposed rules. However, we believe that the revisions in the final rule provide sufficient flexibility to States to ensure that they can meet the requirements of this section without incurring additional measurement costs as a result of compliance with this regulation. We have included costs for challenging data elsewhere in these estimates.

    Linking Student Learning Outcomes to Teacher Preparation Programs

    Whether using student scores on State assessments, teacher evaluation ratings, or other measures of student growth, under the regulations States must link the student learning outcomes data back to the teacher, and then back to that teacher's preparation program. The costs to States to comply with this requirement will depend, in part, on the data and linkages in their statewide longitudinal data system. Through the Statewide Longitudinal Data Systems (SLDS) program, the Department has awarded $575.7 million in grants to support data systems that, among other things, allow States to link student achievement data to individual teachers and to postsecondary education systems. Forty-seven States, the District of Columbia, and the Commonwealth of Puerto Rico have already received at least one grant under this program to support the development of these data systems, so we expect that the cost to these States of linking student learning outcomes to teacher preparation programs would be lower than for the remaining States.

    According to information from the SLDS program in June 2015, nine States currently link K-12 teacher data including data on both teacher/administrator evaluations and teacher preparation programs to K-12 student data. An additional 11 States and the District of Columbia are currently in the process of establishing this linkage, and ten States and the Commonwealth of Puerto Rico have plans to add this linkage to their systems during their SLDS grant. Based on this information, it appears that 30 States, the Commonwealth of Puerto Rico, and the District of Columbia either already have the ability to aggregate data on student achievement of students taught by program graduates and link those data back to teacher preparation programs or have committed to doing so; therefore, we do not estimate any additional costs for these States to comply with this aspect of the regulations. We note that, based on information from other Department programs and initiatives, a larger number of States currently make these linkages and would therefore incur no additional costs associated with the regulations. However, for purposes of this estimate, we use data from the SLDS program. As a result, these estimates are likely overestimates of the actual costs borne by States to make these data connections.

    During the development of the regulations, the Department consulted with experts familiar with the development of student growth models and longitudinal data systems. These experts indicated that the cost of calculating growth for students taught by individual teachers and aggregating these data according to the teacher preparation program that these teachers completed would vary among States. For example, in States in which data on teacher preparation programs are housed within different or even multiple different postsecondary data systems that are not currently linked to data systems for elementary through secondary education students and teachers, these experts suggested that a reasonable estimate of the cost of additional staff or vendor time to link and analyze the data would be $250,000 per State. For States that already have data systems that include data from elementary to postsecondary education levels, we estimate that the cost of additional staff or vendor time to analyze the data would be $100,000. Since we do not know enough about the data systems in the remaining 20 States to determine whether they are likely to incur the higher or lower estimate of costs, we averaged the higher and lower figure. Accordingly we estimate that the remaining 20 States will need to incur an average cost of $175,000 to develop models to calculate growth for students taught by individual teachers and then link these data to teacher preparation programs for a total cost of $3,500,000.

    Several commenters stated that their States did not currently have the ability to make these linkages and their data systems would have to be updated and that, even in States that already have these linkages, there may be required updates to the system. We recognize that some States for which we assume no costs do not yet have the required functionality in their State data systems to make the links required under the regulations. However, as noted elsewhere, we reasonably rely on the assurances made by States that they are already planning on establishing these links, and are not doing so as a result of the regulations. As a result, we do not estimate costs for those States here. In regards to States that already have systems with these links in place, we are not aware of any updates that will need to be made to any of these systems solely in order to comply with the regulations, and therefore estimate no additional costs to these States.

    Employment Outcomes

    The final regulations require States to report employment outcomes, including data on both the teacher placement rate and the teacher retention rate, and on the effectiveness of a teacher preparation program in preparing, placing, and supporting novice teachers consistent with local educational needs. We have limited information on the extent to which States currently collect and maintain data on placement and retention for individual teachers.

    Under § 612.4(b), States are required to report annually, for each teacher preparation program, on the teacher placement rate for traditional teacher preparation programs, the teacher placement rate calculated for high-need schools for all teacher preparation programs (whether traditional or alternative route), the teacher retention rate for all teacher preparation programs (whether traditional or alternative route), and the teacher retention rate calculated for high-need schools for all teacher preparation programs (whether traditional or alternative route). States are not required to report on the teacher placement rate for alternative route programs. The Department has defined the “teacher placement rate” as the percentage of recent graduates who have become novice teachers (regardless of retention) for the grade level, span, and subject area in which they were prepared. “High-need schools” is defined in § 612.2(d) by using the definition of “high-need school” in section 200(11) of the HEA. The regulations will give States discretion to exclude recent graduates from this measure if they are teaching in a private school, teaching in another State, teaching in a position that does not require State certification, enrolled in graduate school, or engaged in military service.

    Section 612.5(a)(2) and the definition of “teacher retention rate” in § 612.2 require a State to provide data on each teacher preparation program's teacher retention rate, by calculating, for each of the last three cohorts of novice teachers preceding the current title II reporting year, the percentage of those teachers who have been continuously employed as teachers of record in each year between their first year as a novice teacher and the current reporting year. For the purposes of this definition, a cohort of novice teachers is determined by the first year in which they were identified as a novice teacher by the State. High-need schools is defined in § 612.2 by using the definition of “high-need school” from section 200(11) of the HEA. The regulations give States discretion to exclude novice teachers from this measure if they are teaching in a private school or another State, enrolled in graduate school, or serving in the military. States also have the discretion to treat this rate differently for alternative route and traditional route providers.

    In its comments on the Department's Notice of Intention to Develop Proposed Regulations Regarding Teacher Preparation Reporting Requirements, the Data Quality Campaign reported that 50 States, the District of Columbia, and the Commonwealth of Puerto Rico all collect some certification information on individual teachers and that a subset of States collect the following specific information on teacher preparation or qualifications that is relevant to the requirements: Type of teacher preparation program (42 States), location of teacher preparation program (47 States), and year of certification (51 States).91

    91 ED's Notice of Intention to Develop Proposed Regulations Regarding Teacher Preparation Reporting Requirements: DQC Comments to Share Knowledge on States' Data Capacity. Retrieved from www.dataqualitycampaign.org/files/HEA%20Neg%20Regs%20formatted.pdf.

    Data from the SLDS program indicate that 24 States can currently link data on individual teachers with their teacher preparation programs, including information on their current certification status and placement. In addition, seven States are currently in the process of making these links, and 10 States plan to add this capacity to their data systems, but have not yet established the link and process for doing so. Because these States would also maintain information on the certification status and year of certification of individual teachers, we assume they would already be able to calculate the teacher placement and retention rates for novice teachers but may incur additional costs to identify recent graduates who are not employed in a full-time teaching position within the State. It should be possible to do this at minimal cost by matching rosters of recent graduates from teacher preparation programs against teachers employed in full-time teaching positions who received their initial certification within the last three years. Additionally, because States already maintain the necessary information in State databases to identify schools as “high-need,” we do not believe there would be any appreciable additional cost associated with adding “high-need” flags to any accounting of teacher retention or placement rates in the State.

    Several commenters stated that it was unrealistic to assume that any States currently had the information required under the regulations as the requirements were new. While we recognize that States may not have previously conducted these specific data analyses in the past, this does not mean that their systems are incapable of doing so. In fact, as outlined above, information available to the Department indicates that at least 24 States already have this capacity and that an additional 17 are in the process of developing it or plan to do so. Therefore, regardless of whether the specific data analysis itself is new, these States will not incur additional costs associated with the final regulations to establish that functionality.

    The remaining 11 States may need to collect additional information from teacher preparation programs and LEAs because they do not appear to be able to link information on the employment, certification, and teacher preparation program for individual teachers. If it is not possible to establish this link using existing data systems, States may need to obtain some or all of this information from teacher preparation programs or from the teachers themselves. The American Association of Colleges for Teacher Education reported that, in 2012, 495 of 717 institutions (or about 70 percent) had begun tracking their graduates into job placements. Although half of those institutions have successfully obtained placement information, these efforts suggest that States may be able to take advantage of work already underway.92

    92 American Association of Colleges for Teacher Education (2013), The Changing Teacher Preparation Profession: A report from AACTE's Professional Education Data System (PEDS).

    A number of commenters stated that IHEs would experience substantial burden in obtaining this information from all graduates. We agree that teacher preparation programs individually tracking and contacting their recent graduates would be highly burdensome and inefficient. However, in the regulations, the reporting burden falls on States, rather than institutions. As such, we believe it would be inappropriate to assume data collection costs and reporting burdens accruing to institutions.

    For each of these 11 States, the Department originally estimated that 150 hours may be required at the State level to collect information about novice teachers employed in full-time teaching positions (including designing the data collection instruments, disseminating them, providing training or other technical assistance on completing the instruments, collecting the data, and checking their accuracy). Several commenters stated that the Department's estimates were too low. One commenter estimated that this process would take 350 hours. Another commenter indicated that his State takes approximately 100 hours to collect data on first year teachers and that data collection on more cohorts would take more time. Generally, the Department believes that this sort of data collection is subject to economies of scale—that for each additional cohort on which data are collected in a given year, the average time and cost associated with each cohort will decrease. This belief arises from the fact that many of the costs associated with such a collection, such as designing the data request instruments and disseminating them, are largely fixed. As such, we do not think that collecting data on three cohorts will take three times as long as collecting data on one. However, we do recognize that there could be wide variation across States depending on the complexity of their systems and the way in which they opt to collect these data. For example, a State that sends data requests to individual LEAs to query their own data systems will experience a much higher overall burden with this provision than one that sends data requests to a handful of analysts at the State level who perform a small number of queries on State databases. Because of this potentially wide variation in burden across States, it is difficult to accurately estimate an average. However, based on public comment, we recognize that our initial estimate may have been too low. However, we also believe that States will make every effort to reduce the burdens associated with this provision. As such, we are increasing our estimate to 200 hours, with an expectation that this may vary widely across States. Using this estimate, we calculate a total annual cost to the 11 States of $112,130, based on the national average hourly wage for education administrators of $50.97.

    Teacher Preparation Program Characteristics

    Under § 612.5(a)(4) States are required to report whether each teacher preparation program in the State either: (a) Is accredited by a specialized accrediting agency recognized by the Secretary for accreditation of professional teacher education programs, or (b) provides teacher candidates with content and pedagogical knowledge and quality clinical preparation, and has rigorous teacher candidate exit standards. As discussed in greater detail in the Paperwork Reduction Act section of this document, we estimate that the total cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of providing these assurances for the estimated 15,335 teacher preparation programs nationwide for which States have already determined are accredited based on previous title II reporting submissions would be $790,670, assuming that 2 hours were required per teacher preparation program and using an estimated hourly wage of $25.78. Several commenters argued that these estimates did not accurately reflect the costs associated with seeking specialized accreditation. We agree with this statement. However, the regulations do not require programs to seek specialized accreditation. Thus, there would be no additional costs associated with this requirement for programs that are already seeking or have obtained specialized accreditation. If teacher preparation programs that do not currently have specialized accreditation decide to seek it, they would not be doing so because of a requirement in these regulations, and therefore, it would be inappropriate to include those costs here.

    Survey Outcomes

    The Department requires States to report—disaggregated for each teacher preparation program—qualitative and quantitative data from surveys of novice teachers and their employers in order to capture their perceptions of whether novice teachers who were prepared at a teacher preparation program in that State possess the skills needed to succeed in the classroom. The design and implementation of these surveys would be determined by the State, but we provide the following estimates of costs associated with possible options for meeting this requirement.

    Some States and IHEs currently survey graduates or recent graduates of teacher preparation programs. According to experts consulted by the Department, depending on the number of questions and the size of the sample, some of these surveys have been administered quite inexpensively. Oregon conducted a survey of a stratified random sample of approximately 50 percent of its teacher preparation program graduates and estimated that it cost $5,000 to develop and administer the survey and $5,000 to analyze and report the data. Since these data will be used to assess and publicly report on the quality of each teacher preparation program, we expect that the cost of implementing the proposed regulations is likely to be higher, because States may need to survey a larger sample of teachers and their employers in order to capture information on all teacher preparation programs.

    Another potential factor in the cost of the teacher and employer surveys would be the number and type of questions. We have consulted with researchers experienced in the collection of survey data, and they have indicated that it is important to balance the burden on the respondent with the need to collect adequate information. In addition to asking teachers and their employers whether graduates of particular teacher preparation programs are adequately prepared before entering the classroom, States may also wish to ask about course-taking and student teaching experiences, as well as to collect demographic information on the respondent, including information on the school environment in which the teacher is currently employed. Because the researchers we consulted stressed that teachers and their employers are unlikely to respond to a survey that requires more than 30 minutes to complete, we assume that the surveys would not exceed this length.

    Based on our consultation with experts and previous experience conducting surveys of teachers through evaluations of Department programs or policies, we originally estimated that it would cost the average State approximately $25,000 to develop the survey instruments, including instructions for the survey recipients. However, a number of commenters argued that these development costs were far too low. Alternate estimates provided by commenters ranged from $50,000 per State to $200,000, with the majority of commenters offering a $50,000 estimate. As such, the Department has revised its original estimate to $50,000. This provides a total cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of $2,600,000. However, we recognize that the cost would be lower for States that identify an existing instrument that could be adapted or used for this purpose, potentially including survey instruments previously developed by other States.93 If States surveyed all individuals who completed teacher preparation programs in the previous year, we estimate that they would survey 180,744 teachers, based on the reported number of individuals completing teacher preparation programs, both traditional and alternative route programs, during the 2013-2014 academic year.

    93 The experts with whom we consulted did not provide estimates of the number of hours involved in the development of this type of survey. For the estimated burden hours for the Paperwork Reduction Act section, this figure represents 1,179 hours at an average hourly wage rate of $42.40, based on the hourly wage for faculty at a public IHE and statisticians employed by State governments.

    To estimate the cost of administering these surveys, we consulted researchers with experience conducting a survey of all recent graduates of teacher preparation programs in New York City.94 In order to meet the target of a 70 percent response rate for that survey, the researchers estimated that their cost per respondent was $100, which included an incentive for respondents worth $25. We believe that it is unlikely that States will provide cash incentives for respondents to the survey, thus providing an estimate of $75 per respondent. However, since the time of data collection in that survey, there have been dramatic advances in the availability and usefulness of online survey software with a corresponding decrease in cost. As such, we believe that the $75 per respondent estimate may actually provide an extreme upper bound and may dramatically over-estimate the costs associated with administering any such survey. For example, several prominent online survey companies offer survey hosting services for as little as $300 per year for unlimited questions and unlimited respondents. In the NPRM, using that total cost, and assuming surveys administered and hosted by the State and using the number of program graduates in 2013 (203,701), we estimated that the cost per respondent would range from $0.02 to $21.43, with an average cost per State of $0.97. We recognize that this estimate would represent an extreme lower bound and many States are unlikely to see costs per respondent that low until the survey is fully integrated into existing systems. For example, States may be able to provide teachers with a mechanism, such as an online portal, to both verify their class rosters and complete the survey. Because teachers would be motivated to ensure that they were not evaluated based on the performance of students they did not teach, requiring novice teachers to complete the survey in order to access their class rosters would increase the response rate for the survey and allow novice teachers to select their teacher preparation program from a pull-down menu, reducing the amount of time required to link the survey results to particular programs. States could also have teacher preparation programs disseminate the novice teacher survey with other information for teacher preparation program alumni or have LEAs disseminate the novice teacher survey during induction or professional development activities. We believe that, as States incorporate these surveys into other structures, data collection costs will dramatically decline towards the lower bounds noted above.

    94 These cost estimates were based primarily on our consultation with a researcher involved in the development, implementation, and analysis of surveys of teacher preparation program graduates and graduates of alternative certification programs in New York City in 2004 as part of the Teacher Pathways Project. These survey instruments are available online at: www.teacherpolicyresearch.org/TeacherPathwaysProject/Surveys/tabid.

    The California State School Climate Survey (CSCS) is one portion of the larger California School Climate, Health, & Learning Survey, designed to survey teachers and staff to address questions of school climate. While the CSCS is subsidized by the State of California, it is also offered to school districts outside of the State for a fee, ranging from $500 to $1,500 per district, depending on its enrollment size. Applying this cost structure to all school districts nationwide with enrollment (as outlined in the Department's Common Core of Data), we estimated in the NPRM that costs would range from a low of $0.05 per FTE teacher to $500 per FTE teacher with an average of $21.29 per FTE. However, these costs are inflated by single-school, single-teacher districts, which are largely either charter schools or small, rural school districts unlikely to administer separate surveys. When removing single-school, single-teacher districts, the average cost per respondent decreased to $12.27.

    Given the cost savings associated with online administration of surveys and the likelihood that States will fold these surveys into existing structures, we believe that many of these costs are likely over-estimates of the actual costs that States will bear in administering these surveys. However, for purposes of estimating costs in this context, we use a rate of $30.33 per respondent, which represents a cost per respondent at the 85th percentile of the CSCS administration and well above the maximum administration cost for popular consumer survey software. One commenter stated that the Department's initial estimate was appropriate; but also suggested that, to reduce costs further, a survey could be administered less than annually, or only a subset of novice teachers could be surveyed. One commenter argued that this estimate was too low and provided an alternate estimate of aggregate costs for their State of $300,000 per year. We note, however, that this commenter's alternate estimate was actually a lower cost per respondent than the Department's initial estimate—approximately $25 per respondent compared to $30.33. Another commenter argued that administration of the survey would cost $100 per respondent. Some commenters also argued that administering the survey would require additional staff. Given the information discussed above and that public comment was divided on whether our estimate was too high, too low, or appropriate, we do not believe there is adequate reason to change our initial estimate of $30.33 per respondent. Undoubtedly, some States may bear the administration costs by hiring additional staff while others will contract with an outside entity for the administration of the survey. In either case, we believe our original estimates to be reasonable. Using that estimate, we estimate that, if States surveyed a combined sample of 180,744 teachers and an equivalent number of employers,95 with a response rate of 70 percent, the cumulative cost to the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico of administering the survey would be $7,674,760.

    95 We note that, to the extent that multiple novice teachers are employed in the same school, there would be fewer employers surveyed than the estimates outlined above. However, for purposes of this estimate, we have assumed an equivalent number of employers. This assumption will result in an overestimate of actual costs.

    If States surveyed all teacher preparation program graduates and their employers, assuming that both the teacher and employer surveys would take no more than 30 minutes to complete, that the employers are likely to be principals or district administrators, and a response rate of 70 percent of teachers and employers surveyed, the total estimated burden for 126,521 teachers and their 126,521 employers of completing the surveys would be $2,635,430 and $3,224,390 respectively, based on the national average hourly wage of $41.66 for elementary and secondary public school teachers and $50.97 for elementary and secondary school level administrators. These costs would vary depending on the extent to which a State determines that it can measure these outcomes based on a sample of novice teachers and their employers. This may depend on the distribution of novice teachers prepared by teacher preparation programs throughout the LEAs and schools within each State and also on whether or not some of this information is available from existing sources such as surveys of recent graduates conducted by teacher preparation programs as part of their accreditation process.

    One commenter stated that principals would be unlikely to complete these surveys unless paid to do so. We recognize that some administrators may see these surveys as a burden and may be less willing to complete these surveys. However, we believe that States will likely take this factor into consideration when designing and administering these surveys by either reducing the amount of time necessary to complete the surveys, providing a financial incentive to complete them, or incorporating the surveys into other, pre-existing instruments that already require administrator input. Some States may also simply make completion a mandatory part of administrators' duties.

    Annual Reporting Requirements Related to State Report Card

    As discussed in greater detail in the Paperwork Reduction Act section of this document, § 612.4 includes several requirements for which States must annually report on the SRC. Using an estimated hourly wage of $25.78, we estimate that the total cost for the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico to report the following required information in the SRC would be: Classifications of teacher preparation programs ($370,280, based on 0.5 hours per 28,726 programs); assurances of accreditation ($98,830, based on 0.25 hours per 15,335 programs); State's weighting of the different indicators in § 612.5 ($340 annually, based on 0.25 hours per State); State-level rewards and consequences associated with the designated performance levels ($670 in the first year and $130 thereafter, based on 0.5 hours per State in the first year and 0.1 hours per State in subsequent years); method of program aggregation ($130 annually, based on 0.1 hours per State); and process for challenging data and program classification ($4,020 in the first year and $1,550 thereafter, based on 3 hours per State in the first year and 6 hours for 10 States in subsequent years).

    The Department's initial estimates also included costs associated with the examination of data collection quality (5.3 hours per State annually), and recordkeeping and publishing related to appeal decisions (5.3 hours per State). However, one commenter stated that the examination of data quality would take a high level of scrutiny and would take more time than was originally estimated and that our estimate associated with recordkeeping and publishing was low. Additionally, several commenters responded generally to the overall cost estimates in the NPRM with concerns about data quality and review. In response to these general concerns, and upon further review, the Department believes that States are likely to engage in a more robust data quality review process in response to these regulations. Furthermore, we believe that the associated documentation and recordkeeping estimates may have been lower than those reasonably expected by States. As such, the Department has increased its estimate of the time required from the original 5.3 hour estimate to 10 hours in both cases. These changes result in an estimated cost of $13,410 for each of the two components. The sum of these annual reporting costs would be $495,960 for the first year and $492,950 in subsequent years, based on a cumulative burden hours of 19,238 hours in the first year and 19,121 hours in subsequent years.

    In addition, a number of commenters expressed concern that our estimates included time and costs associated with challenging data and program classification but did not reflect time and costs associated with allowing programs to actually review data in the SRC to ensure that the teachers attributed to them were actual recent program graduates. We agree that program-level review of these data may be necessary, particularly in the first few years, in order to ensure valid and reliable data. As such, we have revised our cost estimates to include time for programs to individually review data reports to ensure their accuracy. We assume that this review will largely consist of matching lists of recent teacher preparation program graduates with prepopulated lists provided by the State. Based on the number of program completers during the 2013-2014 academic year, and the total number of teacher preparation programs in that year, we estimate the average program would review a list of 19 recent graduates (180,744 program completers each year over three years divided by 27,914 programs). As such, we do not believe this review will take a considerable amount of time. However, to ensure that we estimate sufficient time for this review, we estimate 1 hour per program for a total cost for the 27,914 teacher preparation programs of $719,620.

    Under § 612.5, States would also incur burden to enter the required aggregated information on student learning, employment, and survey outcomes into the information collection instrument for each teacher preparation program. Using the estimated hourly wage rate of $25.78, we estimate the following cumulative costs to the 50 States, the District of Columbia, and Puerto Rico to report on 27,914 teacher preparation programs and 812 teacher preparation programs provided through distance education: Annual reporting on student learning outcomes ($1,851,390 annually, based on 2.5 hours per program); and annual reporting of employment outcomes ($2,591,950 annually, based on 3.5 hours per program); and annual reporting of survey outcomes ($740,560 annually, based on 1 hour per program).

    After publication of the NPRM, we recognized that our initial estimates did not include costs or burden associated with States' reporting data on any other indicators of academic content knowledge and teaching skills. To the extent that States use additional indicators not required by these regulations, we believe that they will choose to use indicators currently in place for identifying low-performing teacher preparation programs rather that instituting new indicators and new data collection processes. As such, we do not believe that States will incur any additional data collection costs. Additionally, we assume that transitioning reporting on these indicators from the entity level to the program level will result in minimal costs at the State level that are already captured elsewhere in these estimates. As such, we believe the only additional costs associated with these other indicators will be in entering the aggregated information into the information collection instrument. We assume that, on average, it will take States 1 hour per program to enter this information. States with no or few other indicators will experience much lower costs than those estimated here. Those States that use a large number of other indicators may experience higher costs than those estimated here, though we believe it is unlikely that the data entry process per program for these other indicators will exceed this estimate. As such, we estimate an annual cost to the 50 States, the District of Columbia, and Puerto Rico of $740,560 to report on other indicators of academic content knowledge and teaching skills.

    Our estimate of the total annual cost of reporting these outcome measures on the SRC related to § 612.5 is $5,924,460, based on 229,808 hours.

    Potential Benefits

    The principal benefits related to the evaluation and classification of teacher preparation programs under the regulations are those resulting from the reporting and public availability of information on the effectiveness of teachers prepared by teacher preparation programs within each State. The Department believes that the information collected and reported as a result of these requirements will improve the accountability of teacher preparation programs, both traditional and alternative route to certification programs, for preparing teachers who are equipped to succeed in classroom settings and help their students reach their full potential.

    Research studies have found significant and substantial variation in teaching effectiveness among individual teachers and some variation has also been found among graduates of different teacher preparation programs.96 For example, Tennessee reports that some teacher preparation programs consistently report statistically significant differences in student learning outcomes for grades and subjects covered by State assessments over multiple years and meaningful differences in teacher placement and retention rates.97 Because this variation in the effectiveness of graduates is not associated with any particular type of preparation program, the only way to determine which programs are producing more effective teachers is to link information on the performance of teachers in the classroom back to their teacher preparation programs.98 The regulations do this by requiring States to link data on student learning outcomes, employment outcomes, and teacher and employer survey outcomes back to the teacher preparation programs, rating each program based on these data, and then making that information available to the public.

    96 See, for example: Boyd, D., Grossman, P., Lankford, H., Loeb, S., & Wyckoff, J. (2009). Teacher Preparation and Student Achievement. Education Evaluation and Policy Analysis, 31(4), 416-440.

    97 See Report Card on the Effectiveness of Teacher Training Programs, Tennessee 2014 Report Card. (n.d.). Retrieved from http://www.tn.gov/thec/article/report-card.

    98 Kane, T., Rockoff, J., & Staiger, D. (2008). What does certification tell us about teacher effectiveness? Evidence from New York City. Economics of Education Review, 27(6), 615-631.

    The Department recognizes that simply requiring States to assess the performance of teacher preparation programs and report this information to the public will not produce increases in student achievement, but it is an important part of a larger set of policies and investments designed to attract talented individuals to the teaching profession; prepare them for success in the classroom; and support, reward, and retain effective teachers. In addition, the Department believes that, once information on the performance of teacher preparation programs is more readily available, a variety of stakeholders will become better consumers of these data, which will ultimately lead to improved student achievement by influencing the behavior of States seeking to provide technical assistance to low-performing programs, IHEs engaging in deliberate self-improvement efforts, prospective teachers seeking to train at the highest quality teacher preparation programs, and employers seeking to hire the most highly qualified novice teachers.

    Louisiana has already adopted some of the proposed requirements and has begun to see improvements in teacher preparation programs. Based on data suggesting that the English Language Arts teachers prepared by the University of Louisiana at Lafayette were producing teachers who were less effective than other novice teachers prepared by other programs, Louisiana identified the program in 2008 as being in need of improvement and provided additional analyses of the qualifications of the program's graduates and of the specific areas where the students taught by program graduates appeared to be struggling.99 When data suggested that students struggled with essay questions, faculty from the elementary education program and the liberal arts department in the university collaborated to restructure the teacher education curriculum to include more writing instruction. Based on 2010-11 data, student learning outcomes for teachers prepared by this program are now comparable to other novice teachers in the State, and the program is no longer identified for improvement.100

    99 Sawchuk, S. (2012). Value Added Concept Proves Beneficial to Teacher Colleges. Retrieved from www.edweek.org.

    100 Gansle, K., Noell, G., Knox, R.M., Schafer, M.J. (2010). Value Added Assessment of Teacher Preparation Programs in Louisiana: 2007-2008 to 2009-2010 Overview of 2010-11 Results. Retrieved from Louisiana Board of Regents.

    This is one example, but it suggests that States can use data on student learning outcomes for graduates of teacher preparation programs to help these programs identify weaknesses and implement needed reforms in a reasonable amount of time. As more information becomes available and if the data indicate that some programs produce more effective teachers, LEAs seeking to hire novice teachers will prefer to hire teachers from those programs. All things being equal, aspiring teachers will elect to pursue their degrees or certificates at teacher preparation programs with strong student learning outcomes, placement and retention rates, survey outcomes, and other measures.

    TEACH Grants

    The final regulations link program eligibility for participation in the TEACH Grant program to the State assessment of program quality under 34 CFR part 612. Under §§ 686.11(a)(1)(iii) and 686.2(d), to be eligible to receive a TEACH Grant for a program, an individual must be enrolled in a high-quality teacher preparation program—that is, a program that is classified by the State as effective or higher in either or both the October 2019 or October 2020 SRC for the 2021-2022 title IV, HEA award year; or, classified by the State as effective or higher in two out of the previous three years, beginning with the October 2020 SRC, for the 2022-2023 title IV, HEA award year, under 34 CFR 612.4(b). As noted in the NPRM, the Department estimates that approximately 10 percent of TEACH Grant recipients are not enrolled in teacher preparation programs, but are majoring in such subjects as STEM, foreign languages, and history. Under the final regulations, in a change from the NPRM and from the current TEACH Grant regulations, students would need to be in an effective teacher preparation program as defined in § 612.2, but those who pursue a dual-major that includes a teacher preparation program would be eligible for a TEACH Grant. Additionally, institutions could design and designate programs that aim to develop teachers in STEM and other high-demand teaching fields that combine subject matter and teacher preparation courses as TEACH Grant eligible programs. Therefore, while we expect some reduction in TEACH Grant volume as detailed in the Net Budget Impacts section of this Regulatory Impact Analysis (RIA), we do expect that many students interested in teaching STEM and other key subjects will still be able to get TEACH Grants at some point in their postsecondary education.

    In addition to the referenced benefits of improved accountability under the title II reporting system, the Department believes that the regulations relating to TEACH Grants will also contribute to the improvement of teacher preparation programs. Linking program eligibility for TEACH Grants to the performance assessment by the States under the title II reporting system provides an additional factor for prospective students to consider when choosing a program and an incentive for programs to achieve a rating of effective or higher.

    In order to analyze the possible effects of the regulations on the number of programs eligible to participate in the TEACH Grant program and the amount of TEACH Grants disbursed, the Department analyzed data from a variety of sources. This analysis focused on teacher preparation programs at IHEs. This is because, under the HEA, alternative route programs offered independently of an IHE are not eligible to participate in the TEACH Grant program. For the purpose of analyzing the effect of the regulations on TEACH Grants, the Department estimated the number of teacher preparation programs based on data from the Integrated Postsecondary Education Data System (IPEDS) about program graduates in education-related majors as defined by the Category of Instructional Program (CIP) codes and award levels. For the purposes of this analysis, “teacher preparation programs” refers to programs in the relevant CIP codes that also have the IPEDS indicator flag for being a State-approved teacher education program.

    As detailed in the NPRM published December 3, 2014, in order to estimate how many programs might be affected by a loss of TEACH Grant eligibility, the Department had to estimate how many programs will be individually evaluated under the regulations, which encourage States to report on the performance of individual programs offered by IHEs rather than on the aggregated performance of programs at the institutional level as currently required. As before, the Department estimates that approximately 3,000 programs may be evaluated at the highest level of aggregation and approximately 17,000 could be evaluated if reporting is done at the most disaggregated level. Table 3 summarizes these two possible approaches to program definition that represent the opposite ends of the range of options available to the States. Based on IPEDS data, approximately 30 percent of programs defined at the six digit CIP code level have at least 25 novice teachers when aggregated across three years, so States may add one additional year to the analysis or aggregate programs with similar features to push more programs over the threshold, pursuant to the regulations. The actual number of programs at IHEs reported on will likely fall between these two points represented by Approach 1 and Approach 2. The final regulations define a teacher preparation program offered through distance education as a teacher preparation program at which at least 50 percent of the program's required coursework is offered through distance education and that starting with the 2021-2022 award year and subsequent award years, is not classified as less than effective, based on 34 CFR 612.4(b), by the same State for two out of the previous three years or meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or (E). The exact number of these programs is uncertain, but in the Supplemental NPRM concerning teacher preparation programs offered through distance education, the Department estimated that 812 programs would be reported. Whatever the number of programs, the TEACH Grant volume associated with these schools is captured in the amounts used in our Net Budget Impacts discussion. In addition, as discussed earlier in the Analysis of Comments and Changes section, States will have to report on alternative certification teacher preparation programs that are not housed at IHEs, but they are not relevant for analysis of the effects on TEACH Grants because they are ineligible for title IV, HEA funds and are not included in Table 5.

    Table 5—Teacher Preparation Programs at IHEs and TEACH Grant Program Approach 1 Total Approach 1 TEACH Grant
  • participating
  • Approach 2 Total Approach 2 TEACH Grant
  • participating
  • Public Total 2,522 1,795 11,931 8,414 4-year 2,365 1,786 11,353 8,380 2-year or less 157 9 578 34 Private Not-for-Profit Total 1,879 1,212 12,316 8,175 4-year 1,878 1,212 12,313 8,175 2-year or less 1 3 Private For-Profit Total 67 39 250 132 4-year 59 39 238 132 2-year or less 8 12 Total 4,468 3,046 24,497 16,721

    Given the number of programs and their TEACH Grant participation status as described in Table 3, the Department examined IPEDS data and the Department's budget estimates for 2017 related to TEACH Grants to estimate the effect of the regulations on TEACH Grants beginning with the FY 2021 cohort when the regulations would be in effect. Based on prior reporting, only 37 IHEs (representing an estimated 129 programs) were identified as having a low-performing or at-risk program in 2010 and twenty-seven States have not identified any low-performing programs in twelve years. Given prior identification of such programs and the fact that the States would continue to control the classification of teacher preparation programs subject to analysis, the Department does not expect a large percentage of programs to be subject to a loss of eligibility for TEACH Grants. Therefore, the Department evaluated the effects on the amount of TEACH Grants disbursed and the number of recipients on the basis of the States classifying a range of three percent, five percent, or eight percent of programs to be low-performing or at-risk. These results are summarized in Table 6. Ultimately, the number of programs affected is subject to the program definition, rating criteria, and program classifications adopted by the individual States, so the distribution of those effects is not known with certainty. However, the maximum effect, whatever the distribution, is limited by the amount of TEACH Grants made and the percentage of programs classified as low-performing and at-risk that participate in the TEACH Grant program. In the NPRM, the Department invited comments about the expected percentage of programs that will be found to be low-performing and at-risk. No specific comments were received, so the updated numbers based on the budget estimates for 2017 apply the same percentages as were used in the NPRM.

    Table 6—Estimated Effect in 2021 on Programs and TEACH Grant Amounts of Different Rates of Ineligibility 3% 5% 8% [Percentage of low-performing or at-risk programs] Programs: Approach 1 214 356 570 Approach 2 385 641 1,026 TEACH Grant Recipients 1,061 1,768 2,828 TEACH Grant Amount at Low-Performing or At-Risk programs $3,127,786 $5,212,977 $8,340,764

    The estimated effects presented in Table 4 reflect assumptions about the likelihood of a program being ineligible and do not take into account the size of the program or participation in the TEACH Grant program. The Department had no program level performance information and treats the programs as equally likely to become ineligible for TEACH Grants. If, in fact, factors such as size or TEACH Grant participation were associated with high or low performance, the number of TEACH Grant recipients and TEACH Grant volume could deviate from these estimates.

    Whatever the amount of TEACH Grant volume at programs found to be ineligible, the effect on IHEs will be reduced from the full amounts represented by the estimated effects presented here as students could elect to enroll in other programs at the same IHE that retain eligibility because they are classified by the State as effective or higher. Another factor that would reduce the effect of the regulations on programs and students is that an otherwise eligible student who received a TEACH Grant for enrollment in a TEACH Grant-eligible program is eligible to receive additional TEACH Grants to complete the program, even if that program loses status as a TEACH Grant-eligible program.

    Several commenters expressed concern that linking TEACH Grant eligibility to the State's evaluation of the program would harm teacher development from, and availability to, poor and underserved communities. We believe that the pilot year that provides some warning of program performance, the flexibility for States to develop their evaluation criteria, and a long history of programs performing above the at-risk or low-performing levels will reduce the possibility of this effect. The Department continues to expect that over time a large portion of the TEACH Grant volume now disbursed to students at programs that will be categorized as low-performing or at-risk will be shifted to programs that remain eligible. The extent to which this happens will depend on other factors affecting the students' enrollment decisions such as in-State status, proximity to home or future employment locations, and the availability of programs of interest, but the Department believes that students will take into account a program's rating and the availability of TEACH Grants when looking for a teacher preparation program. As discussed in the Net Budget Impacts section of this RIA, the Department expects that the reduction in TEACH Grant volume will taper off as States identify low-performing and at-risk programs and those programs are improved or are no longer eligible for TEACH Grants. Because existing recipients will continue to have access to TEACH Grants, and incoming students will have notice and be able to consider the program's eligibility for TEACH Grants in making an enrollment decision, the reduction in TEACH Grant volume that is classified as a transfer from students at ineligible programs to the Federal government will be significantly reduced from the estimated range of approximately $3.0 million to approximately $8.0 million in Table 4 for the initial years the regulations are in effect. While we have no past experience with students' reaction to a designation of a program as low-performing and loss of TEACH Grant eligibility, we assume that, to the extent it is possible, students would choose to attend a program rated effective or higher. For IHEs, the effect of the loss of TEACH Grant funds will depend on the students' reaction and how many choose to enroll in an eligible program at the same IHE, choose to attend a different IHE, or make up for the loss of TEACH Grants by funding their program from other sources.

    The Department does not anticipate that many programs will lose State approval or financial support. If this does occur, IHEs with such programs would have to notify enrolled and accepted students immediately, notify the Department within 30 days, and disclose such information on its Web site or promotional materials. The Department estimates that 50 IHEs would offer programs that lose State approval or financial support and that it would take 5.75 hours to make the necessary notifications and disclosures at a wage rate of $25.78 for a total cost of $7,410. Finally, some of the programs that lose State approval or financial support may apply to regain eligibility for title IV, HEA funds upon improved performance and restoration of State approval or financial support. The Department estimates that 10 IHEs with such programs would apply for restored eligibility and the process would require 20 hours at a wage rate of $25.78 for a total cost of $5,160.

    3. Regulatory Alternatives Considered

    The final regulations were developed through a negotiated rulemaking process in which different options were considered for several provisions. Among the alternatives the Department considered were various ways to reduce the volume of information States and teacher preparation programs are required to collect and report under the existing title II reporting system. One approach would have been to limit State reporting to items that are statutorily required. While this would reduce the reporting burden, it would not address the goal of enhancing the quality and usefulness of the data that are reported. Alternatively, by focusing the reporting requirements on student learning outcomes, employment outcomes, and teacher and employer survey data, and also providing States with flexibility in the specific methods they use to measure and weigh these outcomes, the regulations balance the desire to reduce burden with the need for more meaningful information.

    Additionally, during the negotiated rulemaking session, some non-Federal negotiators spoke of the difficulty States would have developing the survey instruments, administering the surveys, and compiling and tabulating the results for the employer and teacher surveys. The Department offered to develop and conduct the surveys to alleviate additional burden and costs on States, but the non-Federal negotiators indicated that they preferred that States and teacher preparation programs conduct the surveys.

    One alternative considered in carrying out the statutory directive to direct TEACH Grants to “high quality” programs was to limit eligibility only to programs that States classified as “exceptional”, positioning the grants more as a reward for truly outstanding programs than as an incentive for low-performing and at-risk programs to improve. In order to prevent a program's eligibility from fluctuating year-to-year based on small changes in evaluation systems that are being developed and to keep TEACH Grants available to a wider pool of students, including those attending teacher preparation programs producing satisfactory student learning outcomes, the Department and most non-Federal negotiators agreed that programs rated effective or higher would be eligible for TEACH Grants.

    4. Net Budget Impacts

    The final regulations related to the TEACH Grant program are estimated to have a net budget impact of $0.49 million in cost reduction over the 2016 to 2026 loan cohorts. These estimates were developed using the Office of Management and Budget's (OMB) Credit Subsidy Calculator. The OMB calculator takes projected future cash flows from the Department's student loan cost estimation model and produces discounted subsidy rates reflecting the net present value of all future Federal costs associated with awards made in a given fiscal year. Values are calculated using a “basket of zeros” methodology under which each cash flow is discounted using the interest rate of a zero-coupon Treasury bond with the same maturity as that cash flow. To ensure comparability across programs, this methodology is incorporated into the calculator and used Government-wide to develop estimates of the Federal cost of credit programs. Accordingly, the Department believes it is the appropriate methodology to use in developing estimates for these regulations. That said, in developing the following Accounting Statement, the Department consulted with OMB on how to integrate the Department's discounting methodology with the discounting methodology traditionally used in developing regulatory impact analyses.

    Absent evidence of the impact of these regulations on student behavior, budget cost estimates were based on behavior as reflected in various Department data sets and longitudinal surveys. Program cost estimates were generated by running projected cash flows related to the provision through the Department's student loan cost estimation model. TEACH Grant cost estimates are developed across risk categories: Freshmen/sophomores at 4-year IHEs, juniors/seniors at 4-year IHEs, and graduate students. Risk categories have separate assumptions based on the historical pattern of the behavior of borrowers in each category—for example, the likelihood of default or the likelihood to use statutory deferment or discharge benefits.

    As discussed in the TEACH Grants section of the Discussion of Costs, Benefits, and Transfers section in this RIA, the regulations could result in a reduction in TEACH Grant volume. Under the effective dates and data collection schedule in the regulations, that reduction in volume would start with the 2021 TEACH Grant cohort. The Department assumes that the effect of the regulations would be greatest in the first years they were in effect as the low-performing and at-risk programs are identified, removed from TEACH Grant eligibility, and helped to improve or are replaced by better performing programs. Therefore, the percent of volume estimated to be at programs in the low-performing or at-risk categories is assumed to drop for future cohorts. As shown in Table 7, the net budget impact over the 2016-2026 TEACH Grant cohorts is approximately $0.49 million in reduced costs.

    Table 7—Estimated Budget Impact PB 2017 TEACH Grant: Awards 35,354 36,055 36,770 37,498 38,241 38,999 Amount 104,259,546 106,326,044 108,433,499 110,582,727 112,774,555 115,009,826 Remaining Volume after Reduction from Change in TEACH Grants for STEM Programs: % 92.00% 92.00% 92.00% 92.00% 92.00% 92.00% Awards 32,526 33,171 33,828 34,498 35,182 35,879 Amount 95,918,782 97,819,960 99,758,819 101,736,109 103,752,591 105,809,040 Low Performing and At Risk: % 5.00% 3.00% 1.50% 1.00% 0.75% 0.50% Awards 1,626 995 507 345 264 179 Amount 4,795,939 2,934,599 1,496,382 1,017,361 778,144 529,045 Redistributed TEACH Grants: % 75% 75% 75% 75% 75% 75% Amount 3,596,954 2,200,949 1,122,287 763,021 583,608 396,784 Reduced TEACH Grant Volume: % 25% 25% 25% 25% 25% 25% Amount 1,198,985 733,650 374,096 254,340 194,536 132,261 Estimated Budget Impact of Policy: Subsidy Rate 17.00% 17.16% 17.11% 16.49% 16.40% 16.24% Baseline Volume 104,259,546 106,326,044 108,433,499 110,582,727 112,774,555 115,009,826 Revised Volume 103,060,561 105,592,394 108,059,403 110,328,387 112,580,019 114,877,565 Baseline Cost 17,724,123 18,245,549 18,552,972 18,235,092 18,495,027 18,677,596 Revised Cost 17,520,295 18,119,655 18,488,964 18,193,151 18,463,123 18,656,117 Estimated Cost Reduction 203,827 125,894 64,008 41,941 31,904 21,479

    The estimated budget impact presented in Table 5 is defined against the PB 2017 baseline costs for the TEACH Grant program, and the actual volume of TEACH Grants in 2021 and beyond will vary. The budget impact estimate depends on the assumptions about the percent of TEACH Grant volume at programs that become ineligible and the share of that volume that is redistributed or reduced as shown in Table 5. Finally, absent evidence of different rates of loan conversion at programs that will be eligible or ineligible for TEACH Grants when the proposed regulations are in place, the Department did not assume a different loan conversion rate as TEACH Grants shifted to programs rated effective or higher. However, given that placement and retention rates are one element of the program evaluation system, the Department does hope that, as students shift to programs rated effective, more TEACH Grant recipients will fulfill their service obligations. If this is the case and their TEACH Grants do not convert to loans, the students who do not have to repay the converted loans will benefit and the expected cost reductions for the Federal government may be reduced or reversed because more of the TEACH Grants will remain grants and no payment will be made to the Federal government for these grants. The final regulations also change total and permanent disability discharge provisions related to TEACH Grants to be more consistent with the treatment of interest accrual for total and permanent discharges in the Direct Loan program. This is not expected to have a significant budget impact.

    In addition to the TEACH Grant provision, the regulations include a provision that would make a program ineligible for title IV, HEA funds if the program was found to be low-performing and subject to the withdrawal of the State's approval or termination of the State's financial support. As noted in the NPRM, the Department assumes this will happen rarely and that the title IV, HEA funds involved would be shifted to other programs. Therefore, there is no budget impact associated with this provision.

    5. Accounting Statement

    As required by OMB Circular A-4 (available at www.whitehouse.gov/sites/default/files/omb/assets/omb/circulars/a004/a-4.pdf), in the following table we have prepared an accounting statement showing the classification of the expenditures associated with the provisions of these final regulations. This table provides our best estimate of the changes in annual monetized costs, benefits, and transfers as a result of the final regulations.

    Category Benefits Better and more publicly available information on the effectiveness of teacher preparation programs Not Quantified Distribution of TEACH Grants to better performing programs Not Quantified Category Costs 7% 3% Institutional Report Card (set-up, annual reporting, posting on Web site) $3,734,852 $3,727,459 State Report Card (Statutory requirements: Annual reporting, posting on Web site; Regulatory requirements: Meaningful differentiation, consulting with stakeholders, aggregation of small programs, teacher preparation program characteristics, other annual reporting costs) $3,653,206 $3,552,147 Reporting Student Learning Outcomes (develop model to link aggregate data on student achievement to teacher preparation programs, modifications to student growth models for non-tested grades and subjects, and measuring student growth) $2,317,111 $2,249,746 Reporting Employment Outcomes (placement and retention data collection directly from IHEs or LEAs) $2,704,080 $2,704,080 Reporting Survey Results (developing survey instruments, annual administration, and response costs) $14,621,104 $14,571,062 Reporting Other Indicators $740,560 $740,560 Identifying TEACH Grant-eligible Institutions $12,570 $12,570 Category Transfers Reduced costs to the Federal government from TEACH Grants to prospective students at teacher preparation programs found ineligible ($60,041) ($53,681) 6. Final Regulatory Flexibility Analysis

    These regulations will affect IHEs that participate in the title IV, HEA programs, including TEACH Grants, alternative certification programs not housed at IHEs, States, and individual borrowers. The U.S. Small Business Administration (SBA) Size Standards define for-profit IHEs as “small businesses” if they are independently owned and operated and not dominant in their field of operation with total annual revenue below $7,000,000. The SBA Size Standards define nonprofit IHEs as small organizations if they are independently owned and operated and not dominant in their field of operation, or as small entities if they are IHEs controlled by governmental entities with populations below 50,000. The revenues involved in the sector affected by these regulations, and the concentration of ownership of IHEs by private owners or public systems means that the number of title IV, HEA eligible IHEs that are small entities would be limited but for the fact that the nonprofit entities fit within the definition of a small organization regardless of revenue. The potential for some of the programs offered by entities subject to the final regulations to lose eligibility to participate in the title IV, HEA programs led to the preparation of this Final Regulatory Flexibility Analysis.

    Description of the Reasons That Action by the Agency Is Being Considered

    The Department has a strong interest in encouraging the development of highly trained teachers and ensuring that today's children have high quality and effective teachers in the classroom, and it seeks to help achieve this goal in these final regulations. Teacher preparation programs have operated without access to meaningful data that could inform them of the effectiveness of their teachers who graduate and go on to work in the classroom setting.

    The Department wants to establish a teacher preparation feedback mechanism premised upon teacher effectiveness. Under the final regulations, an accountability system would be established that would identify programs by quality so that effective teacher preparation programs could be recognized and rewarded and low-performing programs could be supported and improved. Data collected under the new system will help all teacher preparation programs make necessary corrections and continuously improve, while facilitating States' efforts to reshape and reform low-performing and at-risk programs.

    We are issuing these regulations to better implement the teacher preparation program accountability and reporting system under title II of the HEA and to revise the regulations implementing the TEACH Grant program. Our key objective is to revise Federal reporting requirements, while reducing institutional burden, as appropriate. Additionally, we aim to have State reporting focus on the most important measures of teacher preparation program quality while tying TEACH Grant eligibility to assessments of program performance under the title II accountability system. The legal basis for these regulations is 20 U.S.C. 1022d, 1022f, and 1070g, et seq.

    The final regulations related to title II reporting affect a larger number of entities, including small entities, than the smaller number of entities that could lose TEACH Grant eligibility or title IV, HEA program eligibility. The Department has more data on teacher preparation programs housed at IHEs than on those independent of IHEs. Whether evaluated at the aggregated institutional level or the disaggregated program level, as described in the TEACH Grant section of the Discussion of Costs, Benefits, and Transfers section in this RIA as Approach 1 and Approach 2, respectively, State-approved teacher preparation programs are concentrated in the public and private not-for-profit sectors. For the provisions related to the TEACH Grant program and using the institutional approach with a threshold of 25 novice teachers (or a lower threshold at the discretion of the State), since the IHEs will be reporting for all their programs, we estimate that approximately 56.4 percent of teacher preparation programs are at public IHEs—the vast majority of which would not be small entities, and 42.1 percent are at private not-for-profit IHEs. The remaining 1.5 percent are at private for-profit IHEs and of those with teacher preparation programs, approximately 18 percent had reported FY 2012 total revenues under $7 million based on IPEDS data and are considered small entities. Table 8 summarizes the estimated number of teacher preparation programs offered at small entities.

    Table 8—Teacher Preparation Programs at Small Entities Total
  • programs
  • Programs at
  • small entities
  • % of Total
  • programs
  • offered at
  • small entities
  • Programs at
  • TEACH Grant
  • participating
  • small entities
  • Public: Approach 1 2,522 17 1 14 Approach 2 11,931 36 0 34 Private Not-for-Profit: Approach 1 1,879 1,879 100 1,212 Approach 2 12,316 12,316 100 8,175 Private For-Profit: Approach 1 67 12 18 1 Approach 2 250 38 15 21 Source: IPEDS Note: Table includes programs at IHEs only.

    The Department has no indication that programs at small entities are more likely to be ineligible for TEACH Grants or title IV, HEA funds. Since all private not-for-profit IHEs are considered to be small because none are dominant in the field, we would expect about 5 percent of TEACH Grant volume at teacher preparation programs at private not-for-profit IHEs to be at ineligible programs. In AY 2014-15, approximately 43.7 percent of TEACH Grant disbursements went to private not-for-profit IHEs, and by applying that to the estimated TEACH Grant volume in 2021 of $95,918,782, the Department estimates that TEACH Grant volume at private not-for-profit IHEs in 2021 would be approximately $42.0 million. At the five percent low-performing or at-risk rate assumed in the TEACH Grants portion of the Cost, Benefits, and Transfers section of the Regulatory Impact Analysis, TEACH Grant revenues would be reduced by approximately $2.1 million at programs at private not-for-profit entities in the initial year the regulations are in effect and a lesser amount after that. Much of this revenue could be shifted to eligible programs within the IHE or the sector, and the cost to programs would be greatly reduced by students substituting other sources of funds for the TEACH Grants.

    In addition to the teacher preparation programs at IHEs included in Table 6, approximately 1,281 alternative certification programs offered outside of IHEs are subject to the reporting requirements in the regulations. The Department assumes that a significant majority of these programs are offered by non-profit entities that are not dominant in the field, so all of the alternative certification teacher preparation programs are considered to be small entities. However, the reporting burden for these programs falls on the States. As discussed in the Paperwork Reduction Act section of this document, the estimated total paperwork burden on IHEs would decrease by 66,740 hours. Small entities would benefit from this relief from the current institutional reporting requirements.

    The final regulations are unlikely to conflict with or duplicate existing Federal regulations.

    Paperwork Reduction Act of 1995

    The Paperwork Reduction Act of 1995 (PRA) does not require you to respond to a collection of information unless it displays a valid OMB control number. We display the valid OMB control numbers assigned to the collections of information in these final regulations at the end of the affected sections of the regulations.

    Sections 612.3, 612.4, 612.5, 612.6, 612.7, 612.8, and 686.2 contain information collection requirements. Under the PRA, the Department has submitted a copy of these sections, related forms, and Information Collection Requests (ICRs) to the Office of Management and Budget (OMB) for its review.

    The OMB control number associated with the regulations and related forms is 1840-0837. Due to changes described in the Discussion of Costs, Benefits, and Transfers section of the RIA, estimated burdens have been updated below.

    Start-Up and Annual Reporting Burden

    These regulations implement a statutory requirement that IHEs and States establish an information and accountability system through which IHEs and States report on the performance of their teacher preparation programs. Because parts of the regulations require IHEs and States to establish or scale up certain systems and processes in order to collect information necessary for annual reporting, IHEs and States may incur one-time start-up costs for developing those systems and processes. The burden associated with start-up and annual reporting is reported separately in this statement.

    Section 612.3 Reporting Requirements for the Institutional Report Cards

    Section 205(a) of the HEA requires that each IHE that provides a teacher preparation program leading to State certification or licensure report on a statutorily enumerated series of data elements for the programs it provides. The HEOA revised a number of the reporting requirements for IHEs.

    The final regulations under § 612.3(a) require that, beginning on April 1, 2018, and annually thereafter, each IHE that conducts traditional or alternative route teacher preparation programs leading to State initial teacher certification or licensure and that enrolls students receiving title IV, HEA funds report to the State on the quality of its programs using an IRC prescribed by the Secretary.

    Start-Up Burden Entity-Level and Program-Level Reporting

    Under the current IRC, IHEs typically report at the entity level rather than the program level. For example, if an IHE offers multiple teacher preparation programs in a range of subject areas (for example, music education and special education), that IHE gathers data on each of those programs, aggregates the data, and reports the required information as a single teacher preparation entity on a single report card. Under the final regulations and for the reasons discussed in the NPRM and the preamble to this final rule, reporting is now required at the teacher preparation program level rather than at the entity level. No additional data must be gathered as a consequence of this regulatory requirement; instead, IHEs will simply report the required data before, rather than after, aggregation.

    As a consequence, IHEs will not be required to alter appreciably their systems for data collection. However, the Department acknowledges that in order to communicate disaggregated data, minimal recordkeeping adjustments may be necessary. The Department estimates that initial burden for each IHE to adjust its recordkeeping systems will be 10 hours per entity. In the most recent year for which data are available, 1,490 IHEs reported required data to the Department through the IRC. Therefore, the Department estimates that the one-time total burden for IHEs to adjust recordkeeping systems will be 14,900 hours (1,490 IHEs multiplied by 10 burden hours per IHE).

    Subtotal of Start-Up Burden Under § 612.3

    The Department believes that IHEs' experience during prior title II reporting cycles has provided sufficient knowledge to ensure that IHEs will not incur any significant start-up burden, except for the change from entity-level to program-level reporting described above. Therefore, the subtotal of start-up burden for § 612.3 is 14,900 hours.

    Annual Reporting Burden Changes to the Institutional Report Card

    For a number of years IHEs have gathered, aggregated, and reported data on teacher preparation program characteristics, including those required under the HEOA, to the Department using the IRC approved under OMB control number 1840-0837. The required reporting elements of the IRC principally concern admissions criteria, student characteristics, clinical preparation, numbers of teachers prepared, accreditation of the program, and the pass rates and scaled scores of teacher candidates on State teacher certification and licensure examinations.

    Given all of the reporting changes under these final rules as discussed in the NPRM, the Department estimates that each IHE will require 66 fewer burden hours to prepare the revised IRC annually. The Department estimates that each IHE will require 146 hours to complete the current IRC approved by OMB. There would thus be an annual burden of 80 hours to complete the revised IRC (146 hours minus 66 hours in reduced data collection). The Department estimates that 1,490 IHEs would respond to the IRC required under the regulations, based on reporting figures from the most recent year data are available. Therefore, reporting data using the IRC would represent a total annual reporting burden of 119,200 hours (80 hours multiplied by 1,490 IHEs).

    Entity-Level and Program-Level Reporting

    As noted in the start-up burden section of § 612.3, under the current IRC, IHEs report teacher preparation program data at the entity level. The final regulations require that each IHE report disaggregated data at the teacher preparation program level. The Department believes this will not require any additional data collection or appreciably alter the time needed to calculate data reported to the Department. However, the Department believes that some additional reporting burden will exist for IHEs' electronic input and submission of disaggregated data because each IHE typically houses multiple teacher preparation programs.

    Based on the most recent year of data available, the Department estimates that there are 27,914 teacher preparation programs at 1,490 IHEs nationwide. Based on these figures, the Department estimates that on average, each of these IHEs offers 16.40 teacher preparation programs. Because each IHE already collects disaggregated IRC data, the Department estimates it will take each IHE one additional hour to fill in existing disaggregated data into the electronic IRC for each teacher preparation program it offers. Because IHEs already have to submit an IRC for the IHE, we estimate that the added burden for reporting on a program level will be 15.40 hours (an average of 16.40 programs at one hour per program, minus the existing submission of one IRC for the IHE, or 15.40 hours). Therefore, each IHE will incur an average burden increase of 15.40 hours (1 hour multiplied by an average of 15.40 teacher preparation programs at each IHE), and there will be an overall burden increase of 22,946 hours each year associated with this regulatory reporting requirement (15.40 multiplied by 1,490 IHEs).

    Posting on the Institution's Web site

    The regulations also require that the IHE provide the information reported on the IRC to the general public by prominently and promptly posting the IRC information on the IHE's Web site. Because the Department believes it is reasonable to assume that an IHE offering a teacher preparation program and communicating data related to that program by electronic means maintains a Web site, the Department presumes that posting such information to an already-existing Web site will represent a minimal burden increase. The Department therefore estimates that IHEs will require 0.5 hours (30 minutes) to meet this requirement. This would represent a total burden increase of 745 hours each year for all IHEs (0.5 hours multiplied by 1,490 IHEs).

    Subtotal of Annual Reporting Burden Under § 612.3

    Aggregating the annual burdens calculated under the preceding sections results in the following burdens: Together, all IHEs would incur a total burden of 119,200 hours to develop the systems needed to meet the requirements of the revised IRC, 22,946 hours to report program-level data, and 745 hours to post IRC data to their Web sites. This would constitute a total burden of 142,891 hours of annual burden nationwide.

    Total Institutional Report Card Reporting Burden

    Aggregating the start-up and annual burdens calculated under the preceding sections results in the following burdens: Together, all IHEs would incur a total start-up burden under § 612.3 of 14,900 hours and a total annual reporting burden under § 612.3 of 142,891 hours. This would constitute a total burden of 157,791 total burden hours under § 612.3 nationwide.

    The burden estimate for the existing IRC approved under OMB control number 1840-0837 was 146 hours for each IHE with a teacher preparation program. When the current IRC was established, the Department estimated that 1,250 IHEs would provide information using the electronic submission of the form for a total burden of 182,500 hours for all IHEs (1,250 IHEs multiplied by 146 hours). Applying these estimates to the current number of IHEs that are required to report (1,490) would constitute a burden of 217,540 hours (1,490 IHEs multiplied by 146 hours). Based on these estimates, the revised IRC would constitute a net burden reduction of 59,749 hours nationwide (217,540 hours minus 157,791 hours).

    Section 612.4 Reporting Requirements for the State Report Card

    Section 205(b) of the HEA requires that each State that receives funds under the HEA provide to the Secretary and make widely available to the public not less than the statutorily required specific information on the quality of traditional and alternative route teacher preparation programs. The State must do so in a uniform and comprehensible manner, conforming with definitions and methods established by the Secretary. Section 205(c) of the HEA directs the Secretary to prescribe regulations to ensure the validity, reliability, accuracy, and integrity of the data submitted. Section 206(b) requires that IHEs assure the Secretary that their teacher training programs respond to the needs of LEAs, be closely linked with the instructional decisions novice teachers confront in the classroom, and prepare candidates to work with diverse populations and in urban and rural settings, as applicable.

    Implementing the relevant statutory directives, the regulations under § 612.4(a) require that, starting October 1, 2019, and annually thereafter, each State report on the SRC the quality of all approved teacher preparation programs in the State, whether or not they enroll students receiving Federal assistance under the HEA, including distance education programs. This new SRC, to be implemented in 2019, is an update of the current SRC. The State must also make the SRC information widely available to the general public by posting the information on the State's Web site.

    Section 103(20) of the HEA and § 612.2(d) of the proposed regulations define “State” to include nine locations in addition to the 50 States: The Commonwealth of Puerto Rico, the District of Columbia, Guam, American Samoa, the United States Virgin Islands, the Commonwealth of the Northern Mariana Islands, the Freely Associated States, which include the Republic of the Marshall Islands, the Federated States of Micronesia, and the Republic of Palau. For this reason, all reporting required of States explicitly enumerated under § 205(b) of the HEA (and the related portions of the regulations, specifically §§ 612.4(a) and 612.6(b)), apply to these 59 States. However, certain additional regulatory requirements (specifically §§ 612.4(b), 612.4(c), 612.5, and 612.6(a)) only apply to the 50 States of the Union, the Commonwealth of Puerto Rico, and the District of Columbia. The burden estimates under those portions of this report apply to those 52 States. For a full discussion of the reasons for the application of certain regulatory provisions to different States, see the preamble to the NPRM.

    Entity-Level and Program-Level Reporting

    As noted in the start-up and annual burden sections of § 612.3, under the current information collection process, data are collected at the entity level, and the final regulations require data reporting at the program level. In 2015, States reported that there were 27,914 teacher preparation programs offered, including 24,430 at IHEs and 3,484 through alternative route teacher preparation programs not associated with IHEs. In addition, as discussed in the Supplemental NPRM, the Department estimates that the sections of these final regulations addressing teacher preparation programs offered through distance education will result in 812 additional reporting instances. Because the remainder of the data reporting discussed in this burden statement is transmitted using the SRC, for those burden estimates concerning reporting on the basis of teacher preparation programs, the Department uses the estimate of 28,726 teacher preparation programs (27,914 teacher preparation programs plus 812 reporting instances related to teacher preparation programs offered through distance education).

    Start Up and Annual Burden Under § 612.4(a)

    Section 612.4(a) codifies State reporting requirements expressly referenced in section 205(b) of the HEA; the remainder of § 612.4 provides for reporting consistent with the directives to the Secretary under sections 205(b) and (c) and the required assurance described in section 206(c).

    The HEOA revised a number of the reporting requirements for States. The requirements of the SRC are more numerous than those contained in the IRC, but the reporting elements required in both are similar in many respects. In addition, the Department has successfully integrated reporting to the extent that data reported by IHEs in the IRC is pre-populated in the relevant fields on which the States are required to report in the SRC. In addition to the elements discussed in § 612.3 of this burden statement regarding the IRC, under the statute a State must also report on its certification and licensure requirements and standards, State-wide pass rates and scaled scores, shortages of highly qualified teachers, and information related to low-performing or at-risk teacher preparation programs in the State.

    The SRC currently in use, approved under OMB control number 1840-0837, collects information on these elements. States have been successfully reporting information under this collection for many years. The burden estimate for the existing SRC was 911 burden hours per State. In the burden estimate for that SRC, the Department reported that 59 States were required to report data, equivalent to the current requirements. This represented a total burden of 53,749 hours for all States (59 States multiplied by 911 hours). This burden calculation was made on entity-level, rather than program-level, reporting (for a more detailed discussion of the consequences of this issue, see the sections on entity-level and program-level reporting in §§ 612.3 and 612.4). However, because relevant program-level data reported by the IHEs on the IRC will be pre-populated for States on the SRC, the burden associated with program-level reporting under § 612.4(a) will be minimal. Those elements that will require additional burden are discussed in the subsequent paragraphs of this section.

    Elements Changed in the State Report Card

    Using the calculations outlined in the NPRM and changes discussed above, the Department estimates that the total reporting burden for each State will be 243 hours (193 hours for the revised SRC plus the additional statutory reporting requirements totaling 50 hours). This would represent a reduction of 668 burden hours for each State to complete the requirements of the SRC, as compared to approved OMB collection 1840-0837 (911 burden hours under the current SRC compared to 243 burden hours under the revised SRC). The total burden for States to report this information would be 14,337 hours (243 hours multiplied by 59 States).

    Posting on the State's Web Site

    The final regulations also require that the State provide the information reported on the SRC to the general public by prominently and promptly posting the SRC information on the State's Web site. Because the Department believes it is reasonable to assume that each State that communicates data related to its teacher preparation programs by electronic means maintains a Web site, the Department presumes that posting such information to an already-existing Web site represents a minimal burden increase. The Department therefore estimates that States will require 0.5 hours (30 minutes) to meet this requirement. This would represent a total burden increase of 29.5 hours each year for all IHEs (0.5 hours multiplied by 59 States).

    Subtotal § 612.4(a) Start-Up and Annual Reporting Burden

    As noted in the preceding discussion, there is no start-up burden associated solely with § 612.4(a). Therefore, the aggregate start-up and annual reporting burden associated with reporting elements under § 612.4(a) would be 14,366.5 hours (243 hours multiplied by 59 States plus 0.5 hours for each of the 59 States).

    Reporting Required Under § 612.4(b) and § 612.4(c)

    The preceding burden discussion of § 612.4 focused on burdens related to the reporting requirements under section 205(b) of the HEA and reflected in 34 CFR 612.4(a). The remaining burden discussion of § 612.4 concerns reporting required under § 612.4(b) and (c).

    Start-Up Burden Meaningful Differentiations

    Under § 612.4(b)(1), a State is required to make meaningful differentiations in teacher preparation program performance using at least three performance levels—low-performing teacher preparation program, at-risk teacher preparation program, and effective teacher preparation program—based on the indicators in § 612.5 and including employment outcomes for high-need schools and student learning outcomes.

    The Department believes that State higher education authorities responsible for making State-level classifications of teacher preparation programs will require time to make meaningful differentiations in their classifications and determine whether alternative performance levels are warranted. States are required to consult with external stakeholders, review best practices by early adopter States that have more experience in program classification, and seek technical assistance.

    States will also have to determine how they will make such classifications. For example, a State may choose to classify all teacher preparation programs on an absolute basis using a cut-off score that weighs the various indicators, or a State may choose to classify teacher preparation programs on a relative basis, electing to classify a certain top percentile as exceptional, the next percentile as effective, and so on. In exercising this discretion, States may choose to consult with various external and internal parties and discuss lessons learned with those States already making such classifications of their teacher preparation programs.

    The Department estimates that each State will require 70 hours to make these determinations, and this would constitute a one-time total burden of 3,640 hours (70 hours multiplied by 52 States).

    Assurance of Specialized Accreditation

    Under § 612.4(b)(3)(i)(A), for each teacher preparation program, a State must provide disaggregated data for each of the indicators identified pursuant to § 612.5. See the start-up burden section of § 612.5 for a more detailed discussion of the burden associated with gathering the indicator data required to be reported under this regulatory section. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting disaggregated indicator data under this regulation. No further burden exists beyond the burden described in these two sections.

    Under § 612.4(b)(3)(i)(B), a State is required to provide, for each teacher preparation program in the State, the State's assurance that the teacher preparation program either: (a) Is accredited by a specialized agency or (b) provides teacher candidates with content and pedagogical knowledge, quality clinical preparation, and rigorous teacher exit qualifications. See the start-up burden section of § 612.5 for a detailed discussion of the burden associated with gathering the indicator data required to be reported under this regulation. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting these assurances. No further burden exists beyond the burden described in these two sections.

    Indicator Weighting

    Under § 612.4(b)(2)(ii), a State must provide its weighting of the different indicators in § 612.5 for purposes of describing the State's assessment of program performance. See the start-up burden section of § 612.4 on stakeholder consultation for a detailed discussion of the burden associated with establishing the weighting of the various indicators under § 612.5. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting these relative weightings. No further burden exists beyond the burden described in these two sections.

    State-Level Rewards or Consequences

    Under § 612.4(b)(2)(iii), a State must provide the State-level rewards or consequences associated with the designated performance levels. See the start-up burden section of § 612.4 on stakeholder consultation for a more detailed discussion of the burden associated with establishing these rewards or consequences. See the annual reporting burden section of § 612.4 for a discussion of the ongoing reporting burden associated with reporting these relative weightings. No further burden exists beyond the burden described in these two sections.

    Aggregation of Small Programs

    Under § 612.4(b)(3), a State must ensure that all of its teacher preparation programs in that State are represented on the SRC. The Department recognized that many teacher preparation programs consist of a small number of prospective teachers and that reporting on these programs could present privacy and data validity issues. After discussion and input from various non-Federal negotiators during the negotiated rulemaking process, the Department elected to set a required reporting program size threshold of 25. However, the Department realized that, on the basis of research examining accuracy and validity relating to reporting small program sizes, some States may prefer to report on programs smaller than 25. Section 612.4(b)(3)(i) permits States to report using a lower program size threshold. In order to determine the preferred program size threshold for its programs, a State may review existing research or the practices of other States that set program size thresholds to determine feasibility for its own teacher preparation program reporting. The Department estimates that such review will require 20 hours for each State, and this would constitute a one-time total burden of 1,040 hours (20 hours multiplied by 52 States).

    Under § 612.4(b)(3), all teacher preparation entities must report on the remaining small programs that do not meet the program size threshold the State chooses. States will be able to do so through a combination of two possible aggregation methods described in § 612.4(b)(3)(ii). The preferred aggregation methodology is to be determined by the States after consultation with a group of stakeholders. For a detailed discussion of the burden related to this consultation process, see the start-up burden section of § 612.4, which discusses the stakeholder consultation process. Apart from the burden discussed in that section, no other burden is associated with this requirement.

    Stakeholder Consultation

    Under § 612.4(c), a State must consult with a representative group of stakeholders to determine the procedures for assessing and reporting the performance of each teacher preparation program in the State. This stakeholder group, composed of a variety of members representing viewpoints and interests affected by these regulations, must provide input on a number of issues concerning the State's discretion. There are four issues in particular on which the stakeholder group advises the State—

    a. The relative weighting of the indicators identified in § 612.5;

    b. The preferred method for aggregation of data such that performance data for a maximum number of small programs are reported;

    c. The State-level rewards or consequences associated with the designated performance levels; and

    d. The appropriate process and opportunity for programs to challenge the accuracy of their performance data and program classification.

    The Department believes that this consultative process will require that the group convene at least three times to afford each of the stakeholder representatives multiple opportunities to meet and consult with the constituencies they represent. Further, the Department believes that members of the stakeholder group will require time to review relevant materials and academic literature and advise on the relative strength of each of the performance indicators under § 612.5, as well as any other matters requested by the State.

    These stakeholders will also require time to advise whether any of the particular indicators will have more or less predictive value for the teacher preparation programs in their State, given its unique traits. Finally, because some States have already implemented one or more components of the regulatory indicators of program quality, these stakeholders will require time to review these States' experiences in implementing similar systems. The Department estimates that the combination of gathering the stakeholder group multiple times, review of the relevant literature and other States' experiences, and making determinations unique to their particular State will take 900 hours for each State (60 hours per stakeholder multiplied by 15 stakeholders). This would constitute a one-time total of 46,800 hours for all States (900 hours multiplied by 52 States).

    Subtotal of Start-Up Burden Under § 612.4(b) and § 612.4(c)

    Aggregating the start-up burdens calculated under the preceding sections results in the following burdens: All States would incur a total burden of 3,640 hours to make meaningful differentiations in program classifications, 1,040 hours to determine the State's aggregation of small programs, and 46,800 hours to complete the stakeholder consultation process. This would constitute a total of 51,480 hours of start-up burden nationwide.

    Annual Reporting Burden Classification of Teacher Preparation Programs

    The bulk of the State burden associated with assigning programs among classification levels should be in gathering and compiling data on the indicators of program quality that compose the basis for the classification. Once a State has made a determination of how a teacher preparation program will be classified at a particular performance level, applying the data gathered under § 612.5 to this classification basis is straightforward. The Department estimates that States will require 0.5 hours (30 minutes) to apply already-gathered indicator data to existing program classification methodology. The total burden associated with classification of all teacher preparation programs using meaningful differentiations would be 14,363 hours each year (0.5 hours multiplied by 28,726 teacher preparation programs).

    Disaggregated Data on Each Indicator in § 612.5

    Under § 612.4(b)(2)(i)(A), States must report on the indicators of program performance in § 612.5. For a full discussion of the burden related to the reporting of this requirement, see the annual reporting burden section of § 612.5. Apart from the burden discussed in this section, no other burden is associated with this requirement.

    Indicator Weighting

    Under § 612.4(b)(2)(ii), States must report the relative weight it places on each of the different indicators enumerated in § 612.5. The burden associated with this reporting is minimal: After the State, in consultation with a group of stakeholders, has made the determination about the percentage weight it will place on each of these indicators, reporting this information on the SRC is a simple matter of inputting a number for each of the indicators. Under § 612.5, this minimally requires the State to input eight general indicators of quality.

    Note:

    The eight indicators are—

    a. Associated student learning outcome results;

    b. Teacher placement results;

    c. Teacher retention results;

    d. Teacher placement rate calculated for high-need school results;

    e. Teacher retention rate calculated for high-need school results;

    f. Teacher satisfaction survey results;

    g. Employer satisfaction survey results; and

    h. Teacher preparation program characteristics.

    This reporting burden will not be affected by the number of teacher preparation programs in a State, because such weighting applies equally to each program. Although the State has the discretion to add indicators, the Department does not believe that transmission of an additional figure representing the percentage weighting assigned to that indicator will constitute an appreciable burden increase. The Department therefore estimates that each State will incur a burden of 0.25 hours (15 minutes) to report the relative weighting of the regulatory indicators of program performance. This would constitute a total burden on States of 13 hours each year (0.25 hours multiplied by 52 States).

    State-Level Rewards or Consequences

    Similar to the reporting required under § 612.4(b)(2)(ii), after a State has made the requisite determination about rewards and consequences, reporting those rewards and consequences represents a relatively low burden. States must report this on the SRC during the first year of implementation, the SRC could provide States with a drop-down list representing common rewards or consequences in use by early adopter States, and States can briefly describe those rewards or consequences not represented in the drop-down options. For subsequent years, the SRC could be pre-populated with the prior-year's selected rewards and consequences, such that there will be no further burden associated with subsequent year reporting unless the State altered its rewards and consequences. For these reasons, the Department estimates that States will incur, on average, 0.5 hours (30 minutes) of burden in the first year of implementation to report the State-level rewards and consequences, and 0.1 hours (6 minutes) of burden in each subsequent year. The Department therefore estimates that the total burden for the first year of implementation of this regulatory requirement will be 26 hours (0.5 hours multiplied by 52 States) and 5.2 hours each year thereafter (0.1 hours multiplied by 52 States).

    Stakeholder Consultation

    Under § 612.4(b)(4), during the first year of reporting and every five years thereafter, States must report on the procedures they established in consultation with the group of stakeholders described under § 612.4(c)(1). The burden associated with the first and third of these four procedures, the weighting of the indicators and State-level rewards and consequences associated with each performance level, respectively, are discussed in the preceding paragraphs of this section.

    The second procedure, the method by which small programs are aggregated, is a relatively straightforward reporting procedure on the SRC. Pursuant to § 612.4(b)(3)(ii), States are permitted to use one of two methods, or a combination of both in aggregating small programs. A State can aggregate programs that are similar in teacher preparation subject matter. A State can also aggregate using prior year data, including that of multiple prior years. Or a State can use a combination of both methods. On the SRC, the State simply indicates the method it uses. The Department estimates that States will require 0.5 hours (30 minutes) to enter these data every fifth year. On an annualized basis, this would therefore constitute a total burden of 5.2 hours (0.5 hours multiplied by 52 States divided by five to annualize burden for reporting every fifth year).

    The fourth procedure that States must report under § 612.4(b)(4) is the method by which teacher preparation programs in the State are able to challenge the accuracy of their data and the classification of their program. First, the Department believes that States will incur a paperwork burden each year from recordkeeping and publishing decisions of these challenges. Because the Department believes the instances of these appeals will be relatively rare, we estimate that each State will incur 10 hours of burden each year related to recordkeeping and publishing decisions. This would constitute an annual reporting burden of 520 hours (10 hours multiplied by 52 States).

    After States and their stakeholder groups determine the preferred method for programs to challenge data, reporting that information will likely take the form of narrative responses. This is because the method for challenging data may differ greatly from State to State, and it is difficult for the Department to predict what methods States will choose. The Department therefore estimates that reporting this information in narrative form during the first year will constitute a burden of 3 hours for each State. This would represent a total reporting burden of 156 hours (3 hours multiplied by 52 States).

    In subsequent reporting cycles, the Department can examine State responses and (1) pre-populate this response for States that have not altered their method for challenging data or (2) provide a drop-down list of representative alternatives. This will minimize subsequent burden for most States. The Department therefore estimates that in subsequent reporting cycles (every five years under the final regulations), only 10 States will require more time to provide additional narrative responses totaling 3 burden hours each, with the remaining 42 States incurring a negligible burden. This represents an annualized reporting burden of 6 hours for those 10 States (3 hours multiplied by 10 States, divided by 5 years), for a total annualized reporting burden of 60 hours for subsequent years (6 hours multiplied by 10 States).

    Under § 612.4(c)(2), each State must periodically examine the quality of its data collection and reporting activities and modify those activities as appropriate. The Department believes that this review will be carried out in a manner similar to the one described for the initial stakeholder determinations in the preceding paragraphs: States will consult with representative groups to determine their experience with providing and using the collected data, and they will consult with data experts to ensure the validity and reliability of the data collected. The Department believes such a review will recur every three years, on average. Because this review will take place years after the State's initial implementation of the regulations, the Department further believes that the State's review will be of relatively little burden. This is because the State's review will be based on the State's own experience with collecting and reporting data pursuant to the regulations, and because States can consult with many other States to determine best practices. For these reasons, the Department estimates that the periodic review and modification of data collection and reporting will require 30 hours every three years or an annualized burden of 10 hours for each State. This would constitute a total annualized burden of 520 hours for all States (10 hours per year multiplied by 52 States).

    Subtotal Annual Reporting Burden Under § 612.4(b) and § 612.4(c)

    Aggregating the annual burdens calculated under the preceding sections results in the following: All States would incur a burden of 14,363 hours to report classifications of teacher preparation programs, 13 hours to report State indicator weightings, 26 hours in the first year and 5.2 hours in subsequent years to report State-level rewards and consequences associated with each performance classification, 5.2 hours to report the method of program aggregation, 520 hours for recordkeeping and publishing appeal decisions, 156 hours the first year and 60 hours in subsequent years to report the process for challenging data and program classification, and 520 hours to report on the examination of data collection quality. This totals 15,603.2 hours of annual burden in the first year and 15,486.4 hours of annual burden in subsequent years nationwide.

    Total Reporting Burden Under § 612.4

    Aggregating the start-up and annual burdens calculated under the preceding sections results in the following burdens: All States would incur a total burden under § 612.4(a) of 14,366.5 hours, a start-up burden under §§ 612.4(b) and 612.4(c) of 51,480 hours, and an annual burden under §§ 612.4(b) and 612.4(c) of 15,603.2 hours in the first year and 15,486.4 hours in subsequent years. This totals between 81,332.9 and 81,449.7 total burden hours under § 612.4 nationwide. Based on the prior estimate of 53,749 hours of reporting burden on OMB collection 1840-0837, the total burden increase under § 612.4 is between 27,583.9 hours and 27,700.7 hours (53,749 hours minus a range of 81,332.9 and 81,449.7 total burden hours).

    Section 612.5 Indicators a State Must Use To Report on Teacher Preparation Program Performance

    The final regulations at § 612.5(a)(1) through (a)(4) identify those indicators that a State is required to use to assess the academic content knowledge and teaching skills of novice teachers from each of its teacher preparation programs. Under the regulations, a State must use the following indicators of teacher preparation program performance: (a) Student learning outcomes, (b) employment outcomes, (c) survey outcomes, and (d) whether the program (1) is accredited by a specialized accrediting agency or (2) produces teacher candidates with content and pedagogical knowledge and quality clinical preparation, who have met rigorous exit standards. Section 612.5(b) permits a State, at its discretion, to establish additional indicators of academic content knowledge and teaching skills.

    Start-Up Burden Student Learning Outcomes

    As described in the Discussion of Costs, Benefits, and Transfers section of the RIA, we do not estimate that States will incur any additional burden associated with creating systems for evaluating student learning outcomes. However, the regulations also require that States link student growth or teacher evaluation data back to each teacher's preparation programs consistent with State discretionary guidelines included in § 612.4. Currently, few States have such capacity. However, based on data from the SLDS program, it appears that 30 States, the District of Columbia, and the Commonwealth of Puerto Rico either already have the ability to aggregate data on student achievement and map back to teacher preparation programs or have committed to do so. For these 30 States, the District of Columbia, and the Commonwealth of Puerto Rico we estimate that no additional costs will be needed to link student learning outcomes back to teacher preparation programs.

    For the remaining States, the Department estimates that they will require 2,940 hours for each State, for a total burden of 58,800 hours nationwide (2,940 hours multiplied by 20 States).

    Employment Outcomes

    Section 612.5(a)(2) requires a State to provide data on each teacher preparation program's teacher placement rate as well as the teacher placement rate calculated for high-need schools. High-need schools are defined in § 612.2(d) by using the definition of “high-need school” in section 200(11) of the HEA. The regulations give States discretion to exclude those novice teachers or recent graduates from this measure if they are teaching in a private school, teaching in another State, enrolled in graduate school, or engaged in military service. States also have the discretion to treat this rate differently for alternative route and traditional route providers.

    Section 612.5(a)(2) requires a State to provide data on each teacher preparation program's teacher retention rate and teacher retention rate calculated for high-need schools. The regulations give States discretion to exclude those novice teachers or recent graduates from this measure if they are teaching in a private school (or other school not requiring State certification), another State, enrolled in graduate school, or serving in the military. States also have the discretion to treat this rate differently for alternative route and traditional route providers.

    As discussed in the NPRM, the Department believes that only 11 States will likely incur additional burden in collecting information about the employment and retention of recent graduates of teacher preparation programs in its jurisdiction. To the extent that it is not possible to establish these measures using existing data systems, States may need to obtain some or all of this information from teacher preparation programs or from the teachers themselves upon requests for certification and licensure. The Department estimates that 200 hours may be required at the State level to collect information about novice teachers employed in full-time teaching positions (including designing the data request instruments, disseminating them, providing training or other technical assistance on completing the instruments, collecting the data, and checking their accuracy), which would amount to a total of 2,200 hours (200 hours multiplied by 11 States).

    Survey Outcomes

    Section 612.5(a)(3) requires a State to provide data on each teacher preparation program's teacher survey results. This requires States to report data from a survey of novice teachers in their first year of teaching designed to capture their perceptions of whether the training that they received was sufficient to meet classroom and profession realities.

    Section 612.5(a)(3) also requires a State to provide data on each teacher preparation program's employer survey results. This requires States to report data from a survey of employers or supervisors designed to capture their perceptions of whether the novice teachers they employ or supervise were prepared sufficiently to meet classroom and profession realities.

    Some States and IHEs already survey graduates of their teacher preparation programs. The sampling size and length of survey instrument can strongly affect the potential burden associated with administering the survey. The Department has learned that some States already have experience carrying out such surveys (for a more detailed discussion of these and other estimates in this section, see the Discussion of Costs, Benefits and Transfers section regarding student learning outcomes in the RIA). In order to account for variance in States' abilities to conduct such surveys, the variance in the survey instruments themselves, and the need to ensure statistical validity and reliability, the Department assumes a somewhat higher burden estimate than States' initial experiences.

    Based on Departmental consultation with researchers experienced in carrying out survey research, the Department assumes that survey instruments will not require more than 30 minutes to complete. The Department further assumes that a State can develop a survey in 1,224 hours. Assuming that States with experience in administering surveys will incur a lower cost, the Department assumes that the total burden incurred nationwide would maximally be 63,648 hours (1,224 hours multiplied by 52 States).

    Teacher Preparation Program Characteristics

    Under § 612.5(a)(4), States must report, for each teacher preparation program in the State whether it: (a) Is accredited by a specialized accrediting agency recognized by the Secretary for accreditation of professional teacher education programs, or (b) provides teacher candidates with content and pedagogical knowledge and quality clinical preparation, and has rigorous teacher candidate exit standards.

    CAEP, a union of two formerly independent national accrediting agencies, the National Council for Accreditation of Teacher Education (NCATE) and the Teacher Education Accreditation Council (TEAC), reports that currently it has fully accredited approximately 800 IHEs. The existing IRC currently requires reporting of whether each teacher preparation program is accredited by a specialized accrediting agency, and if so, which one. We note that, as of July 1, 2016, CAEP has not been recognized by the Secretary for accreditation of teacher preparation programs. As such, programs accredited by CAEP would not qualify under § 612.5(a)(4)(i). However, as described in the discussion of comments above, States would be able to use accreditation by CAEP as an indicator that the teacher preparation program meets the requirements of § 612.5(a)(4)(ii). In addition, we explain in the comments above that a State also could meet the reporting requirements in § 612.5(a)(4)(ii) by indicating that a program has been accredited by an accrediting organization whose standards cover the program characteristics identified in that section. Because section 205(a)(1)(D) of the HEA requires IHEs to include in their IRCs the identity of any agency that has accredited their programs, and the number of such accrediting agencies is small, States should readily know whether these other agencies meet these standards. For these reasons, the Department believes that no significant start-up burden will be associated with State determinations of specialized accreditation of teacher preparation programs for those programs that are already accredited.

    As discussed in the NPRM, the Department estimates that States will have to provide information for 15,335 teacher preparation programs nationwide (11,461 unaccredited programs at IHEs plus 3,484 programs at alternative routes not affiliated with an IHE plus 390 reporting instances for teacher preparation programs offered through distance education).

    The Department believes that States will be able to make use of accreditation guidelines from specialized accrediting agencies to determine the measures that will adequately inform them about which of its teacher preparation programs provide teacher candidates with content and pedagogical knowledge, quality clinical preparation, and have rigorous teacher candidate exit qualifications—the indicators contained in § 612.5(a)(4)(ii). The Department estimates that States will require 2 hours for each teacher preparation program to determine whether or not it can provide such information. Therefore, the Department estimates that the total reporting burden to provide this information would be 30,670 hours (15,335 teacher preparation programs multiplied by 2 hours).

    Subtotal of Start-Up Reporting Burden Under § 612.5

    Aggregating the start-up burdens calculated under the preceding sections results in the following burdens: All States would incur a burden of 58,800 hours to link student learning outcome measures back to each teacher's preparation program, 2,200 hours to measure employment outcomes, 63,648 hours to develop surveys, and 30,670 hours to establish the process to obtain information related to certain indicators for teacher preparation programs without specialized accreditation. This totals 155,318 hours of start-up burden nationwide.

    Annual Reporting Burden

    Under § 612.5(a), States must transmit, through specific elements on the SRC, information related to indicators of academic content knowledge and teaching skills of novice teachers for each teacher preparation program in the State. We discuss the burden associated with establishing systems related to gathering these data in the section discussing start-up burden associated with § 612.5. The following section describes the burden associated with gathering these data and reporting them to the Department annually.

    Student Learning Outcomes

    Under § 612.5(a)(1), States are required to transmit information related to student learning outcomes for each teacher preparation program in the State. The Department believes that in order to ensure the validity of the data, each State will require two hours to gather and compile data related to the student learning outcomes of each teacher preparation program. Much of the burden related to data collection will be built into State-established reporting systems, limiting the burden related to data collection to technical support to ensure proper reporting and to correct data that had been inputted incorrectly. States have the discretion to use student growth measures or teacher evaluation measures in determining student learning outcomes. Regardless of the measure(s) used, the Department estimates that States will require 0.5 hours (30 minutes) for each teacher preparation program to convey this information to the Department through the SRC. This is because these measures will be calculated on a quantitative basis. The combination of gathering and reporting data related to student learning outcomes therefore constitutes a burden of 2.5 hours for each teacher preparation program, and would represent a total burden of 71,815 hours annually (2.5 hours multiplied by 28,726 teacher preparation programs).

    Employment Outcomes

    Under § 612.5(a)(2), States are required to transmit information related to employment outcomes for each teacher preparation program in the State. In order to report employment outcomes to the Department, States must compile and transmit teacher placement rate data, teacher placement rate data calculated for high-need schools, teacher retention rate data, and teacher retention rate data for high-need schools. Similar to the process for reporting student learning outcome data, much of the burden related to gathering data on employment outcomes is subsumed into the State-established data systems, which provides information on whether and where teachers were employed. The Department estimates that States will require 3 hours to gather data both on teacher placement and teacher retention for each teacher preparation program in the State. Reporting these data using the SRC is relatively straightforward. The measures are the percentage of teachers placed and the percentage of teachers who continued to teach, both generally and at high-need schools. The Department therefore estimates that States will require 0.5 hours (30 minutes) for each teacher preparation program to convey this information to the Department through the SRC. The combination of gathering and reporting data related to employment outcomes therefore constitutes a burden of 3.5 hours for each teacher preparation program and would represent a total burden of 100,541 hours annually (3.5 hours multiplied by 28,726 teacher preparation programs).

    Survey Outcomes

    In addition to the start-up burden needed to produce a survey, States will incur annual burdens to administer the survey. Surveys will include, but will not be limited to, a teacher survey and an employer survey, designed to capture perceptions of whether novice teachers who are employed as teachers in their first year of teaching in the State where the teacher preparation program is located possess the skills needed to succeed in the classroom. The burdens for administering an annual survey will be borne by the State administering the survey and the respondents completing it. For the reasons discussed in the RIA in this document, the Department estimates that States will require approximately 0.5 hours (30 minutes) per respondent to collect a sufficient number of survey instruments to ensure an adequate response rate. The Department employs an estimate of 253,042 respondents (70 percent of 361,488—the 180,744 completers plus their 180,744 employers) that will be required to complete the survey. Therefore, the Department estimates that the annual burden to respondents nationwide would be 126,521 hours (285,181 respondents multiplied by 0.5 hours per respondent).

    With respect to burden incurred by States to administer the surveys annually, the Department estimates that one hour of burden will be incurred for every respondent to the surveys. This would constitute an annual burden nationwide of 253,042 hours (253,042 respondents multiplied by one hour per respondent).

    Under § 612.5(a)(3), after these surveys are administered, States are required to report the information using the SRC. In order to report survey outcomes to the Department, the Department estimates that States will need 0.5 hours to report the quantitative data related to the survey responses for each instrument on the SRC, constituting a total burden of one hour to report data on both instruments. This would represent a total burden of 28,726 hours annually (1 hour multiplied by 28,726 teacher preparation programs). The total burden associated with administering, completing, and reporting data on the surveys therefore constitutes 408,289 hours annually (126,521 hours plus 253,042 hours plus 28,726 hours).

    Teacher Preparation Program Characteristics

    Under § 612.5(a)(4), States are required to report whether each program in the State is accredited by a specialized accrediting agency recognized by the Secretary, or produces teacher candidates with content and pedagogical knowledge, with quality clinical preparation, and who have met rigorous teacher candidate exit qualifications. The Department estimates that 726 IHEs offering teacher preparation programs are or will be accredited by a specialized accrediting agency (see the start-up burden discussion for § 612.5 for an explanation of this figure). Using the IRC, IHEs already report to States whether teacher preparation programs have specialized accreditation. However, as noted in the start-up burden discussion of § 612.5, as of July 1, 2016, there are no specialized accrediting agencies recognized by the Secretary for teacher preparation programs. As such, the Department does not expect any teacher preparation program to qualify under § 612.5(a)(4)(i). However, as discussed elsewhere in this document, States can use accreditation by CAEP or another entity whose standards for accreditation cover the basic program characteristics in § 612.5(a)(4)(ii) as evidence that the teacher preparation program has satisfied the indicator of program performance in that provision. Since IHEs are already reporting whether they have specialized accreditation in their IRCs, and this reporting element will be pre-populated for States on the SRC, States would simply need to know whether these accrediting agencies have standards that examine the program characteristics in § 612.5(a)(4)(ii). Therefore, the Department estimates no additional burden for this reporting element for programs that have the requisite accreditation.

    Under § 612.5(a)(4)(ii), for those programs that are not accredited by a specialized accrediting agency, States are required to report on certain indicators in lieu of that accreditation: Whether the program provides teacher candidates with content and pedagogical knowledge and quality clinical preparation, and has rigorous teacher candidate exit qualifications. We assume that such requirements are already built into State approval of relevant programs. The Department estimates that States will require 0.25 hours (15 minutes) to provide to the Secretary an assurance, in a yes/no format, whether each teacher preparation program in its jurisdiction not holding a specialized accreditation from CAEP, NCATE, or TEAC meets these indicators.

    As discussed in the start-up burden section of § 612.5 which discusses reporting of teacher preparation program characteristics, the Department estimates States will have to provide such assurances for 15,335 teacher preparation programs that do not have specialized accreditation. Therefore, the Department estimates that the total burden associated with providing an assurance that these teacher preparation programs meet these indicators is 3,834 hours (0.25 hours multiplied by the 15,335 teacher preparation programs that do not have specialized accreditation).

    Other Indicators

    Under § 612.5(b), States may include additional indicators of academic content knowledge and teaching skill in their determination of whether teacher preparation programs are low-performing. As discussed in the Discussion of Costs, Benefits, and Transfers section of the RIA, we do not assume that States will incur any additional burden under this section beyond entering the relevant data into the information collection instrument. The Department estimates that the total reporting burden associated with this provision will be 28,726 hours (28,726 teacher preparation programs multiplied by 1 hour).

    Subtotal of Annual Reporting Burden Under § 612.5

    Aggregating the annual burdens calculated under the preceding sections results in the following burdens: All States would incur a burden of 71,815 hours to report on student learning outcome measures for all subjects and grades, 100,541 hours to report on employment outcomes, 408,289 hours to report on survey outcomes, 3,834 hours to report on teacher preparation program characteristics, and 28,726 hours to report on other indicators not required in § 612.5(a)(1)-(4). This totals 613,204.75 hours of annual burden nationwide.

    Total Reporting Burden Under § 612.5

    Aggregating the start-up and annual burdens calculated under the preceding sections results in the following burdens: All States would incur a start-up burden under § 612.5 of 155,318 hours and an annual burden under § 612.5 of 613,204.75 hours. This totals 768,522.75 burden hours under § 612.5 nationwide.

    Section 612.6 What Must a State Consider in Identifying Low-Performing Teacher Preparation Programs or At-Risk Programs?

    The regulations in § 612.6 require States to use criteria, including, at a minimum, indicators of academic content knowledge and teaching skills from § 612.5, to identify low-performing or at-risk teacher preparation programs.

    For a full discussion of the burden related to the consideration and selection of the criteria reflected in the indicators described in § 612.5, see the start-up burden section of §§ 612.4(b) and 612.4(c) discussing meaningful differentiations. Apart from that burden discussion, the Department believes States will incur no other burden related to this regulatory provision.

    Section 612.7 Consequences for a Low-Performing Teacher Preparation Program That Loses the State's Approval or the State's Financial Support

    For any IHE administering a teacher preparation program that has lost State approval or financial support based on being identified as a low-performing teacher preparation program, the regulations under § 612.7 require the IHE to—(a) notify the Secretary of its loss of State approval or financial support within thirty days of such designation; (b) immediately notify each student who is enrolled in or accepted into the low-performing teacher preparation program and who receives funding under title IV, HEA that the IHE is no longer eligible to provide such funding to them; and (c) disclose information on its Web site and promotional materials regarding its loss of State approval or financial support and loss of eligibility for title IV funding.

    The Department does not expect that a large percentage of programs will be subject to a loss of title IV eligibility. The Department estimates that approximately 50 programs will lose their State approval or financial support.

    For those 50 programs, the Department estimates that it will take each program 15 minutes to notify the Secretary of its loss of eligibility; 5 hours to notify all students who are enrolled in or accepted into the program and who receive funding under title IV of the HEA; and 30 minutes to disclose this information on its Web sites and promotional materials, for a total of 5.75 hours per program. The Department estimates the total burden at 287.5 hours (50 programs multiplied by 5.75 hours).

    Section 612.8 Regaining Eligibility To Accept or Enroll Students Receiving Title IV, HEA Funds After Loss of State Approval or Financial Support

    The regulations in § 612.8 provide a process for a low-performing teacher preparation program that has lost State approval or financial support to regain its ability to accept and enroll students who receive title IV, HEA funds. Under this process, IHEs will submit an application and supporting documentation demonstrating to the Secretary: (1) Improved performance on the teacher preparation program performance criteria reflected in indicators described in § 612.5 as determined by the State; and (2) reinstatement of the State's approval or the State's financial support.

    The process by which programs and institutions apply for title IV eligibility already accounts for the burden associated with this provision.

    Total Reporting Burden Under Part 612

    Aggregating the total burdens calculated under the preceding sections of part 612 results in the following burdens: Total burden hours incurred under § 612.3 is 157,791 hours, under § 612.4 is between 81,332.9 hours and 81,449.7 hours, under § 612.5 is 768,522.75 hours, under § 612.7 is 287.5 hours, and under § 612.8 is 200 hours. This totals between 1,008,134.15 hours and 1,008,250.95 hours nationwide.

    Reporting Burden Under Part 686

    The changes to part 686 in these regulations have no measurable effect on the burden currently identified in the OMB Control Numbers 1845-0083 and 1845-0084.

    Consistent with the discussions above, the following chart describes the sections of the final regulations involving information collections, the information being collected, and the collections the Department has submitted to the OMB for approval and public comment under the Paperwork Reduction Act. In the chart, the Department labels those estimated burdens not already associated an OMB approval number under a single prospective designation “OMB 1840-0837.” This label represents a single information collection; the different sections of the regulations are separated in the table below for clarity and to appropriately divide the burden hours associated with each regulatory section.

    Please note that the changes in burden estimated in the chart are based on the change in burden under the current IRC OMB control numbers 1840-0837 and “OMB 1840-0837.” The burden estimate for 612.3 is based on the most recent data available for the number of IHEs that are required to report (i.e. 1,522 IHEs using most recent data available rather than 1,250 IHEs using prior estimates). For a complete discussion of the costs associated with the burden incurred under these regulations, please see the RIA in this document, specifically the accounting statement.

    Regulatory section Information collection OMB Control No. and
  • estimated change in the burden
  • 612.3 This section requires IHEs that provide a teacher preparation program leading to State certification or licensure to provide data on teacher preparation program performance to the States OMB 1840-0837—The burden will decrease by 64,421 hours. 612.4 This section requires States that receive funds under the Higher Education Act of 1965, as amended, to report to the Secretary on the quality of teacher preparation in the State, both for traditional teacher preparation programs and for alternative route to State certification and licensure programs OMB 1840-0837—The burden will increase by between 27,700.7 hours. 612.5 This regulatory section requires States to use certain indicators of teacher preparation performance for purposes of the State report card OMB 1840-0837—The burden will increase by 768,522.75. 612.6 This regulatory section requires States to use criteria, including indicators of academic content knowledge and teaching skills, to identify low-performing or at-risk teacher preparation programs OMB 1840-0837—The burden associated with this regulatory provision is accounted for in other portions of this burden statement. 612.7 The regulations under this section require any IHE administering a teacher preparation program that has lost State approval or financial support based on being identified as a low-performing teacher preparation program to notify the Secretary and students receiving title IV, HEA funds, and to disclose this information on its Web site OMB 1840-0837—The burden will increase by 287.5 hours. 612.8 The regulations in this section provide a process for a low-performing teacher preparation program that lost State approval or financial support to regain its ability to accept and enroll students who receive title IV funds OMB 1840-0837—The burden will increase by 200 hours. Total Change in Burden Total increase in burden under parts 612 will be between 732,173.15 hours and 732,289.95 hours.
    Intergovernmental Review

    These programs are subject to the requirements of Executive Order 12372 and the regulations in 34 CFR part 79. One of the objectives of the Executive order is to foster an intergovernmental partnership and a strengthened federalism. The Executive order relies on processes developed by State and local governments for coordination and review of proposed Federal financial assistance.

    This document provides early notification of our specific plans and actions for these programs.

    Assessment of Educational Impact

    In the NPRM we requested comments on whether the proposed regulations would require transmission of information that any other agency or authority of the United States gathers or makes available.

    Based on the response to the NPRM and on our review, we have determined that these final regulations do not require transmission of information that any other agency or authority of the United States gathers or makes available.

    Federalism

    Executive Order 13132 requires us to ensure meaningful and timely input by State and local elected officials in the development of regulatory policies that have federalism implications. “Federalism implications” means substantial direct effects on the States, on the relationship between the National Government and the States, or on the distribution of power and responsibilities among the various levels of government.

    In the NPRM we identified a specific section that may have federalism implications and encouraged State and local elected officials to review and provide comments on the proposed regulations. In the Public Comment section of this preamble, we discuss any comments we received on this subject.

    Accessible Format: Individuals with disabilities can obtain this document in an accessible format (e.g., braille, large print, audiotape, or compact disc) on request to the person listed under FOR FURTHER INFORMATION CONTACT.

    Electronic Access to This Document: The official version of this document is the document published in the Federal Register. Free Internet access to the official edition of the Federal Register and the Code of Federal Regulations is available via the Federal Digital System at: www.thefederalregister.org/fdsys. At this site you can view this document, as well as all other documents of this Department published in the Federal Register, in text or Adobe Portable Document Format (PDF). To use PDF you must have Adobe Acrobat Reader, which is available free at the site.

    You may also access documents of the Department published in the Federal Register by using the article search feature at: www.federalregister.gov. Specifically, through the advanced search feature at this site, you can limit your search to documents published by the Department.

    List of Subjects in 34 CFR Parts 612 and 686

    Administrative practice and procedure, Aliens, Colleges and universities, Consumer protection, Grant programs—education, Loan programs—education, Reporting and recordkeeping requirements, Selective Service System, Student aid, Vocational education.

    Dated: October 11, 2016. John B. King, Jr., Secretary of Education.

    For the reasons discussed in the preamble, the Secretary amends chapter VI of title 34 of the Code of Federal Regulations as follows:

    1. Part 612 is added to read as follows: PART 612—TITLE II REPORTING SYSTEM Subpart A—Scope, Purpose, and Definitions Sec. 612.1 Scope and purpose. 612.2 Definitions. Subpart B—Reporting Requirements 612.3 What are the regulatory reporting requirements for the Institutional Report Card? 612.4 What are the regulatory reporting requirements for the State Report Card? 612.5 What indicators must a State use to report on teacher preparation program performance for purposes of the State report card? 612.6 What must States consider in identifying low-performing teacher preparation programs or at-risk teacher preparation programs, and what actions must a State take with respect to those programs identified as low-performing? Subpart C—Consequences of Withdrawal of State Approval or Financial Support 612.7 What are the consequences for a low-performing teacher preparation program that loses the State's approval or the State's financial support? 612.8 How does a low-performing teacher preparation program regain eligibility to accept or enroll students receiving Title IV, HEA funds after loss of the State's approval or the State's financial support? Authority:

    20 U.S.C. 1022d and 1022f.

    Subpart A—Scope, Purpose, and Definitions
    § 612.1 Scope and purpose.

    This part establishes regulations related to the teacher preparation program accountability system under title II of the HEA. This part includes:

    (a) Institutional Report Card reporting requirements.

    (b) State Report Card reporting requirements.

    (c) Requirements related to the indicators States must use to report on teacher preparation program performance.

    (d) Requirements related to the areas States must consider to identify low-performing teacher preparation programs and at-risk teacher preparation programs and actions States must take with respect to those programs.

    (e) The consequences for a low-performing teacher preparation program that loses the State's approval or the State's financial support.

    (f) The conditions under which a low-performing teacher preparation program that has lost the State's approval or the State's financial support may regain eligibility to resume accepting and enrolling students who receive title IV, HEA funds.

    § 612.2 Definitions.

    (a) The following terms used in this part are defined in the regulations for Institutional Eligibility under the HEA, 34 CFR part 600:

    Distance education Secretary State Title IV, HEA program

    (b) The following term used in this part is defined in subpart A of the Student Assistance General Provisions, 34 CFR part 668:

    Payment period

    (c) The following term used in this part is defined in 34 CFR 77.1:

    Local educational agency (LEA)

    (d) Other terms used in this part are defined as follows:

    At-risk teacher preparation program: A teacher preparation program that is identified as at-risk of being low-performing by a State based on the State's assessment of teacher preparation program performance under § 612.4.

    Candidate accepted into the teacher preparation program: An individual who has been admitted into a teacher preparation program but who has not yet enrolled in any coursework that the institution has determined to be part of that teacher preparation program.

    Candidate enrolled in the teacher preparation program: An individual who has been accepted into a teacher preparation program and is in the process of completing coursework but has not yet completed the teacher preparation program.

    Content and pedagogical knowledge: An understanding of the central concepts and structures of the discipline in which a teacher candidate has been trained, and how to create effective learning experiences that make the discipline accessible and meaningful for all students, including a distinct set of instructional skills to address the needs of English learners and students with disabilities, in order to assure mastery of the content by the students, as described in applicable professional, State, or institutional standards.

    Effective teacher preparation program: A teacher preparation program with a level of performance higher than a low-performing teacher preparation program or an at-risk teacher preparation program.

    Employer survey: A survey of employers or supervisors designed to capture their perceptions of whether the novice teachers they employ or supervise who are in their first year of teaching were effectively prepared.

    High-need school: A school that, based on the most recent data available, meets one or both of the following:

    (i) The school is in the highest quartile of schools in a ranking of all schools served by a local educational agency (LEA), ranked in descending order by percentage of students from low-income families enrolled in such schools, as determined by the LEA based on one of the following measures of poverty:

    (A) The percentage of students aged 5 through 17 in poverty counted in the most recent Census data approved by the Secretary.

    (B) The percentage of students eligible for a free or reduced price school lunch under the Richard B. Russell National School Lunch Act [42 U.S.C. 1751 et seq.].

    (C) The percentage of students in families receiving assistance under the State program funded under part A of title IV of the Social Security Act (42 U.S.C. 601 et seq.).

    (D) The percentage of students eligible to receive medical assistance under the Medicaid program.

    (E) A composite of two or more of the measures described in paragraphs (i)(A) through (D) of this definition.

    (ii) In the case of—

    (A) An elementary school, the school serves students not less than 60 percent of whom are eligible for a free or reduced price school lunch under the Richard B. Russell National School Lunch Act; or

    (B) Any school other than an elementary school, the school serves students not less than 45 percent of whom are eligible for a free or reduced price school lunch under the Richard B. Russell National School Lunch Act.

    Low-performing teacher preparation program: A teacher preparation program that is identified as low-performing by a State based on the State's assessment of teacher preparation program performance under § 612.4.

    Novice teacher: A teacher of record in the first three years of teaching who teaches elementary or secondary public school students, which may include, at a State's discretion, preschool students.

    Quality clinical preparation: Training that integrates content, pedagogy, and professional coursework around a core of pre-service clinical experiences. Such training must, at a minimum—

    (i) Be provided by qualified clinical instructors, including school and LEA-based personnel, who meet established qualification requirements and who use a training standard that is made publicly available;

    (ii) Include multiple clinical or field experiences, or both, that serve diverse, rural, or underrepresented student populations in elementary through secondary school, including English learners and students with disabilities, and that are assessed using a performance-based protocol to demonstrate teacher candidate mastery of content and pedagogy; and

    (iii) Require that teacher candidates use research-based practices, including observation and analysis of instruction, collaboration with peers, and effective use of technology for instructional purposes.

    Recent graduate: An individual whom a teacher preparation program has documented as having met all the requirements of the program in any of the three title II reporting years preceding the current reporting year, as defined in the report cards prepared under §§ 612.3 and 612.4. Documentation may take the form of a degree, institutional certificate, program credential, transcript, or other written proof of having met the program's requirements. For the purposes of this definition, a program may not use either of the following criteria to determine if an individual has met all the requirements of the program:

    (i) Becoming a teacher of record; or

    (ii) Obtaining initial certification or licensure.

    Rigorous teacher candidate exit qualifications: Qualifications of a teacher candidate established by a teacher preparation program prior to the candidate's completion of the program using an assessment of candidate performance that relies, at a minimum, on validated professional teaching standards and measures of the candidate's effectiveness in curriculum planning, instruction of students, appropriate plans and modifications for all students, and assessment of student learning.

    Student growth: The change in student achievement between two or more points in time, using a student's scores on the State's assessments under section 1111(b)(2) of the ESEA or other measures of student learning and performance, such as student results on pre-tests and end-of-course tests; objective performance-based assessments; student learning objectives; student performance on English language proficiency assessments; and other measures that are rigorous, comparable across schools, and consistent with State guidelines.

    Teacher evaluation measure: A teacher's performance level based on an LEA's teacher evaluation system that differentiates teachers on a regular basis using at least three performance levels and multiple valid measures in assessing teacher performance. For purposes of this definition, multiple valid measures must include data on student growth for all students (including English learners and students with disabilities) and other measures of professional practice (such as observations based on rigorous teacher performance standards, teacher portfolios, and student and parent surveys).

    Teacher of record: A teacher (including a teacher in a co-teaching assignment) who has been assigned the lead responsibility for student learning in a subject or area.

    Teacher placement rate: (i) The percentage of recent graduates who have become novice teachers (regardless of retention) for the grade level, grade span, and subject area in which they were prepared.

    (ii) At the State's discretion, the rate calculated under paragraph (i) of this definition may exclude one or more of the following, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State:

    (A) Recent graduates who have taken teaching positions in another State.

    (B) Recent graduates who have taken teaching positions in private schools.

    (C) Recent graduates who have enrolled in graduate school or entered military service.

    (iii) For a teacher preparation program provided through distance education, a State calculates the rate under paragraph (i) of this definition using the total number of recent graduates who have obtained certification or licensure in the State during the three preceding title II reporting years as the denominator.

    Teacher preparation entity: An institution of higher education or other organization that is authorized by the State to prepare teachers.

    Teacher preparation program: A program, whether traditional or alternative route, offered by a teacher preparation entity that leads to initial State teacher certification or licensure in a specific field. Where some participants in the program are in a traditional route to certification or licensure in a specific field, and others are in an alternative route to certification or licensure in that same field, the traditional and alternative route components are considered to be separate teacher preparation programs. The term teacher preparation program includes a teacher preparation program provided through distance education.

    Teacher preparation program provided through distance education: A teacher preparation program at which at least 50 percent of the program's required coursework is offered through distance education.

    Teacher retention rate: The percentage of individuals in a given cohort of novice teachers who have been continuously employed as teachers of record in each year between their first year as a novice teacher and the current reporting year.

    (i) For the purposes of this definition, a cohort of novice teachers includes all teachers who were first identified as a novice teacher by the State in the same title II reporting year.

    (ii) At the State's discretion, the teacher retention rates may exclude one or more of the following, provided that the State uses a consistent approach to assess and report on all teacher preparation programs in the State:

    (A) Novice teachers who have taken teaching positions in other States.

    (B) Novice teachers who have taken teaching positions in private schools.

    (C) Novice teachers who are not retained specifically and directly due to budget cuts.

    (D) Novice teachers who have enrolled in graduate school or entered military service.

    Teacher survey: A survey administered to all novice teachers who are in their first year of teaching that is designed to capture their perceptions of whether the preparation that they received from their teacher preparation program was effective.

    Title II reporting year: A period of twelve consecutive months, starting September 1 and ending August 31.

    Subpart B—Reporting Requirements
    § 612.3 What are the regulatory reporting requirements for the Institutional report card?

    Beginning not later than April 30, 2018, and annually thereafter, each institution of higher education that conducts a teacher preparation program and that enrolls students receiving title IV HEA program funds—

    (a) Must report to the State on the quality of teacher preparation and other information consistent with section 205(a) of the HEA, using an institutional report card that is prescribed by the Secretary;

    (b) Must prominently and promptly post the institutional report card information on the institution's Web site and, if applicable, on the teacher preparation program portion of the institution's Web site; and

    (c) May also provide the institutional report card information to the general public in promotional or other materials it makes available to prospective students or other individuals.

    § 612.4 What are the regulatory reporting requirements for the State report card?

    (a) General. Beginning not later than October 31, 2018, and annually thereafter, each State that receives funds under the HEA must—

    (1) Report to the Secretary, using a State report card that is prescribed by the Secretary, on—

    (i) The quality of all teacher preparation programs in the State consistent with paragraph (b)(3) of this section, whether or not they enroll students receiving Federal assistance under the HEA; and

    (ii) All other information consistent with section 205(b) of the HEA; and

    (2) Make the State report card information widely available to the general public by posting the State report card information on the State's Web site.

    (b) Reporting of information on teacher preparation program performance. In the State report card, beginning not later than October 31, 2019, and annually thereafter, the State—

    (1) Must make meaningful differentiations in teacher preparation program performance using at least three performance levels—low-performing teacher preparation program, at-risk teacher preparation program, and effective teacher preparation program—based on the indicators in § 612.5.

    (2) Must provide—

    (i) For each teacher preparation program, data for each of the indicators identified in § 612.5 for the most recent title II reporting year;

    (ii) The State's weighting of the different indicators in § 612.5 for purposes of describing the State's assessment of program performance; and

    (iii) Any State-level rewards or consequences associated with the designated performance levels;

    (3) In implementing paragraph (b)(1) through (2) of this section, except as provided in paragraphs (b)(3)(ii)(D) and (b)(5) of this section, must ensure the performance of all of the State's teacher preparation programs are represented in the State report card by—

    (i)(A) Annually reporting on the performance of each teacher preparation program that, in a given reporting year, produces a total of 25 or more recent graduates who have received initial certification or licensure from the State that allows them to serve in the State as teachers of record for K-12 students and, at a State's discretion, preschool students (i.e., the program size threshold); or

    (B) If a State chooses a program size threshold of less than 25 (e.g., 15 or 20), annually reporting on the performance of each teacher preparation program that, in a given reporting year, produces an amount of recent graduates, as described in this paragraph (b)(3)(i), that meets or exceeds this threshold; and

    (ii) For any teacher preparation program that does not meet the program size threshold in paragraph (b)(3)(i)(A) or (B) of this section, annually reporting on the program's performance by aggregating data under paragraph (b)(3)(ii)(A), (B), or (C) of this section in order to meet the program size threshold except as provided in paragraph (b)(3)(ii)(D) of this section.

    (A) The State may report on the program's performance by aggregating data that determine the program's performance with data for other teacher preparation programs that are operated by the same teacher preparation entity and are similar to or broader than the program in content.

    (B) The State may report on the program's performance by aggregating data that determine the program's performance over multiple years for up to four years until the program size threshold is met.

    (C) If the State cannot meet the program size threshold by aggregating data under paragraph (b)(3)(ii)(A) or (B) of this section, it may aggregate data using a combination of the methods under both of these paragraphs.

    (D) The State is not required under this paragraph (b)(3)(ii) of this section to report data on a particular teacher preparation program for a given reporting year if aggregation under paragraph (b)(3)(ii) of this section would not yield the program size threshold for that program; and

    (4) Must report on the procedures established by the State in consultation with a group of stakeholders, as described in paragraph (c)(1) of this section, and on the State's examination of its data collection and reporting, as described in paragraph (c)(2) of this section, in the State report card submitted—

    (i) No later than October 31, 2019, and every four years thereafter; and

    (ii) At any other time that the State makes a substantive change to the weighting of the indicators or the procedures for assessing and reporting the performance of each teacher preparation program in the State described in paragraph (c) of this section.

    (5) The State is not required under this paragraph (b) to report data on a particular teacher preparation program if reporting these data would be inconsistent with Federal or State privacy and confidentiality laws and regulations.

    (c) Fair and equitable methods—(1) Consultation. Each State must establish, in consultation with a representative group of stakeholders, the procedures for assessing and reporting the performance of each teacher preparation program in the State under this section.

    (i) The representative group of stakeholders must include, at a minimum, representatives of—

    (A) Leaders and faculty of traditional teacher preparation programs and alternative routes to State certification or licensure programs;

    (B) Students of teacher preparation programs;

    (C) LEA superintendents;

    (D) Small teacher preparation programs (i.e., programs that produce fewer than a program size threshold of 25 recent graduates in a given year or any lower threshold set by a State, as described in paragraph (b)(3)(i) of this section);

    (E) Local school boards;

    (F) Elementary through secondary school leaders and instructional staff;

    (G) Elementary through secondary school students and their parents;

    (H) IHEs that serve high proportions of low-income students, students of color, or English learners;

    (I) English learners, students with disabilities, and other underserved students;

    (J) Officials of the State's standards board or other appropriate standards body; and

    (K) At least one teacher preparation program provided through distance education.

    (ii) The procedures for assessing and reporting the performance of each teacher preparation program in the State under this section must, at minimum, include—

    (A) The weighting of the indicators identified in § 612.5 for establishing performance levels of teacher preparation programs as required by this section;

    (B) The method for aggregation of data pursuant to paragraph (b)(3)(ii) of this section;

    (C) Any State-level rewards or consequences associated with the designated performance levels; and

    (D) Appropriate opportunities for programs to challenge the accuracy of their performance data and classification of the program.

    (2) State examination of data collection and reporting. Each State must periodically examine the quality of the data collection and reporting activities it conducts pursuant to paragraph (b) of this section and § 612.5, and, as appropriate, modify its procedures for assessing and reporting the performance of each teacher preparation program in the State using the procedures in paragraph (c)(1) of this section.

    (d) Inapplicability to certain insular areas. Paragraphs (b) and (c) of this section do not apply to American Samoa, the Commonwealth of the Northern Mariana Islands, the freely associated States of the Republic of the Marshall Islands, the Federated States of Micronesia, the Republic of Palau, Guam, and the United States Virgin Islands.

    § 612.5 What indicators must a State use to report on teacher preparation program performance for purposes of the State report card?

    (a) For purposes of reporting under § 612.4, a State must assess, for each teacher preparation program within its jurisdiction, indicators of academic content knowledge and teaching skills of novice teachers from that program, including, at a minimum, the following indicators:

    (1) Student learning outcomes.

    (i) For each year and each teacher preparation program in the State, a State must calculate the aggregate student learning outcomes of all students taught by novice teachers.

    (ii) For purposes of calculating student learning outcomes under paragraph (a)(1)(i) of this section, a State must use:

    (A) Student growth;

    (B) A teacher evaluation measure;

    (C) Another State-determined measure that is relevant to calculating student learning outcomes, including academic performance, and that meaningfully differentiates among teachers; or

    (D) Any combination of paragraphs (a)(1)(ii)(A), (B), or (C) of this section.

    (iii) At the State's discretion, in calculating a teacher preparation program's aggregate student learning outcomes a State may exclude one or both of the following, provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State—

    (A) Student learning outcomes of students taught by novice teachers who have taken teaching positions in another State.

    (B) Student learning outcomes of all students taught by novice teachers who have taken teaching positions in private schools.

    (2) Employment outcomes.

    (i) Except as provided in paragraph (a)(2)(v) of this section, for each year and each teacher preparation program in the State, a State must calculate:

    (A) Teacher placement rate;

    (B) Teacher placement rate in high-need schools;

    (C) Teacher retention rate; and

    (D) Teacher retention rate in high-need schools.

    (ii) For purposes of reporting the teacher retention rate and teacher retention rate in high-need schools under paragraph (a)(2)(i)(C) and (D) of this section—

    (A) Except as provided in paragraph (B), the State reports a teacher retention rate for each of the three cohorts of novice teachers immediately preceding the current title II reporting year.

    (B)(1) The State is not required to report a teacher retention rate for any teacher preparation program in the State report to be submitted in October 2018.

    (2) For the State report to be submitted in October 2019, the teacher retention rate must be calculated for the cohort of novice teachers identified in the 2017-2018 title II reporting year.

    (3) For the State report to be submitted in October 2020, separate teacher retention rates must be calculated for the cohorts of novice teachers identified in the 2017-2018 and 2018-2019 title II reporting years.

    (iii) For the purposes of calculating employment outcomes under paragraph (a)(2)(i) of this section, a State may, at its discretion, assess traditional and alternative route teacher preparation programs differently, provided that differences in assessments and the reasons for those differences are transparent and that assessments result in equivalent levels of accountability and reporting irrespective of the type of program.

    (iv) For the purposes of the teacher placement rate under paragraph (a)(2)(i)(A) and (B) of this section, a State may, at its discretion, assess teacher preparation programs provided through distance education differently from teacher preparation programs not provided through distance education, based on whether the differences in the way the rate is calculated for teacher preparation programs provided through distance education affect employment outcomes. Differences in assessments and the reasons for those differences must be transparent and result in equivalent levels of accountability and reporting irrespective of where the program is physically located.

    (v) A State is not required to calculate a teacher placement rate under paragraph (a)(2)(i)(A) of this section for alternative route to certification programs.

    (3) Survey outcomes. (i) For each year and each teacher preparation program on which a State must report a State must collect through survey instruments qualitative and quantitative data including, but not limited to, a teacher survey and an employer survey designed to capture perceptions of whether novice teachers who are employed in their first year of teaching possess the academic content knowledge and teaching skills needed to succeed in the classroom.

    (ii) At the State's discretion, in calculating a teacher preparation program's survey outcomes the State may exclude survey outcomes for all novice teachers who have taken teaching positions in private schools provided that the State uses a consistent approach to assess and report on all of the teacher preparation programs in the State.

    (4) Characteristics of teacher preparation programs. Whether the program—

    (i) Is administered by an entity accredited by an agency recognized by the Secretary for accreditation of professional teacher education programs; or

    (ii) Produces teacher candidates—

    (A) With content and pedagogical knowledge;

    (B) With quality clinical preparation; and

    (C) Who have met rigorous teacher candidate exit qualifications.

    (b) At a State's discretion, the indicators of academic content knowledge and teaching skills may include other indicators of a teacher's effect on student performance, such as student survey results, provided that the State uses the same indicators for all teacher preparation programs in the State.

    (c) A State may, at its discretion, exclude from its reporting under paragraph (a)(1)-(3) of this section individuals who have not become novice teachers after three years of becoming recent graduates.

    (d) This section does not apply to American Samoa, the Commonwealth of the Northern Mariana Islands, the freely associated states of the Republic of the Marshall Islands, the Federated States of Micronesia, the Republic of Palau, Guam, and the United States Virgin Islands.

    § 612.6 What must a State consider in identifying low-performing teacher preparation programs or at-risk teacher preparation programs, and what actions must a State take with respect to those programs identified as low-performing?

    (a)(1) In identifying low-performing or at-risk teacher preparation programs the State must use criteria that, at a minimum, include the indicators of academic content knowledge and teaching skills from § 612.5.

    (2) Paragraph (a)(1) of this section does not apply to American Samoa, the Commonwealth of the Northern Mariana Islands, the freely associated states of the Republic of the Marshall Islands, the Federated States of Micronesia, the Republic of Palau, Guam, and the United States Virgin Islands.

    (b) At a minimum, a State must provide technical assistance to low-performing teacher preparation programs in the State to help them improve their performance in accordance with section 207(a) of the HEA. Technical assistance may include, but is not limited to: Providing programs with information on the specific indicators used to determine the program's rating (e.g., specific areas of weakness in student learning, job placement and retention, and novice teacher and employer satisfaction); assisting programs to address the rigor of their exit criteria; helping programs identify specific areas of curriculum or clinical experiences that correlate with gaps in graduates' preparation; helping identify potential research and other resources to assist program improvement (e.g., evidence of other successful interventions, other university faculty, other teacher preparation programs, nonprofits with expertise in educator preparation and teacher effectiveness improvement, accrediting organizations, or higher education associations); and sharing best practices from exemplary programs.

    Subpart C—Consequences of Withdrawal of State Approval or Financial Support
    § 612.7 What are the consequences for a low-performing teacher preparation program that loses the State's approval or the State's financial support?

    (a) Any teacher preparation program for which the State has withdrawn the State's approval or the State has terminated the State's financial support due to the State's identification of the program as a low-performing teacher preparation program—

    (1) Is ineligible for any funding for professional development activities awarded by the Department as of the date that the State withdrew its approval or terminated its financial support;

    (2) May not include any candidate accepted into the teacher preparation program or any candidate enrolled in the teacher preparation program who receives aid under title IV, HEA programs in the institution's teacher preparation program as of the date that the State withdrew its approval or terminated its financial support; and

    (3) Must provide transitional support, including remedial services, if necessary, to students enrolled at the institution at the time of termination of financial support or withdrawal of approval for a period of time that is not less than the period of time a student continues in the program but no more than 150 percent of the published program length.

    (b) Any institution administering a teacher preparation program that has lost State approval or financial support based on being identified as a low-performing teacher preparation program must—

    (1) Notify the Secretary of its loss of the State's approval or the State's financial support due to identification as low-performing by the State within 30 days of such designation;

    (2) Immediately notify each student who is enrolled in or accepted into the low-performing teacher preparation program and who receives title IV, HEA program funds that, commencing with the next payment period, the institution is no longer eligible to provide such funding to students enrolled in or accepted into the low-performing teacher preparation program; and

    (3) Disclose on its Web site and in promotional materials that it makes available to prospective students that the teacher preparation program has been identified as a low-performing teacher preparation program by any State and has lost the State's approval or the State's financial support, including the identity of the State or States, and that students accepted or enrolled in the low-performing teacher preparation program may not receive title IV, HEA program funds.

    § 612.8 How does a low-performing teacher preparation program regain eligibility to accept or enroll students receiving Title IV, HEA program funds after loss of the State's approval or the State's financial support?

    (a) A low-performing teacher preparation program that has lost the State's approval or the State's financial support may regain its ability to accept and enroll students who receive title IV, HEA program funds upon demonstration to the Secretary under paragraph (b) of this section of—

    (1) Improved performance on the teacher preparation program performance criteria in § 612.5 as determined by the State; and

    (2) Reinstatement of the State's approval or the State's financial support, or, if both were lost, the State's approval and the State's financial support.

    (b) To regain eligibility to accept or enroll students receiving title IV, HEA funds in a teacher preparation program that was previously identified by the State as low-performing and that lost the State's approval or the State's financial support, the institution that offers the teacher preparation program must submit an application to the Secretary along with supporting documentation that will enable the Secretary to determine that the teacher preparation program has met the requirements under paragraph (a) of this section.

    PART 686—TEACHER EDUCATION ASSISTANCE FOR COLLEGE AND HIGHER EDUCATION (TEACH) GRANT PROGRAM 2. The authority citation for part 686 continues to read as follows: Authority:

    20 U.S.C. 1070g, et seq., unless otherwise noted.

    § 686.1 [Amended]
    3. Section 686.1 is amended by removing the words “school serving low-income students” and adding, in their place, the words “school or educational service agency serving low-income students (low-income school)”. 4. Section 686.2 is amended by: A. Redesignating paragraph (d) as paragraph (e). B. Adding a new paragraph (d). C. In newly redesignated paragraph (e): i. Redesignating paragraphs (1) and (2) in the definition of “Academic year or its equivalent for elementary and secondary schools (elementary or secondary academic year)” as paragraphs (i) and (ii); ii. Adding in alphabetical order the definition of “Educational service agency”; iii. Redesignating paragraphs (1) through (7) in the definition of “High-need field” as paragraphs (i) through (vii), respectively; iv. Adding in alphabetical order definitions of “High-quality teacher preparation program not provided through distance education” and “High-quality teacher preparation program provided through distance education”; v. Redesignating paragraphs (1) through (3) in the definition of “Institutional Student Information Record (ISIR)” as paragraphs (i) through (iii), respectively; vi. Redesignating paragraphs (1) and (2) as paragraphs (i) and (ii) and paragraphs (2)(i) and (ii) as paragraphs (ii)(A) and (B), respectively, in the definition of “Numeric equivalent”; vii. Redesignating paragraphs (1) through (3) in the definition of “Post-baccalaureate program” as paragraphs (i) through (iii), respectively; viii. Adding in alphabetical order a definition for “School or educational service agency serving low-income students (low-income school)”; ix. Removing the definition of “School serving low-income students (low-income school)”; x. Revising the definitions of “TEACH Grant-eligible institution” and “TEACH Grant-eligible program”; and xi. Revising the definition of “Teacher preparation program.”

    The additions and revisions read as follows:

    § 686.2 Definitions.

    (d) A definition for the following term used in this part is in Title II Reporting System, 34 CFR part 612:

    Effective teacher preparation program.

    (e) * * *

    Educational service agency: A regional public multiservice agency authorized by State statute to develop, manage, and provide services or programs to LEAs, as defined in section 8101 of the Elementary and Secondary Education Act of l965, as amended (ESEA).

    High-quality teacher preparation program not provided through distance education: A teacher preparation program at which less than 50 percent of the program's required coursework is offered through distance education; and

    (i) Beginning with the 2021-2022 award year, is not classified by the State to be less than an effective teacher preparation program based on 34 CFR 612.4(b) in two of the previous three years; or

    (ii) Meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or 34 CFR 612.4(b)(5).

    High-quality teacher preparation program provided through distance education: A teacher preparation program at which at least 50 percent of the program's required coursework is offered through distance education; and

    (i) Beginning with the 2021-2022 award year, is not classified by the same State to be less than an effective teacher preparation program based on 34 CFR 612.4(b); in two of the previous three years; or

    (ii) Meets the exception from State reporting of teacher preparation program performance under 34 CFR 612.4(b)(3)(ii)(D) or (E).

    School or educational service agency serving low-income students (low-income school): An elementary or secondary school or educational service agency that—

    (i) Is located within the area served by the LEA that is eligible for assistance pursuant to title I of the ESEA;

    (ii) Has been determined by the Secretary to be a school or educational service agency in which more than 30 percent of the school's or educational service agency's total enrollment is made up of children who qualify for services provided under title I of the ESEA; and

    (iii) Is listed in the Department's Annual Directory of Designated Low-Income Schools for Teacher Cancellation Benefits. The Secretary considers all elementary and secondary schools and educational service agencies operated by the Bureau of Indian Education (BIE) in the Department of the Interior or operated on Indian reservations by Indian tribal groups under contract or grant with the BIE to qualify as schools or educational service agencies serving low-income students.

    TEACH Grant-eligible institution: An eligible institution as defined in 34 CFR part 600 that meets financial responsibility standards established in 34 CFR part 668, subpart L, or that qualifies under an alternative standard in 34 CFR 668.175 and—

    (i) Provides at least one high-quality teacher preparation program not provided through distance education or one high-quality teacher preparation program provided through distance education at the baccalaureate or master's degree level that also provides supervision and support services to teachers, or assists in the provision of services to teachers, such as—

    (A) Identifying and making available information on effective teaching skills or strategies;

    (B) Identifying and making available information on effective practices in the supervision and coaching of novice teachers; and

    (C) Mentoring focused on developing effective teaching skills and strategies;

    (ii) Provides a two-year program that is acceptable for full credit in a TEACH Grant-eligible program offered by an institution described in paragraph (i) of this definition, as demonstrated by the institution that provides the two-year program, or provides a program that is the equivalent of an associate degree, as defined in § 668.8(b)(1), that is acceptable for full credit toward a baccalaureate degree in a TEACH Grant-eligible program;

    (iii) Provides a high-quality teacher preparation program not provided through distance education or a high-quality teacher preparation program provided through distance education that is a post-baccalaureate program of study; or

    (iv) Provides a master's degree program that does not meet the definition of terms “high-quality teacher preparation program not provided through distance education” or “high-quality teacher preparation program that is provided through distance education” because it is not subject to reporting under 34 CFR part 612, but that prepares:

    (A) A teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field; or

    (B) A teacher who is using high-quality alternative certification routes to become certified.

    TEACH Grant-eligible program: (i) An eligible program, as defined in 34 CFR 668.8, that meets the definition of a “high-quality teacher preparation program not provided through distance education” or “high-quality teacher preparation program provided through distance education” and that is designed to prepare an individual to teach as a highly-qualified teacher in a high-need field and leads to a baccalaureate or master's degree, or is a post-baccalaureate program of study;

    (ii) A program that is a two-year program or is the equivalent of an associate degree, as defined in 34 CFR 668.8(b)(1), that is acceptable for full credit toward a baccalaureate degree in a TEACH Grant-eligible program; or;

    (iii) A master's degree program that does not meet the definition of the terms “high-quality teacher preparation not provided through distance education” or “high-quality teacher preparation program that is provided through distance education” because it is not subject to reporting under 34 CFR part 612, but that prepares:

    (A) A teacher or a retiree from another occupation with expertise in a field in which there is a shortage of teachers, such as mathematics, science, special education, English language acquisition, or another high-need field; or

    (B) A teacher who is using high-quality alternative certification routes to become certified.

    Teacher preparation program: A course of study, provided by an institution of higher education, the completion of which signifies that an enrollee has met all of the State's educational or training requirements for initial certification or licensure to teach in the State's elementary or secondary schools. A teacher preparation program may be a traditional program or an alternative route to certification or licensure, as defined by the State.

    5. Section 686.3 is amended by adding paragraph (c) to read as follows:
    § 686.3 Duration of student eligibility.

    (c) An otherwise eligible student who received a TEACH Grant for enrollment in a TEACH Grant-eligible program is eligible to receive additional TEACH Grants to complete that program, even if that program is no longer considered a TEACH Grant-eligible program, not to exceed four Scheduled Awards for an undergraduate or post-baccalaureate student and up to two Scheduled Awards for a graduate student.

    6. Section 686.11 is amended by: A. Revising paragraph (a)(1)(iii). B. Adding paragraph (d).

    The revision and addition read as follows:

    § 686.11 Eligibility to receive a grant.

    (a) * * *

    (1) * * *

    (iii) Is enrolled in a TEACH Grant-eligible institution in a TEACH Grant-eligible program or is an otherwise eligible student who received a TEACH Grant and who is completing a program under § 686.3(c);

    (d) Students who received a total and permanent disability discharge of a TEACH Grant agreement to serve or a title IV, HEA loan. If a student's previous TEACH Grant agreement to serve or title IV, HEA loan was discharged based on total and permanent disability, the student is eligible to receive a TEACH Grant if the student—

    (1) Obtains a certification from a physician that the student is able to engage in substantial gainful activity as defined in 34 CFR 685.102(b);

    (2) Signs a statement acknowledging that neither the new agreement to serve for the TEACH Grant the student receives nor any previously discharged agreement to serve which the grant recipient is required to fulfill in accordance with paragraph (d)(3) of this section can be discharged in the future on the basis of any impairment present when the new grant is awarded, unless that impairment substantially deteriorates and the grant recipient applies for and meets the eligibility requirements for a discharge in accordance with 34 CFR 685.213; and

    (3) In the case of a student who receives a new TEACH Grant within three years of the date that any previous TEACH Grant service obligation or title IV loan was discharged due to a total and permanent disability in accordance with § 686.42(b), 34 CFR 685.213(b)(4)(iii), 34 CFR 674.61(b)(3)(v), or 34 CFR 682.402(c)(3)(iv), acknowledges that he or she is once again subject to the terms of the previously discharged TEACH Grant agreement to serve or resumes repayment on the previously discharged loan in accordance with 34 CFR 685.213(b)(7), 674.61(b)(6), or 682.402(c)(6) before receiving the new grant.

    7. Section 686.12 is amended by: A. In paragraph (b)(2), adding the words “low-income” before the word “school”; and B. Revising paragraph (d).

    The revision reads as follows:

    § 686.12 Agreement to serve.

    (d) Majoring and serving in a high-need field. In order for a grant recipient's teaching service in a high-need field listed in the Nationwide List to count toward satisfying the recipient's service obligation, the high-need field in which he or she prepared to teach must be listed in the Nationwide List for the State in which the grant recipient teaches—

    (1) At the time the grant recipient begins teaching in that field, even if that field subsequently loses its high-need designation for that State; or

    (2) For teaching service performed on or after July 1, 2010, at the time the grant recipient begins teaching in that field or when the grant recipient signed the agreement to serve or received the TEACH Grant, even if that field subsequently loses its high-need designation for that State before the grant recipient begins teaching.

    § 686.32 [Amended]
    8. Section 686.32 is amended by: A. In paragraph (a)(3)(iii)(B), adding the words “or when the grant recipient signed the agreement to serve or received the TEACH Grant” after the words “that field”; and B. In paragraph (c)(4)(iv)(B), adding the words “or when the grant recipient signed the agreement to serve or received the TEACH Grant” after the words “that field”.
    § 686.37 [Amended]
    9. Section 686.37(a)(1) is amended by removing the citation “§§ 686.11” and adding in its place the citation “§§ 686.3(c), 686.11,.” 10. Section 686.40 is amended by revising paragraphs (b) and (f) to read as follows:
    § 686.40 Documenting the service obligation.

    (b) If a grant recipient is performing full-time teaching service in accordance with the agreement to serve, or agreements to serve if more than one agreement exists, the grant recipient must, upon completion of each of the four required elementary or secondary academic years of teaching service, provide to the Secretary documentation of that teaching service on a form approved by the Secretary and certified by the chief administrative officer of the school or educational service agency in which the grant recipient is teaching. The documentation must show that the grant recipient is teaching in a low-income school. If the school or educational service agency at which the grant recipient is employed meets the requirements of a low-income school in the first year of the grant recipient's four elementary or secondary academic years of teaching and the school or educational service agency fails to meet those requirements in subsequent years, those subsequent years of teaching qualify for purposes of this section for that recipient.

    (f) A grant recipient who taught in more than one qualifying school or more than one qualifying educational service agency during an elementary or secondary academic year and demonstrates that the combined teaching service was the equivalent of full-time, as supported by the certification of one or more of the chief administrative officers of the schools or educational service agencies involved, is considered to have completed one elementary or secondary academic year of qualifying teaching.

    11. Section 686.42 is amended by revising paragraph (b) to read as follows:
    § 686.42 Discharge of agreement to serve.

    (b) Total and permanent disability. (1) A grant recipient's agreement to serve is discharged if the recipient becomes totally and permanently disabled, as defined in 34 CFR 685.102(b), and the grant recipient applies for and satisfies the eligibility requirements for a total and permanent disability discharge in accordance with 34 CFR 685.213.

    (2) If at any time the Secretary determines that the grant recipient does not meet the requirements of the three-year period following the discharge as described in 34 CFR 685.213(b)(7), the Secretary will notify the grant recipient that the grant recipient's obligation to satisfy the terms of the agreement to serve is reinstated.

    (3) The Secretary's notification under paragraph (b)(2) of this section will—

    (i) Include the reason or reasons for reinstatement;

    (ii) Provide information on how the grant recipient may contact the Secretary if the grant recipient has questions about the reinstatement or believes that the agreement to serve was reinstated based on incorrect information; and

    (iii) Inform the TEACH Grant recipient that he or she must satisfy the service obligation within the portion of the eight-year period that remained after the date of the discharge.

    (4) If the TEACH Grant of a recipient whose TEACH Grant agreement to serve is reinstated is later converted to a Direct Unsubsidized Loan, the recipient will not be required to pay interest that accrued on the TEACH Grant disbursements from the date the agreement to serve was discharged until the date the agreement to serve was reinstated.

    12. Section 686.43 is amended by revising paragraph (a)(1) to read as follows:
    § 686.43 Obligation to repay the grant.

    (a) * * *

    (1) The grant recipient, regardless of enrollment status, requests that the TEACH Grant be converted into a Federal Direct Unsubsidized Loan because he or she has decided not to teach in a qualified school or educational service agency, or not to teach in a high-need field, or for any other reason;

    [FR Doc. 2016-24856 Filed 10-28-16; 8:45 am] BILLING CODE 4000-01-P
    81 210 Monday, October 31, 2016 Rules and Regulations Part III Department of the Treasury 31 CFR Part 148 Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation Authority; Final Rule DEPARTMENT OF THE TREASURY 31 CFR Part 148 RIN 1505-AC46 Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation Authority AGENCY:

    Department of the Treasury.

    ACTION:

    Final rule.

    SUMMARY:

    The Secretary of the Treasury (the “Secretary”), as Chairperson of the Financial Stability Oversight Council (the “Council”), is adopting final rules (the “Final Rules”) in consultation with the Federal Deposit Insurance Corporation (the “FDIC”) to implement the qualified financial contract (“QFC”) recordkeeping requirements of the Dodd-Frank Wall Street Reform and Consumer Protection Act (the “Dodd-Frank Act” or the “Act”). The Final Rules require recordkeeping with respect to positions, counterparties, legal documentation, and collateral. This information is necessary and appropriate to assist the FDIC as receiver to: Fulfill its obligations under the Dodd-Frank Act in deciding whether to transfer QFCs; assess the consequences of decisions to transfer, disaffirm or repudiate, or allow the termination of, QFCs with one or more counterparties; determine if any risks to financial stability are posed by the transfer, disaffirmance or repudiation, or termination of such QFCs; and otherwise exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act.

    DATES:

    The Final Rules are effective December 30, 2016.

    FOR FURTHER INFORMATION CONTACT:

    Monique Y.S. Rollins, Deputy Assistant Secretary for Capital Markets, (202) 622-1745; Jacob Liebschutz, Director, Office of Capital Markets, (202) 622-8954; Peter Nickoloff, Financial Economist, Office of Capital Markets, (202) 622-1692; Steven D. Laughton, Assistant General Counsel (Banking & Finance), (202) 622-8413; or Stephen T. Milligan, Attorney-Advisor, (202) 622-4051.

    SUPPLEMENTARY INFORMATION: Table of Contents I. Introduction II. Description of the Final Rules A. Scope, Purpose, Effective Date, and Compliance Dates 1. Scope 2. Purpose 3. Effective Date and Compliance Dates B. General Definitions C. Form, Availability, and Maintenance of Records 1. Form and Availability 2. Maintenance and Updating 3. Exemptions D. Content of Records 1. General Information 2. Appendix Information III. Administrative Law Matters A. Regulatory Flexibility Act B. Paperwork Reduction Act C. Executive Orders 12866 and 13563 1. Description of the Need for the Regulatory Action 2. Literature Review 3. Baseline 4. Evaluation of Alternatives 5. Affected Population 6. Assessment of Potential Costs and Benefits 7. Retrospective Analysis IV. Text of the Final Rules I. Introduction

    Title II of the Dodd-Frank Act (“Title II”) 1 generally establishes a mechanism for the orderly resolution of a financial company whose failure and resolution under otherwise applicable federal or state law would have serious adverse effects on financial stability in the United States.

    1 Dodd-Frank Wall Street Reform and Consumer Protection Act, Public Law 111-203, 124 Stat. 1376 (2010).

    Section 210(c)(8)(H) of the Act requires the Federal primary financial regulatory agencies, as defined in the Act 2 (the “PFRAs”), to jointly prescribe, by July 21, 2012, final or interim final regulations that require financial companies to maintain such records with respect to QFCs that the PFRAs determine to be necessary or appropriate to assist the FDIC as receiver for a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10).3 Section 210(c)(8)(H) also requires the regulations to, as appropriate, differentiate among financial companies by taking into consideration their size, risk, complexity, leverage, frequency and dollar amount of QFCs, interconnectedness to the financial system, and any other factors deemed appropriate.

    2 12 U.S.C. 5301(12). See the term “primary financial regulatory agency.”

    3 12 U.S.C. 5390(c)(8)(H).

    Section 210(c)(8)(H) provides that if the PFRAs do not so prescribe such joint regulations by July 21, 2012, the Secretary, as Chairperson of the Council, shall prescribe such regulations in consultation with the FDIC. As the PFRAs did not prescribe such regulations by the statutory deadline, on January 7, 2015, the Secretary, as Chairperson of the Council, in consultation with the FDIC, requested public comment on proposed rules that would implement section 210(c)(8)(H) (the “Proposed Rules”).4 The Secretary received comments on the Proposed Rules from trade associations, asset managers, insurance companies, clearing organizations, a nonprofit organization, and a private individual. In general, commenters acknowledged the need for the FDIC to have access to appropriate QFC records in order to exercise its role as a receiver under Title II of the Dodd-Frank Act but also requested relief from aspects of the Proposed Rules that they argued were unduly burdensome.5 As discussed below, the Secretary has, in consultation with the FDIC, made substantial changes in the Final Rules in response to the comments received. In making these changes, the Secretary has sought to reduce the burden of the rules while still assuring that the FDIC will have the records it needs to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), and (10).

    4 80 FR 966 (Jan. 7, 2015).

    5See, e.g., comment letters from The Clearing House Association L.L.C., the Securities Industry and Financial Markets Association, the American Bankers Association, the Financial Services Roundtable, and the Int'l Swaps and Derivatives Association, Inc. (April 7, 2015) (the “TCH et al. letter”), p. 2; The Depository Trust & Clearing Corporation (April 7, 2015) (“DTCC letter”), pp. 1-2; Sutherland Asbill & Brennan LLP on behalf of The Commercial Energy Working Group (April 7, 2015) (“CEWG letter), p. 2; the Asset Management Group of the Securities Industry and Financial Markets Association (April 7, 2015) (“SIFMA AMG letter”), p. 1.

    The substantial constraints imposed by Title II on the FDIC's exercise of its rights with respect to QFCs necessitate the detailed, standardized recordkeeping requirements adopted in the Final Rules. As discussed in greater detail in the Supplementary Information to the Proposed Rules,6 Title II provides the FDIC as receiver of a covered financial company with the authority to (i) transfer the QFCs of the covered financial company to another financial institution, including a bridge financial company established by the FDIC or (ii) retain the QFCs within the receivership, disaffirm or repudiate the QFCs, and pay compensatory damages.7 The FDIC may also retain the QFCs within the receivership and allow the counterparties to terminate the QFCs. In deciding whether to transfer, disaffirm or repudiate, or allow counterparties to terminate the QFCs of the covered financial company, the FDIC must take into consideration the requirements of Title II, including those discussed below.

    6 A more general summary of the treatment of QFCs under Title II and the rights and obligations of the FDIC under the Act was provided in section II of the Supplementary Information to the Proposed Rules. See 80 FR 966, 968-70.

    7 12 U.S.C. 5390(c)(11).

    As referenced throughout this Supplementary Information to the Final Rules, Title II requires that the FDIC as receiver treat the QFCs of a covered financial company with a particular counterparty and that counterparty's affiliates consistently. Within certain constraints, the FDIC may take different approaches with respect to QFCs with different counterparties. However, if the FDIC as receiver desires to transfer any QFC with a particular counterparty, it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution. Similarly, if the FDIC desires to disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty.8

    8 For transfer, see 12 U.S.C. 5390(c)(9)(A); for disaffirmance or repudiation, see 12 U.S.C. 5390(c)(11).

    Furthermore, the FDIC is required to confirm that the aggregate amount of liabilities, including QFCs, of the covered financial company that are transferred to, or assumed by, the bridge financial company from the covered financial company do not exceed the aggregate amount of the assets of the covered financial company that are transferred to, or purchased by, the bridge financial company from the covered financial company.9 In addition, in order to repudiate any QFCs of the covered financial company, the receiver must first determine that the performance of such QFCs would be burdensome and that such repudiation will promote the orderly administration of the affairs of the covered financial company.10 More generally, Title II provides that with respect to the disposition of assets of a covered financial company, including a repudiation or transfer of QFCs, the FDIC shall, to the greatest extent practicable, do so in a way that maximizes value and minimizes losses and mitigates the potential for serious adverse effects to the financial system.11 Finally, the FDIC must make its decision as to how to treat the QFCs of the covered financial company within a very limited time frame because the stay that prevents termination based on the appointment of the receiver lasts only for the period between the appointment of the FDIC as receiver and 5 p.m. (eastern time) on the business day following the date of the appointment.12

    9See 12 U.S.C. 5390(h)(5)(F).

    10See 12 U.S.C. 5390(c)(1).

    11See 12 U.S.C. 5390(a)(9)(E). See also 12 U.S.C. 5390(a)(1)(B)(iv).

    12See 12 U.S.C. 5390(c)(10)(B)(i).

    The Secretary has determined that, given these statutory constraints, it is necessary and appropriate for the FDIC as receiver to have access to detailed, standardized records from the financial companies that potentially would be the most likely to be considered for orderly liquidation under Title II. Nonetheless, having considered the comments received, the Secretary has determined that it is possible to reduce the scope of financial companies subject to the rules and the extent of recordkeeping required while still requiring the records the FDIC would need as receiver in order to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10). In particular, the Secretary has made changes in the Final Rules that provide for further differentiation among financial companies by:

    • Adding to the definition of “records entity” new thresholds based on the level of a financial company's derivatives activity;

    • providing an exclusion for insurance companies;

    • providing a conditional exemption for clearing organizations; and

    • providing a de minimis exemption from the recordkeeping requirements, other than the requirement to maintain copies of the documents that govern QFC transactions, for entities that are party to 50 or fewer open QFC positions.

    The Final Rules also significantly reduce the burden of the required recordkeeping by, among other things:

    • Revising the definition of “records entity” to identify which members of a corporate group are records entities by reference to whether they are consolidated under accounting standards;

    • replacing the requirement to maintain organizational charts of counterparties with a requirement to identify only certain information as to each counterparty, such as the ultimate and immediate parent entities of the counterparty;

    • eliminating the requirement to maintain risk metrics information;

    • eliminating the requirement to maintain copies of additional information with respect to QFCs provided by the records entity to other regulators, swap data repositories, and security-based swap data repositories;

    • eliminating the requirement that copies of QFC agreements be searchable;

    • eliminating several fields from the required data tables; and

    • providing for tiered initial compliance dates based on the size of the corporate group, with all records entities having additional time to comply with the rules.

    The Final Rules also provide for additional fields in the required data tables that are not anticipated to impose a significant additional burden on records entities, and the proposed requirement that records of affiliated records entities be maintained in a form that allows for aggregation has been replaced in the Final Rules with the requirement that the top-tier parent financial company be capable of aggregating such records.

    II. Description of the Final Rules

    The following discussion provides a summary of the Proposed Rules, the comments received, and the Secretary's responses to those comments, including modifications made in the Final Rules. In addition to the considerations discussed in this section, the Secretary, in adopting these Final Rules, has taken into account the potential costs and benefits of the rules discussed in Section III below.

    A. Scope, Purpose, Effective Date, and Compliance Dates

    Section 148.1(a) of the Final Rules defines the scope of the rules. Section 148.1(b) explains the purpose of the rules. Sections 148.1(c) and (d) set forth the rules' effective and compliance dates.

    1. Scope a. Key Definitions

    The scope of the Final Rules is established by certain key definitions that determine the entities that would be subject to the rules. Specifically, section 148.1(a) of the Final Rules provides that the rules apply to any “financial company” that is a “records entity” and, with respect to section 148.3(a), to the “top-tier financial company” of a “corporate group,” as those terms are defined in the Final Rules.

    Financial Company: The Final Rules, as did the Proposed Rules, incorporate the definition of “financial company” set forth in section 201(a)(11) of the Dodd-Frank Act.13 Entities that are not included in the section 201(a)(11) definition of “financial company” are not included in the definition of “records entity” and, therefore, are not subject to the rules. Entities that are included in the section 201(a)(11) definition of “financial company” are subject to the rules if they also meet the other criteria in the definition of “records entity.” In addition, the definition of “covered financial company” in section 201(a)(8) of the Dodd-Frank Act excludes insured depository institutions,14 which as a result are ineligible for a Title II orderly liquidation. Thus, based on the section 201(a)(11) definition of “financial company” and the section 201(a)(8) definition of “covered financial company,” the following entities are not required to maintain records under the Final Rules:

    13 12 U.S.C. 5381(a)(11)

    14 12 U.S.C. 5381(a)(8)(B).

    • Financial companies that are not incorporated or organized under U.S. federal or state law;

    • Farm Credit System institutions;

    • Governmental entities, and regulated entities under the Federal Housing Enterprises Financial Safety and Soundness Act of 1992; 15 and

    15 12 U.S.C. 4502(20).

    • Insured depository institutions.

    Records Entity: Each records entity is required to maintain records with respect to all of its QFCs unless such records entity receives an exemption under the rules. The Proposed Rules would have defined “records entity” as a financial company that: Is not an exempt entity; is a party to an open QFC, or guarantees, supports, or is linked to an open QFC; and meets one of the following requirements: (a) Is determined pursuant to section 113 of the Dodd-Frank Act 16 to be an entity that could pose a threat to the financial stability of the United States; (b) is designated pursuant to section 804 of the Dodd-Frank Act 17 as a financial market utility that is, or is likely to become, systemically important; (c) has total assets equal to or greater than $50 billion; or (d) is a party to an open QFC or guarantees, supports, or is linked to an open QFC of an affiliate and is a member of a corporate group within which at least one affiliate meets one of the criteria in (a), (b), or (c).

    16 12 U.S.C. 5323.

    17 12 U.S.C. 5463.

    As described below, the Secretary has modified the definition of “records entity” in order to further differentiate financial companies by reference to certain factors listed in section 210(c)(8)(H)(iv) and to reduce the costs of complying with the rules. This has the effect of substantially narrowing the scope of entities subject to the recordkeeping requirements of the Final Rules, as discussed more fully below, and thereby reducing the costs imposed by the rules. Furthermore, as discussed below, the Secretary has eliminated the phrase “guarantees, supports, or is linked to an open QFC” from the definition of “records entity” in the Final Rules.

    Designated nonbank financial companies and financial market utilities. The Secretary continues to believe that nonbank financial companies subject to a determination by the Council under section 113 of the Act and financial market utilities designated by the Council under section 804 of the Act as, or as likely to become, systemically important should be included as records entities. As was noted in the Supplementary Information to the Proposed Rules, certain of the factors relevant to a designation under both section 113 and section 804 are similar to the factors listed in section 210(c)(8)(H)(iv). The Council may make a determination under section 113 if it determines that material financial distress at the nonbank financial company, or the nature, scope, size, scale, concentration, interconnectedness, or mix of the activities of the nonbank financial company could pose a threat to the financial stability of the United States.18 Similarly, in making a determination that a financial market utility is or is likely to become systemically important, the Council is required to consider the effect that the failure of or a disruption to the financial market utility would have on critical markets, financial institutions, or the broader financial system.19 In light of the factors the Council must consider in making a determination regarding a nonbank financial company under section 113 or a designation of a financial market utility under section 804, the Secretary has concluded that these are the types of financial companies that potentially would be the most likely to be considered for orderly liquidation under Title II 20 and that it is therefore appropriate that they be deemed to be records entities. Therefore, the Secretary has retained the inclusion of such nonbank financial companies and financial market utilities in the definition of “records entity” in the Final Rules. However, the Secretary has provided a conditional exemption applicable to certain financial market utilities as described below.

    18 A determination under section 113 subjects the nonbank financial company to supervision by the Board of Governors of the Federal Reserve System and to enhanced prudential standards established in accordance with Title I of the Act. See 12 U.S.C. 5365.

    19See 12 U.S.C. 5463(a)(2)(D).

    20 In making a determination under section 113, the Council may take into consideration each of the factors expressly referenced in section 210(c)(8)(H)(iv), including as follows: Leverage of a company may be considered under sections 113(a)(2)(A) or 113(b)(2)(A); complexity may be considered under sections 113(a)(2)(B) or 113(b)(2)(B); interconnectedness to the financial system may be considered under sections 113(a)(2) (C), (G), and (I) or 113(b)(2)(C), (G), and (I); size may be considered under sections 113(a)(2)(B), (D), (E), (G), (I), and (J) or 113(b)(2) (B), (D), (E), (G), (I) and (J); frequency and dollar amount of QFCs may be considered under sections 113(a)(2)(I) and (J) or 113(b)(2)(I) and (J); and risk may be considered throughout sections 113(a)(2) and 113(b)(2). See also 12 CFR 1310.11 (setting forth the Council's considerations in making proposed and final determinations, which correspond to the considerations provided in section 113) and 77 FR 21637 (April 11, 2012) (adopting 12 CFR part 1310 and related interpretive guidance). In making a determination under section 804, the Council takes into consideration various factors under section 804(a)(2) and 12 CFR 1320.10 that correspond to the factors referenced in section 210(c)(8)(H)(iv). See also 76 FR 44763 (July 27, 2011) (adopting 12 CFR part 1320).

    Financial Companies with $50 Billion in Assets; Additional Factors. The Proposed Rules would have included as a records entity any financial company that is not an exempt entity; is a party to an open QFC, or guarantees, supports, or is linked to an open QFC; and has total assets equal to or greater than $50 billion. The Secretary proposed the $50 billion threshold as a useful means of identifying entities that are of a sufficient size that they could potentially be considered for orderly liquidation under Title II. In proposing the $50 billion asset threshold, the Secretary took into consideration the fact that it corresponds to the threshold that was established for determining which bank holding companies would be subject to enhanced supervision and prudential standards under Title I of the Dodd-Frank Act 21 and was also adopted by the Council as an initial threshold for identifying nonbank financial companies that merit further evaluation as to whether they should be designated under section 113 of the Act.22

    21See 12 U.S.C. 5365(a).

    22See Financial Stability Oversight Council Guidance for Nonbank Financial Company Determinations, 12 CFR part 1310, app. A., III.a.

    The proposed $50 billion asset threshold received substantial attention from commenters. Several commenters stated that reliance on this threshold would lead to an overbroad application of the recordkeeping requirements and argued for a more tailored approach that would focus on those institutions that are more likely to be resolved under Title II.23 One commenter proposed $250 billion as a more appropriate level for an asset threshold.24 Several commenters recommended that the Secretary adopt a multi-factor approach, citing the use of multi-factor approaches in other contexts, including the Council's nonbank financial holding company determinations process and the methodology used by the Board of Governors of the Federal Reserve System (“Federal Reserve”) for identifying U.S. global systemically important bank holding companies (“G-SIBs”).25 One commenter stated that the scope of entities subject to the Proposed Rules was too narrow.26

    23See, e.g., TCH et al. letter, p. 7; IIB letter, pp. 5-6; ICI letter, pp. 7-9; SIFMA AMG letter, pp. 3-5. The specific concerns raised with respect to the application of the $50 billion asset threshold to investment companies and investment advisers are discussed below.

    24See IIB letter, p. 7.

    25See IIB letter, pp. 3, 11; TCH et al. letter, p. 11; letter from Capital One Financial Corporation, Fifth Third Bancorp, The PNC Financial Services Group, Inc., Regional Financial Corporation and SunTrust Banks, Inc. (April 7, 2015) (the “Regional Banks letter”).

    26See Letter from Better Markets, Inc. (April 7, 2015) (“Better Markets letter”), p. 6-10.

    The Secretary is making two changes to the definition of “records entity” in the Final Rules that will, by incorporating additional factors, substantially reduce the number of entities that will be subject to recordkeeping requirements. These measures relate to several of the factors specifically enumerated in section 210(c)(8)(H) of the Act and allow the Secretary to better limit the financial companies included within the scope of records entities to those companies that potentially would be the most likely to be considered for orderly liquidation under Title II.

    First, the Final Rules specifically include in the definition of “records entity” those entities that are identified as G-SIBs.27 Since the Proposed Rules were issued, the Federal Reserve has adopted rules specifying the criteria by which U.S. bank holding companies are identified as G-SIBs.28 G-SIBs are required to hold additional capital to increase their resiliency in light of the greater threat they pose to the financial stability of the United States.29 An entity is identified as a G-SIB pursuant to the Federal Reserve's rules based on its level of twelve systemic indicators as compared to the aggregate indicator amounts across other large, global banking organizations. These twelve systemic indicators correspond to five categories—size, interconnectedness, cross-jurisdictional activity, substitutability, and complexity—that correlate with systemic importance and overlap with the factors specifically enumerated in section 210(c)(8)(H) of the Act, listed above.30 Because the G-SIBs have been deemed to be the top-tier U.S. bank holding companies with the greatest systemic importance, the Secretary has determined that it is appropriate that they be included within the definition of “records entity” under the Final Rules. By incorporating the Federal Reserve's multi-factor framework into the definition of “records entity,” the Secretary has responded to comments to reflect the use of additional factors in the definition of “records entity.”

    27 § 148.2(n)(1)(iii)(C).

    28See 12 CFR part 217, subpart H.

    29See 12 CFR part 217, subpart H; Federal Reserve, Regulatory Capital Rules: Implementation of Risk-Based Capital Surcharges for Global Systemically Important Bank Holding Companies, 80 FR 49082, 83 (Aug. 14, 2015).

    30See 12 CFR 217.404. See also 80 FR at 49095-97.

    However, the Secretary believes that to include only the G-SIBs identified by the Federal Reserve, along with designated financial market utilities and nonbank financial companies subject to a Council determination, within the definition of “records entity” would unduly limit the entities that would be subject to the recordkeeping rules. The G-SIBs identified under the Federal Reserve's rules by definition only include U.S. top-tier bank holding companies, whereas other types of financial companies potentially would also be among the most likely financial companies to be considered for orderly liquidation under Title II. Therefore, in addition to adding the G-SIBs to the definition of “records entity,” the Secretary has chosen to maintain the $50 billion threshold but supplement it with an additional factor tied to a financial company's level of derivatives activity. Specifically, section 148.2(n)(iii)(D) of the Final Rules provides that in addition to having total consolidated assets equal to or greater than $50 billion, an entity must on a consolidated basis have either (i) total gross notional derivatives outstanding equal to or greater than $250 billion or (ii) derivative liabilities equal to or greater than $3.5 billion in order to be deemed a records entity under that prong of the definition. As explained below, this approach incorporates the most relevant factors into the definition of “records entity” by reference to metrics that are already generally calculated by financial companies.

    Gross notional derivatives outstanding relates directly to three of the factors enumerated in section 210(c)(8)(H)(iv)—complexity, interconnectedness, and the dollar amount of QFCs. Gross notional derivatives outstanding is used in the Federal Reserve's methodology for identifying G-SIBs as an indicator of complexity.31 Gross derivatives exposure is also one metric the Council has taken into consideration when assessing the interconnectedness of a nonbank financial company under review for a potential determination under section 113.32 In addition, because derivatives reflected in the total gross notional derivatives outstanding metric are all QFCs as defined in Title II, this metric relates directly to the importance of an institution's maintaining QFC records. Derivatives are among the most complex QFCs, and thus the inclusion in the definition of “records entity” of measures of derivatives activity relates directly to the objective of the rules, which is to allow the FDIC to make informed judgments about complex portfolios of QFCs in a timely manner.

    31Id.

    32See 12 CFR part 1310, appx. A.II.d.2.

    Unlike some other potential measures of complexity and interconnectedness and unlike the measures of the volume of QFCs generally, gross notional derivatives outstanding is a measure that the Secretary understands is generally already calculated, and in most cases reported or disclosed, by financial companies with assets of $50 billion or more. Bank holding companies with assets of $50 billion or more are required to report to the Federal Reserve the amount of gross notional derivatives outstanding quarterly on Schedule H-CL of Form Y-9C and annually on Schedule D of Form Y-15. Financial companies often satisfy the requirement to disclose in their financial statements the volume of their derivatives activity by disclosing the amount of gross notional derivatives outstanding; 33 disclosure of gross notional derivatives outstanding is also frequently provided by large financial companies filing annual and quarterly reports under sections 13 and 15(d) of the Securities Exchange Act of 1934 (“Exchange Act”) to satisfy the requirement of the Securities and Exchange Commission (“SEC”) to provide quantitative disclosures about the market risk of their derivatives portfolio.34 In addition, registered investment companies typically disclose notional amounts with respect to certain derivatives. The Final Rules define “total gross notional derivatives outstanding” as the gross notional value of all derivative instruments that are outstanding as of the end of the most recent fiscal year as recognized and measured in accordance with U.S. generally accepted accounting principles (“GAAP”) or other applicable accounting standards.

    33See FASB Accounting Standards Codification Topic 815, Derivatives and Hedging ¶ 10-50-1A.

    34See Item 305 of Regulation S-K, 17 CFR 229.305.

    Referring to gross notional derivatives outstanding alone, however, would not be sufficient to identify financial companies with large exposures to derivatives. The Final Rules include the amount of a financial company's derivative liabilities as an alternative measure by which a financial company may be deemed a records entity. The Final Rules define “derivative liabilities” as the fair value of derivative instruments in a negative position that are outstanding as of the end of the most recent fiscal year as recognized and measured in accordance with GAAP or other applicable accounting standards, taking into account the effects of master netting agreements and cash collateral held with the same counterparty on a net basis to the extent reflected on the financial company's financial statements. This metric, like total gross notional derivatives outstanding, serves as a proxy for interconnectedness, as a company that has a greater level of derivative liabilities would have higher counterparty exposure throughout the financial system. For this reason, derivative liabilities is one of the metrics used by the Council for identifying nonbank financial companies that may merit further evaluation for a potential determination under section 113.35 Bank holding companies with assets of $50 billion or more are required to report quarterly to the Federal Reserve the net negative fair value of their derivatives contracts classified as trading liabilities on Schedule HC-D of Form Y-9C. Moreover, large financial companies filing annual and quarterly reports under the Exchange Act generally disclose the amount of their derivative liabilities in the footnotes to their financial statements in accordance with GAAP.

    35See 12 CFR part 1310, appx. A.III.a.

    The inclusion of both the total gross notional amount of derivatives outstanding and derivative liabilities thresholds in the definition of “records entity” will better capture entities that are using substantial amounts of derivatives. The amount of total gross notional derivatives outstanding is an amount that may not, by itself, be fully representative of the interconnection and complexity of an entity and its QFC activities. For example, the notional amount of interest rate derivatives tends to be significantly larger than the notional amount of credit derivatives representing comparable levels of fair value risk, yet both types of derivatives are indicative of the interconnection and complexity of an entity. In turn, reference to derivative liabilities alone could obscure entities' level of derivatives activity to the extent a financial company's financial statements take into account the effects of netting agreements and cash collateral held with the same counterparty on a net basis. Although such netting may reduce the risk to the entity from engaging in such derivatives, even a derivatives portfolio with a low negative fair value after accounting for the effects of master netting agreements and cash collateral held with the same counterparty is indicative of interconnection and complexity if it is sufficiently large on a gross notional basis.

    By including reference to total assets, notional amount of derivatives, and derivative liabilities, the Secretary has incorporated, as explained above, consideration of size, complexity, interconnectedness to the financial system, and the dollar amount of QFCs into the definition of “records entity.” Size, complexity, and interconnectedness to the financial system are, in turn, all indicators of risk, particularly risk to financial stability.36 The Secretary, in adopting the definition of “records entity,” also considered the other factors listed in section 210(c)(8)(H), i.e., frequency of QFCs and leverage. To the extent that the inclusion of frequency of QFCs among these factors is intended to serve as a proxy for the extent to which QFCs are utilized by a financial company, the Secretary believes that the inclusion of the total gross notional amount of derivatives outstanding and derivative liabilities achieves the same purpose. In addition, the Secretary has considered the frequency of QFCs in providing in the Final Rules for the de minimis exemption pursuant to which a records entity of any size that is a party to 50 or fewer open QFC positions is not required to maintain the records required under the rules other than to maintain copies of the documents governing its QFC transactions. The Secretary has decided not to reference leverage in the definition of “records entity,” because the appropriate methodology for calculating leverage may vary depending on the type of financial company, which would make incorporation of a specific measure of leverage difficult, particularly given the wide variety of entities that fall within the definition of “financial company.”

    36See, e.g., 80 FR at 49095-49097.

    The Final Rules provide for thresholds of $250 billion of total gross notional derivatives outstanding and $3.5 billion of total derivative liabilities. As noted above, bank holding companies with $50 billion or more in total consolidated assets report both total gross notional derivatives outstanding and derivative liabilities in regulatory filings. As of December 31, 2015, all of the G-SIBs were above the thresholds for total gross notional amount of derivatives outstanding and derivative liabilities and in most cases were significantly above the thresholds.37 Conversely, most other bank holding companies were well below both of these thresholds. In addition, calibrating the derivatives thresholds as provided for in the Final Rules includes within their scope large, complex, and interconnected U.S. subsidiaries of foreign bank organizations that have been identified as global systemically important banks in their home countries.

    37 Although each of the eight bank holding companies that currently are identified as G-SIBs pursuant to 12 CFR part 217 would also qualify as records entities pursuant to § 148.2(n)(iii)(D) of the Final Rules because they each have total consolidated assets in excess of $50 billion and total gross notional derivatives outstanding equal to or greater than $250 billion or derivative liabilities equal to or greater than $3.5 billion, it is possible that in the future, an entity could be deemed a G-SIB without being a records entity under § 148.2(n)(iii)(D) of the Final Rules if it does not maintain a large portfolio of derivatives but does have comparatively high levels of the other systemic indicators set forth in the G-SIB rules. The Secretary has determined that the G-SIBs, having been identified as the bank holding companies with the greatest systemic importance, should be subject to the recordkeeping requirements of the Final Rules regardless of whether they meet the other thresholds provided for in the definition of “records entity.”

    Another reason for setting the thresholds at these levels is to provide for some degree of stability in the set of financial companies that are deemed to be records entities. In looking back across the previous eight quarters, the bank holding companies with derivative liabilities currently at or above the $3.5 billion threshold were at or above the threshold in nearly every quarter, while those with total derivative liabilities currently below the threshold were below the threshold in each quarter. Similarly, for total gross notional derivatives outstanding, bank holding companies at or above the $250 billion threshold were at or above the threshold in nearly every quarter over the last eight quarters, while those with total gross notional derivatives outstanding currently below the threshold were below the threshold in nearly every quarter over the last eight quarters.

    Similar trends are evidenced among other public financial companies reporting derivative liabilities and total gross notional derivatives outstanding in their financial statements filed with the SEC. Among the nonbank financial companies with greater than $50 billion in total consolidated assets that publicly disclose their derivative liabilities or total gross notional derivatives outstanding, as of December 31, 2015, several reported amounts significantly above one or both thresholds while the majority were well below both thresholds. In looking back across the previous eight quarters, those with total derivative liabilities currently at or above the $3.5 billion threshold were above the threshold in every quarter, while those with total derivative liabilities currently below the threshold were below the threshold in nearly every quarter. Similarly, for total gross notional derivatives outstanding, those at or above the $250 billion threshold were above the threshold in nearly every quarter over the last eight quarters, while those below were below in every quarter over the last eight quarters.

    Members of Corporate Groups. The Proposed Rules included within the definition of “records entity” those financial companies that (i) are members of a corporate group in which at least one financial company is a nonbank financial company subject to a Council determination or financial market utility designated by the Council, is a U.S. G-SIB, or meets the $50 billion asset threshold, (ii) are a party to or support a QFC, and (iii) are not excluded entities. The Proposed Rules defined “corporate group” of an entity to include all affiliates of that entity and “affiliate” to include any entity that controls, is controlled by, or is under common control with another entity.

    Several commenters stated that the use of the definition of “affiliate,” discussed further below, had the effect of including too broad a scope of affiliates within the definition of “records entity.” 38 Several commenters argued that only the affiliates that reasonably might be subject to resolution under the orderly liquidation authority of Title II should be included as records entities.39 Other commenters proposed that only those affiliates that meet threshold minimum asset, QFC activity, and complexity criteria should be considered records entities.40 One commenter proposed including as records entities only entities that are identified as being significant to a critical operation or core business line, which, in the case of bank holding companies, would be the “material entities” identified in the resolution plans they are required to prepare.41 Another commenter proposed that the definition of “records entity” only include entities that are consolidated for financial reporting purposes either on the Federal Reserve's Form FR Y-9C (regarding the financial condition of bank holding companies, savings and loan holding companies, and securities holding companies) or under any other generally applicable reporting rules or regulations applicable to the records entity.42

    38See TCH et al. letter, pp. 8-10; ACLI letter, p. 11-13; ICI Letter, pp. 9-10; TIAA-CREF letter, pp. 5-6.

    39See TCH et al. letter, p. 2-3, 8-10, and 13-15; ACLI letter, p. 12; CEWG letter, p. 2.

    40See ACLI letter, p. 11; TIAA-CREF letter, p. 7.

    41See TCH et al. letter, p. 15. See also Dodd-Frank Act § 165(d) (12 U.S.C. 5365); 12 CFR parts 243, 381.

    42See Letter from The Clearing House Association L.L.C. and the Asset Management Group of the Securities Industry and Financial Markets Association (Nov. 13, 2015) (“TCH/SIFMA letter”).

    As discussed further below, the Secretary has adopted the suggestion of commenters, noted above, to revise the definition of “records entity” to identify which members of a corporate group are records entities by reference to whether they are consolidated under accounting standards. This change should have the effect of reducing the number of records entities. The Final Rules do not otherwise revise the scope of members of a corporate group that are included as records entities because the Secretary has decided that it is not possible to describe, ex ante, the precise characteristics of a financial company that could be placed into receivership under Title II. In particular, an entity could be resolved under Title II without the Secretary making the determination required under section 203(b) with respect to a covered financial company. Title II provides that the FDIC may appoint itself as receiver of an entity if it is a “covered subsidiary” of a covered financial company of which the FDIC has been appointed as receiver and it is jointly determined by the FDIC and the Secretary that (i) the covered subsidiary is in default or in danger of default, (ii) the FDIC's appointment as receiver would avoid or mitigate serious adverse effects on the financial stability or economic conditions of the United States, and (iii) the FDIC's appointment as receiver would facilitate the orderly liquidation of the covered financial company.43 If the FDIC appoints itself receiver of a covered subsidiary, that subsidiary is treated as a covered financial company for purposes of Title II, and the FDIC as receiver would have the same rights under the Act and the same obligations under sections 210(c)(8), (9), or (10) of the Act as it does for other covered financial companies.44

    43See 12 U.S.C. 5390(a)(1)(E)(i). “Covered subsidiary” is defined as any subsidiary of a covered financial company, other than an insured depository institution, an insurance company, or a covered broker or dealer. See 12 U.S.C. 5381(a)(9).

    44See 12 U.S.C. 5390(a)(1)(E)(ii).

    Moreover, information about QFCs of each of the members of the corporate group could be of assistance to the FDIC as receiver in deciding whether to transfer the QFCs to a bridge financial company by giving the FDIC a full understanding of the impact of any transfer of the QFCs on the records entity's corporate group. For example, in the case of certain QFCs that the FDIC might otherwise determine to retain in the receivership rather than transfer to a bridge financial company (to which the equity in all of the records entity's subsidiaries have been transferred), if, by reference to a subsidiary's QFC records, the FDIC determines that the QFCs are offset by QFCs of the subsidiary with another counterparty, the FDIC as receiver may decide to transfer the records entity's QFCs to the bridge financial company in order to maintain a matched book at the corporate group level with the QFCs of the subsidiary.

    The Secretary has, instead of excluding certain types or sizes of members of a corporate group from the definition of “records entity,” differentiated among financial companies by providing the de minimis exemption discussed below for records entities that are a party to 50 or fewer QFCs. As discussed below, the FDIC has advised the Secretary that it would be able to review the terms of that number of QFCs on a manual basis within the time frame provided by Title II. The de minimis exemption included in the Final Rules will, unlike commenters' proposed exclusions based on the materiality of the records entity, avoid a situation in which the FDIC as receiver will not have the records it may need for a particular records entity.

    Requested additional limitations on definition of “records entity.” Referring to the FDIC's rules at 12 CFR part 371 (“Part 371”), which require recordkeeping by insured depository institutions that are “in a troubled condition,” commenters suggested that the recordkeeping requirements should apply only to financial companies “in a troubled condition” 45 or that meet an analogous threshold.46 Unlike the Federal Deposit Insurance Act (the “FDIA”), which restricts the authority of the FDIC to require QFC recordkeeping by insured depository institutions to those that are “in a troubled condition,” 47 Title II contains no such limitation, and the Secretary believes that adding such a limitation to the Final Rules would not be appropriate. There is no statutory or other established definition of “in a troubled condition” or of an analogous concept for a financial company as there is for an insured depository institution. Although one commenter proposed adoption of a condition based on the amount of risk-based capital at an insurance company,48 such a condition would have to be appropriately calibrated for each type of financial company subject to the rules. More important, the amount of time that records entities are anticipated to need in order to come into compliance with the rules is such that to allow companies to wait until such a condition is met would not provide sufficient time to ensure that the relevant records would be available to the FDIC if needed. Several commenters requested two years to establish the recordkeeping systems required by the Proposed Rules,49 and, as discussed below, the Secretary has provided for two or more years for all but the largest corporate groups to comply with the rules.

    45See letter from The Capital Group Companies, Inc. (April 7, 2015) (the “Capital Group letter”), p. 3, ICI letter, p. 9.

    46See ACLI letter, p. 17.

    47See section 11(e)(8)(H) of the FDIA (12 U.S.C. 1821(e)(8)(H)).

    48See ACLI letter, p. 17.

    49See AMG letter, p. 13; Regional Banks letter, p. 4.

    Excluded Entity: The Proposed Rules provided that the following entities would be exempt from the definition of “records entity” and, therefore, the scope of the rules:

    (1) An insured depository institution as defined in 12 U.S.C. 1813(c)(2);

    (2) A subsidiary of an insured depository institution that is not a functionally regulated subsidiary as defined in 12 U.S.C. 1844(c)(5), a security-based swap dealer as defined in 15 U.S.C. 78c(a)(71), or a major security-based swap participant as defined in 15 U.S.C. 78c(a)(67); or

    (3) A financial company that is not a party to a QFC and controls only exempt entities as defined in clause (1) of this definition.

    The Final Rules use the term “excluded entity” rather than “exempt entity,” as used in the Proposed Rules, in order to avoid confusion with the Secretary's authority to grant exemptive relief from the requirements of the Final Rules. Several commenters requested the addition of other types of entities to the list of excluded entities, as discussed below.

    Insurance companies. Several commenters recommended that the Proposed Rules be revised to exclude insurance companies from the definition of “records entity.” These commenters pointed to section 203(e) of the Dodd-Frank Act, which requires that the liquidation or rehabilitation of an insurance company, as defined in Title II, would be conducted as provided under applicable state law, rather than under the orderly liquidation authority otherwise provided for under Title II.50 Citing this provision, these commenters argued that subjecting insurance companies to the rules' recordkeeping requirements would not be sufficiently justified.51

    50 12 U.S.C. 5383(e).

    51See ACLI letter, pp. 4-6; letter from New York Life Insurance Company, The Northwestern Mutual Life Insurance Company, Massachusetts Mutual Life Insurance Company, and The Guardian Life Insurance Company of America (April 7, 2015) (the “Mutual Insurance Companies letter”), pp. 3-4; TIAA-CREF letter, p. 4.

    Having considered these comments and the requirements of section 203(e) of the Act, the Secretary is excluding insurance companies from the definition of “records entity” in the Final Rules. Given that the liquidation or rehabilitation of an insurance company under Title II would be conducted under state law, to subject insurance companies to the requirements of the rules would not assist the FDIC as receiver in exercising its rights under the Act or fulfilling its obligations under sections 210(c)(8), (9), or (10). As discussed below, a definition of “insurance company” has been added in the Final Rules to ensure consistency with the application of section 203(e) of the Act.

    Commenters also requested that certain non-insurance affiliates of insurance companies be excluded from the scope of the rules, specifically, that non-insurance affiliates within a holding company structure that is predominantly engaged in insurance activities be excluded from the rules.52 Section 203(e) of the Act, however, excludes non-insurance company subsidiaries and affiliates from the requirement, referenced above, that the liquidation or rehabilitation of insurance companies be conducted under state law. Such non-insurance company subsidiaries and affiliates could themselves be determined to be a covered financial company or covered subsidiary. As these entities would be subject to the orderly liquidation authority of Title II, the records that would be required to be generated by these entities under the rules would assist the FDIC in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Secretary is therefore not excluding such insurance company affiliates from the definition of “records entity” under the Final Rules. However, the changes to the definition of “records entity” discussed above will reduce the number of corporate groups, including those predominantly engaged in insurance activities, that are subject to the rules, and the de minimis exemption discussed below will substantially eliminate recordkeeping requirements for those records entities with minimal QFC activity. A commenter proposed that QFCs that are entered into for the benefit of or on behalf of affiliated insurance companies be excluded from the rules.53 However, it is unclear how such QFCs would be distinguished from other QFCs of non-insurance company affiliates, and the FDIC has advised that it would not necessarily treat such QFCs any differently than the way it would treat other QFCs of non-insurance company affiliates.

    52See ACLI letter, p. 3; Mutual Insurance Companies letter, p. 5.

    53See ACLI letter, p. 10.

    Investment companies and investment advisers. A number of commenters argued that investment companies and investment advisers should not be included as records entities subject to the rules' recordkeeping requirements.54 Commenters outlined the manner in which investment advisers and funds are typically resolved outside the scope of Title II 55 and argued that it would be very unlikely for an investment adviser or the funds it manages either to be resolved under Title II or be important to the FDIC's consideration of a resolution under Title II of a financial company of which the adviser is an affiliate.56 Commenters argued that regulatory constraints applied to registered investment companies, particularly leverage requirements and structural features, such as the ability to limit redemptions, mitigate the potential use of the orderly liquidation authority of Title II.57 Additionally, they contended that because each investment adviser and investment company is highly substitutable, their assets under management could be liquidated or transferred to other managers without threatening financial stability.58

    54See SIFMA AMG letter, pp. 3-4; ICI letter, pp. 7-12

    55See SIFMA AMG letter, p. 7; ICI letter, pp. 4-5.

    56See SIFMA AMG letter, p. 4; ICI letter, pp. 3-4

    57See TIAA-CREF letter, p. 5; ICI letter, p. 4.

    58See SIFMA AMG letter, p. 6.

    The definition of “records entity” in the Final Rules would include only extremely large and interconnected asset management firms, and, for the reasons discussed above, investment advisers that are members of a corporate group that is subject to the rules. Although commenters cited examples of mergers and closures of funds and advisers that were conducted in an orderly fashion as demonstrating the unlikelihood of the need to resolve such entities under Title II, these examples did not address the potential effects of the rapid failure of a fund or of an asset management firm or other corporate group of the size and complexity that would be subject to the Final Rules.

    The Secretary has made certain other changes in the Final Rules that will further reduce their impact on asset management firms. In response to the proposal of a commenter that noted that an investment adviser may be party to a QFC of one of its funds or clients for the limited purpose of providing a representation,59 the Secretary confirms that an entity will not be considered to be a party to a QFC for purposes of the rules if it is only a party to such QFC for the limited purpose of providing a representation. In addition, the Secretary notes that individual investment funds, including mutual funds, would not be deemed to be affiliates of an investment adviser or other funds managed by that investment adviser solely by virtue of the investment adviser serving in such capacity with respect to the funds. Further, the Secretary confirms that, as stated in the Supplementary Information to the Proposed Rules,60 each series of a series company (as defined in Rule 18f-2 under the Investment Company Act) 61 will be deemed to be a separate financial company, which means that an individual series would itself have to meet the asset and derivatives thresholds in order to be subject to the rules as a “records entity” and that such an individual series would be able to avail itself of the de minimis exemption if it alone was a party to 50 or fewer QFCs.

    59Id., p. 10.

    60See 80 FR 966, 975, n. 66.

    61 17 CFR 270.18f-2.

    Clearing Organizations. The Proposed Rules' inclusion of designated financial market utilities within the definition of “records entity” would have subjected certain clearing organizations to the recordkeeping requirements of the rules. Three commenters recommended either excluding or exempting clearing organizations from the scope of the Final Rules.62 Commenters stated that the requirements of the Proposed Rules were not appropriate for clearing organizations because they were designed to collect information relevant to bilateral trades and that such information is generally irrelevant to, and not collected by, clearing organizations.63 Commenters stated that there is no need to require maintenance of copies of legal agreements as contemplated by the Proposed Rules, as a clearing organization's legal relationships with its clearing members are governed by its rulebook and not by individual contracts with its clearing members.64 More generally, commenters stated that the recordkeeping requirements under the Proposed Rules were not tailored in a manner that would best facilitate resolution of a clearing organization.65

    62See DTCC letter, p. 11, letter from the Options Clearing Corporation (April 7, 2015) (“OCC letter”), pp. 6-8; letter from the Clearing Division of CME Mercantile Exchange Inc. (April 7, 2015) (“CME letter”), pp. 5-6.

    63See Letter from the Futures Industry Association (April 10, 2015), p. 2; DTCC letter, p. 9; OCC letter, p. 8.

    64See DTCC letter, p. 9; OCC letter pp. 11-12. See also CME letter, pp. 6-7.

    65See DTCC letter, p. 7; CME letter, p. 6; OCC letter, pp. 8-13.

    Commenters stated that the FDIC should coordinate with the clearing organizations' primary regulators (the Commodity Futures Trading Commission (“CFTC”) or SEC, as applicable) and utilize to the maximum extent practicable the existing reporting regulations, mechanisms, and formats already applicable to clearing organizations.66 Commenters submitted that the records required to be provided under existing regulations should be sufficient to allow the FDIC as receiver to decide whether to transfer, disaffirm or repudiate, or allow the termination of a clearing organization's QFCs.67 For example, one commenter indicated that a clearing organization can be expected to maintain trade records; aggregated trade data by clearing member; records of the amount of margin posted by or through clearing members; detail on the amount, type, and location of collateral; records of variation margin payments; and the terms of each QFC cleared by the derivatives clearing organization as provided in its rulebook.68

    66See DTCC letter, p. 7; OCC letter, pp. 7-8; CME letter, p. 5.

    67See OCC letter, p. 7.

    68See CME letter, p. 7.

    The Secretary acknowledges that all derivatives clearing organizations are required by the CFTC to maintain extensive records.69 In addition, systemically important derivatives clearing organizations are required by CFTC rules to have procedures for providing the CFTC and FDIC with “information needed for purposes of resolution planning.” 70 Likewise, clearing agencies registered with the SEC are required to maintain extensive records,71 and systemically important or covered clearing agencies for which the SEC is the supervisory agency under the Dodd-Frank Act are required to adopt recovery and wind-down plans.72

    69See 17 CFR 39.14(e), 39.20.

    70See 17 CFR 39.39(c)(2).

    71See 17 CFR 240.17a-1.

    72See 17 CFR 240.17Ad-22 (e)(3)(ii).

    In addition, as commenters noted, the unique nature of derivatives clearing organizations make it possible that their existing recordkeeping practices would be sufficient to meet the needs of the FDIC. The unique characteristics include the following: (i) A clearing organization's only counterparties are its clearing members; (ii) it enters into, or clears, a prescribed set of QFCs; (iii) it maintains a consolidated recordkeeping system to calculate aggregate exposures and margin requirements of its clearing members; and (iv) all transactions are governed by the rulebook of the clearing organization rather than individual legal agreements. The data requirements of the tables included in the Proposed Rules and the Final Rules were created with the expectation that the FDIC as receiver might need to make decisions as to whether to transfer, disaffirm or repudiate, or allow the termination of QFCs with a specific counterparty and its affiliates. In the case of a clearing organization, in contrast, a significant focus of the FDIC would be maintaining the clearing organization's matched book of QFCs. In these cases, the most relevant data would be the type of data that would be of value to a transferee in managing the transferred QFC portfolio, and this is the type of data that clearing organizations are required by their primary regulators to maintain and report.

    Having considered the foregoing, the Secretary has determined, after consulting with the FDIC, that the FDIC would be able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act if it has access to the records currently required to be maintained by clearing organizations. Accordingly, the Final Rules provide that a clearing organization is exempt from complying with the recordkeeping requirements of the Final Rules other than the requirement to designate a point of contact if it is (i) in compliance with the recordkeeping requirements of the CFTC and the SEC, as applicable, including its maintenance of records pertaining to all QFCs cleared by the clearing organization and (ii) capable of and not restricted from, whether by law, regulation, or agreement, such as the clearing organization's rulebook, transmitting electronically directly to the FDIC the records maintained under such recordkeeping requirements within 24 hours of request of the SEC or CFTC, as applicable, as PFRA for the clearing organization. The Secretary has determined that this approach should eliminate the burden of duplicative and unnecessary data collection for such entities.

    Guaranteed, Supported, or Linked: The Proposed Rules provided definitions for “guaranteed or supported” and “linked.” Under section 210(c)(16) of the Act, the FDIC as receiver has additional powers with respect to contracts of subsidiaries or affiliates of a covered financial company that are guaranteed or otherwise supported by or linked to such covered financial company.73 Such contracts can be enforced by the FDIC as receiver of the covered financial company notwithstanding the insolvency, financial condition, or receivership of the covered financial company. The terms “guarantees or supports” and “linked” in the Proposed Rules were defined in the same way as they are defined in the FDIC's regulations implementing section 210(c)(16) of the Act. Under the Proposed Rules, a financial company would have had to be a party to or have guaranteed or supported or been linked to an open QFC in order to be deemed a records entity, and a records entity would have been required to have maintained records with respect to QFCs that it guaranteed or supported.

    73See 12 U.S.C. 5390(c)(16).

    The Secretary has decided to simplify the rules by omitting references to “guaranteed or supported” and “linked.” Under the Final Rules, a financial company would, in addition to meeting the other criteria discussed above, have to be a party to an open QFC in order to be a “records entity,” and such a records entity would only be required to maintain records with respect to its QFCs. This change reduces the complexity of the rules but generally would not be expected to change significantly which entities would be records entities because guarantees and other credit enhancements of QFCs are themselves QFCs.74 Further, given that the FDIC has adopted regulations clarifying that no special action will be required of the receiver to preserve enforceability of QFCs that are merely “linked” to the entity in receivership,75 the Secretary has removed all references to “linked” from the Final Rules.

    74See 12 U.S.C. 5390(c)(8)(D).

    75See 12 CFR 380.12

    Affiliate, Subsidiary, and Control: The Proposed Rules defined the terms “affiliate” and “subsidiary” consistently with the definitions given to such terms in the Dodd-Frank Act. Sections 2(1) 76 and 2(18) 77 of the Dodd-Frank Act provide that these terms will have the same meanings as in section 3 of the FDIA. Under section 3(w)(4) of the FDIA, the term “subsidiary” is defined as “any company which is owned or controlled directly or indirectly by another company.” Similarly, the term “affiliate” is defined in section 3(w)(6) of the FDIA by reference to section 2(k) of the Bank Holding Company Act of 1956, as amended (“BHC Act”) 78 as “any company that controls, is controlled by, or is under common control with another company.”

    76 12 U.S.C. 5301(1).

    77 12 U.S.C. 5301(18).

    78 12 U.S.C. 1841(k).

    The FDIA, by reference to section 2 of the BHC Act, provides that any company has control over another company if the company directly or indirectly or acting through one or more persons owns, controls, or has the power to vote 25 percent or more of any class of voting securities of the company; the company controls in any manner the election of a majority of the directors or trustees of the company; or the Federal Reserve determines, after notice and opportunity for hearing, that the company directly or indirectly exercises a controlling influence over the management or policies of the company. The first two prongs of the definition of “control” in the Proposed Rules are consistent with the BHC Act definition. The third prong of the definition of “control” in the Proposed Rules, that an entity controls another entity if it must consolidate another entity for financial or regulatory purposes, was proposed to reflect the fact that, in certain situations, a controlling interest may be achieved through arrangements that do not involve voting interests and to provide an objective test that does not require a determination by the Federal Reserve. In the Proposed Rules, the definitions of “affiliate” and “control” related both to (1) the determination of which members of a corporate group would be records entities and (2) the information that would be required to be maintained by records entities as to the identities of affiliates of counterparties.

    One commenter stated that existing recordkeeping and operational controls with respect to QFCs are customarily maintained by parent companies or other entities that have majority ownership of or are otherwise required to consolidate the entities engaging in QFC activity for financial and regulatory purposes.79 Commenters stated that, in contrast, the proposed definition of “control” would result in records entity status for legal entities, such as joint ventures and companies in which other members of the corporate group only have a minority interest, that might not be subject to actual governing control by the other members of the corporate group. These commenters indicated that this would pose difficulties for corporate groups attempting to coordinate the compliance of all of their member records entities.80 This concern would apply in particular to the requirement that affiliated records entities use the same unique counterparty identifier for each counterparty and the proposed requirement that records of affiliated records entities be maintained in a form that allows for aggregation, which has been replaced in the Final Rules with the requirement that the top-tier parent financial company be capable of aggregating such records. As to the Proposed Rules' requirement to identify the affiliates of counterparties, one commenter argued that non-financial company counterparties' lack of familiarity with the BHC Act definition of “control” would make it difficult for records entities to maintain records as to the identity of such affiliates.81

    79See ACLI letter, p. 14.

    80See ACLI letter, pp. 13-14; TIAA-CREF letter, p. 6.

    81See TCH et al. letter, p. 16.

    The Secretary has determined that the FDIC as receiver in a Title II resolution would need to know the identities of the affiliates, as defined by reference to the BHC Act definition of “control,” of the records entity's counterparties. Specifically, as referenced above, section 210(c)(9)(A) of the Act provides the FDIC as receiver shall transfer to one transferee either all or none of the QFCs of a counterparty and the counterparty's “affiliates,” as defined by reference to the BHC Act definition of “control.” 82 In addition, this provision requires that in making any such transfer, the FDIC as receiver must also transfer (i) all claims of the counterparty or any of its affiliates against the covered financial company under any such QFC, (ii) all claims of the covered financial company against the counterparty and any of its affiliates under any such QFC, and (iii) all property securing or any other credit enhancement for any such QFC. In order for the FDIC to comply with these requirements, the FDIC must have available to it the information as to affiliates, as defined in Title II, of counterparties that is specified in the tables in the appendix to the rules.

    82See 12 U.S.C. 5390(c)(9)(A).

    As discussed below, the Proposed Rules would have required a records entity to identify each affiliate of a counterparty by maintaining full organizational charts of the corporate group of a QFC counterparty. This has been replaced in the Final Rules with a requirement in the tables in the appendix to the rules to maintain records as to the identity of the immediate and ultimate parent entity of each counterparty, which will allow the FDIC to identify affiliated counterparties based on their common parent and ultimate parent entities. A new term, “parent entity,” has been defined for this purpose as an entity that controls another entity.

    In addition, the Final Rules have been revised to conform the third prong in the definition of “control” to that provided in the BHC Act, i.e., that control exists if the Federal Reserve has determined, after notice and opportunity for hearing, that the company directly or indirectly exercises a controlling influence over the management or policies of the company.83 Including this prong will ensure that in the case in which the Federal Reserve has made such a determination, the FDIC would have the relevant records with respect to QFCs with that entity. Likewise, eliminating the proposed consolidation prong of the definition of “control,” i.e., that an entity controls another entity if it must consolidate another entity for financial or regulatory purposes, will avoid the possibility of capturing entities that are not affiliates of the counterparty for purposes of Title II.

    83See 12 U.S.C. 1841(a)(2)(C).

    As to the determination of which members of a corporate group would be records entities, the Secretary has adopted the request of commenters, referenced above, to define “records entity” by reference to whether an entity is consolidated under accounting standards. Specifically, under the Final Rules, “records entity” is defined to include a member of a corporate group that consolidates, is consolidated with, or is consolidated by the financial company member of the corporate group that meets the other criteria of the definition of “records entity,” e.g., the asset and derivatives thresholds. The rules provide that with respect to financial companies that are not subject to such accounting principles or standards, for instance because they are not required to prepare financial statements, such member of the corporate group would be a “records entity” if it would consolidate, be consolidated by, or be consolidated with such financial company if such principles or standards applied.

    This change addresses the concerns identified by commenters that members of a corporate group would not have access to the records of a minority-owned entity or joint venture and is intended to better align the identification of records entities in a way that comports with existing recordkeeping practices by corporate groups. The modification of the definition of “records entity” is also responsive to concerns from commenters that the scope of the Proposed Rules would have been too broad, given that reference to accounting consolidation generally requires a higher level of an affiliation relationship than the 25 percent voting interest standard of the BHC Act definition of “control.”

    Two commenters stated that the definition of “affiliate” could deem investment companies that are “seeded” with an initial capital investment by the fund's sponsor to be affiliates of that sponsor during the period before such a fund attracted third party investors.84 The changes made to the definition of “records entity” in the Final Rules should greatly limit the circumstances in which this is likely to arise. In the event that such a seeded fund were to be deemed a records entity under the rules, the fund would be able to request an exemption from the recordkeeping requirements of the rules for the duration of the seeding period.

    84See TIAA-CREF letter, p. 6; ICI letter, p. 10.

    Non-U.S. Entities: Because the Proposed Rules incorporated the Title II definition of “financial company,” the Proposed Rules applied only to entities incorporated or organized in the United States.85 One commenter argued that the records of foreign affiliates of U.S. broker-dealers should be subject to the recordkeeping requirements.86 However, the Secretary's authority to adopt recordkeeping rules under section 210(c)(8)(H) only extends to financial companies as defined in Title II of the Act; therefore, entities that are not incorporated or organized within the United States, including foreign affiliates of records entities, are not subject to the Final Rules.

    85See 12 U.S.C. 5381(a)(11)(A).

    86See Better Markets letter, pp. 16-19.

    b. Scope of Final Rules

    Section 148.1(a) of the Final Rules provides that the recordkeeping requirements apply to each financial company that qualifies as a records entity and, with respect to section 148.3(a), to the top-tier financial company of a corporate group. As discussed above, the Secretary received numerous comments on the Proposed Rules pertaining to the definition of “records entity.” Section 210(c)(8)(H) of the Dodd-Frank Act gives the Secretary broad flexibility in determining the scope of the recordkeeping requirements as necessary or appropriate in order to assist the FDIC as a receiver for a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. Section 210(c)(8)(H) also requires the regulations to differentiate among financial companies, as appropriate, by taking into consideration their size, risk, complexity, leverage, frequency and dollar amount of QFCs, interconnectedness to the financial system, and any other factors deemed appropriate. As discussed earlier, the Secretary has complied with these requirements and consulted extensively with the FDIC.

    The Secretary anticipates that records entities may include the following types of financial companies: 87 (i) Broker-dealers, investment advisers, investment companies, swap dealers, security-based swap dealers, major swap participants, major security-based swap participants, derivatives clearing organizations, and clearing agencies; (ii) bank holding companies or bank holding company subsidiaries (that are not insured depository institutions or other types of excluded entities); savings and loan holding companies or savings and loan holding company subsidiaries (that are not insured depository institutions or other types of excluded entity); U.S. affiliates of a foreign bank; noninsured state member banks; agencies or commercial lending companies other than a federal agency; organizations organized and operated under section 25A of the Federal Reserve Act or operating under section 25 of the Federal Reserve Act; (iii) (A) nonbank financial companies that the Council has determined shall be subject to Federal Reserve supervision and enhanced prudential standards under section 113 or (B) financial market utilities that the Council has designated as, or as likely to become, systemically important under section 804; (iv) subsidiaries of State non-member insured banks that are not supervised on a consolidated basis with the State non-member insured bank, or financial companies that are not supervised by a PFRA; and (v) other non-bank financial companies satisfying criteria set forth in the Final Rules.

    87 Not all of these entities would qualify as records entities subject to the Final Rules because of conditions in the definition of records entity related to asset size and level of derivatives activity. “Financial company” includes any company that is incorporated or organized under any provision of federal law or the laws of any state and is predominantly engaged in activities that the Board of Governors has determined are financial in nature for purposes of section 4(k) of the BHC Act. 12 U.S.C. 5381(a)(11). Activities that are “financial in nature” include “providing financial, investment, or economic advisory services, including advising an investment company” and “issuing or selling instruments representing interests in pools of assets . . .” and “underwriting, dealing in, or making a market in securities.” 12 U.S.C. 1843(k)(4).

    2. Purpose

    Section 148.1(b) of the Proposed Rules provided that the purpose of the rules is to establish QFC recordkeeping requirements for a records entity in order to assist the FDIC as receiver for a covered financial company. The Secretary did not receive any comments requesting changes to this section and has not modified it from the Proposed Rules.

    3. Effective Date and Compliance Dates a. Initial Compliance Dates

    Section 148.1(c) of the Proposed Rules provided that the rules would become effective 60 days after publication of the Final Rules in the Federal Register. Section 148.1(d)(1) of the Proposed Rules provided that each entity that constitutes a records entity on the date the rules become effective would be required to provide each of its PFRAs and the FDIC a point of contact responsible for recordkeeping under the rules and to comply with all the other requirements of the rules within 270 days of the effective date. For a records entity that becomes subject to the rules after they become effective, compliance with the point of contact requirement would have been required within 60 days after such entity becomes subject to the rules and compliance with all the other requirements of the rules would have been required within 270 days after such entity becomes subject to the rules.

    Several commenters submitted that the proposed compliance period would be an inadequate amount of time for implementation because of the significant information systems upgrades and changes in recordkeeping practices that commenters said would be required for implementation.88 Some commenters suggested that the initial compliance period should be extended to two years.89 Other commenters suggested that compliance should be phased in in stages, with staggered compliance dates for various types of QFCs 90 or for entities based on the size of their QFC portfolios, with entities with the largest QFC portfolios required to comply first under the assumption that they would be more likely to have the infrastructure in place to comply with the recordkeeping requirements.91

    88See ACLI letter, pp. 19-20; OCC letter, p. 12; SIFMA AMG letter, pp. 13, 22-23.

    89See Regional Banks letter, p. 4; SIFMA AMG letter, p. 13.

    90See TCH et al. letter, p. 23.

    91See ACLI letter, pp. 15-16.

    In response to these comments, the Final Rules provide additional time to all records entities to comply with the requirements of the rules. All records entities will have 90 days after the effective date of the rules to comply with the requirement to provide point of contact information to their PFRAs and the FDIC; this extension will provide additional time to financial companies to determine whether they are records entities under the rules. As to the remainder of the requirements of the rules, the Final Rules provide staggered compliance dates that will provide all records entities with additional time to comply with the recordkeeping requirements. The Final Rules provide that records entities with $1 trillion or more in total consolidated assets and the financial company members of their corporate group will have 540 days (approximately 18 months) after the effective date to comply with the rules. The Secretary understands that only the four largest G-SIBs would meet this threshold on the effective date. The Secretary has determined that it is important for data on the largest, most systemically important entities to be available as soon as reasonably possible. The FDIC has advised that, in general, large insured depository institutions subject to the Part 371 recordkeeping requirements have been able to comply with those requirements within 270 days. Although the recordkeeping requirements under the Final Rules are more detailed in many respects than those under Part 371, the Secretary believes that the extra time allotted for compliance should be sufficient to allow the largest financial companies to adapt the processes, procedures, and systems to comply with the Final Rules.

    Under the Final Rules, all other records entities will have at least two years to comply with the rules' recordkeeping requirements. Records entities with total assets equal to or greater than $500 billion (but less than $1 trillion) and financial company members of the corporate group of such entities will have two years from the effective date to comply. Records entities with total assets equal to or greater than $250 billion (but less than $500 billion) and financial company members of the corporate group of such entities will have three years from the effective date to comply. All other records entities will have four years from the effective date to comply.

    The Final Rules provide for a staggered schedule based on the total consolidated assets of the records entities (or other members of their corporate group) on the understanding that larger entities will generally have greater capacity to apply to the task of coming into initial compliance with the rules. In addition, because the Department of the Treasury and the FDIC anticipate providing guidance to records entities as they work to come into compliance with the rules, the staggered compliance schedule will permit staff of the Department of the Treasury and the FDIC to allocate their resources to address more efficiently requests for guidance from each tier of records entities in turn. The commenter's proposal to provide for staggered compliance based on type of QFC would mean that the FDIC would not have records that would be of meaningful usefulness under Title II until the final compliance deadline had been met, given the requirement, discussed above, that if the FDIC as receiver decides to (i) transfer any QFC with a particular counterparty, it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution and (ii) disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty. In contrast, the compliance schedule provided for in the Final Rules would provide the FDIC with complete records for a successively larger set of companies.

    The Final Rules provide that a financial company that becomes a records entity after the effective date must provide point of contact information within 90 days of becoming a records entity and must comply with all other applicable requirements of the rules within 540 days of becoming a records entity or within the remainder of the applicable initial compliance period if it has not yet expired, whichever period is longer. The Secretary believes that this amount of time will be sufficient given that financial companies generally should be able to anticipate meeting the criteria for being deemed a records entity in advance of crossing the total assets and derivatives thresholds.

    b. Subsequent Compliance Dates

    Under Section 148.1(d)(2) of the Proposed Rules, a financial company that no longer qualifies as a records entity would have been permitted to cease maintaining records one year after it ceases to qualify as a records entity. The definition of “records entity” in section 148.2(n) of the Final Rules provides that a company that is a records entity by virtue of exceeding the total assets and derivatives exposure thresholds shall remain a records entity until one year after it ceases to meet the total assets and derivatives exposure thresholds. Financial companies that are members of such a corporate group would be subject to the same provision. However, in a change from the Proposed Rules, any company that is a records entity because it meets the other criteria of the definition shall cease to be a records entity and thus shall cease to be subject to the rules immediately upon ceasing to meet such criteria. For example, a nonbank financial company with respect to which the Council rescinds a determination under section 113 would no longer be a records entity upon such rescission.

    The Proposed Rules provided that a financial company that becomes subject to the rules again after it had ceased recordkeeping would be required to comply with the requirements of the rules within 90 days of the date it again becomes subject to the rules. The Final Rules extend that period to 365 days, but if a longer period still remains under the applicable initial compliance period discussed above, the entity has until the end of that longer period to comply with the rules.

    c. Extensions of Compliance Dates

    Section 148.1(d)(3) of the Final Rules, consistent with section 148.3(c)(3) of the Proposed Rules, authorizes the Secretary, in consultation with the FDIC, to grant extensions of time with respect to compliance with the recordkeeping requirements. As discussed in the Supplemental Information to the Proposed Rules, it is anticipated that such extensions of time would apply when records entities first become subject to the rules and likely would not be used to adjust the time periods specified in the maintenance and updating requirements of section 148.3(b) of the Final Rules. Extensions of time may also be appropriate on a limited basis with respect to a records entity that is temporarily incapable of generating records due to unforeseen technical issues.

    d. Compliance by Top-Tier Financial Company

    Finally, section 148.1(d)(4) of the Final Rules provides that a top-tier financial company must comply with the requirement, discussed below, to be capable of generating a single, compiled set of records of all the members of its corporate group on the same date as the date on which the records entity members of the corporate group of which it is a member are required to comply with this part.

    B. General Definitions

    In addition to the definitions described in detail above in reference to the scope of the Proposed Rules, certain additional terms were defined in the Proposed Rules to describe a records entity's recordkeeping obligations. The Secretary did not receive any comments on these definitions.

    The definition of “primary financial regulatory agency” has been revised to include, with respect to a financial market utility that is subject to a designation pursuant to section 804 of the Act, the Supervisory Agency for that financial market utility, as defined in section 803(8) of the Act, if such financial market utility would not otherwise have a PFRA.92

    92 12 U.S.C. 5462(8).

    The term “total assets,” which is used both in the definition of “records entity” and for determining a particular records entity's compliance date, is defined in the Final Rules by reference to the audited consolidated statement of financial condition submitted to the financial company's PFRAs or, if no such statement is submitted, to the financial company's consolidated balance sheet for the most recent fiscal year end, as prepared in accordance with GAAP or other applicable accounting standards. This definition is unchanged from the Proposed Rules other than the addition of the reference to GAAP or other applicable accounting standards. One commenter proposed excluding from the definition of “total assets” any assets under management, even if those assets are included on a balance sheet under applicable accounting standards.93 The Secretary has decided, for the sake of consistency and to allow for ease of determination as to what a financial company's total assets are, not to provide such an exclusion. However, to the extent assets under management are not reflected on a financial company's consolidated statement of financial condition or consolidated balance sheet, as applicable, such assets would not be included within the definition of “total assets.”

    93See SIFMA AMG letter, p. 10.

    The Final Rules also include several additional definitions. A definition of “legal entity identifier,” previously provided in the appendix, has been added to section 148.2. In addition, a definition of “parent entity” has been added because, as discussed below, the appendix has been revised in the Final Rules to require information regarding the immediate and ultimate parent entity of a counterparty to a QFC rather than a full organizational chart for each counterparty. In order to align with the definition of “affiliate” in Title II, as discussed above, “parent entity” is defined in the Final Rules as “an entity that controls another entity.”

    Because, as discussed above, the Final Rules exclude insurance companies from the definition of “records entity,” a definition of “insurance company” has been added. In addition to incorporating the definition of “insurance company” provided in Title II, the definition in the Final Rules includes mutual insurance holding companies that meet the conditions, specified by the FDIC in part 380 of its rules, for being treated as an insurance company for the purpose of section 203(e) of the Act.94 The Final Rules also include definitions of “gross notional amount of derivatives outstanding” and “derivative liabilities,” as discussed above, and a definition of “top-tier financial company,” as discussed below.

    94 A mutual insurance holding company is created through the restructuring of a mutual insurance company into two entities, a mutual insurance holding company and a stock insurance company that is converted from the original mutual insurance company. The FDIC excluded mutual insurance holding companies that meet the conditions specified in its rules in order to address concerns that, because, under applicable state laws, a mutual insurance holding company generally is prohibited from selling policies of insurance, it might not fit squarely within a literal reading of the statutory definition of insurance company under the Dodd-Frank Act. The FDIC also noted that state law generally subjects a mutual insurance holding company to liquidation or rehabilitation under the state regime if the converted mutual insurance company is placed in liquidation or rehabilitation and that in the liquidation of a converted mutual insurance company, the assets of the mutual insurance holding company generally are included in the estate of the converted mutual insurance company being liquidated. See 77 FR 25349, 25349-50 (April 30, 2012).

    C. Form, Availability, and Maintenance of Records 1. Form and Availability

    Generally applicable requirements. Section 148.3(a)(1) of the Proposed Rules provided that a records entity must maintain all records in electronic form in the format set forth in the appendix to the Proposed Rules. The Proposed Rules further provided that all affiliated records entities in a corporate group must be able to generate data in the same data format and use the same unique counterparty identifiers to enable the aggregation of data. As explained in the Supplemental Information to the Proposed Rules, the FDIC would use the aggregation of counterparty positions to determine the effects of termination or transfer of QFCs. The Secretary requested comments on whether the rules should require that the parent company of a corporate group aggregate the records of the records entities of the corporate group.95 The Secretary, after consulting with the FDIC, has determined that it is important that the FDIC be able to receive a single set of compiled records from a corporate group in order to allow it to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act under the short time frame provided in Title II.

    95See 80 FR 966, 975.

    Accordingly, section 148.3(a)(1) has been revised in the Final Rules to provide that a top-tier financial company, defined as a financial company that is a member of a corporate group consisting of multiple records entities and that is not itself controlled by another financial company, must be able to generate a single, compiled set of the records, in electronic form, for all records entities in the corporate group that it consolidates or are consolidated with it, in a format that allows for aggregation and disaggregation of such data by records entity and counterparty. By limiting this requirement to records of records entities that are consolidated by or with the top-tier financial company, the Secretary has sought to avoid circumstances in which the top-tier financial company might not have access to the records it is required to compile. The top-tier financial company may comply with this requirement by providing that any of its affiliates or any third-party service provider maintains the capability of generating the single, compiled set of the records, in electronic form, for all records entities in the corporate group; provided, however, that the top-tier financial company shall itself maintain records under this part in the event that such affiliate or service provider shall fail to maintain such records.96 Given that the Proposed Rules would have required each records entity in a corporate group to generate data in the same format, the Secretary does not anticipate that this will place a significant additional burden on records entities. Section 148.3(a)(2) of the Proposed Rules has been consolidated in the Final Rules with section 148.4, as discussed below under section II.D.1.

    96 It is possible that there could be more than one top-tier financial company in a corporate group, particularly in the circumstance in which the top-tier parent entity of the group is not itself a financial company; in such a case, the top-tier financial companies would presumably provide that only one of them, or an affiliate or service provider, would maintain the capability of generating the single, compiled set of the records for all records entities in the corporate group.

    Section 148.3(a)(3) of the Proposed Rules provided that each records entity designate a point of contact to enable its PFRA and the FDIC to contact the records entity with respect to the rules and to update this information within 30 days of any change. The Secretary did not receive any comments on this subsection, which in the Final Rules appears as section 148.3(a)(2), and has not modified it from the Proposed Rules, other than by subjecting the top-tier financial company of a corporate group to this requirement and by making certain technical changes.

    Section 148.3(a)(4) of the Proposed Rules provided that each records entity that is regulated by a PFRA be capable of providing all QFC records specified in the rules to its PFRA within 24 hours of request. This provision has been revised as section 148.3(a)(3) of the Final Rules to provide that the records entity is required to be capable of providing electronically, within 24 hours of the request of the PFRA, all QFC records specified in the rules to both its PFRA and the FDIC. This change has been made to ensure that the records will be maintained in a format that is compatible with the FDIC's systems and to avoid any delay resulting from the records having to be transmitted from the PFRA to the FDIC.97 This provision also provides that the top-tier financial company of a corporate group be required to be capable of providing, upon the request of the PFRA, the compiled set of records for all records entities of the corporate group to both its PFRA and the FDIC.

    97 One commenter requested that the Secretary provide clarification that, given the global nature of many financial companies that would be records entities under the rule, a request for records made before 5:00 p.m., eastern time on a given day must be satisfied by 5:00 p.m. eastern time on the following day. See TCH et al. letter, p. 23. This is not the intention of Secretary in adopting the 24 hour requirement.

    Request for reliance on existing recordkeeping requirements. Commenters suggested that the records required under the Proposed Rules be made consistent with supervisory recordkeeping or reporting requirements for derivatives imposed by other federal regulatory agencies.98 However, the types of financial contracts included within the scope of other derivatives recordkeeping and reporting requirements is not as broad as the definition of QFCs under the Dodd-Frank Act.99 Further, the scope of entities required to maintain records under such other recordkeeping and reporting rules is different from that under the Final Rules, given their differing purposes. Finally, reliance on a collection of records maintained under different recordkeeping and reporting regimes would not permit the aggregation of data that will be necessary for the receiver to comply with the time frame under which the FDIC as receiver must take action with respect to the covered financial company's QFCs under the statutory constraints discussed above.

    98See SIFMA AMG letter, pp. 12-13; DTCC letter, p. 7; ACLI letter, p. 20; Capital Group letter, pp. 3-4.

    99 For example the CFTC's swap data recordkeeping requirement at 17 CFR part 46 covers “swaps,” which does not include certain contracts such as commodity contracts and margin loans that are included in the definition of QFCs under the Dodd-Frank Act.

    Request for exclusion of certain types of transactions. One commenter proposed that the recordkeeping requirements of the Final Rules not apply to QFCs that are for the purchase and sale of securities such as typical cash transactions that settle on a delivery-versus-payment basis or settle within a fixed number of days following the transaction date.100 The commenter argued that (i) these short-term transactions are not relevant to the FDIC for the purposes of its decision making under Title II, (ii) the significant volume of these transactions that would be reported on any given day would overwhelm and obscure otherwise relevant data, and (iii) for those transactions that are exchange traded, only the settlement system and the clearing agency would be listed as direct counterparties, which should simplify the FDIC's decisions with respect to such transactions. The commenter offered similar arguments with respect to QFCs entered into with retail customers or as part of a records entity's retail or brokerage account activities.

    100See TCH/SIFMA letter.

    All QFCs, regardless of their tenor, their volume, and how they are settled, are subject to the requirement, discussed above, that if the FDIC as receiver determines (i) to transfer any QFC with a particular counterparty, it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution or (ii) to disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty. The large volume of these short-term transactions supports the determination that the QFC information required to be provided must be maintained in the standard format specified in the rules to ensure rapid aggregation and evaluation of the information by the receiver. Whether these transactions are exchange traded will not necessarily affect the FDIC's decision as to whether to transfer the QFCs in question; rather, the FDIC's decision as to whether to transfer a particular counterparty's QFCs will be based on an evaluation of the other information required to be collected under the Final Rules and on an evaluation of the impact of such transfer on the receivership and U.S. financial stability. Furthermore, for corporate groups that include members that are subject to different recordkeeping regimes, permitting entities to rely on their existing records would not be consistent with the requirement for the top-tier financial company to be capable of generating a single, compiled set of QFC records in a format that allows for aggregation and disaggregation of such data. The Secretary notes, however, that under the exemptive process provided in the rules and discussed below, a records entity may apply for relief from particular requirements as to the information to be maintained by a records entity for a particular type of QFC or counterparty. Any exemptive relief requested with respect to a particular type of QFC or counterparty would need to be defined in such a way as to ensure consistency of treatment by each records entity.

    2. Maintenance and Updating

    Section 148.3(b) of the Proposed Rules would have required that each records entity maintain the capacity to produce QFC records on a daily basis based on previous end-of-day records and values. The Secretary has clarified in the Final Rules that, if records are maintained on behalf of a records entity by an affiliate or service provider, such records entity shall itself maintain records under this part in the event that such affiliate or service provider fails to maintain such records. The Secretary confirms that, as was suggested by a commenter, the information required to be capable of being provided shall be with respect to QFCs as of the end of the day on the date the request is provided.101

    101See TCH et al. letter, p. 23.

    3. Exemptions a. Requests for Exemptions

    Section 148.3(c) of the Proposed Rules provided that upon written request by a records entity, the FDIC, in consultation with the PFRAs for the records entity, may recommend that the Secretary grant a specific exemption from compliance with one or more of the requirements of the rules. In addition, under the Proposed Rules, the Secretary would also have been permitted to issue exemptions that have general applicability upon receipt of a recommendation from the FDIC, in consultation with the PFRAs for the applicable records entities.

    One commenter suggested that exemptions should be granted by the PFRAs for a records entity rather than by the Secretary.102 Another commenter suggested that exemption recommendations should be made by the PFRAs rather than by the FDIC.103 A third commenter suggested that the exemption process should be streamlined to involve only one agency.104 After considering these comments, the Secretary is adopting the provision for granting exemptions substantially as proposed, with certain modifications as described below. The Secretary believes that the Act does not authorize the Secretary, as Chairperson of the Council, to delegate decision making authority with respect to these rules to other agencies. In making any decision regarding exemptions, the Secretary continues to believe that it is appropriate to obtain a recommendation from the FDIC, prepared in consultation with the PFRAs for the relevant records entities. The provision for a recommendation from the FDIC is consistent with the requirement that the Secretary consult with the FDIC in adopting these rules and reflects the fact that the FDIC is the intended user of the QFC records. Including the PFRAs for the relevant records entities in the exemption process recognizes their familiarity with the operations of the records entities. The Final Rules have been modified to clarify that, even if the FDIC does not make a recommendation, the Secretary nevertheless may make a determination to grant or deny an exemption request.

    102See Capital Group letter, p. 4.

    103See ICI letter, p. 10.

    104See OCC letter, p. 8.

    In addition, the Secretary has simplified the exemption provision by consolidating the separate provisions for general and specific exemptions and has specified in the Final Rules what a request for an exemption must contain. In determining whether to grant any requests from records entities for exemptions, the Secretary may take into consideration their size, risk, complexity, leverage, frequency and dollar amount of QFCs, interconnectedness to the financial system, and any other factors deemed appropriate, including whether the application of one or more requirements of the rules is not necessary or appropriate to achieve the purpose of the rules.

    b. De Minimis Exemption

    Several commenters argued that the requirements of the Proposed Rules should not apply to records entities that have a minimal level of QFC activity. Commenters noted that a financial company might be subject to the recordkeeping requirements of the Proposed Rules even if it is a party to only a single QFC.105 One commenter suggested that the definition of “records entity” exclude any financial company that, over the immediately preceding 12 months, (i) had fewer than 50 unaffiliated counterparties or entered into fewer than 100 QFC transactions with non-affiliates and (ii) entered into QFCs having a gross notional value equal to or less than $2.5 billion.106 Another commenter proposed providing varying de minimis thresholds for each type of QFC, with different levels set to reflect the different risks associated with each type of QFC.107

    105See ACLI letter, p. 15; TCH et al. letter, p. 11; TIAA-CREF letter, p. 7; CWEG letter, pp. 4-5.

    106See TCH/SIFMA letter.

    107See ACLI letter, p. 15.

    After consideration of these comments, the Secretary has determined that an exemption from the preponderance of the recordkeeping requirements of the rules is appropriate for records entities that have a minimal level of QFC activity such that if the FDIC were appointed as receiver for any such records entity, the FDIC would be in a position to make the requisite determinations with respect to the treatment of QFCs during the stay period even in the absence of the records required to be maintained under the rules. The Secretary considered a number of different approaches to setting the threshold for the de minimis exemption, including the gross notional value of a records entity's QFC portfolio over a defined period, the number of discrete unaffiliated QFC counterparties, and the number of open positions. The Secretary determined that gross notional value would not be an appropriate metric because the gross notional amount of a QFC portfolio is not a good proxy for the difficulty the receiver would have in assessing the QFC portfolio and in making the requisite determinations with speed and accuracy. For instance, a single interest rate swap that exceeds a specified threshold may easily be reviewed by the receiver without standardized recordkeeping. By contrast, a records entity may have a QFC portfolio that falls below the threshold but is comprised of hundreds of open positions, such that the portfolio would pose challenges for the receiver to review and act upon during the one business day stay period and thus would necessitate the advance recordkeeping required by the rules. Likewise, the Secretary determined that neither the risk each type of QFC might pose, even if that were something that could be distinguished for purposes of these rules, nor any of the other factors listed in section 210(c)(8)(H)(iv) would be relevant to the question of how many QFCs a receiver will be able to review during the one business day stay period.

    The recordkeeping requirements of Part 371 of the FDIC's rules relax the recordkeeping requirements for institutions with fewer than twenty open QFC positions. Based on its experience with Part 371, the FDIC advised that a receiver should be able to exercise its statutory rights and duties under the Dodd-Frank Act relating to QFCs without having access to standardized records for any records entity that is a party to no more than 50 open QFC positions. Having considered the comments received and the FDIC's experience with evaluating QFC portfolios, the Secretary has provided in the Final Rules that any records entity that is a party to no more than 50 open QFC positions is not required to maintain the records described in section 148.4 other than the copies of the documents governing QFC transactions between the records entity and each counterparty as provided in section 148.4(i). This exemption provides further differentiation among financial companies and reduces the burden of the rules without compromising the ability of the FDIC to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), and (10).

    D. Content of Records 1. General Information

    Section 148.4 of the Final Rules requires each records entity to maintain the data listed in the appendix tables, copies of the documents that govern QFCs, and lists of vendors directly supporting the QFC-related activities of the records entity and the vendors' contact information with respect to each QFC to which it is a party. As discussed above, the Final Rules have been simplified so as not to separately require that a full set of records be maintained with respect to the underlying QFCs for which a records entity provides a guarantee or other credit enhancement. Instead, as discussed below, certain fields specific to the provision by a records entity of a guarantee of a QFC or of another type of credit enhancement of a QFC have been added to the tables in the Final Rules.

    The Proposed Rules would have also required that records entities maintain any written data or information that is not listed in the appendix tables that the records entity is required to provide to a swap data repository, security-based swap data repository, the CFTC, the SEC, or any non-U.S. regulator with respect to any QFC, for any period that such data or information is required to be maintained by its PFRA. Having considered a comment received indicating that this would be unduly burdensome,108 the Secretary has chosen to eliminate these requirements as not sufficiently significant to the receiver to justify the burden they would place on records entities.

    108See TCH et al. letter, p. 17.

    The Proposed Rules provided that a records entity also would be required to maintain electronic, full-text searchable copies of all agreements that govern the QFC transactions subject to the rules, as well as credit support documents related to such QFC transactions. Having considered the comments received indicating that the requirement that such electronic documents be full-text searchable would be unduly burdensome,109 the Secretary has decided to omit this requirement as not sufficiently significant to the receiver to justify the burden it would place on records entities. No comments were received on the proposed requirement that each records entity maintain a list of vendors directly supporting the QFC-related activities and the contact information for such vendors, and this provision has been retained without change in the Final Rules. The Proposed Rules also provided that each records entity would be required to maintain information about the risk metrics used to monitor the QFC portfolios and contact information for each risk manager. The Secretary has decided to eliminate this requirement as not sufficiently significant to the receiver to justify its burden on records entities.

    109See TCH et al. letter, pp. 18-19; ACLI letter, pp. 18-19.

    2. Appendix Information

    For the receiver to make a well-informed decision that complies with the requirements of Title II discussed in section I, the receiver must have sufficient information to fully evaluate and model various QFC transfer or termination scenarios as well as the potential impact of its transfer or retention decisions. To perform this analysis in the extremely limited time frame provided by Title II, the receiver must have access to data on the QFC positions of the records entity, net QFC exposures under applicable netting agreements, detailed and aggregated collateral positions of the records entity and of its counterparties, and information regarding certain key provisions of the legal agreements governing the QFC transactions. Many commenters recognized the importance of maintaining detailed records of QFCs for use by the FDIC if it were appointed as receiver under Title II; however, several commenters expressed concern that the requirements of Tables A-1 through A-4, as proposed, were overly burdensome and would require maintenance of data that is different in content or format from that currently tracked or collected in the ordinary course of business or for other regulatory purposes.110

    110See TCH et al. letter, pp. 15, 20; ACLI letter, pp. 17-18; SIFMA AMG letter, pp. 12-13; TIAA-CREF letter, p. 2.

    The appendix to the Final Rules preserves the basic structure and content of the data tables included in the Proposed Rules. However, the Secretary has eliminated data fields that the Secretary decided would not provide a sufficiently significant benefit to the FDIC as receiver to justify the burden they would place on records entities. Further, the Final Rules add four master data lookup tables, composed largely of requirements that previously appeared in the four data tables of the Proposed Rules, in order to reduce the burden on records entities and improve the tables' functionality for the receiver. These include: (1) A corporate organization master data lookup table; (2) a counterparty master data lookup table; (3) a booking location master data lookup table; and (4) a safekeeping agent master data lookup table.

    The master data lookup tables are cross-referenced to one or more of Tables A-1 through A-4 and provide a centralized site for records of affiliate, counterparty, booking location, and safekeeping agent data, which eliminates the need for a records entity to include duplicative data in Tables A-1 through A-4 and thereby makes it easier for a records entity to enter and update the data included in those Tables. In particular, the records entity members of a corporate group, which are required to utilize common identifiers for shared counterparties, will be able to use the same counterparty consolidated corporate master lookup table for a given counterparty. For example, if there were several records entities in a corporate group and each was a party to one or more QFCs with a particular counterparty, use of the counterparty master lookup table would enable the information as to that counterparty to be entered only once. The lookup table format, which conforms to customary information technology practices, will also allow for smaller file sizes by eliminating repetitive entries, thereby reducing the burden of maintaining the records and maintaining the capability of transmitting them to the FDIC and the records entity's PFRA.

    Each table contains examples and, as relevant, instructions for recording the required information and an indication of how the FDIC as receiver would apply the required information. A records entity may leave an entry blank for any data fields that do not apply to a given QFC transaction, agreement, collateral item, or counterparty. For example, if a QFC is not collateralized, the data fields that relate to collateral may be left blank (in the case of character fields) or given a zero value (in the case of numerical fields).

    Several commenters noted that the scope of the recordkeeping requirements in the appendix is more extensive than that of the recordkeeping requirements in the appendix to Part 371.111 As noted in the Supplementary Information to the Proposed Rules, the recordkeeping requirements of the rules have been informed by the FDIC's experience in evaluating multiple QFC portfolios of insured depository institutions.

    111See TCH et al. letter, p. 20; Capital Group letter, p. 3.

    a. Table A-1—Position-Level Data

    Table A-1 requires each records entity to maintain detailed position-level data to enable the FDIC as receiver to evaluate a records entity's QFC exposure to each of its counterparties on a position-by-position basis. The records required by the table include critical information about the type, terms, and value of each of the records entity's QFCs. Position-level information must be available for each counterparty, affiliate, and governing netting agreement to allow the FDIC as receiver to model the potential impacts of its decisions relating to the transfer or retention of positions. This information will also enable the FDIC to confirm that the netting-set level data provided in Table A-2, such as the market value of all positions in the netting set (A2.6), based on the aggregated data from Table A-1, is accurate and can be validated across different tables. In addition, position-level information will assist the receiver or any transferee in complying with the terms of the records entity's QFCs and thereby reduce the likelihood of inadvertent defaults.

    In response to comments received, the Secretary has made several changes to Table A-1 that will reduce the recordkeeping burden. One commenter recommended elimination of the requirement to identify the purpose of a QFC position, stating that this could involve a complicated analysis and impose a substantial burden on records entities. The commenter stated that a QFC position may have multiple purposes that may change over time such that any identified purpose would be of minimal value to the receiver.112 In response to this comment, the Secretary has eliminated from Table A-1 the requirement to identify the purpose of each QFC.

    112See TCH et al. letter, pp. 20-21.

    One commenter also recommended eliminating the requirement to maintain operational and business-level details relating to QFC positions, such as the identification of related inter-affiliate positions, trading desk identifiers, and points of contact. The commenter stated that such operational and business-level details are subject to frequent change that would require frequent updates by records entities and submitted that this information would likely be of limited value to the receiver.113 In consideration of this comment, the Secretary has decided to eliminate both the requirement to maintain data on related inter-affiliate positions and the requirement to maintain contact information for the person at the records entity responsible for each position. The Secretary has replaced the inter-affiliate fields of the Proposed Rules with a narrower requirement to link only related positions, if any, to which the records entity itself is a party (A1.22). All positions of a particular records entity that are reported on Table A-1 and that are related to one another should have the same designation in this field. The requirement to identify loans related to a QFC position has also been retained (A1.23-24). In addition, in recognition that it may be necessary for the FDIC, in determining whether to transfer a QFC, to locate the personnel at a records entity who are familiar with a particular position and can provide the receiver with additional information on the position, the Final Rules require a records entity to provide, in the booking location master table, identifiers for the booking unit or desk, a description of the booking location, and contact information for the desk associated with a QFC (BL.3-BL.7).

    113See TCH et al. letter, p. 19.

    One commenter stated that the requirement to provide information based on a classification under GAAP or IFRS may not be appropriate if the records entity follows a different accounting standard.114 In response to this comment, the Secretary has decided to require that each records entity maintain the asset classification available under any accounting principles or standards used by the records entity (A1.18). If no asset classification scheme is available under any accounting principles or standards used by a records entity, the records entity may leave the entry blank.

    114See ACLI letter, p. 18.

    To further reduce the burden of Table A-1, the Secretary has eliminated the following proposed data fields in the Final Rules: Industry code (GIC or SIC code); position standardized contract type; and documentation status of the position.

    The Final Rules include two additional fields to Table A-1 based on the FDIC's experience with implementing Part 371. The Secretary believes that the addition of these fields should impose minimal, if any, additional burden on a records entity. The first addition is a data field for the date that the data maintained in the table was extracted from the records of the records entity (A1.1). Because records entities may derive data from multiple systems in multiple locations, information on the date that data was extracted is necessary to enable the receiver to assess whether all recorded information is current. The data extraction date field has been included in each of the tables of the appendix.

    A netting agreement counterparty identifier field (A1.10) has also been added to the table. Based on the FDIC's experience with the implementation of Part 371, the FDIC has advised that it is necessary for the rules to address circumstances in which the counterparty to a QFC is different from the counterparty securing the QFC (for example, if an affiliate of the QFC counterparty is providing collateral for the position). In such cases, the netting agreement counterparty identifier is necessary to enable the receiver to link certain position-level data from Table A-1 to the applicable netting-set level data under Table A-2.

    In addition certain fields specific to guarantees of QFCs provided by the records entity and other credit enhancements of QFCs provided by the records entity have been added to the table, including the type of QFC covered by the guarantee or other third party credit enhancement (A1.7.1) and the underlying QFC obligor identifier (A1.7.2). Further, the Final Rules include fields requiring identification of any credit enhancement that has been provided by a third party with respect to a QFC of the records entity (A1.21.1-.5).

    As in the Proposed Rules, Table A-1 under the Final Rules requires that a records entity be identified by its legal entity identifier (“LEI”). In order for an LEI to be properly maintained, it must be kept current and up to date according to the standards established by the Global LEI Foundation. In addition, to the extent a records entity uses a global standard unique transaction identifier or unique product identifier to identify a QFC for which records are kept under these rules, the records entity should use such identifiers in completing fields A1.3 and A1.7, respectively. The Secretary has made this change in recognition of the ongoing work of the Committee on Payments and Market Infrastructures and the Board of the International Organization of Securities Commissions to establish such global identifiers.

    b. Table A-2—Counterparty Netting Set Data

    Table A-2, which specifies the information to be maintained regarding aggregated QFC exposure and collateral data by counterparty, has been adopted in the Final Rules substantially as proposed, with certain changes discussed below.

    Table A-2 requires a records entity to maintain records of the aggregated QFC exposures under each netting agreement between the records entity and its counterparty. Table A-2 also requires comprehensive information on the collateral exchanged to secure net exposures under each netting agreement. Information on collateral required by the table includes the market value of collateral, any collateral excess or deficiency positions, the identification of the collateral safekeeping agent, a notation as to whether the collateral posted by a counterparty or a records entity is subject to rehypothecation, and the market value of any collateral subject to rehypothecation. The information required by Table A-2 must be maintained at each level of netting under the relevant governing agreement. For example, if a master agreement includes an annex for repurchase agreements and an annex for forward exchange transactions and requires separate netting under each annex, the information required by Table A-2 with respect to the net exposures under each annex would need to be maintained separately.

    In evaluating whether to transfer or retain QFCs between a records entity and a counterparty, the receiver must be able to assess the records entity's net exposure to the counterparty (and the counterparty's affiliates), the counterparty's net exposure to the records entity, and the amount of collateral securing those exposures. Net QFC exposure data will also assist the receiver in aggregating exposures under netting agreements with a counterparty and its affiliates based on the netting rights of the entire group, in order to determine relative concentrations of risk under each applicable netting agreement. This information will assist the receiver in modeling various transfer or termination scenarios and evaluating the effects and potential impact of the FDIC's decision to transfer the covered financial company's QFCs, retain and disaffirm or repudiate them, or retain them and allow the counterparty to terminate them. Information on collateral also ensures that the FDIC as receiver is able to comply with its statutory obligation to transfer all collateral securing the QFC obligations that it elects to transfer.115 In addition, the records required to be maintained under Table A-2 will assist the receiver in identifying any excess collateral posted by a counterparty for possible return to the counterparty should the contracts be terminated after the one business day stay period.

    115See 12 U.S.C. 5390(c)(9)(A)(i)(IV).

    As discussed above, one commenter recommended eliminating the requirement to maintain operational and business-level details relating to QFC positions, including points of contact and the risk or relationship manager for each counterparty.116 In addition to the changes made to Table A-1 in response to this comment, the Final Rules eliminate from Table A-2 the requirement to provide information on a counterparty risk or relationship manager at the records entity. However, the receiver may need contact information for the counterparty to fulfill its statutory notice requirements under section 210(c)(10) of the Act. Accordingly, the Final Rules retain the requirement, now in Table A-3, to identify a point of contact at the counterparty, but provide that the information to be maintained by the records entity is limited to the information provided by the counterparty pursuant to the notification section of the relevant QFC documentation. Accordingly, a records entity is not required to update the counterparty contact information unless the counterparty has provided to the records entity a notice of a change to this information.

    116See TCH et al. letter, p. 19.

    The burden of Table A-2 has been further reduced in the Final Rules by elimination of the following fields: Industry code (GIC or SIC code); master netting agreement for counterparty corporate group; name of each master agreement, master netting agreement or other governing documentation related to netting among affiliates in a counterparty's corporate group; current market value of all inter-affiliate positions with the records entity; master netting agreement for records entity's corporate group; and name of each master agreement, master netting agreement or governing documentation related to netting among records entities.

    An additional change was made to Table A-2 relating to the requirement in the Proposed Rules for the maintenance of records on the current market value of all positions netted under the applicable netting agreement. Table A-2 in the Final Rules retains this requirement (A2.6) and adds a related requirement to maintain records of the aggregate current market values of all positive positions (A2.7) and, separately, of all negative positions under the netting agreement (A2.8). Providing such valuations should not pose a significant additional burden, given that the records entity is required to calculate the aggregate current market value of all positions under the netting agreement. Such aggregate positive and aggregate negative positions can be calculated by summing the applicable position-level values provided in Table A-1; however, the FDIC has advised, based on its experience implementing Part 371, that inclusion of this information in summary format will make this information more useful to the receiver in making the determinations necessary to exercise its rights and fulfill its obligations within the one business day stay period.

    The Proposed Rules would have required that the amount of pending margin calls be included in the calculation of collateral positions. The Final Rules instead require information on the next margin payment date (A2.15) and the next margin payment amount (A2.16) in Table A-2. This information will assist the receiver in avoiding any failure to make a pending margin call during the one business day stay. Since the amount of pending margin calls was required to be calculated under Table A-2 as proposed to determine collateral excess or deficiency, requiring such information to be capable of being separately provided should not impose a significant additional burden.

    In place of the data fields in the Proposed Rules for the legal name of any master agreement guarantor and the unique counterparty identifier of guarantor, Table A-2 includes a field for third-party credit enhancement agreement identifiers (A2.5), which clarifies that it covers unaffiliated providers of credit support and encompasses forms of support in addition to guarantees. The Final Rules also add new fields to Table A-2 (A2.4.1 and A2.5.1-.5) to provide additional information as to third-party credit enhancements. The Final Rules also add to Table A-2 certain fields necessary to link the data in Table A-2 to one or more of the other data tables or lookup tables. Finally, the Final Rules add to Table A-2 the data extraction date field discussed above.

    c. Table A-3—Legal Agreement Data

    Table A-3 as adopted is intended to ensure that the FDIC as receiver has available to it the legal agreements governing and setting forth the terms and conditions of each of the QFCs subject to the rules. Table A-3 requires each legal agreement to be identified by name and unique identifier (A3.3-A3.4) and requires the maintenance of records on key legal terms of the agreement, such as relevant governing law (A3.7) and information about any third-party credit enhancement agreement (A3.10-12.3).

    In response to comments received on the Proposed Rules, the Final Rules include several changes to Table A-3 to reduce the recordkeeping burden. Commenters suggested eliminating the proposed requirement in Table A-3 to maintain records containing descriptions or excerpts of certain cross-default provisions, transfer restrictions, events of default, and termination events set forth in each QFC agreement or master agreement, arguing that providing this information would be extremely burdensome and of limited value to the receiver.117 In response to this comment, the Secretary has eliminated from Table A-3 the requirements to provide any information on transfer restrictions and substantially reduced the information required as to default provisions. As to cross-defaults, the Final Rules require only that a records entity indicate whether a QFC contains a default or other termination event provision that references another entity that is not a party to the QFC and, if so, the identity of such entity (A3.8-A3.9).

    117See TCH letter, p. 19; ACLI letter, p. 17.

    To further reduce the burden of Table A-3, the Final Rules eliminate the following proposed data fields: Basic form of agreement; legal name of guarantor of records entity obligations; industry code (GIC or SIC code); and legal name of counterparty obligations.

    Other changes to Table A-3 conform to those discussed above with respect to other tables, i.e., inclusion of the data extraction date field (A3.1), a field for the records entity identifier (A3.2, to link the data in Table A-3 to other data tables or look-up tables), an agreement date field (A3.5) and a field to identify the underlying QFC obligation for QFCs that are guarantees or credit enhancements (A3.6.1). In addition, as noted above in the discussion of Table A-2, the counterparty contact information that was required under Table A-2 in the Proposed Rules has been moved to fields A3.13-A3.16.

    d. Table A-4—Collateral Detail Data

    Table A-4 requires detailed information, on a counterparty by counterparty basis, relating to the collateral received by and the collateral posted by the records entity as reported in Table A-2. This information includes, for each collateral item, the unique collateral identifier (A4.6), information about the value of the collateral (A4.7-9), a description of the collateral (A4.10), the fair value asset classification (A4.11), the collateral segregation status (A4.12), the collateral location and jurisdiction (A4.13-14), and whether the collateral is subject to rehypothecation (A4.15). This collateral detail data, together with the netting-set level collateral data in Table A-2, will enable the receiver to more fully assess the type, nature, value, and location of the collateral and to model various QFC transfer or termination scenarios. Collateral detail information will also enable the receiver to ensure that collateral is transferred together with any QFCs that it secures, as required by the Act.118 For cross-border transactions, the comprehensive information on collateral will assist the receiver in determining the sufficiency and availability of collateral posted outside the United States, as well as any close-out risk if the receiver does not arrange for the transfer of QFC positions.

    118See 12 U.S.C. 5390(c)(9)(A)(i)(IV).

    The Secretary did not receive any comments requesting specific changes to the requirements of Table A-4. Nevertheless, to reduce the burden of Table A-4, the following data fields have been eliminated in the Final Rules: Original face amount of collateral item in U.S. dollars; current end of day market value amount of collateral item in local currency; and collateral code. The Final Rules also eliminate the requirement to describe the scope of collateral segregation.

    A collateral posted or received flag has been added to Table A-4 to clearly indicate to the receiver whether the collateral was posted or received by the records entity (A4.3). This field should impose minimal additional burden because a records entity will already need to identify all collateral as posted or received in Table A-2, which requires separate collateral information for collateral posted and collateral received. The Final Rules also adds the data extraction date field (A4.1), as discussed above, to Table A-4 as well as certain other fields necessary to link the data in Table A-4 to the data maintained in one or more of the other data tables or look-up tables (A4.2, A4.4, A4.5).

    e. Corporate Organization Master Data Lookup Table

    In the Proposed Rules, information regarding a records entity's affiliates was required by section 148.4(a)(7) and Tables A-1 and A-2. The Secretary has determined it is appropriate to provide instead for the corporate organization information to be maintained in the new corporate organization master data lookup table, which is cross-referenced with Tables A-1 through A-4. The Final Rules require this information to be maintained by a records entity with respect to itself and all of the members of its corporate group, which includes all of the records entities' affiliates. Although, as discussed above, the definition of “records entity” has been revised in the Final Rules to identify which members of a corporate group are records entities by reference to whether they are consolidated under accounting standards, in the event of a Title II resolution, the FDIC would need the information described in the next paragraph for each affiliate, irrespective of consolidation, to allow it to exercise its rights and obligations under, and ensure compliance with, section 210(c)(16) of the Act. As referenced above, under section 210(c)(16) of the Act, the contracts of subsidiaries or affiliates of a covered financial company that are guaranteed or otherwise supported by or linked to such covered financial company can be enforced by the FDIC as receiver of the covered financial company notwithstanding the insolvency, financial condition, or receivership of the financial company if the FDIC transfers the guarantee or other support to a bridge financial company or other third party.119 The FDIC's decision as to whether to transfer such a guarantee or credit support pursuant to sections 210(c)(9) and (10) of the Act may thus be influenced by the information required to be maintained as to a records entities' affiliates. Information about affiliates of the records entity will also, as discussed below, assist the FDIC with monitoring compliance with the rules.

    119 12 U.S.C. 5390(c)(16).

    The information that each records entity will need to maintain with respect to itself and each of its affiliates includes its and its affiliates' identifiers and legal name (CO.2-4), identification of immediate parent (CO.5-CO.7), the immediate parent's percentage ownership (CO.8), the entity type (CO.9), domicile (CO.10), and jurisdiction of incorporation or organization (CO.11). This information will be easier to provide and to update as part of the corporate organization master data lookup table rather than as part of the corporate organization chart provided for under the Proposed Rules. Use of the corporate organization master data lookup table will also facilitate the linking of the data provided in Tables A-1 through A-4 to key information about the records entity and its affiliates.

    The corporate organization master data lookup table also includes a recordkeeping status field (CO.12) that was not included in the Proposed Rules. This field, which requires the records entity to identify, with respect to each of its affiliates, whether the affiliate is (i) a records entity, (ii) a non-financial company, (iii) an excluded entity, (iv) a financial company that is not a party to any open QFCs, (v) a records entity that is availing itself of the de minimis exemption, or (vi) a records entity that is availing itself of another exemption, e.g., the conditional exemption for clearing organizations provided under the Final Rules. The information provided in this field will enable the FDIC as receiver to validate that all affiliates that are records entities have provided records to the extent appropriate. For example, if an affiliate has not provided QFC records, the FDIC will be able to ascertain, by reference to this field, whether the affiliate has not provided records because it is not a party to any QFCs, has availed itself of the de minimis exemption, or is not included within the definition of “records entity.” The addition of the de minimis exemption in the Final Rules made the need for this field more acute; without this information, the FDIC as receiver will not be alerted to an entity having availed itself of the de minimis exemption such that the FDIC would need to review the QFC documentation of that entity manually. Because each member of a corporate group for which there is a records entity will make its own determination as to whether it is subject to the recordkeeping requirements of the rules, the addition of this field should impose only a minimal burden.

    f. Counterparty Master Data Lookup Table

    In the Proposed Rules, information regarding a records entity's non-affiliated QFC counterparties was required by section 148.4(a)(6) and in Table A-2. Several commenters suggested that the organizational and affiliate information for counterparties not affiliated with the records entity that would have been required by the Proposed Rules be eliminated or significantly reduced.120 These commenters stated that the broad definitions of “affiliate” and “control” would make this a complex and difficult analysis.121 One commenter noted that most financial companies do not ask for or maintain records on affiliations between counterparties (other than parent-subsidiary relationships) and that these relationships are subject to change, such that even if such information were maintained, the records entity would not be in a position to verify the accuracy of the information.122

    120See TCH et al. letter, pp. 16-17; SIFMA AMG letter, p. 11.

    121Id.

    122See TCH et al. letter, p. 16.

    Having considered the comments received as to the burden of collecting, maintaining, and updating this information, the Secretary has determined that information regarding the identity of the immediate and ultimate parent of each counterparty is sufficient to enable the FDIC as receiver to comply with the requirement, discussed above, that the FDIC either (i) transfer all QFCs between the covered financial company and a counterparty and any affiliate of such counterparty to a single financial institution, (ii) disaffirm or repudiate all such QFCs, or (iii) retain all such QFCs. The data required by the counterparty master data lookup table includes the counterparty identifier (CP.2, which must be the current LEI maintained by the counterparty if the counterparty has obtained an LEI), the legal name of the counterparty (CP.4), domicile of counterparty (CP.5), jurisdiction of incorporation (CP.6), identification of the immediate parent of the counterparty (CP.7-CP.9), and identification of the ultimate parent of the counterparty (CP.10-CP.12).

    g. Booking Location Master Data Lookup Table

    In the Proposed Rules, the maintenance of information related to the booking location of a QFC position was required under Table A-1. To simplify the tables and facilitate the updating of this information, the Secretary has decided that some of this information should be maintained in a separate table. The information required by the booking location table, which includes the booking location identifier and booking unit or desk identifier, description and contact information, will enable the receiver to determine where the trade is booked and settled and understand the purpose of the position. As noted above, Table A-1 as proposed had also required information pertaining to a point of contact responsible for the position. Based on consideration of comments received, the Secretary determined that this information is not necessary to the FDIC so long as records entities are required to provide current information on the booking location and the booking unit or desk pertaining to QFCs.

    h. Safekeeping Agent Master Data Lookup Table

    In the Proposed Rules, the maintenance of information relating to the safekeeping agent for collateral securing a QFC position was required by Table A-2. To simplify the tables and facilitate updating this information, the Secretary has decided to maintain the detailed information as to safekeeping agent in a separate table. The data required by this table includes the safekeeping agent identifier, name, and point of contact information (SA.2-SA.7). The information in this table must be capable of being provided with respect to each safekeeping agent for collateral of QFCs of a records entity, whether the safekeeping agent is a third party, the counterparty to the QFC secured by such collateral, or the records entity itself.

    III. Administrative Law Matters A. Regulatory Flexibility Act

    The Regulatory Flexibility Act (the “RFA”) (5 U.S.C. 601 et seq.) requires an agency to consider whether the rules it promulgates will have a significant economic impact on a substantial number of small entities. Congress enacted the RFA to address concerns related to the effects of agency rules on small entities, and the Secretary is sensitive to the impact the Final Rules may impose on small entities. The RFA defines a “small business” as having the same meaning as “small business concern” under section 3 of the Small Business Act,123 which is defined as an entity that is “independently owned and operated” and is “not dominant in its field of operation.” 124 In this case, the Secretary believes that the Final Rules likely would not have a “significant economic impact on a substantial number of small entities.” The Dodd-Frank Act mandates that the Secretary prescribe regulations requiring financial companies to maintain records with respect to QFCs to assist the FDIC as receiver of a covered financial company in being able to exercise its rights under the Act and to fulfill its obligations under sections 210(c)(8), (9), or (10) of the Dodd-Frank Act. As a result, the economic impact on financial companies, including any impact on small entities, flows directly from the Dodd-Frank Act, and not the Final Rules.

    123See 5 U.S.C. 601(3).

    124See 15 U.S.C. 632(a)(1).

    The RFA requires agencies either to provide an initial regulatory flexibility analysis with a proposed rule or to certify that the proposed rule will not have a significant economic impact on a substantial number of small entities. As described in the Proposed Rules, the Secretary, in accordance with section 3(a) of the RFA, reviewed the Proposed Rules and preliminarily concluded that the Proposed Rules likely would not have a significant economic impact on a substantial number of small entities.125 However, because the Secretary did not have complete data at that time to certify this determination, particularly with regard to affiliated financial companies, an Initial Regulatory Flexibility Analysis was prepared in accordance with 5 U.S.C. 603.

    125See 5 U.S.C. 605(b).

    The Secretary certifies, pursuant to 5 U.S.C. 605(b), that the Final Rules will not have a significant economic impact on a substantial number of small entities under the Small Business Administration's (“SBA”) most recently revised standards for small entities, which went into effect on February 26, 2016. As discussed below, the Secretary has made various changes to reduce the scope and burden of the rules. However, even apart from these considerations, the Final Rules are not expected to have a significant economic effect on any small entities because any entities that would be subject to the rules as “records entities” that would otherwise meet the standards for small entities would be subsidiaries of large corporate groups and would therefore not be “independently owned and operated.”

    In the Initial Regulatory Flexibility Analysis, the Secretary requested comment on whether the Proposed Rules would have a significant economic impact on a substantial number of small entities and whether the costs are the result of the Act itself, and not the Proposed Rules. Specifically, the Secretary requested that commenters quantify the number of small entities, if any, that would be subject to the Proposed Rules, describe the nature of any impact on small entities, and provide empirical and other data to illustrate and support the number of small entities subject to the Proposed Rules and the extent of any impact.

    The Secretary received comments on the Proposed Rules from trade associations, asset managers, insurance companies, clearing organizations, nonprofit organizations, and a private individual. In general, commenters acknowledged the need for the FDIC to have appropriate information in order to exercise its role as a receiver under Title II of the Dodd-Frank Act.126 However, while commenters also requested various modifications to or relief from aspects of the Proposed Rules that they stated would entail burdens that outweighed the benefits to the FDIC, none provided comments, empirical data, or other analyses in response to the Initial Regulatory Flexibility Analysis or in response to the questions posed by the Secretary regarding the economic impact on small entities.127 As discussed in detail in section II above, after carefully considering all of the comments received and consulting with the FDIC, Treasury has adopted these Final Rules.

    126See, e.g., Better Markets letter, p. 1; TCH et al. letter, p. 2; DTCC letter, p. 1-2; CEWG letter, p. 2; SIFMA AMG letter, p. 1.

    127See, e.g., TIAA-CREF letter, p. 1; ACLI letter, p. 9; TCH et al. letter, p. 2. Several commenters also commented on the potential impact of the Proposed Rules on affiliates of a corporate group, though such affiliates were not identified as small entities. See discussion under “Members of Corporate Groups” in section II.A.1.a above.

    The Proposed Rules, rather than requiring all financial companies to maintain records with respect to QFCs, would have applied to a narrower subset of financial companies. Specifically, the Secretary proposed to exclude from the scope of the Proposed Rules financial companies that did not meet one of the following three criteria: (1) A nonbank financial company subject to a determination by the Council pursuant to section 113 of the Act (12 U.S.C. 5323); (2) a financial market utility designated pursuant to Section 804 of the Act (12 U.S.C. 5463) as, or as likely to become, systemically important; or (3) have total assets equal to or greater than $50 billion. At the time the Proposed Rules were published, each of the financial companies expected to be subject to the rules under these criteria had revenues in excess of the SBA's revised standards for small entities that went into effect on July 22, 2013. The Proposed Rules would also have applied to these large financial companies' affiliated financial companies if an affiliated financial company otherwise qualified as a “records entity” and was not an “exempt entity” under the Proposed Rules. However, such affiliated financial companies are not independently owned and operated.

    As discussed in section II.A.1 above, the Secretary, in response to comments, determined to make several changes to the definition of “records entity” in the Final Rules in order to substantially reduce the number of entities that will be subject to recordkeeping requirements. Further, as discussed in section II.C.3 above, the Secretary determined to include in the Final Rules a de minimis exemption from the preponderance of the recordkeeping requirements for certain records entities that have a minimal level of QFC activity. These changes have the effect of further reducing the likelihood that the rules would affect a substantial number of small entities. In addition, the definition of “records entity” has been revised in the Final Rules to refer to members of a corporate group that are consolidated under accounting standards, which should reduce the number of entities that would be included as records entities and ensure that records entities that are members of a corporate group are able to coordinate their compliance with the recordkeeping requirements of the rules. The addition in the Final Rules of the requirement that a top-tier financial company of a corporate group that has multiple records entities must be able to generate a single, compiled set of the records for all records entities in the corporate group that it consolidates or are consolidated with it would not affect the number of small entities that are subject to the rule as no such top-tier financial company would be a small entity.

    As discussed above, the Final Rules would only affect large financial companies and certain of their affiliates that meet the definition of a records entity. Previously, the Secretary proposed that the recordkeeping requirements in the Proposed Rules would be applicable to all affiliated financial companies in a large corporate group that meet the definition of “records entity,” regardless of their size, because excluding records entities, including small entities, could significantly impair the FDIC's right to enforce certain QFCs of affiliates of covered financial companies under section 210(c)(16) of the Act. The Secretary has been advised by the FDIC that, based on its experience with Part 371, the FDIC as receiver should be able to exercise its statutory rights and duties under the Dodd-Frank Act relating to QFCs without having access to standardized records for any records entity that is a party to 50 or fewer open QFC positions. Thus the Secretary has determined that a de minimis exemption from maintaining the records described in section 148.4 of the Final Rules, other than the records described in section 148.4(i), is appropriate for records entities that have such a minimal level of QFC activity. This change has the effect of further reducing the likelihood that the Final Rules would affect a substantial number of small entities. Although it is unlikely that any small entities would be affected because affiliated members generally do not meet the definition of “small entity,” this revision will minimize the burden faced by affiliated members of a corporate group.

    Based on current information and discussions with staff of several of the PFRAs who are familiar with financial company operations and have experience supervising financial companies with QFC portfolios, the Secretary believes that the large corporate groups that would be subject to the Final Rules would likely comply with the rules by utilizing a centralized recordkeeping system, whether by adapting an existing system or establishing a new system, that would obviate the need for each member of such corporate group, including small entity members of the corporate group, to maintain its own recordkeeping system in order to comply with the rules. This is expected to have the effect of substantially reducing the burden of compliance with the rules on particular small entity members, if any, of a corporate group subject to the rules. The Secretary requested information and comment in the Initial Regulatory Flexibility Analysis on the role of entities responsible for the centralized recordkeeping systems and whether such entities are small entities to which the Proposed Rules would apply. While several commenters addressed the impact of the Proposed Rules in general on information recordkeeping systems,128 none specifically addressed the role of entities responsible for such systems and whether any such entities are small entities.

    128See DTCC letter, p. 10; OCC letter, p. 12; TCH et al. letter, pp. 22-23; TIAA letter, p. 2

    As discussed in more detail above, the Final Rules impose certain recordkeeping requirements on records entities. A records entity is required to maintain all records described in section 148.4 of the Final Rules, be able to generate data in the format set forth in the appendix to the Final Rules, and be capable of transmitting those records electronically to the records entity's PFRA and the FDIC. The Final Rules include recordkeeping requirements with respect to position-level data, counterparty-level data, legal documentation data, collateral detail data, corporate organization data, and a list of vendors directly supporting QFC-related activities of the records entity and the vendors' contact information.

    As discussed in the Initial Regulatory Flexibility Analysis, based on discussions with several of the PFRAs that are familiar with financial company operations and have experience supervising financial companies with QFCs portfolios, the Secretary believes that records entities are already maintaining, as part of their ordinary course of business, most of the QFC information required to be maintained under the Final Rules, which minimizes the potential economic impact.129 However, the Secretary acknowledges that the Final Rules' form and availability requirements may impose additional costs and burdens on records entities.

    129 Registered derivatives clearing organizations and clearing agencies, given the nature of their business, do not currently maintain much of the required records and have been provided a conditional exemption under the Final Rules for the reasons discussed under “Clearing Organizations” in section II.A.1.a above.

    The Secretary recognizes that there may be particular types of QFCs or counterparties for which more limited information may be sufficient to enable the FDIC to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Final Rules provide the Secretary with the discretion to grant conditional or unconditional exemptions from one or more of the requirements of the Final Rules, which could include exemptions from the recordkeeping requirements regarding particular types of QFCs or counterparties. In addition, section 148.1(d)(3) of the Final Rules provides the Secretary with the authority to grant extensions of time for compliance purposes.

    The Secretary requested in the Initial Regulatory Flexibility Analysis information and comment on any costs, compliance requirements, or changes in operating procedures arising from application of the Proposed Rules on small entities.130 Most commenters offered general comments on the costs of compliance requirements and changes in operating procedures.131 These comments have been addressed by the Secretary in section II, above. However, none of these commenters quantified the costs of compliance by small entities or otherwise provided empirical data regarding the costs of compliance by small entities.132 Moreover, the Secretary received no comments on its discussion of the impact on small entities in the Initial Regulatory Flexibility Analysis. In light of the foregoing and the considerations discussed above, the Secretary certifies the Final Rules will not have a significant economic effect on a substantial number of small entities.

    130See 80 FR 966, 986.

    131See, e.g., ACLI letter, pp. 17-19; SIFMA AMG letter, pp. 11-14.

    132 One commenter stated that the Secretary's estimate of the cost of initial compliance for most financial groups subject to the rules will, on an individual basis, far exceed the Secretary's estimation of the total industry-wide compliance cost included in the Secretary's Paperwork Reduction Act analysis of the Proposed Rules; however, the commenter did not otherwise offer an estimate of compliance costs or estimate the costs of compliance by small entities specifically. See TCH et al. letter, pp. 3-4.

    B. Paperwork Reduction Act

    Certain provisions of the Final Rules contain “collection of information requirements” within the meaning of the Paperwork Reduction Act of 1995 (“PRA”). An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid control number. The collection of information requirements in the Final Rules have been submitted by the Secretary to the Office of Management and Budget (“OMB”) for review in accordance with the PRA, 44 U.S.C. 3507(d). The title of this collection is “Qualified Financial Contracts Recordkeeping Related to Orderly Liquidation Authority.” The collection of information has been assigned OMB Control No. 1505-0256.

    Previously, the Secretary requested comments on the collection of information burdens associated with the Proposed Rules. Specifically, the Secretary asked for comment concerning:

    (1) Whether the proposed information collection is necessary for the proper performance of agency functions, including whether the information will have practical utility;

    (2) The accuracy of the estimated burden associated with the proposed collection of information, including the validity of the methodology and assumptions used;

    (3) How to enhance the quality, utility, and clarity of the information required to be maintained;

    (4) How to minimize the burden of complying with the proposed information collection, including the application of automated collection techniques or other forms of information technology;

    (5) Estimates of capital or start-up costs and costs of operation, maintenance, and purchase of services to maintain the information; and

    (6) Estimates of (i) the number of financial companies subject to the Proposed Rules, (ii) the number of records entities that are parties to an open QFC or guarantee, support, or are linked to an open QFC, and (iii) the number of affiliated financial companies that are parties to an open QFC or guarantee, support, or are linked to an open QFC of an affiliate.

    Commenters on the Proposed Rules generally acknowledged the need for the FDIC to have appropriate information in order to exercise its role as a receiver under Title II of the Act. Commenters also requested various modifications to or relief from aspects of the Proposed Rules that they stated would entail burdens that outweighed the benefits to the FDIC. This included recommendations that the records required to be maintained under the Proposed Rules be tailored more narrowly to require only data that is critical to the FDIC's QFC transfer determinations under section 210 of the Act. Several commenters also remarked generally that the Proposed Rules would entail significant information technology and systems development challenges.133 However, none of the commenters provided comments, empirical data, estimates of costs or benefits, or other analyses directly addressing matters pertaining to the PRA discussion.

    133See DTCC letter, p. 3, 8-11; OCC letter, p. 12; TCH et al. letter, pp. 19, 22; TIAA-CREF letter, p. 2.

    The collection of information is required by section 210(c)(8)(H) of the Act, which mandates that the Secretary prescribe regulations requiring financial companies to maintain records with respect to QFCs to assist the FDIC as receiver for a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Final Rules implement these requirements by requiring that a records entity maintain records with respect to, among other things, position-level data, counterparty data, legal agreement data (including copies of agreements governing QFC transactions and open confirmations), collateral detail data, corporate organization information, and a list of vendors directly supporting QFC-related activities of the records entity and the vendors' contact information. The Final Rules require that a records entity be capable of providing QFC records to its PFRA and the FDIC within 24 hours of the request of such PFRA. For corporate groups that have multiple records entities, the top-tier financial company of the corporate group must be able to generate a single, compiled set of the records specified in the Final Rules for all records entities in the corporate group that it consolidates or are consolidated with it and provide such set of records to its PFRA and the FDIC within 24 hours of the request of such PFRA and in a format that allows for aggregation and disaggregation of such data by records entity and counterparty.

    The Final Rules also provide that a records entity may request in writing an extension of time with respect to the compliance dates associated with the recordkeeping requirements. The Final Rules further provide that one or more records entities may request in writing an exemption from one or more of the recordkeeping requirements. Finally, the Final Rules provide a de minimis exemption from maintaining the records described in section 148.4 of the Final Rules, other than the records described in section 148.4(i), for a records entity that is a party to 50 or fewer open QFC positions.

    Respondents

    In the PRA discussion in the Proposed Rules, the Secretary estimated that approximately 140 large corporate groups and each of their respective affiliated financial companies that is a party to an open QFC or guarantees, supports or is linked to an open QFC of an affiliate and is not an “exempt entity,” would meet the proposed definition of “records entity.” The estimate of 140 large corporate groups includes the four nonbank financial companies subject to a determination by the Council under section 113 of the Dodd-Frank Act and the eight financial market utilities designated by the Council under section 804 of the Dodd-Frank Act as systemically important. The Proposed Rules also included within the definition of “records entity” financial companies with assets greater than or equal to $50 billion. The Federal Financial Institutions Examination Council (“FFIEC”) maintains on its public Web site a list of bank holding companies with total assets of greater than $10 billion, which was used to identify bank holding companies with assets greater than or equal to $50 billion. For corporate groups that are not bank holding companies, SNL Financial, a private vendor that provides a subscription-access database that aggregates publicly available financial information on insurance, securities and investment, specialty finance, and financial technology companies, as well as financial statements filed with the SEC and, for broker-dealers, with the Financial Industry Regulatory Authority, were used to identify corporate groups with assets greater than or equal to $50 billion as of December 31, 2013. By reference to these sources, as well as conversations with the PFRAs, 128 additional corporate groups were estimated to be subject to the rules.

    For purposes of the PRA discussion in the Proposed Rules, the Secretary estimated that each large corporate group was comprised of approximately 168 affiliates, resulting in an estimate of 23,325 affiliated financial companies. As noted above, commenters generally did not provide comments, empirical data, or other analyses directly addressing the Secretary's estimates in the PRA discussion. As discussed in detail in section II above, the Final Rules, as adopted, incorporate several changes to the Proposed Rules, including the addition to the definition of “records entity” of criteria based on the level of a financial company's derivatives activity, the exclusion of insurance companies, a conditional exemption for derivatives clearing organizations, and the inclusion of a de minimis exemption. Taken together, these changes substantially reduce the scope of financial companies subject to the recordkeeping requirements of the Final Rules.

    The Secretary estimates that approximately 30 large corporate groups, and each of their respective affiliated financial companies that is a party to an open QFC and is not an “excluded entity,” will meet the definition of “records entity” in section 148.2(n) upon the effective date of the Final Rules, compared to the estimate in the Proposed Rules of 140 large corporate groups. The Secretary estimates that collectively these 30 corporate groups had approximately $15 trillion in total assets, compared to an estimated $25 trillion in total assets of the 140 corporate groups that were expected to meet the definition of “records entity” in the Proposed Rules. These estimates were based on the publicly disclosed financial statements of such corporate groups as of December 31, 2015 and December 31, 2013, respectively.

    The estimate of 30 large corporate groups was calculated as follows. There are three categories of financial companies that are included within the definition of “records entity” in the Final Rules without regard to whether they meet the asset or derivatives thresholds. The estimate includes the eight U.S. top-tier bank holding companies currently identified as G-SIBs. Likewise, the estimate includes the two nonbank financial companies currently subject to a determination by the Council under section 113 of the Dodd-Frank Act. There are currently eight financial market utilities designated by the Council under section 804 of the Dodd-Frank Act as systemically important. Six of these entities are registered clearing agencies or derivatives clearing organizations, for which a conditional exemption has been provided under the Final Rules, though their affiliates may be subject to the recordkeeping requirements if they are party to open QFCs.

    The estimate also includes large corporate groups that would be subject to the rules by virtue of the amount of their total consolidated assets and level of derivatives activity. For bank holding companies, the FFIEC-maintained list, referenced above, of bank holding companies with total assets of greater than $10 billion was used to identify bank holding companies with assets greater than or equal to $50 billion. The amount of total gross notional derivatives outstanding and the amount of derivatives liabilities of these bank holding companies was obtained by reference to the consolidated financial statements filed with the Federal Reserve by such bank holding companies on the Federal Reserve's Form FR Y-9C, which are publicly available on the Federal Reserve's Web site. For corporate groups that are not bank holding companies, the SNL Financial database referenced above, as well as financial statements filed with the SEC and, for broker-dealers, with the Financial Industry Regulatory Authority were used to identify corporate groups having total assets greater than or equal to $50 billion and having either greater than or equal to $3.5 billion in derivatives liabilities or greater than or equal to $250 billion in total gross notional derivatives outstanding as of December 31, 2015. By reference to these sources, as well as conversations with the PFRAs, twelve additional corporate groups were estimated to be subject to the rules. While the number of corporate groups having total assets greater than or equal to $50 billion was similar to that estimated at the time of the issuance of the Proposed Rules, the addition to the definition of “records entity” of criteria based on the level of a financial company's derivatives activity and the exclusion of insurance companies significantly reduced the number of corporate groups estimated to be subject to the rules.

    The following table summarizes the calculation of the estimates of the number and aggregate size of large corporate groups subject to the Proposed Rules and the Final Rules.

    Large Corporate Groups Subject to the Rules Proposed
  • rules
  • Final
  • rules
  • Subject to a determination that the company shall be subject to Federal Reserve supervision and enhanced prudential standards pursuant to 12 U.S.C. 5323 8 8 Subject to a designation as, or as likely to become, systemically important pursuant to 12 U.S.C. 5463 4 2 Identified as a global systemically important bank holding company pursuant to 12 CFR Part 217 N/A 8 Corporate group (excluding the above) that has, on a consolidated basis, greater than $50 billion in total assets * 128 N/A Corporate group (excluding the above) that has, on a consolidated basis (1) greater than $50 billion in total assets and (2)(i) total gross notional derivatives outstanding equal to or greater than $250 billion or (ii) derivative liabilities equal to or greater than $3.5 billion * N/A 12 Total corporate groups
  • Aggregate total assets *
  • 140
  • ** $25
  • 30
  • ** $15
  • * Based on data obtained from FFIEC public Web site; SNL Financial, a private vendor that provides a subscription-access database that aggregates publicly available financial information on insurance, securities and investment, specialty finance, and financial technology companies; financial statements filed with the SEC, Financial Industry Regulatory Authority, and the Federal Reserve; and conversations with the PFRAs. ** Trillion.

    The Final Rules would also apply to these large corporate groups' affiliated financial companies (regardless of their size) if an affiliated financial company otherwise qualifies as a “records entity,” and is not an “excluded entity.” In addition, as referenced above, the Final Rules will also require the top-tier financial company of the corporate group to be capable of generating a single, compiled set of the records specified in the Final Rules for all records entities in the corporate group that it consolidates or are consolidated with it and to be capable of providing such a set of records to its PFRA and the FDIC.

    The Secretary estimates that the large corporate groups that will be subject to the rules collectively have 5,010 affiliated financial companies that may qualify as records entities. The Secretary recognizes that, based on a number of factors, the actual total number of respondents may differ significantly from this estimate. One such factor is that there is no information available to determine how many of the affiliated financial companies of a large corporate group are a party to an open QFC and thus would qualify as records entities. At the same time, the inclusion and availability of the de minimis exemption in the Final Rules will have the effect of reducing the number of affiliated financial companies in many corporate groups subject to the recordkeeping requirements. Finally, as previously noted, commenters did not provide requested comments, empirical data, or other analyses directly addressing the Secretary's estimates of the total number of respondents for purposes of the PRA discussion. For the foregoing reasons, the Secretary has concluded it is reasonable to maintain the estimate of affiliates per corporate group used in the PRA discussion in the Proposed Rules and therefore to assume that a total of 5,010 affiliated financial companies would qualify as record entities.

    The Secretary's recordkeeping, reporting, data retention, and records generation burden estimates are based on discussions with the PFRAs regarding their prior experience with initial burden estimates for other recordkeeping systems. The Secretary also considered the burden estimates in rulemakings with similar recordkeeping and reporting requirements.134 As noted above, some commenters stated that certain aspects of the Proposed Rules entailed burdens that outweighed the benefits to the FDIC. Several commenters also provided general comments that the recordkeeping requirements of the Proposed Rules would involve significant information technology and systems development challenges. In general, commenters did not directly address the Secretary's estimates and analysis in the PRA discussion. Nevertheless, the Secretary has taken all comments into consideration and made certain modifications and adjustments to this PRA discussion in the Final Rules to reflect those comments. As discussed in section II above, the Final Rules incorporate numerous changes in response to commenters' concerns, and this PRA discussion reflects those changes.

    134See 80 FR 14563 (Mar. 19, 2015); 77 FR 2136 (Jan. 13, 2012); 76 FR 46960 (Aug. 3, 2011); 76 FR 43851 (July 22, 2011); 73 FR 78162 (Dec. 22, 2008).

    In order to comply with the Final Rules, each of the large corporate group respondents will need to set up its network infrastructure to collect data in the required format. This will likely impose a one-time initial burden on the large corporate group respondents in connection with the necessary updates to their recordkeeping systems, such as systems development or modifications. This initial burden is mitigated to some extent because QFC data is likely already retained in some form by each large corporate group respondent in the ordinary course of business, but large corporate group respondents may need to amend internal procedures, reprogram systems, reconfigure data tables, and implement compliance processes. Moreover, they may need to standardize the data and create records tables to match the format required by the Final Rules. In recognition of this, as discussed in section II.A.3 above, the Final Rules provide for staggered compliance dates that will provide all records entities with additional time to comply with the recordkeeping requirements. Under the Final Rules, all but the very largest institutions will have at least two years to comply with the rules' requirements.135

    135 All records entities and top-tier financial companies will be required to provide point of contact information to their PFRAs and the FDIC on the effective date of the rules.

    As discussed above, the Final Rules also apply to affiliated financial companies of the large corporate group respondents. The Final Rules will likely impose a one-time initial burden on the affiliated financial companies in connection with necessary updates to their recordkeeping systems, such as systems development or modifications. These burdens will vary widely among affiliated financial companies. As noted herein and as discussed in section II.C.3 above, the Final Rules provide a de minimis exemption from the recordkeeping and reporting requirements for certain records entities that have a minimal level of QFC activity, which the Secretary believes will significantly reduce the number of affiliated financial companies subject to the recordkeeping and reporting requirements of the Final Rules.

    The Secretary believes that the large corporate groups subject to the Final Rules are likely to rely on centralized systems to comply with most of the recordkeeping requirements, as set forth herein, for the QFC activities of all affiliated members of the corporate group. The entity responsible for each large corporate group's centralized system will likely operate and maintain a technology shared services model with the majority of the technology applications, systems, and data shared by the multiple affiliated financial companies within the corporate group. Therefore, the majority of the recordkeeping burden stemming from the Final Rules will be borne by the entity responsible for each large corporate group's centralized systems, while relatively little initial and ongoing recordkeeping burden will be imposed on their affiliated financial companies. The affiliated financial companies will likely have a much lower burden because they can utilize the technology and network infrastructure operated and maintained by the entity responsible for the centralized system at their respective large corporate group. Similarly, the Secretary believes that the affiliated financial companies will rely on the entities responsible for the centralized systems to perform the requirements under section 148.3(a)(1)(ii).

    Similarly, the Secretary believes that affiliated financial companies will rely on large corporate group respondents to submit any requests for extensions of time under section 148.1(d)(3) or requests for exemption from one or more requirements of the Final Rules under section 148.3(c)(3).

    Estimated Paperwork Burden

    Recordkeeping

    Estimated number of respondents

    Estimated number of large corporate groups: 30.

    Estimated number of affiliated financial companies: 5,010.

    Total estimated initial recordkeeping burden

    Estimated average initial burden hours per respondent: 7,200 hours for large corporate groups, 0.5 hours for affiliated financial companies.

    Estimated frequency: One-time, spread over applicable compliance period.

    Estimated total initial recordkeeping burden: 216,000 hours for large corporate groups and 2,505 hours for affiliated financial companies.

    Total estimated annual recordkeeping burden

    Estimated average annual burden hours per respondent: 240 hours for large corporate groups, 0.5 hours for affiliated financial companies.

    Estimated frequency: Annually.

    Estimated total annual recordkeeping burden: 7,200 hours per year for large corporate groups and 2,505 hours per year for affiliated financial companies.

    The initial and annual recordkeeping burden is imposed by the Dodd-Frank Act, which requires that the Secretary prescribe regulations requiring financial companies to maintain records with respect to QFCs to assist the FDIC as receiver of a covered financial company in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act.

    Reporting

    Estimated number of respondents: 30.

    Total estimated annual reporting burden

    Estimated average annual burden hours per respondent: 50 hours.

    Estimated frequency: Annually.

    Estimated total annual reporting burden: 1,500 hours per year.

    As discussed in more detail in section III.C.6.a below, the Secretary estimates the potential total costs of the initial recordkeeping burden associated with the Final Rules, including the burden hours estimated above plus estimated technology and systems development and modification costs, to be $36,631,995. The potential total costs of annual recordkeeping and reporting burdens associated with the Final Rules, including the burden hours estimated above, are estimated to be $1,248,795.136

    136 All cost and wage estimates are in nominal dollars and have not been adjusted for inflation.

    C. Executive Orders 12866 and 13563

    It has been determined that the Final Rules are a significant regulation as defined in section 3(f)(1) of Executive Order 12866, as amended. Accordingly, the Final Rules have been reviewed by OMB. The Regulatory Assessment prepared by the Secretary for the Final Rules is provided below.

    1. Description of the Need for the Regulatory Action

    The rulemaking is required by the Dodd-Frank Act to implement the QFC recordkeeping requirements of section 210(c)(8)(H) of the Act. Section 210(c)(8)(H) generally provides that if the PFRAs do not prescribe joint final or interim final regulations requiring financial companies to maintain records with respect to QFCs within 24 months from the date of enactment of the Act, the Chairperson of the Council shall prescribe such regulations in consultation with the FDIC. The Secretary, as Chairperson of the Council, is adopting the Final Rules in consultation with the FDIC because the PFRAs did not prescribe such joint final or interim final regulations. The recordkeeping required in the Final Rules is necessary and appropriate to assist the FDIC as receiver to exercise its rights and fulfill its obligations under sections 210(c)(8), (9), and (10) of the Dodd-Frank Act, by enabling it to assess the consequences of decisions to transfer, disaffirm or repudiate, or allow the termination of QFCs with one or more counterparties.

    The recent financial crisis has demonstrated that management of QFC positions, including steps undertaken to close out such positions, can be an important element of a resolution strategy which, if not handled properly, may magnify market instability. Large, interconnected financial companies may hold very large positions in QFCs involving numerous counterparties. A disorderly unwinding of these QFCs, including the mass exercise of QFC default rights and the rapid liquidation of collateral, could cause severe negative consequences for not only the counterparties themselves but also U.S. financial stability. A disorderly unwind could result in rapid liquidations, or “fire sales,” of large volumes of financial assets, such as the collateral that secures the contracts, which can in turn weaken and cause stress for other firms by lowering the value of similar assets that they hold or have pledged as collateral to other counterparties.

    In order for the FDIC to effectuate an orderly liquidation of a covered financial company under Title II, the FDIC would need to make appropriate decisions regarding whether to transfer QFCs to a bridge financial company or other solvent financial institution or leave QFCs of the covered financial company in receivership. Determining whether to transfer QFCs in a manner that complies with the requirements of Title II and ensuring continued performance on any QFCs transferred requires detailed and standardized records. It would not be possible for the FDIC to fully analyze a large amount of QFC information in the short time frame afforded by Title II unless such information is readily available to the FDIC in a standardized format designed to enable the FDIC to conduct the analysis in an expeditious manner.

    As referenced in section I above, Title II requires the FDIC as receiver to exercise its authorities, to the greatest extent practicable, in a manner that maximizes value, minimizes losses, and mitigates the potential for serious adverse effects to the financial system. Title II also requires that the aggregate amount of liabilities of a covered financial company that are transferred to a bridge financial company from a covered financial company not exceed the aggregate amount of the assets of the covered financial company that are transferred to the bridge financial company from the covered financial company. If it does not have the records required by the rules, the FDIC may be unable to assess the financial position associated with certain QFCs and thus may not be able to determine how the transfers would affect the financial viability of a bridge financial company or other transferee institution, how the transfers would affect financial stability, whether the transfers would serve to maximize value and minimize losses in the disposition of assets of the receivership, and whether the transfers would cause the amount of aggregate transferred liabilities of the bridge financial company to exceed the amount of aggregate transferred assets.

    Furthermore, as discussed in sections I and II above, if the FDIC as receiver decides to transfer any QFC with a particular counterparty, Title II requires that it must transfer all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty to a single financial institution, and if the FDIC as receiver decides to disaffirm or repudiate any QFC with a particular counterparty, it must disaffirm or repudiate all QFCs between the covered financial company and such counterparty and any affiliate of such counterparty. If the FDIC were to lack information about the affiliates of the counterparties to the QFCs of the covered financial company, it might not be able to transfer the QFCs given its uncertainty as to whether such a transfer would violate this requirement.

    The FDIC's inability to effect the transfer of QFCs for any of the above reasons could have significant adverse effects on financial stability in circumstances in which transferring such QFCs may have prevented the unnecessary termination of QFCs and fire sales of collateral securing these QFCs. Even after a transfer decision is made, the records required by the rule are necessary to ensure that the bridge financial company and its subsidiaries continue to perform their obligations on any QFCs that are transferred. The inadvertent failure to perform their obligations under the QFCs, including meeting margin requirements and other obligations, could result in counterparties terminating QFCs, asset fire sales, and the failure of the bridge financial company.

    2. Literature Review

    In assessing the need for these recordkeeping requirements, we have reviewed two categories of academic literature. As highlighted above, one of the potential channels through which the disorderly unwinding of QFCs could cause severe negative consequences for both the counterparties themselves and U.S. financial stability is through the rapid liquidation of collateral. The disorderly failure of a financial company with a large QFC portfolio may lead QFC counterparties to exercise their contractual remedies and rights by closing out positions and liquidating collateral, while also potentially increasing uncertainty in both derivatives and asset markets. This could lead to lower asset prices, decrease the availability of funding, and increase the likelihood that other financial companies also are forced to liquidate assets. To assess the potential impact of rapid liquidations, we have reviewed economic studies of fire sales among financial companies. Second, while there is limited academic literature specifically focused on the cost of a disorderly unwinding of a large, complex financial company's QFC portfolio, there has been recent literature analyzing the cost of the Lehman Brothers bankruptcy in 2008, which may be illustrative of the potential costs.137

    137 Lehman Brothers Holdings, Inc. (“Lehman Brothers”), Lehman Brothers Inc. (the U.S. registered broker-dealer), and Lehman Brothers International (Europe) (the UK registered broker-dealer) were subject to separate liquidation proceedings.

    a. Fire Sales Among Financial Institutions

    The economic literature on financial company fire sales offers insight into their potential internal and external impacts. While not directly addressing QFCs, the fire sale literature can be applied to the potential impact of the rapid liquidation of QFC collateral that might occur in a disorderly unwinding of a large QFC portfolio. As noted above, the recordkeeping required by the Final Rules is necessary to assist the FDIC in being able to make decisions regarding whether to transfer QFCs of a covered financial company to a bridge financial company or other solvent financial institution or to retain the QFCs in the covered financial company in receivership. Transferring QFCs, if appropriate, may prevent the mass exercise of QFC default rights and a corresponding fire sale of assets held as collateral for those QFCs.

    Principles of Fire Sales among Financial Companies. According to the literature, a fire sale can occur when a company cannot pay its creditors without selling assets. During a fire sale, assets sold may be heavily discounted below their fundamental values, depending on the market of participating buyers. If buyers are other investors in the asset class or classes being sold (“specialists”), prices may decline little. However, if the fire sale occurs during a financial crisis when uncertainty is higher and many specialists, including financial companies, may be constrained by solvency or liquidity pressures, they may not participate in the other side of the market. As a result, prices may fall substantially, to a level at which buyers who would only buy the assets in question at a large discount enter the market. Low sale prices may cause other financial companies to reduce the value at which they hold similar assets on their books when marking to market, which may trigger a downward spiral marked by more firms in distress (Shleifer and Vishny, 2011).138 In addition, because many financial companies rely upon short-term sources of financing, such as repurchase agreements, the falling asset prices and heightened uncertainty may contribute to liquidity pressures as these financing sources withdraw funding or demand more collateral. This may force even solvent financial companies to sell assets in order to deleverage, decrease the size of their balance sheets, and reduce risk. This self-reinforcing cycle can result in additional fire sales, and eventually, precipitate or magnify a financial crisis.

    138 Shleifer, A, and Vishny, R. (2011). Fire Sales in Finance and Macroeconomics. Journal of Economic Perspectives 25: 29-48.

    Shleifer and Vishny (2011) believe that before the September 2008 Lehman Brothers bankruptcy many specialist buyers, including most financial companies, were active in the market, but after the Lehman bankruptcy most of them were unwilling to buy assets, causing security prices to plunge, and prompting fund withdrawals, collateral calls, and self-reinforcing fire sales. This cycle of price collapses and deleveraging increased the fragility of the financial system, and disrupted financial intermediation.

    At the time of a fire sale both seller and non-seller financial companies may curtail their lending, thereby imposing additional social costs associated with reduced financial intermediation. Shleifer and Vishny (2010) 139 use a three-period model of bank lending to illustrate the dynamics. They show that, in normal times, securitization can lead to higher lending volumes and earnings, but market sentiment shocks can quickly reverse these outcomes. When banks are highly leveraged, they may be more vulnerable to unanticipated shocks. A severe shock can lead them to liquidate assets in fire sales, fostering industry-wide asset price declines and weakening the banking system. In that environment, banks may forego lending, both to meet capital requirements and to preserve the capacity to purchase deeply discounted assets in the future. This credit contraction may reduce economic welfare due to a large number of potentially profitable investments that do not receive financing. He et al. (2010) 140 and Ivashina and Scharfstein (2010) 141 offer evidence that financial companies used spare balance sheet capacity to purchase discounted securities after the financial crisis rather than to increase lending. Hence, foregone lending during a crisis is a potential social cost.

    139 Shleifer, A. and Vishny, R. (2010). Asset Fire Sales and Credit Easing. National Bureau of Economic Research working paper 15652.

    140 He, Z., Khang, I.G., and Krishnamurthy, A. (2010). Balance Sheet Adjustments During the 2008 Crisis. IMF Economic Review 58: 118-156.

    141 Ivashina, V. and Scharfstein, D. (2010). Bank Lending During the Financial Crisis of 2008. Journal of Financial Economics 97: 319-338.

    Empirical Estimates of the Economic Effects of Fire Sales. The literature provides empirical estimates of the economic effects of asset fire sales. Research suggests both the potential direct price discount effect and the indirect spillover effects of fire sales are economically substantial. Although this body of work does not necessarily target financial companies, it provides broadly applicable insights.

    Coval and Stafford (2007) 142 compare stock transactions by mutual funds under normal conditions and fire sale conditions from 1980-2004. The study regards high volumes of concurrent capital outflows from mutual funds as creating stock fire sale conditions when they force several funds to sell substantial amounts of underlying stock (the same stocks may be sold by multiple investment funds that are experiencing similar stresses). It finds a negative 7.9 percent average abnormal stock return in the two quarters preceding and including the distressed selling of a stock by mutual funds. This stock price dip tends to rebound after the high sales volumes dissipate, which the authors point out is consistent with fire sale dynamics, as liquidity providers earn abnormal positive returns after a crisis period and stock prices revert to reflect their fundamental values.

    142 Coval, J. and Stafford, E. (2007). “Asset Fire Sales (and Purchases) in Equity Markets.” Journal of Financial Economics 86: 479-512.

    Dinc, Erel, and Liao (2015) 143 find industry-adjusted distressed asset sale discounts of 8 to 9 percent when a firm buys equity shares of target firms in distressed industries in the 2000-12 period. The model controls for target firm size, liquidity, leverage, and profitability, and results are robust to alternative definitions of distressed firms, analytic periods, and industry classifications. The authors consider the estimated discounts to be a lower-bound for fire sale discounts in less liquid assets than equities, such as real assets or debt securities, which may be more difficult to sell during periods of distress.

    143 Dinc, S., Erel I., and Liao, R. (2015). “Fire Sale Discount: Evidence from the Sale of Minority Equity Stakes.” Ohio State University Fisher College of Business working paper 2015-03-11.

    While ample research documents the costs of fire sales to distressed firms selling assets, little analytic emphasis has been placed on the effect of fire sales on asset buyers. A recent study by Meier and Servaes (2015) 144 examines the direct effects of fire sale purchases on the stock returns of the acquiring firms. Using data for 1982-2012, their model finds abnormal stock price increases of roughly 2 percent among firms buying assets or entire companies under fire sale conditions, compared to purchasing during normal economic conditions.145 The result is robust to model specifications with alternative control variables, and buyer returns are inversely associated with the level of liquidity in the market and the potential for alternative uses for the assets. The authors conclude that when the gains to firms buying assets during fire sales are included in the estimates, the welfare costs of fire sales may be lower than previously expected. However, the study does not consider the negative spillover effects of fire sales that may infect other firms in the seller's industry, and is not intended to be a full welfare analysis.

    144 Meier, J.A. and Servaes, H. (2015). “The Bright Side of Fire Sales.” London Business School working paper.

    145 The model uses an event study approach to study a three-day period starting one day before the transaction announcement.

    In contrast to studies of the direct discounts or stock returns associated with asset transactions during fire sales, Duarte and Eisenbach (2015) 146 assess the indirect spillover costs of fire sales. They develop a model to assess vulnerability to fire sale spillovers, and find substantial negative economic effects. Based on several assumptions developed by the authors, the model estimates that from July 2008 to March 2014, an exogenous 1 percent decline in the price of assets financed with repos leads to average fire sale losses of 8 percent of total equity capital in the broker-dealer sector. The authors conclude that asset fire sale spillovers are an important part of overall risk to the financial system.

    146 Duarte, F. and Eisenbach, T.M. (2015). “Fire Sale Spillovers and Systemic Risk.” Federal Reserve Bank of New York Staff Reports, No. 645.

    Potential Effects on Lending. As predicted by the theoretical models discussed above, empirical research shows bank lending declined sharply during the crisis. Ivashina and Scharfstein (2010) show that in August through December 2008, banks that depended more heavily on short-term debt (other than insured deposits), reduced their business lending by significantly more than banks less dependent on short-term debt financing. At the time of the Lehman bankruptcy, the paper identifies two channels driving this result that collectively constituted a “run” on financial companies. First, short-term creditors refused to roll over their unsecured commercial paper loans and repo lenders increased collateral requirements, which particularly constrained financial companies dependent on short-term credit for a significant share of their financing. Second, borrowers substantially increased draws on their existing credit lines “to enhance their liquidity and financial flexibility during the credit crisis.” In particular, financial companies that co-syndicated credit lines with Lehman Brothers were more likely to experience larger credit line drawdowns after the Lehman failure, and reduced their new lending more than those without co-syndication relationships with Lehman. Ivashina and Scharfstein conclude the results are consistent with a decline in the supply of funding as a result of the run associated with the Lehman event.

    On the borrower side, Campello et al. (2010) 147 surveyed the chief financial officers of 1,050 nonfinancial firms in the United States, Europe, and Asia and found that those that identified their firms as “financially constrained” 148 during the financial crisis cut back more on capital and technology investments compared to those that identified their firms as “financially unconstrained.” They also cut marketing expenditures by significantly greater margins, and shed far more employees (financially constrained firms planned to cut 10.9 percent of their personnel in 2009, while financially unconstrained firms planned to shed 2.7 percent). The survey revealed that during the crisis, 86 percent of constrained firms reported foregoing attractive investments, compared to 44 percent of unconstrained firms. This suggests the crisis-related decline in bank credit supply directly contributed to the reduction in constrained firms' investments, and imposed associated economic effects.

    147 Campello, M., Graham, J., and Harvey, C. (2010). The Real Effects of Financial Constraints: Evidence from a Financial Crisis. Journal of Financial Economics 97: 470-487.

    148 Derived from survey respondents' self-assessments of their financial condition.

    b. Costs of Lehman Brothers Bankruptcy

    Numerous researchers have provided broad estimates of the economic costs of the 2007-09 financial crisis (see GAO (2013) 149 for a useful review). This section focuses more narrowly on the terminations of derivative contracts associated with the Lehman bankruptcy to help illustrate the potential costs of unwinding the derivatives portfolio of a large, complex financial company. While this particular example occurred under the U.S. Bankruptcy Code rather than as a Title II orderly liquidation, the disorderly unwind and disruptions that resulted are indicative of the potential negative consequences that could result from a situation in which the FDIC as receiver in a Title II resolution is unable to make informed decisions as to whether to transfer a QFC because it does not have adequate records.

    149 Government Accountability Office, Financial Regulatory Reform: Financial Crisis Losses and Potential Impacts of the Dodd-Frank Act, GAO-13-180 (January 16, 2013).

    The net worth of Lehman Brothers derivative positions at the time of bankruptcy on September 15, 2008 totaled $21 billion, with 96 percent representing over-the-counter (OTC) positions.150 The portfolio consisted of more than 6,000 OTC derivative contracts involving over 900,000 transactions. Fleming and Sarkar's (2014) 151 detailed assessment of the Lehman Brothers bankruptcy finds the overall recovery rate of all allowed unsecured claims (not limited to QFCs) amounted to roughly 28 percent, a rate the authors describe as low relative to both an estimated 59 percent for other financial company failures and 40 percent for failures occurring in recessions.

    150 Most derivatives were held in several subsidiaries specializing in derivatives and related instruments. Since Lehman had numerous subsidiaries with intermingled interests, we simplify the discussion by describing them as if they were a single entity, except when specificity is necessary for descriptive accuracy.

    151 Fleming, M. and Sarkar, A. (2014). The Failure Resolution of Lehman Brothers. Economic Policy Review 20(2). Federal Reserve Bank of New York.

    We use a framework that divides costs associated with derivatives resolution into private costs and public (external) costs. Private costs consist of direct losses to derivatives counterparties from unrecovered claims, indirect costs to derivatives counterparties from loss of hedged positions, costs to other Lehman Brothers creditors in the bankruptcy proceeding due to reductions in recovery values resulting from the termination and settlement of OTC derivatives, losses to the Lehman estate from excess collateral transfers during bulk sales of exchange-traded derivatives, and litigation and administrative expenses. While we find no literature that assesses the public costs directly attributable to the resolution of Lehman's derivatives portfolio, below we examine the literature assessing the public impact of Lehman's failure more broadly.

    While rigorous estimates of the value of each cost element listed above would be ideal, in reality we are constrained by a lack of publicly available data. Therefore, this section combines qualitative descriptions of costs with limited quantitative information when available, in an effort to provide insight on the costs of resolving Lehman's QFC portfolio under the bankruptcy proceedings.

    Private Derivatives Counterparty Costs: Unrecovered Claims. Estimates of bankruptcy claim recovery rates of OTC derivative counterparties (excluding Lehman affiliate claims) are reported in the literature at the Lehman subsidiary level, and vary widely, ranging from 31 percent for Lehman Brothers Special Financing (the largest Lehman derivatives entity) to 100 percent each for Lehman Brothers OTC Derivatives, Lehman Brothers Derivatives Products, and Lehman Brothers Financial Products, as of March 27, 2014 (Fleming and Sarkar (2014)). Still the authors emphasize that, “many counterparties of Lehman's OTC derivatives suffered substantial losses.”

    Private Derivatives Counterparty Costs: Loss of Hedged Positions. A key reason for many counterparties to acquire derivative positions is to hedge against potential future market developments. These hedges reduce uncertainties and serve as valuable risk management instruments. Fleming and Sarkar (2014) suggest Lehman's abrupt bankruptcy took counterparties by surprise, and allowed them little time to assess their derivative positions facing Lehman, decide whether to terminate contracts, and rehedge their positions as needed.152 Therefore, many counterparties lost their hedged positions within a brief period and were unexpectedly exposed to risks until new positions could be established. We find no estimates of the costs of these lost hedges in the literature.

    152 Fleming and Sarkar believe the selection of the termination date for safe harbor purposes influenced this. They write (p. 25), “Although Lehman filed for bankruptcy protection at about 1:00 a.m. on Monday, September 15, 2008, the termination date was set as Friday, September 12 for derivatives subject to automatic termination. Normally, nondefaulting derivatives counterparties of Lehman would have attempted to hedge their positions on Monday to mitigate expected losses on their position. However, they could not do so since their positions were deemed to have terminated two days earlier.”

    Private Costs to the Entire Lehman Bankruptcy Estate: Settlement of OTC Derivatives. Fleming and Sarkar (2014) note that the settlement of Lehman's OTC derivatives claims may have also resulted in significant losses to the Lehman bankruptcy estate. Derivatives valuation claims are generally based on replacement costs and they note that due to the large prevailing bid-ask spreads at the time of Lehman's bankruptcy filing, replacement costs may have diverged significantly from fair value. During the settlement process the Lehman estate received $11.85 billion in OTC derivatives receivables by January 10, 2011. It is unclear how much in additional receivables may have been “lost” by Lehman due to the termination and settlement of contracts following its bankruptcy filing. The literature notes that the relatively abrupt timing of the bankruptcy filing may have also influenced the magnitude of losses. Valukas (2010) suggested that Lehman insufficiently planned for the possibility of bankruptcy, such that management only began to plan seriously for bankruptcy a few days before the bankruptcy filing. A bankruptcy court document 153 cites a “turnaround specialist” advising Lehman, Bryan Marsal, as telling the court-appointed examiner that the sudden bankruptcy resulted in the loss of 70 percent of $48 billion of receivables from derivatives that could have been unwound. Yet, the same document notes that Lehman counsel Harvey Miller did not think the rushed filing had an adverse impact on the estate (Valukas 2010). These accounts appear anecdotal and no information is provided on the derivation of the figures cited by Marsal.

    153 Valukas, A. (2010). “Report of the Examiner in the Chapter 11 Proceedings of Lehman Brothers Holdings Inc.” March 11. Accessed at: http://jenner.com/lehman/.

    Private Costs to the Entire Lehman Bankruptcy Estate: Settlement of Exchange-traded Derivatives. Wiggins and Metrick (2015) 154 report that three days following the Lehman bankruptcy filing, the derivatives exchange holding its accounts sold them through a bulk auction to three buyer entities, who assumed the positions taken in the derivatives contracts. The transactions included transfer of $2 billion in Lehman collateral and clearing deposits to the buyers, which exceeded the market value of the obligations by roughly $1.2 billion. This excess collateral value was considered a loss to Lehman by the bankruptcy examiner.

    154 Wiggins, R.Z. and Metrick, A. (2015). “The Lehman Brothers Bankruptcy G: The Special Case of Derivatives.” Yale Program on Financial Stability Case Study 2014-3G-V1.

    Private Costs: Litigation and Administrative. The extended duration of the OTC derivatives settlement process included multiple court petitions, procedure approvals, settlement mechanisms, and legal challenges. While 81 percent of derivative contracts in claims against Lehman were terminated by November 13, 2008, the final settlement process moved more deliberately due to the multiple steps involved in properly addressing the unprecedented scale and complexity of claims within the bankruptcy process. Only 84 percent of derivatives claims had been settled by the end of 2012. Estimates of litigation and administrative expenses for OTC derivatives alone are not available, but these expense categories for the full Lehman settlement process were estimated to total $3.2 billion as of May 13, 2011 (Fleming and Sarkar (2014)).

    Public Costs: Externalities. The event study is a common method of estimating the market impact of a particular event. Measured market reactions to the Lehman bankruptcy are based on the institution's failure event as a whole; they are not reactions to the QFC resolution process alone and therefore overstate the impacts of these terminations. We may plausibly assume, however, that the market reactions to the overall Lehman collapse announcement included a component associated with potential costs of settling their derivative contracts.155

    155 Still, we caution that event study results may produce “noisy” signals. For example, attribution is problematic as the period surrounding the Lehman collapse was a particularly active one with nearly two dozen significant economic events in September 2008.

    Johnson and Mamun (2012) 156 apply an event study approach to assess stock market reactions of a sample of 742 U.S. financial institutions—divided into banks, savings and loans, brokers, and primary dealers—on the date of the Lehman bankruptcy filing. While each group of institutions showed negative abnormal returns, only the bank (−3 percent) and primary dealer (−6 percent) coefficients were statistically significant. The data strongly support the notion that the event had differential impacts by type of financial institution and abnormal returns across institution groups.

    156 Johnson, M.A. and Mamun, A. (2012). The Failure of Lehman Brothers and its Impact on Other Financial Institutions. Applied Financial Economics 22: 375-385.

    Dumontaux and Pop (2012) 157 apply a similar approach to assess stock market reactions of a sample of 382 U.S. financial companies, using brief event windows. They report heterogeneous outcomes according to institution size and business lines. Among the twenty large companies 158 (excluding Lehman Brothers), cumulative abnormal stock price returns were highly significantly negative, ranging from −10 percent to −18 percent over five distinct event windows of up to five days in duration. However, the effects on the full sample were not statistically significant, indicating the immediate contagion effect was limited to large companies. The results of both event studies suggest the Lehman bankruptcy likely imparted immediate negative external effects on a subset of financial companies, causing substantial drops in their market valuations. However, as noted above, it is not clear from these studies the extent to which the change in company valuation is driven by the costs of the QFC resolution process. We did not find event studies specifically assessing market impacts on non-financial firms.

    157 Dumontaux, N. and Pop, A. (2012). “Contagion Effects in the Aftermath of Lehman's Collapse: Measuring the Collateral Damage.” University of Nantes working paper 2012/27.

    158 Large financial companies are defined as those with total assets over $1 billion in their last audited report before the event date.

    Domestic Public Support: Federal Reserve Facility. The Federal Reserve provided substantial liquidity to the markets during the 2007-2009 period. Fleming and Sarkar (2014) consider the support to Lehman in the first week after the bankruptcy as a critical factor in the recovery of claims against at least part of Lehman Brothers, which allowed it to keep operating until it was acquired by Barclays. Between September 15 and 18, 2008, Lehman Brothers Inc. borrowed $68 billion from the Primary Dealer Credit Facility (“PDCF”). Because the borrowed funds were fully collateralized and repaid in full with interest, the Congressional Budget Office (2010) 159 estimated that total lending through the PDCF involved a negligible subsidy value.

    159 Congressional Budget Office. (2010). The Budgetary Impact and Subsidy Costs of the Federal Reserve's Actions During the Financial Crisis.

    Global Public Costs: Externalities. The economic literature is rich with event studies of market reactions to policy announcements designed to alleviate the financial crisis, however, we find no studies focusing directly on the global market impacts of the Lehman Brothers bankruptcy as an event. We also acknowledge global spillovers as a potential public cost; however, we find no studies focusing directly on the global impacts of the Lehman Brothers bankruptcy as an event.

    c. Conclusion

    The economic literature on financial asset fire sales maintains that such events are more systemically harmful when occurring during industry-wide periods of distress, making mitigating these costs a public policy concern. The Lehman Brothers bankruptcy and the resulting QFC terminations occurred during a crisis period, and might have imposed widespread private and public costs. We do not compare the Lehman bankruptcy costs to the alternative of potential resolution costs under a counterfactual case had Title II of the Dodd-Frank Act been in effect at the time of the Lehman bankruptcy filing. Nonetheless, Fleming and Sarkar (2014) argue that, “some of the losses associated with the failure of Lehman Brothers may have been avoided in a more orderly liquidation process.”

    3. Baseline

    The FDIC promulgated 12 CFR part 371, Recordkeeping Requirements for Qualified Financial Contracts (“Part 371”), pursuant to section 11(e)(8)(H) of the FDIA.160 The FDIC's QFC recordkeeping rule, which applies to insured depository institutions that are in a troubled condition, was promulgated to enable the FDIC as receiver to make an informed decision as to whether to transfer or retain QFCs and thereby reduce losses to the deposit insurance fund and minimize the potential for market disruptions that could occur with respect to the liquidation of QFC portfolios of insured depository institutions. The recordkeeping requirements of the Final Rules, which do not apply to insured depository institutions, are based, in part, on Part 371. However, the information requirements of the Final Rules are more extensive, reflecting the FDIC's experience with portfolios of QFCs of insured depository institutions subject to Part 371.

    160 12 U.S.C. 1821(e)(8)(H).

    Based on discussions with the staff of the PFRAs who are familiar with financial company operations and have experience supervising financial companies with QFC portfolios, the Secretary believes that the large corporate groups that would be subject to the Final Rules should already be maintaining much of the QFC information required to be maintained under the Final Rules as part of their ordinary course of business. In order for these large corporate groups to effectively manage their QFC portfolios, they need to have robust recordkeeping systems in place; for example, large corporate groups that trade derivatives out of several distinct legal entities need to have detailed records, including counterparty identification, position-level data, collateral received and posted, and contractual requirements, in order to effectively manage their portfolio, perform on contracts, and monitor risks. As noted by commenters, regulated financial companies must maintain extensive QFC records pursuant to other regulatory requirements.161 However, the Secretary understands that these large corporate groups are not currently maintaining the QFC records in the standardized format prescribed by the Final Rules and as set forth in the appendix to the Final Rules such that they may have to modify existing recordkeeping systems with respect to QFCs or build new systems in order to comply with the rules.

    161See SIFMA AMG letter, pp. 12-13; ACLI letter, pp. 20-21.

    4. Evaluation of Alternatives

    The Secretary considered alternatives to implementing the recordkeeping requirements of the Final Rules but believes that the adopted form is the best available method of achieving both the statutory mandate and the regulatory objectives. The assessment of alternatives below is organized into three subcategories: The scope of the rules; the content of records; and standardized recordkeeping.

    a. Scope of the Final Rules

    The scope of the Final Rules and the reasons for the changes made to the scope of the rules as compared to the Proposed Rules is provided in section II.A.1, above. The Secretary considered alternative criteria in developing the definition of a records entity, such as including financial companies that have more than $10 billion in assets. This threshold, which would have captured more financial companies that potentially might be considered for orderly liquidation under Title II, has been used in other regulatory requirements. For example, the Dodd-Frank Act requires certain financial companies with more than $10 billion in total consolidated assets to conduct annual stress tests.162 Additionally, the CFTC's final rule on the end-user exemption to the clearing requirement for swaps exempts banks, savings associations, farm credit system institutions, and credit unions with total assets of $10 billion or less from the definition of “financial entity,” making such “smaller” financial institutions eligible for the end-user exception.163

    162See 12 U.S.C. 5365(i)(2).

    163See 17 CFR 50.50(d).

    However, the Secretary determined that while it is possible that financial companies with more than $10 billion and less than $50 billion in total assets would be considered for orderly liquidation under Title II, a more appropriate threshold is the $50 billion in total consolidated assets, supplemented by the secondary thresholds of $250 billion of total gross notional derivatives outstanding or $3.5 billion of derivative liabilities. Imposing the $50 billion total assets threshold by itself or including all financial companies with over $10 billion in total assets would substantially increase the number of financial companies subject to recordkeeping requirements, many of which would likely not be considered for orderly liquidation under Title II. Financial companies with total assets of $50 billion or more and with a substantial degree of activity in QFCs as indicated by total gross notional derivatives outstanding of at least $250 billion or derivative liabilities of at least $3.5 billion, potentially would be among the most likely to be considered for orderly liquidation under Title II. The definition of “records entity” in the Final Rules is thus designed to reduce recordkeeping burdens on smaller financial company groups by only capturing those financial companies that are part of a group with a member that it is the type of company for which the FDIC is most likely to be appointed as receiver.

    b. Content of Records

    The Secretary determined, after consulting with the FDIC, that requiring each records entity to maintain the data included in Tables A-1 through A-4 and the four master data lookup tables of the appendix to the Final Rules is necessary to assist the FDIC in being able to effectively exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. To facilitate the resolution of QFC portfolios, the FDIC, upon being appointed as receiver for a covered financial company under Title II, would need to analyze such data in order to promptly effectuate decisions. The information must be sufficient to allow the FDIC to estimate the financial and operational impact on the covered financial company and its counterparties, affiliated financial companies, and the financial markets as a whole of the FDIC's decision to transfer, retain and disaffirm or repudiate, or retain and allow the counterparty to terminate the covered financial company's QFCs. It must also allow the FDIC to assess the potential impact that such decisions may have on the financial markets as a whole, which may inform its transfer decisions. The need for the information specified by each table is discussed in further detail in section II.D.2 above.

    As indicated above, the recordkeeping requirements of the Final Rules are similar to the FDIC's Part 371, rules applicable to insured depository institutions in troubled condition but the information requirements of the Final Rules (which do not apply to insured depository institutions) are more extensive. Previously, in developing the Proposed Rules, the Secretary considered the appropriateness of reducing the recordkeeping burden by aligning the requirements more closely with those of the FDIC's Part 371, but determined, in consultation with the FDIC, that additional recordkeeping beyond that required by Part 371 would be needed for the FDIC to resolve a financial company with significant QFC positions under Title II. The Secretary reaffirms in the Final Rules that this determination is appropriate and that, in a Title II resolution scenario, the FDIC will need the additional information required by the Final Rules to analyze the QFC portfolio, decide how to manage the QFCs, and perform their obligations under the QFCs, including meeting collateral requirements. Furthermore, although applying the Part 371 requirements to records entities instead of the requirements of the Final Rules would have imposed less of a burden on records entities, even the Part 371 requirements would require records entities to update their recordkeeping systems, including by amending internal procedures, reprogramming systems, reconfiguring data tables, and implementing compliance processes in similar ways as are expected to be required for records entities complying with the Final Rules.

    As an example of the additional information required to be maintained under the Final Rules as compared to Part 371, the counterparty-level data required in Table A-2 to the appendix of the Final Rules includes the next margin payment date and payment amount. This will assist the FDIC in ensuring that a covered financial company and its subsidiaries perform their QFC obligations, including meeting clearing organization margin calls. The Table A-3 legal agreement information, which is not included in Part 371, is necessary to enable the FDIC as receiver to evaluate the likely treatment of QFCs under such contracts, and to inform the FDIC of any third-party credit enhancement and the identification of any default or other termination event provisions that reference an entity. Table A-4 includes additional collateral detail data, such as the location of collateral, the collateral segregation status, and whether the collateral may be subject to re-hypothecation by the counterparty. These additional data are necessary to enable the FDIC to assess risks associated with the collateral and improve the FDIC's ability to analyze various QFC transfer or termination scenarios. For example, for cross-border transactions, this information would help the FDIC evaluate the availability of collateral in different jurisdictions and the related close-out risks under local law if the receiver cannot arrange for the transfer of QFC positions. As noted above, we believe in many cases records entities are maintaining the additional information required under the rules due to existing business practices or other regulatory requirements. However, the Secretary understands that these large corporate groups are not currently maintaining the QFC records in the standardized format prescribed by the Final Rules and as set forth in the appendix to the Final Rules such that the additional information required will impose additional burden associated with amending internal procedures, reconfiguring data tables, and implementing compliance processes.

    c. Standardized Recordkeeping

    The Secretary determined that requiring records entities to have the capacity to maintain and generate QFC records in the uniform, standardized format set forth in the appendix to the Final Rules is necessary to assist the FDIC in being able to effectively exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. Specifically, when the FDIC is appointed as receiver of a covered financial company, the covered financial company's QFC counterparties are prohibited from exercising their contractual right of termination until 5 p.m. (eastern time) on the first business day following the date of appointment. After its appointment as receiver and prior to the close of the aforementioned 5 p.m. deadline, the FDIC has three options in managing a covered financial company's QFC portfolio. Specifically, with respect to all of the covered financial company's QFCs with a particular counterparty and all its affiliates, the FDIC may: (1) Transfer the QFCs to a financial institution, including a bridge financial company established by the FDIC; (2) retain the QFCs within the receivership and allow the counterparty to exercise contractual remedies to terminate the QFCs; or (3) retain the QFCs within the receivership, disaffirm or repudiate the QFCs, and pay compensatory damages. If the FDIC transfers the QFCs to a financial institution, the counterparty may not terminate the QFCs solely because the QFCs were transferred, or by reason of the covered financial company's financial condition or insolvency or the appointment of the FDIC as receiver. If the FDIC does not transfer the QFCs and does not disaffirm or repudiate such QFCs within the one business day stay period, the counterparty may exercise contractual remedies to terminate the QFCs and assert claims for payment from the covered financial company and may have rights to liquidate the collateral pledged by the covered financial company.

    Previously, in developing the Proposed Rules, the Secretary considered reducing the recordkeeping burden by permitting the maintenance of QFC records in non-standardized formats, but determined, after consulting with the FDIC, that this alternative would compromise the FDIC's flexibility as receiver in managing the QFC portfolio and impair its ability as receiver to maximize the value of the assets of the covered financial company in the context of orderly liquidation.164 The Secretary reaffirms in the Final Rules that this determination is appropriate in order to ensure that the FDIC, in a Title II resolution scenario, has the maximum potential to execute a prompt and effective decision regarding the disposition of the QFC portfolio of a covered financial company.

    164See 12 U.S.C. 5390(a)(1)(B)(iv).

    However, while the Final Rules specify a standardized recordkeeping format, the Secretary also recognizes that there may be particular types of QFC or counterparties for which more limited information may be sufficient to enable the FDIC to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act. The Final Rules provide the Secretary with the discretion to grant conditional or unconditional exemptions from compliance with one or more of the requirements of the Final Rules, which could include exemptions with respect to the information required regarding particular types of QFCs or counterparties.

    5. Affected Population

    Instead of requiring all financial companies to maintain records with respect to QFCs, the Secretary is limiting the scope of the Final Rules to a narrow subset of financial companies. Discretion to do so is afforded under section 210(c)(8)(H)(iv) of the Act, which requires the recordkeeping requirements to differentiate among financial companies by taking into consideration, among other things, their size and risk. The Secretary is exercising this discretion to define the term “records entity” and thereby include within the scope of the Final Rules only those financial companies that: (1) Are identified as U.S. G-SIBs; (2) the Council determines could pose a threat to U.S. financial stability; (3) the Council designates as systemically important financial market utilities; (4) have total consolidated assets equal to or greater than $50 billion and either (i) total gross notional derivatives outstanding equal to or greater than $250 billion or (ii) derivative liabilities equal to or greater than $3.5 billion; or (5) are part of the same corporate group in which at least one financial company satisfies one or more of the other foregoing criteria. The Final Rules would only apply to large corporate groups (including a large corporate group's affiliated financial companies, regardless of their size, if the affiliated financial company is a party to an open QFC and is not an “excluded entity” under the Final Rules). The types of financial companies that would qualify as records entities under the Final Rules include those listed in section II.A.1.b, above. The Secretary estimates that 30 large corporate groups would be subject to the recordkeeping requirements.

    6. Assessment of Potential Costs and Benefits a. Potential Costs

    Based on discussions with the PFRAs who are familiar with financial company operations and have experience supervising financial companies with QFC portfolios, the Secretary believes that the costs of implementing the Final Rules may be mitigated by the fact that records entities should be maintaining most of the QFC information required by the Final Rules as part of their ordinary course of business. However, the Secretary recognizes that the requirement in the Final Rules for records to be maintained in a standardized format, among other requirements, may impose costs and burdens on records entities. In order to comply with the Final Rules, each of the approximately 30 large corporate groups that the Secretary estimates would be subject to the recordkeeping requirements will need to have network infrastructure to maintain data in the required format. The Secretary expects that this will likely impose one-time initial costs on each large corporate group in connection with necessary updates to their recordkeeping systems, such as systems development or modifications. The initial costs to set up network infrastructure will depend on whether a large corporate group already holds and maintains QFC data in an organized electronic format, and if so, whether the data currently reside on different systems rather than on one centralized system. Large corporate groups may need to amend internal procedures, reprogram systems, reconfigure data tables, and implement compliance processes. Moreover, they may need to standardize the data and create tables to match the format required by the Final Rules. However, the Secretary believes that the large corporate groups that would be subject to the Final Rules are likely to rely on existing centralized systems for recording and reporting QFC activities to perform most of the recordkeeping and reporting requirements set forth herein. The entity within the corporate group responsible for this centralized system will likely operate and maintain a technology shared services model with the majority of technology applications, systems, and data shared by the affiliated financial companies within the large corporate group. In addition, as referenced above, the Final Rules will also require the top-tier financial company of the corporate group to be capable of generating a single, compiled set of the records specified in the Final Rules for all records entities in the corporate group that it consolidates or are consolidated with it and to be capable of providing such a set of records to its PFRA and the FDIC. Therefore, the Final Rules will likely impose the most significant costs on the entity or entities within the large corporate group responsible for such centralized systems, which is reflected in the cost estimates for large corporate groups provided herein; most affiliated financial companies within a large corporate group are not expected to bear significant costs. The affiliated financial companies will likely have much lower costs because they can utilize and rely upon the technology and network infrastructure operated and maintained by the entity responsible for the centralized system within the large corporate group.

    Previously, the Secretary estimated the costs of the initial and annual recordkeeping burdens, as well as the annual reporting burden, associated with the Proposed Rules in both man-hours and dollar terms and requested comment on whether the cost estimates were reasonable. As noted above, the Secretary's recordkeeping, reporting, data retention, and records generation burden estimates were based on discussions with the PFRAs regarding their prior experience with burden estimates for other recordkeeping systems. The Secretary also considered the burden estimates in rulemakings with similar recordkeeping requirements. For example, the initial non-recurring burden estimates provided in rulemakings for such recordkeeping requirements varied based on the scope of requirements and the type of entity subject to the requirements, but included initial burden estimates ranging from approximately 100 to 3,300 hours and estimates of required investments in technology and infrastructure from $50,000 to $250,000. Although the type and amount of data collected and reported for such reporting systems are substantively different in both content and format from the data that would be recorded under the Final Rules, the estimates from these prior rulemakings nevertheless provide some guidance as to the scale of system modifications and information technology investments that would be required for compliance with the Final Rules. Similarly, the types of information technology professionals that will establish the recordkeeping and data retention for records entities under the final rules are expected to be similar to the professionals involved in establishing the other systems referenced above.

    Most commenters offered general comments on the costs associated with complying with the Proposed Rules, with several stating that the costs—either in general, or as related to certain proposed recordkeeping requirements—outweighed the benefits to the FDIC as receiver.165 Some commenters addressed the impact that the Proposed Rules would have on entities' recordkeeping and information systems. For example, one commenter stated that the Proposed Rules, if not modified, would force market participants to rebuild existing recordkeeping systems and protocols and impose significant expense.166 One commenter directly referenced the Secretary's cost estimates in the context of such commenter's request for an extension of the proposed initial compliance period, stating that the Secretary's estimate of the cost of such work for most financial groups subject to the rule will far exceed the Secretary's estimation of the total industry-wide compliance cost.167 On this basis, the commenter went on to request that the initial 270-day compliance period provided for in the Proposed Rules be extended to two years and that compliance be phased in over a period of years based on the potential criticality of QFCs to the FDIC during resolution. However, neither this commenter, nor any other commenter on the Proposed Rules, offered quantified costs, estimated or otherwise, or other empirical data in support of these comments.

    165See, e.g., ACLI letter, pp. 17-19; SIFMA AMG letter, p. 4.

    166See TIAA-CREF letter, p. 2. Two other commenters stated that the Proposed Rules would have a significant impact on information technology and systems development, but these comments arose in the context of clearing organizations not having access to much of the information required under the Final Rules. See DTCC letter, p. 10; OCC letter, p. 12. The Secretary has provided a conditional exemption for registered derivatives clearing organizations and clearing agencies from the recordkeeping requirements of the Final Rules as discussed in section II.A.1.a, above.

    167See TCH et al. letter, pp. 3-4.

    As discussed in detail in section II above, after carefully considering all of the comments received and consulting with the FDIC, the Secretary is adopting numerous changes from the Proposed Rules. Many of these changes are being adopted in response to comments and are intended to limit the scope and mitigate the burdens associated with complying with the QFC recordkeeping requirements of the Final Rules. In main part, these changes relate to narrowing the scope of the definition of “records entity,” extending the initial compliance period for all records entities, eliminating certain proposed recordkeeping requirements, and providing for a de minimis exemption from the preponderance of the recordkeeping requirements for certain records entities that have a minimal level of QFC activity.

    Taking into consideration the changes made in the Final Rules and the comments received as to the burden the rules would place on records entities, the Secretary has updated the estimated potential costs. It is estimated that the initial recordkeeping burden for all records entities (including affiliates) will be approximately 218,505 hours with a total one-time initial cost of approximately $36,631,995 (in nominal dollars), representing $1,221,000 per large corporate group on average. The basis for this estimate, discussed further below, is necessarily constrained by the limited availability of relevant information, including the lack of quantitative information from commenters.

    Specifically, based on staff-level discussions with several of the PFRAs, burden estimates in rulemakings with similar recordkeeping requirements, and the comments received, it is expected that each of the approximately 30 large corporate groups will incur on average approximately $500,000 in systems development and modification costs, including the purchase of computer software, and that the entity responsible for maintaining the centralized system within each large corporate group will incur 7,200 initial burden hours at a cost of $712,800 to update to their recordkeeping systems. This initial burden is mitigated to some extent because QFC data is likely already retained in some form by each large corporate group respondent in the ordinary course of business, but large corporate group respondents may need to amend internal procedures, reprogram systems, reconfigure data tables, and implement compliance processes. Moreover, they may need to standardize the data and create records tables to match the format required by the Final Rules. These costs will likely be borne by the entity responsible for maintaining the centralized system within each large corporate group. It is expected that the initial burden hours will require the work of senior programmers, programmer analysts, senior system analysts, compliance managers, compliance clerks, directors of compliance, and compliance attorneys. The Secretary has estimated that the average hourly wage rate for recordkeepers to comply with the initial recordkeeping burden is approximately $99 per hour based in part on average hourly wage rate for these occupations in the U.S. Department of Labor, Bureau of Labor Statistics' occupational employment statistics and wage statistics for financial sector occupations, dated May 2015.168

    168 All cost and wage estimates are in nominal dollars and have not been adjusted for inflation.

    The total estimated one-time cost for all large corporate group respondents to comply with the initial recordkeeping burden, is approximately $36,384,000, of which $21,384,000 is due to the burden hours and $15,000,000 is for systems development and modification costs. This is based on the estimated 7,200 initial burden hours for each of the 30 large corporate groups multiplied by the estimated average hourly wage rate for recordkeepers (216,000 hours multiplied by $99/hour) and the $500,000 in systems development and modification costs for each of the 30 large corporate groups. Finally, the total estimated one-time initial cost includes the estimated cost for the 5,010 affiliated financial company respondents to comply with the initial recordkeeping burden, which is approximately $247,995. This is based on an estimated 0.5 initial burden hour for each affiliated financial company, 5,010 affiliated financial companies, and the $99 estimated average hourly wage rate for recordkeepers described above (2,505 hours multiplied by $99/hour).

    However, section 148.1(d)(1)(i) of the Final Rules provides for compliance periods of between 540 days and four years after the effective date of the Final Rules, depending on the total assets of records entities. Thus, the initial recordkeeping burden is expected to occur over multiple years, resulting in a substantial reduction in the annual cost. Information as to how records entities would spread this initial cost over the compliance period is not available. However, assuming the costs would be incurred evenly over the entire compliance period, this would result in annual one-time, initial recordkeeping costs ranging from $814,000 for a large corporate group with a 540 day compliance period to $305,267 for a large corporate group with a four year compliance period.

    Based in part on staff-level discussions with several of the PFRAs, burden estimates in rulemakings with similar recordkeeping requirements, and the comments received, it is expected that the total estimated recurring annual recordkeeping burden necessary to oversee, maintain, and utilize the recordkeeping system will be approximately 240 hours for each large corporate group and 0.5 hours for each affiliated financial company. Based on the estimate of 30 large corporate groups and 168 affiliates of each corporate group that will be subject to the rules, the total estimated annual recordkeeping burden for all record entities will be approximately 9,705 hours with a total annual cost of approximately $960,795 (9,705 hours multiplied by $99/hour). The estimated average hourly wage rate for recordkeepers to comply with the annual recordkeeping burden is approximately $99 per hour, using the same methodology described above for compliance with the initial recordkeeping burden.

    With regard to reporting burdens under the Final Rules, a records entity may request in writing an extension of time with respect to compliance with the recordkeeping requirements or an exemption from the recordkeeping requirements. The annual reporting burden under the Final Rules associated with such exemption requests is estimated to be approximately 50 hours per large corporate group. The estimated average hourly rate for recordkeepers to comply with the annual reporting burden is approximately $192 per hour based on the U.S. Department of Labor, Bureau of Labor Statistics' occupational employment statistics and wage statistics for financial sector occupations, dated May 2015. The $192 hourly wage rate is based on the average hourly wage rates for compliance managers, directors of compliance, and compliance attorneys that will conduct the reporting. The total annual cost of the reporting burden under the Final Rules is approximately $288,000 (50 hours multiplied by 30 records entities multiplied by $192/hour).

    Based on the total one-time cost (phased in over 540 days to 4 years), the total annual recordkeeping cost, and the total annual cost of the reporting burden, the estimated net present values of the estimated potential costs of the Final Rules over the next 10 years are approximately $42,103,000 using a discount rate of 3 percent and $38,000,000 using a discount rate of 7 percent.

    The estimated potential costs in nominal dollars for the initial recordkeeping burden, the annual recordkeeping burden, and the annual reporting burden associated with the Final Rules are summarized in the following table.

    QFC Recordkeeping Requirements Final Rule—Potential Costs Initial
  • recordkeeping
  • Annual
  • recordkeeping
  • Annual
  • reporting
  • 30 Large Corporate Groups: Estimated Hours per Group 7,200 240 50 Total Hours 216,000 7,200 1,500 Total Cost $21,384,000 $712,800 $288,000 5,010 Affiliates: Estimated Hours per Affiliate 0.5 0.5 Total Hours 2,505 2,505 Total Cost $247,995 $247,995 IT Costs: Estimated IT Costs per Corporate Group $500,000 Total Cost $15,000,000 Total: Hours 218,505 9,705 1,500 Cost $36,631,995 $960,795 $288,000 Memorandum: Estimated average hourly wage/rate * $99 $99 $192 * Estimated average hourly rate for recordkeepers to comply with the initial and annual recordkeeping and annual reporting requirements, based on the U.S. Department of Labor, Bureau Labor Statistics' occupational employment statistics and wage statistics for financial sector occupations, dated May 2015.
    b. Potential Benefits

    As noted earlier, QFCs tend to increase the interconnectedness of the financial system, and the recent financial crisis demonstrated that the management of QFC positions can be an important element of a resolution strategy which, if not handled properly, may magnify market instability. The recordkeeping requirements of the Final Rules are therefore designed to ensure that the FDIC, as receiver of a covered financial company, will have comprehensive information about the QFC portfolio of such financial company subject to orderly resolution, and enable the FDIC to carry out the rapid and orderly resolution of a financial company's QFC portfolio in the event of insolvency, for example, by transferring QFCs to a bridge financial company within the narrow time frame afforded by the Act. Given the short time frame for FDIC decisions regarding a QFC portfolio of significant size or complexity, the Final Rules require the use of a regularly updated and standardized recordkeeping format to allow the FDIC to process the large amount of QFC information quickly. In the absence of updated and standardized information, for example, the FDIC could leave QFCs in the receivership when transferring them to a bridge financial company or other solvent financial institution would have been the preferred course of action had better information been available. Specifically, if the FDIC does not transfer the QFCs and does not disaffirm or repudiate such QFCs, counterparties may terminate the QFCs and assert claims for payment from the covered financial company and may have rights to liquidate the collateral pledged by the covered financial company. However, a decision by the FDIC not to transfer the QFCs of a large, interconnected financial company must be calculated and based on detailed information about the QFC portfolio. Otherwise, the subsequent unwinding and termination of QFCs involving numerous counterparties risks becoming disorderly, potentially resulting in the rapid liquidation of collateral, deterioration in asset values, and severe negative consequences for U.S. financial stability. The FDIC as receiver may also wish to make sure that affiliates of the covered financial company continue to perform their QFC obligations in order to preserve the critical operations of the covered financial company and its affiliates. In such cases, the FDIC may need to arrange for additional liquidity, support, or collateral to the affiliates to enable them to meet collateral obligations and generally perform their QFC obligations.

    While there could be significant benefits associated with the QFC recordkeeping requirements of the Final Rules, such benefits are difficult to quantify. The Final Rules are only one component of the orderly liquidation authority under Title II of the Act and the benefits of the Final Rules will only be realized upon such authority being exercised. Moreover, implementation of additional provisions of the Dodd-Frank Act has, among other things: (1) Subjected large, interconnected financial companies to stronger supervision, and, as a result, reduced the likelihood of their failure; and (2) blunted the impact of any such failure on U.S. financial stability and the economy. For example, bank holding companies with total consolidated assets of $50 billion or more and nonbank financial companies supervised by the Board are subject to supervisory and company-run stress tests to help the Board and the company measure the sufficiency of capital available to support the company's operations throughout periods of stress.169 These financial companies also are or will be subject to more stringent prudential standards, including risk-based capital and liquidity requirements, which will make their failure less likely. However, if such a financial company does fail, the implementation of the Dodd-Frank Act is also intended to ensure that its failure and resolution under the Bankruptcy Code may occur without adverse effects on U.S. financial stability. For example, each of these large bank holding companies and nonbank financial companies supervised by the Board will have in place resolution plans to facilitate their rapid and orderly resolution under the Bankruptcy Code in the event of material financial distress or failure.170 The Title II orderly liquidation authority will only be used to resolve a failing financial company if its resolution under the Bankruptcy Code would have serious adverse effects on U.S. financial stability. In addition, there are substantial procedural safeguards to prevent the unwarranted use of the Title II orderly liquidation authority.

    169See 12 U.S.C. 5365(i); 12 CFR part 252.

    170See 12 U.S.C. 5365(d).

    Nevertheless, one way to gauge the potential benefits of the Final Rules is to examine the effect of the recent financial crisis on the real economy and how the Title II orderly liquidation authority as a whole will help reduce the probability or severity of a future financial crisis. For example, in a 2013 Government Accountability Office (GAO) report, GAO cited research that suggests that U.S. output losses associated with the 2007-2009 financial crisis could range from several trillion dollars to over $10 trillion.171 GAO also surveyed financial market regulators, academics, and industry and public interest groups who identified, inter alia, the more stringent prudential standards discussed above and the orderly liquidation authority as not only enhancing financial stability, at least in principle, but also helping to reduce the probability or severity of a future crisis.172

    171See Government Accountability Office, Financial Regulatory Reform: Financial Crisis Losses and Potential Impacts of the Dodd-Frank Act, GAO-13-180 at 15-16 (Jan. 16, 2013).

    172Id. at 33-34. GAO added that the experts it surveyed had differing views on these provisions but that many expect some or all of the provisions to improve the financial system's resilience to shocks.

    However, as discussed above, even if the benefits of preventing future financial crises are significant, it is difficult to quantify such benefits and determine what portion would be attributable to any single provision of the Dodd-Frank Act, let alone those benefits directly attributable to the Final Rules. In addition, as discussed above, the benefits associated with the Final Rules would only be realized if the Title II orderly liquidation authority is exercised and, even if utilized, the Final Rules are only one component of the orderly liquidation authority and the resulting benefits.

    7. Retrospective Analysis

    Executive Order 13563 also directs the Secretary to develop a plan, consistent with law and the Department of the Treasury's resources and regulatory priorities, to conduct a periodic retrospective analysis of significant regulations to determine whether such regulations should be modified, streamlined, expanded, or repealed so as to make the regulations more effective and less burdensome. The Secretary expects to conduct a retrospective analysis not later than seven years after the effective date of the Final Rules. This review will consider whether the QFC recordkeeping requirements are necessary or appropriate to assist the FDIC as receiver in being able to exercise its rights under the Act and fulfill its obligations under sections 210(c)(8), (9), or (10) of the Act and may result in proposed amendments to the Final Rules. For example, the Secretary will review whether the total assets and derivatives thresholds of the definition of “records entity” should be adjusted and whether the data set forth in Tables A-1 through A-4 and the master tables in the appendix of the Final Rules are necessary or appropriate to assist the FDIC as receiver, and whether maintaining different data is necessary or appropriate.

    List of Subjects in 31 CFR Part 148

    Reporting and recordkeeping requirements.

    Authority and Issuance

    For the reasons set forth in the preamble, the Department of the Treasury adds part 148 to 31 CFR chapter I to read as follows:

    Part 148—Qualified Financial Contracts Recordkeeping Related to the FDIC Orderly Liquidation Authority Sec. 148.1 Scope, purpose, effective date, and compliance dates. 148.2 Definitions. 148.3 Form, availability and maintenance of records. 148.4 Content of records. Appendix A to Part 148—File Structure for Qualified Financial Contract Records Authority:

    31 U.S.C. 321(b) and 12 U.S.C. 5390(c)(8)(H).

    § 148.1 Scope, purpose, effective date, and compliance dates.

    (a) Scope. This part applies to each financial company that is a records entity and, with respect to § 148.3(a), a top-tier financial company of a corporate group as defined in § 148.2.

    (b) Purpose. This part establishes recordkeeping requirements with respect to QFCs of records entities in order to assist the Federal Deposit Insurance Corporation (“FDIC”) as receiver for a covered financial company (as defined in 12 U.S.C. 5381(a)(8)) in being able to exercise its rights and fulfill its obligations under 12 U.S.C. 5390(c)(8), (9), or (10).

    (c) Effective Date. This part shall become effective December 30, 2016.

    (d) Compliance—(1) Initial compliance dates. (i) A records entity subject to this part on the effective date must comply with § 148.3(a)(2) on the date that is 90 days after the effective date and with all other applicable requirements of this part on the date that is:

    (A) 540 days after the effective date for a records entity that:

    (1) Has total assets equal to or greater than $1 trillion; or

    (2) Is a member of the corporate group of any such records entity described in paragraph (d)(1)(i)(A)(1) of this section;

    (B) Two years after the effective date for any records entity that is not subject to the compliance date set forth in paragraph (d)(1)(i)(A) of this section and:

    (1) Has total assets equal to or greater than $500 billion; or

    (2) Is a member of the corporate group of any such records entity described in paragraph (d)(1)(i)(B)(1) of this section; and

    (C) Three years after the effective date for any records entity that is not subject to the compliance date set forth in paragraphs (d)(1)(i)(A) or (B) of this section and:

    (1) Has total assets equal to or greater than $250 billion; or

    (2) Is a member of the corporate group of any such records entity described in paragraph (d)(1)(i)(C)(1) of this section; and

    (D) Four years after the effective date for any records entity that is not subject to the compliance dates set forth in paragraphs (d)(1)(i)(A), (B), or (C) of this section.

    (ii) A financial company that becomes a records entity after the effective date must comply with § 148.3(a)(2) within 90 days of becoming a records entity and with all other applicable requirements of this part within 540 days of becoming a records entity or within the remainder of the applicable period provided under paragraph (d)(1)(i) of this section, whichever period is longer.

    (2) Subsequent compliance dates. If a financial company that at one time met the definition of records entity later ceases to meet the definition of records entity and thereafter, on any subsequent date, again meets the definition of a records entity, such financial company must comply with all applicable requirements of this part within 365 days after such subsequent date, or within the remainder of the applicable period provided under paragraph (d)(1)(i) of this section, whichever period is longer.

    (3) Extensions of time to comply. The Secretary, in consultation with the FDIC, may grant one or more extensions of time for compliance with this part. A records entity may request an extension of time by submitting a written request to the Department of the Treasury and the FDIC at least 30 days prior to the deadline for its compliance provided under paragraph (d)(1) of this section. The written request for an extension must contain:

    (i) A statement of the reasons why the records entity cannot comply by the deadline; and

    (ii) A plan for achieving compliance during the requested extension period.

    (4) Compliance by top-tier financial company. A top-tier financial company must comply with § 148.3(a)(1)(ii) on the same date as the date on which the records entity members of the corporate group of which it is the top-tier financial company are required to comply with this part.

    § 148.2 Definitions.

    For purposes of this part:

    (a) Affiliate means any entity that controls, is controlled by, or is under common control with another entity.

    (b) Control. An entity “controls” another entity if:

    (1) The entity directly or indirectly or acting through one or more other persons owns, controls, or has the power to vote 25 percent or more of any class of voting securities of the other entity;

    (2) The entity controls in any manner the election of a majority of the directors or trustees of the other entity; or

    (3) The Board of Governors of the Federal Reserve System has determined, after notice and opportunity for hearing in accordance with 12 CFR 225.31, that the entity directly or indirectly exercises a controlling influence over the management or policies of the other entity.

    (c) Corporate group means an entity and all affiliates of that entity.

    (d) Counterparty means any natural person or entity (or separate foreign branch or division of any entity) that is a party to a QFC with a records entity.

    (e) Derivative liabilities means the fair value of derivative instruments in a negative position as of the end of the most recent fiscal year end, as recognized and measured in accordance with U.S. generally accepted accounting principles or other applicable accounting standards. Such value shall be adjusted for the effects of master netting agreements and cash collateral held with the same counterparty on a net basis to the extent such adjustments are reflected on the audited consolidated statement of financial condition of the applicable financial company filed with its primary financial regulatory agency or agencies or, for financial companies not required to file such statements, on the consolidated balance sheet of the financial company prepared in accordance with U.S. generally accepted accounting principles or other applicable accounting standards.

    (f) Excluded entity means:

    (1) An insured depository institution as defined in 12 U.S.C. 1813(c)(2);

    (2) A subsidiary of an insured depository institution that is not:

    (i) A functionally regulated subsidiary as defined in 12 U.S.C. 1844(c)(5);

    (ii) A security-based swap dealer as defined in 15 U.S.C. 78c(a)(71); or

    (iii) A major security-based swap participant as defined in 15 U.S.C. 78c(a)(67); or

    (3) An insurance company.

    (g) Financial company has the meaning set forth in 12 U.S.C. 5381(a)(11).

    (h) Insurance company means:

    (1) An insurance company as defined in 12 U.S.C. 5381(a)(13); and

    (2) A mutual insurance holding company that meets the conditions set forth in 12 CFR 380.11 for being treated as an insurance company for the purpose of section 203(e) of the Dodd-Frank Act, 12 U.S.C. 5383(e).

    (i) Legal Entity Identifier or LEI for an entity shall mean the global legal entity identifier maintained for such entity by a utility accredited by the Global LEI Foundation or by a utility endorsed by the Regulatory Oversight Committee. As used in this definition:

    (1) Regulatory Oversight Committee means the Regulatory Oversight Committee (of the Global LEI System), whose charter was set forth by the Finance Ministers and Central Bank Governors of the Group of Twenty and the Financial Stability Board, or any successor thereof; and

    (2) Global LEI Foundation means the not-for-profit organization organized under Swiss law by the Financial Stability Board in 2014, or any successor thereof.

    (j) Parent entity with respect to an entity is an entity that controls that entity.

    (k) Position means an individual transaction under or evidenced by a QFC and includes the rights and obligations of a party to an individual transaction under or evidenced by a QFC.

    (l) Primary financial regulatory agency means:

    (1) With respect to any financial company, the primary financial regulatory agency as specified for such financial company in subparagraphs (A), (B), (C), and (E) of 12 U.S.C. 5301(12); and

    (2) With respect to a financial market utility that is subject to a designation pursuant to 12 U.S.C. 5463 for which there is no primary financial regulatory agency under § 148.2(l)(1), the Supervisory Agency for that financial market utility as defined in 12 U.S.C. 5462(8).

    (m) Qualified financial contract or QFC means any qualified financial contract defined in 12 U.S.C. 5390(c)(8)(D), including without limitation, any “swap” defined in section 1a(47) of the Commodity Exchange Act (7 U.S.C. 1a(47)) and in any rules or regulations issued by the Commodity Futures Trading Commission pursuant to such section; any “security-based swap” defined in section 3(a) of the Securities Exchange Act of 1934 (15 U.S.C. 78c(a)) and in any rules or regulations issued by the Securities and Exchange Commission pursuant to such section; and any securities contract, commodity contract, forward contract, repurchase agreement, swap agreement, and any similar agreement that the FDIC determines by regulation, resolution, or order to be a qualified financial contract as provided in 12 U.S.C. 5390(c)(8)(D).

    (n) Records entity

    (1) Records entity means any financial company that:

    (i) Is not an excluded entity as defined in § 148.2(f);

    (ii) Is a party to an open QFC; and

    (iii) (A) Is subject to a determination that the company shall be subject to Federal Reserve supervision and enhanced prudential standards pursuant to 12 U.S.C. 5323;

    (B) Is subject to a designation as, or as likely to become, systemically important pursuant to 12 U.S.C. 5463;

    (C) Is identified as a global systemically important bank holding company pursuant to 12 CFR part 217;

    (D)(1) Has total assets on a consolidated basis equal to or greater than $50 billion; and

    (2) On a consolidated basis has:

    (i) Total gross notional derivatives outstanding equal to or greater than $250 billion; or

    (ii) Derivative liabilities equal to or greater than $3.5 billion; or

    (E)(1) Is a member of a corporate group in which at least one financial company meets the criteria under one or more of paragraphs (n)(1)(iii)(A), (B), (C), or (D) of this section; and

    (2)(i) Consolidates, is consolidated by, or is consolidated with such financial company on financial statements prepared in accordance with U.S. generally accepted accounting principles or other applicable accounting standards; or

    (ii) For financial companies not subject to such principles or standards, would consolidate, be consolidated by, or be consolidated with such financial company if such principles or standards applied.

    (2) A financial company that qualifies as a records entity pursuant to paragraph (n)(1)(iii)(D) will remain a records entity until one year after it ceases to meet the criteria set forth in paragraph (n)(1)(iii)(D) of this section.

    (o) Secretary means the Secretary of the Treasury or the Secretary's designee.

    (p) Subsidiary means any company that is controlled by another company.

    (q) Top-tier financial company means a financial company that is a member of a corporate group consisting of multiple records entities and that is not itself controlled by another financial company.

    (r) Total assets means the total assets reported on the audited consolidated statement of financial condition of the applicable financial company for the most recent year end filed with its primary financial regulatory agency or agencies or, for financial companies not required to file such statements, the total assets shown on the consolidated balance sheet of the financial company for the most recent fiscal year end as prepared in accordance with U.S. generally accepted accounting principles or other applicable accounting standards.

    (s) Total gross notional derivatives outstanding means the gross notional value of all derivative instruments that are outstanding as of the most recent fiscal year end, as recognized and measured in accordance with U.S. generally accepted accounting principles or other applicable accounting standards.

    § 148.3 Form, availability and maintenance of records.

    (a) Form and availability—(1) Electronic records. (i) Except to the extent of any relevant exemption provided under paragraph (c) of this section, a records entity is required to maintain the records described in § 148.4 in electronic form and, as applicable, in the format set forth in the tables in the appendix to this part.

    (ii) A top-tier financial company must be capable of generating a single, compiled set of the records required to be maintained by § 148.4(a)-(h), in a format that allows for aggregation and disaggregation of such data by records entity and counterparty, for all records entities in its corporate group that are consolidated by or consolidated with such top-tier financial company on financial statements prepared in accordance with U.S. generally accepted accounting principles or other applicable accounting standards or, for financial companies not subject to such principles or standards, that would be consolidated by or consolidated with such financial company if such principles or standards applied.

    (2) Point of contact. Each records entity and top-tier financial company must provide a point of contact who is responsible for recordkeeping under this part by written notice to its primary financial regulatory agency or agencies and the FDIC and must provide written notice to its primary financial regulatory agency or agencies and the FDIC within 30 days of any change in its point of contact.

    (3) Access to records. Except to the extent of any relevant exemption provided under paragraph (c) of this section, a records entity and a top-tier financial company that are regulated by a primary financial regulatory agency shall be capable of providing electronically to such primary financial regulatory agency and the FDIC, within 24 hours of request by the primary financial regulatory agency:

    (i) In the case of a records entity, the records specified in § 148.4, and

    (ii) In the case of a top-tier financial company, the set of records referenced in paragraph (a)(1)(ii) of this section.

    (b) Maintenance and updating—(1) Daily updating. Except to the extent of any relevant exemption provided under paragraph (c) of this section, the records maintained under § 148.4 shall be based on values and information that are no less current than previous end-of-day values and information.

    (2) Records maintenance. The records required under § 148.4 and the capability of generating the set of records required by paragraph (a)(1)(ii) of this section may be maintained on behalf of the records entity or top-tier financial company, as applicable, by any affiliate of such records entity or top-tier financial company, as applicable, or any third-party service provider; provided that such records entity shall itself maintain records under this part in the event that such affiliate or service provider shall fail to maintain such records and such top-tier financial company shall itself maintain the capability of generating the set of records required by paragraph (a)(1)(ii) of this section in the event that such affiliate or service provider shall fail to maintain the capability of doing so.

    (3) Record retention. A records entity shall retain records maintained under § 148.4 based on end-of-day values and information for the five preceding business days.

    (c) Exemptions—(1) De minimis exemption. A records entity that is a party to 50 or fewer open QFC positions is not required to maintain the records described in § 148.4, other than the records described in § 148.4(i).

    (2) Clearing organizations. A records entity that is a derivatives clearing organization registered with the Commodity Futures Trading Commission under section 5b of the Commodity Exchange Act (7 U.S.C. 7a-1) or a clearing agency registered with the Securities and Exchange Commission under section 17A of the Securities Exchange Act of 1934 (15 U.S.C. 78q-1) is not required to maintain the records described in § 148.4 if it is:

    (i) In compliance with the recordkeeping requirements of the Commodity Futures Trading Commission or the Securities and Exchange Commission, as applicable, including its maintenance of records pertaining to all QFCs cleared by such records entity; and

    (ii) Capable of and not restricted from, whether by law, regulation, or agreement, transmitting electronically to the FDIC the records maintained under such recordkeeping requirements within 24 hours of request of the Commodity Futures Trading Commission or the Securities and Exchange Commission, as applicable.

    (3) Requests for exemptions. One or more records entities may request an exemption from one or more of the requirements of this part by writing to the Department of the Treasury, the FDIC, and its primary financial regulatory agency or agencies, if any. The written request for an exemption must:

    (i) Identify the records entity or records entities or the types of records entities to which the exemption should apply;

    (ii) Specify the requirement(s) under this part from which the identified records entities should be exempt;

    (iii) Provide details as to the size, risk, complexity, leverage, frequency and dollar amount of qualified financial contracts, and interconnectedness to the financial system of each records entity identified in paragraph (c)(3)(i) of this section, to the extent appropriate, and any other relevant factors; and

    (iv) Specify the reason(s) why granting the exemption will not impair or impede the FDIC's ability to exercise its rights or fulfill its statutory obligations under 12 U.S.C. 5390(c)(8), (9), and (10).

    (4) Granting exemptions. (i) Upon receipt of a written recommendation from the FDIC, prepared in consultation with the primary financial regulatory agency or agencies for the applicable records entity or entities, that takes into consideration each of the factors referenced in 12 U.S.C. 5390(c)(8)(H)(iv) and any other factors the FDIC considers appropriate, the Secretary may grant, in whole or in part, a conditional or unconditional exemption from compliance with one or more of the requirements of this part by issuing an exemption to one or more records entities.

    (ii) In determining whether to grant an exemption to one or more records entities, including whether to grant a conditional or unconditional exemption, the Secretary will consider any factors deemed appropriate by the Secretary, including whether application of one or more requirements of this part is not necessary to achieve the purpose of this part as described in § 148.1(b).

    (iii) If the FDIC does not submit, within 90 days of the date on which the FDIC and the Department of the Treasury received the exemption request, a written recommendation to the Secretary as to whether to grant or deny an exemption request, the Secretary will nevertheless determine whether to grant or deny the exemption request.

    § 148.4 Content of records.

    Subject to § 148.3(c), a records entity must maintain the following records:

    (a) The position level data listed in Table A-1 in appendix A to this part with respect to each QFC to which it is a party.

    (b) The counterparty netting set data listed in Table A-2 in appendix A to this part for each netting set with respect to each QFC to which it is a party.

    (c) The legal agreements information listed in Table A-3 in appendix A to this part with respect to each QFC to which it is a party.

    (d) The collateral detail data listed in Table A-4 in appendix A to this part with respect to each QFC to which it is a party.

    (e) The corporate organization master data lookup table in appendix A to this part for the records entity and each of its affiliates.

    (f) The counterparty master data lookup table in appendix A to this part for each non-affiliated counterparty with respect to QFCs to which it is a party.

    (g) The booking location master data lookup table in appendix A to this part for each booking location used with respect to QFCs to which it is a party.

    (h) The safekeeping agent master data lookup table in the appendix to this part for each safekeeping agent used with respect to QFCs to which it is a party.

    (i) All documents that govern QFC transactions between the records entity and each counterparty, including, without limitation, master agreements and annexes, schedules, netting agreements, supplements, or other modifications with respect to the agreements, confirmations for each open QFC position of the records entity that has been confirmed and all trade acknowledgments for each open QFC position that has not been confirmed, all credit support documents including, but not limited to, credit support annexes, guarantees, keep-well agreements, or net worth maintenance agreements that are relevant to one or more QFCs, and all assignment or novation documents, if applicable, including documents that confirm that all required consents, approvals, or other conditions precedent for such assignment or novation have been obtained or satisfied.

    (j) A list of vendors directly supporting the QFC-related activities of the records entity and the vendors' contact information.

    Appendix A to Part 148—File Structure for Qualified Financial Contract Records Table A-1—Position-Level Data Field Example Instructions and data application Definition Validation A1.1 As of date 2015-01-05 Provide data extraction date YYYY-MM-DD A1.2 Records entity identifier 999999999 Provide LEI for records entity. Information needed to review position-level data by records entity Varchar(50) Validated against CO.2. A1.3 Position identifier 20058953 Provide a position identifier. Should be used consistently across all record entities within the corporate group. Use the unique transaction identifier if available. Information needed to readily track and distinguish positions Varchar(100). A1.4 Counterparty identifier 888888888 Provide a counterparty identifier. Use LEI if counterparty has one. Should be used consistently by all record entities within the corporate group. Information needed to identify counterparty by reference to Counterparty Master Table Varchar(50) Validated against CP.2. A1.5 Internal booking location identifier New York, New York Provide office where the position is booked. Information needed to determine system on which the trade is booked and settled Varchar(50) Combination A1.2 + A1.5 + A1.6 should have a corresponding unique combination BL.2 + BL.3 + BL.4 entry in Booking Location Master Table. A1.6 Unique booking unit or desk identifier xxxxxx Provide an identifier for unit or desk at which the position is booked. Information needed to help determine purpose of position Varchar(50) Combination A1.2 + A1.5 + A1.6 should have a corresponding unique combination BL.2 + BL.3 + BL.4 entry in Booking Location Master Table. A1.7 Type of QFC Credit, equity, foreign exchange, interest rate (including cross-currency), other commodity, securities repurchase agreement, securities lending, loan repurchase agreement, guarantee or other third party credit enhancement of a QFC Provide type of QFC. Use unique product identifier if available. Information needed to determine the nature of the QFC Varchar (100). A1.7.1 Type of QFC covered by guarantee or other third party credit enhancement Credit, equity, foreign exchange, interest rate (including cross-currency), other commodity, securities repurchase agreement, securities lending, or loan repurchase agreement If QFC type is guarantee or other third party credit enhancement, provide type of QFC of the QFC that is covered by such guarantee or other third party credit enhancement. Use unique product identifier if available. If multiple asset classes are covered by the guarantee or credit enhancement, enter the asset classes separated by comma. If all the QFCs of the underlying QFC obligor identifier are covered by the guarantee or other third party credit enhancement, enter “All” Varchar(500) Only required if QFC type (A1.7) is a guarantee or other third party credit enhancement. A1.7.2 Underlying QFC obligor identifier 888888888 If QFC type is guarantee or other third party credit enhancement, provide an identifier for the QFC obligor whose obligation is covered by the guarantee or other third party credit enhancement. Use LEI if underlying QFC obligor has one. Complete the counterparty master table with respect to a QFC obligor that is a non-affiliate Varchar(50) Only required if QFC asset type (A1.7) is a guarantee or other third party credit enhancement. Validated against CO.2 if affiliate or CP.2 if non-affiliate. A1.8 Agreement identifier xxxxxxxxx Provide an identifier for the primary governing documentation, e.g., the master agreement or guarantee agreement, as applicable Varchar(50) Validated against A3.3. A1.9 Netting agreement identifier xxxxxxxxx Provide an identifier for netting agreement. If this agreement is the same as provided in A1.8, use same identifier. Information needed to identify unique netting sets Varchar(50) Validated against A3.3. A1.10 Netting agreement counterparty identifier xxxxxxxxx Provide a netting agreement counterparty identifier. Use same identifier as provided in A1.4 if counterparty and netting agreement counterparty are the same. Use LEI if netting agreement counterparty has one. Information needed to identify unique netting sets Varchar(50) Validated against CP.2. A1.11 Trade date 2014-12-20 Provide trade or other commitment date for the QFC. Information needed to determine when the entity's rights and obligations regarding the position originated YYYY-MM-DD. A1.12 Termination date 2014-03-31 Provide date the QFC terminates or is expected to terminate, expire, mature, or when final performance is required. Information needed to determine when the entity's rights and obligations regarding the position are expected to end YYYY-MM-DD. A1.13 Next call, put, or cancellation date 2015-01-25 Provide next call, put, or cancellation date YYYY-MM-DD. A1.14 Next payment date 2015-01-25 Provide next payment date YYYY-MM-DD. A1.15 Local Currency Of Position USD Provide currency in which QFC is denominated. Use ISO currency code Char(3). A1.16 Current market value of the position in local currency 995000 Provide current market value of the position in local currency. In the case of a guarantee or other third party credit enhancements, provide the current mark-to-market expected value of the exposure. Information needed to determine the current size of the obligation or benefit associated with the QFC Num (25,5). A1.17 Current market value of the position in U.S. dollars 995000 In the case of a guarantee or other third party credit enhancements, provide the current mark-to-market expected value of the exposure. Information needed to determine the current size of the obligation/benefit associated with the QFC Num (25,5). A1.18 Asset Classification 1 Provide fair value asset classification under GAAP, IFRS, or other accounting principles or standards used by records entity. Provide “1” for Level 1, “2” for Level 2, or “3” for Level 3. Information needed to assess fair value of the position Char(1). A1.19 Notional or principal amount of the position in local currency 1000000 Provide the notional or principal amount, as applicable, in local currency. In the case of a guarantee or other third party credit enhancement, provide the maximum possible exposure. Information needed to help evaluate the position Num (25,5). A1.20 Notional or principal amount of the position In U.S. dollars 1000000 Provide the notional or principal amount, as applicable, in U.S. dollars. In the case of a guarantee or other third party credit enhancements, provide the maximum possible exposure. Information needed to help evaluate the position Num (25,5). A1.21 Covered by third-party credit enhancement agreement (for the benefit of the records entity)? Y/N Indicate whether QFC is covered by a guarantee or other third-party credit enhancement. Information needed to determine credit enhancement Char(1) Should be “Y” or “N. A1.21.1 Third-party credit enhancement provider identifier (for the benefit of the records entity) 999999999 If QFC is covered by a guarantee or other third-party credit enhancement, provide an identifier for provider. Use LEI if available. Complete the counterparty master table with respect to a provider that is a non-affiliate Varchar(50) Required if A1.21 is “Y”. Validated against CP.2. A1.21.2 Third-party credit enhancement agreement identifier (for the benefit of the records entity) 4444444 If QFC is covered by a guarantee or other third-party credit enhancement, provide an identifier for the agreement Varchar(50) Required if A1.21 is “Y.” Validated against A3.3. A1.21.3 Covered by third-party credit enhancement agreement (for the benefit of the counterparty)? Y/N Indicate whether QFC is covered by a guarantee or other third-party credit enhancement. Information needed to determine credit enhancement Char(1) Should be “Y” or “N. A1.21.4 Third-party credit enhancement provider identifier (for the benefit of the counterparty) 999999999 If QFC is covered by a guarantee or other third-party credit enhancement, provide an identifier for provider. Use LEI if available. Complete the counterparty master table with respect to a provider that is a non-affiliate Varchar(50) Required if A1.21.3 is “Y”. Validated against CO.2 or CP.2. A1.21.5 Third-party credit enhancement agreement identifier (for the benefit of the counterparty) 4444444 If QFC is covered by a guarantee or other third-party credit enhancement, provide an identifier for agreement Varchar(50) Required if A1.21.3 is “Y”. Validated against A3.3. A1.22 Related position of records entity 3333333 Use this field to link any related positions of the records entity. All positions that are related to one another should have same designation in this field Varchar(100). A1.23 Reference number for any related loan 9999999 Provide a unique reference number for any loan held by the records entity or a member of its corporate group related to the position (with multiple entries delimited by commas) Varchar(500). A1.24 Identifier of the lender of the related loan 999999999 For any loan recorded in A1.23, provide identifier for records entity or member of its corporate group that holds any related loan. Use LEI if entity has one Varchar(500). Table A-2—Counterparty Netting Set Data Field Example Instructions and data application Definition Validation A2.1 As of date 2015-01-05 Data extraction date YYYY-MM-DD. A2.2 Records entity identifier 999999999 Provide the LEI for the records entity Varchar(50) Validated against CO.2. A2.3 Netting agreement counterparty identifier 888888888 Provide an identifier for the netting agreement counterparty. Use LEI if counterparty has one Varchar(50) Validated against CP.2. A2.4 Netting agreement identifier xxxxxxxxx Provide an identifier for the netting agreement Varchar(50) Validated against A3.3. A2.4.1 Underlying QFC obligor identifier 888888888 Provide identifier for underlying QFC obligor if netting agreement is associated with a guarantee or other third party credit enhancement. Use LEI if available Varchar(50) Validated against CO.2 or CP.2. A2.5 Covered by third-party credit enhancement agreement (for the benefit of the records entity)? Y/N Indicate whether the positions subject to the netting set agreement are covered by a third-party credit enhancement agreement Char(1) Should be “Y” or “N.“ A2.5.1 Third-party credit enhancement provider identifier (for the benefit of the records entity) 999999999 Use LEI if available. Information needed to identity third-party credit enhancement provider Varchar(50) Required if A2.5 is “Y”. Validated against CP.2. A2.5.2 Third-party credit enhancement agreement identifier (for the benefit of the records entity) 4444444 Varchar(50) Required if A2.5 is “Y”. Validated against A3.3. A2.5.3 Covered by third-party credit enhancement agreement (for the benefit of the counterparty)? Y/N Information needed to determine credit enhancement Char(1) Should be “Y” or “N. A2.5.4 Third-party credit enhancement provider identifier (for the benefit of the counterparty) 999999999 Use LEI if available. Information needed to identity third-party credit enhancement provider Varchar(50) Required if A2.5.3 is “Y”. Should be a valid entry in the Counterparty Master Table. Validated against CP.2. A2.5.5 Third-party credit enhancement agreement identifier (for the benefit of the counterparty) 4444444 Information used to determine guarantee or other third-party credit enhancement Varchar(50) Required if A2.5.3 is “Y”. Validated against A3.3. A2.6 Aggregate current market value in U.S. dollars of all positions under this netting agreement −1000000 Information needed to help evaluate the positions subject to the netting agreement Num (25,5) Market value of all positions in A1 for the given netting agreement identifier should be equal to this value. A2.6 = A2.7 + A2.8. A2.7 Current market value in U.S. dollars of all positive positions, as aggregated under this netting agreement 3000000 Information needed to help evaluate the positions subject to the netting agreement Num (25,5) Market value of all positive positions in A1 for the given netting agreement identifier should be equal to this value. A2.6 = A2.7 + A2.8. A2.8 Current market value in U.S. dollars of all negative positions, as aggregated under this netting agreement −4000000 Information needed to help evaluate the positions subject to the netting agreement Num (25,5) Market value of all negative positions in A1 for the given Netting Agreement Identifier should be equal to this value. A2.6 = A2.7 + A2.8. A2.9 Current market value in U.S. dollars of all collateral posted by records entity, as aggregated under this netting agreement 950000 Information needed to determine the extent to which collateral has been provided by records entity Num (25,5) Market value of all collateral posted by records entity for the given netting agreement Identifier should be equal to sum of all A4.9 for the same netting agreement identifier in A4. A2.10 Current market value in U.S. dollars of all collateral posted by counterparty, as aggregated under this netting agreement 50000 Information needed to determine the extent to which collateral has been provided by counterparty Num (25,5) Market value of all collateral posted by counterparty for the given netting agreement identifier should be equal to sum of all A4.9 for the same netting agreement identifier in A4. A2.11 Current market value in U.S. dollar of all collateral posted by records entity that is subject to re-hypothecation, as aggregated under this netting agreement 950000 Information needed to determine the extent to which collateral has been provided by records entity Num (25,5). A2.12 Current market value in U.S. dollars of all collateral posted by counterparty that is subject to re-hypothecation, as aggregated under this netting agreement 950000 Information needed to determine the extent to which collateral has been provided by records entity Num (25,5). A2.13 Records entity collateral—net 950000 Provide records entity's collateral excess or deficiency with respect to all of its positions, as determined under each applicable agreement, including thresholds and haircuts where applicable Num (25,5) Should be less than or equal to A2.9. A2.14 Counterparty collateral—net 950000 Provide counterparty's collateral excess or deficiency with respect to all of its positions, as determined under each applicable agreement, including thresholds and haircuts where applicable Num (25,5) Should be less than or equal to A2.10. A2.15 Next margin payment date 2015-11-05 Provide next margin payment date for position YYYY-MM-DD. A2.16 Next margin payment amount in U.S. dollars 150000 Use positive value if records entity is due a payment and use negative value if records entity has to make the payment Num (25,5). A2.17 Safekeeping agent identifier for records entity 888888888 Provide an identifier for the records entity's safekeeping agent, if any. Use LEI if safekeeping agent has one Varchar(50) Validated against SA.2. A2.18 Safekeeping agent identifier for counterparty 888888888 Provide an identifier for the counterparty's safekeeping agent, if any. Use LEI if safekeeping agent has one Varchar(50) Validated against SA.2. Table A-3—Legal Agreements Field Example Instructions and data application Definition Validation A3.1 As Of Date 2015-01-05 Data extraction date YYYY-MM-DD. A3.2 Records entity identifier 999999999 Provide LEI for records entity Varchar(50) Validated against CO.2. A3.3 Agreement identifier xxxxxx Provide identifier for each master agreement, governing document, netting agreement or third-party credit enhancement agreement Varchar(50). A3.4 Name of agreement or governing document ISDA Master 1992 or Guarantee Agreement or Master Netting Agreement Provide name of agreement or governing document Varchar(50). A3.5 Agreement date 2010-01-25 Provide the date of the agreement YYYY-MM-DD. A3.6 Agreement counterparty identifier 888888888 Use LEI if counterparty has one. Information needed to identify counterparty Varchar(50) Validated against field CP.2. A3.6.1 Underlying QFC obligor identifier 888888888 Provide underlying QFC obligor identifier if document identifier is associated with a guarantee or other third party credit enhancement. Use LEI if underlying QFC obligor has one Varchar(50) Validated against CO.2 or CP.2. A3.7 Agreement governing law New York Provide law governing contract disputes Varchar(50). A3.8 Cross-default provision? Y/N Specify whether agreement includes default or other termination event provisions that reference an entity not a party to the agreement (“cross-default Entity”). Information needed to determine exposure to affiliates or other entities Char(1) Should be “Y” or “N. A3.9 Identity of cross-default entities 777777777 Provide identity of any cross-default entities referenced in A3.8. Use LEI if entity has one. Information needed to determine exposure to other entities Varchar(500) Required if A3.8 is “Y”. ID should be a valid entry in Corporate Org Master Table or Counterparty Master Table, if applicable. Multiple entries comma separated. A3.10 Covered by third-party credit enhancement agreement (for the benefit of the records entity)? Y/N Information needed to determine credit enhancement Char(1) Should be “Y” or “N.” A3.11 Third-party credit enhancement provider identifier (for the benefit of the records entity) 999999999 Use LEI if available. Information needed to identity Third-Party Credit Enhancement Provider Varchar(50) Required if A3.10 is “Y”. Should be a valid entry in the Counterparty Master Table. Validated against CP.2. A3.12 Associated third-party credit enhancement agreement document identifier (for the benefit of the records entity) 33333333 Information needed to determine credit enhancement Varchar(50) Required if A3.10 is “Y”. Validated against field A3.3. A3.12.1 Covered by third-party credit enhancement agreement (for the benefit of the counterparty)? Y/N Information needed to determine credit enhancement Char(1) Should be “Y” or “N.” A3.12.2 Third-party credit enhancement provider identifier (for the benefit of the counterparty) 999999999 Use LEI if available. Information needed to identity Third-Party Credit Enhancement Provider Varchar(50) Required if A3.12.1 is “Y”. Should be a valid entry in the Counterparty Master. Validated against CP.2. A3.12.3 Associated third-party credit enhancement agreement document identifier (for the benefit of the counterparty) 33333333 Information needed to determine credit enhancement Varchar(50) Required if A3.12.1 is “Y”. Validated against field A3.3. A3.13 Counterparty contact information: name John Doe & Co Provide contact name for counterparty as provided under notice section of agreement Varchar(200). A3.14 Counterparty contact information: address 123 Main St, City, State Zip code Provide contact address for counterparty as provided under notice section of agreement Varchar(100). A3.15 Counterparty contact information: phone 1-999-999-9999 Provide contact phone number for counterparty as provided under notice section of agreement Varchar(50). A3.16 Counterparty's contact information: email address [email protected] Provide contact email address for counterparty as provided under notice section of agreement Varchar(100). Table A-4—Collateral Detail Data Field Example Instructions and data application Definition Validation A4.1 As of date 2015-01-05 Data extraction date YYYY-MM-DD. A4.2 Records entity identifier 999999999 Provide LEI for records entity Varchar(50) Validated against CO.2. A4.3 Collateral posted/collateral received flag P/N Enter “P” if collateral has been posted by the records entity. Enter “R” for collateral received by Records Entity Char(1). A4.4 Counterparty identifier 888888888 Provide identifier for counterparty. Use LEI if counterparty has one Varchar(50) Validated against CP.2. A4.5 Netting agreement identifier xxxxxxxxx Provide identifier for applicable netting agreement Varchar(50) Validated against field A3.3. A4.6 Unique collateral item identifier CUSIP/ISIN Provide identifier to reference individual collateral posted Varchar(50). A4.7 Original face amount of collateral item in local currency 1500000 Information needed to evaluate collateral sufficiency and marketability Num (25,5) A4.8 Local currency of collateral item USD Use ISO currency code Char(3). A4.9 Market value amount of collateral item in U.S. dollars 850000 Information needed to evaluate collateral sufficiency and marketability and to permit aggregation across currencies Num (25,5) Market value of all collateral posted by Records Entity or Counterparty A2.9 or A2.10 for the given netting agreement identifier should be equal to sum of all A4.9 for the same netting agreement identifier in A4. A4.10 Description of collateral item U.S. Treasury Strip, maturity 2020/6/30 Information needed to evaluate collateral sufficiency and marketability Varchar(200). A4.11 Asset classification 1 Provide fair value asset classification for the collateral item under GAAP, IFRS, or other accounting principles or standards used by records entity. Provide “1” for Level 1, “2” for Level 2, or “3” for Level 3 Char(1) Should be “1” or “2” or “3.” A4.12 Collateral or portfolio segregation status Y/N Specify whether the specific item of collateral or the related collateral portfolio is segregated from assets of the safekeeping agent Char(1) Should be “Y” or “N.” A4.13 Collateral location ABC broker-dealer (in safekeeping account of counterparty) Provide location of collateral posted Varchar(200). A4.14 Collateral jurisdiction New York, New York Provide jurisdiction of location of collateral posted Varchar(50). A4.15 Is collateral re-hypothecation allowed? Y/N Information needed to evaluate exposure of the records entity to the counterparty or vice-versa for re-hypothecated collateral Char(1) Should be “Y” or “N.” Corporate Organization Master Table 1 Field Example Instructions and data application Definition Validation CO.1 As of date 2015-01-05 Data extraction date YYYY-MM-DD. CO.2 Entity identifier 888888888 Provide unique identifier. Use LEI if available. Information needed to identify entity Varchar(50) Should be unique across all record entities. CO.3 Has LEI been used for entity identifier? Y/N Specify whether the entity identifier provided is an LEI Char(1) Should be “Y” or “N.” CO.4 Legal name of entity John Doe & Co Provide legal name of entity Varchar(200). CO.5 Immediate parent entity identifier 77777777 Use LEI if available. Information needed to complete org structure Varchar(50). CO.6 Has LEI been used for immediate parent entity identifier? Y/N Specify whether the immediate parent entity identifier provided is an LEI Char(1) Should be “Y” or “N.” CO.7 Legal name of immediate parent entity John Doe & Co Information needed to complete org structure Varchar(200). CO.8 Percentage ownership of immediate parent entity in the entity 100.00 Information needed to complete org structure Num (5,2). CO.9 Entity type Subsidiary, foreign branch,
  • foreign division
  • Information needed to complete org structure Varchar(50).
    CO.10 Domicile New York, New York Enter as city, state or city, foreign country Varchar(50). CO.11 Jurisdiction under which incorporated or organized New York Enter as state or foreign jurisdiction Varchar(50). CO.12 Reporting status REN Indicate one of the following, as appropriate, given status of entity under the this part. Information needed to validate compliance with the requirements of this part
  • REN = Records entity (reporting).
  • NFC= Non-financial company (not reporting)
  • EXC = Excluded entity (not reporting)
  • ZER = Records entity with 0 QFCs (not reporting)
  • DEM = Records entity de minimis exemption (not reporting)
  • OTH = Records entity using another exemption (not reporting)
  • Char(3) Should be “REN” or “NFC” or “EXC” or “DEM” or “ZER” or “OTH.”
    1 Foreign branches and divisions shall be separately identified to the extent they are identified in an entity's reports to its PFRAs.
    Counterparty Master Table Field Example Instructions and data application Definition Validation CP.1 As of date 2015-01-05 Data extraction date YYYY-MM-DD. CP.2 Counterparty identifier 888888888 Use LEI if counterparty has one. Should be used consistently across all records entities within a corporate group. The counterparty identifier shall be the global legal entity identifier if one has been issued to the entity. If a counterparty transacts with the records entity through one or more separate foreign branches or divisions and any such branch or division does not have its own unique global legal entity identifier, the records entity must include additional identifiers, as appropriate to enable the FDIC to aggregate or disaggregate the data for each counterparty and for each entity with the same ultimate parent entity as the counterparty Varchar(50). CP.3 Has LEI been used for counterparty identifier? Y/N Indicate whether the counterparty identifier is an LEI Char(1) Should be “Y” or “N.” CP.4 Legal name of counterparty John Doe & Co Information needed to identify and, if necessary, communicate with counterparty Varchar(200). CP.5 Domicile New York, New York Enter as city, state or city, foreign country Varchar(50). CP.6 Jurisdiction under which incorporated or organized New York Enter as state or foreign jurisdiction Varchar(50). CP.7 Immediate parent entity identifier 77777777 Provide an identifier for the parent entity that directly controls the counterparty. Use LEI if immediate parent entity has one Varchar(50). CP.8 Has LEI been used for immediate parent entity identifier? Y/N Indicate whether the immediate parent entity identifier is an LEI Char(1) Should be “Y” or “N.” CP.9 Legal name of immediate parent entity John Doe & Co Information needed to identify and, if necessary, communicate with counterparty Varchar(200). CP.10 Ultimate parent entity identifier 666666666 Provide an identifier for the parent entity that is a member of the corporate group of the counterparty that is not controlled by another entity. Information needed to identify counterparty. Use LEI if ultimate parent entity has one Varchar(50) CP.11 Has LEI been used for ultimate parent entity identifier? Y/N Indicate whether the ultimate parent entity identifier is an LEI Char(1) Should be “Y” or “N.” CP.12 Legal name of ultimate parent entity John Doe & Co Information needed to identify and, if necessary, communicate with counterparty Varchar(100). Booking Location Master Table Field Example Instructions and data application Definition Validation BL.1 As of date 2015-01-05 Data extraction date YYYY-MM-DD. BL.2 Records entity identifier 999999999 Provide LEI Varchar(50) Should be a valid entry in the Corporate Org Master Table. BL.3 Internal booking location identifier New York, New York Provide office where the position is booked. Information needed to determine the headquarters or branch where the position is booked, including the system on which the trade is booked, as well as the system on which the trade is settled Varchar(50). BL.4 Unique booking unit or desk identifier xxxxxx Provide unit or desk at which the position is booked. Information needed to help determine purpose of position Varchar(50). BL.5 Unique booking unit or desk description North American trading desk Additional information to help determine purpose of position Varchar(50). BL.6 Booking unit or desk contact—phone 1-999-999-9999 Information needed to communicate with the booking unit or desk Varchar(50). BL.7 Booking unit or desk contact—email [email protected] Information needed to communicate with the booking unit or desk Varchar(100). Safekeeping Agent Master Table Field Example Instructions and data application Definition Validation SA.1 As of date 2015-01-05 Data extraction date YYYY-MM-DD SA.2 Safekeeping agent identifier 888888888 Provide an identifier for the safekeeping agent. Use LEI if safekeeping agent has one Varchar(50). SA.3 Legal name of safekeeping agent John Doe & Co Information needed to identify and, if necessary, communicate with the safekeeping agent Varchar(200). SA.4 Point of contact—name John Doe Information needed to identify and, if necessary, communicate with the safekeeping agent Varchar(200). SA.5 Point of contact—address 123 Main St, City, State Zip Code Information needed to identify and, if necessary, communicate with the safekeeping agent Varchar(100). SA.6 Point of contact—phone 1-999-999-9999 Information needed to identify and, if necessary, communicate with the safekeeping agent Varchar(50). SA.7 Point of contact—email [email protected] Information needed to identify and, if necessary, communicate with the safekeeping agent Varchar(100). Details of Formats Format Content in brief Additional explanation Examples YYYY-MM-DD Date YYYY = four digit date, MM = 2 digit month, DD = 2 digit date 2015-11-12 Num (25,5) Up to 25 numerical characters including 5 decimals Up to 20 numerical characters before the decimal point and up to 5 numerical characters after the decimal point. The dot character is used to separate decimals 1352.67
  • 12345678901234567890.12345
  • 0
  • −20000.25
  • −0.257
  • Char(3) 3 alphanumeric characters The length is fixed at 3 alphanumeric characters USD
  • X1X
  • 999
  • Varchar(25) Up to 25 alphanumeric characters The length is not fixed but limited at up to 25 alphanumeric characters asgaGEH3268EFdsagtTRCF543
    Dated: October 13, 2016. Amias Moore Gerety, Acting Assistant Secretary for Financial Institutions.
    [FR Doc. 2016-25329 Filed 10-28-16; 8:45 am] BILLING CODE 4810-25-P
    CategoryRegulatory Information
    CollectionFederal Register
    sudoc ClassAE 2.7:
    GS 4.107:
    AE 2.106:
    PublisherOffice of the Federal Register, National Archives and Records Administration

    2024 Federal Register | Disclaimer | Privacy Policy
    USC | CFR | eCFR