931 Amended reliability benchmarks and standards for electric distribution companies  

  • PENNSYLVANIA PUBLIC UTILITY COMMISSION

    Amended Reliability Benchmarks and Standards for Electric Distribution Companies

    [34 Pa.B. 2764]

       Public Meeting held
    May 7, 2004

    Commissioners Present:  Terrance J. Fitzpatrick, Chairperson; Robert K. Bloom, Vice Chairperson; Glen R. Thomas; Kim Pizzingrilli; Wendell F. Holland

    Amended Reliability Benchmarks and Standards for the Electric Distribution Companies; Doc. No. M-00991220

    Order

    By the Commission:

       Today, in conjunction with our Final Rulemaking Order at Docket No. L-00030161, we tighten our standards for performance reliability in the electric distribution industry and reiterate the Commission's regulations regarding qualifying an interruption as a major event as well as the process for filing formal requests for waivers from having to submit reliability data for any reporting period.

    Procedural History

       The Electricity Generation Customer Choice and Competition Act (Act), December 3, 1996, P. L. 802, No. 138 § 4, became effective January 1, 1997. The Act amends 66 Pa.C.S. by adding Chapter 28 to establish standards and procedures to create direct access by retail customers to the competitive market for the generation of electricity, while maintaining the safety and reliability of the electric system. Specifically, the Commission was given a legislative mandate to ensure that levels of reliability that were present prior to the restructuring of the electric utility industry would continue in the new competitive markets. 66 Pa.C.S. § 2802(12).

       In response to this legislative mandate, the Commission adopted a Final Rulemaking Order on April 23, 1998, at Docket No. L-00970120, setting forth various reporting requirements designed to ensure the continuing safety, adequacy and reliability of the transmission and distribution of electricity in the Commonwealth. See 52 Pa. Code §§ 57.191--57.197. The Final Rulemaking Order acknowledged that the Commission could reevaluate its monitoring efforts at a later time as deemed appropriate.

       On December 16, 1999, the Commission entered a Final Order at M-00991220, which established reliability benchmarks and standards1 for the electric distribution companies (EDC) in accordance with 52 Pa. Code § 57.194(h). The Commission's regulations for Electric Reliability Standards at 52 Pa. Code § 57.194(h)(1) state that:

    ''In cooperation with an electric distribution company and other affected parties, the Commission will, from time to time, establish numerical values for each reliability index or other measure of reliability performance that identify the benchmark performance of an electric distribution company, and performance standards.''

       In a series of orders at Docket No. M-00991220, the Commission established reliability Benchmarks and Standards regarding: (1) Customer Average Interruption Duration Index (CAIDI); (2) System Average Interruption Frequency Index (SAIFI); (3) System Average Interruption Duration Index (SAIDI); and (4) Momentary Average Interruption Index (MAIFI).2 The benchmark for each performance value index is the average of the historical annual averages of the index for the 5-year period from 1994-1998 and is company specific. The standard is two standard deviations from the benchmark. These benchmarks and standards have remained in effect since their issuance in December 1999.

       In June 2002, the Legislative Budget and Finance Committee (LB&FC) issued a report entitled Assessing the Reliability of Pennsylvania's Electric Transmission and Distribution Systems. The report, in part, concluded that the two-standard deviation minimum performance standard is too loose and should be tightened as it does not assure that reliability performance will be maintained at levels experienced prior to the Act, December 3, 1996, P. L. 802, No. 138 § 4, effective January 1, 1997.

       A Staff Internal Working Group on Electric Service Reliability (Staff Internal Working Group) prepared a report entitled Review of the Commission's Monitoring Process For Electric Distribution Service Reliability, dated July 18, 2002, which reviewed the Commission's monitoring process for electric distribution service reliability and commented on the recommendations from the LB&FC report. In its report, the Staff Internal Working Group recommended, in part, that ''the Commission should develop minimum performance standards that achieve the Commission's policy objective (See Recommendation III-1, p. 7).'' A subsequent Commission Order on August 29, 2002, at Docket No. D-02SPS021 directed:

    ''That the Commission staff shall undertake the preparation of such orders, policy statements, and proposed rulemakings as may be necessary to implement the recommendations contained within the Staff Internal Working Group . . . Report (p. 4).''

       The Staff Internal Working Group was assigned this task and conducted field visits to EDCs to identify the current capabilities of each EDC for measuring and reporting reliability performance. These field visits began in October 2002 and continued through March 2003.

       On June 27, 2003, the Commission entered a Tentative Order at M-00991220, which recomputed the reliability benchmarks and standards for the EDCs. The Tentative Order was published for comments in the Pennsylvania Bulletin. Comments were filed by the Attorney General's Office of Consumer Advocate (OCA), AFL-CIO Utility Caucus (AFL-CIO), Energy Association of Pennsylvania (EAP), Metropolitan Edison Company (Met-Ed), Pennsylvania Electric Company (Penelec), Pennsylvania Power Company (Penn Power), Citizens' Electric Company (Citizens'), Wellsboro Electric Company (Wellsboro), Pike County Light & Power Company (Pike County), PPL Electric Utilities Corporation (PPL), UGI Utilities, Inc.--Electric Division (UGI), Allegheny Power and PECO Energy Company (PECO). Reply Comments were filed by EAP, Met-Ed, Penelec, Penn Power, the AFL-CIO and the OCA.

    Discussion

       The comments raised issues regarding several topics. The following is a short synopsis of each topic, the parties' positions and our disposition of each issue.

    1.  Recalculation of Reliability Benchmarks

    A.  System-wide Major Event Exclusion Standardization

       In our Tentative Order, we discussed two sources of variability in the computation of the permanent benchmarks to date that made it difficult to set new performance standards equitably across the EDCs.

       The first source of variability was that some EDCs used one, system-wide operating area to compute their reliability metrics, while other EDCs subdivided their service territories and used multiple operating areas to compute their metrics. The number, size and composition of operating areas used for metric computations introduced variability into the criterion used to exclude major events from the reliability metrics reported to the Commission. An EDC that subdivided its territory into several small geographic operating areas could exclude major events from its metric calculations based on a criterion of an interruption affecting 10% of the customers in an operating area; whereas another EDC, employing only one service territory-wide operating area had to meet a much higher criterion of an interruption affecting 10% of the total EDC customer base. The proposed solution to this benchmark variability problem was to develop one uniform calculation method using system-wide performance (for the entire service territory) for computing and reporting reliability metrics to the Commission.

       We proposed that EDCs should compute and report their reliability metrics to the Commission considering the entire service territory as one operating area and the major event exclusion of an interruption that affects 10% of the entire customer base for a duration of 5 minutes or longer.

       To develop proposed standards based on the uniform definition of an operating area, Commission staff requested those EDCs that had developed their metrics using more than one operating area to recalculate their metrics for the 1994-2002 period using the entire service territory criterion. The data recalculations were used by Commission staff to recompute the current benchmarks using a uniform methodology across the EDCs. In the Tentative Order, the Commission emphasized that the recomputed benchmarks do not represent a lowering or raising of the historical benchmarks. All of the EDCs were asked to apply a uniform exclusion criterion to their original data. The only major events excluded from the recomputed benchmarks were unscheduled interruptions that affected 10% or more of the customers in the entire service territory for a duration of 5 minutes or longer. For EDCs that had previously excluded major events based on the multiple operating area criterion, the recomputed benchmark values may be higher than the original benchmark values because previously excluded outage data may now be included in the metric values. However, we noted in our Tentative Order that the recomputed benchmarks should be viewed as representing the actual reliability performance during the historical period, as calculated using a uniform methodology.

    Positions of the Parties

       PPL filed comments in support of the Commission's proposed recalculation of the historical benchmarks using the single operating area data as it will establish a uniform calculation method for computing and reporting reliability metrics. However, the OCA strongly urged the Commission to retain the existing historic performance benchmarks rather than lowering expectations for certain EDCs through a recomputation of historic data. The OCA agrees that on a prospective basis, the Commission should ensure that the major event criteria are applied uniformly by the EDCs as that is an appropriate way to proceed on a going forward basis. The OCA also noted a concern identified in the LB&FC Report that wide variations exist among EDCs in both data collection and the application of Commission regulations to the data. The OCA noted that the LB&FC recommended that the Commission clarify when an EDC can exclude data for major events from the underlying data used to calculate the metrics.

       In response to the OCA's comments, EAP countered that the recomputed benchmarks do not change the historical service provided to the relevant EDC customers. EAP commented that the Commission has not lowered the benchmarks going forward but has sought to ensure compatibility. EAP notes that it is critical for accurate comparisons that the same method be employed for historical and future evaluations of reliability.

       The OCA submits that operating area information reflects how an EDC manages its distribution system and utilizes its resources within its system and that worst performing circuit reports as required under the companion Final Rulemaking Order at L-00030161 are not a suitable proxy for operating area information. The OCA also recognizes that the Staff Report noted that some EDCs defined operating areas differently for internal purposes than for Commission reporting purposes. As a result, the OCA suggests that EDCs be required to continue reporting of operating area reliability metrics using operating areas consistent with those used for internal operations and monitoring.

    Disposition

       The Commission strongly emphasizes that recalculating the historical benchmarks so that all EDCs are using standard criteria for excluding major events is not lowering the bar for future reliability performance. The recalculation is consistent with the recommendation of the LB&FC (as noted in the comments of the OCA) that the Commission clarify when an EDC can exclude major events from the data used to calculate the metrics. The benchmark recalculation achieves three important objectives for the Commission. The first objective is uniformity of metric calculations. The second objective is that the Commission, in performing its reliability monitoring role, can view the metric values on the same numerical scale. The third Commission objective is captured in the reply comments of EAP who points out that it is critical to use identical calculations for historical benchmarks (reference points) and future reliability performance measures. We would add that to allow the use of different calculation measures for benchmarks, but to use a standard calculation method for measuring reliability performance on a going forward basis (as suggested by the OCA), would render erroneous results so that the Commission would conclude that some EDCs' performance relative to their benchmarks had improved or deteriorated when in fact that was not the case. Therefore, we will retain the recalculated benchmarks and require EDCs to use the standard methodology that employs the system-wide definition of an operating area for the exclusion of major events from reliability metric calculations.

       It should be realized that if EDCs are required to report by the operating areas they use for internal operations, all previous years' operating area reliability metrics would need to be recomputed each time they reconfigure their internal operations. This would make it more difficult to find pocket areas where reliability is a concern, since the companies could continually reconfigure operating areas to cover areas of concern. The circuit analysis proposed eliminates this potential problem and allows for identifying problem areas that are in need of remedial action. Therefore, we will adopt the initial Commission position, whereby companies report reliability data using a system-wide operating area and a listing of worst performing circuits. Our position is further addressed in our Final Rulemaking Order at L-00030161.

    B.  Standardization of Individual EDC Calculations

       A second source of variability discussed in our Tentative Order that made it difficult to equitably set new performance standards for all the EDCs pertained to two EDCs not excluding any data on major events and another EDC using a different major event definition than that contained in Commission regulations and used by all the other EDCs. In the first instance, Wellsboro and Citizens' did not exclude major events from their metric calculations for 1994-2002, although the regulations permit these exclusions.3 This was in contrast to the calculations of all the other EDCs and, therefore, was a source of variability to only Citizens' and Wellsboro. In the second instance, Penn Power used the FirstEnergy definition of a major event which is different than the definition used by the Commission and can yield a different result.

       Commission staff requested that the metrics for Citizens', Wellsboro and Penn Power be recomputed so that they would be calculated using the same uniform methodology that other EDCs used.

    Positions of the Parties

       The OCA noted in their comments that for Citizens' and Wellsboro, the recomputed historical benchmarks suggest that a much higher reliability was achieved from 1994-1998 than was previously calculated. The OCA is unclear as to why there was a change in these two EDCs' benchmarks with the recalculation since these small EDCs always reported on a system-wide basis rather than a multiple operating area basis. EAP reply comments noted that Citizens and Wellsboro have recomputed their benchmarks to exclude major events consistent with the other EDCs to ensure comparability. In their comments, FirstEnergy did not specifically address the recomputation of Penn Power's benchmarks using the Commission's definition of a major event rather than using FirstEnergy's definition which yields a different result. However, FirstEnergy commented that as a broad conceptual matter and over the long-term, that they agree with and support the Commission's efforts to standardize among the EDCs the outage data maintained and submitted to the Commission.

    Disposition

       The reply comments of EAP correctly capture the reason why the recalculated benchmarks of Citizens' and Wellsboro reflect a higher level of reliability during the benchmark period than was previously reported. As we noted in our Tentative Order, Citizens' did not exclude any major events from its metric calculations for 1994-2002 although the Commission's regulations permit these exclusions. To place Citizens' and Wellsboro's metric values on the same numerical scale as the metrics from the other EDCs, the Commission requested that Citizens' and Wellsboro recalculate their benchmarks using the allowed exclusions of major events, thereby lowering their benchmark values from those reported previously. We will retain the recomputed benchmarks for Citizens' and Wellsboro. We will retain the recomputed benchmarks for Penn Power as advanced in the Tentative Order that used the Commission's definition of a major event so that Penn Power's benchmarks are calculated using the same methodology that other EDCs use. We will interpret FirstEnergy's comments to be consistent with this disposition.

       Appendix A contains a table of the benchmarks as originally calculated and the recomputed benchmarks based on: (1) excluding major event data using the entire service territory criterion (changes for Allegheny Power, Duquesne Light, Met-Ed, Penelec and PPL); (2) excluding major events for the first time (Citizens' and Wellsboro); (3) using the Commission's definition of a major event (Penn Power as noted in our Tentative Order); and corrected SAIDI calculations that reflect SAIDI as being the product of SAIFI multiplied by CAIDI (UGI and Pike County). We will adopt the recomputed benchmarks contained in Appendix A and also add remarks to clarify why the prior benchmarks were changed as reflected in the recomputed benchmarks.

    2.  Reliability Data Quality Issues

       In our Tentative Order, we discussed two data quality issues that may affect the Commission's electric reliability monitoring efforts. The first issue pertained to Allegheny Power which reported having several months of missing data for their 1997 and 1998 SAIFI calculations. We noted that because 1997 and 1998 data was used to calculate the historical benchmarks, Allegheny Power's SAIFI and SAIDI benchmarks were set artificially low. Thus Allegheny Power's SAIFI and SAIDI post restructuring reliability performance metric values as compared to the benchmarks would be inherently unfavorable to the company.

       The second data quality issue we discussed in our Tentative Order pertained to EDCs that had implemented automated reliability Outage Management Systems (OMS) which had the potential to improve the accuracy of reliability monitoring information. We noted that the changes in data gathering methods had implications for comparing historical reliability performance to current performance and introduced a degree of uncertainty into our ability to interpret reliability trend data. Our discussion in the Tentative Order pointed out the importance of separating out the method variance (due to differences in measurement capability) from the variance in reliability scores that is attributable to true changes in reliability. We concluded that we could not quantify the exact degree of method variance resulting from OMS implementation.

    Positions of the Parties

       As to the first data quality issue, Allegheny Power commented that the specific benchmarks proposed for Allegheny Power are unrealistic and not useful for future comparisons. Allegheny Power claims that their SAIFI benchmark is skewed by a period of incomplete data and that their SAIFI benchmark is unrealistically low in comparison to other large EDCs. As evidence, Allegheny Power comments that their SAIFI performance for 2000-2002 matches the best performance of all large EDCs for the same period. Accordingly, Allegheny Power requests an adjustment of their benchmarks.

       Comments were filed on behalf of Met-Ed, Penelec and Penn Power (collectively, FE Companies) that pertained to data quality issues about the implementation of OMS and the resulting implications for the validity of the proposed benchmarks. The FE Companies noted that with the exception of Allegheny Power, Met-Ed and Penelec are in the unique position of having installed and implemented new automated processes for collecting outage information after the 1994-1998 base period used by the Commission in setting the reliability benchmarks. The FE Companies comment that the Tentative Order recalculates their benchmarks without any consideration of the improvement in their methods for collecting reliability data since electric restructuring. Quoting from their 2002 Reliability Report to the Commission, the FE Companies state that although statistics for several operating areas are elevated, there has been no real change in reliability performance. FE Companies believe that the elevated statistics have been the result of the implementation of the new automated systems.

       The comments of the FE Companies also note that the benchmarks and standards proposed in the Tentative Order for Penn Power do not give any consideration to the inaccuracy of some of its historic period reliability data. The comments note that in the early 1990s Penn Power relied in part on estimates of the number of customers affected by power outages. However, with more recent electronic mapping efforts, Penn Power now has substantially more accurate outage statistics that are not directly comparable to the historic benchmark as proposed by the Commission.

       In place of the benchmarks proposed by the Commission in the Tentative Order, the FE Companies request that the Commission utilize revised benchmarks and standards proposed by the FE Companies which are based on reported performance during the 1998-2002 period. The benchmarks and standards proposed by the FE Companies in Exhibit 1 of their comments are significantly higher (allowing for worse reliability performance) than those we proposed. In support of the higher benchmarks and standards proposed by the FE Companies, they cite the Commission's 2002 Customer Service Performance Report as evidence of customers' positive perception of reliability performance.

       The AFL-CIO and the OCA filed reply comments in response to some of the points noted in the comments of Allegheny Power and the FE Companies pertaining to data quality issues. The AFL-CIO comments that, in theory, it is possible that the mere fact of changing data collection methods could have some effect on the reliability statistics reported by the EDCs. However, the AFL-CIO notes that the FE Companies have not shown that this has occurred. Similarly, the OCA comments that no evidence has been presented by the EDCs that shows or even supports the claims that the historic data is not representative of pre-restructuring performance or that the installation of new OMS is the sole cause of the apparent deterioration in reliability. The OCA notes that the claim that the new OMS are causing the appearance of deterioration in reliability has not been subjected to evaluation or review. The OCA commented that the LB&FC Report made the point that careful analyses of these claims are needed before any adjustments should be considered. The OCA also comments that the Commission should not entertain requests to change individual EDC benchmarks and standards through the Tentative Order. In the view of the OCA, these requests are more properly made as a separate petition where the merits and all underlying facts can be thoroughly examined on the record.

       The AFL-CIO and the OCA also offered reply comments addressing the FE Companies citation of the Commission's 2002 Customer Service Performance Report findings to note customer satisfaction with post-restructuring reliability and the need to adapt new benchmarks and standards proposed by the FE Companies. The AFL-CIO notes that the Commission's Report evaluates EDC call center operations and has nothing to do with the reliability of distribution service. The OCA comments that the use of customer service data is not an adequate substitute for objective standards for reliability.

    Disposition

       First we will address the requests by Allegheny Power and the FE Companies to adjust the Commission's proposed benchmarks and standards or to substitute benchmarks and standards proposed by the FE Companies for those proposed by the Commission in our Tentative Order. We will adopt the position advanced by the OCA that the Commission should not entertain requests to change individual EDC benchmarks and standards through this Tentative/Final Order process. We note that this is a generic proceeding and does not have provisions for the more intensive presentation and review of evidence that the AFL-CIO and the OCA note should accompany a request for changes in benchmarks and standards.

       The data that Allegheny Power and the FE Companies are now claiming is inaccurate was the same data (covering the period of 1994-1998) used to establish the original benchmarks in 1999, and no EDC appealed the Commission's December 16, 1999, Order at M-00991220 which established those benchmarks. The December 16, 1999, Order stemmed from a consensus proceeding as opposed to an evidentiary hearing and at that time the companies represented that those benchmarks were the averages of their indices over a 5-year, precompetition period (from 1994-1998). Based upon the companies' representations, the December 16, 1999, Order was entered establishing the benchmarks and standards. No one appealed said Order and we believe the EDCs cannot now challenge the original benchmarks. However, we will allow the EDCs to challenge the recomputed benchmarks if they have new evidence such as the impact of OMS systems on their reliability indices by allowing utility-specific on-the-record proceedings to afford the parties the opportunity to examine all relevant issues and provide the Commission with a complete factual record upon which to base its decision. The proceedings must be initiated within 30 days of the date of entry of this Order and the burden of proof is to be on the Petitioners. The petition must be served upon all parties of interest including the Pennsylvania AFL-CIO (Utility Caucus), the OCA and OSBA.

       In the case of the FE Companies requests for new benchmarks and standards, we believe that a thorough examination of factual data by all interested parties is necessary before any potential revisions to the benchmarks are made. We note that as recently as May 2001, Met Ed and Penelec reliability metrics were incorporated into a Service Quality Index that was part of the Joint Application for Approval of the Merger of GPU, Inc. with First Energy Corporation approved by the Commission in an Order dated May 24, 2001. It is not clear why the FE Companies' claims regarding the inaccuracy of the metrics was not an issue at that time, but is an issue now.

       Further, we note that other Pennsylvania EDCs have implemented OMS and taken measures to increase connectivity but have not made similar claims of adverse effects on reliability indices. Absent an on-the-record proceeding which can determine the facts that are specific to the FE Companies, it does not appear to be fair to make specific adjustments to the FE Companies benchmarks and standards that will not also be made to other EDCs' benchmarks and standards. The FE Companies should have expected in advance that implementing OMS had the potential to affect the measurement of reliability performance and thus should have taken steps to conduct parallel measurements of their old and new data gathering systems.

       With parallel measurement and analysis, the FE Companies could then determine the degree of method variance, if any, and have factual information to present to the Commission to support a request for a change in benchmarks and standards. Also, having factual information about the degree of method variance would appear to be necessary to make meaningful comparisons of current performance to past performance so that EDC management could determine if there was any true change in reliability performance over time aside from any change that may have occurred from the implementation of OMS. If parallel measurement and analyses were conducted, this information should be presented in an on-the-record proceeding before the Commission.

       We also want to address the comments of the FE Companies that cite the Commission's 2002 Customer Service Performance Report as evidence of customer satisfaction with service reliability. As noted by the AFL-CIO, the customer survey reported on in the Commission Report does not inquire about satisfaction on the number of service interruptions or service restoration times. The focus of the survey data is on call center performance such as access, courteousness and knowledge of the call center representatives. We do not view this data as an indication that customers are satisfied with the aspects of Met Ed and Penelec's service reliability measured by the benchmarks and standards.

       Going forward, the Commission wishes to stress the importance of EDCs conducting parallel measurements and analyses when implementing changes in reliability monitoring and data gathering methods so that the Commission is provided with accurate information about true reliability performance. Parallel measurement efforts also appear necessary to enable EDC management to fulfill their obligations to effectively maintain good reliability performance.

       Finally, we want to point out that the Commission is providing some degree of latitude to all EDCs by setting the 3-year rolling standard at 110% of the benchmark versus 100% of the benchmark as discussed in greater detail later in this Order. This latitude should sufficiently account for any potential typical degree of method variance that may have occurred in the measurement of the benchmarks and performance in the post-restructuring period. Absent a determination from the Commission based on an on-the-record proceeding, the Commission will not permit revisions to individual EDCs' benchmarks and standards to allow for a greater degree of latitude because of reliability measurement method variance.

    3.  Revised Performance Standards

       In our Tentative Order, we noted two shortcomings in our existing minimum reliability standards that were established in 1999. The first shortcoming was statistical in nature and related to the establishment of standards that were two standard deviations above the benchmarks. This method of establishing standards yielded a result that enabled an EDC to perform worse on a performance index (such as CAIDI or SAIFI) after 1998 than any year during the 1994-1998 benchmark period and still be within the standard. This wide band of acceptable performance within the standard led to the second shortcoming, an inconsistency with the Commission's policy objective of setting standards for reliability that maintain the same level of reliability after restructuring as was experienced prior to restructuring. We also noted that the LB&FC arrived at a similar conclusion about an overly wide band of acceptable performance with the current performance standards. In our Tentative Order, we showed figures for the major EDCs revealing that our two standard deviation approach to setting standards allowed for average SAIFI values to be 40% greater than the historical benchmark and average CAIDI values to be 24% above the benchmark, but still within standards.

       Based on the shortcomings previously identified, the Commission proposed to set new reliability standards that were more closely tied to the EDCs' historic benchmark performance but also allowed for some degree of variability from year to year. The Commission considered but declined to use the standard deviation approach for setting the proposed new performance standards. A standard deviation measures the degree of variance from an average and can be useful for the establishment of variability standards. However, because the benchmark data currently available consists of only five data points for each reliability index per EDC (the annual average indices for the years 1994, 1995, 1996, 1997 and 1998), we were not confident that the standard deviation statistic would yield a valid result. The standard deviation is typically used to summarize the variability in a large data set. We did not believe that this underlying assumption for the statistic was met with only five data points per EDC for each metric.

       Instead of using the standard deviation approach for setting an acceptable band of performance, the Commission proposed thresholds using a percentage bandwidth above the benchmark for a shorter term standard and another percentage for a longer term standard.4 The proposed longer term standards were generic in the sense that the proposed percentages above each EDC's benchmarks were the same for all EDCs. However, the generic percentage standard was applied to each EDCs' benchmarks which were based on individual EDC performance from 1994-1998. The proposed longer term standard was that the rolling 3-year average for system-wide reliability indices should be no higher than 10% above the historic benchmark. The proposed rolling 3-year standard was set at 10% above the benchmark to ensure that the rolling 3-year standard is not worse than the worst annual performance experienced during the years prior to restructuring (1994-1998). Rolling 3-year performance was proposed to be measured against the standard at the end of each calendar year.

       The Commission also proposed a short-term standard to monitor performance on a more frequent basis. For the large EDCs5 (companies with 100,000 or more customers) the Commission proposed that the rolling 12-month averages for the system-wide indices be within 20% of the benchmark. For small EDCs6 (companies with less than 100,000 customers), the Commission proposed that the rolling 12-month averages for the system-wide indices should be within 35% of the historical benchmarks. A greater degree of short-term latitude was proposed for the small EDCs in the rolling 12-month standard because small EDCs have fewer customers and fewer circuits than the large EDCs, potentially allowing a single event to have more of a significant impact on the reliability performance of the small EDCs' distribution systems.

       The distinction between small EDCs and large EDCs is illustrated by the SAIFI calculation. SAIFI is a ratio of customers interrupted divided by customers served. With a much smaller number of customers served, outages that are relatively insignificant for a large EDC's reliability measures will have a more significant impact on small EDCs. Thus, small EDCs have standards of deviation that are higher than the large EDCs because of small sample sizes. Reducing the former two-standard deviation standard to a 135% standard is a significant tightening of the standard for the small EDCs. The rolling 12-month performance was proposed to be measured against the standard on a quarterly basis.

       The proposed long-term and short-term standard set points were selected for a number of reasons. First, the standards allow for some variability from the benchmarks because reliability performance is influenced by weather conditions and other factors that are inherently variable in nature. Second, a review of historical reliability performance levels reveals a certain degree of variance from year to year. However, the use of rolling averages, particularly for the 3-year rolling average standard, will tend to even out some of the inherent variance in performance metrics. The longer the period under review, the more year-to-year high and low variations will tend to cancel each other out. As such, the 3-year rolling average standard should promote reliability performance that is closer to the benchmark over time. Finally, the set points were selected so that the Commission would be more actively involved in monitoring and remedial activities when performance deviates significantly from the benchmark, but would not be as involved when the variations are within the more typical range.

       The Tentative Order also made comparisons of the proposed standards with the standards set by the Commission in 1998. In all cases, the 3-year rolling average standards are tighter than the previous standards that were based on two standard deviations. Comparisons of the proposed 12-month rolling standards to the previous standards revealed that in 32 of 33 cases (11 EDCs with 3 metrics each) the proposed standards are tighter than those established in 1998. Therefore, the Commission concluded that the proposed standards represented a tightening of our reliability standards for electric distribution service.

    Positions of the Parties

       The FE Companies and EAP offered comments in support of abandoning the two standard deviation approach for setting reliability standards for the large EDCs. However, numerous comments were filed by the small EDCs (Citizens', Pike, UGI and Wellsboro) in support of using the standard deviation approach for setting reliability standards for themselves. EAP also supported this approach and joined with Citizens', Pike County and UGI in recommending that the 12-month rolling average standard should be set at 1 1/2 standard deviations above the benchmark and the 3-year rolling standard be set at one standard deviation above the benchmark for the small EDCs only. Pike County recommended the use of standard deviations because of the significant amount of variation in the data caused by small events that skew the statistics. In reply comments the OCA noted that it could not support the use of the standard deviation approach for the small EDC standards.

       The Commission's proposal to generally tighten the reliability standards received support in comments by the AFL-CIO, the FE Companies and the OCA. However, the AFL-CIO and the OCA commented that the Commission did not go far enough in their efforts to tighten the standards. These commenters pointed out that the Commission's proposals fall short of requiring reliability performance to be at a level experienced prior to restructuring. The OCA recommends an alternative approach whereby the 12-month rolling average standard be established at 10% above the benchmark and the 3-year rolling standard be established at the benchmark.

       Comments filed by PPL recommend a different model of setting reliability standards than that proposed by the Commission in the Tentative Order. PPL comments that there should be a single Statewide standard for the industry. PPL believes that benchmarks and standards should consider an EDC's historical performance and provide additional allowances for those EDCs that have met performance objectives. In PPL's view, the application of their model for a Statewide standard would ensure that better performing EDCs are not penalized for historically good performance and that improvement by those EDCs whose performance has lagged would be encouraged.

       The Commission received supportive comments from several parties (AFL-CIO, the FE Companies, the OCA and PPL) about the overall proposed model whereby we would seek to establish short-term and long-term standards. The FE Companies and PPL also generally supported the percentages proposed for the short-term and long-term standards (20% and 10% above the benchmarks, respectively). EAP and UGI recommended that as an alternative to using standard deviations to set the standards for small EDCs, the Commission should consider setting the 12-month standard at 45% above the benchmark and the 3-year standard at 15% above the benchmark.

       The OCA filed comments recommending that the Commission clarify the regulatory purpose of the short-term, 12-month rolling average standard. The OCA recommends that the 12-month standard be used to ensure that performance does not deteriorate on an annual basis to a level that makes it unlikely that an EDC will meet the requirements of the regulation over time. The OCA suggested that the Commission incorporate language from the 2002 Staff Report that addressed the purpose of the short-term standard.

    Disposition

       The Commission will retain its proposal for using percentages to establish standards for electric distribution service reliability. In so doing, we will not adopt the position of the small EDCs who offered comments in support of the alternative of using the standard deviation statistic. With only five data points, a key underlying assumption for the standard deviation statistic is not met, thereby rendering the statistic invalid for our purposes.

       We will also retain our proposal for adopting both a long-term, 3-year rolling average standard and a 12-month rolling average short-term standard for all EDCs. We will not adopt the model advised by PPL for a single, Statewide standard for all EDCs. The intent of the Act is that service ''be maintained at the same level of quality under retail competition'' 66 Pa.C.S. § 2807(d) (emphasis added). The Act could have required that all EDCs' performance not fall below some absolute standard, but it does not state that. Instead, the language of the Act implicitly recognizes that different EDCs may have had different levels of service reliability, and that each EDC's historic performance prior to electric restructuring would be the minimum performance standard to be maintained for the future. We also recognize that a single, Statewide performance standard may not account for legitimate differences in geography that can affect reliability. Accordingly, we shall, for the time being, retain these standards on a company-specific basis.

       As previously noted, EAP, UGI and other small EDCs provided comments in support of having somewhat more lenient standards for the small EDCs than those proposed by the Commission. Commenters supported 1 1/2 standard deviations or an upper range of 45% above the benchmark for the for the 12-month rolling standard for small EDCs and advanced either one standard deviation or an upper range of 15% above the benchmark for the 3-year rolling standard for small EDCs. We have already addressed our reservations about using the standard deviation statistic, the logic of which applies to small and large EDCs alike. We are not inclined to set an even wider bandwidth of acceptable performance for the small EDCs than we originally proposed. With regard to the 3-year rolling average, we believe the small EDCs should be within 10% of their benchmark, just like the large EDCs. For the 12-month rolling average, we proposed a somewhat more lenient standard of 135% for the small EDCs versus 120% for the large EDCs. We believe this extra degree of latitude is justified for the small EDCs because of the greater potential impact of single outage events on distribution systems with few circuits. However, we decline to provide even more latitude to the small EDCs. We would prefer to keep the acceptable performance range moderate and to examine specific causes and events on a case-by-case basis should the reported metric values exceed the 135% standard.

       Comments filed by the AFL-CIO and the OCA recommended that the Commission further tighten the standards for EDC reliability performance beyond that proposed in our Tentative Order. We will not adopt standards that are tighter than we proposed in our Tentative Order at this time. We believe that our proposals represent very significant steps to tighten the standards over the next few years and should serve to focus EDC management on achieving benchmark performance in the future. Given the uncertainty of weather and other events that can affect reliability performance, EDCs should set goals to achieve benchmark performance or better to allow for those times when unforeseen circumstances push the indices above the benchmark. By carefully managing performance in this manner, EDCs will have the necessary latitude to occasionally have performance above the benchmark, but still have the 12-month and 3-year averages close to the benchmark and well within the Commission's standards.

       We agree with the OCA that the Commission should clarify the purpose of the short-term 12-month rolling average standard. The primary purpose of the short-term 12-month standard is to ensure that performance does not deteriorate and move too far from the benchmark without Commission attention during the period in which the 3-year average develops. If quarterly monitoring of the 12-month rolling average metric values reveals trends that are incompatible with meeting the long-term standards, the Commission will conduct further reviews and remedial activities with the subject EDC until performance trends in the desirable range.

       Appendix B contains the recomputed benchmarks, rolling 12-month standard and the rolling 3-year standard for each EDC's SAIFI, CAIDI and SAIDI metrics.

    4.  Waiver Petitions

       In Ordering Paragraph No. 4 of the Commission's June 27, 2003, Tentative Order, we ordered EDCs to request, in writing to the Commission's Secretary Bureau, any waivers of reliability reporting requirements necessary to fulfill its obligations under 52 Pa. Code Chapter 57, Subchapter N (Electric Reliability Standards). Since there were no adverse comments to this requirement and there were supportive comments filed by the OCA and PPL, we will maintain our initial position requiring the formal filing of waiver requests, and again direct that all requests for waiver shall be made formally in writing to the Commission. EDCs are required to timely file in advance of the reporting requirements a petition for waiver of formal reporting requirements under 52 Pa. Code § 5.43 (relating to petitions for waiver of regulations). The EDCs are directed to disclose the reasons they are not in full conformity with the reliability regulations in all waiver petitions submitted to the Commission.

    5.  Starting and Ending Times of Major Events

       The LB&FC and the Staff Internal Working Group identified scenarios wherein certain EDCs had inappropriately claimed service interruptions as a major event by excluding all outage data that took place on any day in which a major event took place, regardless of the actual timeframes in which the major event took place. The current definition of a ''major event'' (as defined in 52 Pa. Code § 57.192) indicates that ''The event begins when notification of the first interruption is received and ends when service to all customers affected by the event are restored.'' We agree that the designated starting and ending time of major events should be enforced according to the regulations.

       Although we revised the definition of a major event, there was no change made to the starting and ending times of a major event. The Commission hereby reiterates that there are regulations which define the designated starting and ending times of major events according to 52 Pa. Code § 57.192 and these shall be followed by all EDCs.

    Positions of the Parties

       No adverse comments were filed.

    Disposition

       We reiterate the starting and ending times of major events is adequately defined in 52 Pa. Code § 57.192.

    [Continued on next Web Page]

    _______

    1  A performance benchmark is the statistical average of an EDC's annual reliability performance index values for a given time period and is established by the Commission. The benchmark represents company-specific reliability performance for a specific historical period. An EDC's performance benchmark is the average of the historical annual averages of the performance index values for the 5-year time period from 1994-1998 and appear in Appendix B.
       A performance standard is a numerical value established by the Commission that represents the minimal performance allowed for each reliability index for a given EDC. Performance standards established by this order are derived from and based on each EDC's historical performance as represented in performance benchmarks. Both long-term and short-term performance standards are established for each EDC. Long-term, 3-year rolling performance standards are based on the three most recent annual index values. Short-term, 12-month rolling performance standards are based on the four most recent quarterly index values. The long-term and short-term performance standards appear in Appendix B.

    2  CAIDI is Customer Average Interruption Duration Index. It is the average duration of sustained interruptions for those customers who experience interruptions during the analysis period. CAIDI represents the average time required to restore service to the average customer per sustained interruption. It is determined by dividing the sum of all sustained customer interruption durations, in minutes, by the total number of interrupted customers. SAIFI is System Average Interruption Frequency Index. SAIFI measures the average frequency of sustained interruptions per customer occurring during the analysis period. SAIDI is System Average Interruption Duration Index. SAIDI measures the average duration of sustained customer interruptions per customer occurring during the analysis period. MAIFI measures the average frequency of momentary interruptions per customer occurring during the analysis period. These indices are accepted national reliability performance indices as adopted by the Institute of Electrical and Electronics Engineers, Inc. (IEEE), and are defined with formulas at 52 Pa. Code § 57.192.

    3  The Tentative Order noted that Citizens' did not exclude major events. However, it is clear that both Citizens and Wellsboro did not exclude major events in their original calculations.

    4  When referring to the establishment of new performance standards based on a percentage of the benchmark, it is important to note that this is the recomputed benchmark based on excluding major event data using the entire service territory criterion.

    5  Large EDCs currently include: Allegheny Power, Duquesne Light, Met-Ed, Penelec, Penn Power, PECO and PPL.

    6  Small EDCs include: UGI, Citizens', Pike County and Wellsboro.


    [Continued from previous Web Page]

    6.  Formal Requests for Exclusion of Service Interruptions as Major Events

       The Staff Internal Working Group's Recommendation No. IV-1 states that the Commission should implement a process that will enable EDCs to formally request exclusion of service interruptions for reporting purposes by proving an outage qualifies as a major event. To analyze and set measurable goals for service reliability performance, outage data is partitioned into normal and abnormal periods so that only normal event periods are used for calculating service reliability indices. The term ''major event'' is used to identify an abnormal event, for which this outage data is to be excluded when calculating service reliability indices. 52 Pa. Code § 57.192 currently defines a ''major event'' as follows:

       (i)  Either of the following:
       (A)  An interruption of electric service resulting from conditions beyond the control of the electric distribution company which affects at least 10% of the customers in an operating area during the course of the event for a duration of 5 minutes each or greater. The event begins when notification of the first interruption is received and ends when service to all customers affected by the event are restored. When one operating area experiences a major event, the major event shall be deemed to extend to all other affected operating areas of the electric distribution company.
       (B)  An unscheduled interruption of electric service resulting from an action taken by an electric distribution company to maintain the adequacy and security of the electrical system, including emergency load control, emergency switching and energy conservation procedures, as described in Section 57.52 (relating to emergency load control and energy conservation by electric utilities), which affects at least one customer.
       (ii)  A major event does not include scheduled outages in the normal course of business or an electric distribution company's actions to interrupt customers served under interruptible rate tariffs.

       The Staff Internal Working Group identified the following scenarios wherein certain EDCs had inappropriately claimed service interruptions as a major event:

       *  Combining two separate storm events, of which only one meets the definition of a major event, into one major event.

       *  Excluding outage data from all operating areas when a major event had occurred in only one operating area.

       *  Excluding all outage data that took place on any day in which a major event took place, regardless of the actual timeframes in which the major event took place.

       Reliability performance will appear to be better than it really is when an EDC excludes more outage data from its reliability calculations than it should. The performance will appear to be better because the number of customers interrupted and/or the customer minutes of the interruption are excluded from the calculations of the performance metrics, thus resulting in lower (better) scores. To avoid the inappropriate exclusion of outage data from any calculated service reliability indices reported to the Commission, the Staff Internal Working Group recommended that a process be established whereby the EDC could formally notify the Commission that it has recently experienced what it believes to be a major event so that the specific outage data associated with the event would be excluded for calculating reliability performance. After providing Commission Staff with the report, the utility would be able to exclude the related outage data from its reliability calculations. The Staff Internal Working Group also recommended that the following outage data be provided in support of the request:

       *  The starting and ending times of the outage.

       *  The main operating area(s) affected by the major event, including the causes and number of customers affected.

       *  The neighboring operating area(s) affected, including the causes and number of customers affected.

       It will not be necessary to provide information about neighboring operating areas affected, since the Staff Internal Working Group is recommending that the definition of a major event be revised so that it is based on interruption criteria of the entire service territory of an operating company as opposed to individual operating areas defined by each operating company.

    Positions of the Parties

       The Energy Association states that while it supports the clarification of what constitutes a major storm, there is not a compelling reason to have a costly and duplicate major storm determination process. EAP states that the information proposed to be included in the formal request form is virtually identical to the information required in Service Outage Report filed under 52 Pa. Code § 67.1; thus, the Commission should eliminate the formal request requirement as being duplicative of existing reporting requirements. If, however, the Commission maintains a major event approval process, EAP requests that the filing be deemed approved after ten days if no Commission action is taken. (EAP's October 10, 2003, Comments, p. 10-12.)

       The FE Companies believe that the proposed form would be costly and time consuming for all parties involved. Rather than implement the formal process, the FE Companies propose that the Commission review each company's annual report to determine if there has been any abuse or misunderstanding regarding the claims for major events, and direct any necessary and appropriate adjustments based upon this after the fact evaluation. (FE Companies' October 9, 2003, Joint Comments p.15-16.)

       PPL believes that the proposed form for requesting an exclusion is duplicative because most ''major events'' would be associated with storms that require submission of a Storm Outage Report in 52 Pa. Code § 67.1. PPL recommends that the Commission develop a standard reporting format for outages which includes the required information. PPL also recommends that the request to classify a storm as a ''major event'' be part of the Storm Outage Report and that the request be deemed approved unless denied by the Commission within 10 business days. (PPL's October 10, 2003, Comments, p. 5-6.)

       The OCA submits that the Commission should retain the formal process for requesting exclusion of major event data from the reported results at this time. The OCA avers that although the Commission provided clarification about the application of the criteria to underlying data in the Tentative Order, the Commission should utilize a formal process to ensure that the criteria are being applied uniformly. The OCA asserts that a formal process to review the continuing use of the regulation is in order, given the significant differences in the application and interpretation of this regulation in the past. (OCA's October 27, 2003, Reply Comments, p. 17-18.).

    Disposition

       Upon further review of this issue, the Commission orders the implementation of a process by which the utility must submit a formal request for exclusion of service interruptions for reporting purposes, accompanied by data which demonstrates that a service interruption qualifies as a major event as defined by regulations. The outage data to be provided in support of the request will be as follows:

       (1)  The approximate number of customers involved in the incident/outage.

       (2)  The total number of customers served in the service territory.

       (3)  The geographic areas affected, in terms of the county and local political subdivision.

       (4)  The reason for the interruption, including weather conditions if applicable.

       (5)  The number of utility workers and others assigned specifically to the repair work.

       (6)  The date and time of the first information of a service interruption.

       (7)  The actual time that service was restored to the last affected customer.

       Following this Order as Appendix D is a sample Major Event exclusion request form which the Commission directs the companies to use to request exclusions for major events.

       We also reject PPL's and EAP's claim that the exclusion request would be duplicative. 52 Pa. Code § 67.1 requires utilities to provide notification to the Commission when 2,500 or 5% (whichever is less) of its customers are without service for 6 hours of more. 52 Pa. Code § 57.192 defines a major event as at least 10% of the customers being without service for at least 5 minutes. Obviously, there is the potential for 2,500 customers to be out of service for more than 6 hours, thus requiring a Section 67.1 report, but not fulfilling the requirements to be classified as a major event. Conversely, there is the potential for large numbers of customers to be out of service for less than 6 hours. In this case, the major event criteria may be met, but a Section 67.1 report would not be required. Contrary to EAP and PPL's assertion that these types of events are unlikely, they can and have occurred. Thus, tying a major event exclusion request to the Section 67.1 report does not conform to this Commission's intent to ensure the application of 52 Pa. Code § 57.192 in a timely and consistent manner.

       EAP and FE Companies have characterized the requirement to submit a major event exclusion form as costly and time consuming. However, neither has presented any reasoning for these assertions. In fact, EAP points out the similarities between the forms required for §§ 67.1 and 57.192. This Commission is not aware of any arguments that compliance with the currently effective § 67.1 is costly and burdensome to utility operations. We therefore find EAP and FirstEnergy's assertions to be without merit.

       FE Companies' proposal to review each company's annual report to determine if there has been any abuse or misunderstanding regarding the claims for major events is inappropriate. A major event review could occur as late as after four quarterly reports were filed with the Commission for the year in question. If there were adjustments made as a result of a review of a utility's major event exclusion claims, those adjustments would need to be implemented retrospectively in each affected quarterly report. The intent of quarterly report submissions is that the Commission be able to perform a timely review and analysis of a utility's reliability performance on the most accurate information available. Making adjustments to quarterly data 15 to 18 months after the fact clearly does not allow for a timely and accurate analysis. We will therefore not adopt the FE Companies' suggestion in this regard.

       EAP and PPL requested that the filing be deemed approved after 10 days if no Commission action is taken. We believe that it is important that a utility know as soon as possible whether its request for a major event exclusion is accepted because of the time and complication associated with the calculation of the various reliability indices. Therefore, we will strive to provide a response to the filing utility within 10 days of the request, but we will give Commission staff 20 days from the date of receipt of the request for exclusion to request any additional information from the utility. If staff does not approve the exclusion, request additional information or reject the exclusion within that time, the filing utility may treat the excluded period as a major event.

       The authority to respond to the request is delegated to the Bureau of Fixed Utility Services, which shall notify the filing utility in writing of staff's determination. All filings are subject to an audit at a later time by the Bureau of Audits. The utility may appeal staff's determination under 52 Pa. Code § 5.44 by filing a petition within 10 days after service. Requests for major event exclusions must be filed with the Secretary's Bureau and copies served upon the Bureau of Audits and Bureau of Fixed Utility Services prior to the quarterly report in which the exclusion is proposed to be claimed.

    7.  Commission Enforcement

       Several parties raised the issue regarding what action the Commission will take if an EDC's reported indices are higher (worse) than the performance benchmark or higher (worse) than the EDC's performance standard. Generally, we view the new, recomputed benchmarks to represent the EDC's average performance prior to the Act and prior to competition. Since we have tightened the performance standards, we view performance within the standard to approximate the benchmark. Therefore, the Commission will not take compliance enforcement action against any EDC that meets its performance standard. However, once a standard is violated, Commission staff will carefully review all information presented in the EDC's quarterly and annual reliability reports including the EDC's causal analysis, inspection and maintenance goal data, expenditure data, staffing levels and other supporting information and Section 67.1 reports to determine appropriate monitoring and enforcement actions. Depending upon the findings of this review, we may consider a range of compliance actions including engaging in additional remedial review, requiring additional EDC reporting, conducting an informal investigation, initiating a formal complaint, requiring a formal improvement plan with enforceable commitments and an implementation schedule and assessing penalties and fines.

       While overall system performance trends that fall in the range between the benchmark and the standard will not be subject to compliance enforcement, the Commission will keep EDCs whose performance is within the standard, but trending away from the benchmark, under review as a precautionary measure.

       The Commission will hold the EDCs to the new 3-year standard using 2004, 2005 and 2006 annual data, effective with the April 30, 2007, annual report, and will hold the EDCs accountable to the 12-month standard using data from the last quarter of 2003 and the first three quarters of 2004, effective with the quarterly report to be filed November 1, 2004. Therefore,

    It Is Ordered That:

       1.  The Commission is issuing, under 52 Pa. Code § 57.194(h), final benchmarks and standards for EDCs operating within this Commonwealth as set forth in Appendices A, B and C.

       2.  The Commission will enforce the 3-year standard on April 30, 2007, and the Commission will enforce the 12-month standard on November 1, 2004.

       3.  EDCs are directed to use the draft form in Appendix D when requesting the exclusion of service interruptions for reporting purposes by proving that a service interruption qualifies as a major event as defined by regulations.

       4.  An EDC shall request, in writing to the Commission's Secretary's Bureau, any waivers of reliability reporting requirements necessary to fulfill its obligations under 52 Pa. Code Chapter 57, Subchapter N (Electric Reliability Standards).

       5.  The Commission shall review and consider the EDC's request for waivers and shall issue Secretarial Letters granting or denying said requests.

       6.  Copies of this Order be served upon all parties to this proceeding including: EDCs operating in this Commonwealth, the OCA, the Office of Small Business Advocate, EAP and the Pennsylvania AFL-CIO--Utility Division.

       7.  A copy of this Order and Appendices A, B, C and D shall be filed at the Proposed Rulemaking Docket L-00030161, Rulemaking Re Amending Electric Service Reliability Regulations at 52 Pa. Code Chapter 57.

       8.  The Secretary certify this Order with Appendices and deposit with the Legislative Reference Bureau for publication in the Pennsylvania Bulletin.

       9.  Requests for major event exclusions must be filed with the Secretary's Bureau and copies served upon the Bureau of Fixed Utility Services and Bureau of Audits prior to the quarterly report in which the exclusion is proposed to be claimed.

       10.  Authority is delegated to the Bureau of Fixed Utility Services to determine whether requests for major event exclusions should be accepted or denied. Appeals may be taken under 52 Pa. Code § 5.44 from staff's determination within 10 days of service of the letter.

       11.  All requests for major event exclusions shall be subject to audit by the Bureau of Audits.

       12.  Any EDC requesting its benchmark be modified is directed to file a petition with the Commission outlining the reasons why the benchmark should be modified within 30 days of the date of entry of this Order.

       13.  Copies of the petition to amend benchmarks and/or standards should be served upon the Pennsylvania AFL-CIO--Utility Division, the OCA and the Office of Small Business Advocate.

    JAMES J. MCNULTY,   
    Secretary

    Appendix A

    Prior Benchmarks and Recomputed Benchmarks

    Reliability      Prior Recomputed Reason for
    Name of EDC
    Indices Benchmark Benchmark Recomputation
    SAIFI     0.67     0.67
    Allegheny Power CAIDI 178 178Change to One Operating Area
    SAIDI 116 119
    SAIFI     1.15     1.17
    Duquesne Light CAIDI 108 108Change to One Operating Area
    SAIDI 123 126
    SAIFI     0.97     1.06
    Met-EdCAIDI 117 127Change to One Operating Area
    SAIDI 113 135
    SAIFI     1.07     1.15
    PenelecCAIDI 104 115Change to One Operating Area
    SAIDI 108 132
    SAIFI     1.01     1.02 PUC Definition of a Major Event
    Penn PowerCAIDI   93   92v. FE Definition
    SAIDI   95   94
    SAIFI     1.23     1.23
    PECOCAIDI 112 112No Change
    SAIDI 138 138
    SAIFI     0.88     0.98
    PPLCAIDI 128 145Change to One Operating Area
    SAIDI 113 142
    UGI SAIFI     0.83     0.83 SAIDI Calculation Correction =
    CAIDI 169 169SAIFI × CAIDI
    SAIDI 147 140
    SAIFI     1.29     0.20 Major Events Now Excluded
    CitizensCAIDI   73 105
    SAIDI   73   21
    SAIFI     0.39     0.39 SAIDI Calculation Correction =
    Pike CountyCAIDI 178 178SAIFI × CAIDI
    SAIDI   66   69
    SAIFI     2.74     1.23
    WellsboroCAIDI 128 124Major Events Now Excluded
    SAIDI 309 153

    Appendix B

    Benchmarks and Standards for EDC Distribution Reliability Performance

             (A) (B)
          (C)      (D)      (E)
      Rolling   Rolling
    Reliability Recomputed 12-Month 3-Yr Avg.
    Name of EDC Indices Benchmark Standard Standard
    SAIFI     0.67     0.80     0.74
    Allegheny PowerCAIDI 178 214 196
    SAIDI 119 172 144
    SAIFI     1.17     1.40     1.29
    Duquesne LightCAIDI 108 130 119
    SAIDI 126 182 153
    SAIFI     1.06     1.27     1.17
    Met-EdCAIDI 127 152 140
    SAIDI 135 194 163
    SAIFI     1.15     1.38     1.27
    PenelecCAIDI 115 138 127
    SAIDI 132 190 160
    SAIFI     1.02     1.22     1.12
    Penn PowerCAIDI   92 110 101
    SAIDI   94 135 114
    SAIFI     1.23     1.48     1.35
    PECOCAIDI 112 134 123
    SAIDI 138 198 167
    SAIFI     0.98     1.18     1.08
    PPLCAIDI 145 174 160
    SAIDI 142 205 172
    UGI SAIFI     0.83     1.12     0.91
    CAIDI 169 228 186
    SAIDI 140 256 170
    SAIFI     0.20     0.27     0.22
    CitizensCAIDI 105 141 115
    SAIDI   21   38   25
    SAIFI     0.39     0.53     0.43
    Pike CountyCAIDI 178 240 196
    SAIDI   69 127   84
    SAIFI     1.23     1.66     1.35
    WellsboroCAIDI 124 167 136
    SAIDI 153 278 185

       Column C--The recomputed benchmarks based on historical performance excluding major event data using the entire service territory criterion.

       Column D--The rolling 12-month standard. The threshold is at 120% of the recomputed benchmark for the major EDCs and 135% of the recomputed benchmarks for the small EDCs.

       Column E--The rolling three-year standard. The threshold is at 110% of the recomputed benchmark of each EDC.

    Appendix C

    Rolling 12-Month Standard for Major EDCs

    (120% of Benchmark)

    (A)
    (B)      (C)      (D)      (E)       (F)     (G)
     2-Std. Dev. Proposed
        Above   Rolling
    Reliability   Current   Current Recomputed Recomputed 12-Month
    Name of EDC Indices Benchmark Standard Benchmark  Benchmark Standard
    SAIFI     0.67     1.08     0.67     1.08     0.80
    Allegheny PowerCAIDI 178 223 178 224 214
    SAIDI 116 159 119 241 172
    SAIFI     1.15     1.46     1.17     1.49     1.40
    Duquesne LightCAIDI 108 127 108 127 130
    SAIDI 123 143 126 189 182
    SAIFI     0.97     1.29     1.06     1.29     1.27
    Met-EdCAIDI 117 140 127 155 152
    SAIDI 113 155 135 200 194
    SAIFI     1.07     1.70     1.15     1.42     1.38
    PenelecCAIDI 104 134 115 141 138
    SAIDI 108 140 132 201 190
    SAIFI     1.01     1.41     1.02     1.41     1.22
    Penn PowerCAIDI   93 117   92 119 110
    SAIDI   95 154   94 168 135
    SAIFI     1.23     1.70     1.23     1.70     1.48
    PECOCAIDI 112 144 112 143 134
    SAIDI 138 196 138 244 198
    SAIFI     0.88     1.14     0.98     1.19     1.18
    PPLCAIDI 128 155 145 190 174
    SAIDI 113 155 142 226 205

    Rolling 12-Month Standard for Small EDCs

    (135% of Benchmark)

    (A)
    (B)       (C)       (D)       (E)       (F)     (G)
     2-Std. Dev. Proposed
        Above   Rolling
    Reliability   Current   Current Recomputed Recomputed 12-Month
    Name of EDC Indices Benchmark Standard  Benchmark  Benchmark Standard
    SAIFI     0.83     1.35     0.83     1.35     1.12
    UGICAIDI 169 304 169 305 228
    SAIDI 147 331 140 412 256
    SAIFI     1.29     3.10     0.20     0.38     0.27
    CitizensCAIDI   73 156 105 230 141
    SAIDI   73 123   21   86   38
    SAIFI     0.39     0.58     0.39     0.58     0.53
    Pike CountyCAIDI 178 283 178 283 240
    SAIDI   66 112   69 165 127
    SAIFI     2.74     6.16     1.23     1.91     1.66
    WellsboroCAIDI 128 195 124 252 167
    SAIDI 309 565 153 483 278

       Column C--The current benchmarks established December 16, 1999 at Docket No. M-00991220. It represents the five-year average of the historical performance for years 1994-1998.

       Column D--The current standards established December 16, 1999 at Docket No. M-00991220. The standard is plus two standard deviations from the established benchmarks.

       Column E--The recomputed benchmarks based on historical performance excluding major event data using the entire service territory criterion.

       Column F--Represents what the current standard would be if applying the two-standard deviation methodology to the recomputed benchmarks.

       Column G--The proposed rolling 12-month standard. The threshold is at 120% of the recomputed benchmark for the major EDCs and 135% of the recomputed benchmarks for the small EDCs.

    Appendix D

    REQUEST FOR EXCLUSION OF MAJOR OUTAGE FOR
    RELIABILITY REPORTING PURPOSES TO
    PENNSYLVANIA PUBLIC UTILITY COMMISSION
    P O BOX 3265
    HARRISBURG, PA 17105-3265

       Reports require an original and one copy to be filed with the Secretary's Bureau.

       Information Required:

       1.  Requesting Utility: __________

            Address:                __________

                                       __________

       2.  Name and title of person making request:

            _________________                          _________________
                                        (Name)                                                                                         (Title)

       3. Telephone number:      _________________
                                                               (Telephone Number)

       4. Interruption or Outage:

          (a)  Number of customers
    affected:
    Total number of customers in
    service territory: __________

          (b)  Number of troubled locations in each geographic area affected listed by county and
    local political subdivision:

               __________

               __________

               __________

               __________

          (c)  Reason for interruption or outage, including weather data where applicable:

               __________

               __________

               __________

               __________

          (d)  The number of utility workers and others assigned specifically to the repair work:

               __________

          (e)  The date and time of the first notification of a service
    interruption:                                                                                             _________________

          (f)  The actual time that service was restored to the last affected
    customer:                                                                                                _________________

          Remarks: __________

          __________

          __________

          __________

          __________

    [Pa.B. Doc. No. 04-931. Filed for public inspection May 21, 2004, 9:00 a.m.]

Document Information