USF Libraries
USF Digital Collections

1991 performance evaluation of Florida transit systems : analysis of the performance evaluation study utilization survey...

MISSING IMAGE

Material Information

Title:
1991 performance evaluation of Florida transit systems : analysis of the performance evaluation study utilization survey technical report : final report
Physical Description:
Book
Language:
English
Creator:
Florida. Office of Public Transportation Operations
University of South Florida. Center for Urban Transportation Research
Publisher:
Center for Urban Transportation Research (CUTR)
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Bus lines--Florida--Management   ( lcsh )
Local transit--Florida --Evaluation   ( lcsh )
Bus lines--Florida--Evaluation   ( lcsh )
Genre:
letter   ( marcgt )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - C01-00183
usfldc handle - c1.183
System ID:
SFS0032284:00001


This item is only available as the following downloads:


Full Text

PAGE 1

. . .... : : . . '.' 1991 PERFORMANCE EVALUATION OF FLORIDA TRANSIT SYSTEMS '. : : < . Analysis of the Performance Evaluation Study Utilization Survey Technical Memorandum Prepared for The Office of Public Transportation Operations Department of Transportation State of Florida By Center for Urban Transportation Research University of South Florida Final Report April 1993 .

PAGE 2

II

PAGE 3

1991 PERFORMANCE EVALUATION OF FLORIDA TRANSIT SYSTEMS Analysis of the Perform8Dce Evaluation Study Utilization Survey TABLE OF CONTENTS SECI'ION PAGE TABLE OF CONTENTS . . . . . . . . . . . . . . . . . . . . . . 111 LIST OF TABLES . . . . . . . . . . . . . . . . . . . . . . . . iv LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . v FOREWORD ................................. : . . . . . . . . 1 . I. INTRODUCI'ION .. ........................... . . . . . . 3 II. PERFORMANCE EVALUATION STUDY UTILIZATION SURVEY... ..... 5 III. ANALYSIS OF SURVEY RESULTS . . . . . . . . . . . . . . 6 A. Awareness of Study . . . . . . . . . . . . . . . . 6 B. Receipt & Distribution of Reports . . . . . . . . . . . 9 C. Utilization of Reports . . . . . . . . . . . . . . . . I 0 D Utilization of Pe(fonnance Measures . . . . . . . . . . 1 4 E. Comments & Suggestions . . . . . . . . . . . . . . . 18 IV. RECOMM:ENDATIONS . . . . . . . . . . . . . . . . . . 20 V. SUiv.Irv1ARY . . . . . . . . . . . . . . . . . . . . . . . 26 APPENDIX A Survey Questionnaire . . . . . . . . . . . . . . 27 APPENDIX B frequencies by Question . . . . . . . . . . . . . 31 lll

PAGE 4

LIST O F TABLES TABLE PAGE I Performance Review Indicators and Measures, Direct l y-Operated Transit Services . . . . . . . . . . . . . . . . . . . . . . 4 2 Performance Evaluation Study Utilization Survey, Response Rates . . . . . 5 3 Receipt of Performance Evaluat ion Study Reports, Results of ThreeWay Crosstabulation . . . . . . . . . . . . . . . . . . 7 4 Performance Measure Frequency of Use Composite Scores . . . . . . . . 17 5 Most Utilized Performance Measures by Organization Type . . . . . . . 18 6 Candidate Performance Measures by Level of Elimination . . . . . . . . 21 7 Final Recommendations . . . . . . . . . . . . . . . . . . . . 25 I V

PAGE 5

LIST OF FIGURES FIGURE PAGE 1 Question 2: Were you previously aware of the Performance Evaluation Study? . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Question 3a: Have you received the Trend Analysis report previously? . . . 8 3 Question 3b: Have you received the Peer Review report previously? . . . . 8 4 Question 3c: Have you received the Summary Report previously? . . . . 8 5 Question 4: How did you receive the reports? . . . . . . . . . . . . 9 6 Question 5a: Did you dislribute/circulate the report(s) to anyone? . . . . . I I 7 Question 5b: If yes, to whom did you distribute/circulate the report(s)? . . . II 8 Question 6: Do you use any of the reported data for your legislatively required performance reporting? . . . . . . . . . . . . 12 9 Question 7a: How often do you refer to the Trend Analysis report? . . . . 13 I 0 Question 7b: How often do you refer to the Peer Review report? . . . . . . 13 1 1 Question 7c: How often do you refer to the Summary Report? ...... ... : . 13 12 Performance Measure Frequencies of Use . . . . . . . . . . . . . 15 v

PAGE 6

vi

PAGE 7

FOREWORD The Office of Public Transportation Operations, Department of Transportation, State of Florida, with assistance from the Center for Urban Transportation ReselU"ch (CUTR), is in the fourth year of publishing the Trend Analysis and Peer Review Analysis reports for the Performance Evaluation of Florida's Transit Systems study. The DeplU"tment is required by state statute to publish these reports annually. Thus far, a standard format has been used to present the various performance indicators and measures that are included in these reports. During completion of the 1991 Performance Evaluation, a survey was conducted to determin e organizational awareness of the reports and the extent to which they have been utilized by transit systems, metropolitan planning organizations (MPOs), regional planning councils (RPCs), and Florida DeplU"tment of Transportation (FDOT) district offices. Additionally, the survey questionnaire was designed to identify the usefulness of the reports' current performance indicators and measures, and to determine what other information might be included in order to make the reports more useful to these organizations. This report presents the anslyses of the resulting survey data. Discussed herein are the responses and comments of the participating organizations, and the potential effects that their input may have on the format of future performance evaluation efforts. CUTR would l ike to thank FDOT for their cooperation and assistance in the preparation of this memorandum and each of the individual transit systems, MPOs, RPCs, and FDOT district offices who participated in the survey. Center for Urban Transportation Research University of South Florida Telephone: (813) 974-3120 Project Director: Project Manager: Staff Support: Steven B. Polzin Joel R. Rey WiUiam L. Ball Tony Rodriguez Fadhely Viloria Florida Department of Transportation Office of Public Transportation Operations Public T r ansit Offiee 605 Suwannee Street Mail Station 26 Tallahassee, Florida 32399-0450 Telephone: (904) 488 1

PAGE 8

2

PAGE 9

1991 PERFORMANCE EVALUATION OF FLORIDA TRANSIT SYSTEMS Analysis of t h e Performa n ce Evaluation S tu dy Utilization Survey I. INTRODUCTION Due to Florida's burgeoning growth, the attention given to public transit as a pote n tial solution for the state's transportation problems has intensified. This increased emphasis on public transit has motivated the assessment of the effectiveness and efficiency with which transit systems across the state provide service. As a result, Florida leg i slation requires the Florida Department of Transportation (FDOT) and Florida's transit systems to develop and report performance measures on an annual basis. In order to comply with this legislation, FDOT has contracted with the Center for Urban Transportation Research (CUTR) to conduct annual performance evaluations of Florida's fixed route transit systems utilizing data from the systems' federal l y required Section 15 reports. Each performance evaluation is conducted as two separate analyses, the trend analysis and the peer review analysis. The trend analysis reviews the Florida fixed-route total system trend as well as the performance trend for each of Florida's fixed-route transit systems for a giv e n tim e period The March 1993 version of this study analyzes trends fo r the period from 1984 to 1991. The purpose of this type of analysis is to examine significant changes in the various performance indicators and measures over time and attempt to ide ntify the reasons f o r the changes. Conversely, the peer review analys i s examines the systems' performance data for only one year. This type of analysis is essentially a compariso n of a sys tem's overall performance with the performance of similar systems or "peers" from around the country. As a result of these separate analyses each performance evaluatio n study has consisted of two independent technical memoranda: Part I, Trend Analysis,-198419XX and Part II, Peer Review Analysis, 1 9XX. The fomutt and presentation for each of these reports have remained re l atively standard for each of the four studies conduc t ed. The only major format change occurred in tl1e 1991 Performance Evaluation Study. In thi s most recent version, each report's layout was changed from portrait (8W' w. x II" h.) to lan d scape (II" w. x 8l!1" h.) to allow for the consolidati o n of graphics and to better aceommodate the landscape tab les used in the earlier reports. The primary reasons for these changes were to decrease the number of pages in each report and to make the documents easier to read. 3

PAGE 10

Since the first reports were produced i n 1990, there have been no changes in data analysis methodology or in the types of performance indicators and measures used. These ind icators and measures are illustrated below in T able l-1. 4 Table 1 Performance Review Indicators and Measures Directly-Operated Transit Services Performance lndfc.ators Effoctlvtne" Measures Efficiency Measures Service Alea Population Service Supp!y COS-t EfficJeney Veh icle Miles Per Capita Operating Exp Per C apita Passenger Trips Operating E:xp. Per Peak Vehicle Passenger Miles Service CC)tlsumption Operating Exp Per Pa.ue nger Trip Passenger Trips Per Capita Oper a tin g Exp. Per Passenger Mile Vehicle Miles Passenger Trips Per Revenu e M il e Operating Exp. Per Revenue M il e Revenue Miles Passen ger Trips Per Revenue Hou r Operating Exp Per Revenue Hour Vehicle Hours Mainten ance Exp. Per Reven .ue M ile Revenue Houl'$ Quality of Service Main L Exp Per Ope r a t i ng Exp Route Miles Speed Avetag:e AQe of Flee t ( i n years) Operating Ratios Total Operaling EXpense Number ot Incidents Farebox Recovery Total Operaling Expense (1984 $) To t al Roadca ll s local Reve n u e Pe r Operaling Exp Tota l Mainlena nce Expense Revenue Miles Between Incidents Operating Reven u e Per Oper. Exp Total Expense (1984 S) Revenue Miles Between Roadcalls Tota l Expense Vehicle Utilization Availability Vehidt Miles Pe r Peak Veh ic le T ola l Local Revenue Reve n ue Miles Per Route Mie Vehldt Hours Pe r Peak Vehicle Operating Revenue Revenue Miles Pe r Vehicle Mile Pauenger F are Revenues Revenue Mites Pe r To t al Vehicles Revenue Hours Per To l al Veh icles Tota l E mployees Tran sportat i on Operating Employee$ Labor Productivity Maintenance Empk>yees Revenue Hours Per Empbyee Adm inistrative Emp l oyees Reven ue Hours Per Oper. Emp)Oy&e Reve n ue Hours Per Main t. Employee Vehicles AvaiabJe for Max. Service Revenue Hours Per Adm in. Veh i cles Operated in Max. Service Emp lo yee Spare Ratio Vehicle Miles Per Mainl. Employee Passenger Pe r Employee Tota l Gallons Consumed Total Vehicles Per Maint. Employee Kilowatt Hours of PropulSiOn Power Total VehiCles Per Adm i n Employee Energy Utilization Vehicle Miles Per Gallon Veh.icle Miles Per K ilow att Hour Fare Average Fare

PAGE 11

. ; .. D. PERFORMANCE EVALUATION STUDY UTILIZATION SURVEY In an attempt to identify the usefulness of the performance indicators and measures currently being utilized in the Performance Evaluation Study, and to determine the extent to which the study's reports have been utilized by transit systems and various planning organi?.ations in Florida, CUTR administered a Performance Evaluation Study utilization survey in November 1992. The utilization survey included thirteen questions that all owed respondents to provide information concerning their awareness of the study and its related documents and to identify how often they use the information contain ed in the reports. Other important questions sought information on how the reports were received, whethe r the repo rts were distributed to others within or outside of an organization, and whether the reported data were used for l egislatively-required performance reporting. A copy o f the survey questionnaire is provided in Appendix A. A mail-out, mail-back questionnaire was used to conduct the survey. FortY-five survey questionnaires were mailed on November 2, 1992, t o nineteen transit systems nineteen metropolitan planning organizations (MPOs) and regional planning councils (RPCs ), and the seve n FDOT district offices The response ra te s for each type of organization are presented in Table 2 Surprisingly, the transit systems did not have the highest response rate of the three respondent groups. It was anticipated that this would be the case given that these documents pre s e n t data that are specifically orient e d towards the transit systems' use and, a s such, the transit systems will be affected the most by any changes made to the reports as a result of the survey findings. Instead it was the FD O T district offices that had the highest response r ate (l 00 percent) Type of Organizati<>n Transit System MPOJRPC FOOT Olslrict Office To1al Table 2 Performance Evaluation Study Utilization Survey Response Rates No. of Surveys MaiJtd No. of Surveys RetumtHI Resi)OI1$e Rate (%) 19 16 84% 19 15 79% 7 7 10 0% 45 38 84% 5

PAGE 12

ID. ANALYSIS OF SURVEY REsULTS A. Awareness of Study The survey questionnaires were mai led to the transit systems, MPOs, RPCs, and FDOT district offices with a copy of the 1990 Summary Report for the Performance Evaluation of Florida's Transit Systems study. In Question 2 of the survey, respondents were asked if they were aware of this particular study prior to the receipt of this summary report Figure I illustrates both the raw response data and the percentage distributions by organization type for this question. Figure 1 Question 2: Were you previously aware of the Perfonnance Evaluation Study? 100% r--.,.,,---,., 80% 60% 40% Yes No 1:811 Transtt Systems MPOsiRPCs FOOT Oltttlct OlflcO$ Total It is evident that a majority of the organirotions surveyed (79 percent) were aware of the Performance Evaluation Study prior to receiving the 1990 Summary Report. Only eight of the 38 total survey respondents indicated that they were not previously aware of the study It is possible that personnel changes and/or limi ted internal circulation of the study's reports may have resulted in certain respondent's not being aware of the study. MPOs and RPCs were the least aware of all the organizations; nearly half of those which responded indicated that they had no previous knowledge of the study. Interestingly, one of the sixteen transit systems responding to the survey indicated not having prior knowledge of the study. This response is somewhat surprising since all nineteen transit systems that were given the opportunity to complete a survey have been included in each of the previous studies and have also provided the Section 15 data that were utilized in the studies' trend and peer analyses. 6

PAGE 13

Question 3 of the survey questionnaire is a three-part question that asked respondents whether they had previously received each of the three reports associated with the Performance Evaluation Study. Figures 2 through 4 on the following page pre sent the raw response data and the percentage distributions by organization type for each part of this question. The analysis of the responses to this question reveals that the of the organizations have previously received copies of the study's three documents: 68 percent have previously received the Trend Analysis report, 74 percent have previously received the Peer Review report; and 74 percent have previously received the SlllllllllllY Report. Looking at the orgtinizations individually, it is evident that more transit systems indicated previously receiving Trend Analysis and Peer Review reports than they did Summary Reports; which is also the case for the FDOT district offices. Howe ver, more MPOs and RPCs indicated that they previously received Summary Reports ( 67 percent) than they did the other two document s ( 40 percent and 53 percent previously received the Trend Analysis and Peer Review reports, respectively). A three-way crosstabulation o f the responses to the three parts of Question 3 was conducted to give further insight into the organizations' awareness of the Performance Evaluation Study and their receipt of its associ ated reports. The results of this crosstabulation are shown below in Table 3. Table 3 Receipt of Performance Evaluation Study Reports Results of Three-Way Crosstabulation Report Combinations Roecivcd No. of Respondents Trend Analysis. Peer Review, and Summary Reports 24 Trend Analysis and Pee r Review Reports 2 Trend Analysis and Sui"M\\lry Reports 0 Peer Review and Summary Reports 2 Trend Analysis Report only 0 Pocr Review Report onty 0 Summary Report only 2 None of the Re:ports 8 7

PAGE 14

8 Figure 2 Question Ja: Have you received the Trend Analysis report previously? 100%T ___ IOO% __ r======r======;-J Rlw Response Data 20% Yes No No response Figur e 3 fi! Transit Sy.stems MPOSIRPC$ FDOT O l $1tlct Offices Tota l Question Jb: Have you received the Peer Re view report previously? Yes No Figure4 No resp onse mil Transit Systems (!1!1 MPOs/RPCs FOOT Olslliet Offices Total Question Jc: Have you received the Summary Report previously? Raw Date Yes No 0 % 0% 2% No response sml T ransit System s fill MPOs/RPCs FOOT District Offleos Total

PAGE 15

The crosstabulation results illustrated in Table 3 seem to support the findings from the analysis of Question 2 of the survey. The analysis of Question 2 revealed that eight of the survey respondents were not previously aware of the Performance Evaluation Study. The crosstabulation results show that a similar number of organizations did not receive any of the study's documents. Hence, it is logical to expect that the eight organizations who did not receive any of the study's reports were the same eight who were not aware of the study until they received the survey and a copy of the 1990 Summary Report. However, further analysis of these questions revealed that only six respondents did not receive any of the reports and were not previously aware of the study. The other two organizations indicating a lack of awareness of the study actually indicated receiving copies of all three reports previously. In addition, two organizations indicated that they had not received any of the reports previously, but also indicated that they were aware of the study. B. Receipt & Distribution of Reports If the survey respondents indicated that they had previously received any of the Performance Evaluation Study reports, the questionnaire's instructions then directed these respondents to co mplete the remaining ten questions on the survey form. The first two of these remaining questions, Questions 4 and 5, sought to determine how the respondents received the report(s), whether they distributed/circulated the report(s) to others within or outside of their organization, and if they did, to whom. Figure 5 presents the raw response data and the percentage distributions by organization type for Question 4. CUTR Figure 5 Question 4: How did you receive the reports? FOOT Someone In Organization Other No Response fm TransH Syat&nt$ II!I!J MPOs/RPCs FOOT Dlotrlct OIIICO$ Total 9

PAGE 16

The data in Figure 5 show that the majority of the reJ.'J)Ondents who received reports got them directly from CUTR or FOOT. In fact, of the 29 respondents who provided a response for this particular question, only three indicated that they received their report(s) from a source other than CUTR or FOOT. Interestingly, two of the three respondents not receiving their report(s) from either CUTR or FOOT happened to be from MPOs/RPCs. It is possible that these two organizations received their report(s) from a transit system within their planning area. Question 5 of the survey questionnaire is a two-part question that, as mentioned previously, asked respondents whether they distributed/circulated the Performance Evaluation Study report(s) to others within or outside of their organization, and, if the y did, to whom did they distribute/circulate the report(s). Figures 6 and 7 on the foUowing page present the raw response data and the percentage distributions by organization type for each part of this question. Sixty-one percent of the respondents distributed/circulated the report(s) that they received to other individuals or organizations, as presented in Figure 6. Of the seven respondents who did not indicate that they distributed/ circulated the report(s), four were from MPOs/RPCs and two were from transit systems. The results illustrated in Figure 7 show that most of the respondents who i ndicated that they distributed/circulated the report(s) did so among their own staff (37 percent). However, none of the respondents indicated that they shared the report(s) with the media. Looking at the organization types on an individual basis, it is interesting to note that, aside from staff, the responding transit systems were mos t likely to distribute/circulate the report(s) to their Boards of Directors. C. Utilization of Reports The basic purpose of Questions 6 and 7 was to determine the respondents' levels of utilization of the Performance Evaluation Study report(s) that they had received previously. In Question 6, respondents were asked whether they used the data in the report(s) for their leg islative ly-required performance reporting. Additionally, Question 7 sought to determine how often the respondents referred to each of the reports that they received. Figure 8 on page 12 presents the raw response data and the percentage distributions by organization type for Question 6; the corresponding figures for Question 7 follow on page 13. 10

PAGE 17

Figure 6 Question Sa: Did you distribute/circulate the report(s) to anyone? Yes No Figure7 No respons e m Trans i t Systems MPOOIRPCs FDOT District Offices Tolal Question Sb: If yes; to wbom did you distribute/circulate the report( s)? Boa r d Members Staff Advisory Committee MPO Media Consultant Other 0% 20% 40% 60% 80% fR Transit Syst ems MPOs/RPCs FOOT District Ofllcos Tolal 11

PAGE 18

Figufe 8 Question 6: Do you use any of the reported data for your legislatively-required performance reporting? 60%+---------------Yes No No response ill!l!ll T...,.H Syatoma MPOo/RPCa FOOT District Offlcos .Total From the data presented in Figure 8, above, it is evident that the majority of the survey respondents (58 percent) do not use the data contained in the Performance Evaluation Study reports for their legislatively-required performance reporting. Only eight of the 38 respondents indicated that they utilized the reported data for this purpose; of these eight respondents, six are from transit systems (37 percent of responding transit systems). A primary reason cited by several of the survey respondents for the data not being used in this capacity is the datedness of the information. However, progress is currently being made by FDOT, CUTR, and the transit systems to expedite the performance evaluation process so that future reports are produced in a more timely manner. Figures 9 through II present the results of the three parts of Question 7 which, as mentioned previously, asked respondents how often they referred to each of the reports that they bad received. The figures show that the majority of survey respondents refer to each report "sometimes" (55 percent for the Trend Analysis report, 58 percent for the Peer Review report and 63 percent for the Summary Report). Looking at the organizations individually, it is evident that only the transit system respondents refer to the Trend Analysis report "frequently." The data also show that MPOs/RPCs have the highest percent distribution of respondents who have not received copies of the reports. Interestingly, none of the respondents indicated that they referred to the Summary Report "frequently." These varying levels of report utilization and interest in the reported data are also exhibited in the responses to Question II, which attempted to gauge organizational interest in receiving performance data on floppy disk. Only 12 of the 38 survey respondents (32 percent) indicated that their organizations would be interested in this opportunity. The remaining respondents either answered negatively (16 respondents, 42 p ercent ) or did not respond to the question (10 respondents, 26 percent) 1 2

PAGE 19

< .... Figure 9 Question 7a: How oft e n do you refer to the Trend Analysis report? Frequently Sometimes Never Have not No response received Figure 10 m Transit Systems 1111 MPOSIRPCS Wj FOOT District Offices Total Question 7b: How often do you refer to the Peer Review report? Frequently Sometimes Never Have not No response received Figure 11 1 Transit Systems llll MPOs/RPCs Wj FOOT DiS1rlot Offices Total Question 7c: How often do you refer to the Summary Report? 60%+----FrequenUy Sometimes Never Have nol No response received IS! Transit Systems IIIJ MPOs/RPCe WJ FOOT District Offices Total 13

PAGE 20

D. Utilization of Perl'ormance Measures In addition to ascertaining the levels of utilization of the three reports associated with the Performance Evaluation Study, another important pwpose of the survey was to determine how often the reported performance measures were utilized by the responding organizations. It was anticipated that this portion of the questionnaire, Question 12 (parts a through mm), would play an important role in the determination of any recommended changes to the current format of the study's reports. As such, a weighting procedure was used to calculate composite "frequency of use" scores for each of the performance measures. These weights then enabled the individual measures to be compared and ranked. The weighting procedure is detailed below A scale of 0 to I 0 was adopted for the frequency of use scoring. A response of "frequently" was given a score of I 0, "sometimes" was given a score of 5, and "never" received a score of 0. The weighting scores were applied to the corresponding responses for each performance measure and the resulting weighted responses were summed. For each performance measure, the sum of the weighted responses was divided by the sum of the unweighted responses to calculate a composite frequency of use score. Figure 12 illustrates the frequencies of use for the reported performance measures. The diamonds represent the relative positions of the measures along the frequency of use scale The farther to the right that a diamond is located, the more frequently used is its corresponding measure. The figure shows that the cost efficiency measures, such as operating expe nse per passenger trip, are among the most frequently used measures, while the labor productivity measures (e.g., revenue hours per employee) are among the least utilized. Overall, the measures are used "sometimes" when they are considered as a group. The mean composite score for all measures is approximately 4.66 (see Table 4, page 17), a rating slightly below the score of 5 that was assigned to the "sometimes" response. This overall score makes sense when compared to the results for Question 7, which indicate that the majority of the survey respondents also refer to each of the study's reports "sometimes. 14

PAGE 21

... ,. Figure 12 Performance Measure Frequencies of Use Vehicle M iles Per Capita Passenger Trips Per Capita Passenger Trips P&r Revenue M ile Pass&nger Trips P e r Revenue Hour Average Speed Average Age Of Fleet Number of Inci dents Totsl Roa d calls Revenue Miles Between Incidents Revenue Miles Between Roadcalls Revenue Miles Per Ro ute M ile Operating Expense Per Capita Operating Expense Per Peak Vehicle Op&ratlng Expense Per Passenger Trip Op&rating Expense Per Passenger Mile Operating Expense Per Revenue Mile Operating Expense Per Reven u e Hour Ma i ntenance Expense Per Revenue Mile Ma intenance Expense Per Operating Expense Farebox Recovery Ratio Local Revenue Per Operating Expense Operating Revenue Per O pe rat i n g Expense Vehicl& Miles Pe r Peak Vehicle Vehicle Hours Per Peak Vehicle Revenue M iles Per Vehicle Miles Revenue Mlles Per Total Vehicles Revenu& Hours Per Total Vehicles Reven ue Hours Per Employee Revenue Hours Per Operating Employ&e Revenue Hours Per Maintenance Employee Reven u e Hours Per Administrative Emp loyee Vehicle Miles Per Maintenance Employee Passenger Trips Per Employee Total Vehicles Pe r Maintenance Employee Total Vehicles Pe r Admin i strative Employee Vehicle Miles Per Gallon Vehicle M iles Per Kilowatt-Hour Average Fare Neve r It ] ""tt l Sometimes Frequently 15

PAGE 22

Table 4 presents the frequency of use composite scores for each performance measure by organization type, as well as the mean composite scores For the performance measure totals, the scores range from a low of 1.61 for vehicle miles per kilowatt-hour to a high of 6.94 for the farebox recovery ratio. However, most of the scores fall within the range of 4.0 to 6.0 on the frequency of use scale. For the individual organization types, the transit system respondents have the highest mean composite score (5.00) while the respondents from the FOOT district offices have the lowest (4.21). In ranking the performance measures based on frequency of use composite scores, it is evident that there are some differences among the organization types in how often they refer to particular performance measures. For example, the composite score for transit system respondents for vehicle miles per gallon is 6.43. Based on the frequency of use scale, this rating is higher than a utilization of "sometimes." In comparison, MPO/RPC and FOOT district office respondents have composite scores of 3 .50 and 2.86, respectively, for this same measure. These scores fa ll into a range of usage between "sometimes" and "never." Differences such as this make it difficult to detennine which performance measures are superfluous and can be excluded in subsequent reports. The MPOIRPC and FOOT district office scores suggest that this energy utilization measure is dispensable; however, vehicle miles per gallon is one of the ten most utilized measures by the transit systems based on its composite score for this organization type. Hence, a predicament results where more organizations (MPOs/RPCs and FOOT district offices) utilize the measure infrequently, but the organizations (transit systems) that use the measure more often are also the most frequent users of the study's reports. Table 5 on page 18 presents the performance measures that are most utilized by the surveyed organizations, in rank order of composite scoring. Due to similar comp osite scores among many of the measures, it is difficult to tabulate the top five or ten measures overall or for each organization The measures with similar scores have been grouped together within each organization category. Overall, the most frequently utilized measure is the fare box recovery ratio. Other measures frequently used by each of the organizations include operating expense per passenger trip and operating expense per passenger mile. It is interesting to note that transit system respondents rated passenger trips per revenue mile as their most frequently utilized measure (7.50); however, this measure was rated somewhat lower by the MPOs/RPCs (6.00) and the FDOT district offices (5.00). 16

PAGE 23

. : : T a b l e 4 Perfonnance Measure Freq u ency of Use Composite Scores Vehicle M ilos Per C4ptta Passenger Trips Per Revonuo Mile Passenger Trips Per Revenue Hour --------Awrago Ago Of Fleet Number of I n c i dents Total Roadca.tla R evenue Miles Between Incidents Revenue Mites Between ---------------Rcvonuc Miles Pe r Route M i le Operatin g E xp ens e Per Ce.p lta _ _ _ Op e rating Expens.e Per P a ssenger Trip --------------------------___ Operating Expense P e r Revenue Mne ------------------Operating EXpense Per Revenue Hour Maintenance Expens.o Per Revenue M ile ----------------!!"_ Fare box Recovery Ratio L oca l Revenue Per Operating Expense __ Vehi cle M iles Per Peak Vehicle Vehk:le Hours Per Pea k Vehlcte Revonuc MIJes Pe r V e hicle MIMs Revenue MUC$ Per Total Vehicles Revenue Hours Per Total Vehicles !!r_ --Revenue Hours Per Operating Empl oyco -------------# -------Revenue Hours Per Admlnlstrattve Employee Y!'!'':' _ Pass e nger Trips Per Employee ------------Tot a l VehJcles Per Ma int e nance Employ e e Total Veh1clcs Por AdmlntS1rat1vo Emptoyoo ---------Vehicle M iles Per G allon Vohlelo M ll4s Per KCiow att .. H our Average Fare Me1111 TraMit sYstems MPO
PAGE 24

Table 5 M os t Utilized P erformance M easures by O rgallizatio n Typ e Transit Systems MPOsiRPCs FDOT Dls1rlct Offices Total Pass Trips P e r Rev. Mile F arebox Recovety Oper Exp Per Pas-,. Trip Farebox Recove.ry Farebox Recovery Oper. Exp. Per Pass Trip Average Fare Farebox Rec:overy Oper. Exp. Pe r Pas.s. Mile Oper. Exp. Per Pass. Trip Opcr. Exp. Per Rev. Mile Pass. Trips. Per R&v. Hour ()per Exp. Pe r Rev. Hour Fare Oper Exp. Per Pass M ile A v e rage Age of Flee t Oper. Exp. Per Pass Mile Oper Ex:p. Per PaS$. M i le Oper Exp. Per Pass. Trip Oper. Exp. Pe-r Rev. Mile Oper Exp. Per Rev. Mile Oper. Rev. Per Oper. Exp. O p er. Exp. Per Pe a k V eh. Pass Trips Per Rev. Mile Oper. Exp. Per Rev. Hour Ope r Exp Per Rev Mile Average FaA!I Average Fare Pass. Trlp.s Per Rev. Hour VehicJe Miles Per Gallon Pa$S Trips Per Rev. Mile Numbe r of I ncident$ Pass. Trj)s Pe r Rev. Hou r Rev. HOUI'S Pe r Total Vehs. Pau. T,.,s Per Rev. Hour Oper Exp, Per Rev. Hour Rev. Miles Per Total Vehs. Pass T,..,s Per R ev. Mile Rev. Miles Per Veh. Miles Rev. Mile$ Per Total Veh$. R e v. Mile& PGr Veh Miles Oper Rev. Per O per Exp. Rev. Miles Pe r Route M i le E Comments & Suggestions The remaining four survey q uestions, Questions 8 through 10 and Questi o n 1 3, pro v i d e d re s pondents with the opportunity to contribute f eed b a c k and suggestion s concerning the Performance Eval u a t ion Study and its reports. Que stions 8 through I 0 specifically asked respondents to prov i de any additional measures that they would like to see added to future T r e n d Analysis, Peer Review Analysis, and Summary reports Similarly, Question 13 gave respondents an opportunity to list suggestio n s that th ey would like to see addre s sed in e ac h of the future reports. Following are listi n gs of the various comments and suggestions, in no particular order, that were indic a t e d by the respondents on th eir survey forms Que st i o n s 8 throug h I 0: Additio n al Measurc:s and Other Info rmati on Trend Analysis Report: System-by syste m comparison of ridership trends T otal cost per pas sen g e r trip (also to be add e d to Peer and Summary reports) Pee r R e view Analysis Report : More employee d ata (e. g part-time drivers, organizational functions) Vanpoo l comparisons throughout country C o mparison of prop o rtio n of non-l o cal funding Geographic measure of service area Salary rates & overtime hours for major employee categories Percent of labor expense per operat i ng e x pense 18

PAGE 25

: : : .. : Fuel cost per revenue mile Fringe benefits as percent of total pay Measure of on-sb:eet txansit amenities (also to be added to Swnmary Report) Comparison of vehicle type & size (also to be added to Summary Report) Comparison of size & makeup of Boards of Directors (also to be added to Summary Report) All Reports: Walk access coverage area Mean service area activity density Mean coverage area activity density Total annual nwnber of transfers On-time performance, systemwide and by mode Total estimated annual trips Estimated transit mode split QuestiQn 13: Suggestions for future Renorts Trend Analysis Report: Use tabs to separate each system section Add table comparing each Florida system for each of the performance measures Add individual \comparisons on gfOwth of ridership Peer Review Analysis Report: Do not split tables; if necessary, keep table numbers the same for split tables Use more systematic approach to present peer properties, i.e., improve differentiation of Florida systems from peers Provide the names & phone nwnbers of contact people for the peer agencies Ensure that peers closely resemble each other in terms of peak hour vehicle commitment Provide comparisons of each system's paral:!"ansit operation Provide a more detailed picture of employees within each organization Somehow the impacts of ADA need to be quantified Summary Report Provide the names & phone nwnbers of contact people for the peer agencies Clarify that the "# of vehicles" column in the table on page 11 represents peak vehicles All Reports Add demand response data Make all reports more widely available Address the effects tliat a region's demographic, economic, & social characteristics have on the region's transit performance measures 19

PAGE 26

IV. RECOMMENDATIONS The analysis of the survey results as weU as the various respondent comments, suggest several potential changes that can be made to the Performance Evaluation Study reports in order to improv e the documents and make them more useful to the organizations that utilize them. Interestingly, the nature of the comments and suggestions provided indicates a sophisticated audience who understand and utilize the trend and peer data, and who often want additional information. In this sect io n, the principal changes that are recommended for inclusion in future study reports are discussed. Ideally, it would be most beneficial to incorporate the recommended changes into the reports for the 1992 Performance Evaluation Study; however, it is possible that a particular change may take longer to integrate into the reports' current structures. The recommendations have been broken down into four primary categories: Peer Groupings, Performance Measures, Layout and Structure, and Additional Information. Following are discussions of each of these categories. Peer Groupings Currently, the Peer Review Analysis portion of the s tudy separates the Florida systems and their peers into four peer group categories: greater than 200 motorbuses, 50 to 200 motorbuses, I 0 49 motorbuses, and I to 9 motorbuses operated in maximum service. Some of the survey respo n dents expressed concerns that the groupings may not be equitable in terms of "peak hour vehicle commitment," especially in the 50 to 200 motorbus group. I n this particular group the number of vehicles operated in maximum service ranges from 58 to 174 motorbuses, with a peer group average of approximately 116 vehicles. The concern stems from the perception that unfair compar isons are being made between smaller systems such as Palm Beach County Transportation Authority (58 vehicles in maximum servi ce ) and peer systems that, in some cases operate two or three times as many vehicles in maximum service. Because of these concerns, it is recommended that the current peer groups be revisited for the 1992 Performance Evaluation Study. Based on peak vehicle data from the 1992 Section 15 reports, CUTR and FOOT can jointly determine the most comparable peer group categories for the Florida and non Florida systems. Secondly, some of the current non-Florida peer systems were selected during the first Performance Evaluation Study using 1987 Section IS data. While some of the peer groups have been updated during subsequent studies, it would be beneficial to review the current non-F lorida systems using the original peer selection process. This review would ensure that these systems are still the best peers to present in terms of similarity in the key characteristics utilized in the original selection process (population density, vehicle miles, and 20

PAGE 27

' ... average speed). Especially since it is possible that systems not originally selected may have changed such that they are now preferable peers. Performance Measures The analysis of performance measure utilization indicates that some of the measures are referred to relatively infrequently On a frequency of use scale from 0 to 10 (10 representing frequent reference to a measure; 0 representing no reference), only 14 of the 38 total measures have composite usage scores of 5.00 (signifies that measure is referred to "sometimes") or above. Of the 24 remaining measures with scores below 5.00, 22 were reviewed for potential elimination from future reports (two indicators-total roadcalls and number ofincidents--were included in the frequency of use analysis, but were not considered for elimination). Table 6 presents the 22 candidate measures grouped into three suggested levels of elimination based on frequency of use composite scores. Level I includes measures with frequency of use composite scores below 3.20. These are the least utilized measures and are, therefore, the most logical candidates for elimination. Lcve12 includes measures with composite scores between 3.20 and 4.20. These measures are also referred to infrequently and it is reasonable to eonsider them for elimination as well. The final level, Level 3, is comprised of measures with composite scores above 4.20 but below 5.00. While these measures are more frequently utilized than those from Levels 1 and 2, they are still potential candidates for elimination from future reports Table 6 Candidate Performance Measures by Level of Elimination Level1 Lewl2 Level3 Rev. Houts Per Admin. Employee Average Speed Malnt. Expense Per Revenue Mile Rev. Hours: Per MalnL Employee Passenger Trips Per Capia Rev. Hours Pet TOial Vahldes Rov. Hours Pet Oper. Employee Vehicle Miles Per C.pila Operabng Expense Per Capita Total Vehs. Per Admin. Employee Vehicle Hours Per Peal< Vehicle Revenue Mifes Pe-r Route Mile Total Vehs. Per Malnl Employee Vehicle Miles Per Peak Vehicle Vehicle Miles Per Gallon Vehk:le Miles Pet Maint Emp5oyee Revenue Hours P$r Employee Ma.int. Ex:pen.se Per Oper. Veh.icle Miles Per Kilowatt-Hour Passenger Trip& Per Employee Expense Revenue Mile$ Between R.oadcalls Revenue Miles Between lncidenla Based on the analysis of the frequency of use composite scores, it is recommended that the Level 1 measures be eliminated from future reports. The one exception is the energy utilization measure, vehicle miles per kilowatt-hour. This measure was the least utilized overall 21

PAGE 28

since only two transit systems in Florida have modes that require electric propulsion power. However, it is used by these properties and is recommended for continued inclusion. The elimination of Level 2 and Level 3 measures should be based on considerations for the structure of future reports. If continued downsizing of the reports in terms of the number of pages is desired, then these measures may become expendable. With fewer total measures, it would be possible to combine the effectiveness and efficiency measures that now occupy two tables (on two separate pages) into one table on a single page. It may even be possible to combine these tables by only eliminating a few selected measures from each of the Level 2 and Level 3 categories. However, given the historical databases, the relatively modest levels of effort involved in calculating and compiling the measures, and the fact that most of the measures are used by some of the properties, the primary motivation for eliminating these additional measures would be cost savings realized in report production. Hence, it i s not recommended that the Level 2 and Level 3 be eliminated. Regardless of which measures are ultimately chosen for elimination, it should be recognized that organizations utilizing the reports will still be able to calculate all of the current measures using the reported performance indicators. Layout and Structwe In analyzing the various surveys, it v.'liS found that not many comments or suggestions specifically concerned the layout or structure of the study's documents. One respondent did comment on the tabular format used to present peer data in the Peer Review Analysis report. The respondent did not like the way that the larger peer groups had to be subdivided into multiple tables of data for the same measures or indicators. Additionally, the respondent wanted to see a more systematic approach used to order the systems within each peer group's tables. As a result of these comments, it is recommended that multiple tables of peer performance indicators or measures be given the same table number with a "(continued)" designation following the table number on each subsequent page of related data. For example, if the I to 9 motorbus peer group has two tables of performance indicators, the tables would be numbered "Table I" and "Table I (continued)," instead of "Table I" and "Table 2." Also, it is recommended that a single indicator, such as the number ofvehicles operated in maximum service, be used to order the systems within each peer group's tables. The data for the Florida systems can then be shown in bold to better differentiate these systems from their peers. 22

PAGE 29

The final recommendation for report structure involves the use of tabs, primarily in the Trend Analysis report. While some respond ents did commen t on the difficulty involved with finding individual transit systems in the voluminous Trend document, this is an inconvenience that had been noted previously A possible solution to this problem is the addition of tabbed cardstock pages to identify each transit system section as well as the statewide system total section. However, the viability of this solution will mostly depend on the cost of producing and . incorporating the tabbed pages. Additional Information Finally many respondents suggested the addition of a number of variables, measures, and data co mparisons that are not currently included in the study's reports Many of the requested data i tems suc h as employee salary rates, number of parttime employees, and annual number of transfers are not available in the Section 15 re ports and would require a more intensive data collect i on effort to compile. However, once the init ial information i s collected it is possible that a number of the items could be included in a more detailed system description for each Florida transit property. Potentially, a separate report of these in-{!epth system characteristics could be produced and updated every three to four years. Data presented in this report might include detailed emp l oyee information, vehicle invento ry data, deScriptions of 'Boards of Directors and other advisory boards, and other useful items that would be of interes t to the transit systems and planning organizations in Florida. It is, however, recommended that this issue be treated separately from the trend and peer reports to preclude any delay in producing and disseminating this information. Perhaps a standardized "System Description" section could be prescribed for Transit Development Plans (TDPs) to meet this need. Of the respondents' suggested -data items two i tems did make sense for inclusion in future trend and peer reports: (I) a geographic measure of service area, and (2) demand response service information The first item, a measure of service area, may be relatively simple to collect from each of the transit systems and add to the reports' data tab l es without much change to the current table structure. Therefore, it is recommended that service area measures be included in the upcoming trend and peer reports. (It should be recognized that it will also be necessary to understand and, if possible standardize the service area for them to be most meaningful.) It is the inclusion of tbe second item, demand response service information, that may pose problems since it is highly probable that the addition of these data will not be a straightforward process. 23

PAGE 30

Two primary problems exist with the addition of demand response information. First, the current structure of the systems' trend data tables will not readily accommodate the addition of a new service mode. Additional tables will need to be added and revisions will need to be made to system total tables, thereby increasing the size of the Trend document once again. Secondly, historical demand response data (1984-1991) for all of the Florida systems will have to be collected and incorporated into the cwrent reports' tables and graphics, possibly altering system totals and trends reported in these documents and their predecessors. All of these changes will require a large amount of time to complete and the impacts of these changes should probably be discussed before attempting to address this recommendation. However, alternatives such as foregoing the collection of historical data may be reasonable for consideration. It should also be noted that some data collection for the demand response mode has already been completed as part of Technical Memorandum No. 2 of the Florida Five-Year Transportation Disadvantaged Plan completed by CUTR for the Transportation Disadvantaged Commission and FOOT. In this report, indicators such as passenger trips, vehicle miles, operating costs, and vehicles available for maximum service were presented for Section 9 operators for the years 1985 through 1989. Several perfo rman ce measures based on these particular indica tors were also presen ted in the technical mem orandum. Despite the presence of this data, it is still anticipated that the collection of historical demand response data will require considerable effort. In addition, the demand response data presented in this part icular report raised numerous questions concerning the comparability of data across systems due to impacts of such factors as data quality, treatment of contract providers, and large variations in service levels, eligibility, and vehicles, among others. Therefore while the addition of demand response data makes sense from the standpoint of addressing the quantification of the Americans with Disabilities Act (ADA) requirements, this is one recommended change that warrants additional investigation. Further analysis of the changes in report format, data collection, tables, and graphics may serve to lessen the impacts of the addition of this particular service mode. As such, it is recommended that the possible addition of demand response service be studied further in the context of the next Performance Evaluation Study scheduled to begin in June 1993. The findings of this additional research can then be incorporated into a procedure for the inclusion of demand response data into the following study (Spring 1994). A summary of the final recommendations resulting from the Performance Evaluation Utilization surv ey is presented in Table 7. 24

PAGE 31

Tabl e 7 Final Recommenda t ions Primary Cat8iJOI')' Recommendation Peer Groupings Review the current peer group$ to e n sure comparobi lity w'tlhln each grou p The review s houDI1 target peer g roup size and non-FL systems. Per1ormance M easures Eliminate the fo llowing measu res from future re ports: .. Rev. Hours Per Admin. Employ e e Rev. lioul$ Per Maint. Rev. Hours Per Oper. Employee .. T ota l Vehs. Per Admin. Employee Tota l Vehs. Per Maint. Employee Vehicle M i los Per Malnt. Employee layout and S tructure Give t a bles of peel perfonna n ce indjcato r s or meas ures the same table number. Use a "continued" designation o n each s u bsequent page of related data Use nu mbe r o f ve hidCS operated In maximu m service" to ofdet t h e system s within e ach p oet group's ta b le$. Use tab s to de lineate e.ach lta n.sil. system' s sectio n wlthln th e Trend Anatysis re p ort. AdditionaJ Information Conside r a supplementa l re p ort o r spe ci fic a tion wit hin t h e lDP for a more d etailed sy$tem descrip tio n I nc l ude service are;. measu r ement s f or each system in both teports . S tud y t h e incl u sio n of deman d r espo n se data in fu!u re reports furt her 25

PAGE 32

V. SUMMARY In summary, the Center for Urban Transportation Research (CUTR) conducted a Performance Eva luation Study utilization survey in November 1992 wtder contract with the Office of Public Transportation Operations, Department of Transportation, State of Florida. The primary purpose of the survey was to determine organizational awareness of the Performance Evaluation Study's reports and the extent to which they have been utilized by transit systems metropolitan planning organizations, and regional planning councils in Florida, as well as Florida Department of Transportation district offices. A 13-q uestion survey questionnaire was designed to identif y the usefulness of the reports' various performance indicators and measures, and to determine what other in fo rmation might be included in future reports in order to make them more useful to the surveyed organizatio ns. This report presents the analyses of the resulting survey data and includes the respondents' suggestions and comments Raw response data and percentage distributions are provided for most of the survey questions in both tabu lar and graphical format. Also included are re comm endations for future report modifica tions based on the analyses of the data and the respondents comments. Comments and questions about this report can be directed to the Office of Public Transportation Operations, Department of Transportation, State of Florida, 605 Suwannee Street, Ma il Station 26, Tallahassee, Florida, 32399-0450 Tele phon e: (904) 488-7774. 26

PAGE 33

APPENDIX A SURVEY QUESTIONNAIRE 27

PAGE 34

Transit System Peer Study UUiizaUon QuesUonnaire The Flotkla OepaliiMnl ol TranspOrtation, will\-from lho c.nrer for Urllan Transport.atfon Research (CIJTR), Is in the fourlh yeor ol pullllshi091he Trend Analysis and P-R...._ for tho ol Florida's Transit srudy. The roqvRd by-Slali.CO 10 publish lhese aMUaily. TI1Of{ormatW:O hlicaiOtl and -!hal ara indudad In !heM ri!>Cl
PAGE 35

'\ .. .. Measl.ln!S a. Vehicle Mles Per ..... ..... b. Passenger Trips Per CD,plta ... ...... c. Passenger Trips Per Revenue Mile ..... d. Passenger Trips Por Rovonuo Hour ....... .. e. Average Speed ..................... I. AY813Q0 Age 01 Aeet ................. g. Number ollnc:idents .. .. . .. .. . .. h. Total Roadcan.s .............. L Revelltl& Miles B-n lnclders ... Rewnua Mies BeCweon Roadcalls .... k. Ravenue Mies Per Rcue M1e ........... L Operallng Expense Per capita ......... .. m. Operating Expense Per Peal< VehiCle ... n. Ope.atlng EXpense Per Passenger Tnp .... o. Operating Expense Per Passenger Mile ... p. Operating Expemo Por Rewnue Mile .. q. Operating El< Recovery ................... u. Local Revenue Per Operatl119 Expense .... v. Operating Revenue Per Operating Expense w. Vehicle Mies Per Peale Vehicle .......... x. Vehicle Hours Per Poalc Vehicle .......... y. Vehicle Miles Per capita .............. .. z. Revenue Mles Per Vehicle MDes ..... aa. Revenue Mies Per Total VehiCles ..... bb. Revenue Hours Per Total Vehicles . ..... cc. Revenue Hours Per Employee ......... dd. Revenue Hours Per Operallon,o Employee . ee. Revenue Hours Per M aintenance Employee . ff. Revenue Hours Per Admlnlstra!Ne Employee gg. Vehicle Mies Per Employee . hh. Passenger Trips Per Employee ..... I. T olal Vehicles Per Mafmnance Employee . D Total Vehicles Per Admrislrallve Employee Jck. v-MJes Per GaDon .............. .. I. Vehicle Mies Per KJowan-Hout .... mm. Average Fare . . . . . . . . . . . FreautQ!IY Sometimes 13. Do you have any suggesllon s that you would like to see addressed In any c4these reports ? Trend Analysis. ______________________________ OVIOW Sllmmery Report Thank you for your tima a n d etrort I n comp(ating this questionnaire. Please raturn your compl e ted questiOMafte In the enclosed sell-addressed, s l amped envelope by November 20, 1992.

PAGE 36

30

PAGE 37

APPENDIXB FREQUENCIES BY QUEST I ON 31

PAGE 38

32 Question 1: Type of Organization Transit Age ncy Regiona l P l anning Council Metropo l itan P lanning Organ i zation FOOT District Office 0% 10% 20% 30% 40% 50% Yes No 0% 20% 40% 60% 80% 100% Technica l Memora ndum 1, T rend Analysis Technica l Memorandum 2, Peer Review .v .. EJ No No response 0% 20% 40% 60% 80% 100% Question 4: How did you receive the report(s)? FromCU T R F rom FOOT From someone in m y organization Other No response 0% 20' % 40% 60% 80%

PAGE 39

Yes No No response 0% 20% 40% 60% 80% Question Sb: To whom did you distribute/circulate the report(s)? Board Members Staff Advisory. Committee(s) MPO Media Consultant Other No response 0% 10% 20% 30% 40% 50% Question 6: Reported data used to fulfill performance reporting requirements? Yes No No response 0% 20% 40% 60% 80% Technical Memorandum 1, Trend Analysis Technical Memorandum 2, Peer Review Froquonuy SomeUmes [] Never Summary Report Havo not received 1m No response 0% 20% 40% 60% 60% 33

PAGE 40

Provided response No response 0% 20% 40% 60% 80% 100% Provided response No response 0% 20% 40% 60% 80% 100% Provided response No response O"A. 20% 40% 60% 80% 100% Question 11: Is organization ioteres1ed in receiving performance data on floppy disk? Yes No No response 34 0% 10% 20% 30% 40% 50%

PAGE 41

Frequently Sometimes Never No respom111 0% 20% 40% 60% Frequently Sometimes Never No response 0% 20% 40% 60% Frequently Sometimes Never No response 0% 20% 40% 60% Frequently Sometimes Never No response 0% 20% 40% 60% 35

PAGE 42

36 Question 12e: How often does organization refer to "average speed?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12f: How often does organization refe r to "average age of fleet?" Frequently Sometimes Never No response 0% 20% 40";. 60% Question l2g: How often does organization refer to "number of incidents?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12h: How often does organization refer to "total roadcalls?" Frequently Sometimes Never No response 0% 20% 40"k 60%

PAGE 43

. ':': \ .. Question 12i: How often does organization refer to "rev. miles between incidents?" FrequenUy Sometimes Never No res,,oti!le 0% 20% 40% 60% FrequenUy Sometimes Never No response 20% 40% 60% Question 12k: How of t en does o rganization refer to "rev. miles per route mile?" F r equently Sometimes Never No response 40% 60"/o Frequently Sometimes Never No response 0% 20% 40% 60% 37

PAGE 44

38 Question 12m: How often does organization refer to "oper. expense per peak veh .?" Frequently Sometlmu Never No response 0% 20% 40% 60% Question 12n: How often does organization refer to "oper. expense per pass. trip?" Frequently Sometimes Never No response 0% 20% 40% 60% Ques t ion 12o: How often does organization refer to "oper. expense per pass. mile? Frequently Sometimes Never No response 0% 20% 40% 60% Question 12p: How often does organization refer to "oper. expense per rev. mile?" Frequently Sometimes Never No response 0% 20% 40% 60%

PAGE 45

Question 12q: How often does organization refer to "oper. expense per rev. hour?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12r: How often does organiza tion refer to "maint. expense per rev. mile?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12s: How ofte1i'does organization refer to "maint. exp. per oper. exp.?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12t: How often does organization refer to "farebox recovery?" Frequently Sometimes Never No response 0% 20% 40% 60% 39

PAGE 46

40 Question 12u: How often does organization refer to "local revenue per oper. exp.?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12v: How often does organization refer to "oper. revenue per oper. exp. ? Frequently Sometimes Never No response 0% 20% 40% 60% Question 12w: How often does o rganiza tion refer to "veb. miles per peak vehicle?" Frequently Sometimes Never No response 0% 20% 40% 60"/o Question 12x: How often does organization refer to "veb. hours per peak vehicle? Frequently Sometimes Never No response 0% 20% 40% 60%

PAGE 47

Question 12y: How does organization refer to "vehicle miles per capita?" Frequently Sometimes Never No response 0% 20% 40%. 60"/o Question 12z: How often does organizationTefer to"rev. miles per veh. mileS?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12aa: How often does organization refer to "rev. miles per total vehs.?" Frequently Sometimes Never No response 0"/o 20% 40% 60% Question 12bb: How often does organization refer to hours per total vehs.?" F r equently Sometimes Never No response 0% 20% 40% 60% 41

PAGE 48

42 Question 12cc: How often does organization refer to "rev. hours per employee?" Frequently Sometimes Never No response 0% 20"/o 40% 60% Question 12dd: How often does organization refer to "rev. hours per oper. emp.?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12ee: How often does organization refer to "rev. hours per maint. emp. ?" Frequently Sometimes Never No response 0"/o 20% 40% 60% Question 12ff: How often does organization refer to "rev. hours per ad min. emp. ?" Frequently Sometimes Never No response 0% 20% 40% 60"/o

PAGE 49

' Frequently Sometimes Never No response 0% 20% 40% 60% Question 12hh: How often does refer to pass. trips pel" employee?" Frequently Sometimes Never No response 0% 20% 40% 60% Frequently Sometimes Never No response 0% 20% 40% 60% Question 12jj: How often does organization refer to "total vehs. per admin. emp. ?" Frequently Sometimes Never No response 0% 20% 40% 60% 43

PAGE 50

44 Question 12kk: How often does organization refer to "vehicle miles per gallon?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12U: How often does organization refer to "veb. miles per kw-bour?" Frequently Sometimes Never No response 0% 20% 40% 60% Question 12mm: How often does organization refer to "average fare?" Frequently Sometimes Never No response 0% 20% 40% 60%


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader ntm 22 Ka 4500
controlfield tag 008 s flunnn| ||||ineng
datafield ind1 8 ind2 024
subfield code a C01-00183
040
FHM
049
FHmm
2 110
Florida. Office of Public Transportation Operations.
0 245
1991 performance evaluation of Florida transit systems : analysis of the performance evaluation study utilization survey technical report : final report
260
Tampa, Fla
b Center for Urban Transportation Research (CUTR)
c 1993 April
650
Bus lines--Florida--Management
Local transit--Florida --Evaluation
Bus lines--Florida--Evaluation
710
University of South Florida. Center for Urban Transportation Research
1 773
t Center for Urban Transportation Research Publications [USF].
4 856
u http://digital.lib.usf.edu/?c1.183