Performance Indicators Team
Identifying, compiling, clarifying, and summarizing data collected on system performance and service delivery across divisions. Analyzing which indicators are useful in routinely characterizing system performance. Identifying additional key indicators that should be collected and reported and recommending systematic and reliable mechanisms for aggregating and reporting these data on a routine basis.
- "Dashboard indicators" of OIT performance. (easily understood indicators, similar to an automobile's dashboard which is an easily understandable and available indication of how an automobile is performing) Indicators might track such factors as productivity/achievement levels, service availability, finances, consumer satisfaction, workforce climate, and operational effectiveness and efficiency.
The team reviewed data currently collected across all divisions and the charts and graphs available through the web by division. The first pass of data analysis was at a very high level, but it was clear that a great deal of data is currently accumulated, and many charts and graphs are available and displayed
We also reviewed performance indicators available from Penn State, Indiana University, Stanford University, University of Michigan, Cornell University, and the University of Delaware. During this review process of other institutions, it was often difficult to locate information on technology, and more difficult to determine if the organization posted performance indicators. Some of the web pages for these institutions contained a great deal of useful information, but often in text format. Only Penn State and Indiana University provided indicators that we were able to access. A thorough review of web information regarding the Rutgers systems showed that a large amount of information is currently available about our systems and more is being developed for display all the time.
Following the review of sample indicators and the Rutgers indicators now available, the team established categories of performance measurements that are good candidates for inclusion in the final performance indicators dashboard. These categories include:
Web hits, mail messages, web browser startups, authentications, Internet utilization, standard software uses, account uses, modem sessions, telephone calls, number of student lines, registered and active dorm machines, number of registered hosts, pages printed, repair statistics by lab, service status indicators, number of accounts on ICI, RCI, student-workstation ratio, number of labs and equipment in labs.
For the results, see the Report of the Performance Indicators Team. (For printing, we recommend that you use the PDF Version .)
Because of the size of Appendix A (1.6 MB), the two appendices are distributed separately:
- Performance Indicators, Appendix A. (PDF Version).
- Performance Indicators, Appendix B. (PDF Version).
Team Leader: Marie J. Botticelli [email@example.com]
Newark Computing Services.
Members: Robert Allen [firstname.lastname@example.org]
Administrative Computing Services.
Nancy Rohrman [email@example.com]
Camden Computing Services.
Roy Marantz [firstname.lastname@example.org]
New Brunswick Computing Services.
Alexander Latzko [email@example.com]
Keith Sproul [firstname.lastname@example.org]