Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »





Action Items:

Due Date




Notes/ Discussion items:

Study 8—Strategic Plan, Metrics, KPIs: Because of the evaluation team’s involvement in the development of XSEDE’s metrics and KPIs, we will likely engage an independent evaluator to interview the PI, SMT and XAB members, L3 managers, NSF program officer, and PMs to assess the impact of XSEDE’s management and metric system on its effectiveness.

Retreat Notes:

  • Eval team lead(s): Lizanne

  • Eval team deputy: --

  • John Towns, 

Interviews with NSF project officers that have managed the project (Bob, Rudi, etc.), Level 1 and 2 managers, XSEDE PMs.

Goals for session:

  • Identify potential interviewers
  • Identify potential interviewees
  • Draft questions for Interview Protocol
    • What was the value-added of having KPIs and metrics? 
      • Can you give me an example of how KPIs increased your ability (i.e. efficiency, excellence, data driven decision making, obstacle, etc.)?


Areas of study: History, organizational structure, process of development, 

Study could be a legacy item from XSEDE to the community. 

Share how a system like this was setup, created, buy in generated, etc.

Moving to this system caused a change in the organizational structure of XSEDE. Was this a good thing?

John views this as looking at the strategic plan, mission/vision/goals, performance measurement system all together - "the system."

XSEDE2.0 was reorganized under the strategic goals unlike in XSEDE1. Facilitated alignment and strategic planning to achieve goals in a large distributed organization. 

-illustrate the Ferris wheel diagram vs the XSEDE2.0 org structure diagram. 

We made a first attempt at KPIs and we should look at both versions. Continuous improvement of the measures and process (i.e. PCRs). 

Changed the way we report and are reviewed. Helped articulate the value we bring to the community in a concrete, consistent way of doing it with trend data.

Made it easier for us to be reviewed since it created focus for the panel and identified clear areas where they could provide feedback. Boundary setting.

Historically the project/community would share quantitative data on HPC usage and system data but it didn't provide insight on sustained impact of the project.

Narrowed guidance from year to year. KPIs were an educative tool for panels. 

KPIs gave the project control over their own definition of success and panel evaluation. 

-if you do this wrong it can come across as self serving. requires balance between setting the bar too low/high. KPIs needed to be challenging, attainable, measures intended dimension/result.

Iterative process that was painful at time because it's a difficult question to answer in a realistic way.

Provided clarity regarding priorities and what's important

Connects L2s to each other. Forces each area to consider how changes in their area affect those in other areas.

Requires communication between areas to discuss impacts, compromises, mitigation etc.

Formalized internal/historical knowledge about what should be measured. The hard part was determining how to measure it.

Climate Study shows the spread of KPI utilization throughout the project over time. 

Quarterly meeting reporting and discussions throughout general meetings was important in learning how to do this. 

Socializing the KPIs was critical to creating familiarity and reducing anxiety around implementation and utilization. 

Continuing the improvement and utilization by creating graphical representations that are easier to digest might be the next phase for NSF

Focus on impact over outcomes.




  • No labels