Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

Decisions:

 Summary

Description

  





Action Items:

Summary
Description
Responsible
Due Date







 

 








 

Notes/ Discussion items:

Study 8—Strategic Plan, Metrics, KPIs: Because of the evaluation team’s involvement in the development of XSEDE’s metrics and KPIs, we will likely engage an independent evaluator to interview the PI, SMT and XAB members, L3 managers, NSF program officer, and PMs to assess the impact of XSEDE’s management and metric system on its effectiveness.

Retreat Notes:

  • Eval team lead(s): Lizanne

  • Eval team deputy: --

  • John Towns, 

Interviews with NSF project officers that have managed the project (Bob, Rudi, etc.), Level 1 and 2 managers, XSEDE PMs.

Goals for session:

  • Identify potential interviewers
  • Identify potential interviewees
    • Staff, L1/2s (include Nancy and Ralph), L3s, PM&R
      • Patricia Kovatch, Tim Cockerill, Amy Schuele, Ester Soriano, Jim Marstellar, John Cazes, Marlon Pierce, Karla Gendler, Laura Herriott, 
        Nick Berente, Kay Hunt
    • Project officers: Rudi? Might be reluctant as an NSF stance. Interview would be about the process not the project. Might also include Barry and Bob. 
    • XAB: Cliff, Karen, Albert, historical members (John will review list)
      • Could do this in an hour long focus group during an XAB call
    • SP forum members, PIs on awards. 
      • Could do this in an online focus group during an SP forum call
    • NSF: Al and Amy, Manish?
    • Centers: Mike Norman, Bill Gropp, Sean, Dan S., Emre, Tom Cheatham. 
      • This group might need visuals/materials to orient them towards focus of study. Look at section 1 of the quarterly reports/IPRs. 
  • Draft questions for Interview Protocol
    • What was the value-added of having KPIs and metrics? 
      • Can you give me an example of how KPIs increased your ability (i.e. efficiency, excellence, data driven decision making, obstacle, etc.)?
    •  What should some of the higher level top down metrics be?

Notes:

Areas of study: History, organizational structure, process of development, 

Study could be a legacy item from XSEDE to the community. 

Share how a system like this was setup, created, buy in generated, etc.

Moving to this system caused a change in the organizational structure of XSEDE. Was this a good thing?

John views this as looking at the strategic plan, mission/vision/goals, performance measurement system all together - "the system."

XSEDE2.0 was reorganized under the strategic goals unlike in XSEDE1. Facilitated alignment and strategic planning to achieve goals in a large distributed organization. 

-illustrate the Ferris wheel diagram vs the XSEDE2.0 org structure diagram. 

We made a first attempt at KPIs and we should look at both versions. Continuous improvement of the measures and process (i.e. PCRs). 

Changed the way we report and are reviewed. Helped articulate the value we bring to the community in a concrete, consistent way of doing it with trend data.

Made it easier for us to be reviewed since it created focus for the panel and identified clear areas where they could provide feedback. Boundary setting.

Historically the project/community would share quantitative data on HPC usage and system data but it didn't provide insight on sustained impact of the project.

Narrowed guidance from year to year. KPIs were an educative tool for panels. 

KPIs gave the project control over their own definition of success and panel evaluation. 

-if you do this wrong it can come across as self serving. requires balance between setting the bar too low/high. KPIs needed to be challenging, attainable, measures intended dimension/result.

Iterative process that was painful at time because it's a difficult question to answer in a realistic way.

Provided clarity regarding priorities and what's important

Connects L2s to each other. Forces each area to consider how changes in their area affect those in other areas.

Requires communication between areas to discuss impacts, compromises, mitigation etc.

Formalized internal/historical knowledge about what should be measured. The hard part was determining how to measure it.

Climate Study shows the spread of KPI utilization throughout the project over time. 

Quarterly meeting reporting and discussions throughout general meetings was important in learning how to do this. 

Socializing the KPIs was critical to creating familiarity and reducing anxiety around implementation and utilization. 

Continuing the improvement and utilization by creating graphical representations that are easier to digest might be the next phase for NSF

Focus on impact over outcomes.

Since we went through this process and continue to do this creates deeper understanding and cohesion across disparate areas of the project.

Would be interesting to talk about gaps and issues with what we have. For example the number of ECSS projects is related to the number of ECSS staff - are we big enough to do what we do?

Could relate to consistency of metrics across areas - "linking metrics."

Should we have more KPIs across L2 areas to help generate joint work for a common goal ex. how are each of the areas contributing to increasing the number of MSI users?

Can also look at publications as a top down metric

-what would some of these higher level metrics be? 

--ROI work might fall under here. 

*Consider adding questions related to this study to the climate study for 2020. 

Who should be interviewed?




Details:

 

 

  • No labels