Child pages
  • XSEDE Project-wide KPIs & Metrics
Skip to end of metadata
Go to start of metadata

More Information: What is a KPI - KPI Guidelines - Annual KPI Reviews 

1    Executive Summary

Computing across all fields of scholarship is becoming ubiquitous: digital technologies underpin, accelerate, and enable new, even transformational, research in all domains. Researchers continue to integrate an increasingly diverse set of distributed resources and instruments directly into their research and educational pursuits. Access to an array of integrated and well-supported high-end digital services is critical for the advancement of knowledge. XSEDE (the Extreme Science and Engineering Discovery Environment) is a socio-technical platform that integrates and coordinates advanced digital services within the national ecosystem to support contemporary science. This ecosystem involves a highly distributed, yet integrated and coordinated, assemblage of software, supercomputers, visualization systems, storage systems, networks, portals and gateways, collections of data, instruments and personnel with specific expertise. XSEDE fulfills the need for an advanced digital services ecosystem distributed beyond the scope of a single institution and provides a long-term platform to empower modern science and engineering research and education. As a significant contributor to this ecosystem, driven by the needs of the open research community, XSEDE substantially enhances the productivity of a growing community of scholars, researchers, and engineers. XSEDE federates with other high-end facilities and campus-based resources, serving as the foundation for a national e-science infrastructure with tremendous potential for enabling new advancements in research and education. Our vision is a world of digitally-enabled scholars, researchers, and engineers participating in multidisciplinary collaborations while seamlessly accessing computing resources and sharing data to tackle society’s grand challenges.

Researchers use advanced digital resources and services every day to expand their understanding of our world. More pointedly, research now requires more than just supercomputers, and XSEDE represents a step toward a more comprehensive and cohesive set of advanced digital services through our mission: to substantially enhance the productivity of a growing community of scholars, researchers, and engineers through access to advanced digital services that support open research; and to coordinate and add significant value to the leading cyberinfrastructure resources funded by the NSF and other agencies

XSEDE has developed its strategic goals in a manner consistent with NSF’s strategic plan, Investing in Science, Engineering, and Education for the Nation's Future: NSF Strategic Plan for 2014 - 2018,1 NSF’s strategies stated broadly in the Cyberinfrastructure Framework for 21st Century Science and Engineering,2 vision document, and the more specifically relevant Advanced Computing Infrastructure: Vision and Strategic Plan3 document.

 

1.1       Strategic Goals

To support our mission and to guide the project’s activities toward the realization of our vision, three strategic goals are defined: 

Deepen and Extend Use: XSEDE will deepen the use—make more effective use—of the advanced digital services ecosystem by existing scholars, researchers, and engineers, and extend the use to new communities. We will contribute to preparation—workforce development—of the current and next generation of scholars, researchers, and engineers in the use of advanced digital services via training, education, and outreach; and we will raise the general awareness of the value of advanced digital services

Advance the Ecosystem: Exploiting its internal efforts and drawing on those of others, XSEDE will advance the broader ecosystem of advanced digital services by creating an open and evolving e-infrastructure, and by enhancing the array of technical expertise and support services offered. 

Sustain the Ecosystem: XSEDE will sustain the advanced digital services ecosystem by ensuring and maintaining a reliable, efficient, and secure infrastructure, and providing excellent user support services. XSEDE will further operate an effective, productive, and innovative virtual organization

The strategic goals of XSEDE cover a considerable scope. To assure we are delivering our mission and to assess progress toward our vision, we have identified key metrics to measure our progress toward meeting each sub-goal. These key performance indicators (KPIs) are a high-level encapsulation of our project metrics that measure how well we are meeting each sub-goal. Planning is driven by our vision, mission, goals, and these metrics— which are in turn rooted in the needs and requirements of the communities we serve. 

The key concept is not that the KPIs themselves must have a direct causal effect on eventual outcomes, or measure eventual outcomes or long-term impacts, but rather that the KPIs are chosen so that actions and decisions which move the metrics in the desired direction also move the organization in the direction of the desired outcomes and goals. 

Table 1-1 below shows the project’s three strategic goals and associated sub-goals along with the KPIs used to measure progress toward achieving those goals.

Table 1-1: Summary of key performance indicators (KPIs) for XSEDE.

Strategic Goals

Sub-goals

KPIs

Deepen and Extend Use

Deepen use (existing communities)

  • Number of completed ECSS projects
  • Average ECSS impact rating
  • Average satisfaction with ECSS support

Extend use (new communities)

  • Number of new users from underrepresented communities and non-traditional disciplines of XSEDE resources and services
  • Number of sustained users from underrepresented communities and non-traditional disciplines of XSEDE resources and services

Prepare the current and next generation

  • Number of attendees in synchronous and asynchronous training
  • Average impact assessment of training for attendees registered through the XSEDE User Portal

Raise awareness of the value of advanced digital services

  • Number of pageviews to the XSEDE website
  • Number of pageviews to the XSEDE User Portal
  • Number of social media impressions
  • Number of media hits

Advance the Ecosystem

Create an open and evolving e-infrastructure

  • Number of new capabilities made available for production deployment
  • Average satisfaction rating of XCI services

Enhance the array of technical expertise and support services

  • Average rating of staff regarding how well-prepared they feel to perform their jobs

Sustain the Ecosystem

Provide reliable, efficient, and secure infrastructure

  • Average composite availability of core services
  • Hours of downtime with direct user impacts from an XSEDE security incident

Provide excellent user support

  • Mean time to ticket resolution
  • Average user satisfaction ratings for allocations and other support services

Operate an effective and productive virtual organization

  • Percentage of recommendations addressed by relevant project areas

Operate an innovative virtual organization

  • Number of staff publications
  • Number of key improvements addressed from systematic evaluation
  • Number of key improvements addressed from external sources
  • Ratio of proactive to reactive improvements

[Table of Contents]

2    Discussion of Strategic Goals and Key Performance Indicators

The strategic goals of XSEDE cover a considerable scope. Additionally, the specific activities within our scope are often very detailed; therefore, to ensure that this significant and detailed scope will ultimately deliver our mission and realize our vision, we decompose the three strategic goals into components or sub-goals to be considered individually. 

In determining the best measures of progress toward each of the sub-goals, KPIs that correlate to impact on the scientific community are used. These often pair measurements of outcome with an assessment of quality or impact to provide both a sense of scope and significance of the supporting activities. In some cases, metrics for impact, outcome, or both are not currently being collected. In these cases, the best available metric is used. 

 

2.1       Deepen and Extend Use

XSEDE will 1) deepen the use—make more effective use—of the advanced digital services ecosystem by existing scholars, researchers, and engineers and 2) extend the use to new communities. We will 3) contribute to preparation—workforce development—of scholars, researchers, and engineers in the use of advanced digital technologies via training, education, and outreach; and we will 4) raise the general awareness of the value of advanced digital research services

2.1.1 Deepen use to Existing Communities

Although efforts to identify new technologies and new service providers along with efforts to evolve the e-infrastructure and enhance the research prowess of current and future researchers all serve to also deepen use, the collaborative, year long work done to help research teams more effectively and broadly use the ecosystem is the best indicator of deeper use. These efforts enable increased scale and efficiency of usage and allow use of new capabilities for the delivery of science. The project has chosen three metrics (Table 2-1) that together measure the scope, quality, and impact of these activities: 1) the total number of projects completed by the Extended Collaborative Support Services team through work with research teams, community codes, and gateways and the average ratings for 2) satisfaction with and 3) overall impact from the ECSS work. Satisfaction and overall impact are scores provided by the PIs of the projects on a 1-5 scale following project completion. 

Table 2-1: KPIs for the sub-goal of deepen use (existing communities). 

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Average ECSS impact rating

RY5

      

ECSS (§4)

RY4

      

RY3

      

RY2

4 out of 5 / qtr4.11    

RY1

4 out of 5 / qtr

*4.564.61

3.29

4.14

 

Definition/Description: After an ECSS project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate the impact of the ECSS support on their research on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ECSS staff members. The L2 Director asks the PI to rate the impact of the ECSS support on their research on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual impact rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Average satisfaction with ECSS supportRY5      ECSS (§4)
RY4      
RY3      
RY24.5 out of 5 / qtr4.65    
RY14.5 out of 5 / qtr*4.864.72

4.64

4.54

 

Definition/Description: After an ECSS project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate their satisfaction with the ECSS support they have received on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ECSS staff members. The L2 Director asks the PI to rate their satisfaction with the support received on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Number of completed ECSS projects

RY5

      

ECSS (§4)

RY4

      

RY3

      

RY2

50 / yr16    

RY1

50 / yr

*1013

25

48

 

Definition/Description: The total number of completed ECSS projects consists of projects that were completed through work in ESRT, ESCC, and ESSGW. A completed project is defined as a project that has progressed through the complete support pipeline.  This pipeline includes steps such as assignment of a consultant, production of a work plan, execution of the work plan and reporting of progress through quarterly reports, and the filing of a final project report.

Collection methodology: The number of completed projects is tracked in Sciforma, XSEDE’s project management software. A report has been created in Sciforma that queries all the ECSS projects, filters for allocations that have ended in that quarter and also for projects that have a final report.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

2.1.2 Extend Use to New Communities

New communities are defined as new fields of science, industry, and under-represented communities. New fields of science are those that represent less than 1% of XSEDE Resource Allocation Committee (XRAC) allocations. The Novel & Innovative Projects (NIP) team and the Broadening Participation team both work to bring advanced digital services to new communities. XSEDE measures both the number of new users and the number of sustained users on research projects from under-represented communities and non-traditional disciplines of XSEDE resources and services as the indicators of progress (Table 2-2).

Table 2-2: KPIs for the sub-goal of extend use (new communities).

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Number of new users from underrepresented

communities and non-traditional disciplines of XSEDE resources and services

RY 5

      

ECSS – NIP (§4.3)


CEE - Broadening Participation (§3.4)

RY 4

      

RY 3

      

RY 2

1,100 / yr352    

RY 1

> 200 / yr

*297227

398

922

 

Definition/Description: This KPI tracks progress in extending participation in XSEDE by first-time users from new communities.   

Collection methodology: This is the sum of the number of new users from underrepresented communities measured by Community Engagement and Enrichment - WBS 2.1 and the number of new users from non-traditional disciplines measured by Novel and Innovative Projects - WBS 2.2.3 

Number of sustained users from underrepresented

communities and non-traditional disciplines of XSEDE resources and services

RY 5

      

ECSS – NIP (§4.3)


CEE - Broadening Participation (§3.4)

RY 4

      

RY 3

      

RY 2

3,500 / yr1,190    

RY 1

2,600 / yr1

*773755

1,251

2,779

 

Definition/Description: This KPI tracks progress in extending participation in XSEDE by persistent engagement of users from new communities. 

Collection methodology: This is the sum of the number of persistent users from underrepresented communities measured by Community Engagement and Enrichment - WBS 2.1 and the number of persistent users from non-traditional disciplines measured by Novel and Innovative Projects - WBS 2.2.3. 

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1 - Target has been updated as NIP updated target for RY1RP4.

2.1.3 Prepare the Current and Next Generation

Part of XSEDE’s mission is to provide a broad community of existing and future researchers with access and training to use advanced digital services via the sub-goal of preparing the current and next generation of computationally-savvy researchers. While many activities support this sub-goal, such as the various Champion (§3.6), Student Engagement (§3.4), and Education (§3.2) programs, the training offered through Community Engagement & Enrichment (CEE) impacts the most people directly. This, and a complementary measure of impact as indicated by those same individuals, are therefore considered the key indicators (Table 2-3) of performance toward this goal. 

Table 2-3: KPIs for the sub-goal of prepare the current and next generation

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Number of attendees in synchronous and

asynchronous training

RY 5

      

CEE - Workforce Development (§3.2)

RY 4

      

RY 3

      

RY 2

5,200 / yr1,305    

RY 1

6,000 / yr1

*1,3041,804

1,264

3,652

 

Definition/Description: We will report number of attendees in two categories:

  • The total number of attendees at synchronous events (in-person and webinar), where an attendee is any XSEDE User Portal registrant who attends the event for any length of time.  For an in-person workshop, attendees are those who physically attend, and for webinars, attendees are those who sign in with their XSEDE User Portal username; and

  • The number of classes taken by all attendees, where an attendee is anyone who loaded page(s) in an online training module.  Regardless of activity level, an attendee will be counted as taking a course no more than once in a given quarter.  Only attendees who signed in with an XUP username are count.

Collection methodology:

    • For synchronous events, information about attendees (XUP username, date, and course) who register and participate using their XSEDE User Portal username are recorded and uploaded to a central metrics database via the course coordinator.   Walk-ins are not counted.

    • For asynchronous events, attendee information (XUP username, date, and course) is pulled from CI-Tutor and Cornell Virtual Workshop databases and uploaded to the XSEDE User Portal database by the training lead and is uploaded to a central metrics database. Visiting pages in one asynchronous module multiple times during a quarter count as one participant. If this person visits the course again the next quarter, it is counted as a new class that quarter, as it’s the equivalent of a person attending the same course/event a second time.

Average impact assessment of training for attendees registered through the XSEDE User Portal

RY 5

      

CEE - Workforce Development (§3.2)

RY 4

      

RY 3

      

RY 2

4 out of 5 / qtr4.29    

RY 1

4 out of 5 / qtr

*4.544.39

4.28

4.36

 

Definition/Description: This is the average of all attendees’ self-rating of each event in which they attend, on a scale of 1 to 5.

Collection methodology: The data is collected from post event surveys and recorded in the database. Specific items included in this index are:

    • Q1. The training session fulfilled my expectations.

    • Q2. The trainer stimulated my interest.

    • Q6. The training session was well-organized.

    • Q8. I was able to easily access this training session.

    • Q10. Overall I would rate my experience as successful.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1 The target total for synchronous and asynchronous attendees (2,000 and 4,000, respectively) was previously listed as 5,600 / yr. This was an error. The target total should be 6,000.

2.1.4 Raise Awareness of the value of advanced digital services

While many PY1 activities, such as our Workforce Development (§3.2), User Engagement (§3.3) and Broadening Participation (§3.4) efforts, and the visibility of our Champions and other Campus Engagement (§3.6) activities, contribute to our efforts to measure our ability to raise the general awareness of the value of advanced digital research services, we have chosen to focus on measures in four areas (Table 2-4): website, social media, public relations, and media hits. Desirable trends in these key outcomes can be correlated to success for this sub-goal. 

Table 2-4: KPIs for the sub-goal of raise awareness of the value of advanced digital research services

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Number of pageviews to the XSEDE website

RY 5

      

Community Engagement and Enrichment - UII (§3.5)

RY 4

      

RY 3

      

RY 2

80,000 / qtr63,998    

RY 1

80,000 / qtr

*49,40965,157

68,227

183,212

 

Definition/Description: Per Google Analytics, “a pageview is defined as a view of a page on your site that is being tracked by the Analytics tracking code. If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.”

Collection methodology: This is the number reported in the Google Analytics tracking software for the given time period.

Number of pageviews to the XSEDE User Portal

RY 5

      

Community Engagement and Enrichment - UII (§3.5)

RY 4

      

RY 3

      

RY 2

250,000 / qtr256,482    

RY 1

100,000 / qtr

*183,408219,644

265,151

670,080

 

Definition/Description: Per Google Analytics, “a pageview is defined as a view of a page on your site that is being tracked by the Analytics tracking code. If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.”

Collection methodology: This is the number reported in the Google Analytics tracking software for the given time period.

Number of Social Media impressions

RY 5

      

Program Office - ER (§8.2)

RY 4

      

RY 3

      

RY 2

300,00 / yr69,607    

RY 1

190,000 / yr

*52,200128,675

81,332

262,207

 

Definition/Description: Number of people who have seen XSEDE interactions on Facebook + Twitter. Both Facebook and Twitter count “impressions,” which is essentially how many people saw a certain post. A user of FB or Twitter no longer has to directly “like” or “retweet” a post to get this number - “impressions” would refer to people who see the post based on their friends or followers sharing our information. It is a better way of collecting social media awareness than just “followers” or “shares.”

Collection methodology: Facebook and Twitter both have internal methods of tracking metrics - we simply go to those sites and find the “impressions” numbers we need at the end of each reporting cycle. Anyone with access to the FB and Twitter pages can easily gather this information.

Number of media hits

RY 5

      

Program Office - ER (§8.2)

RY 4

      

RY 3

      

RY 2

169 / yr42    

RY 1

140 / yr

*3230

18

80

 

Definition/Description: This is the number of XSEDE-related stories we find in the media, many times through Google alerts for “XSEDE,” that mentions XSEDE by name. Often, we manually search “XSEDE,” as well, in case the daily alert email is missed.  

Collection methodology: NCSA tracks media hits for both NCSA and XSEDE, so that is a baseline collection of numbers. The ER team can then do additional manual searching - as an example, oftentimes the hits of HPCwire are not collected on the NCSA page, so an additional manual count is needed.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

2.2       Advance the Ecosystem

Exploiting its internal efforts and drawing on those of others, XSEDE will advance the broader ecosystem of advanced digital services by 1) creating an open and evolving e-infrastructure, and by 2) enhancing the array of technical expertise and support services offered. 

2.2.1 Create an Open and Evolving e-Infrastructure

There are a variety of factors that affect the evolution of the e-infrastructure. These range from external factors, such as the number of XSEDE Federation members and the variety of services they provide, to internal factors, like Operations (§6) of critical infrastructure and services and the evaluation and integration of new capabilities. While we actively seek new Federation members and Service Providers, and partnerships with national and international cyberinfrastructure projects, we view our role as connector of these elements to be the most impactful. Thus, XSEDE focuses on the number of new capabilities made available for production deployment along with the satisfaction rating of all XSEDE Community Infrastructure (XCI) services as indicators of performance with respect to this sub-goal (Table 2-5). 

Table 2-5: KPIs for the sub-goal of create an open and evolving e-infrastructure

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Number of new capabilities made

available for production deployment

RY 5

      

XCI (§5)

RY 4

      

RY 3

      

RY 2

7 / yr1    

RY 1

7 / yr

*00

6

6

 

Definition/Description: New use cases that have been fully enabled using new or enhanced components and made available by RACD completed CDP activities.  

Collection methodology: Count is done manually and maintained in JIRA.

Average satisfaction rating of

XCI services

RY 5

      

XCI (§5)

RY 4

      

RY 3

      

RY 2

4.7 out of 5 / yr4.25    

RY 1

4 out of 5 / yr

*4.84.81

4.5

4.5

 

Definition/Description: Customer satisfaction rating of XCI staff provided services (does not include software services which are covered in RACD metrics)

Collection methodology: Customer satisfaction survey(s).

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1This number is for RACD only as XCRI satisfaction numbers are unavailable for this reporting period.

 

2.2.2 Enhance the Array of Technical Expertise and Support Services

To enhance the technical expertise of our staff to offer an evolving set of support services, we will continue many activities including workshops, symposia, and training events hosted by Extended Collaborative Support Services (ECSS) and Service Providers (§4.6). The average rating by staff for training will continue to be a KPI; the staff climate survey question that measures this has been changed to focus on staff having the training they need, instead of staff turning to XSEDE training to fulfill these needs (Table 2-6). This represents a change in approach to staff training that reflects budget constraints, where we will leverage existing training offered by the Service Providers, universities, and professional associations alongside our own training to enhance the expertise of staff.

Table 2-6: KPIs for the sub-goal of enhance the array of technical expertise and support services.

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Average rating of staff regarding how

well-prepared they feel to perform their jobs

RY 5

      

Program Office - Strategic Planning, Policy & Evaluation (§8.5)

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr-    

RY 1

4 out of 5 / yr

*-^-

-

-

 

Definition/Description: An annual Staff Climate Study is a survey administered and analyzed by an external evaluation team as part of an effort to understand whether XSEDE staff feel adequately prepared to conduct their XSEDE work. The survey study includes items regarding staff training.

Collection methodology: Respondents rate closed-ended Likert scale items regarding training on scale of 1 (strongly disagree) to 5 (strongly agree). The average of all staff responses for a given year is calculated to determine this metric. The specific item used to calculate this metric is Q2m: “I have access to adequate training to conduct my XSEDE-related work.”

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

- Data reported annually.  Due to the shortened reporting year, the annual XSEDE Staff Climate Study did not occur during RY1. 

^ The number 3.70 was erroneously reported in RY1, RP2.

 

2.3       Sustain the Ecosystem

XSEDE will sustain the advanced digital services ecosystem by 1) ensuring and maintaining a reliable, efficient, and secure infrastructure, and 2) providing excellent user support services. Furthermore, XSEDE will operate an 3) effective, 4) productive, and 5) innovative virtual organization

2.3.1 Provide Reliable, Efficient, and Secure Infrastructure

Many activities support this sub-goal—such as User Information & Interfaces (§3.5), Security (§6.2), Data Transfer Services (§6.3), Systems Operations and Support (§6.5), support for Allocations (§7.2), and Allocations, Accounting & Account Management (§7.3)—but perhaps the truest measure of an infrastructure’s reliability is its robustness as reflected by sustained availability. Thus, the KPI for this sub-goal is a composite measure of the availability of XSEDE infrastructure components and the number of hours of downtime with direct user impacts from security incidents (Table 2-7). The composite measure is a geometric mean of the availability of critical enterprise services and the XRAS allocations request management service. 

Table 2-7: KPIs for the sub-goal of provide reliable, efficient, and secure infrastructure

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Average composite availability of core services

RY 5

 

     

Operations (§6) RAS(§7)

RY 4

      

RY 3

      

RY 2

99.9% / qtr99.8%    

RY 1

99% / qtr

*99.9%99.9%

99.9%

99.9%

 

Definition/Description: The percent average composite availability of core services is the  geometric mean of % core enterprise services availability and % POPS/XRAS availability. Because the availability percentage of each of these components is measured separately, they are aggregated and then averaged using a geometric mean to determine the composite availability.

Collection methodology: See the individual collection methodologies for each component in the responsible WBS area below: Core enterprise services (Systems Operational Support 2.4.5) and POPS/XRAS (AA&AM 2.5.3).

Hours of downtime with direct user impacts from XSEDE security incidents

RY 5

      

Operations - Cybersecurity (§6.2)

RY 4

      

RY 3

      

RY 2

0 / qtr0    

RY 1

< 24 / qtr

*0146

0

146

 

Definition/Description: This metric will measure resource unavailability for users as a result of an XSEDE-wide security event (involving a Tier 1 service or spanning more than one SP). The metric will be calculated as the Time to Return to Repair (TTR) summed across all applicable services and incidents during the quarter.

Collection methodology: The Cybersecurity co-leads determine that an XSEDE-wide security event is responsible for an outage. The Sysops lead will determine the TTR based on monitoring and other information in logs and tickets, and the TTR value will be the data point reported.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

2.3.2 Provide Excellent User Support 

Although nearly every group in the organization has some support function, we have chosen to focus on metrics with respect to two primary support interfaces to the community: the XSEDE Operations Center (XOC) metrics with respect to ticket resolutions times and the average satisfaction rating of the allocations process. The XOC is the frontline centralized support group that either resolves or escalates tickets to the appropriate resolution center depending on the request. Two KPIs are measured (Table 2-8): the mean time to resolution on support tickets that are resolved by the XOC or routed to, and resolved by, other XSEDE areas, and the average satisfaction rating for the allocations process measured via a quarterly survey of users who have interacted with the allocations request system and the allocations process more generally. 

Table 2-8: KPIs for the sub-goal of provide excellent user support

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Mean time to ticket resolution (hours)

RY 5

      

Operations (§6)

RY 4

      

RY 3

      

RY 2

< 24 / qtr26    

RY 1

< 24 / qtr

*24.028.2

23.1

25.1

 

Definition/Description: XSEDE-funded areas are measured to ensure that, on average, tickets are resolved in a timely manner. This includes all XSEDE WBS ticket queues as well as the XSEDE Operations Center (XOC) queue. This metric does NOT apply to Service Provider ticket queues. End-users expect for their issues to be responded to within a reasonable amount of time, so the target is a resolution time under 24 hours.

Collection methodology: The data is stored locally via the Request Tracker service, which is the XSEDE ticket software and which is located at TACC. Reports are generated against the RT database to determine MTTR and other metrics.

User satisfaction with allocations process and other support services

RY 5

 

     

RAS (§7)

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr4.08    

RY 1

4 out of 5 / yr

*3.984.03

4.03

4.01

 

Definition/Description: Average of the two area metrics “Averager user satisfaction with XRAS” and “average user satisfaction of allocations process.” The average user satisfaction with XRAS is the measure of users’ satisfaction with using XRAS, the primary tool for submission, review, and administration of allocation requests. The “average user satisfaction with the allocations process” is the measure of users’ satisfaction with allocation policies and procedures for the submission, review, and administration of allocation requests.

Collection methodology: Because this is a composite KPI that consists of an AM from XSEDE Allocations Process & Policies (WBS 2.5.2) and an AM from Allocations, Accounting & Account Management (WBS 2.5.3), see those sections below for how the data is collected each AM.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

2.3.3 Effective and Productive Virtual Organization

During the first five years of XSEDE, in conjunction with developing a methodology for driving and assessing performance excellence, XSEDE adopted the Baldrige Criteria 4 and has assessed and applied applicable criteria from all seven criteria by that methodology. These include annual reviews of the vision, mission, strategic goals, project-wide processes and standards (KPIs and area metrics); user and staff surveys (§3.3, §8.5); stakeholder communications (§8.2); advisory boards (§8.1); community engagement (§3); workforce enhancement (§3.2); and the analysis of organizational data that leads to organizational learning, strategic improvement and innovation. Thus, XSEDE has chosen to monitor its continuous improvement efforts as an organizational measurement of recommendations addressed by the relevant project areas (Table 2-9). The idea here is that a stable or expanding number of improvements shows that XSEDE continues to systematically evaluate the organization and make informed, proactive improvements. 

Table 2-9: KPIs for the sub-goal of operate an effective and productive virtual organization.

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Percentage of recommendations addressed

by relevant project areas

RY 5

      

Program Office - Strategic Planning, Policy & Evaluation (§8.5)

RY 4

      

RY 3

      

RY 2

90% / qtr23%    

RY 1

90% / qtr

*NA100%

57%

67%

 

Definition/Description: The SP&E team will calculate the metric based off the data from two measurements; total key recommendations made and total key recommendations addressed. Total key recommendations addressed will be divided by the total key recommendations made at the point in time of data capture (at the end of the IPR reporting period).

Collection methodology: The number of key recommendations made according to the annual XSEDE Climate Study findings + the total number of recommendations recorded on the XSEDE Project-Wide Improvements & Recommendations Google Sheet, such as XAB, NSF, and SP&E recommendations. XSEDE staff, not the SP&E team, are accountable for entering information to track improvements and recommendations.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

2.3.4 Innovative Virtual Organization

Measuring innovation for an organization like XSEDE (or for organizations in general) is difficult and represents an area of open research. A partial measure is the number of staff publications produced since this shows that XSEDE staff is involved in novel activities that achieve peer-reviewed publication. Additionally, after much thought and discussion both internally and with external stakeholders and advisors, we have identified two additional indicators that strongly correlate to innovation: 1) the ratio of proactive organizational improvements to those that were reactive and 2) the number of improvements that are innovative or lead to innovations. The first indicator is a measurement of organizational maturity and agility; the second measures innovative actions directly (Table 2-10). While these provide some indication of innovation, they are still not satisfactory. These KPIs will continue to be the subject of an open conversation within the organization and with stakeholders and advisors as XSEDE assesses these measurements and how to best quantify innovation. In particular, this will be discussed in earnest within the Strategy, Planning, Policy, Evaluation & Organizational Improvement team (§8.5). 

Table 2-10: KPIs for the sub-goal of operate an innovative organization

KPI

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Number of staff publications

RY 5      Program Office (§8)
RY 4      
RY 3      
RY 220 / yr2    
RY 170 / yr*50

13

18

 

Definition/Description:  

Collection methodology: When an XSEDE staff member has a publication, he or she enters the publication data into their profile on the XUP. This data is then summed for the appropriate reporting period. 

 

Number of key improvements addressed from systematic evaluation1

 

RY 5

      

Program Office - Strategic Planning, Policy & Evaluation (§8.5)

RY 4

      

RY 3

      

RY 2

10 / yrNA3    

RY 1

*

*

*

*

19

19

 

Definition/Description: It is essential for teams to include improvements implemented based upon responses to formal survey results such as the Staff Climate survey. The following sources will be used as recommendation data:
- Ticket Survey
- Climate Survey
- User Survey
- XCI Survey
- Micro Surveys
- XRAS post-allocation survey

Collection methodology: Number of improvements addressed during the quarter and logged on the XSEDE Project-Wide Improvements & Recommendations Google Sheet. There may be future development in JIRA for tracking.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Number of key improvements addressed from external sources1

RY 5      Program Office - Strategic Planning, Policy & Evaluation (§8.5)




RY 4      
RY 3      
RY 210 / yr2    
RY 1

*

*

*

*

11

11

 

Definition/Description: It is essential for teams to include improvements implemented based upon responses to feedback from such sources as XSEDE advisory bodies. The following will be used as recommendation data:
- XAB
- UAC
- SP Forum
- NSF

Collection methodology: Number of improvements addressed during the quarter and logged on the XSEDE Project-Wide Improvements & Recommendations Google Sheet. There may be future development in JIRA for tracking.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Ratio of proactive to reactive

improvements

RY 5

      

Program Office - Strategic Planning, Policy & Evaluation (§8.5)

RY 4

      

RY 3

      

RY 2

4:1 / yr1:1    

RY 1

3:1 / yr

*2:127:1

8:9

17:11

 

Definition/Description: The ratio of proactive to reactive improvements is a measurement of organizational maturity and agility. An improvement is classified as being reactive if the change was due to a problem that occurred naturally and had to be corrected for the process to continue. A proactive improvement is an improvement that is made in anticipation of future problems, needs, or changes.

Collection methodology: Process improvements are tracked via a self-reporting method (Each quarter, all of the WBS level 3 managers and WBS Level 2 Directors are queried via e-mail asking them to self-report on any process improvements they have implemented in the last quarter. This information is collected in a google spreadsheet and totaled each quarter.). All areas of the project are asked to submit the process improvements they have made to their areas on a quarterly basis. From this list, the improvements are then classified as being either proactive or reactive based on the definitions above.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1This metric is one of two new metrics that resulted from splitting a former metric, called "Number of strategic or innovative improvements," into two. The former metric had a target of 9/yr and results were 3 for RP2 and 8 for RP3 in RY1. These two new metrics each have a target of 10/yr and will be reported annually. 

2Number was updated from what was originally reported due to reporting error.

3L2 Directors are currently responding to climate study recommendations; data will be available in RY2 RP2. 

[Table of Contents]

3       Community Engagement & Enrichment - CEE (WBS 2.1) 

At the core of Community Engagement & Enrichment (CEE) is the researcher, broadly defined to include anyone who uses or may potentially use the array of resources and services offered by XSEDE. The CEE team is dedicated to actively engaging a broad and diverse cross-section of the open science community, bringing together those interested in using, integrating with, enabling, and enhancing the national cyberinfrastructure. Vital to the CEE mission is the persistent relationship with existing and future users, including allocated users, training participants, XSEDE collaborators, and campus personnel. CEE will unify public offerings to provide a more consistent, clear, and concise message about XSEDE resources and services, and bring together those aspects of XSEDE that have as their mission teaching, informing, and engaging those interested in advanced cyberinfrastructure.

The five components of CEE are Workforce Development (§3.2), which includes Training, Education and Student Preparation, User Engagement (§3.3), Broadening Participation (§3.4), User Interfaces & Online Information (§3.5) and Campus Engagement (§3.6). These five teams ensure routine collection and reporting of XSEDE’s actions to address user requirements. They provide a consistent suite of web-based information and documentation and engage with a broad range of campus personnel to ensure that XSEDE’s resources and services complement those offered by campuses. Additionally, CEE teams expand workforce development efforts to enable many more researchers, faculty, staff, and students to make effective use of local, regional, and national advanced digital resources. CEE expands efforts to broaden the diversity of the community utilizing advanced digital resources. The CEE team tightly coordinates with the rest of XSEDE, particularly Extended Collaborative Support Services (§4), Resource Allocation Services (§7), XSEDE Community Infrastructure (§5), and External Relations (§8.2). 

CEE is focused on personal interactions, ensuring that existing users, potential users and the general public have sufficient access to materials and have a positive and effective experience with XSEDE public offerings and frontline user support. As such, the CEE area metrics are designed to broadly assess this performance. CEE focuses on metrics that quantify how many users in aggregate are benefiting from XSEDE resources and services. Additionally, CEE focuses on how well the user base is sustained over time and how well training offerings evolve with changing user community needs.

Table 3-1: Area Metrics for Community Engagement & Enrichment 

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Number of new users of XSEDE

resources and services

RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

2,000 / qtr2,305    

RY 1

> 1,000 / qtr

*1,97311,8491

 2,359

6,181 

 

Definition/Description: This is the number of portal accounts created within the selected time period

Collection methodology: This is the number of portal accounts created with Liferay. This is queried in the Liferay database by looking in the Users table and seeing how many accounts have a timestamp in the created field between the selected dates.

Number of sustained users of XSEDE resources and services

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

3,000 / qtr3,962    

RY 1

> 5,000 / qtr

*4,7554,4461

4,924 

6,186 

 

Definition/Description: This is the number of active users logged into the XSEDE User Portal within the selected time period.

Collection methodology: There is a usage log parsing script that collects the tomcat logs and parses out the relevant data during the selected time period.

Number of new users from underrepresented

communities using XSEDE resources and services

(KPI - composite)


RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

150/ qtr251    

RY 1

> 100 / qtr

*150135

240 

525

 

Definition/Description:First-time awardees of XSEDE compute resources or an “allocation” (i.e. Startup, Research, Education, etc.) from underrepresented communities including women and racial/ethnic domestic minorities in HPC as well as anyone from a Minority Serving Institution (MSI) as defined by the Carnegie Classification of Institutions of Higher Education are tracked.

Collection methodology: XSEDE portal users are able to specify their race/ethnicity, gender, and institution in their user profile and are encouraged to complete this data when registering for training events through the XSEDE portal. Date of allocation awards is also recorded in the XSEDE database (XDCDB). A query is then run to generate the number of individuals meeting the aforementioned criteria within a given quarter. Care is taken to not over represent this metric by ensuring an individual who meets multiple criteria is only accounted for one time.

Number of sustained users from underrepresented

communities using XSEDE resources and services

(KPI - composite)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

1,500 / yr490    

RY 1

> 1,000 / yr

*322238

535

1,095

 

Definition/Description: Awardees of XSEDE compute resources or an “allocation” (i.e. Startup, Research, Education, etc.) from underrepresented communities including women and racial/ethnic domestic minorities in HPC as well as anyone from a Minority Serving Institution (MSI) as defined by the Carnegie Classification of Institutions of Higher Education are tracked that access their “allocation” during the time period.

Collection methodology: XSEDE portal users are able to specify their race/ethnicity, gender, and institution in their user profile and are encouraged to complete this data when registering for training events through the XSEDE portal. Date of allocation awards is also recorded in the XSEDE database (XDCDB). A query is then run to generate the number of individuals meeting the aforementioned criteria within a given quarter. Care is taken to not over represent this metric by ensuring an individual who meets multiple criteria is only accounted for one time.

Number of attendees in synchronous and

asynchronous training

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

5,200 / yr1,305    

RY 1

> 6,000 / yr2

*1,3041,084

 1,264

3,652 

 

Definition/Description:

  • We will report number of attendees in two categories:

    • The total number of attendees at synchronous events (in-person and webinar), where an attendee is any XSEDE User Portal registrant who attends the event for any length of time.  For an in-person workshop, attendees are those who physically attend, and for webinars, attendees are those who sign in with their XSEDE User Portal username; and

    • The number of classes taken by all attendees, where an attendee is anyone who loaded page(s) in an online training module.  Regardless of activity level, an attendee will be counted as taking a course no more than once in a given quarter.  Only attendees who signed in with an XUP username are counted

Collection methodology:

    • For synchronous events, information about attendees (XUP username, date, and course) who register and participate using their XSEDE User Portal username are recorded and uploaded to a central metrics database via the course coordinator.   Walk-ins are not counted.

    • For asynchronous events, attendee information (XUP username, date, and course) is pulled from CI-Tutor and Cornell Virtual Workshop databases and uploaded to the XSEDE User Portal database by the training lead and is uploaded to a central metrics database. Visiting pages in one asynchronous module multiple times during a quarter count as one participant. If this person visits the course again the next quarter, it is counted as a new class that quarter, as it’s the equivalent of a person attending the same course/event a second time.


Average impact assessment of training for attendees registered through XSEDE User Portal

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

4 out of 5 / qtr4.29    

RY 1

4 out of 5 / qtr

*4.544.39

4.28 

4.36

 

Definition/Description: This is the average of all attendees’ self-rating of each event in which they attend, on a scale of 1 to 5.

Collection methodology: 

  • The data is collected from post event surveys and recorded in the database. Specific items included in this index are:

    • Q1. The training session fulfilled my expectations.

    • Q2. The trainer stimulated my interest.

    • Q6. The training session was well-organized.

    • Q8. I was able to easily access this training session.

    • Q10. Overall I would rate my experience as successful.


Number of pageviews to the XSEDE website

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital research services

RY 4

      

RY 3

      

RY 2

80,000 / qtr63,998    

RY 1

80,000 / qtr

*49,40965,157

68,227 

183,212 

 

Definition/Description: Per Google Analytics, “a pageview is defined as a view of a page on your site that is being tracked by the Analytics tracking code. If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.”

Collection methodology: This is the number reported in the Google Analytics tracking software for the given time period.

Number of pageviews to the XSEDE User Portal

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital research services

RY 4

      

RY 3

      

RY 2

250,000 / qtr256,482    

RY 1

100,000 / qtr

*183,408219,664

261,115

670,080 

 

Definition/Description:Per Google Analytics, “a pageview is defined as a view of a page on your site that is being tracked by the Analytics tracking code. If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.”

Collection methodology: This is the number reported in the Google Analytics tracking software for the given time period.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1  Number was updated from original reported as this number was calculated incorrectly.

2The target total for synchronous and asynchronous attendees (2,000 and 4,000, respectively) was previously listed as 5,600 / yr. This was an error. The target total should be 6,000.

3.1 CEE Director's Office (WBS 2.1.1)

The CEE Director’s Office has been established to provide the necessary oversight to ensure the greatest efficiency and effectiveness of the CEE area. This oversight includes providing direction to the L3 management team, coordination of, and participation in, CEE planning activities and reports through the area’s Project Manager, and monitoring compliance with budgets, and retarget effort if necessary. The Director’s Office also attends and supports the preparation of project level reviews and activities.

The CEE Director’s Office will continue to manage and set direction for CEE activities and responsibilities. They will contribute to and attend bi-weekly senior management team calls, contribute to the project level plan, schedule, and budget, contribute to XSEDE quarterly, annual, and other reports as required by the NSF and attend XSEDE quarterly and annual meetings.Lastly, the Director’s Office will advise the XSEDE PI on many issues, especially those relevant to this WBS area.

3.2 Workforce Development (WBS 2.1.2) 

The Workforce Development mission is to provide a continuum of learning resources and services designed to address the needs and requirements of researchers, educators, developers, integrators, and students utilizing advanced digital resources. This includes providing professional development for XSEDE team members.

Workforce Development provides an integrated suite of training, education, and student preparation activities to address formal and informal learning about advanced digital resources addressing the needs of researchers, developers, integrators, IT staff, XSEDE staff, faculty, and undergraduate and graduate students. CEE –Workforce Development provides business and industry with access to XSEDE’s workforce development efforts including training services and student internships that have historically proven beneficial to industry.

Workforce Development is comprised of three areas: Training, Education, and Student Preparation. The Training team develops and delivers training programs to enhance the skills of the national open science community and ensure productive use of XSEDE’s cyberinfrastructure. The Education team works closely with Training and Student Preparation to support faculty in all fields of study with their incorporation of advanced digital technology capabilities within the undergraduate and graduate curriculum. The Student Preparation program actively recruits students to use the aforementioned training and education offerings to enable the use of XSEDE resources by undergraduate and graduate students to motivate and prepare them to pursue advanced studies and careers to advance discovery and scholarly studies.

Table 3-2: Area Metrics for Workforce Development.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Number of unique attendees, synchronous training

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

1,200 / yr325    

RY 1

1,600 / yr

*112371

551 

981

 

Definition/Description: Number of unique attendees at synchronous events (in-person and webinar), where an attendee is any XSEDE User Portal registrant who attends the event for any length of time.  For an in-person workshop, attendees are those who physically attend, and for webinars, attendees are those who sign in with their XSEDE User Portal username.

Collection methodology: Information about attendee (XUP username, date, and course) who register and participate using their XSEDE User Portal username are recorded by the session moderator and uploaded to a central metrics database via the course coordinator. For the annual total, the query is rerun to look at the year as a whole and not just the quarter.  Walk-ins are not counted.

Number of total attendees, synchronous training

(One person can take several classes)

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

1,400 / yr921    

RY 1

2,000 / yr

*188382

 614

1,184

 

Definition/Description: The total number of attendees at synchronous events (in-person and webinar), where an attendee is any XSEDE User Portal registrant who attends the event for any length of time.  For an in-person workshop, attendees are those who physically attend, and for webinars, attendees are those who sign in with their XSEDE User Portal username. 

Collection methodology: Information about attendees (XUP username, date, and course) who register and participate using their XSEDE User Portal username are recorded and uploaded to a central metrics database via the course coordinator.   Walk-ins are not counted.

Number of unique attendees, asynchronous training

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

1,200 / yr148    

RY 1

2,000 / yr

*215213

211 

520

 

Definition/Description: Number of unique attendees using asynchronous materials (CI-Tutor and Cornell Virtual Workshop), where an attendee is any XSEDE User Portal  user logged in who visited at least one page in an online course.

Collection methodology: Attendee information (XUP username, date, and course) is pulled from CI-Tutor and Cornell Virtual Workshop databases and uploaded to the XSEDE User Portal database by the training lead  and is uploaded to a central metrics database, for the time period listed.

Number of total attendees, asynchronous training

(One person can take several classes)


RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

4,000 / yr384    

RY 1

4,000 / yr

*1,116702

650 

2,468

 

Definition/Description: The number of classes taken by all attendees, where an attendee is anyone who loaded page(s) in an online training module.  Regardless of activity level, an attendee will be counted as taking a course no more than once in a given quarter.  Only attendees who signed in with an XUP username are counted. 

Collection methodology: Attendee information (XUP username, date, and course) is pulled from CI-Tutor and Cornell Virtual Workshop databases and uploaded to the XSEDE User Portal database by the training lead and is uploaded to a central metrics database. Visiting pages in one asynchronous module multiple times during a quarter count as one participant. If this person visits the course again the next quarter, it is counted as a new class that quarter, as it’s the equivalent of a person attending the same course/event a second time.

Average impact assessment of training for attendees

registered through XSEDE User Portal

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

4 out of 5 /qtr4.29    

RY 1

4 out of 5 /qtr

*4.544.39

4.28 

4.36

 

Definition/Description: This is the average of all attendees’ self-rating of each event in which they attend, on a scale of 1 to 5.

Collection methodology: This data is collected from post event surveys and recorded in the database. Specific items included in this index are:

    • Q1. The training session fulfilled my expectations.

    • Q2. The trainer stimulated my interest.

    • Q6. The training session was well-organized.

    • Q8. I was able to easily access this training session.

    • Q10. Overall I would rate my experience as successful.

Number of formal degree, minor, and certificate

programs added to the curricula

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

3 / yr0    

RY 1

3 / yr

*10

2

 

Definition/Description: The number of degrees, minors, and certificate programs addressing computational and data-enabled science and engineering added to course curricula in higher education institutions.  This will span all disciplines. 

Collection methodology: This is self-reported by these institutions as a result of XSEDE polling them on an annual basis. Pre and post NCSI workshop surveys are sent for each event. A follow up survey approximately 6 months post participation is also conducted. The data is stored in a surveymonkey account. The education lead pulls these numbers from the interim reports that the evaluation team provides based on the survey data.

Number of training materials contributed to public repository

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

50/yr0    

RY 1

40 / yr

*1025

 10

45

 

Definition/Description: This is a count of training and education materials that are posted to the HPC University repository (http://www.hpcuniversity.org).

Collection methodology: This data is maintained by the HPC University web site and is extracted quarterly.

Number of materials downloaded from the repository

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

62,000 / yr16,545    

RY 1

56,000 / yr

*11,11425,233

25,279 

61,626

 

Definition/Description: This is a count of training and education materials downloaded from the HPC University repository (http://www.hpcuniversity.org).

Collection methodology: This data is maintained by the HPC University web site and is extracted quarterly.

Number of computational science modules added to courses in higher education institutions

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

40 / yr18    

RY 1

40 / yr

*---NA1

 

Definition/Description: The number of courses addressing computational and data-enabled science and engineering modules added to courses in higher education institutions.  This spans all disciplines.

Collection methodology: This is self-reported by these institutions as a result of us polling them on an annual basis.  We poll the faculty and the institutions with whom we are working and have contact over the years.

Number of students benefitting from XSEDE resources and services

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

950 / qtr1,722    

RY 1

50 / qtr

*99721,148

2,679

4,824

 

Definition/Description: This counts the total number of students supported by XSEDE to work on XSEDE projects and to attend the annual XSEDE Conference. 

Collection methodology: This data is provided by the organizations receiving student financial support from XSEDE on a quarterly basis.

Percentage of under-represented students benefitting

from XSEDE resources and services

RY 5

      

Deepen/Extend – Prepare the current and next generation

RY 4

      

RY 3

      

RY 2

50% / qtr28%    

RY 1

50% / qtr

*34%233%

19% 

28%

 

Definition/Description: Percentage of underrepresented students out of total students supported by XSEDE to work on XSEDE projects and to attend the annual XSEDE/PEARC Conference.  

Collection methodology: This data is provided by the organizations receiving student financial support from XSEDE on a quarterly basis

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

- Data reported annually. 

1Not measured yet. Survey will go out in RY2 RP1. 

2Number was updated from original reported in RP1 (276) as this number was calculated incorrectly. 

3.3 User Engagement (WBS 2.1.3) 

The mission of the User Engagement (UE) team is to capture community needs, requirements, and recommendations for improvements to XSEDE’s resources and services, and report to the national community how their feedback is being addressed. XSEDE places an emphasis on maintaining consistent user contact, traceability in tracking user issues, and closing the feedback loop.

Table 3-3: Area Metrics for User Engagement.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Percentage of active and new PIs

contacted quarterly

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

100% / qtr

100%

(2,040)

    

RY 1

100% / qtr

*

100%

(1,533)

100%

(1,586)

100%
(2,062)2

**

 

Definition/Description: The percentage of PIs with an active allocation that are sent an email to identify user issues that need to be addressed (user requirements) and pass the issues on to the appropriate XSEDE group for resolution.

Collection methodology: Members of the User Engagement team send emails, acknowledge user responses, create XSEDE tickets on the user’s behalf, track the issues to resolution, and close the loop with the user. Currently a spreadsheet is maintained that includes the number of PIs contacted and the date the emails were sent. The goal is to use Jira to keep track of this metric.

Percentage of user requirements resolved1

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

100% / yr78% (36/44)    

RY 1

100% / yr1

*

50%

(16/32)

89%

(40/45)

75%
(40/53)

74%
(96/130)

 

Definition/Description: The percentage of user requirements that are resolved.

Collection methodology: Currently a spreadsheet is maintained that includes the status of each ticket and we report on the number of tickets that have been resolved in the quarter. The goal is to use Jira to track and report this metric.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1Resolution may be dependent upon SPs and other XSEDE groups.

2Query was not catching all PIs, modified for RP4.

**Quarterly metric, 5,181 total contact emails sent in RP2-RP4. 

3.4 Broadening Participation (WBS 2.1.4) 

Broadening Participation engages under-represented minority researchers from domains that are not traditional users of HPC and from Minority Serving Institutions. This target audience ranges from potential users with no computational experience to computationally savvy researchers, educators, Champions, and administrators that will promote change at their institutions for increased use of advanced digital services for research and teaching. 

Broadening Participation will continue the most effective recruitment activities— conference exhibiting, campus visits, and regional workshops—while increasing national impact through new partnerships and the utilization of lower cost awareness strategies to continue the growth in new users from underrepresented communities. The Diversity Forum and the Minority Research Community listservs and community calls focus on user persistence in their use of XSEDE services and their deepening engagement through participation in committees such as the User Advisory Committee (UAC) and XRAC, participation in Champions, bridging, and other programs. Persistent institutional engagement is enabled by curriculum reform and larger numbers of researchers adopting the use of advanced digital resources as a standard research method.

Table 3-4: Area Metrics for Broadening Participation.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Number of new under-represented individuals

using XSEDE resources and services

(KPI)

RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

150 / qtr251    

RY 1

> 100 / yr

*150135

240

525

 

Definition/Description: First-time awardees of XSEDE compute resources or an “allocation” (i.e. Startup, Research, Education, etc.) from underrepresented communities including women and racial/ethnic domestic minorities in HPC as well as anyone from a Minority Serving Institution (MSI) as defined by the Carnegie Classification of Institutions of Higher Education are tracked.

Collection methodology: XSEDE portal users are able to specify their race/ethnicity, gender, and institution in their user profile and are encouraged to complete this data when registering for training events through the XSEDE portal. Date of allocation awards is also recorded in the XSEDE database (XDCDB). A query is then run to generate the number of individuals meeting the aforementioned criteria within a given quarter. Care is taken to not over represent this metric by ensuring an individual who meets multiple criteria is only accounted for one time.

Number of sustained under-represented individuals

using XSEDE resources and services

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

1,500 / yr490    

RY 1

> 1,000 / yr

*322238

535

1,095

 

Definition/Description: Awardees of XSEDE compute resources or an “allocation” (i.e. Startup, Research, Education, etc.) from underrepresented communities including women and racial/ethnic domestic minorities in HPC as well as anyone from a Minority Serving Institution (MSI) as defined by the Carnegie Classification of Institutions of Higher Education are tracked that access their “allocation” during the time period.

Collection methodology: XSEDE portal users are able to specify their race/ethnicity, gender, and institution in their user profile and are encouraged to complete this data when registering for training events through the XSEDE portal. Date of allocation awards is also recorded in the XSEDE database (XDCDB). A query is then run to generate the number of individuals meeting the aforementioned criteria within a given quarter. Care is taken to not over represent this metric by ensuring an individual who meets multiple criteria is only accounted for one time.

Longitudinal Assessment of Inclusion in XSEDE via the Staff Climate Survey1

RY 5

      

Advance – Enhance the array of technical expertise and support services

RY 4

      

RY 3

      

RY 2

5% improvement / yr-    

RY 1

5% improvement / yr

*---

 

Definition/Description: A measure of XSEDE staff perception and experience relating to inclusion in the project.

Collection methodology: A mean index score is generated from annual Staff Climate Study responses to items within the "Inclusion" dimension of the study. This score is then compared to the previous year to determine percent change on an annual basis.

Longitudinal Assessment of Equity in XSEDE via the Staff Climate Survey1



       
RY 5      Advance – Enhance the array of technical expertise and support services



RY 4      
RY 3      
RY 25% improvement / yr -    
RY 15% improvement / yr* ----
 

Definition/Description: A measure of XSEDE staff perception and experience relating to diversity in the project.

Collection methodology: A mean index score is generated from annual Staff Climate Study responses to items within the "diversity" dimension of the study. This score is then compared to the previous year to determine percent change on an annual basis.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

- Data reported annually 

1 The staff climate survey for this plan year is underway and data is not available to report at this time. 

3.5 User Interfaces & Online Information (WBS 2.1.5) 

User Interfaces & Online Information (UII) is committed to enabling the discovery, understanding, and effective utilization of XSEDE’s powerful capabilities and services. Through UII’s ongoing effort to improve and engage a variety of audiences via the XSEDE web site and user portal, UII has immediate impact on a variety of stakeholders including the general public, potential and current users, educators, services providers, campus affiliates, and funding agencies. These stakeholders will gain valuable information about XSEDE through an information rich website, the XSEDE User Portal, and a uniform set of user documentation.

Table 3-5: User Interfaces & Online Information.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Number of new users of XSEDE

resources and services

RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

2,000 / qtr2,305    

RY 1

>1,000 / qtr

*1,88111,7701

2,359

6,181

 

Definition/Description: This is the number of portal accounts created within the selected time period.

Collection methodology: The Liferay Database is queried for the number of users in the Users table that was created during the selected time periods. Sample query: select screenName, createDate from User_ where createDate between '2016-09-01 00:00:00' AND '2016-10-31 00:00:00';

Number of sustained users of XSEDE

resources and services

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

3,000 / qtr3,962    

RY 1

>5,000 / qtr

*

4,7551

4,4461

4,924

6,168

 

Definition/Description: This is the number of users logged into the XSEDE User Portal within the selected time period that were existing XUP users before the selected reporting period. Users who have created their account during that time period are not counted.

Collection methodology: The UII team provides the list of username’s logged in during this period to the evaluation team, they filter out the accounts that were created during this time to get the total number of sustained users.

Number of unique pageviews to the XSEDE website

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

80,000 / qtr63,998    

RY 1

80,000 / qtr

*49,40965,157

68,227

183,212

 

Definition/Description: Per Google Analytics, “a pageview is defined as a view of a page on your site that is being tracked by the Analytics tracking code. If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.”

Collection methodology: This is the number reported in the Google Analytics tracking software for the given time period.

Number of pageviews to the

XSEDE User Portal

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

250,000 / qtr256,482    

RY 1

100,000 / qtr

*183,408219,664

265,115

670,080

 

Definition/Description: Per Google Analytics, “a pageview is defined as a view of a page on your site that is being tracked by the Analytics tracking code. If a user clicks reload after reaching the page, this is counted as an additional pageview. If a user navigates to a different page and then returns to the original page, a second pageview is recorded as well.”

Collection methodology: This is the number reported in the Google Analytics tracking software for the given time period.

User satisfaction with the website

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr4.17    

RY 1

4 out of 5 / yr

*----

 

Definition/Description: The user satisfaction rating of the web site as determined by the Annual User Survey on a scale of 1-5.

Collection methodology: An annual user survey is emailed to all users of XSEDE resources and users are asked to rate their satisfaction with the website based on a scale of 1-5. The statement is “Please rate your overall satisfaction with XSEDE on a scale of 1 to 5, with 1 being “very dissatisfied” and 5 being “very satisfied.” If you have no basis for rating your satisfaction, please select "Not applicable.”

User satisfaction with the XSEDE User Portal

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr4.18    

RY 1

4 out of 5 / yr

*----

 

Definition/Description: The user satisfaction rating of the portal from the Annual User Survey on a scale of 1-5.

Collection methodology: An annual user survey is emailed to all users of XSEDE resources and users are asked to rate their satisfaction with the website based on a scale of 1-5. The statement is “Please rate your overall satisfaction with XSEDE on a scale of 1 to 5, with 1 being “very dissatisfied” and 5 being “very satisfied.” If you have no basis for rating your satisfaction, please select "Not applicable.”

User satisfaction with user documentation

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

4.25 out of 5 / yr4.07    

RY 1

4 out of 5 / yr

*-4.222-4.22

 

Definition/Description: The user satisfaction rating of user documentation from the Annual User Survey on a scale of 1-5.

Collection methodology: An annual user survey is emailed to all users of XSEDE resources and users are asked to rate their satisfaction with the user documentation based on a scale of 1-5.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

- Data reported annually 

1 Number was updated from original reported as this number was initially calculated incorrectly.

2 This measurement is based on an XSEDE Microsurvey focused on XSEDE User Documentation. The measurement is the mean satisfaction rating for the question “Please rate your level of satisfaction with the technical documentation available on the XSEDE User Portal (XUP) and XSEDE Web Site." 

 

3.6 Campus Engagement (WBS 2.1.6) 

The Campus Engagement program promotes and facilitates the effective participation of a diverse national community of campuses in the application of advanced digital resources and services to accelerate discovery, enhance education, and foster scholarly achievement. 

Campus Engagement, via the Campus Champions, works directly with institutions across the US both to facilitate computing and data-intensive research and education, nationally and with collaborators worldwide, and to expand the scale, scope, ambition and impact of these endeavors. This is done by increasing scalable, sustainable institutional uptake of advanced digital services from providers at all levels (workgroup, institutional, regional, national, international), fostering a broader, deeper, more agile, more sustainable and more diverse nationwide cyberinfrastructure ecosystem across all levels, and cultivating inter-institutional interchange of resources, expertise and support. CE also aims to assist with the establishment and expansion of consortia (for example, intra-state, regional, domain-specific) that collaborate to better serve the needs of their advanced computing stakeholders.

Table 3-6: Area Metrics for Campus Engagement.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Number of institutions with a Champion

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

240218    

RY 1

225 / yr

*224231

234

234

 

Definition/Description: The total number of Institutions that have a signed MOU with the Champion program. 

Collection methodology:  These are collected from a spreadsheet kept by the Campus Engagement leadership and on the public website listing Champion institutions.

Number of unique contributions to the Champion email list (campuschampions@xsede.org)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

100 / yr129    

RY 1

50 / yr

*90112

125

190

 

Definition/Description: The total number of distinct champions that sent a message to the champion email list.

Collection methodology: Archives are available at https://mhonarc.xsede.org/campuschampions/  as mbox files that can be parsed and counted.

Number of activities that (i) expand the emerging CI workforce and/or (ii) improve the extant CI workforce, participated in by members of the Campus Engagement team

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

40 / yr28    

RY 1

20 / yr

*1011

10

30

 

Definition/Description: CI workforce development activities engaged in by any member of the Campus Engagement team, including but not limited to the Campus Engagement Level 3 Co-Managers, the Campus Champions Coordinator, Campus Engagement staff and leadership team, and the Campus Champions. This includes but isn't limited to all  education, outreach, training, mentoring, team-building, fostering of relevant communities, professional development, recruitment, proposal submissions, workshops, dissemination (for example, papers, posters, presentations etc, at conferences, in journals, and in other venues), which are intended to, or have the effect of, expanding and/or improving the CI workforce, across all segments (academic, industry, government, non-profit/non-governmental).

Collection methodology: Written reports (documents, e-mails, etc) provided by any and all members of the Campus Engagement team, as enumerated above, including but not limited to any internal survey or other data collection results.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

[Table of Contents]

4       Extended Collaborative Support Service - ECSS (WBS 2.2) 

The Extended Collaborative Support Service (ECSS) improves the productivity of the XSEDE user community both through successful, meaningful collaborations and well-planned training activities. The goal is to optimize applications, improve work and data flows, increase effective use of the XSEDE digital infrastructure and broadly expand the XSEDE user base by engaging members of under-represented communities and domain areas. The ECSS program provides professionals who can be part of a collaborative team—dedicated staff who develop deep, collaborative relationships with XSEDE users—helping them make best use of XSEDE resources to advance their work. These professionals possess combined expertise in many fields of computational science and engineering. They have a deep knowledge of underlying computer systems and of the design and implementation principles for optimally mapping scientific problems, codes, and middleware to these resources. ECSS includes experts in not just the traditional use of advanced computing systems but also in data-intensive work, workflow engineering, and the enhancement of scientific gateways. 

ECSS projects fall into five categories: Extended Support for Research Teams (ESRT), Novel and Innovative Projects (NIP), Extended Support for Community Codes (ESCC), Extended Support for Science Gateways (ESSGW), and Extended Support for Training, Education and Outreach (ESTEO). Project-based ECSS support is requested by researchers via the XSEDE peer-review allocation process. If reviewers recommend support and if staff resources are available, the ECSS expert and the requesting PI develop a work plan outlining the project tasks. The work plan includes concrete quarterly goals and staffing commitments from both the PI team and ECSS. ECSS managers review work plans and also track progress via quarterly reports. 

Table 4-1: Area Metrics for Extended Collaborative Support Services.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Average ECSS impact rating

(KPI)

RY 5

      

Deepen/Extend – Deepening use to existing communities

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr4.11    

RY 1

4 out of 5 / qtr

*4.564.61

3.29 

4.14

 

Definition/Description: After an ECSS project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate the impact of the ECSS support on their research on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ECSS staff members. The L2 Director asks the PI to rate the impact of the ECSS support on their research on a scale of one to five and this number is recorded in a spreadsheet, which is provided to the L2 Project Manager (PM). The PM then transfers this number to Sciforma. Each individual impact rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Average satisfaction with ECSS support

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

4.5 out of 5 / yr4.65    

RY 1

4.5 out of 5 / qtr

*4.864.72

4.64 

4.54

 

Definition/Description: After an ECSS project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate their satisfaction with the ECSS support they have received on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ECSS staff members. The L2 Director asks the PI to rate their satisfaction with the support received on a scale of one to five and this number is recorded in a spreadsheet, which is provided to the L2 Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Number of completed ECSS projects

(ESRT  +  ESCC  +  ESSGW)

(KPI)

RY5      Deepen/Extend – Deepening use to existing communities
RY4      
RY3      
RY250 / yr16    

RY1

50 / yr

*1013

25 

48

 

Definition/Description: The total number of completed ECSS projects consists of projects that were completed through work in ESRT, ESCC, and ESSGW. A completed project is defined as a project that has progressed through the complete support pipeline.  This pipeline includes steps such as assignment of a consultant, production of a work plan, execution of the work plan and reporting of progress through quarterly reports, and the filing of a final project report.

Collection methodology: The number of completed projects is tracked in Sciforma, XSEDE’s project management software. A report has been created in Sciforma that queries all the ECSS projects, filters for allocations that have ended in that quarter and also for projects that have a final report.

 

Number of new users from non-traditional

disciplines of XSEDE resources and services

(KPI - composite)

RY 5

      

Deepen/Extend – Extend use to new communities

       

RY 4

      

RY 3

      

RY 2

500 / yr101    

RY 1

400 / yr

*14792

158 

397

 

Definition/Description: This metric tracks the progress of extending XSEDE allocations to first-time users from fields of science (FOS) that have not been significant consumers of HPC resources and services.  

Collection methodology: A set of 60 FOS have been identified in the XD Central Database (XDCDB), each of whose usage over the past 10 years is below 0.5% of the total normalized usage. A scripted query to XDCDB counts the total number of users on the grants newly activated from these FOS.

Number of sustained users from non-traditional

disciplines of XSEDE resources and services

(KPI - composite)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

500 / qtr700    

RY 1

400 / qtr1

*451517

716 

1,684

 

Definition/Description: This metric tracks the progress of users from fields of science (FOS) that have not been significant consumers of HPC resources and services, who are benefiting from their XSEDE allocations in a sustained manner.  

Collection methodology: A set of 60 FOS have been identified in the XD Central Database (XDCDB), each of whose usage over the past 10 years is below 0.5% of the total normalized usage. A scripted query to XDCDB counts the total number of users on the grants from these FOS that have used at least 10% of their allocated usage in the past year.

Average estimated months saved due to ECSS support2RY 5      Deepen/Extend – Deepen use to existing communities
RY 4      
RY 3      
RY 212 mo / project13.61    
RY 1******

 

Definition/Description: This metric is an average amount of months a PI estimates to have saved due to ECSS support and is ascertained during the PI interview conducted by ECSS L2 management. The metric will be reported by adding the number of months estimated to have been saved due to ECSS support divided by the total number of PI interviews conducted during the reporting period. 

Collection methodology: During this interview, the PI is asked to estimate the time in months that it would have taken to get to the same stage that the project achieved if the project did not have ECSS Support. The number that would be reported as a metric is the average estimated months saved due to ECSS Support.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1 Target has been updated for RY1 RP4.

2New metric for PY7.

4.1 ECSS Director's Office (WBS 2.2.1)

The ECSS Director’s Office has been established to provide the necessary oversight to ensure the greatest efficiency and effectiveness of the ECSS area. This oversight includes providing direction to the L3 management team, coordination of, and participation in, ECSS planning activities and reports through the area’s Project Manager, and monitoring compliance with budgets, and retargeting effort if necessary. The Director’s Office also attends and supports the preparation of project level reviews and activities. The ECSS Director’s Office will continue to manage and set direction for ECSS activities and responsibilities. They will contribute to and attend bi-weekly senior management team calls, contribute to the project level plan, schedule, and budget, contribute to XSEDE quarterly, annual, and other reports as required by the NSF and attend XSEDE quarterly and annual meetings. The Director’s Office will advise the XSEDE PI on many issues, especially those relevant to this WBS area. The office consists of two Level 2 Co-Directors (Ralph Roskies, who manages ESRT and NIP activities, and Nancy Wilkins-Diehr who manages ESCC, ESSGW, and ESTEO activities) and two project managers (Karla Gendler and Marques Bland). 

Roskies and Wilkins-Diehr carry out the post-project interviews with all project PIs who have received ECSS support, both to get their assessment of how the project went, and to hear and act on any concerns they may express. Wilkins-Diehr also organizes the monthly symposium series, one of the contributors to staff training, and runs the Campus Champions Fellows program. Roskies reviews most stories destined for inclusion in XSEDE Highlights and also convenes User Advisory Committee meetings.

The two project managers aid in the management of the day-to-day activities of ECSS which includes the management of project requests (XRAC and startups), active projects, project assignments and staffing. They continuously refine the ECSS project lifecycle, further defining processes to aid in the management of over 100 active projects. They also administer JIRA for the management and tracking of projects, both for the managers and directors of ECSS and for ECSS staff.

4.2 Extended Support for Research Teams (WBS 2.2.2) 

Extended Support for Research Teams (ESRT) accelerates scientific discovery by collaborating with researchers, engineers, and scholars in order to optimize their application codes, improve their work and data flows, and increase the effectiveness of their use of XSEDE digital infrastructure.  

ESRT projects are initiated as a result of support requests or recommendations obtained during the allocation process. Most projects focus on home-grown codes, as community codes fall under ESCC (§4.4) but are not exclusively restricted to this classification. The primary mandate of ESRT is the support of individual research teams within the context of their research goals. 

Table 4-2: Area Metrics for Extended Support for Research Teams.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

 Goal Supported

Average ESRT impact rating

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr3.7    

RY 1

4 out of 5 / qtr

*54.67

5

4.67

 

Definition/Description: After an ESRT project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate the impact of the ESRT support on their research on a scale of 1 to 5. 

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ESRT staff members. The L2 Director asks the PI to rate the impact of the ESRT support on their research on a scale of one to five and this number is recorded in a spreadsheet, which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average. 

Average satisfaction with ESRT support

(KPI)

RY 5

 

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

4.5 out of 5 / yr4.6    

RY 1

4.5 out of 5 / qtr

*54.33

5

4.86

 

Definition/Description: After an ESRT project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate their satisfaction with the ECSS support they have received on a scale of 1 to 5. 

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ESRT staff members. The L2 Director asks the PI to rate their satisfaction with the support received on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average. 

Number of completed ESRT projects

(KPI)

RY 5      Deepen/Extend – Deepen use to existing communities
RY 4       
 RY 3      
 RY 230 / yr10    
 RY 1

30 / yr

*36

15

24

 

Definition/Description: A completed project is defined as a project that has progressed through the complete support pipeline.  This pipeline includes steps such as assignment of a consultant, production of a work plan, execution of the work plan and reporting of progress through quarterly reports and the filing of a final project report.

Collection methodology: The number of completed projects is tracked in Sciforma, XSEDE’s project management software. A report has been created in Sciforma that queries all the ECSS projects, filters for allocations that have ended in that quarter and also for projects that have a final report. This report can be filtered by the 3 project areas in ECSS: ESRT, ESCC, or ESSGW.

Average estimated months saved due to ESRT support1RY 5      Deepen/Extend – Deepen use to existing communities
RY 4      
RY 3      
RY 26 mo / project18.3    
RY 1******
 

Definition/Description: This metric is an average amount of months a PI estimates to have saved due to ESRT support and is ascertained during the PI interview conducted by ECSS L2 management. The metric will be reported by adding the number of months estimated to have been saved due to ESRT support divided by the total number of PI interviews conducted during the reporting period. 

Collection methodology: During this interview, the PI is asked to estimate the time in months that it would have taken to get to the same stage that the project achieved if the project did not have ESRT Support. The number that would be reported as a metric is the average estimated months saved due to ESRT Support.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1New metric for PY7.

4.3 Novel & Innovative Projects (WBS 2.2.3) 

Novel and Innovative Projects (NIP) accelerates research, scholarship, and education provided by new communities that can strongly benefit from the use of XSEDE’s ecosystem of advanced digital services. Working closely with the XSEDE Outreach team, the NIP team identifies a subset of scientists, scholars and educators from new communities, i.e. from disciplines or demographics that have not yet made significant use of advanced computing infrastructure, who are now committed to projects that appear to require XSEDE services and are in a good position to use them efficiently. NIP staff then provides personal mentoring to these projects, helping them to obtain XSEDE allocations and use them successfully. 

XSEDE projects generated by, and mentored by, the personal efforts of the NIP experts should stimulate additional practitioners in their field to become interested in XSEDE. Strategies used will include building and promotion of science gateways serving communities of end-users and the enhancement of the Domain Champions program by which successful practitioners spread the word about the benefits of XSEDE to their colleagues. 

Table 4-3: Area Metrics for Novel and Innovative Projects.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Number of new users from non-traditional

disciplines using XSEDE resources and services

(KPI)

RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

500 / yr101    

RY 1

400 / yr

*14792

158 

397

 

Definition/Description: This metric tracks the progress of extending XSEDE allocations to first-time users from fields of science (FOS) that have not been significant consumers of HPC resources and services.  

Collection methodology: A set of 60 FOS have been identified in the XD Central Database (XDCDB), each of whose usage over the past 10 years is below 0.5% of the total normalized usage. A scripted query to XDCDB counts the total number of users on the grants newly activated from these FOS.

Number of sustained users from non-traditional

disciplines using XSEDE resources and services

(KPI- composite)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

500 / qtr700    

RY 1

400 / qtr1

*451517

 716

1,684

 

Definition/Description: This metric tracks the progress of users from fields of science (FOS) that have not been significant consumers of HPC resources and services, who are benefiting from their XSEDE allocations in a sustained manner.  

Collection methodology: A set of 60 FOS have been identified in the XD Central Database (XDCDB), each of whose usage over the past 10 years is below 0.5% of the total normalized usage. A scripted query to XDCDB counts the total number of users on the grants from these FOS that have used at least 10% of their allocated usage in the past year.

Number of new XSEDE projects from target

communities generated by NIP

RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

30 / yr11    

RY 1

20 / yr

*165

 17

38

 

Definition/Description: This metric tracks the success of the personal efforts of NIP staff members to generate new XSEDE projects from fields of science (FOS) that have not been significant consumers of HPC resources and services.

Collection methodology: First-time grants generated by the  personal efforts of NIP staff members are flagged in XDCDB. A scripted query reports the number of these grants.

Number of successful XSEDE projects from

target communities mentored by NIP

RY 5

      

Deepen/Extend – Extend use to new communities

RY 4

      

RY 3

      

RY 2

25 / qtr24    

RY 1

20 / qtr1

*2325

34 

82

 

Definition/Description: This metric tracks the success of the personal efforts of NIP staff members to mentor sustained projects from fields of science (FOS) that have not been significant consumers of HPC resources and services.

Collection methodology: Grants generated by the  personal efforts of NIP staff members are flagged in XDCDB. A scripted query reports the number of these grants that have used at least 10% of their allocated usage in the past year..

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1Target has been updated for RY1 RP4.

4.4 Extended Support for Community Codes (WBS 2.2.4) 

Extended Collaborative Support for Community Codes (ESCC) extends the use of XSEDE resources by collaborating with researchers and community code developers to deploy, harden, and optimize software systems necessary for research communities to create new knowledge. 

ESCC supports users via requested projects and initiated projects. ESCC projects may be created in two different ways. Most ESCC projects are initiated as a result of requests for assistance during the allocation process. These projects are similar in nature to ESRT projects but involve community codes rather than codes developed for and by individual research groups. ESCC projects may also be initiated by staff to support a community’s needs. 

Table 4-4: Area Metrics for Extended Support for Community codes.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Average ESCC impact rating

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

PY27

4 out of 5 / yr5    

RY 1

4 out of 5 / qtr

*4.675

0

3.00

 

Definition/Description: After an ESCC project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate the impact of the ESCC support on their research on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ESCC staff members. The L2 Director asks the PI to rate the impact of the ESCC support on their research on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Average satisfaction with ESCC support

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

4.5 out of 5 / yr5    

RY 1

4.5 out of 5 / qtr

*45

4

4.57

 

Definition/Description: After an ESCC project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate their satisfaction with the ESCC support they have received on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ESCC staff members. The L2 Director asks the PI to rate their satisfaction with the support received on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Number of completed ESCC projects

(KPI)

RY 5      Deepen/Extend – Deepening use to existing communities
RY 4      
RY 3      
RY 210 / yr4    
RY 1

10 / yr

*34

3

10

 

Definition/Description: A completed project is defined as a project that has progressed through the complete support pipeline.  This pipeline includes steps such as assignment of a consultant, production of a work plan, execution of the work plan and reporting of progress through quarterly reports, and the filing of a final project report.

Collection methodology: The number of completed projects is tracked in Sciforma, XSEDE’s project management software. A report has been created in Sciforma that queries all the ECSS projects, filters for allocations that have ended in that quarter and also for projects that have a final report. This report can be filtered by the 3 project areas in ECSS: ESRT, ESCC, or ESSGW.

Average estimated months saved due to ESCC support1RY 5      Deepen/Extend – Deepening use to existing communities
RY 4      
RY 3      
RY 212 mo / project11.5    
RY 1******
 

Definition/Description: This metric is an average amount of months a PI estimates to have saved due to ESCC support and is ascertained during the PI interview conducted by ECSS L2 management. The metric will be reported by adding the number of months estimated to have been saved due to ESCC support divided by the total number of PI interviews conducted during the reporting period. 

Collection methodology: During this interview, the PI is asked to estimate the time in months that it would have taken to get to the same stage that the project achieved if the project did not have ESCC Support. The number that would be reported as a metric is the average estimated months saved due to ESCC Support.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

4.5 Extended Support for Science Gateways (WBS 2.2.5) 

Extended Support for Science Gateways (ESSGW) broadens science impact and accelerates scientific discovery by collaborating in the development and enhancement of science-centric gateway interfaces and by fostering a science gateway community ecosystem. 

ESSGW projects primarily begin through user requests from the XSEDE allocation process. Similar to ESRT and ESCC, ESSGW projects progress through three activities. First, the project is assigned to an ECSS expert. Second, the project is quantified with the formation of a work plan through collaboration with the research group. Third, when the project is completed, the ESSGW expert produces a final report with input from the research group. A successful project is the completion of all three phases. Each state of the progression is measured to provide an assessment of progress. Submission of work plans within 45 days of initial contact, 90% of projects with work plans completed, and 85% of completed projects with final reports within three months are additional criteria for success. 

Table 4-5: Area Metrics for Extended Support for Science Gateways.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Average ESSGW impact rating

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr4.25    

RY 1

4 out of 5 / qtr

*44.7

4.5 

4.54

 

Definition/Description: After an ESSGW project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate the impact of the ESSGW support on their research on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ESSGW staff members. The L2 Director asks the PI to rate the impact of the ESSGW support on their research on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Average satisfaction with ESSGW support

(KPI)

RY 5

      

Deepen/Extend – Deepen use to existing communities

RY 4

      

RY 3

      

RY 2

4.5 out of 5 / yr4.5    

RY 1

4.5 out of 5 / qtr

*54.7

4.88 

4.75

 

Definition/Description: After an ESSGW project is marked as completed (i.e. has a workplan, work has progressed, and a final report has been filed), the L2 Directors interview the PIs, preferably via phone, and ask them to rate their satisfaction with the ESSGW support they have received on a scale of 1 to 5.

Collection methodology: PIs are contacted by the L2 Directors post filing of a final report by ESSGW staff members. The L2 Director asks the PI to rate their satisfaction with the support received on a scale of one to five and this number is recorded in a spreadsheet which is shared with the L2 Area Project Manager (PM). The PM then transfers this number to Sciforma. Each individual satisfaction rating that is reported for the reporting period is then added together and divided by the total number of interviews conducted for the average.

Number of completed ESSGW projects

(KPI)

RY 5      Deepen/Extend – Deepen use to existing communities
RY 4      
RY 3      
RY 210 / yr2    
RY 1

10 / yr

*43

14

 

Definition/Description: A completed project is defined as a project that has progressed through the complete support pipeline.  This pipeline includes steps such as assignment of a consultant, production of a work plan, execution of the work plan and reporting of progress through quarterly reports, and the filing of a final project report.

Collection methodology: The number of completed projects is tracked in Sciforma, XSEDE’s project management software. A report has been created in Sciforma that queries all the ECSS projects, filters for allocations that have ended in that quarter and also for projects that have a final report. This report can be filtered by the 3 project areas in ECSS: ESRT, ESCC, or ESSGW.

Average estimated months saved due to ESSGW support1

RY 5      Deepen/Extend – Deepen use to existing communities
RY 4      
RY 3      
RY 212 mo / project4    
RY 1******
 

Definition/Description: This metric is an average amount of months a PI estimates to have saved due to ESSGW support and is ascertained during the PI interview conducted by ECSS L2 management. The metric will be reported by adding the number of months estimated to have been saved due to ESSGW support divided by the total number of PI interviews conducted during the reporting period. 

Collection methodology: During this interview, the PI is asked to estimate the time in months that it would have taken to get to the same stage that the project achieved if the project did not have ESSGW Support. The number that would be reported as a metric is the average estimated months saved due to ESSGW Support.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1New metric for PY7.

4.6 ECSS Communities – Extended Support for Education, Outreach, & Training (WBS 2.2.6) 

Extended Collaborative Support for Training, Education & Outreach (ESTEO) prepares the current and next generation of researchers, engineers, and scholars in the use of advanced digital technologies by providing the technical support for Training, Education and Outreach planned activities. 

Typical events include train-the-trainers events, on-site classes requested by Campus Champions, regional workshops, conferences, and summer schools (including the International HPC Summer School). Staff also create and review online documentation and training modules. This on-demand training is increasingly popular with the user community when both time and travel budgets are limited. 

Table 4-6: Area Metrics for Extended Support for Education, Outreach, & Training.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Number of Campus Champions fellows

RY 5

      

Deepen/Extend – Preparing the current and next generation

RY 4

      

RY 3

      

RY 2

4 / yr-    

RY 1

4 / yr

*5-

 

Definition/Description: This is the number of campus champion fellows supported on an annual basis.

Collection methodology: Count of campus champion fellows appointed at beginning of each project year.

Average score of fellows assessment

RY 5

      

Deepen/Extend – Preparing the current and next generation

RY 4

      

RY 3

      

RY 2

4.5 out of 5 / yr-    

RY 1

4 out of 5 / yr

*4.33-

 -

4.33

 

Definition/Description: Satisfaction score generated by participants of the Champion Fellows program.

Collection methodology: As of 2015, Champion Fellows are sent an annual follow up survey to assess their overall experience in the program. This is developed and administered by the XSEDE Evaluation team. Fellows are asked to rate a set of statements regarding their experience on a scale of 1 (strongly disagree) - 5 (strongly agree). This metric is calculated by averaging responses to the statement “Overall I am satisfied with my experience.”

Number of live training events staffed

RY 5

      

Deepen/Extend – Preparing the current and next generation

RY 4

      

RY 3

      

RY 2

20 / yr15    

RY 1

20 / yr

*79

10 

26

 

Definition/Description: Count of live training events staffed by ESTEO trainers, as reported in ESTEO staff quarterly reports.

Collection methodology: Tally of live training events as reported by ESTEO staff.

Number of staff training events

RY 5

      

Deepen/Extend – Preparing the current and next generation

RY 4

      

RY 3

      

RY 2

2 / yr2    

RY 1

2 / yr

*00

 4

4

 

Definition/Description: Number of staff training events held.

Collection methodology: The number of ECSS staff training events is totaled.

Attendees at staff training events

RY 5

      

Deepen/Extend – Preparing the current and next generation

RY 4

      

RY 3

      

RY 2

40 / yr39    

RY 1

40 / yr

*00

 51

51

 

Definition/Description: Number of attendees at every ECSS staff training event held.

Collection methodology: An approximate total of attendees  at ECSS staff training events based on sign-in sheet provided, as well as count of attendees in room and on teleconference.

Attendees at ECSS Symposia

RY 5

      

Deepen/Extend – Preparing the current and next generation

RY 4

      

RY 3

      

RY 2

300 / yr48    

RY 1

300 / yr

*7883

75 

236

 

Definition/Description: Number of  attendees at ECSS symposia

Collection methodology: An approximate total of attendees at ECSS symposia based on sign-ins to web conferencing technology.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

- Data reported annually.

[Table of Contents]

5        XSEDE Cyberinfrastructure Integration (XCI, WBS 2.3) 

The mission of XSEDE Cyberinfrastructure Integration (XCI) is to facilitate interaction, sharing, and compatibility of all relevant software and related services across the national CI community—building and improving upon the foundational efforts of XSEDE.

XCI envisions a national cyberinfrastructure that is consistent, straightforward to understand, and straightforward to use by researchers and students. Service to XSEDE Service Providers is a particularly important aspect of XCI’s activities. Service Providers should find it as simple as possible to provide resources to the national research community through shared expertise and when appropriate, software and tools. We strive to make it possible for researchers and students to effortlessly use computational and data analysis resources ranging from those allocated by XSEDE to campus-based CI facilities, an individual’s workstation and commercial cloud providers, and to interact with these resources via CI software services such as science gateways and Globus Online. Through XCI, XSEDE serves an aligning function within the nation, not by rigorously defining a particular architecture, but rather by assembling a technical infrastructure that facilitates interaction and interoperability across the national CI community and which is adopted by campus, regional, and national CI providers because it makes their task of delivering services easier and the delivered services better. The suite of interoperable and compatible software tools that XSEDE will make available to the community will be based on those already in use and services will be added that address emerging needs including data and computational services. 

The XSEDE User Requirements Evaluation and Prioritization Working Group (UREP) prioritizes users’ needs for XCI. The UREP provides the primary direction and prioritization for the Requirements Analysis and Capability Delivery team (RACD). Given that as a source of prioritization, the group of NSF-funded Level 1 SPs is generally customer number one for XCI. These are the nationally-accessible cyberinfrastructure resources created and operated by Service Providers that are required to be interoperable with XSEDE services in general and to be allocated via the XRAC process. The Level 1 SPs, then, are the cyberinfrastructure service providers most invested in terms of having important requirements from XSEDE and important requirements for XSEDE (and XCI) to fulfill. Level 2 and 3 SPs represent the next two most motivated and engaged groups of cyberinfrastructure providers in the US. The SP Forum is another source for user needs and priorities; the members of the SP Forum constitute national leaders and exemplars for the US national cyberinfrastructure community as a whole. By engaging with Level 1, 2, and 3 SPs and the SP Forum in particular, we believe that we can get the most detailed statements of needs and priorities. New tools implemented under the leadership of XCI are most likely to be widely and quickly adopted by the national community of CI providers if they are first adopted by members of the SP Forum. Because the XSEDE Campus Resource Integration team (CRI) deals primarily with Level 1, 2, and 3 SPs, along with campus cyberinfrastructure administrators and support experts, the SP Forum and Campus Champions are XCI’s primary sources of direction regarding prioritization of efforts.

The Community Software Repository (CSR) is a new service and tool catalog for the national research community to facilitate connecting resources, software, and services into the broader cyberinfrastructure ecosystem. To access the CSR, please visit:   https://software.xsede.org/xcsr/xsede-use-cases.

Table 5-1: Area Metrics for XSEDE Cyberinfrastructure Integration.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Average satisfaction rating of XCI services1

(KPI)

RY 5

      

Advance – Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

4 out of 5 / yr4.25    

RY 1

4 out of 5 / yr

*4.84.81

4.5 

4.5

 

Definition/Description: Customer satisfaction rating of XCI staff provided services (does not include software services which are covered in RACD metrics)

Collection methodology: User interviews and micro-surveys done on an as need basis after services have been delivered.

Number of new capabilities made available

for production deployment

(KPI)

RY 5

      

Advance – Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

7 / yr1    

RY 1

7 / yr

*00

6

 

Definition/Description: New use cases that have been fully enabled using new or enhanced components and made available by RACD completed CDP activities.

Collection methodology: Count is done manually and maintained in JIRA.

Total number of systems that use one or more CRI provided toolkitsRY 5      Advance – Create an open and evolving e-infrastructure
RY 4      
RY 3      
RY 2700 by the end of RY2628    
RY 1

450 / yr

*5125131594
 

Definition/Description: Systems using one or more of the software provided or advocated by CRI (Globus Online, the XSEDE Basic Compatible Cluster (XCBC) or XSEDE National Integration Toolkit (XNIT), or other toolkits as they are developed).

Collection methodology: For Globus numbers, these are reported each quarter from the Globus Team.  XCBC installation numbers are collected by direct contact with installation sites (site visits, tickets to help@xsede.org, “stand and be counted” installation script), XNIT and other toolkit users are based on download statistics from software distribution sites.

Percentage of XSEDE recommended tools that are adopted by allocated systems (all Level 1 and Level 2 SPs that are allocated by XSEDE in whole or in part) where the tools are appropriate

 

 

 

 

RY 5

      

Advance – Create an open and evolving e-infrastructure

RY 4

       

RY 3

       

RY 2

100%/yr100%     

RY 1

100%/yr

*

100%

100%

100%

100%

 
 

Definition/Description: The ratio of the number of recommended tools adopted by Level 1 SPs and Level 2 SPs that are allocated in whole or in part by XSEDE to the total number of tools that are recommended for those SPs, expressed as a percentage.  “Recommended” tools are the tools which are applicable to the type and purpose of the system in question.

 

Collection methodology: SPs are required to register the tools they deploy.  This information is kept by Information Services.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1This number is for RACD only as XCRI satisfaction numbers are unavailable for this reporting period.

5.1 XCI Director's Office (WBS 2.3.1)

The XCI Director’s Office has been established to provide the necessary oversight to ensure the greatest efficiency and effectiveness of the XCI area. This oversight includes providing direction to the L3 management team, coordination of, and participation in, XCI planning activities and reports through the area’s Project Manager, and monitoring compliance with budgets, and retarget effort if necessary. The Director’s Office also attends and supports the preparation of project level reviews and activities.

The XCI Director’s Office will continue to manage and set direction for XCI activities and responsibilities. They will contribute to and attend bi-weekly senior management team calls, contribute to the project level plan, schedule, and budget, contribute to XSEDE quarterly, annual, and other reports as required by the NSF and attend XSEDE quarterly and annual meetings.

5.2 Requirements Analysis and Capability Delivery (RACD, WBS 2.3.2) 

The Requirements Analysis & Capability Delivery (RACD) team facilitates the integration, maintenance, and support of cyberinfrastructure capabilities addressing user technical requirements. The process begins by preparing Capability Delivery Plans (CDPs) that describe the technical gaps in XSEDE’s prioritized Use Cases. To fill the gaps, RACD evaluates and/or tests existing software solutions, engages with software providers and facilitates software and service integration. To ensure software and service adoption and ROI, RACD will involve users, Service Providers and operators in an integration process that uses engineering best practices and instruments components to measure usage. Once components are integrated, RACD will facilitate software maintenance and enhancements in response to evolving user needs and an evolving infrastructure environment.

Table 5-2: Area Metrics for Requirements Analysis and Capability Delivery.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Number of capability delivery plans (CDPs)

prepared for prioritized Use Cases

RY 5

      

Advance-Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

12 / yr3    

RY 1

*

***

*

*

 

Definition/Description: A CDP is a workplan for how to deliver what a use case describes; this metric is the number of CDPs that we prepare based on the UREP prioritized use cases selected for implementation.

Collection methodology: Using our JIRA logs, we total the number of CDPs prepared.

Number of CI integration assistance engagements

RY 5

      

Advance-Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

6 / yr4    

RY 1

6 / yr

*41

0

5

 

Definition/Description: The number of engagements (meetings, conferences) with non-XSEDE components to integrate new CI components with XSEDE. Potential initial candidates include: Federating OpenStack Keystone with XSEDE;Molecular Sciences Software Institute; Globus Groups XSEDE Integration; QBEST XSEDE Integration; Data Hubs

Collection methodology: Count in JIRA how many CI integration assistance engagement activities we worked on during the reporting period.

Average time from support request to solution1

RY 5

      

Advance - Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

< 30 days4 days    

RY 1

45 days or less / yr

*7 days16 days

4 days 

9 days

 

Definition/Description: Days from defect/support submitted to solution communicated.

Collection methodology: From a report of all tickets resolved during the reporting period calculate the number of days between solution communicated and ticket opened.

Number of significant fixes and enhancements to production components2RY 5      Advance - Create an open and evolving e-infrastructure
RY 4      
RY 3      
RY 216 / yr14    
RY 1******
 

Definition/Description: Sum of two metrics (Other) - Number of maintenance releases and upgrades delivered for Service Provider deployment + Number of fixes and enhancements to centrally operated services.

Collection methodology: Will be tracked as JIRA activities.

Number of new components instrumented and tracked for usage and ROI analysis2RY      Advance - Create an open and evolving e-infrastructure
RY       
RY       
RY 4 / yr0    
RY ******
 

Definition/Description: Prioritized components that we will begin to track usage for.

Collection methodology: Will be tracked as JIRA activities.

Average satisfaction rating of RACD services2RY 5      Advance - Create an open and evolving e-infrastructure
RY 4      
RY 3      
RY 24 of 5 / yr3.75    
RY 1******
 

Definition/Description: Average of results from three Other Metrics - User rating of components delivered in production; Operator rating of components delivered for production deployment; Software/Service Provider rating of our integration assistance.

Collection methodology: User rating of components delivered in production and Operator rating of components delivered for production deployment are gathered via a micro-survey implemented in the XSEDE Community Software Repository.

Software/service provider rating of our integration assistance is gathered through user interviews and micro-surveys conducted on an as need basis after integration assistance has been provided.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1 This metric is reported in each reporting period, although the target is annual. The annual figure is the average over all reporting periods.

New metric for PY7.

5.3 XSEDE Cyberinfrastructure Resource Integration (WBS 2.3.3) 

The mission of the Cyberinfrastructure Resource Integration (CRI) team is to work with SPs, CI providers and campuses to maximize the aggregate utility of national cyberinfrastructure. CRI will coordinate interactions between SP’s and XSEDE in the SP Forum, engage with national CI providers, gather requirements for tools and training that assist users of XSEDE and other cyberinfrastructures, identify and disseminate benefits and costs of interoperating with XSEDE, and create toolkits that fit and improve common usage modalities.

CRI’s activities are reflected in the uptake of CRI-integrated toolkits, such as the XSEDE Campus Bridging Cluster toolkit and XSEDE National Integration Toolkit, but also Globus Transfer clients and other toolkits as developed. 

Table 5-3: Area Metrics for XSEDE Cyberinfrastructure Resource Integration.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Total number of systems that use one or more CRI provided toolkits

RY 5

      

Advance-Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

700 / yr628    

RY 1

450 / yr

*5125131594

 

Definition/Description: Systems using one or more of the software provided or advocated by CRI (Globus Online, the XSEDE Basic Compatible Cluster (XCBC) or XSEDE National Integration Toolkit (XNIT), or other toolkits as they are developed).

Collection methodology: For Globus numbers, these are reported each quarter from the Globus Team.  XCBC installation numbers are collected by direct contact with installation sites (site visits, tickets to help@xsede.org, “stand and be counted” installation script), XNIT and other toolkit users are based on download statistics from software distribution sites.

User satisfaction with CRI services1RY 5      Advance-Create an open and evolving e-infrastructure
RY 4      
RY 3      
RY 24 of 5 /yr4.31    
RY 1

4 of 5 /yr

*

*

5

NA

5

 

Definition/Description: Users of CRI services (site visits) are asked to report their satisfaction with CRI services.  

Collection/Methodology: Direct request from CRI site visit participants via email and logged in the staff wiki.

Number of repository subscribers to SCRI cluster and laptop toolkits

RY 5

      

Advance-Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

150 / yr102    

RY 1

150 / yr

*914499

 

Definition/Description: Number of systems getting updates for XCRI toolkit software.

Collection methodology: Download logs for CRI toolkits are checked for regular access, systems that check the repository more than once per quarter are regarded to be subscribers and added to the count. Logs are stored on the software.xsede.org and cb-repo.iu.xsede.org systems.

Aggregate number of TeraFLOPS of cluster systems using SCRI toolkits

RY 5

1,000     

Advance-Create an open and evolving e-infrastructure

RY 4

700     

RY 3

500     

RY 2

Additional 200 / yr

+10

(742 total)

    

RY 1

1,000 / length of project

*

5321

5321 

 732

732

 

Definition/Description: Total number of TeraFLOPS of all clusters which have XCRI software toolkits installed. This metric has five year target of 1,000 TeraFLOPS, with aggregated goals for the end of each RY. These are as follows: RY1 200 TF, RY2 300 TF, RY3 500 TF, RY4 700 TF or written another way as follows:

RY1 = 200

RY1 + RY2 = 300

RY1 + RY2 + RY3 = 500

RY1 + RY2 + RY3 + RY4 = 700

RY1 + RY2 + RY3 + RY4 + RY5 = 1,000

Collection methodology: Users of clusters with XCRI toolkits installed are surveyed (self reporting) to ask their TF. The first of these surveys was conducted in May of 2016 and will be repeated annually.

Number of partnership interactions between XCRI and SPs, national CI organizations, and campus CI providersRY 5      Advance-Create an open and evolving e-infrastructure
RY 4      
RY 3      
RY 212 / yr10    
RY 1

12 / yr

*

51 

71

10

22

 

Definition/Description: Meetings and presentations attended by XCRI and SP’s, organizations such as ACI-REF and Open Science Grid, or individual campus CI providers.

Collection methodology: Whoever attends the meeting or presentation is asked to make a count of attendees; the event titles and counts are stored in a document at box.iu.edu and the XCRI section of confluence.xsede.org.

Toolkit updates1RY 5      Advance-Create an open and evolving e-infrastructure
RY 4      
RY 3      
RY 24/yr12    
RY 1

4/yr

*

2

1

6

9

 

Definition/Description: Number of updates to packages within toolkits or updates to toolkit scripts/glue code, which includes bugfixes/features/security improvements.

Collection methodology: Packagers of individual rpms at Cornell maintain spreadsheet of version updates, and this information is recorded in the XCRI section of the wiki.  Also, updates are made to scripts/glue code to git repositories, which are also recorded on the wiki.

New Toolkits released1

RY 5

      

Advance-Create an open and evolving e-infrastructure

RY 4

      

RY 3

      

RY 2

2/yr0    

RY 1

2/yr

*

*

1

0

1

 

Definition/Description: Number of new toolkits released for use by community members, including use case, capability delivery plan, testing, and release.

Collection methodology: Toolkit tracking pages on XSEDE wiki will have detailed description of each toolkit, and new toolkits will be counted each quarter.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1Data for this metric was previously underreported.

[Table of Contents]

6        XSEDE Operations (WBS 2.4) 

The mission of XSEDE Operations is to install, connect, maintain, secure, and evolve an integrated cyberinfrastructure that incorporates a wide range of digital capabilities to support national scientific, engineering, and scholarly research efforts. 

In addition to the Operations Director’s Office (§6.1), Operations staff is subdivided into four teams based on the work breakdown structure: Cybersecurity (§6.2), Data Transfer Services (DTS, §6.3), XSEDE Operations Center (XOC, §6.4), and Systems Operational Support (SysOps, §6.5). The Operations management team meets weekly and individual Operations groups meet approximately bi-weekly with all meeting minutes posted to the XSEDE wiki. 

Table 6-1: Area Metrics for XSEDE Operations.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Average composite availability of core services (geometric mean of critical services and XRAS)

(KPI)

RY 5

      

Sustain – Provide reliable, and secure infrastructure

RY 4

      

RY 3

      

RY 2

99.9% / qtr99.8%    

RY 1

99% / qtr

*99.9%99.9%

99.9%

99.9%

 

Definition/Description: The percent average composite availability of core services is the  geometric mean of % core enterprise services availability and % POPS/XRAS availability. Because the availability percentage of each of these components is measured separately, they are aggregated and then averaged using a geometric mean to determine the composite availability.

Collection methodology: See the individual collection methodologies for each component in the responsible WBS area below: Core enterprise services (Systems Operational Support 2.4.5) and POPS/XRAS (AA&AM 2.5.3).

Hours of downtime with direct user impacts from XSEDE security incidents

(KPI)

RY 5

      

Sustain – Provide reliable, and secure infrastructure

RY 4

      

RY 3

      

RY 2

0 / qtr0    

RY 1

<24 / qtr

*0146

0

146

 

Definition/Description: This metric will measure resource unavailability for users as a result of an XSEDE-wide security event (involving a Tier 1 service or spanning more than one SP). The metric will be calculated as the Time to Return to Repair (TTR) summed across all applicable services and incidents during the quarter.

Collection methodology: The Cybersecurity co-leads determine that an XSEDE-wide security event is responsible for an outage. The Sysops lead will determine the TTR based on monitoring and other information in logs and tickets, and the TTR value will be the data point reported.

Mean time to ticket resolution by XOC and WBS ticket queues (hrs.)

(KPI)

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

<24 / qtr26    

RY 1

<24 / qtr

*24.028.2

23.1

25.1

 

Definition/Description: XSEDE-funded areas are measured to ensure that, on average, tickets are resolved in a timely manner. This includes all XSEDE WBS ticket queues as well as the XSEDE Operations Center (XOC) queue. This metric does NOT apply to Service Provider ticket queues. End-users expect for their issues to be responded to within a reasonable amount of time, so the target is a resolution time under 24 hours.

Collection methodology: The data is stored locally via the Request Tracker service, which is the XSEDE ticket software and which is located at TACC. Reports are generated against the RT database to determine MTTR and other metrics.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

6.1 Operations Director's Office (WBS 2.4.1)

The Operations Director’s Office has been established to provide the necessary oversight to ensure the greatest efficiency and effectiveness of the Operations area. This oversight includes providing direction to the L3 management team, coordination of, and participation in, Operations planning activities and reports through the area’s Project Manager, and monitoring compliance with budgets, and retarget effort if necessary. The Director’s Office also attends and supports the preparation of project level reviews and activities.

The Operations Director’s Office will continue to manage and set direction for Operations activities and responsibilities. They will contribute to and attend bi-weekly senior management team calls, contribute to the project level plan, schedule, and budget, contribute to XSEDE IPR, annual, and other reports as required by the NSF and attend XSEDE quarterly & annual meetings. Lastly, the Director’s Office will advise the XSEDE PI on many issues, especially those relevant to this WBS area.

6.2 Cybersecurity (WBS 2.4.2) 

The Cybersecurity Security (Ops-Sec) group protects the confidentiality, integrity and availability of XSEDE resources and services. Users expect XSEDE resources to be reliable and secure, thus the security team’s goal is to minimize any interruption of services related to a security event. 

Table 6-2: Area Metrics for Cybersecurity.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Hours of downtime with direct user impacts from XSEDE security incidents

(KPI)

RY 5

      

Sustain – Provide reliable, and secure infrastructure

RY 4

      

RY 3

      

RY 2

0 / qtr0    

RY 1

<24 / qtr

*0146

0

146

 

Definition/Description: This metric will measure resource unavailability for users as a result of an XSEDE-wide security event (involving a Tier 1 service or spanning more than one SP). The metric will be calculated as the Time to Return to Repair (TTR) summed across all applicable services and incidents during the quarter.

Collection methodology: The Cybersecurity co-leads determine that an XSEDE-wide security event is responsible for an outage. The Sysops lead will determine the TTR based on monitoring and other information in logs and tickets, and the TTR value will be the data point reported.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

6.3 Data Transfer Services (WBS 2.4.3) 

The Data Transfer Services (DTS) group facilitates data movement and management for the community by maintaining and continuously evolving XSEDE data services and resources. 

Table 6-3: Area Metrics for Data Transfer Services.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Performance (Gbps) of instrumented, intra-XSEDE GridFTP transfers > 1 GB

RY 5

      

Sustain – Provide reliable, and secure infrastructure

RY 4

      

RY 3

      

RY 2

1.5 Gbps / qtr1.8    

RY 1

1 Gbps / qtr

*1.61.8

1.2

1.5

 

Definition/Description: The metric for data transfer performance on large files reflects the ability to consistently provide high performance data movement facilities. 

Collection methodology: Each SP logs all GridFTP transfer data locally. These logs are collected on a quarterly basis by XSEDE staff and loaded into a database. A number of SQL queries are then run to select intra-XSEDE transfers of the appropriate size and to compute the average transfer performance on those queries.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

6.4 XSEDE Operations Center (WBS 2.4.4) 

The XSEDE Operations Center (XOC) staff serve as user advocates, providing timely and accurate assistance to the XSEDE community, while simultaneously monitoring and troubleshooting user-facing systems and services.

Table 6-4: Area Metrics XSEDE Operations Center

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Meantime to resolution in XOC ticket queue

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

<4 hrs / qtr4.4    

RY 1

<24 hours / qtr

*4.25.9

3.7

4.6

 

Definition/Description: Tickets resolved by the XOC are measured to ensure that, on average, they are resolved in a timely manner. This is only the XSEDE Operations Center (XOC) ticket queue. This metric does NOT apply to the WBS ticket queues or the Service Provider ticket queues. End-users expect for their issues to be responded to within a reasonable amount of time, so the target is a resolution time under 24 hours.

Collection methodology: The data is stored locally via the Request Tracker service, which is the XSEDE ticket software and which is located at TACC. Reports are generated against the RT database to determine MTTR and other metrics.

User satisfaction with tickets closed by the XOCRY 5      Sustain - Provide excellent user support



PY 9      
PY 8      
PY 74.5 out of 5 / qtr4.5    
PY 64 out of 5 / qtr*4.84.2

5.0

4.7

Definition/Description: This is a survey of all external users whose help tickets are closed by the XOC during the specified reporting period.

Collection methodology: Within two hours of the resolution and closure of a ticket, the user receives an e-mail containing a link to a Survey Monkey survey. The survey consists of five questions using a 5 point Likert scale and the results are stored in Survey Monkey.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

6.5 Systems Operational Support (WBS 2.4.5) 

Systems Operational Support (SysOps) provides enterprise level support and system administration for all XSEDE central services. 

Table 6-5: Area Metrics for Systems Operational Support

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Average availability of critical enterprise services (%) [geometric mean]

(KPI - composite)

RY 5

      

Sustain – Provide reliable, and secure infrastructure

RY 4

      

RY 3

      

RY 2

99.9% / qtr99.9%    

RY 1

99% / qtr

*99.9%99.9%99.9%99.9%

 

Definition/Description: The availability metric shows the amount of time that critical enterprise services are available for use.  Services are grouped into criticality tiers based on their significance and dependence that other services have on them. Tier 1 systems are systems such as Kerberos, XDCDB, and DNS whereas Tier 3 services are development or business-day supported systems such as the RT test instance and Sciforma test instance. When calculated as a geometric mean, the metric shows a more accurate availability of the enterprise services to the users.  

Collection methodology: The data is stored locally to the Nagios service at IU. Nagios is an application that constantly logs the availability of core central services. These logs are queried to determine if/when a service becomes unavailable to users.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

[Table of Contents]

7     Resource Allocation Service (WBS 2.5) 

The Resource Allocation Service (RAS) is building on XSEDE’s current allocation processes and evolving to meet the challenges presented by new types of resources to be allocated via XSEDE, new computing and data modalities to support increasingly diverse research needs, and large-scale demands from the user community for limited XSEDE-allocated resources. RAS is pursuing these objectives through three activities: managing the XSEDE allocations process in coordination with the XD Service Providers, enhancing and maintaining the RAS infrastructure and services, and anticipating changing community needs.

Table 7-1: Area Metrics for Resource Allocation Service

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Average User satisfaction rating with allocations and other support services

(KPI)

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

4 out of 5 / qtr4.08    

RY 1

4 out of 5 / qtr

*3.984.03

4.03

4.01

 

Definition/Description: This is a composite metric that measures user satisfaction with with allocation policies and procedures for the submission, review, and administration of allocation requests as well as satisfaction with using XRAS, the primary tool for submission, review, and administration of allocation requests. 

Collection methodology: Users are surveyed each quarter. For the allocations process, the calculation is the weighted average of responses to set of questions about satisfaction with various aspects of allocations (time to prepare documents, documentation, support available, reviewer comments, etc.). For satisfaction with using XRAS, submitters within the current quarter are sent a post-submission survey prior to award notification. Respondents are asked to rate aspects of the submission process on a scale of 1 (not at all satisfied) - 5 (extremely satisfied). Responses to several questions are then averaged for the current quarter to determine this metric. Specific items include; (Q11B) Ease of use of the online submission system, (Q11D) Interactive responsiveness of the online submission system, (Q11H) Ease of finding reviewer comments and/or the outcome of your allocation request. 

Availability of XRAS

(KPI - composite)

RY 5

      

Sustain – Provide reliable, efficient, and secure infrastructure

RY 4

      

RY 3

      

RY 2

99.9% / qtr99.7%    

RY 1

99% / qtr

*99.9%99.8%

99.9%

99.87%

 

Definition/Description: This is an availability metric that encompasses the A3M services associated with XRAS, including the XRAS Review GUI and the XRAS Admin GUI, the XRAS Submit API as well as the XDCDB.

Collection methodology: There is a table acct.downtime in the XDCDB, which is manually completed when there is an outage in the XDCDB, a network outage disrupting availability of the XRAS services, or an outage of the XSEDE User Portal. Currently, outages are detected via a variety of e-mail notifications, and the data is manually entered into the XDCDB.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

7.1 RAS Director's Office (WBS 2.5.1)

The RAS Director’s Office has been established to provide the necessary oversight to ensure the greatest efficiency and effectiveness of the RAS area. This oversight includes providing direction to the L3 management team, coordination of, and participation in, RAS planning activities and reports through the area’s Project Manager, and monitoring compliance with budgets, and retarget effort if necessary. The Director’s Office also attends and supports the preparation of project level reviews and activities. The RAS Director's Office also contributes to an analytics effort to support NSF, Service Provider and XSEDE2 in understanding and projecting the stewardship of, demand for, and impact of CI resources and services.

The RAS Director’s Office will continue to manage and set direction for RAS activities and responsibilities. They will contribute to and attend bi-weekly senior management team calls, contribute to the project level plan, schedule, and budget, contribute to XSEDE IPR, annual, and other reports as required by the NSF and attend XSEDE quarterly and annual meetings. Lastly, the Director’s Office will advise the XSEDE PI on many issues, especially those relevant to this WBS area.

7.2 XSEDE Allocations Process & Policies (WBS 2.5.2) 

Allocations enable the national open science community to easily gain access to XSEDE’s advanced digital resources, allowing them to achieve their research and education goals. 

Table 7-2: Area Metrics for XSEDE Allocations Process & Policies

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

User satisfaction with allocation process

(KPI)

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

4 out of 5 / qtr4.14    

RY 1

4 out of 5 / qtr

*3.974.02

4.02

4.00

 

Definition/Description: Measure of users’ satisfaction with allocation policies and procedures for the submission, review, and administration of allocation requests.

Collection methodology: Quarterly survey of persons who have submitted allocation requests in the prior quarter. Weighted average of responses to set of questions about satisfaction with various aspects of allocations (time to prepare documents, documentation, support available, reviewer comments, etc.) 

Average time to process Startup requests

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

14 calendar days or less / qtr11.6 days    

RY 1

14 calendar days or less / qtr

*10.65 days11 days

10.4 days

10.7 days

 

Definition/Description: Average number of days it takes a Startup request to be processed and awards relayed to SPs (or a rejection to be finalized).

Collection methodology: The XRAS database schema is queried to average the days between time submitted and time notifications were sent for each Startup for which notifications are sent in a given quarter. XRAS automatically logs timestamps for each of these events as part of standard submission and processing steps.

Percentage of XRAC-recommended SUs allocated

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

100% / qtr60%    

RY 1

100% / qtr

*62%49%

56%

56%

 

Definition/Description: The total SUs allocated (compute resources only) at each XRAC meeting divided by the total SUs recommended by the XRAC.

Collection methodology: Tallied from the meeting spreadsheet.

Percentage of Research requests successful (not rejected)1RY 5      Sustain – Provide excellent user support
RY 4      
RY 3      
RY 285% / qtr70%    
RY 185% / qtr*76%75%

74%

75%

 

Definition/Description: To calculate the metric we will divide the number of Research requests receiving some non-zero award from an XRAC meeting by the number of Research requests submitted to the same XRAC meeting. Successful is defined as receiving some non-zero award from the XRAC meeting.

Collection methodology: Request data comes from the Quarterly XRAC meetings and is stored in XRAS as well as some meeting spreadsheets.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

New metric. Data retroactively entered for all of PY6. 

7.3 Allocations, Accounting & Account Management CI (WBS 2.5.3) 

The Allocations, Accounting & Account Management CI (A3M) group maintains and improves the interfaces, databases and data transfer mechanisms for XSEDE-wide resource allocations, accounting of resource usage, and user account management.

Table 7-3: Area Metrics for Allocations, Accounting, & Account Management CI

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

User satisfaction with XRAS system

(KPI)

RY 5

      

Sustain – Provide excellent user support

RY 4

      

RY 3

      

RY 2

4 out of 5 / qtr4.03    

RY 1

4 out of 5 / qtr

*3.984.03

4.03

4.01

 

Definition/Description: Measure of users’ satisfaction with using XRAS, the primary tool for submission, review, and administration of allocation requests. 

Collection methodology: XRAS submitters within the current quarter are sent a post-submission survey prior to award notification. Respondents are asked to rate aspects of the submission process on a scale of 1 (not at all satisfied) - 5 (extremely satisfied). Responses to several questions are then averaged for the current quarter to determine this metric. Specific items include; (Q11B) Ease of use of the online submission system, (Q11D) Interactive responsiveness of the online submission system, (Q11H) Ease of finding reviewer comments and/or the outcome of your allocation request.

Availability of XRAS systems

(KPI - composite)

RY 5

      

Sustain – Provide reliable, efficient, and secure infrastructure

RY 4

      

RY 3

      

RY 2

99.9% / qtr99.7%    

RY 1

99% / qtr

*99.9%99.8%

99.9%

99.87%

 

Definition/Description: This is an availability metric that encompasses the A3M services associated with XRAS, including the XRAS Review GUI and the XRAS Admin GUI, the XRAS Submit API as well as the XDCDB.

Collection methodology: There is a table acct.downtime in the XDCDB, which is manually completed when there is an outage in the XDCDB, a network outage disrupting availability of the XRAS services, or an outage of the XSEDE User Portal. Currently, outages are detected via a variety of e-mail notifications, and the data is manually entered into  the XDCDB.

Percentage of JIRA tasks assigned to sprints that are completed1

RY 5

      

Sustain – Provide reliable, efficient, and secure infrastructure

RY 4

      

RY 3

      

RY 2

95% / qtr98.5% (66/67)    

RY 1

95% / qtr1

*

100% (7/7)

90% (63/70)

98% (47/48)

96%

 

Definition/Description: A3M development is driven by stakeholder input, either by direct requests or discussions with stakeholder groups.  This input along with developer/management institutional knowledge will be evaluated, scoped and prioritized quarterly for inclusion into the A3M systems. The process includes the use of JIRA for task management, assigning tasks to a sprint based on prioritization. This metric will measure the percentage of Jira tasks assigned to sprints that get completed during a quarter..

Collection methodology: A3M feature development numbers are manually entered into a wiki page, which serves as the repository for that data. Feature and release numbers include prioritization, workload estimation, status, and number of approved requests implemented.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1 Prior to RY1 RP4, this metric was titled "Percentage of approved feature change requests implemented." The title was changed as a more intuitive summary of the RAS and A3M team's success and improvements in estimating, planning, and completing work. The target was also lowered from 100% to 95% as a reflection of the team's potential inability to complete tasks that are assigned at the very end of the quarter but can not be completed within that reporting period. 

[Table of Contents]

8     Program Office (WBS 2.6) 

The purpose of the Program Office is to ensure critical project level functions are in place and operating effectively and efficiently. The oversight provided via the Project Office is necessary to provide consistent guidance and leadership to the L3 managers across the project. A common and consistent approach to managing projects and risks is provided by theProject Management, Reporting and Risk Management (PM&R) team, while Business Operations manages all financial functions and sub-awards. The crucial aspect of communications to all stakeholders is the focus of the External Relations team. Finally, Strategy, Planning, Policy, Evaluation & Organizational Improvement focuses attention in precisely those areas to ensure the best possible structure continues to exist within XSEDE to allow the support of all significant project activities and enable efficient and effective performance of all project responsibilities.

Table 8-1: Area Metrics for Program Office.

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Number of Social Media impressions

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

300,00069,607    

RY 1

190,000

*52,200128,675

81,332

262,207

 

Definition/Description: Number of people who have seen XSEDE interactions on Facebook + Twitter. Both Facebook and Twitter count “impressions,” which is essentially how many people saw a certain post. A user of FB or Twitter no longer has to directly “like” or “retweet” a post to get this number - “impressions” would refer to people who see the post based on their friends or followers sharing our information. It is a better way of collecting social media awareness than just “followers” or “shares.”

Collection methodology: Facebook and Twitter both have internal methods of tracking metrics - we simply go to those sites and find the “impressions” numbers we need at the end of each reporting cycle. Anyone with access to the FB and Twitter pages can easily gather this information.

Number of media hits

RY 5

      

Deepen/Extend – Raise the awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

169 / yr42    

RY 1

147 / yr

*3230

18 

80

 

Definition/Description: This is the number of XSEDE-related stories we find in the media, many times through Google alerts for “XSEDE,” that mentions XSEDE by name. Often, we manually search “XSEDE,” as well, in case the daily alert email is missed.  

Collection methodology: NCSA tracks media hits for both NCSA and XSEDE, so that is a baseline collection of numbers. The ER team can then do additional manual searching - as an example, oftentimes the hits of HPCwire are not collected on the NCSA page, so an additional manual count is needed.

Percentage of recommendations addressed by relevant project areas

(KPI)

RY 5

      

Sustain – Operate an effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

90% / yr23%    

RY 1

90% / yr

*NA1100%

57% 

67%

 

Definition/Description: The SP&E team will calculate the metric based off the data from two measurements; total key recommendations made and total key recommendations addressed. Total key recommendations addressed will be divided by the total key recommendations made at the point in time of data capture (at the end of the IPR reporting period).

Collection methodology: The number of key recommendations made according to the annual XSEDE Climate Study findings + the total number of recommendations recorded on the XSEDE Project-Wide Improvements & Recommendations Google Sheet, such as XAB, NSF, and SP&E recommendations. XSEDE staff, not the SP&E team, are accountable for entering information to track improvements and recommendations.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Number of key improvements addressed from systematic evaluation2

(KPI)

RY 5      Sustain – Operate and effective and productive virtual organization




RY 4      
RY 3      
RY 210 / yrNA1    
RY 1****

 19

19

 

Definition/Description: It is essential for teams to include improvements implemented based upon responses to formal survey results such as the Staff Climate survey. The following sources will be used as recommendation data:
- Ticket Survey
- Climate Survey
- User Survey
- XCI Survey
- Micro Surveys
- XRAS post-allocation survey

Collection methodology: Number of improvements addressed during the quarter and logged on the XSEDE Project-Wide Improvements & Recommendations Google Sheet. There may be future development in JIRA for tracking.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Number of key improvements addressed from external sources2

(KPI)

RY 5      Sustain – Operate and effective and productive virtual organization




RY 4      
RY 3      
RY 210 / yr     
RY 1****

 11

11

 

Definition/Description: It is essential for teams to include improvements implemented based upon responses to feedback from such sources as XSEDE advisory bodies. The following will be used as recommendation data:
- XAB
- UAC
- SP Forum
- NSF

Collection methodology: Number of improvements addressed during the quarter and logged on the XSEDE Project-Wide Improvements & Recommendations Google Sheet. There may be future development in JIRA for tracking.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Ratio of proactive to reactive improvements

(KPI)

RY 5

      

Sustain – Operate an innovative virtual organization

RY 4

      

RY 3

      

RY 2

4:1 / qtr1:1    

RY 1

3:1 / qtr

*2:137:1

8:9 

17:11

 

Definition/Description: Ratio, of the above number of strategic or innovative improvements, of proactive versus reactive improvements.

Collection methodology: Wiki page.

Number of staff publications

(KPI)

RY 5

      

Sustain – Operate an innovative virtual organization

RY 4

      

RY 3

      

RY 2

20 / yr2    

RY 1

70 / yr

*50

 13

18

 

Definition/Description:

Collection methodology: When an XSEDE staff member has a publication, he or she enters the publication data into their profile on the XUP. This data is then summed for the appropriate reporting period. 

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1 L2 Directors are currently responding to climate study recommendations; data will be available in RY2 RP2. 

2This metric is one of two new metrics that resulted from splitting a former metric, called "Number of strategic or innovative improvements," into two. The former metric had a target of 9/yr and results were 3 for RP2 and 8 for RP3 in RY1. These two new metrics each have a target of 10/yr and will be reported annually. 

3This metric was reported incorrectly in IPR1. It has been corrected.

8.1 Project Office (WBS 2.6.1) 

The Project Office has been established to provide the necessary oversight to ensure the greatest efficiency and effectiveness of the Program Office area and to establish responsibility for assuring advisory activities of the project occur. This oversight includes providing direction to the L3 management team and coordination of and participation in Program Office planning activities and reports through the area’s Project Manager. The Project Office also attends and supports the preparation of project level reviews and activities. Importantly, the Project Office is responsible for ensuring that the XSEDE Advisory Board, the User Advisory Committee, and the SP Forum are functioning. The Project Office is responsible for coordination of project-level meetings such at the bi-weekly Senior Management Team (SMT) teleconference calls and the project quarterly meetings. Lastly, the Project Office will advise the XSEDE PI on many issues, especially those relevant to this WBS area.

8.2  External Relations (WBS 2.6.2) 

External Relations’ (ER) mission is to communicate the value and importance of XSEDE to all stakeholders (including the internal audience) through creative and strategic communications. 

Table 8-2: Area Metrics for External Relations

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Number of Social Media Impressions

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

300,000 / yr69,607    

RY 1

190,000 / yr

*52,200128,675

81,332 

262,207

 

Definition/Description: Number of people who have seen XSEDE interactions on Facebook + Twitter. Both Facebook and Twitter count “impressions,” which is essentially how many people saw a certain post. A user of FB or Twitter no longer has to directly “like” or “retweet” a post to get this number - “impressions” would refer to people who see the post based on their friends or followers sharing our information. It is a better way of collecting social media awareness than just “followers” or “shares."

Collection methodology: Facebook and Twitter both have internal methods of tracking metrics - we simply go to those sites and find the “impressions” numbers we need at the end of each reporting cycle. Anyone with access to the FB and Twitter pages can easily gather this information.

Number of media hits

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

169 / yr42    

RY 1

147 / yr

*3230

 18

80

 

Definition/Description: This is the number of XSEDE-related stories we find in the media, many times through Google alerts for “XSEDE,” that mentions XSEDE by name. Often, we manually search “XSEDE,” as well, in case the daily alert email is missed.

Collection methodology: NCSA tracks media hits for both NCSA and XSEDE, so that is a baseline collection of numbers. The ER team can then do additional manual searching - as an example, oftentimes the hits of HPCwire are not collected on the NCSA page, so an additional manual count is needed.

Number of science success stories and announcements appearing in media outlets

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

71 / yr18    

RY 1

62 / yr

*146

 14

34

 

Definition/Description: Email newsletter open rates can be used to measure the communities engagement in ER-outreach via email. Click-through rates help us understand which stories are of most interest to our readers.

Collection methodology: NCSA runs engagement reports through our email management system. These reports collect how many readers open a given email and how many readers click-through embedded links to learn more or take action on a prompt.

Monthly open and click-through rates of XSEDE's newsletter

RY 5

      

Deepen/Extend – Raise awareness of the value of advanced digital services

RY 4

      

RY 3

      

RY 2

Open: 34%/qtr

Click-
through: 12%/qtr

Open:

34%

Click-through:

21%

    

RY 1

Open: 35% / qtr

 

Click-through: 20% / qtr

*

Open: 34.4%

Click-through: 10.9%

Open: 35.7%

Click-through: 10.3% 

 Open: 35%

Click-through: 7% 

Open: 35.03%

Click-through: 9.4%

 

Definition/Description: Email newsletter open rates can be used to measure the communities engagement in ER-outreach via email. Click-through rates help us understand which stories are of most interest to our readers.

Collection methodology: NCSA runs engagement reports through our email management system. These reports collect how many readers open a given email and how many readers click-through embedded links to learn more or take action on a prompt.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

8.3 Project Management, Reporting, & Risk Management (WBS 2.6.3) 

The Project Management, Reporting & Risk Management (PM&R) team enables an effective virtual organization through application of project management principles, provides visibility to project progress, successes, and challenges, brings new ideas and management practices into the project and disseminates lessons learned in XSEDE to other virtual organizations. Communication is critical to success in this highly distributed virtual organization.

Table 8-3: Area Metrics for Project Management, Reporting, & Risk Management

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Variance, in days, between relevant report submission and due date

RY 5

      

Sustain – Operate an effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

0 / qtr0    

RY 1

0 / qtr

*NA100-

 

Definition/Description: As part of its reporting process, XSEDE has to submit various quarterly and annual reports to the NSF. The XSEDE PI and the NSF PO agree on a day that reports are to be submitted by and this is then communicated to the XSEDE team for the reporting schedule to be built.

Collection methodology: The XSEDE PI tracks when the relevant document was submitted to FastLane and records the number of days between the due date and this date in a spreadsheet.

Percentage of risks reviewed

RY 5

      

Sustain – Operate an effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

100% / qtr100%    

RY 1

95% / qtr

*100%97%

0% 

 

Definition/Description:Definition/Description: Risk Management is key to the success of XSEDE. By identifying risks, their triggers, mitigation strategy and contingency plans, XSEDE can proactively manage these risks to mitigate effects that may adversely impact the organization or take advantage of those that may help XSEDE.

Collection methodology:Collection methodology: All XSEDE risk owners are queried quarterly via e-mail with their risks attached in a spreadsheet. Each owner is asked to review their risks, note any changes in the spreadsheet, and email the spreadsheet  back to the PM lead. Each risk in the spreadsheet requires a notation that it was reviewed. The PM lead then records any changes and updates the risk register accordingly. The percentage is calculated by dividing the total number of acknowledged reviews by the total number of risks project wide.

Average number of days to execute PCR

RY 5

      

Sustain – Operate an effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

< 30 calendar days / qtr18.5    

RY 1

< 30 calendar days / qtr

*420

 14.3

-

 

Definition/Description:Definition/Description: A Project Change Request (PCR) form documents requests for any baseline changes. The PCR process allows for documentation of project changes that affect such components as scope, budget, staff, KPI/Area Metrics, and/or schedule changes. Execution is being defined as starting from the time at which all approvals have been received to the time the PCR is fully implemented.

Collection methodology:Collection methodology: JIRA will be used to track the different stages of the PCR and these numbers will be reported.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1There are no relevant report submissions in RY1 RP2. RPs are submitted in the quarter following the reporting period and thus, this variance will be reported in the subsequent quarter. 

8.4 Business Operations (WBS 2.6.4) 

The Business Operations group, working closely with staff at the University of Illinois’ Grants and Contracts Office (GCO) and National Center for Supercomputing Applications’ (NCSA) Business Office, manages budgetary issues and sub-awards, and ensures timely processing of sub-award amendments and invoices. 

Table 8-4: Area Metrics for Business Operations

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Goal Supported

Percentage of sub-award amendments processed within target duration2

RY 5

      

Sustain – Operate an effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

95% processed within 40 calendar days / qtr100%    

RY 1

90% processed within 40 calendar days / qtr

*NA1100%

-

100%

 

Definition/Description: Process is defined as the number of calendar days from PCR approval to amendment signed by partner, excluding the time spent awaiting partner action (e.g., providing supporting documents, review and approval of amendment). Percentage is calculated as # of amendments processed within the target duration / total amendments processed.

Collection methodology: Data is logged and stored in the Business Operations master spreadsheet located on the Business Operations Google Drive folder.

Percentage of sub-award invoices processed within target duration2

RY 5

      

Sustain – Operate an effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

95% processed within 42 calendar days / qtr100%    

RY 1

90% processed within 42 calendar days / qtr

*NA1100%

96%

98%

 

Definition/Description: Process is defined as the number of calendar days from acceptance of submitted invoice to invoice paid. Percentage is calculated as # of invoices processed within the target duration / total invoices processed.

Collection methodology: Data is logged and stored in the Business Operations master spreadsheet located on the Business Operations Google Drive folder.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

1Sub-award institutions do not have XSEDE2 contracts in place yet, so no amendments or invoices can be charged against the grant yet. 

2Metrics were changed from a number of days to process sub-award amendments/sub-award invoices to be a percentage processed within a range of days. Targets duration ranges of < 41 and < 45 calendar days, respectively,  were also modified. This change took effect RY1RP3. 

8.5 Strategic Planning, Policy & Evaluation (WBS 2.6.5) 

XSEDE dedicates effort to project-wide strategic planning, policy development, evaluation and assessment, and organizational improvement in support of sustaining an effective and productive virtual organization. 

XSEDE has engaged an independent Evaluation Team designed to provide XSEDE with information to guide program improvement and assess the impact of XSEDE services. Evaluations are based on five primary data sources: (1) an Annual User Survey that will be part of the XSEDE annual report and program plan; (2) an Enhanced Longitudinal Study encompassing additional target groups (e.g., faculty, institutions, disciplines, etc.) and additional measures (e.g., publications, citations, research funding, promotion and tenure, etc.); (3) an Annual XSEDE Staff Climate Study; (4) XSEDE KPIs, Area Metrics, and Organizational Improvement efforts, including ensuring that procedures are in place to assess these data; and (5) Specialized Studies as contracted by Level 2 directors and the Program Office. 

Table 8-5: Area Metrics for Strategic Planning, Policy & Evaluation

Area Metric

Program Year

Target

RP1

RP2

RP3

RP4

Total

Owner(s)

Percentage of recommendations addressed by relevant project areas

(KPI)

RY 5

      

Sustain – Operate and effective and productive virtual organization

RY 4

      

RY 3

      

RY 2

90% / qtr23%    

RY 1

90% / qtr

*NA1100%

 57%

67%

 

Definition/Description: The SP&E team will calculate the metric based off the data from two measurements; total key recommendations made and total key recommendations addressed. Total key recommendations addressed will be divided by the total key recommendations made at the point in time of data capture (at the end of the IPR reporting period).

Collection methodology: The number of key recommendations made according to the annual XSEDE Climate Study findings + the total number of recommendations recorded on the XSEDE Project-Wide Improvements & Recommendations Google Sheet, such as XAB, NSF, and SP&E recommendations. XSEDE staff, not the SP&E team, are accountable for entering information to track improvements and recommendations.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Average rating of staff regarding how well-prepared they feel to perform their jobs

(KPI)

RY 5

      

Advance – Enhance the array of technical Expertise and support services

RY 4

      

RY 3

      

RY 2

4 out of 5 / yrNA1    

RY 1

4 out of 5 / yr

*- ^-

-

 

Definition/Description: An annual Staff Climate Study is a survey administered and analyzed by an external evaluation team as part of an effort to understand whether XSEDE staff feel adequately prepared to conduct their XSEDE work. The survey study includes items regarding staff training.

Collection methodology: This is being revised and will be updated once the change is implemented.

Number of key improvements addressed from systematic evaluation2

(KPI)

RY 5      Sustain – Operate and effective and productive virtual organization




RY 4      
RY 3      
RY 210/ yrNA1    
RY 110/ yr*--

19

19

 

Definition/Description: It is essential for teams to include improvements implemented based upon responses to formal survey results such as the Staff Climate survey. The following sources will be used as recommendation data:
- Ticket Survey
- Climate Survey
- User Survey
- XCI Survey
- Micro Surveys
- XRAS post-allocation survey

Collection methodology: Number of improvements addressed during the quarter and logged on the XSEDE Project-Wide Improvements & Recommendations Google Sheet. There may be future development in JIRA for tracking.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Number of key improvements addressed from external sources2

(KPI)

RY 5      Sustain – Operate and effective and productive virtual organization




RY 4      
RY 3      
RY 210 / yr2    
RY 110 / yr*--

11

11

 

Definition/Description: It is essential for teams to include improvements implemented based upon responses to feedback from such sources as XSEDE advisory bodies. The following will be used as recommendation data:
- XAB
- UAC
- SP Forum
- NSF

 

Collection methodology: Number of improvements addressed during the quarter and logged on the XSEDE Project-Wide Improvements & Recommendations Google Sheet. There may be future development in JIRA for tracking.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

Ratio of proactive to reactive improvements

(KPI)

RY 5      Sustain – Operate and effective and productive virtual organization




RY 4      
RY 3      
RY 24:1 / yr1:1    
RY 13:1 / yr*8:137:1

8:9 

17:11

 

Definition/Description: That ratio of proactive to reactive improvements is a measurement of organizational maturity and agility. An improvement is classified as being reactive if the change was due to a problem that occurred naturally and had to be corrected for the process to continue. A proactive improvement is an improvement that is made in anticipation of future problems, needs, or changes.

Collection methodology: Process improvements are tracked via a self-reporting method (Each quarter, all of the WBS level 3 managers and WBS Level 2 Directors are queried via e-mail asking them to self-report on any process improvements they have implemented in the last quarter. This information is collected in a google spreadsheet and totaled each quarter.). All areas of the project are asked to submit the process improvements they have made to their areas on a quarterly basis. From this list, the improvements are then classified as being either proactive or reactive based on the definitions above.

NOTE: An improvement made is not necessarily a recommendation addressed; but a recommendation addressed is an improvement made and therefore should also be tracked in the XSEDE Project-Wide Improvements & Recommendations Google Sheet once fully implemented; it is OK to have duplication between Improvements fully implemented and Recommendations addressed.

 * NOTE: RY1 is unique and only spans September 2016-April 2017.  RY1 RP1 doesn’t exist due to the shortened RY. RY1, RP2 only includes two months (September-October 2016). 

- Data reported annually.

^ The number 3.70 was erroneously reported in RY1, RP2.

L2 Directors are currently responding to climate study recommendations; data will be available in RY1 RP3. 

This metric is one of two new metrics that resulted from splitting a former metric, called "Number of strategic or innovative improvements," into two. The former metric had a target of 9/yr and results were 3 for RP2 and 8 for RP3 in RY1. These two new metrics each have a target of 10/yr and will be reported annually. 

3 This metric was reported incorrectly in IPR1. It has been corrected. 

 

[Table of Contents]

Footnotes
Ref Notes
1 https://www.nsf.gov/about/performance/strategic_plan.jsp
2 http://www.nsf.gov/cise/aci/cif21/CIF21Vision2012current.pdf
3 http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf12051
4 http://www.nist.gov/baldrige/

 

 

  • No labels