Skip to end of metadata
Go to start of metadata


Action Items:

Due Date
Update story rubric QMA-353 - Getting issue details... STATUS

Note questions and include in rubric for stories. Everything here is marked as preferred in current rubric–need to make required.

Hannah Remmert 
Interview ECSS PI QMA-354 - Getting issue details... STATUS

Bob had PI volunteer info about ECSS person teaching them about containers etc.  Good person for Hannah to talk to. 

Hannah Remmert 
Update post-interview survey QMA-355 - Getting issue details... STATUS Consider adding some of these questions to the surveyLorna Rivera 
PI surveys QMA-356 - Getting issue details... STATUS Consider including question in PI exit interviews re. how working with XSEDE impacted/changed their researchRobert Sinkovits 
Citation analysis QMA-357 - Getting issue details... STATUS Talk to Gregor about including citation analysis in reports/quarterly updatesRon Payne 
Enhanced videos QMA-358 - Getting issue details... STATUS Consider enhanced videos with questions about how XSEDE has changed their researchHannah Remmert 
Final report QMA-359 - Getting issue details... STATUS Provide more structure to final report. Consider providing structure & ask to address specific questions so these can be easier analyzed.David Hart 


Notes/ Discussion items:

  • Per REC-21 - Getting issue details... STATUS  (June 2017 NSF Panel Review): As an interim measure, the project could consider enhancing their interviews of scientists for impact videos with questions that reflect how the process of working with XSEDE2 has changed 1) how they see themselves, 2) how others in their field see them, and 3) how their work with XSEDE2 has changed their teaching or research. This is consistent with one definition of learning as a transformational process. Although the management team feels that there is a gap between anecdotes and KPIs, when those anecdotes are developed into comprehensive stories, they become case studies. The impact stories and impact videos, coupled with overall statistics such as number of regular and gateway users, billions of dollars of research supported, and millions of dollars directly saved by other research projects, should be sufficient as indicators of impact in lieu of standardized, widely-applied impact analytics.
    • Came up while talking about ECSS interviews.
    • PIs that get allocations–how has award/allocation impacted your research?
    • Broader than just ECSS
    • They saw an example of a video at the review & were impressed with campaign but thought this would be something additional to get out of it. We need to decide what we want to get out of it. 
      • Surveying PIs who may be able to articulate impact better
      • As we do success stories, alteration to rubric. We ask question in a different way. 
      • Show common thread through interviews/videos we create
      • Make sure all have statement from scientist that says why this was important to their research
      • Nancy & Ralph worked together to be sure they asked the same questions the same way. Concern that Nancy & Ralph doing this in person could induce bias.  
      • Lorna & Lizanne send a short 2 question survey (for past ~6 mos)
    • Additional questions that could be wrapped into interview/survey
    • Additional metrics of impact suggested are things we dance around. 
    • Telling us we don't have to have impact analytics because these other measures achieve that. 
  • More videos
  • More questions in post interview survey
  • More surveying of PIs when allocation expires
  • Alterations to rubrics for stories/videos
  • We already collect some of the suggested numbers. 
  • Put on producer hat: add these questions & becomes more impactful.
  • Don't want everyone to look the same
  • Things mentioned are short term. Problem quantifying something that doesn't happen. 
  • Longitudinal study of researchers who have had XSEDE support vs. those who haven't. 
  • Canadians track career paths. We weren't ever sure how to do that. 
  • Doesn't talk about publication metrics which are also impact metrics
  • Willing to consider funding for Gregor's team to understand impact of publications. Top down metrics–project wide measures that don't naturally bubble up through WBS area. What else beyond publication analysis would make sense? 
    • User numbers
    • research supported
    • citation analysis is important
      • include in reports/quarterly updates on that metric & be able to track over time. 
    • Make required to fund them so we know it gets done. 
    • Other things we should think of?
    • Questions in annual survey
      • How essential XSEDE is to your work–self reported impact
    • Get statistics if we ask–not regularly provided. 
    • Don't feel that we highlight this. 

  • Key thing is interim measure, enhanced videos with questions about how XSEDE has changed their research. 
  • "in lieu of"–recognition that there aren't standardized methods. You're giving statistical data. Ask them also how their work with XSEDE has changed them. That combination is sufficient. 
  • Survey every PI as they finish allocation & ask these kinds of questions to get at impact when they finish. 
  • Every allocation is required to submit final report (if they don't submit renewal). Free form. Lorna & Sergio look at these to assess what they're telling us. Have about 300. 
    • Good to analyze.  Submitted abstract to PEARC19. 
    • Consider providing structure & ask to address specific questions.
    • Looking at startups & why they don't transition 
  • Like to claim credit for technology transfer. Startups learn how to do things from working with us that they then apply to their systems. Maybe reflected in exit interviews–what they learned and how they're applying it even if not with XSEDE. 
  • Success with LIGO–provided 4-5x speedup, and based on what they learned from us, took that and achieved a 10x speedup. Story only applies to subset of LIGO research. Like to see more stories like this. 
  • Everything here is marked as preferred in current rubric–need to make required.
  • Survey of PIs --pull user survey questions or something similar around impact. 
  • ECSS interviews–don't ask directly. There is survey about impact. Can we ask these questions? 
    • PI exit interview doesn't capture this. 
    • Better to capture in survey than PI exit interview. More users than PIs. Or capture in multiple places. 
    • PI interviews focused on ECSS project, not use of allocation. Need to differentiate
    • PI exit interviews–great to capture knowledge transfer. 
      • Bob had PI volunteer info about ECSS person teaching them about containers etc.  Good person for Hannah to talk to. 
    • How working with XSEDE has changed how they see themselves
      • Can't report a number with their story
      • I'm more capable of taking on projects that require computational work
      • More comfortable doing things that require XSEDE resources
  • No labels