Skip to end of metadata
Go to start of metadata

Decisions:

N/A 
Action Items:

Summary
Description
Responsible
Due Date
NSF Meeting Slides QMA-325 - Getting issue details... STATUS John will make first set of slides availableJohn Towns 

 

Notes/ Discussion items:


2 sessions at NSF. Most of OAC staff present in morning. Few folks from other directorates in the afternoon. 

  • Had to describe what XSEDE is to both groups
  • Lessons learned
    • Importance of project coordination/project management practices
    • Avoiding reporting overhead vs. communicating about the project
    • Financial management challenges are non-trivial
    • Must act like institutions/succession planning/staff performance feedback
    • Have a vision, mission, strategic goals
    • Determine means to measure impact/reporting NSF can use when meeting with congressional oversight committee etc. 
    • Decision making: strong central mgmt and delegation and decentralization of decision-making authority
    • Centrally coordinating enables the most expertise without duplication
    • Formal methods vs. short term solution
    • Leverage investments by NSF to deliver capabilities as quickly as possible. Acknowledge risks 
    • Consistency in what NSF is asking for as the major stakeholder. Keep consistent program officer as long as possible
    • Engaging researcher community/embed support staff/focused partnerships with domain experts
    • Data challenges: was originally more about computing but data is intimately intertwined. 
    • Managing this complex environment is challenging, but end users don't care. Need consistent environment provided regardless of complexity
  • Critical Capabilities to retain post-XSEDE 2.0
    • Went through each L2 area and highlighted pieces that would need to be retained

Post XSEDE 2.0 Options

  • Vision? Put some bounds on it
  • Long-term exit strategy or must be supported long-term? 
  • Budgetary constraints. A better vision might command more budget
  • Strategic impacts that are most important to NSF? How measured? Can a program be constructed to accomplish this? 
  • Go Big: expand scope to reduce duplication across the foundation
  • Really Big: Single MREFC-level award; treat as a facility
  • Big & Wide: Multiple awards that address various service areas/multiple smaller awards that might be more manageable
  • Go home: ramp-down funding and shut it down
  • Go home v2: ramp-down funding; encourage creation of commercial spinoffs
  • Status quo: fund similar project at similar budget/ preserve known working environment\
  • Keep the trains running: fund similar project wit significantly reduced budget/retain only critical services. 

Discussion: 

  • What was overall reception
    • No direct message. Indirect message was that they're interested in hearing from us.  JT didn't endorse any of the options he presented. They would like John to return and talk to them more.  Will likely give topics/ direction next time. 
  • Who was present:
    • Nearly every OAC program officer most of day ~15 people
    • Afternoon 3 people from elsewhere from foundation. Peter McCartney Bio, Nigel Sharp NPS, Vladimir called in from satellite phone from South Pole
      • Would have liked to have more there from around foundation, but most don't know that they need to know about this. 
  • Follow-on approaches didn't touch on models being used in other places at NSF with multiple award model (i.e., Earth Cube, Power Wireless research). 
    • Talked about this model
  • Have asked ER team to work on reports that we can share with NSF about resource/service usage. Complete list of PIs so can connect awards to the usage chart (Distribution of Consumption by NSF-supported research projects)
    • Most directorates knew they have PIs using XSEDE services, but didn't know scope etc. 
    • Can PIs enter award # in anything XSEDE tracks? Yes. XDMod checks data reported by PIs against NSF database. 
  • No labels