Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 18 Next »

Wed, 10/00 – 12-4pm ET | 11am-3pm CT |  10am-2pm MT | 9am-1pm PT

Executive Summary of Meeting (available after summary is approved by XAB)

Attendees:

NAME
PRESENT (tick)/(error)
XAB Members
Karin Remington(tick)
Randy Bryant(tick)
Thomas Cheatham 
Toni Collis(tick)
Rama Govindaraju 
Cliff Jacobs(tick)
Albert Lazzarini (tick)

Phil Maechling

 
Shaowen Wang 
Theresa Windus 
Service Provider Forum
Shawn Strande(tick)
Jonathon Anderson 
Dana Brunson(tick)
User Advisory Committee
Emre Brookes 
XSEDE Leadership
John W. Towns(tick)
Kelly Gaither(tick)
Philip Blood(error)
Robert Sinkovits(tick)
Dave Lifka 
Gregory Peterson

 

David Hart(tick)
Ron Payne(tick)
Scott Wells(tick)
Leslie Froeschl(tick)
Jennifer Houchins 
Victor Hazlewood(tick)
Sergiu Sanielevici(tick)
Craig Stewart 
Lizanne DeStefano(tick)
Kristin Williamson 
Hannah Remmert(tick)
Lorna Rivera(tick)

Agenda

TimeDurationItem w/ Notes (presentation materials linked)Lead
11:00 AM

10 mins

Welcome

  • Introductions
Karin Remington
11:10 AM30 mins

Project Level Topics

  • Thank you to XAB members for attending and for your input. Appreciation for engagement of members.
  • NSF has forward funded XSEDE 2.0 through ~Feb 2021
    • project end date Aug 2021
    • require spending authorization annually
    • based on strong reviews, reports we've provided
  • Interactions with Program Director are good.
    • not required to have a mid-year review this year
    • use of KPIs has been an effective way to communicate status/progress of project
  • Strategic planning session held Oct. 2
    • assess alignment of our stakeholders (stakeholders defined broadly as user community, partners (SPs), staff, funding agencies, etc.) with strategic goals
    • identify stakeholder priorities, priority areas for draft transition plan, priorities beyond XSEDE 2.0
    • Stakeholder priorities (must haves):
      • Access to research computing
      • Infrastructure services supporting the ecosystem
      • Workforce development
      • Campus champions and campus engagement
      • Advanced user support
    • How to define role of XSEDE in addressing gaps/ how XSEDE is involved
      • Providing leadership role
      • Collaborating with larger community
      • When you think about how to address gaps, think in strategic terms.
  • Beyond XSEDE 2.0 scenarios
    • Fund single awardee for similar services at ~$75M/5 years (labor cost)
      • What are most critical services we provide. Consider what could be eliminated
    • Fund multiple awardees
      • Expanded scope for each component: i.e., RAS might be expanded to provide support for other allocations processes.
      • As resources are retired, could impact whether there would be anything like XSEDE at that point.
    • Albert: Asking us to define vision for future?
      • JT: This will point out key things that need to be continued, we don't know what NSF will do with that. Timing is odd. Decisions will come in next 4-6 mos. Our transition plan wouldn't get to them until later. Should we deliver earlier?
    • Albert: initiatives under 5-10 new ideas initiative. Multi messenger astronomy. Exploring computing resources to support this.
      • JT: Participating in workshop next week that will address this. Not sure what XSEDE's role might be. Also a timing issue. Services XSEDE provides could be leveraged, but challenges in how we put together partnership
  • NSF Discussions: Conversation with Manish Parashar re. where NSF is going, how we can provide input to the process
    • Would like John to come to NSF to brainstorm with him and others at NSF about what should happen next. Encouraged that they're willing to hear John's input, what he sees as priorities, what's working & what's not... Expect to see community workshops to get community input as well. Manish should be at NSF through discussion phase and award phase.
    • John is planning what he'll take to that meeting, topics.
John Towns
11:40 AM20 mins

Evaluation Update

  • Staff climate study (inward looking study of what XSEDE is like as a workplace)
    • Trend of increasing scores from 2013 baseline scores
    • Highest gains in wiki & website, communication tools, decision making
    • Leadership/management training was a common request
    • Cliff: You have learned elements of what helps a virtual org be successful. Important to share this with NSF and the community in general. Value of study of how you can improve organization through climate studies.
      • Lizanne: NSF invited us to speak at large facilities workshop about this.
      • John: Don't see NSF recognizing contributions we're making. How do we incorporate this information into NSF planning processes? Particularly when they're thinking about what the next thing looks like. Don't want them to get lost in discrete services needed and forget that they have to be offered in concert.
  • User survey (annual survey about how users feels about XSEDE; how aware they are; how satisfied they are)
    • Awareness slightly down from 2017. 25% of users are new to XSEDE every year, so this is common.
      • Always need to work to get word out about our services
      • Awareness highest among center non-research staff, center research staff, high school users, HPC centers, MIS, EPSCOR state institutions.
    • Satisfaction remains high (4.28 on 5-point scale)
      • Highest among high school, government lab, faculty/PI users, minority-serving and teaching focused institutions.
      • Survey continually evolving
    • Importance of XSEDE resources to your research remains high
    • Recommendations: More and faster.
    • Randy: Numbers are extraordinary, esp since XSEDE doesn't have control over resources themselves.
    • John: Many in community don't understand difference between XSEDE & SPs
    • Cliff: Can XSEDE make argument that by existence of XSEDE they've enabled long term challenges to be more effectively addressed?
      • Lizanne: Large % of people who use XSEDE training materials don't have an allocation. SPs don't have to provide training for these things.
Lizanne Destefano
12:00 PM20 mins

Program Office Topics

  • Discover More marketing campaign
    • Find well to tell story more consistency, what is unique
    • Show XSEDE as an approachable project no matter who you are a researcher. Everyone can discover more with XSEDE
    • 2 phases: launching at SC18 along with digital implementation. More human elements of XSEDE.
  • SC18 plans
    • Updated booth graphics to match discover more campaign
    • 2 in-booth events: Tuesday morning Breakfast of champions to highlight campus champions; Wednesday discover more reception.
    • Will begin sending communications on these soon.
    • Also promoted SPs
  • Cliff: Remind everyone of strategic objective that these activities are trying to address. Doing this within that context.
  • Karen: Tracking SC participation over years? Indication of how presentation materials are doing over time?
    • Hannah: Track booth leads (who comes by), how many of them have conversation & pick up materials.
  • Marketing analysis
    • 10 students assessing market for XRAS & SSO hub services
    • Difficult to bring them up to speed on what these products are; steep learning curve.
    • Midpoint presentation Oct. 22. SMT will be invited to join that.
Ron Payne
12:20 PM30 mins

RAS Topics

  • Team met in September
    • Valuable to look at 12month window, goals
    • Improve administrative aspects of XRAS
    • Revisit accounting/updating with modern technology
    • Reviewed tasks completed & accomplishments
    • Using XRAS to help allocate aircraft at UCAR/NCAR
    • Huge demand vs. available resources; Declining success for research requests (down to about 65% success rate)
  • Emre: Concerning about declining success rate. Data to support why?
    • Dave: Because of huge demand/requests (225 at each quarterly; 250 not uncommon), this is inevitable. Can't sustain all of them. Over time the panel approach has evolved to understand constraints that are out there & look for ways to simplify their workload. XRAC reviewer manual is continually referred to, including guidance for rejection. Targeting XRAC process, policies. The less we reject means everyone has to take less.
    • Emre: why people are being rejected
      • Dave: 3 major reasons: some sort of technicality (failed to do something so can easily take them out)
      • poorly justified requests: Better for user to reject than to give them something so small they can't get anything done. 12 months to use award & not supposed to come back before that 12 months. If we reject you can come back in 3 months.
      • Every time you reject there is a chance the user goes away and never comes back. Don't want that.
      • Emre: statistics about what categories of rejection are, numbers over time to understand what's happening.
      • Dave: large number handled by ECSS staff have higher success rate. Plenary & parallel session levels rejections are higher. More likely if you're a new user you'll be rejected than a renewal.
      • Emre: More bad science or process has gotten harder?
      • Dave: Both. Went from 100+ to 200+ requests. Desire has morphed.
    • Karen: Across the board cuts used to happen all the time at NIH. Difficult for program to manage. Any program officer probably doesn't know their constraints.
    • Cliff: Have you analyzed who gets rejected by discipline, institution?
      • Dave: 2016 Chart of who gets rejected at what percent: Adaptive; Bioscience; Material Sciences; Physics/Engineering/Astronomy; Plenary. What happens after rejection chart: 20% never submit again, 20% get rejected again, 60% later get a research or startup allocation. Always trying to improve this. Informs outreach/education program. No simple solution.
    • Shawn: Panel has become slightly less tolerant of proposals not well written/documented? Over time tendency to be tougher than in the past (partially due to their knowledge of limited resources)
      • Dave: Some dynamic. Practice has evolved more than policies. Many reviewers looking at 10+ requests/ quarter (volunteer work). Continue to push to rotate out reviewers to bring in new members, which has helped. As new reviewers come in they tend to be more willing to consider requests.
    • Albert: Rejection rate reflection of demographics? Make an effort to ensure various communities are equitably represented?
      • Dave: Do not make an effort to ensure a % goes to each field. Look at merit of requests regardless of field. Perceived competition between biologists & physicists. Materials research has increased dramatically; huge volume of requests.
      • Shawn: Connection to funding of PI.
        • Dave: Once panel makes recommendation, the reconciliation process factors in the source & amount of funding. NSF gets highest. Formulaic process that brings recommendations in line with available resources.
    • Toni: Tracing gender or other demographics?
      • Dave: RAS doesn't track this. Evaluation team looks at gender. Can look at institutional criteria/MSI/EPSCOR. Allocations tend to follow funding from agencies.
      • Toni: Good to have this data and be ready to tell your story.
Dave Hart
12:50 PM10 minsBreak 
1:00 PM30 mins

XCI Topics

 

Dave Lifka
1:30 PM30 mins

CEE Topics


Kelly Gaither
2:00 PM30 mins

ECSS Topics

Phil Blood &

Bob Sinkovits

2:30 PM30 mins

Ops Topics

Operations-XAB-presentation

Victor Hazlewood
3:00 PM Adjourn 

 

 

  • No labels