2012 Software Review Planning Meeting: Mar. 16, 2012

From GlueXWiki
Revision as of 15:06, 16 March 2012 by Cmeyer (Talk | contribs) (Minutes)

Jump to: navigation, search

Main Hall-D Software Review Page


Connecting

Phone:

  • +1-866-740-1260 : US+Canada
  • +1-303-248-0285 : International
  • then enter participant code: 3421244# (remember the "#").
  • or www.readytalk.com (and code without the #)

Agenda

Dear all,

    Thanks for the work to come to concise summaries on our
previous self-assigned actions. I think this has been a very
useful communication and collaboration between the four Halls
and other interested parties. Very good!

    Please find attached an example charge as given to me by
Chip Watson, and once more the earlier document on planning for
the software review.

    Here's our further tentative planning, assuming a software
review June 7 + 8:
    Between March 29 and April 4:
        Followup preparatory meeting to discuss
    1) few slides intro on scientific computing outlook by 2015
    2) each Hall one slide on anticipated weaknesses/tight spots
        and one slide on planned management structure

    Week of April 23-27:
        Initial Dry Run: Talk Outlines
            (what you plan to present in each slide)

    May 14 or May 15:
        First Dry Run: Near-complete presentations

    Period of May 24, 25, 29, 30 (dates to be determined):
        Final Dry Run: Timed presentations

                    Best regards,    Rolf 
  • Time-line for Hall-D
    • April 23-27 - talk outlines.
    • April 30 - Written material for the review (CDR?).
    • May 14/15 - First draft of presentations for dry run.
    • May 24/25 or 29/30 - Final talks with timed presentations.
    • June7/8 Software Review at JLab.
  • Cross-Hall Collaboration
  • Organization of Hall-D preparations for Review

Minutes

Attendees: Mark Ito, David Lawrence and Curtis Meyer

Status of Review Organization

Cross-Hall Collaboration

Organization of Hall-D preparations for Review

Introductory talk

Software Manpower within collaboration

Offline Computing Schedule

Software Milestones/Challenges

  • Curtis has written a one-page summary document describing the experience in CLAS in moving large data sets from Jefferson Lab to remote site for analysis.

Calibration/Alignment

Misc.

  • Curtis will contact members of the collaboration to try and get an estimate of the current computing resources that are available outside of Jefferson Lab for GlueX. The example from CMU is
At CMU: These resources are split with our lattice QCD group with whom
we share operations of the cluster. All CPUs are roughly 2.4GhZ nodes and
all are under three years old at the moment.

32       dual-quad core AMD machines      ->  256 compute cores 1GB core per CPU
12       quad-eight-core AMD machines    ->   384 compute cores 2GB core per CPU
180TB of RAID storage spread across several servers
10Giga-bit backbone from the RAID servers to the switches and 2Giga-bit connections
to each compute node.


Over the next few years, we anticipate purchasing around 16 boxes of the next
generation of CPUs (quad-sixteen or thrirty-two core boxes). We also expect to
double our disk storage capability as we start to retire some of our very old few-TB
servers. 

Action Items