Difference between revisions of "2012 Software Review Planning Meeting: Mar. 16, 2012"
From GlueXWiki
(→Minutes) |
(→Introductory talk) |
||
Line 76: | Line 76: | ||
=== Introductory talk === | === Introductory talk === | ||
+ | *We went over the introductory talk which we felt should be fairly graphically orientated. | ||
+ | Our current feeling is that the slide should include the following: | ||
+ | # Overview of Collaboration/Hall D software management. | ||
+ | # Overview (graphically) of what the software needs to do with rough data volumes and possibly estimates of compute resources. | ||
+ | # An "event display" showing a b1pi event and discussing what we need to have to do amplitude analysis. | ||
+ | # A overview of the 3pi analysis done by Jake showing that we have carried out a full analysis from event generators and background generators through GEANT on the grid, actual reconstruction of the events, physics cuts and then an amplitude analysis that could have | ||
+ | been done on a GPU. | ||
+ | # A list of what we feel still needs to be done to have us ready for data analysis in 2015. | ||
=== Software Manpower within collaboration === | === Software Manpower within collaboration === |
Revision as of 15:11, 16 March 2012
Main Hall-D Software Review Page
Contents
Connecting
Phone:
- +1-866-740-1260 : US+Canada
- +1-303-248-0285 : International
- then enter participant code: 3421244# (remember the "#").
- or www.readytalk.com (and code without the #)
Agenda
- Previous Meeting (Mar. 9, 2012)
- Status of Review Organization (JLab organizers)
- Meeting with all halls and Rolf Ent on 3/14/2012
Dear all, Thanks for the work to come to concise summaries on our previous self-assigned actions. I think this has been a very useful communication and collaboration between the four Halls and other interested parties. Very good! Please find attached an example charge as given to me by Chip Watson, and once more the earlier document on planning for the software review. Here's our further tentative planning, assuming a software review June 7 + 8: Between March 29 and April 4: Followup preparatory meeting to discuss 1) few slides intro on scientific computing outlook by 2015 2) each Hall one slide on anticipated weaknesses/tight spots and one slide on planned management structure Week of April 23-27: Initial Dry Run: Talk Outlines (what you plan to present in each slide) May 14 or May 15: First Dry Run: Near-complete presentations Period of May 24, 25, 29, 30 (dates to be determined): Final Dry Run: Timed presentations Best regards, Rolf
- Time-line for Hall-D
- April 23-27 - talk outlines.
- April 30 - Written material for the review (CDR?).
- May 14/15 - First draft of presentations for dry run.
- May 24/25 or 29/30 - Final talks with timed presentations.
- June7/8 Software Review at JLab.
- Cross-Hall Collaboration
- Organization of Hall-D preparations for Review
- Introductory talk (preliminary sildes) with notes.
- Software Manpower within collaboration (no change) (Mark)
- 2012 Offline Computing Schedule (David)
- Software Milestones/Challenges
- Calibration/Alignment
- Computing Resources in the Collaboration
Minutes
Attendees: Mark Ito, David Lawrence and Curtis Meyer
Status of Review Organization
Cross-Hall Collaboration
Organization of Hall-D preparations for Review
Introductory talk
- We went over the introductory talk which we felt should be fairly graphically orientated.
Our current feeling is that the slide should include the following:
- Overview of Collaboration/Hall D software management.
- Overview (graphically) of what the software needs to do with rough data volumes and possibly estimates of compute resources.
- An "event display" showing a b1pi event and discussing what we need to have to do amplitude analysis.
- A overview of the 3pi analysis done by Jake showing that we have carried out a full analysis from event generators and background generators through GEANT on the grid, actual reconstruction of the events, physics cuts and then an amplitude analysis that could have
been done on a GPU.
- A list of what we feel still needs to be done to have us ready for data analysis in 2015.
Software Manpower within collaboration
Offline Computing Schedule
Software Milestones/Challenges
- Curtis has written a one-page summary document describing the experience in CLAS in moving large data sets from Jefferson Lab to remote site for analysis.
Calibration/Alignment
Misc.
- Curtis will contact members of the collaboration to try and get an estimate of the current computing resources that are available outside of Jefferson Lab for GlueX. The example from CMU is
At CMU: These resources are split with our lattice QCD group with whom we share operations of the cluster. All CPUs are roughly 2.4GhZ nodes and all are under three years old at the moment. 32 dual-quad core AMD machines -> 256 compute cores 1GB core per CPU 12 quad-eight-core AMD machines -> 384 compute cores 2GB core per CPU 180TB of RAID storage spread across several servers 10Giga-bit backbone from the RAID servers to the switches and 2Giga-bit connections to each compute node. Over the next few years, we anticipate purchasing around 16 boxes of the next generation of CPUs (quad-sixteen or thrirty-two core boxes). We also expect to double our disk storage capability as we start to retire some of our very old few-TB servers.