Physics: Management:
  • March 2011 - Program Manager at the Office of High Energy Physics, US Department of Energy:
    • Theoretical Physics (March 2011-present)
    • LHC Operations ( March 2011 - December 2012)
    • Energy Frontier ( December 2011-December 2012)
    • LHC Detectors Upgrades ( December 2012-present)
  • October 2010 - Nominated Leader of a CDF Exotic group focusing on searches for new particles
  • Since 2009 - Leader of a CDF group (VEP) focusing on searches for new particles
    Appointed to lead a group of about 30 people, involved in physics analyses aimed at searching for new particles predicted by leading theoretical models. The main task has been of coordinating and mentoring the analyzers activities, with the goal of expeditiously obtaining the approval of the results by the Collaboration and subsequent publication in peer-reviewed journals. This has been accomplished thanks to the broad experience gained while leading the Exotic Physics group of CDF in the early stage of Run II as well as the broad knowledge of the analysis tools used by the experiment. More than 70% of the analyses have been approved for public consumption outside the collaboration and the remaining are on track for approval by end of 2010.
  • 2006-2007 - Co-convener of the group working on the ATLAS CSC note on Single Top
    The CSC notes were produced by the ATLAS collaboration as a way to update the results presented in the Physics TDR published in 1999 regarding the physics reach and potential of the ATLAS detector. These new results made use of more realistic detector simulation and reconstruction algorithms. The appointment to be co-convener of the group, composed of about 40 people, followed previous activity in validation of the new simulation and in benchmarking of reconstruction algorithms.
  • 2001-2002 - Leader of the CDF New Particles Search (Exotic) Physics Group
    In the early stages of CDF Run II, the expertise in the new software tools and data access was recognized as a key component for the activity of leading one of the main four physics groups (comprising more than 100 people each) to obtain quick physics results. Signatures, which could be understood fast, were the first to be analyzed to pave the way for more complex ones. Traditional approaches were complemented by new signature-based investigations aimed at quickly confirm or exclude Run I anomalies. Several preliminary results, which were the direct result of the oversight as Exotic Physics Group Convener, were presented in the 2003 Winter Conferences.
  • 2000 - Leader of the CDF High Level Objects Group
    Contributed in the definition of high level objects for  physics  analysis, estimate of access patterns and maximum size of event data for realistic early run scenarios. The group included about 30 people.
  • 1999-2000 - Leader of the CDF Exotics Triggers/Datasets/Tools group
    Appointed to help the experiment in designing and implementing triggers optimized for new physics search and devise data access strategies to quickly get first results. This appointment was a result of the recognition of leadership in trigger simulation development and data access expertise. The group included about 30 people.
  • 1999-2005 - Leader of the CDF Trigger Simulation Group
    Appointed to lead the effort of writing the complete trigger simulation for CDF Run II. Leading a group of about 40 people, mostly hardware experts with minimal software proficiency. The tasks of the leader were to provide a common software framework, deal with issues like data access, consistent data flow between different modules and packages and assure that all TRGSim++ was appropriately updated for different software releases.
Computing and Software:
  • 2009-2010 - Validation of CDF software releases on new Linux versions
    As a member of the code management team, revised and updated test programs to validate the execution of CDF software on different platforms.
  • Since 2007 - Architect and main developer of a suite of tools used across physics groups in CDF, aimed at automating the calculation of efficiencies and scale factors for data and Monte Carlo
    With Run II entering a mature stage, it was recognized that the majority of analyses use similar procedures and selection criteria but often repeat common tasks like calculating selection efficiencies. Such redundancy should be eliminated to maximize the physics output. PerfIDia (Performance and ID instant answer) is a suite of tools that provides common efficiencies and scale factors and establishes a validation procedure, which is applied to data and Monte Carlo. It is used by all recently published CDF results. Analyses are streamlined, as common elements are made available centrally and in a coordinated fashion. The PerfIDia suite allows for monitoring of data quality and stability in quasi real time. <
  • 2001-2004 - Co-author of the evtNtuple analysis Ntuple in CDF
    evtNtuple was the first complete standard Ntuple used across the CDF collaboration for the initial Run II data analysis. While more complex standard Ntuples were being developed, evtNtuple, as a flat representation of the event record living outside the CDF software environment, allowed for very quick data access and validation, reducing the latency related to the software development cycles of multiple releases and changes in object definition typical of an early stage experiment. Data were streamed immediately after offline reconstruction into evtNtuple and analyzed, decoupling data analyses from the software development of more complex physics objects.
  • 1999-2006 - Leader of the CDF Trigger Simulation Group
    TRGSim++ is a set of C++ packages that were developed to emulate the completely digital trigger at CDF II (L1 and L2). TRGSim++ is run online as the engine of the trigger monitor TRIGMON (in Control Room) and offline as an analysis tools to calculate rates and efficiencies.
  • 1997-1998 - Author of several innovative studies on the use of object oriented databases as a storage system for CDF Run II data
    Following ideas developed at CERN in previous years, the idea of separately storing different pieces of the event information on different media to optimize storage resources and data access was introduced at Fermilab. The concept of event splitting was used in CDF when designing multibranch files residing on the same media. The ATLAS collaboration implemented the complete splitting of the event in different physical locations, using ROOT as the underlying storage technology and a relational database to manage metadata.
back to Home Page