Trigger and DAQ at the LHC. (Part II)

Similar documents
Data acquisition and Trigger (with emphasis on LHC)

Data acquisition and Trigger (with emphasis on LHC)

Overview of the ATLAS Trigger/DAQ System

The Run-2 ATLAS. ATLAS Trigger System: Design, Performance and Plans

LHC Experiments - Trigger, Data-taking and Computing

Monika Wielers Rutherford Appleton Laboratory

Data acquisi*on and Trigger - Trigger -

LHCb Trigger & DAQ Design technology and performance. Mika Vesterinen ECFA High Luminosity LHC Experiments Workshop 8/10/2016

Trigger and Data Acquisition Systems. Monika Wielers RAL. Lecture 3. Trigger. Trigger, Nov 2,

Triggers For LHC Physics

Trigger and Data Acquisition at the Large Hadron Collider

First-level trigger systems at LHC. Nick Ellis EP Division, CERN, Geneva

Real-time flavour tagging selection in ATLAS. Lidija Živković, Insttut of Physics, Belgrade

ATLAS Phase-II trigger upgrade

ATLAS Muon Trigger and Readout Considerations. Yasuyuki Horii Nagoya University on Behalf of the ATLAS Muon Collaboration

The LHC Situation. Contents. Chris Bee. First collisions: July 2005! Centre de Physique des Particules de Marseille, France,

Introduc*on. Trigger Hands- On Advance Tutorial Session. A. Ave*syan, Tulika Bose (Boston University)

arxiv: v2 [physics.ins-det] 13 Oct 2015

TRIGGER & DATA ACQUISITION. Nick Ellis PH Department, CERN, Geneva

The LHCb trigger system

CMS electron and _ photon performance at s = 13 TeV. Francesco Micheli on behalf of CMS Collaboration

EPJ C direct. The ATLAS trigger system. 1 Introduction. 2 The ATLAS experiment. electronic only. R. Hauser, on behalf of the ATLAS collaboration

LHCb Trigger System and selection for Bs->J/Ψ(ee)φ(KK)

Trigger Overview. Wesley Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review April 12, 2000

Triggering at ATLAS. Vortrag von Johannes Haller, Uni HH Am ATLAS-D Meeting, September 2006

The ATLAS Trigger in Run 2: Design, Menu, and Performance

Introduction to Trigger and Data Acquisition

Operation and Performance of the ATLAS Level-1 Calorimeter and Level-1 Topological Triggers in Run 2 at the LHC

CTEQ Summer School. Wesley H. Smith U. Wisconsin - Madison July 19, 2011

Electronics, trigger and physics for LHC experiments

Track Triggers for ATLAS

Trigger and data acquisition

Phase 1 upgrade of the CMS pixel detector

Upgrade of the ATLAS Thin Gap Chamber Electronics for HL-LHC. Yasuyuki Horii, Nagoya University, on Behalf of the ATLAS Muon Collaboration

First-level trigger systems at LHC

Performance of the ATLAS Muon Trigger in Run I and Upgrades for Run II

The LHCb Upgrade BEACH Simon Akar on behalf of the LHCb collaboration

The LHCb trigger system: performance and outlook

SLHC Trigger & DAQ. Wesley H. Smith. U. Wisconsin - Madison FNAL Forward Pixel SLHC Workshop October 9, 2006

Development and Test of a Demonstrator for a First-Level Muon Trigger based on the Precision Drift Tube Chambers for ATLAS at HL-LHC

Triggers: What, where, why, when and how

Data Quality Monitoring of the CMS Pixel Detector

The detector read-out in ALICE during Run 3 and 4

The Run-2 ATLAS Trigger System

PoS(EPS-HEP2017)476. The CMS Tracker upgrade for HL-LHC. Sudha Ahuja on behalf of the CMS Collaboration

DAQ & Electronics for the CW Beam at Jefferson Lab

CMS SLHC Tracker Upgrade: Selected Thoughts, Challenges and Strategies

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The design and performance of the ATLAS jet trigger

Attilio Andreazza INFN and Università di Milano for the ATLAS Collaboration The ATLAS Pixel Detector Efficiency Resolution Detector properties

The CMS Muon Trigger

LHCb Preshower(PS) and Scintillating Pad Detector (SPD): commissioning, calibration, and monitoring

What do the experiments want?

ATLAS and CMS Upgrades and the future physics program at the LHC D. Contardo, IPN Lyon

Physics at the LHC and Beyond Quy Nhon, Aug 10-17, The LHCb Upgrades. Olaf Steinkamp. on behalf of the LHCb collaboration.

Some Studies on ILC Calorimetry

The CMS ECAL Laser Monitoring System

US CMS Calorimeter. Regional Trigger System WBS 3.1.2

GPU-accelerated track reconstruction in the ALICE High Level Trigger

Totem Experiment Status Report

Tests of the CMS Level-1 Regional Calorimeter Trigger Prototypes

Status of the LHCb Experiment

Calorimeter Monitoring at DØ

Installation, Commissioning and Performance of the CMS Electromagnetic Calorimeter (ECAL) Electronics

The Status of ATLAS. Xin Wu, University of Geneva On behalf of the ATLAS collaboration. X. Wu, HCP2009, Evian, 17/11/09 ATL-GEN-SLIDE

The upgrade of the LHCb trigger for Run III

Level-1 Track Trigger R&D. Zijun Xu Peking University

Signal Reconstruction of the ATLAS Hadronic Tile Calorimeter: implementation and performance

Expected Performance of the ATLAS Inner Tracker at the High-Luminosity LHC

CMS Silicon Strip Tracker: Operation and Performance

3.1 Introduction, design of HERA B

Streaming Readout for EIC Experiments

L1 Trigger Activities at UF. The CMS Level-1 1 Trigger

Hardware Trigger Processor for the MDT System

Hardware Trigger Processor for the MDT System

A High Granularity Timing Detector for the Phase II Upgrade of the ATLAS experiment

irpc upgrade project for CMS during HL-LHC program

Firmware development and testing of the ATLAS IBL Read-Out Driver card

A common vision of a new Tracker is now essential It may not be final but a focus for shared efforts is now vital

9. TRIGGER AND DATA ACQUISITION

Level-1 Calorimeter Trigger Calibration

Opera&on of the Upgraded ATLAS Level- 1 Central Trigger System

Development of a Highly Selective First-Level Muon Trigger for ATLAS at HL-LHC Exploiting Precision Muon Drift-Tube Data

PoS(LHCP2018)031. ATLAS Forward Proton Detector

CMS Phase 2 Upgrade: Preliminary Plan and Cost Estimate

Layout and prototyping of the new ATLAS Inner Tracker for the High Luminosity LHC

BaBar and PEP II. Physics

CALICE AHCAL overview

Muon Collider background rejection in ILCroot Si VXD and Tracker detectors

optimal hermeticity to reduce backgrounds in missing energy channels, especially to veto two-photon induced events.

Beauty Experiments at the LHC

L1 Track Finding For a TiME Multiplexed Trigger

Level-1 Regional Calorimeter System for CMS

HPS ECal & Trigger Simulation. HPS Collaboration Meeting.

The upgrade of the LHCb trigger for Run III

Exo$ca Hotline. Tulika Bose. Boston University. (On behalf of the Exo$ca Group) Physics Plenary (May 19th, 2010)

First Results with the Prototype Detectors of the Si/W ECAL

Update on TAB Progress

VErtex LOcator (VELO)

Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4. Final design and pre-production.

Transcription:

Trigger and DAQ at the LHC (Part II) Tulika Bose Brown University NEPPSR 2007 August 16, 2007 1

The LHC Trigger Challenge σ mb μb nb pb fb σ inelastic bb W Z t t OBSERVED gg H SM qq qqh SM H SM γγ h γγ Bunch Crossing rate storage rate scalar LQ DISCOVERIES 50 100 200 500 1000 2000 M [GeV] Rate GHz MHz khz Hz mhz μhz Triggering at the LHC is very challenging!!! At the standard LHC luminosity: L=10 34 cm -2 s -1 = 10 10 Hz/b σ inelastic (pp) ~ 70 mb 7 x 10 8 interactions/s Bunch crossing frequency: 32MHz 22 interactions/bunch crossing Storage rate ~ 200-300 Hz online rejection: 99.999% crucial impact on physics reach Keep in mind that what is discarded is lost forever 2

Multi-level trigger systems L1 trigger: Selects 1 out of 10000 (max. output rate ~100kHz) This is NOT enough Typical ATLAS and CMS event size is 1MB 1MB x 100 khz = 100 GB/s! What is the amount of data we can reasonably store these days? 100 MB/s Additional trigger levels are needed to reduce the fraction of less interesting events before writing to permanent storage 3

Multi-tiered trigger systems Level-1 trigger: Integral part of all trigger systems always exists reduces rate from 32 MHz to 100 khz (max) Upstream: further reduction needed done in 1 or 2 steps Detectors Detectors Lvl-1 Front end pipelines Lvl-1 Front end pipelines Lvl-2 Readout buffers Readout buffers Switching network Switching network Lvl-3 Processor farms HLT Processor farms ATLAS: 3 physical levels CMS: 2 physical levels 4

3-level processing High Level Trigger = L2 + Event Filter (EF) Additional processing at Level 2 : reduce network bandwidth requirements Level 1: See talk by Kevin Black software hardware 2.5 μs ~ 10 ms ~ sec. ATLAS Level 2: Regions-of-Interest seeds Full granularity for sub-dets Fast rejection steering O(10ms) latency Event Filter: Seeded by Level-2 result Potential full event access Offline-like algorithms 5 O(1s) latency

2-level processing LV-1 µs HLT ms.. s 40 MHz 10 5 Hz 1000 Gb/s 10 2 Hz Level-1 trigger reduces rate from 32 MHz to 100 khz (max) Custom electronic boards and chips process calorimeter and muon data to select objects High-Level Trigger (HLT) reduces rate from 100 khz to O(100 Hz) Filter farm with commodity PCs Partial event reconstruction on demand using full detector resolution Two-level processing: Reduce number of building blocks Rely on commercial components for processing and communication 6

ON-line LEVEL-1 Trigger Hardwired processors (ASIC, FPGA) Pipelined massive parallel OFF-line DAQ HIGH LEVEL Triggers Farms of processors Reconstruction&ANALYSIS TIER0/1/2 Centers 25ns 3µs ms sec hour year 10-9 10-6 10-3 10-0 10 3 7 Giga Tera Petabit

DAQ Overview 8

DAQ Architecture Detector Front-ends: Modules which store data from detector front-end electronics upon a L1 accept Readout systems: Modules which read data from front-end systems and store the data until it is sent to the processors for analysis Intermediate trigger level ( a la ATLAS) Local detector data (partially assembled) provides an intermediate trigger level Builder network: Collection of networks (switches) provide interconnections between the Readout and Filter systems, assembles events Filter Systems: Processors which execute HLT algorithms to select interesting events for offline processing 9

DAQ Overview 10

DAQ Architecture Event Manager: Responsible for controlling the flow of data (events) in the DAQ system Simplifies overall system synchronization Computing systems: Processors which receive filtered events from the Filter farms Controls: Entities responsible for the user interface, configuration and monitoring of the DAQ 11

Event Builder Scheme Event fragments are stored in independent physical memories Each full event should be stored in one physical memory of the processing unit (commodity PC) The Event Builder builds full events from event fragments. must interconnect all data sources to destinations Huge network switch 12

Event Building with a Switch SWITCH : Networking device that connects network segments Allows one to send data from a PC connected to a port (input port) to a PC connected to another port (output port) directly without duplicating the packet to all ports (i.e. an intelligent hub) Switch inspects data packets as they are received, determines the source and destination device of that packet and forwards it appropriately Conserves network bandwidth and optimizes data transfers A switch you may be familiar with: 8-port consumer grade switch 13

HEP Switching Technologies Gigabit Ethernet: 64 ports @ 1.2 Gb/s Myricom Myrinet: 64 ports @ 2.5 Gb/s 14

Traffic Issues READOUT BUFFERS EVB Traffic Jam : All sources send to the same destination concurrently congestion Event Builder congestion should not lead to readout buffer overflow: Need traffic shaping! 15

Dealing with traffic Barrel Shifter: A BCD The sequence of send from each source to each destination follows the cyclic permutations of all destinations Allow to reach a throughput closer to 100% of input bandwidth Additional traffic shaping techniques being used as well 16

Strategies The massive Level-1 data rate poses problems even for network-based event building ATLAS and CMS have adopted different strategies: ATLAS: Uses Region-of-Interest (RoI) mechanism with sequential selection to access the data only as required i.e. only move data needed for Level-2 processing Reduces by a substantial factor the amount of data that needs to be moved from the Readout Systems to the Processors Relatively complicated strategies needed to serve the data selectively to the Level-2 processors more complex software CMS: Event building is factorized into a number of slices each of which sees only a fraction of the rate Requires large total network bandwidth ( cost), but avoids the need for a very large single network switch 17

DAQ Slices Eight slices: Each slice sees only 1/8 th of the events Additional advantage: Don t have to implement all slices initially (funding) 18

Level-2 (ATLAS): Region of Interest (ROI) data is ~1% of total Smaller switching network is needed (not in # of ports but in throughput) But adds: Level-2 farm Lots of control and synchronization Problem of large network problem of Level-2 Combined HLT (CMS): Needs very high throughput Needs large switching network But it is: Simpler data flow and operations More flexible (the entire event is available to the HLT not just a piece of it) Problem of selection problem of technology 19

High Level Trigger 20

HLT Guidelines Strategy/design: Use offline software as much as possible Easy to maintain (software can be easily updated) Uses our best (bug-free) understanding of the detector Boundary conditions: Code runs in a single processor, which analyzes one event at a time Have access to full event data (full granularity and resolution) Limitations: CPU time Output selection rate: ~100 Hz Precision of calibration constants 21

HLT Requirements Flexible: Working conditions at 14 TeV are difficult to evaluate (prepare for different scenarios) Robust: HLT algorithms should not depend in a critical way on alignment and calibration constants Inclusive selection: Rely on inclusive selection to guarantee maximum efficiency to new physics Fast event rejection: Event not selected should be rejected as fast as possible (i.e. early on in the processing) Quasi-offline software: Offline software used online should be optimized for performance 22 (we need to select events that are interesting enough )

HLT Processing High Level Triggers ( > Level 1) are implemented more or less as advanced software algorithms L1 seeds L2 unpacking (MUON/ECAL/HCAL) Run on standard processor farms with Linux as OS cost effective since Linux is free Local Reco (RecHit) L2 Algorithm HLT filter algorithms are setup in various steps: Each HLT trigger path is a sequence of modules Processing of the trigger path stops once a module returns false Algorithms are essentially offline quality but optimized for fast performance Filter L2.5 unpacking (Pixels) Local Reco (RecHit) L2.5 Algorithm 23

Example Trigger Path: CMS E/γ Level-2 (CAL info only): Confirm L1 candidates Apply clustering Supercluster algorithm recovers bremmstrahlung Select highest E T cluster Level-2.5 (pixel only) CAL particles traced back to vertex detector Level-3 Track reconstruction starting with L 2.5 seed & track quality cuts (electrons) High Et cut (photons) 24

Trigger Menus Need to address the following questions: What to save permanently on mass storage? Which trigger streams should be created? What is the bandwidth allocated to each stream? (Usually the bandwidth depends on the status of the experiment and its physics priorities) What selection criteria to apply? Inclusive triggers (to cover major known or unknown physics channels) Exclusive triggers (to extend the physics potential of certain analyses say b-physics) Prescaled triggers, triggers for calibration & monitoring General rule : Trigger tables should be flexible, extensible (to different luminosities for eg.), and allow the discovery of unexpected physics. Performance is a key factor too 25

CMS HLT Exercise CMS Report (LHCC): What is the CPU performance of the HLT? HLT cpu time budget ~ 40ms/event CERN-LHCC 2007-021 Focus: Compile strawman Trigger Menu that covers CMS needs Determine CPU-performance of HLT algorithms Implementation of 2008 physics-run (14 TeV) trigger menu (Study motivated by the need to purchase the Filter Farm by end 2007) Select events that are interesting enough and bring down rate as quickly as possible DAQ-TDR (Dec 02): In 2007, for a L1 accept rate of 50 khz & 2000 CPUs we need an average processing time of 2000/50 khz ~ 40 ms/evt 26

CMS HLT Exercise result Tails : Will eliminate with time-out mechanism Average time needed to run full Trigger Menu on L1 accepted events: 43 ms/event Core 2 5160 Xeon processor running at 3.0 GHz Strong dependence of CPU-times on HLT input: Safety factors used: factor of 3 in allocation of L1 bandwidth; only 17 khz factor of 2 in HLT accept rate; only 150 Hz allocated Auto-accept event if processing time exceeds e.g. 600 ms This saves significant time in MC (probably much more in real data) + will keep events of unexpected nature 27

Triggering on the unexpected General Strategy Physics Signal How does one trigger on the unknown? Signature Background Start by looking at various physics signals/signatures Trigger Design Rate/ Efficiency What are the main backgrounds? Design a trigger using the above info Estimate rates and efficiencies 28

Alternatives signatures 1) Di-lepton, di-jet, di-photon resonances Z (leptons, jets), RS Extra dimensions (leptons, photons, jets) Z KK in TeV -1 heavy neutrino from right-handed W (di-lepton + di-jets) L L 2) Single photon + missing E T ADD direct graviton emission 29

Alternatives signatures 3) Single lepton + jets/missing ET W (lepton+ missing ET) Twin Higgs (lepton + jets + missing ET) W H t H b W H 4) (a) Multi-lepton + multi-jet Technicolor, littlest Higgs, universal extra dimensions 30

Alternatives signatures 4) (b) Multi-leptons + photons universal extra dimensions 5) Same sign di-leptons same-sign top 6) Black Holes High multiplicity events, jets/lepton ratio of 5:1 31

Having robust lepton and jets triggers will be crucial! (Cross-channel triggers like leptons + jets v. important too.) MET at DØ (NOTE: Many BSM signatures involve 3 rd generation particles: b s and τ and also MET Though challenging, triggers for these need to be commissioned at the same time) NOT SUSY! 32

CMS HLT Trigger Rates bread & butter triggers for many BSM analyses For complete triggerlist see CERN-LHCC 2007-021, LHCC-G-134 @ L=10 32 cm -2 s -1 μ: 50 Hz eγ: 30 Hz jets/met/ht: 30 Hz τ: 7 Hz b-jets: 10 Hz x-channels: 20 Hz prescaled: 15 Hz Total: 150 Hz 33

CMS HLT Trigger Rates bread & butter triggers for many BSM analyses @ L=10 32 cm -2 s -1 Similar trigger menus are being designed by ATLAS 34

Lepton thresholds/efficiencies Efficiency of e60 trigger Vs electron p T based on a sample of 500 GeV RS G ee Signal Efficiencies : (L1 eff=100%) @ L=10 31 cm -2 s -1 35

Summary Triggering at the LHC is a real challenge Sophisticated multi-tiered trigger systems have been designed by ATLAS and CMS Trigger menus for early physics runs (2008) are being laid out Tools are in place and strategies are being optimized These strategies cover final states predicted by most BSM models Perhaps the most important strategy? KEEP AN OPEN MIND! 36

Last Resort Trigger General trigger strategies work, but what if an object fails standard quality cuts? More likely to happen at the HLT, as L1 quality requirements are, in general, fairly loose Examples: Electron/photons with large impact parameter resulting in a funny cluster profile Events with abnormally high multiplicity of relatively soft objects b-tagged jets with extremely large impact parameter Funny tracking patterns in roads defined by L1 candidates Abnormally large fraction of L1 triggers fired with no HLT triggers to pass Abnormal density of tracks within HLT roads G. Landsberg, M. Strassler 37

Last Resort Trigger Proposal: Take advantage of the sequential nature of HLT processing Let individual HLT paths set a weirdness flag when the event fails the trigger, but in the process something in the event is found to look fairly strange (e.g., one of the cuts is failed by a very large margin) Run the Last Resort HLT filter as the last one in the path Try to rescue these weird events by analyzing weirdness flags set by individual paths and/or based on global event properties Forcefully accepts the event if several such flags are set Accepts the event if large number of L1 triggers is fired Cuts designed to keep very low output rate («1 Hz) The LRT could allow for an early warning system for weird events, which may indicate hardware failure or interesting, exotic physics Designated triggers can then be developed for particular exotic signatures found by the LRT without compromising taking these data 38

BACKUP 39

40

41

CMS L1 Trigger Rates 42

CMS High Level Trigger Rates 43

CMS Trigger Efficiencies Muons HLT efficiency for benchmark channels Electrons Photons High-E T EM candidates (apply high E T cuts, loosen-up isolation) Good W/Z efficiencies for muon, egamma HLT 44

Global or Regional D e t e c t o r Pixel L_1 Pixel L_2 Si L_1 ECAL Global process (e.g. DIGI to RHITs) each detector fully then link detectors then make physics objects HCAL D e t e c t o r 14 Pixel L_1 Pixel L_2 Si L_1 ECAL HCAL Regional process (e.g. DIGI to RHITs) each detector on a "need" basis link detectors as one goes along physics objects: same 45