Data acquisition and Trigger (with emphasis on LHC)

Similar documents
Data acquisition and Trigger (with emphasis on LHC)

Monika Wielers Rutherford Appleton Laboratory

Introduction to Trigger and Data Acquisition

Trigger and Data Acquisition Systems. Monika Wielers RAL. Lecture 3. Trigger. Trigger, Nov 2,

LHC Experiments - Trigger, Data-taking and Computing

Data acquisi*on and Trigger - Trigger -

Trigger and Data Acquisition at the Large Hadron Collider

Overview of the ATLAS Trigger/DAQ System

The Run-2 ATLAS. ATLAS Trigger System: Design, Performance and Plans

LHCb Trigger & DAQ Design technology and performance. Mika Vesterinen ECFA High Luminosity LHC Experiments Workshop 8/10/2016

Trigger and DAQ at the LHC. (Part II)

Triggers For LHC Physics

First-level trigger systems at LHC. Nick Ellis EP Division, CERN, Geneva

TRIGGER & DATA ACQUISITION. Nick Ellis PH Department, CERN, Geneva

Trigger and data acquisition

ATLAS Phase-II trigger upgrade

Operation and Performance of the ATLAS Level-1 Calorimeter and Level-1 Topological Triggers in Run 2 at the LHC

arxiv: v2 [physics.ins-det] 13 Oct 2015

The LHCb trigger system

First-level trigger systems at LHC

The ATLAS Trigger in Run 2: Design, Menu, and Performance

GPU-accelerated track reconstruction in the ALICE High Level Trigger

Track Triggers for ATLAS

Real-time flavour tagging selection in ATLAS. Lidija Živković, Insttut of Physics, Belgrade

Electronics, trigger and physics for LHC experiments

The design and performance of the ATLAS jet trigger

The LHC Situation. Contents. Chris Bee. First collisions: July 2005! Centre de Physique des Particules de Marseille, France,

DAQ & Electronics for the CW Beam at Jefferson Lab

Layout and prototyping of the new ATLAS Inner Tracker for the High Luminosity LHC

The CMS Muon Trigger

Triggering at ATLAS. Vortrag von Johannes Haller, Uni HH Am ATLAS-D Meeting, September 2006

What do the experiments want?

Performance of the ATLAS Muon Trigger in Run I and Upgrades for Run II

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The detector read-out in ALICE during Run 3 and 4

Trigger Overview. Wesley Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review April 12, 2000

CMS SLHC Tracker Upgrade: Selected Thoughts, Challenges and Strategies

LHCb Trigger System and selection for Bs->J/Ψ(ee)φ(KK)

Calorimeter Monitoring at DØ

Opera&on of the Upgraded ATLAS Level- 1 Central Trigger System

Hardware Trigger Processor for the MDT System

EPJ C direct. The ATLAS trigger system. 1 Introduction. 2 The ATLAS experiment. electronic only. R. Hauser, on behalf of the ATLAS collaboration

Level-1 Track Trigger R&D. Zijun Xu Peking University

Hardware Trigger Processor for the MDT System

The Commissioning of the ATLAS Pixel Detector

Field Programmable Gate Array (FPGA) for the Liquid Argon calorimeter back-end electronics in ATLAS

ATLAS Muon Trigger and Readout Considerations. Yasuyuki Horii Nagoya University on Behalf of the ATLAS Muon Collaboration

LHCb Preshower(PS) and Scintillating Pad Detector (SPD): commissioning, calibration, and monitoring

Streaming Readout for EIC Experiments

PoS(LHCP2018)031. ATLAS Forward Proton Detector

L1 Track Finding For a TiME Multiplexed Trigger

Attilio Andreazza INFN and Università di Milano for the ATLAS Collaboration The ATLAS Pixel Detector Efficiency Resolution Detector properties

Signal Reconstruction of the ATLAS Hadronic Tile Calorimeter: implementation and performance

The Run-2 ATLAS Trigger System

Triggers: What, where, why, when and how

The Liquid Argon Jet Trigger of the H1 Experiment at HERA. 1 Abstract. 2 Introduction. 3 Jet Trigger Algorithm

The Status of ATLAS. Xin Wu, University of Geneva On behalf of the ATLAS collaboration. X. Wu, HCP2009, Evian, 17/11/09 ATL-GEN-SLIDE

The CMS ECAL Laser Monitoring System

Level-1 Calorimeter Trigger Calibration

PoS(EPS-HEP2017)476. The CMS Tracker upgrade for HL-LHC. Sudha Ahuja on behalf of the CMS Collaboration

The LHCb trigger system: performance and outlook

9. TRIGGER AND DATA ACQUISITION

The CMS electromagnetic calorimeter barrel upgrade for High-Luminosity LHC

Study of the ALICE Time of Flight Readout System - AFRO

The LHCb Upgrade BEACH Simon Akar on behalf of the LHCb collaboration

The upgrade of the LHCb trigger for Run III

Phase 1 upgrade of the CMS pixel detector

CTEQ Summer School. Wesley H. Smith U. Wisconsin - Madison July 19, 2011

Upgrade of the ATLAS Thin Gap Chamber Electronics for HL-LHC. Yasuyuki Horii, Nagoya University, on Behalf of the ATLAS Muon Collaboration

ATLAS and CMS Upgrades and the future physics program at the LHC D. Contardo, IPN Lyon

Development of a Highly Selective First-Level Muon Trigger for ATLAS at HL-LHC Exploiting Precision Muon Drift-Tube Data

ATLAS ITk and new pixel sensors technologies

Upgrade of the CMS Tracker for the High Luminosity LHC

Commissioning Status and Results of ATLAS Level1 Endcap Muon Trigger System. Yasuyuki Okumura. Nagoya TWEPP 2008

Development and Test of a Demonstrator for a First-Level Muon Trigger based on the Precision Drift Tube Chambers for ATLAS at HL-LHC

The trigger system of the muon spectrometer of the ALICE experiment at the LHC

Expected Performance of the ATLAS Inner Tracker at the High-Luminosity LHC

Physics at the LHC and Beyond Quy Nhon, Aug 10-17, The LHCb Upgrades. Olaf Steinkamp. on behalf of the LHCb collaboration.

Beam Condition Monitors and a Luminometer Based on Diamond Sensors

Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4 Q1-2 Q3-4. Final design and pre-production.

SLHC Trigger & DAQ. Wesley H. Smith. U. Wisconsin - Madison FNAL Forward Pixel SLHC Workshop October 9, 2006

Micromegas calorimetry R&D

Status of the LHCb Experiment

Nikhef jamboree - Groningen 12 December Atlas upgrade. Hella Snoek for the Atlas group

The upgrade of the LHCb trigger for Run III

Firmware development and testing of the ATLAS IBL Read-Out Driver card

Installation, Commissioning and Performance of the CMS Electromagnetic Calorimeter (ECAL) Electronics

Tests of the CMS Level-1 Regional Calorimeter Trigger Prototypes

Calorimetry in particle physics experiments

Test Beam Measurements for the Upgrade of the CMS Phase I Pixel Detector

The 1st Result of Global Commissioning of the ATALS Endcap Muon Trigger System in ATLAS Cavern

8.882 LHC Physics. Detectors: Muons. [Lecture 11, March 11, 2009] Experimental Methods and Measurements

CMS Silicon Strip Tracker: Operation and Performance

Preparing for the Future: Upgrades of the CMS Pixel Detector

The Muon Pretrigger System of the HERA-B Experiment

Introduc*on. Trigger Hands- On Advance Tutorial Session. A. Ave*syan, Tulika Bose (Boston University)

CMS Phase 2 Upgrade: Preliminary Plan and Cost Estimate

CALICE Software. Data handling, prototype reconstruction, and physics analysis. Niels Meyer, DESY DESY DV Seminar June 29, 2009

Detection of Radio Pulses from Air Showers with LOPES

Development of the FVTX trigger

Transcription:

Lecture 2 Data acquisition and Trigger (with emphasis on LHC) Introduction Data handling requirements for LHC Design issues: Architectures Front-end, event selection levels Trigger Future evolutions Conclusion Monika Wielers (RAL) DAQ and Trigger, Nov 2, 2016 1

DAQ challenges at LHC Challenge 1 Physics Rejection power Requirements for TDAQ driven by rejection power required for the search of rare events Challenge 2 Accelerator Bunch crossing frequency Highest luminosity needed for the production of rare events in wide mass range Challenge 3 Detector Size and data volume Unprecedented data volumes from huge and complex detectors DAQ and Trigger, Nov 2, 2016 2

Challenge 1: Physics Cross sections for most processes at the LHC span 10 orders of magnitude LHC is a factory for almost everything: t, b, W, Z But: some signatures have small branching ratios (e.g. H γγ, BR 10-3 ) Process Production Rate 10 34 cm -2 s -1 inelastic ~1 GHz bbbar 5 MHz W lν 150 Hz Z lν 15 Hz ttbar 10 Hz Z 0.5 Hz H(125) SM 0.4 Hz L=10 34 cm -2 s -1 : Collision rate: ~10 9 Hz. event selection: ~1/10 13 or 10-4 Hz! DAQ and Trigger, Nov 2, 2016 3

Challenge 1: Physics Requirements for TDAQ driven by the search for rare events within the overwhelming amount of uninteresting collisions Main physics aim Measure Higgs properties Searches for new particles beyond the Standard Model Susy, extra-dimensions, new gauge bosons, black holes etc. Plus many interesting Standard Model studies to be done All of this must fit in ~1 khz of data written out to storage Not trivial, W lν: 150 Hz @ 10 34 cm -2 s -1 Good physics can become your enemy! black DAQ and Trigger, Nov 2, 2016 4

Challenge 2: Accelerator Unlike e + e - colliders, proton colliders are more messy due to proton remnants Multiple collisions per bunch crossing Currently ~20-30 overlapping p-p interactions on top of each collision (pileup) è >1000 particles seen in the detector! no pile-up 20 pile-up events DAQ and Trigger, Nov 2, 2016 5

Challenge 3: Detector Besides being huge: number of channels are O(10 6-10 8 ) at LHC, event sizes ~1 MB for pp collisions, 50 MB for pb-pb collisions in Alice Need huge number of connections Some detectors need > 25ns to readout their channels and integrate more than one bunch crossing's worth of information (e.g. ATLAS LArg readout takes ~400ns) It's On-Line (cannot go back and recover events) Need to monitor selection - need very good control over all conditions DAQ and Trigger, Nov 2, 2016 6

Let s build a Trigger and DAQ for this What do we need? DAQ and Trigger, Nov 2, 2016 7

Let s build a Trigger and DAQ for this What do we need? Electronic readout of the sensors of the detectors ( front-end electronics ) A system to collect the selected data ( DAQ ) DAQ and Trigger, Nov 2, 2016 8

Let s build a Trigger and DAQ for this What else do we need? A system to keep all those things in sync ( clock ) Data belonging to the same bunch crossing must be processed together Particle time of flight, cable delays, electronic delays all contritute DAQ and Trigger, Nov 2, 2016 9

Let s build a Trigger and DAQ for this What do we need? Electronic readout of the sensors of the detectors ( front-end electronics ) A system to collect the selected data ( DAQ ) A system to keep all those things in sync ( clock ) A trigger multi-level due to complexity DAQ and Trigger, Nov 2, 2016 10

Let s build a Trigger and DAQ for this What do we need? Electronic readout of the sensors of the detectors ( front-end electronics ) A system to collect the selected data ( DAQ ) A system to keep all those things in sync ( clock ) A trigger multi-level due to complexity A Control System to configure, control and monitor the entire DAQ DAQ and Trigger, Nov 2, 2016 11

Let s look more at the trigger part DAQ and Trigger, Nov 2, 2016 12

Multi-level trigger system Sometime impossible to take a proper decision in a single place too long decision time too far too many inputs Distribute the decision burden in a hierarchical structure Usually τ N+1 >> τ N, f N+1 << f N At the DAQ level, proper buffering must be provided for every trigger level absorb latency De-randomize DAQ and Trigger, Nov 2, 2016 13

LHC DAQ phase-space When LHC experiments were designed back in the 90 Raw data storage capped at ~ 1 PB / year per experiment DAQ and Trigger, Nov 2, 2016 14

Hardware Trigger (L0, L1) Custom electronics designed to make very fast decisions Application-Specified Integrated Circuits (ASICs) Field Programmable Gate Arrays (FPGAs) Possible to change algorithms after installation Must cope with input rate of 40 MHz Reduce rate from 40 MHz to ~100 khz Otherwise cannot process all events Event buffering is expensive, too Use pipeline for holding data during L1 processing Digital/analog custom front-end pipelines Parallel processing of different inputs as much as possible DAQ and Trigger, Nov 2, 2016 15

Trigger Latency This time determines the depth of the pipeline DAQ and Trigger, Nov 2, 2016 16

L1 Trigger in ATLAS Calorimeter and muons only Simple algorithms on reduced data granularity Selection based on particle type, multiplicities and thresholds Reject the bulk of uninteresting collisions DAQ and Trigger, Nov 2, 2016 17

ATLAS L1 calorimeter trigger Example: ATLAS e/γ trigger Sum energy in calorimeter cells into EM and hadronic towers Loop over grid and search in 4x4 towers for a local maximum 1x2 (2x1): cluster Can do something similar for other particles: jets, tau or sum the energy of all towers: missing E T DAQ and Trigger, Nov 2, 2016 18

CMS L1 muon trigger DAQ and Trigger, Nov 2, 2016 19

Central/Global Trigger Now we have the information on the particle candidates found by L1 in the detector We know type, location and E T /p T threshold passed Can also look at topological information E.g. lepton opposite ETmiss, invariant mass of 2 leptons Need to decide if this event is of any interest to us This needs to be made quickly L1 calorimeter L1 muon Central / Global Trigger L1 minimum bias DAQ and Trigger, Nov 2, 2016 20

Software Trigger: Higher Level Trigger (HLT) L1 selected a large rate (up to 100 khz) of events that might be interesting These events are not kept yet (rate too high for storage), but sent to the HLT for additional filtering Use network-based High Level Trigger computer farm(s) commercially available HW organized in a farm CPU CPU CPU CPU CPU CPU CPU CPU CPU DAQ and Trigger, Nov 2, 2016 21

HLT Example: Muon Muons in CMS: Reconstruct and fit tracks using only muon system the Continue if sufficient p T Combine tracker hits with muon system to improve p T measurement Keep the event if p T is large enough Muons in ATLAS: At Level 2, using detector information from the region around the L1 muon candidate, assign muon p T based on fast look up tables Extrapolate to the collision point and find the associated track Is the muon isolated in the tracker, calorimeters? Refine selection at L3 using offline-based reconstruction, recompute p T More on HLT in next lecture DAQ and Trigger, Nov 2, 2016 22

Higher Level Trigger Massive commercial computer farm Each CPU can process individual event or run multi-threaded Resources are still limited Offline: Full reconstruction takes seconds (minutes) Online latency: ms - s (input rate dependent) Need to reduce rate to O(1 khz) Note, output rate mainly driven by offline resources (CPU / disk space) DAQ and Trigger, Nov 2, 2016 23

The ATLAS Trigger/DAQ System Overall Trigger & DAQ architecture: 3 trigger levels Level-1: 2.5 µs latency 100 khz DAQ/HLT HLT: run L2 and EF in one farm Average output rate: ~1 khz (physics), ~2 khz (calib/monitoring) Processing time: 0.2s on average Average event size 1.5-2 MB DAQ and Trigger, Nov 2, 2016 24

The ATLAS Special Features On-demand event building seeded by Region of Interests No need to analyse the whole event in HLT, just look at regions flagged at L1 (e.g. regions with e/γ, µ, τ, jet candidates On average look only at ~5% of the data L2 and EF run on same CPU within one farm (new in 2015) Provides efficient coupling between subsequent selection steps, reducing duplication of CPU usage and network transfer Allows flexible combination of fast and detailed processing DAQ and Trigger, Nov 2, 2016 25

The CMS Trigger/DAQ System Overall Trigger & DAQ architecture: 2 trigger levels DAQ & HLT decoupled via intermediate shared temp. storage Level-1: 3.2 µs latency 100 khz output DAQ/HLT Event building at full L1 rate Average output rate: ~1 khz Average event size 1.5 Mb Max. average CPU time: ~160 ms/event 26

The CMS Special Features 2 stage event building! 1 st stage: Combine fragments into superfragment in RU (Readout Unit) builder Event building in builder units which then write events to transient files on RAM disk 2 nd stage: Serve complete events to trigger farm. DAQ and HLT decoupled via intermediate shared temporary storage (new in 2015) Detector front-end Front-End Readout Optical Link Data Concentrator switches Readout Units Event Builder switch Builder Units Filter Units (HLT) DAQ and Trigger, Nov 2, 2016 27

The LHCb Trigger/DAQ System Overall Trigger & DAQ architecture: 3 trigger levels Level-0: 4 µs latency 1 MHz output DAQ/HLT L1: spot displaced high p T tracks, output 100-200 khz L2: full event reconstruction ~34 (650) ms @ L1 (L2) Average output rate: 12.5 khz, DAQ and Trigger, Nov 2, 2016 Average event size 50 kb 28

The LHCb Special Features HLT decoupled from data flow via local temporary storage! Using periods without beam boost CPU usage by 200 % Full offline-quality reconstruction available online Alignments done at beg of fill, calib done per run Turbo Stream + Tesla Application: Store full information of trigger candidates, remove most of detector raw data Save more than 90% space Ideal for very high signal yield [millions] Very quick turn around [24 h] DAQ and Trigger, Nov 2, 2016 29

The ALICE Trigger/DAQ System Alice has different constraints Low rate: max 8 khz pb+pb Very large events: > 40MB Slow detector (TPC ~ 100 µs) Overall Trigger & DAQ architecture: 4 trigger levels 3 hardware-based trigger, 1 software-based: L0 L2: 1.2, 6.5, 100 µs latency L3: further rejection and data compression DAQ and Trigger, Nov 2, 2016 30

The Alice Special Features Deal with huge events 3 hardware level triggers Heavy utilisation of hardware acceleration: FPGA + GPU Use of data compression in trigger DAQ and Trigger, Nov 2, 2016 31

Towards the Future Experiments upgrade every time the conditions provided by the accelerator change Preparations start well in advance The 4 LHC TDAQ systems are already planning major upgrades ALICE & LCHb will upgrade for Run 3 CMS and ATLAS will mainly upgrade for Run 4 Guiding Principles Physics goals Accelerator conditions Technology reach Cost Rapidly evolving area DAQ and Trigger, Nov 2, 2016 32

Towards the Future Alice Support for continuous read-out (TPC), as well as triggered readout Read out the data of all interactions at a maximum rate of 50kHz (upon min bias trigger) One common online offline computing system: O2 LHCb (Triggerless) Read-out @ 40 MHz + full software trigger Data centre at the surface CMS Hardware-based track trigger ATLAS Hardware based track trigger after very first trigger level DAQ and Trigger, Nov 2, 2016 33

Summary Challenge to design efficient trigger/daq for LHC Very large collision rates (up to 40 MHz) Very large data volumes (tens of MB per collision) Very large rejection factors needed (>10 5 ) Showed data acquisition used in LHC experiments Introduction to basic functionality of trigger We ll look in detail at the trigger aspects in the next lecture That one will be less technical and more physics-oriented! DAQ and Trigger, Nov 2, 2016 34

Backup DAQ and Trigger, Nov 2, 2016 35

Trigger/DAQ parameters No.Levels Level-0,1,2 a Event Readout HLT Out Trigger Rate (Hz) Size (Byte) Bandw.(GB/s) MB/s (Event/s) 4 Pb-Pb 500 5x10 7 25 1250 (10 2 ) p-p 10 3 2x10 6 200 (10 2 ) 3 LV-1 10 5 1.5x10 6 4.5 300 (2x10 2 ) LV-2 3x10 3 2 LV-1 10 5 10 6 100 ~1000 (10 2 ) 2 LV-0 10 6 3.5x10 4 35 70 (2x10 3 ) DAQ and Trigger, Nov 2, 2016 36

TDAQ comparison DAQ and Trigger, Nov 2, 2016 37

Data handling requirements DAQ and Trigger, Nov 2, 2016 38