Run coordination Summary. C. Gemme INFN Genova on behalf of Run Coordination August 16 th, 2016

Similar documents
Opera&on of the Upgraded ATLAS Level- 1 Central Trigger System

Operation and Performance of the ATLAS Level-1 Calorimeter and Level-1 Topological Triggers in Run 2 at the LHC

The 1st Result of Global Commissioning of the ATALS Endcap Muon Trigger System in ATLAS Cavern

Installation, Commissioning and Performance of the CMS Electromagnetic Calorimeter (ECAL) Electronics

Data acquisi*on and Trigger - Trigger -

The Status of ATLAS. Xin Wu, University of Geneva On behalf of the ATLAS collaboration. X. Wu, HCP2009, Evian, 17/11/09 ATL-GEN-SLIDE

arxiv: v2 [physics.ins-det] 20 Oct 2008

The Run-2 ATLAS. ATLAS Trigger System: Design, Performance and Plans

Commissioning Status and Results of ATLAS Level1 Endcap Muon Trigger System. Yasuyuki Okumura. Nagoya TWEPP 2008

arxiv: v1 [physics.ins-det] 25 Oct 2012

ATLAS Muon Trigger and Readout Considerations. Yasuyuki Horii Nagoya University on Behalf of the ATLAS Muon Collaboration

Firmware development and testing of the ATLAS IBL Read-Out Driver card

Level-1 Calorimeter Trigger Calibration

Data acquisition and Trigger (with emphasis on LHC)

ATLAS Phase-II trigger upgrade

The Commissioning of the ATLAS Pixel Detector

What do the experiments want?

Upgrade of the ATLAS Thin Gap Chamber Electronics for HL-LHC. Yasuyuki Horii, Nagoya University, on Behalf of the ATLAS Muon Collaboration

A Cosmic Muon Tracking Algorithm for the CMS RPC based Technical Trigger

Triggers: What, where, why, when and how

LHC Experiments - Trigger, Data-taking and Computing

Data acquisition and Trigger (with emphasis on LHC)

D. Ferrère, Université de Genève on behalf of the ATLAS collaboration

Current Status of ATLAS Endcap Muon Trigger System

ATLAS Tracker and Pixel Operational Experience

Performance of the ATLAS Muon Trigger in Run I and Upgrades for Run II

The ATLAS Trigger in Run 2: Design, Menu, and Performance

The CMS ECAL Laser Monitoring System

CMS Silicon Strip Tracker: Operation and Performance

Calorimeter Monitoring at DØ

Triggering at ATLAS. Vortrag von Johannes Haller, Uni HH Am ATLAS-D Meeting, September 2006

Monika Wielers Rutherford Appleton Laboratory

Minutes of the ALICE Technical Board, CERN

Detector for LHC collisions

Design and Performance of the ATLAS Muon Detector Control System

Operational Experience with the ATLAS Pixel Detector

The trigger system of the muon spectrometer of the ALICE experiment at the LHC

Phase 1 upgrade of the CMS pixel detector

The LHCb VELO Upgrade. Stefano de Capua on behalf of the LHCb VELO group

CALICE AHCAL overview

Attilio Andreazza INFN and Università di Milano for the ATLAS Collaboration The ATLAS Pixel Detector Efficiency Resolution Detector properties

First-level trigger systems at LHC. Nick Ellis EP Division, CERN, Geneva

VErtex LOcator (VELO)

1 Detector simulation

Spectrometer cavern background

Development and Test of a Demonstrator for a First-Level Muon Trigger based on the Precision Drift Tube Chambers for ATLAS at HL-LHC

Hardware Trigger Processor for the MDT System

Real-time flavour tagging selection in ATLAS. Lidija Živković, Insttut of Physics, Belgrade

The CMS ECAL Laser Monitoring System

Diamond sensors as beam conditions monitors in CMS and LHC

Development of a Highly Selective First-Level Muon Trigger for ATLAS at HL-LHC Exploiting Precision Muon Drift-Tube Data

Magnet Powering with zero downtime a dream?

Data Quality Monitoring of the CMS Pixel Detector

Overview of the ATLAS Trigger/DAQ System

Design and Construction of Large Size Micromegas Chambers for the ATLAS Phase-1 upgrade of the Muon Spectrometer

Beam Condition Monitors and a Luminometer Based on Diamond Sensors

Operation and performance of the CMS Resistive Plate Chambers during LHC run II

Hardware Commissioning

Hardware Trigger Processor for the MDT System

EPJ C direct. The ATLAS trigger system. 1 Introduction. 2 The ATLAS experiment. electronic only. R. Hauser, on behalf of the ATLAS collaboration

Development of Telescope Readout System based on FELIX for Testbeam Experiments

The Phase-II ATLAS ITk Pixel Upgrade

LHCb Preshower(PS) and Scintillating Pad Detector (SPD): commissioning, calibration, and monitoring

2008 JINST 3 S Implementation The Coincidence Chip (CC) Figure 8.2: Schematic overview of the Coincindence Chip (CC).

Field Programmable Gate Array (FPGA) for the Liquid Argon calorimeter back-end electronics in ATLAS

CERN/LHCC LHCC September 2007 LARGE HADRON COLLIDER COMMITTEE. Minutes of the ninetieth meeting held on Tuesday, 25 September 2007

ATLAS ITk and new pixel sensors technologies

Radia%on Damage Observa%on in the ATLAS Pixel Detector Using the High Voltage Delivery System

The LHCb Vertex Locator : Marina Artuso, Syracuse University for the VELO Group

DAQ & Electronics for the CW Beam at Jefferson Lab

Totem Experiment Status Report

ATLAS and CMS Upgrades and the future physics program at the LHC D. Contardo, IPN Lyon

Nikhef jamboree - Groningen 12 December Atlas upgrade. Hella Snoek for the Atlas group

Construction and Performance of the stgc and MicroMegas chambers for ATLAS NSW Upgrade

The BaBar Silicon Vertex Tracker (SVT) Claudio Campagnari University of California Santa Barbara

The ATLAS detector at the LHC

The LHCb Upgrade BEACH Simon Akar on behalf of the LHCb collaboration

Tests of the CMS Level-1 Regional Calorimeter Trigger Prototypes

Trigger and Data Acquisition at the Large Hadron Collider

First-level trigger systems at LHC

Status of the LHCb Experiment

Requirements and Specifications of the TDC for the ATLAS Precision Muon Tracker

How did the LHC access system perform in 2009

The Run-2 ATLAS Trigger System

The LUCID-2 Detector RICHARD SOLUK, UNIVERSITY OF ALBERTA FOR THE ATLAS- LUCID GROUP

Commissioning of the ATLAS Pixel Detector with Cosmic Data and Status of the ATLAS detector at the LHC

Micromegas for muography, the Annecy station and detectors

LHC COMMISSIONING AT HIGHER ENERGY

Preparing for the Future: Upgrades of the CMS Pixel Detector

The ATLAS Muon System

L1 Track Finding For a TiME Multiplexed Trigger

TRIGGER & DATA ACQUISITION. Nick Ellis PH Department, CERN, Geneva

The CMS Muon Trigger

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Belle Monolithic Thin Pixel Upgrade -- Update

A High Granularity Timing Detector for the Phase II Upgrade of the ATLAS experiment

Barrel LVL1 Muon Trigger Coincidence Matrix ASIC: User Requirement Document

L1 Trigger Activities at UF. The CMS Level-1 1 Trigger

LHCb Trigger & DAQ Design technology and performance. Mika Vesterinen ECFA High Luminosity LHC Experiments Workshop 8/10/2016

CMS Tracker Upgrades. R&D Plans, Present Status and Perspectives. Benedikt Vormwald Hamburg University on behalf of the CMS collaboration

Transcription:

Run coordination Summary C. Gemme INFN Genova on behalf of Run Coordination August 16 th, 2016

LHC Cardiogram 2076 2173b 2172 b Slow Dump of ATLAS Toroid due to electrical glitch during work on the network. 7h recovery in the interfill. 1. 2. 1. Dipole A31L2 inves8ga8ons and mi8ga8ons 2. PS vacuum leak 2

LHC Cardiogram 3x3 2172b 2220 b 2220b should be the maximum this year 3

The problem: Two quenches while ramping down RB.A12 from 6 ka with -10 A/s 10 June 2016 @ 547 A 3 Aug 2016 @ 295 A à The second event triggered detailed inves8ga8ons. Could be explained with inter-turn short in dipole A31L2 Follow-up (Wednesday) AddiBonal instrumentabon Measurement campaign Various types of cycles evening and overnight. Thursday: analysis of measurements A31L2 No sign of changes in short High current quenches and Fast Power Abort must be avoided: would destroy the magnet Mi8ga8ons: ü Remove global protecbon mechanism: implemented on Thursday and validated ü Reduce BLM thresholds: changed on Thursday/ Friday ü Increase QPS thresholds on A31L2: new board installed on Thursday and successfully validated. 4

LHC plans Plan: Con8nue physics with 2220 bunches Slowly increase bunch intensity up to 1.2e11 bunch intensity Targe8ng restricted range for bunch fla^ening for LHCb (from current fill: 0.95-1.15 ns à 0.95-1.05 ns) In discussion Decrease crossing angle from 375 to 300 urad à 10% more luminosity, z-length of luminous region, pile-up Special fill request by CMS: low-mu running Remove week 43 for pp running to have one more week for training 2 sectors to 7TeV Luminosity levelling test already this year 5

ATLAS Week 5181 2076b 5183 2173b 5187 2172b 5194 3b 5197 2172b 5198 2172b 5199 2220b Slow Dump ATLAS Toroid Cosmis Data taking

TDAQ Silvia Fressard- Batraneanu, Jeremy Love Smooth running. Patch for HLTSV available to fix occasional resource starva8on. Patch pending installa8on, needs at least one run with high rates before pufng in produc8on Problema8c Tile ROL fibres In collabora8on with Tile, organizing the installa8on of new spares Weekly physics efficiency : 94.03% Trigger held by system

IBL 8ming Pixel IBL 8ming not op8mal since TIM replacement aier MD. Since then, running with 2BC. Special fill 5183 on Monday with Individual bunch and a 8ming scan of -10ns/+10ns performed. Timing constants in the TIM have been applied: we recovered most of the hits but we s8ll seem to have a frac8on of clusters on tracks from the neighboring bins. 1 BC on-8me hit efficiency > 99% è 1 BC in the next fill. IBL calibra8on IBL calibra8on was finally recovered aier the sw upgrade during MD re-tuning needed to compensate from TID effects New version of fw for IBL and Layer2 with debugging capabili8es to analyze the TTC-TIM stream Marcello Bindi Best week of the year for Pixel in term of data taking efficiency è dead- Bme = 0.41%

SCT/TRT Chiara Debenede?, A. Bocci, D. Derendarz, A. Romaniouk SCT In general Very quiet running. In Fill 5183, one problema8c link not shown as removable by the shiier and not automa8cally recoverable. Set ROD forma^er overflow limit to 1600 (>max number of hits per link to protect against non-physical pa^erns). deployed on Few RODs in Stable beam, ok. TRT Stable data taking DAQ: Needs also replacement of one TTC board that works correctly during data taking but fails in test pulses FastOR: Observed lower rate than expected. Caused by the change of the readout mode for high threshold bits (from Mar 2016), for three bit readout to single middle bit. May be reverted in future cosmics run.

L1 Calo/ L1Topo Kate Whalen L1Calo: Generally a quiet week for L1Calo! Monitoring improvements. L1Topo Test of complex dead8me with random triggers: bucket 3 changed from 7/260 -> 14/260 -> 15/260 (not deployed yet, need more tes8ng) Muon items enabled since Saturday morning, Fill 5197 Total rate ~ 15 khz (L1), 60 Hz (HLT) Timing checks: all algorithms are well-8med

LAr Jose Benitez HW An HV module (EMEC A) exchanged on Sunday due to 4 problema8c channels. To be followed up if readings were correct. M167.C3 (HEC C) HV decreased due to new short HECA cell 0x3b1a5200 (tower 0x4110400) disabled at each run, consider if permanently disable this shaper switch Reprocessing to consider it DQ/Monitoring under test: online flagging of Mini Noise Bursts (prev. offline) added DQMD checks for PU removal: turning yellow if 1-2 PU s are disabled, red if >2 - to complement informa8on in Shiier Assistant

TILE link failure Silvia Fracchia Repeated stopless removals of ROL ROD5 EBC33-36 (4 neighbouring modules, intolerable defect for DQ) Star8ng from Sunday from 21.18 during stable beam, with consequent HLT problems Caused >3 hours interrup8on in data taking Several tests and a^empts to recover it (power cycles, TTC restarts, turning off affected modules) Finally turned out to be a problem with ROD-ROS link, similar to what occurred on 13th July in LBA The fiber was eventually replaced with a working one out of two spares the subs8tute fiber has low op8cal power due to fiber end misaligned in the connector. Reflectometry measurement on Monday spot problem in the same loca8on for the downstream links. Emergency plan: restore a spare fiber from two bad ones Short term plan: install few addi8onal spare fibers (2 per ROD crate)

TILE link failure Rafał Bielski With the Tile ROL disabled any chain trying to read data from Tile sending events to debug stream at L1 rate. 21:25 Switched to standby keys to mi8gate the HLT backpressure for 40 22:00 Disabled all jet/met chains Recovery completed at 01:00

Muons Claudio Luci CSC MDT CSC latency is now set using ATLAS latency (Instead of sefng it manually and comparing it to Atlas latency) RCD crash has been fixed. This was also the cause for some failed recovery. The fake chamber drop reported to CHIP by RCD is under study. RPC Access to cavern to disconnect a module from HV channel TGC Few other cases under close monitoring and inves8ga8on. new Sector Logic firmware was deployed to reduce L1_MU4 rate.

Other subsystems Trigger New sw release deployed on Tuesday à fine Mul8ple keys deployed during the week, following LHC programs, including overnight Cosmics run decrease in TRT rate wrt earlier cosmic runs due to change in readout (March), which can be reverted for the next cosmic run Data prepara8on Lucid Everything is working fine on the data processing side GM infrastructure is working smoothly too Main point now is to update references: Running smoothly. Only pending issue is the automa8c recovery from all SEU occurences ongoing. BCM/BLM/DBM BCM/BLM/: Some minor hw interven8on done or in progress DBM: debug and commissioning when there was no LHC running.

Conclusions In general smooth data-taking with increasing efficiency. Only serious problem Tile link on Sunday evening. Next week MD2 and 2.5 km commissioning. 6/7 weeks of pp running lei