A DAQ readout for the digital HCAL

Similar documents
Micromegas calorimetry R&D

DHCAL Prototype Construction José Repond Argonne National Laboratory

SILICON-TUNGSTEN SAMPLING ELECTROMAGNETIC CALORIMETER FOR THE TEV ELECTRON-POSITRON LINEAR COLLIDER

LCWS 2008 Chicago - November

Plans for RPC DHCAL Prototype. David Underwood Argonne National Laboratory

The Detector at the CEPC: Calorimeters

LHCb Preshower(PS) and Scintillating Pad Detector (SPD): commissioning, calibration, and monitoring

CALICE Software. Data handling, prototype reconstruction, and physics analysis. Niels Meyer, DESY DESY DV Seminar June 29, 2009

CALICE AHCAL overview

GEM detector contact: (817) , (817) (FAX) b

A new single channel readout for a hadronic calorimeter for ILC

Status of CEPC Calorimeters R&D. Haijun Yang (SJTU) (on behalf of the CEPC-Calo Group)

Some Studies on ILC Calorimetry

Summary of CALICE Activities and Results. Andy White University of Texas at Arlington (for the CALICE Collaboration) DESY-PRC May 27, 2004

Muons & Particle ID. Muon/PID Studies

Construction and first beam-tests of silicon-tungsten prototype modules for the CMS High Granularity Calorimeter for HL-LHC

The CMS HGCAL detector for HL-LHC upgrade

KLauS4: A Multi-Channel SiPM Charge Readout ASIC in 0.18 µm UMC CMOS Technology

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

What do the experiments want?

Second generation ASICS for CALICE/EUDET calorimeters

Status of Semi-Digital Hadronic Calorimeter (SDHCAL)

Timing Measurement in the CALICE Analogue Hadronic Calorimeter.

An ASIC dedicated to the RPCs front-end. of the dimuon arm trigger in the ALICE experiment.

EUDET Pixel Telescope Copies

DAQ & Electronics for the CW Beam at Jefferson Lab

Seminar. BELLE II Particle Identification Detector and readout system. Andrej Seljak advisor: Prof. Samo Korpar October 2010

Recent Developments in Gaseous Tracking Detectors

Micromegas for muography, the Annecy station and detectors

Status of the LHCb Experiment

Laser Alignment System for LumiCal

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

3.1 Introduction, design of HERA B

Development of large readout area, high time resolution RPCs for LEPS2 at SPring-8

Cllb 31 May 2007 LCWS R&D Review - Overview 1

CMS Note Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Data Acquisition System for the Angra Project

SiD Workshop RAL Apr Nigel Watson Birmingham University. Overview Testing Summary

optimal hermeticity to reduce backgrounds in missing energy channels, especially to veto two-photon induced events.

Installation, Commissioning and Performance of the CMS Electromagnetic Calorimeter (ECAL) Electronics

irpc upgrade project for CMS during HL-LHC program

Study of the ALICE Time of Flight Readout System - AFRO

The CMS electromagnetic calorimeter barrel upgrade for High-Luminosity LHC

arxiv:physics/ v1 [physics.ins-det] 19 Oct 2001

Beam Condition Monitors and a Luminometer Based on Diamond Sensors

Development and Test of a Demonstrator for a First-Level Muon Trigger based on the Precision Drift Tube Chambers for ATLAS at HL-LHC

8.882 LHC Physics. Detectors: Muons. [Lecture 11, March 11, 2009] Experimental Methods and Measurements

Resistive Micromegas for sampling calorimetry

Triple GEM detector as beam monitor Monitors for Crystal experiment at SPS A compact Time Projection chamber with GEM

A Fast Waveform-Digitizing ASICbased DAQ for a Position & Time Sensing Large-Area Photo-Detector System

Firmware development and testing of the ATLAS IBL Read-Out Driver card

1.1 The Muon Veto Detector (MUV)

CATIROC a multichannel front-end ASIC to read out the SPMT system of the JUNO experiment

arxiv: v1 [physics.ins-det] 5 Sep 2011

MAPS-based ECAL Option for ILC

Development of Large Area and of Position Sensitive Timing RPCs

ATLAS Muon Trigger and Readout Considerations. Yasuyuki Horii Nagoya University on Behalf of the ATLAS Muon Collaboration

The CMS Outer HCAL SiPM Upgrade.

CMS Tracker Upgrades. R&D Plans, Present Status and Perspectives. Benedikt Vormwald Hamburg University on behalf of the CMS collaboration

Readout ASICs and Electronics for the 144-channel HAPDs for the Aerogel RICH at Belle II

The LHCb trigger system

Track Triggers for ATLAS

Noise Characteristics Of The KPiX ASIC Readout Chip

ITk silicon strips detector test beam at DESY

Calorimetry at the ILC Detectors

Data acquisi*on and Trigger - Trigger -

Development of Telescope Readout System based on FELIX for Testbeam Experiments

Level-1 Track Trigger R&D. Zijun Xu Peking University

Development of a Highly Selective First-Level Muon Trigger for ATLAS at HL-LHC Exploiting Precision Muon Drift-Tube Data

Electron-Bombarded CMOS

Calibration of Scintillator Tiles with SiPM Readout

PoS(LHCP2018)031. ATLAS Forward Proton Detector

Trigger Overview. Wesley Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review April 12, 2000

The Scintillator HCAL Testbeam Prototype

A 130nm CMOS Evaluation Digitizer Chip for Silicon Strips readout at the ILC

SPD VERY FRONT END ELECTRONICS

RP220 Trigger update & issues after the new baseline

Characterization of GEM Chambers Using 13bit KPiX Readout System

Status of TPC-electronics with Time-to-Digit Converters

ATLAS Phase-II Upgrade Pixel Data Transmission Development

Use of FPGA embedded processors for fast cluster reconstruction in the NA62 liquid krypton electromagnetic calorimeter

arxiv: v2 [physics.ins-det] 14 Jan 2009

Front-End Electronics and Feature-Extraction Algorithm for the PANDA Electromagnetic Calorimeter

The Muon Detector Update

US CMS Calorimeter. Regional Trigger System WBS 3.1.2

Test Beam Measurements for the Upgrade of the CMS Phase I Pixel Detector

P ILC A. Calcaterra (Resp.), L. Daniello (Tecn.), R. de Sangro, G. Finocchiaro, P. Patteri, M. Piccolo, M. Rama

Resolution studies on silicon strip sensors with fine pitch

The Readout Electronics for Silicon Tracker of the GLAST Beam Test Engineering Model

Development of TOP counter for Super B factory

Norbert Meyners, DESY. LCTW 09 Orsay, Nov. 2009

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Beam Pipe, Cables, Services

Status of SVT front-end electronics M. Citterio on behalf of INFN and University of Milan

A tracking detector to study O(1 GeV) ν μ CC interactions

P. Branchini (INFN Roma 3) Involved Group: INFN-LNF G. Felici, INFN-NA A. Aloisio, INFN-Roma1 V. Bocci, INFN-Roma3

The LHCb Upgrade BEACH Simon Akar on behalf of the LHCb collaboration

SiD and CLIC CDR preparations

PMF the front end electronic for the ALFA detector

A Prototype Amplifier-Discriminator Chip for the GLAST Silicon-Strip Tracker

Transcription:

LC-DET-2004-029 A DAQ readout for the digital HCAL Jean-Claude Brient brient@poly.in2p3.fr Laboratoire Leprince Ringuet Ecole Polytechnique CNRS-IN2P3 Abstract: Considerations on the size of the digital HCAL prototype and on the readout scheme are presented. A new cost estimate is deduced, far below the previous one, making the proposal very interesting for the coming prototype construction of digital HCAL. 1

1. Introduction A gaseous detector like RPC s or gem s as active device for a digital hadron calorimeter is one of the possible option. Since a pad size of about 1 cm 2 seems to be optimal for what concerns the energy resolution [1], the overall number of channels is very large and therefore the electronics readout is the key point for the cost of such a detector. The original proposal for this readout comes from studies by A.Karar at LLR for the TDR [2], where the main philosophy is to do the zero suppress as early as possible. A study of the prototype size needed to fulfil the goal of the test beam and a detailed description of the readout is presented. 2. The needed size of the DHCAL prototype The containment of the hadronic shower in the foreseen prototype is worse in the case of interaction of charged pion shooting the prototype perpendicular to the radiator. In order to visualise the worse case, a sample of 100 GeV/c charged pion has been simulated, within MOKKA framework [3], at 90 degrees in the middle of the ECAL and HCAL. Figure 1 shows a visual transverse view of the 1000 events in the sample. Figure 1: Transverse view of 100 GeV/c pion interaction Figure 2: 1000 interaction of pions in 1m Note the log scale on the energy cells containts. x,y plane in this figure is the plane perpendicular to pion interaction In order to reduce by a factor of 2 the number of channels, we define a new size for the prototype, with a size of 70x70 cm 2. From the distribution of the figure 1, we found that for 32% of the events, there is absolutely nothing outside this new size. When looking the number of cells outside, in fact less than 1% of the events have more than 1% of cells multiplicity outside our new prototype. If we assume a reasonable time for the test beam and taking into account the duty cycle of the RPC s, it is clear that the checking of the digital HCAL properties versus the GEANT4 simulation will be at best at the 1% level, and therefore a size of 70x70 cm seems appropriate in reducing the prototype cost. Similar study has been done with 10 GeV/c pion, it is shown on figure 3. 2

Figure 3: Transverse view of 10 GeV/c pion interaction. the square shows the proposed new size detector. The containment is illustrated by fig. 4, which shows the fraction of the shower outside this new size detector. Figure 4: Fraction of the multiplicity outside the new size detector for 100 GeV/c pion interaction. The proposed design is therefore 2 Boards of 40x72cm per layers, corresponding to about 230 000 channels for the 40 layers with a transverse size of 80x72 cm 2. Each Board corresponds to 45 ASIC s. 3

3. The readout scheme The overall scheme of the electronic readout is based on an VFE-chip ASIC with the signal pre-amplification, a set of discriminators, and a memory inserted at the level of the ASIC. the control and readout of all the ASIC of a plane is managed by an FPGA which contains VHDL network component, able to communicate together with the ASIC and with a dedicated DAQ computer. The detailed scheme is the following. The pads are located on a PCB or Kapton plane. The signal goes by strip line inserted in the board to the VFE, an ASIC reading 64 channels. A schematic view of the ASIC is shown on figure 5 [3]. An external trigger coming from beam hodoscopes is distributed to all ASIC s, starting the readout process. If at least one pad cell is above the threshold, the 64 bits of the ASIC s is stored in a digital memory together with the contain of the trigger counter. In order to have a look to threshold effect, the level of the discriminator could be changed by external order. Another solution could be to force a storing by external order, even if no pad cell is above the threshold. In both case, the control of the ASIC must come from external line coming from the DAQ readout system. In order to know the memory size needed in the ASIC, the beam structure and hardware limitations can be taken into account. Figure 5: Schematic view of the VFE ASIC (here N 500) For the exercise, the FNAL- MTBF spill structure is used. The spill duration is about 0.7s with one spill per minute. However, with some work, an increase to 10 spills per minute is possible if needed. For the study, we take a spill duration is 0.7 s and a time between spill at 7s. On the hardware side, since the RPC s in streamer mode are limited to a rate of 300Hz, it means that there are about 210 interactions per spill, which we have to store in a memory. Naturally, a safety factor could lead to a memory size corresponding to about 400 interactions, that is a memory of about 30 Kbits, when taking into account the storing of the trigger number (up to 500). 4

4. The simulation of hadronic interactions GEANT4-MOKKA has been used to simulate 1000 interactions of 100 GeV/c pion in the ECAL(W-Si) and the 40 layers of the RPC s Iron digital HCAL. The numbers of hits in the DHCAL is small with a maximum of 920 hits/interaction. Moreover, the transverse dispersion is modest with a maximum of 18 ASIC/layer with at least one hit. For 10 GeV/c pion, the number of ASIC is smaller as expected. However, it must be noted that the number of ASIC to be read per interaction depends on the hadronic shower model used in geant4, due to the different width of the shower with different model. Figure 6: Distribution of the number of ASIC-VFE with 1 hit, for 100GeV/c pion (left) and 10 GeV/c pion (right) interaction. 5. DAQ based on FPGA s for RPC s active devices The average VFE with at least one hit is 5 VFE asic/layers, with a maximum of 18 asics, which justified the token ring structure. Of course these numbers are dependant of the real hadronic shower extension, which are not known. For 100 GeV pion interactions, the average numbers of bits per layer is VERY small, at about 330 bits/interaction/layer. However, recently, the US groups as well as Russian groups measure in RPC the multiplicity per mip track to be about 1.5/hit, leading to 495 bits/interaction/layer. Using again a safety factor of 2, it translate to a number of bits per interaction and per layer of about 1Kbits/interaction/layer. Taking into account the 210 interactions/spill, each FPGA will have to read about 210KBits/FPGA located in about 4 ASIC s in the time between spills. The way to do it could be the following. At the end of the spill, the HCAL DAQ receives from Accelerator DAQ the information of the end-of-spill. The DAQ starts there a reading of each FPGA located on planes, one after another, in token ring mode. Similarly, when one FPGA receives the readout order, it read the ASIC s one after another in token ring way. The overall information to be read is about 210Kbits time 40 layers, that is 8.4 Mbits. With the 7s between spills, it means a transfer rate of about 1200 Kbits/s. This modest speed allows reducing the cost and complexity of the readout, using a single PC working in Ethernet or USB2.0, with just some development on the reading protocol. 5

6. Running with GEM s Now, if the RPC s are replaced by GEM s, as proposed by UTA group, the rate limitation is no longer valid. One solution consists to have the local storage of the VFE ASIC accessible continuously by FPGA. Using, for example, a XLINX Virtex-II, the clock is at 420 MHz, and a readout at 50 MHz is not a problem. With one line per VFE chip, naively we can count about 80 clock counts to read the 45 ASICs per plane, leading to a rate of about 700 KHz. Even with a safety factor of 10, there is a priori, no problem to use the same DAQ to read the GEM s in test beam. A dedicated FPGA simulation is under way to give a more precise estimation. The only open question is the need of an external memory to store interaction up to the end of the spill, and the DAQ will then read this memory directly or through the FPGA. 7. Cost estimate With the estimated rate of transfer, a single FPGA/layer could do the readout. A schematic view is shown on figure 9. The FPGA receive a trigger from external line, (from the beam hodoscopes), and read the 45 ASIC s (for 2 board and 2 FPGA per layer) storing the 64bits of one ASIC only when at least one bit is on. At the end of the spill, the FPGA receive through a network component (NC), the readout order. It enables the reading of the ASIC and transfer the memory to the NC. The protocol and the end-of-spill flag is managed by the NC, which send/receive the flag of readout from the DAQ-PC and make free the token-ring at the end of the FPGA readout. Figure 7: Distribution of the number of bits/interaction/layer for 100GeV/c. Figure 8: total number of Kbits to be read between 2 spills for 100 GeV/c The FPGA estimation is based on XilinX or ALTERA on-shell FPGA, while the IP module of the FPGA is estimated at 20 euros/unit, cost taken from CMS experiments for their TPG cards for the ECAL. An estimation of the time needed for this development is about 2-3 engineers years. The engineers salary if therefore, using standard values used in CNRS - IN2P3, estimated to be about 300 K. 6

8. Possible use for the final project The intrinsic limitation of the RPC s counting rate can be translated to the ILC situation where in addition, the occupancy is expected to be well below the one for a test beam. Therefore, the overall readout for a DHCAL in the full scale detector can be just a copy of the one proposed here. However, a special care and dedicated study have to be done in the low angle region, where the production of hadrons due to photons-photons interaction could be very large. 9. Conclusions A DAQ for the digital HCAL is proposed. The expected performances look adequate for RPC s or even GEM S as active layer, while cost is very effective. The key point of the project is the capability of local storage of the VFE ASIC under development. Acknowledgement: Special thanks to D. Decotigny, F. Gastaldi, A. Karar and J-C. Vanel, for helpful discussions and expert information on DAQ and digital electronics. References [1] V. Zutshi (NIU) presented at Montpellier ECFA workshop, 2003 [2] Tesla Technical Design Report, Part IV, calorimeter section 2001 [3] MOKKA simulation package [4] A.Karar, presentation at SNU meeting, Korea, 2000 7

Table 1 : Cost estimation (not counting salary) for the readout of the prototype ITEM NUMBER Unit cost Total per item FPGA s (1 FPGA per PCB, 2 PCBs layer) 80 150 12000 IP module 80 40 3200 PC, Cables 1 5000 5000 Optical transfer 1 800 800 Misc. + Contingency 4000 TOTAL 25000 Figure 9: Schematic view of the PCB, the ASIC-VFE, the FPGA and the DAQ system. There is 2 PCB per layer. 8