LHCb and its electronics. J. Christiansen On behalf of the LHCb collaboration

Similar documents
LHCb and its electronics.

The Read-Out system of the ALICE pixel detector

CMS Conference Report

A new Scintillating Fibre Tracker for LHCb experiment

SciFi A Large Scintillating Fibre Tracker for LHCb

TORCH a large-area detector for high resolution time-of-flight

Advanced Training Course on FPGA Design and VHDL for Hardware Simulation and Synthesis. 26 October - 20 November, 2009

A pixel chip for tracking in ALICE and particle identification in LHCb

Compact Muon Solenoid Detector (CMS) & The Token Bit Manager (TBM) Alex Armstrong & Wyatt Behn Mentor: Dr. Andrew Ivanov

ALICE Muon Trigger upgrade

Front End Electronics

BABAR IFR TDC Board (ITB): requirements and system description

Design, Realization and Test of a DAQ chain for ALICE ITS Experiment. S. Antinori, D. Falchieri, A. Gabrielli, E. Gandolfi

Performance of a double-metal n-on-n and a Czochralski silicon strip detector read out at LHC speeds

Front End Electronics

Electronics procurements

PIXEL2000, June 5-8, FRANCO MEDDI CERN-ALICE / University of Rome & INFN, Italy. For the ALICE Collaboration

The ATLAS Tile Calorimeter, its performance with pp collisions and its upgrades for high luminosity LHC

Evaluation of an Optical Data Transfer System for the LHCb RICH Detectors.

The LHCb Timing and Fast Control system

CMS Tracker Synchronization

The Pixel Trigger System for the ALICE experiment

A fast and precise COME & KISS* QDC and TDC for diamond detectors and further applications

FRONT-END AND READ-OUT ELECTRONICS FOR THE NUMEN FPD

Local Trigger Electronics for the CMS Drift Tubes Muon Detector

The hybrid photon detectors for the LHCb-RICH counters

arxiv: v1 [physics.ins-det] 1 Nov 2015

Status of the LHCb Experiment

KEK. Belle2Link. Belle2Link 1. S. Nishida. S. Nishida (KEK) Nov.. 26, Aerogel RICH Readout

Trigger Report. Wesley H. Smith CMS Trigger Project Manager Report to Steering Committee February 23, 2004

Trigger Cost & Schedule

CERN-RRB April 2005 Status of the LHCb Experiment Report to April 2005 RRB By the LHCb Collaboration

WBS Calorimeter Trigger. Wesley Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review April 12, 2000

S.Cenk Yıldız on behalf of ATLAS Muon Collaboration. Topical Workshop on Electronics for Particle Physics, 28 September - 2 October 2015

Diamond detectors in the CMS BCM1F

The ALICE on-detector pixel PILOT system - OPS

The Readout Architecture of the ATLAS Pixel System

System: status and evolution. Javier Serrano

University of Oxford Department of Physics. Interim Report

CSC Data Rates, Formats and Calibration Methods

BABAR IFR TDC Board (ITB): system design

SuperB- DCH. Servizio Ele<ronico Laboratori FrascaA

Description of the Synchronization and Link Board

DAQ Systems in Hall A

FRANCO MEDDI CERN-ALICE / University of Rome & INFN, Italy. For the ALICE Collaboration

Minutes of the ALICE Technical Board, November 14 th, The draft minutes of the October 2013 TF meeting were approved without any changes.

The Silicon Pixel Detector (SPD) for the ALICE Experiment

Paul Rubinov Fermilab Front End Electronics. May 2006 Perugia, Italy

TTC Interface Module for ATLAS Read-Out Electronics: Final production version based on Xilinx FPGA devices

Studies of large dynamic range silicon photomultipliers for the CMS HCAL upgrade

Neutron Irradiation Tests of an S-LINK-over-G-link System

Update on DAQ for 12 GeV Hall C

The Scintillating Fibre Tracker for the LHCb Upgrade. DESY Joint Instrumentation Seminar

US CMS Endcap Muon. Regional CSC Trigger System WBS 3.1.1

Conceps and trends for Front-end chips in Astroparticle physics

CMS Upgrade Activities

DTMROC-S: Deep submicron version of the readout chip for the TRT detector in ATLAS

THE ATLAS Inner Detector [2] is designed for precision

IPRD06 October 2nd, G. Cerminara on behalf of the CMS collaboration University and INFN Torino

The CMS Detector Status and Prospects

CMS Tracker Optical Control Link Specification. Part 1: System

WBS Trigger. Wesley Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review April 11, 2000

Tests of the boards generating the CMS ECAL Trigger Primitives: from the On-Detector electronics to the Off-Detector electronics system

Scintillation Tile Hodoscope for the PANDA Barrel Time-Of-Flight Detector

A flexible FPGA based QDC and TDC for the HADES and the CBM calorimeters TWEPP 2016, Karlsruhe HADES CBM

The ATLAS Pixel Detector

The Status of the ATLAS Inner Detector

Copyright 2018 Lev S. Kurilenko

Update on DAQ for 12 GeV Hall C. Brad Sawatzky

Features of the 745T-20C: Applications of the 745T-20C: Model 745T-20C 20 Channel Digital Delay Generator

The Alice Silicon Pixel Detector (SPD) Peter Chochula for the Alice Pixel Collaboration

CMS Note Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Large Area, High Speed Photo-detectors Readout

ALICE Week Technical Board TPC Intelligent Readout Architecture. Volker Lindenstruth Universität Heidelberg

Design of the Level-1 Global Calorimeter Trigger

Data Quality Monitoring in the ATLAS Inner Detector

The Readout Architecture of the ATLAS Pixel System. 2 The ATLAS Pixel Detector System

The TRIGGER/CLOCK/SYNC Distribution for TJNAF 12 GeV Upgrade Experiments

An extreme high resolution Timing Counter for the MEG Upgrade

Atlas Pixel Replacement/Upgrade. Measurements on 3D sensors

Global Trigger Trigger meeting 27.Sept 00 A.Taurok

GFT Channel Digital Delay Generator

Commissioning and Initial Performance of the Belle II itop PID Subdetector

SVT DAQ. Per Hansson Adrian HPS Collaboration Meeting 10/27/2015

Glast beam test at CERN

RTPC 12 Simulation. Jixie Zhang Aug 2014

... A COMPUTER SYSTEM FOR MULTIPARAMETER PULSE HEIGHT ANALYSIS AND CONTROL*

Commissioning and Performance of the ATLAS Transition Radiation Tracker with High Energy Collisions at LHC

arxiv:hep-ex/ v1 27 Nov 2003

National Park Service Photo. Utah 400 Series 1. Digital Routing Switcher.

DXP-xMAP General List-Mode Specification

PEP-II longitudinal feedback and the low groupdelay. Dmitry Teytelman

PICOSECOND TIMING USING FAST ANALOG SAMPLING

New gas detectors for the PRISMA spectrometer focal plane

ATLAS Pixel Subsystem and Simulation

GERDA GeDDAQ. Status, operation, integration. INFN Padova INFN & University Milano. Calin A. Ur

Drift Tubes as Muon Detectors for ILC

VLSI Chip Design Project TSEK06

Sensors for the CMS High Granularity Calorimeter

THE TIMING COUNTER OF THE MEG EXPERIMENT: DESIGN AND COMMISSIONING (OR HOW TO BUILD YOUR OWN HIGH TIMING RESOLUTION DETECTOR )

Transcription:

LHCb and its electronics J. Christiansen On behalf of the LHCb collaboration

Physics background CP violation necessary to explain matter dominance B hadron decays good candidate to study CP violation B lifetime ~1ps -> short decay length (few mm) 40-400 tracks per event LEB 2000 Cracow J.Christiansen 2

LHCb differences from ATLAS/CMS ~1/4 size: budget, physical size, number of collaborators 1.2 million channels in 9 different sub-detectors Particle identification vital -> RICH detectors Vertex resolution vital -> Vertex detector in secondary machine vacuum Uses existing DELPHI cavern: reduced cost, must adapt Open detector with fixed target topology (easy access, sub-detectors mechanically independent, flexible assembly) Forward angle detector -> high particle density B physics triggering difficult -> 4 trigger levels with two in front-end One interaction per ~3 bunch crossings to prevent overlapping events in same crossing (ATLAS/CMS: factor ~50 higher) First level (L0) trigger rate of 1 MHz (ATLAS/CMS: factor 10-20 lower) Consecutive first level triggers supported (ATLAS/CMS: gap of 3 or more) First and second level trigger (L0 & L1) buffering in front-end LEB 2000 Cracow J.Christiansen 3

LHCb evolution since LEB 97 September 1998 LHCb approved General architecture maintained Most detector technologies now defined Key front-end parameters defined L0 latency 3 µs -> 4 µs L1 latency 50 µs -> 1000 µs (memory cheap) Buffer overflow prevention schemes defined: Front-end control defined (TTC, partitioning, overflow prevention, etc.) Electronics under development Better understanding of radiation environment (but more work needed) L2 and L3 trigger performed on same physical processor Architecture of trigger implementations defined Push architecture for DAQ event building network maintained Standard interface and data merger module to DAQ under design Start to make TDR s. LEB 2000 Cracow J.Christiansen 4

LHCb sub-detectors LEB 2000 Cracow J.Christiansen 5

LHCb detector in DELPHI cavern LEB 2000 Cracow J.Christiansen 6

Front-end and DAQ architecture 1.2 million channels 4 µs analog or digital Analog L0 pipeline 40MHz Pile-up Muon Cal L0 derandomizer control Clock pipelined processing and buffering 16 events 40 K links analog/digital 1000 events (digital) L0 derandomizer L1 FIFO 1 MHz Vertex Reorganize Event N 2GB/s Event N+1 X 100 Front-end simulated in VHDL L1 trigger simulated in Ptolemy Parallel processing in L1 trigger system Event pipelined buffering in front-end 40KHz ~ 10 events Event buffers Few hundred links Event N Event N+1 Event building network: 4GB/s X 1000 Throttle L2 & L3 Front-end DAQ Parallel processing 200Hz x 100KB LEB 2000 Cracow J.Christiansen 7

Front-end buffer control L0 trigger L0 pipeline 1MHz Not full Readout supervisor L0 derandomizer emulator Veto s all L0 trigger accepts that risk to overflow L0 derandomizers All L0 derandomizers must comply to given rule: Minimum depth: 16 events Maximum readout time: 900ns = (32+4)x25ns Derand. X 32 Same state 14 L0 Derandomizer loss vs Read out speed 32 data 4 data tags (Bunch ID, Event ID, etc.) 12 10 Data merging Data @ 40MHz Loss(%) 8 6 4 2 0 500 600 700 800 900 1000 Read out speed (ns) Depth = 4 Depth = 8 Depth = 16 Depth = 32 LEB 2000 Cracow J.Christiansen 8

Consecutive L0 triggers Gaps between L0 triggers would imply ~3% physics loss per gap at 1MHz trigger rate. Problematic for detectors that need multiple samples per trigger or detectors with drift time. All sub-detectors have agreed that this can be handled Very useful for testing, verification, calibration and timing alignment of detectors and their electronics Max 16 consecutive triggers Time alignment Pulse width Baseline shifts Single interaction in given time window trigger being considered (simple scintillator detector) Use of single bunch mode of LHC machine being considered LEB 2000 Cracow J.Christiansen 9

L1 buffer control 900ns per event 36 words per event @ 40MHz 4 tags 32 data L1 buffer Max 1000 events Vertex Event N L1 trigger Event N+1 Reorganize 36 words @ 40 MHz Throttle L0 triggers 40 khz L1 buffer monitor (max 1000 events) Throttle L0 triggers L1 derandomizer Zero-suppression < 25 µs Data merge Output buffer Data to DAQ Nearly full Board Nearly full DAQ System History trace TTC broadcast (400ns) L1 decision spacing (900ns) L1 Throttle accept -> reject Readout supervisor LEB 2000 Cracow J.Christiansen 10

Readout supervisor L0 trigger L1 trigger Main controller of front-end and input to DAQ Receive L0 and L1 trigger decisions from trigger systems. L0 interface Special triggers L1 interface Restrict triggers to prevent buffer overflows in front-end, L1 trigger and DAQ L0: Derandomizer emulation + Throttle L1: Throttle Generate special triggers: calibration, empty bunch, no bias, etc. Reset front-end Drive TTC system via switch. Allow flexible partitioning and debugging L1 trigger Front-end DAQ Monitor L0 L0 derandomizer emulator Throttle Ch. A Ch. B TTC encoder Sequence verification Buffer size monitoring Throttle Resets LHC interface LHC One readout supervisor per partition Partitioning of throttle network L1 Monitoring Control ECS interface Partitioning of TTC system DAQ Switch ECS TTC system LEB 2000 Cracow J.Christiansen 11

DAQ Front end ~1000 front-end sources Front end Front-end multiplexing Front-end multiplexing based on Readout Unit Readout unit Readout unit Readout unit ~100 readout units < 50MB/s per link Storage Event building network ( 100 x 100 ) 4GB/s Farm controller Farm controller Farm controller ~100 farms ~1000 s of 1000MIPS or more LEB 2000 Cracow J.Christiansen 12

Experiment control system (ECS) ECS controls and monitors everything in LHCb DAQ (partitioning, initializing, start, stop, running, monitoring, etc.) Front-end and trigger systems (initializing, calibration, monitoring, etc.) Traditional slow control (magnet, gas systems, crates, power supplies, etc.) Requirements Based on commercial control software (from JCOP) Gbytes of data to download to front-end, trigger, DAQ, etc. Distributed system with ~one hundred computers/processors. Partitioning into independent sub-systems (commissioning, debugging, running) Support standard links (Ethernet, CAN, etc.) ECS DAQ Sub-detector Trigger Magnet Gas systems farm Readout units Power supply Front-end LEB 2000 Cracow J.Christiansen 13

ECS interface to electronics No radiation (counting room): Ethernet to credit card PC on modules Local bus: Parallel bus, I 2 C, JTAG Low level radiation (cavern): 10Mbits/s custom serial LVDS twisted pair SEU immune antifuse based FPGA interface chip Local bus: Parallel bus, I 2 C, JTAG High level radiation (inside detectors): CCU control system made for CMS tracker Radiation hard, SEU immune, bypass Local bus: Parallel bus, I 2 C, JTAG Support Supply of interface devices (masters and slaves) Software drivers, software support Ethernet Master PC Master PC Credit card PC Serial slave JTAG I 2 C Par JTAG I 2 C Par LEB 2000 Cracow J.Christiansen 14

Radiation environment In detector: 1K - 1M rad/year Analog front-ends L0 pipeline (Vertex, Inner tracker, RICH) Repair: Few days to open detector Edge of detector and in nearby cavern: Few hundred rad/year ~ 10 10 1Mev neutrons/cm 2 year L0 pipelines L0 trigger systems L1 electronics Power supplies? (reliability) SEU problems: Control flip-flops Memories FPGA's Access: 1 hour with 24 hour notice Quick repairs must be possible Remote diagnostics required Total dose inside experiment X Z Ecal detector LEB 2000 Cracow J.Christiansen 15

Electronics in cavern Relatively low total dose Relatively low neutron flux Use of COTS justified Complex L0 trigger system and L0 and L1 electronics in cavern -> SEU becomes problematic Typical L1 front-end board L1 buffer control Zero - suppression X 32 Assumptions: Data memory not considered 32 FPGAs used for control & ZS 300 Kbit programming per FPGA Total 10Mbits per board 1000 modules in total system Hadron flux at edge of calorimeter: ~ 3 x 10 10 part./cm 2 /year, E > 10 Mev Upset rate: ~1000 channels Xlinx Module: 3 10 10 x 4 10-15 x 10 7 =1200 per year (once per few hours) System: 1200 x 1000 = 1.2 million per year (few per minute) Recovery only by re-initialization!!. LEB 2000 Cracow J.Christiansen 16

Monitoring Errors Assume soft errors from SEU and glitches All event fragments must contain Bunch ID, Event ID plus option of two more tags (error flags, check sum, buffer address, etc). Errors in data ignored Errors in control fatal: Recovery All buffer overflows must be detected and signaled (even though system made to prevent this) When merging data, event fragments must be verified to be consistent Self checking state machines encouraged (one hot encoding) Continuos parity check on setup parameters encouraged Quick reset of L0 and L1 front-ends specified Fast download of front-end parameters Local recovery considered dangerous LEB 2000 Cracow J.Christiansen 17

In-situ testing All registers must have read back Never mix event data and system control data Effective remote diagnosis for electronics in cavern to enable quick repairs (1 hour) Sub-systems Boards Data links Power supplies Use of JTAG boundary scan encouraged (also in-situ) LEB 2000 Cracow J.Christiansen 18

ASIC s Needed for required performance Needed for acceptable cost (but ASIC s are expensive) Problematic for time schedules 1 year delay in designs can easily accumulate. Time for testing and qualification often underestimated. Remaining electronics can not advance before ASIC s ready. Design errors can not be corrected by straps. Technologies are quickly phased out in today s market (5 years). Use of single supplier potentially dangerous. All sub-detectors rely on one or a few key ASIC s ASIC s in LHCb: Designs: ~10 Total volume: ~ 50 K Technologies: 4 x 0.25 µm CMOS, DMILL, BiCMOS, etc. Prototypes of most ASIC s exist We are a very small and difficult customer that easily risks to be put at the bottom of the manufactures priority list LEB 2000 Cracow J.Christiansen 19

Where are we now Progressing towards TDR s over coming year. Long production time -> now Short production time -> later Architecture and parameters of Front-end, trigger and DAQ systems defined. Working on prototypes of detectors and electronics. Ready to select ECS system Part of JCOP Standardizing ECS interfaces to front-ends. Event building network of DAQ not yet chosen Uses commercial technology which must be chosen at the latest possible moment to get highest possible performance at lowest prices (Gigabit Ethernet or alike) LEB 2000 Cracow J.Christiansen 20

A few implementations Vertex vacuum tank 1.5 m Used in 2 (3) LHCb detectors Beetle silicon strip front-end in 0.25 µm CMOS Backup in DMILL (SCTA-VELO) Hybrid Vertex detector prototype with SCTA front-end LEB 2000 Cracow J.Christiansen 21

RICH detector Pixel chip in 0.25 um CMOS is a common development with ALICE Critical time schedule as integrated into vacuum tube Pixel Hybrid Photon Detector Backup solution using commercial MAPMT, read out by analog pipeline chip (Beetle or SCTA-VELO) LEB 2000 Cracow J.Christiansen 22

Hcal & Ecal 40 MHz 12bits front-end Readout Unit: data concentration & DAQ interface LEB 2000 Cracow J.Christiansen 23

LHCb electronics in numbers Channels: Sub-detectors: 9 1.2 million Triggers: 4 Rates: 1 MHz, 40 khz, 5 khz 200 Hz Latencies: 4 us, 1 ms, 10 ms 200 ms Event size: ASIC s: 100Kbyte TTCrx: 2000 Data links: 9U modules: Racks: s: 50K in 10 different types 2000 optical + 40K short distance analog or LVDS 1000 FE + 100 L0 + 100 RU + 50 control 30 cavern, 80 underground counting room, 50 surface (DAQ) 100 L1 + 1000 DAQ + 100 ECS + FE DSP LEB 2000 Cracow J.Christiansen 24

Electronics status System FE architecture Status TDR Front-end Common definitions Architecture and parameters defined L0 trigger Pipelined Architecture defined, Simulations L1 trigger Parallel s Architecture defined, Simulations + prototyping DAQ Parallel, data push Architecture defined, Simulations Vertex Analog readout FE chip prototypes under test Early 02 Early 02 Mid 01 RICH Binary pixel + backup FE chip prototype to be tested Sep 00 Inner tracker Same as Vertex Defining detector type (substitute for MSGC) End 01 Outer tracker ASD + TDC Selecting ASD, TDC chip to be tested Mid 01 Preshower + Digital 10 bit FE prototypes tested Sep 00 E/H cal Digital 12 bit Muon Binary Architecture + FE under study Early 01 LEB 2000 Cracow J.Christiansen 25

Worries in LHCb electronics Time schedules of ASIC s may easily become critical Correctly quantify SEU problem in LHCb cavern Use of power supplies in LHCb cavern Support for common projects: TTC, radiation hard 0.25 um CMOS, power supplies, ECS framework Limited number of electronics designers available Limited electronics support available from CERN Limited number of electronics designers in HEP institutes Difficult to involve engineering institutes/groups No funding for HEP electronics Prefer to work on industrial problems Prefer to work on specific challenges in electronics Hard to get electronics designers and computer scientists (booming market) Qualification/verification of ~10 ASIC designs, tens of hybrids and tens of complicated modules. Documentation and maintenance Supply of electronics components expected to become very difficult for small consumers in the coming two years LEB 2000 Cracow J.Christiansen 26

Handling electronics in LHCb Electronics community in LHCb sufficiently small that general problems can be discussed openly and decisions can be reached. Regular electronics workshop of one week dealing with frontend, trigger, DAQ and ECS. Specific electronics meeting (1/2 day) during LHCb weeks with no parallel sessions to allow front-end, trigger, DAQ, ECS to discuss electronics issues. Electronics coordination part of technical board. It is recognized that electronics is a critical (and complicated and expensive and ----) part of the experiment. Review policy agreed upon (but not yet used extensively) Architecture, Key components (ASIC s, boards), Production readiness LEB 2000 Cracow J.Christiansen 27