LCLS Machine Protection System Engineering Design Specifications

Similar documents
Requirements for the Beam Abort Magnet and Dump

Beam Losses During LCLS Injector Phase-1 1 Operation

Improving EPICS IOC Application (EPICS user experience)

PITZ Introduction to the Video System

VIDEO GRABBER. DisplayPort. User Manual

Status of the X-ray FEL control system at SPring-8

Detailed Design Report

GFT Channel Digital Delay Generator

TC3301. User's Manual

Dimming actuators of the FIX series DM 4-2 T, DM 8-2 T

P. Emma, et al. LCLS Operations Lectures

MICROMASTER Encoder Module

The FLASH objective: SASE between 60 and 13 nm

THE DESIGN OF CSNS INSTRUMENT CONTROL

Features of the 745T-20C: Applications of the 745T-20C: Model 745T-20C 20 Channel Digital Delay Generator

Special Applications Modules

Front End Electronics

1. General principles for injection of beam into the LHC

Digital Transmission System Signaling Protocol EVLA Memorandum No. 33 Version 3

Dimming actuators GDA-4K KNX GDA-8K KNX

LCLS Event System - Software

ENGR 1000, Introduction to Engineering Design

COMMISSIONING OF THE ALBA FAST ORBIT FEEDBACK SYSTEM

A Fast Magnet Current Change Monitor for Machine Protection in HERA and the LHC

User Guide UD51. Second encoder small option module for Unidrive. Part Number: Issue Number: 5.

HPS Slow Controls Overview

PEP-II longitudinal feedback and the low groupdelay. Dmitry Teytelman

ALBA. Libera Workshop 16 A. Olmos

C8000. switch over & ducking

An Overview of Beam Diagnostic and Control Systems for AREAL Linac

AREAL- Phase 1. B. Grigoryan on behalf of AREAL team

MICROSENS. Fast Ethernet Switch Modul 4x 10/100Base-TX, 1x 100Base-FX. Description. Features

User Manual Entry Line Industrial Fast Ethernet Switch 4x 10/100Base-TX, 1x 100Base-X Fiber Port 4x PoE+ up to 30W

Polarized Source Development Run Results

G0 Laser Status Parity Controls Injector Diagnostics

A dedicated data acquisition system for ion velocity measurements of laser produced plasmas

Top-Up Experience at SPEAR3

GA A26497 SOLID-STATE HIGH-VOLTAGE CROWBAR UTILIZING SERIES-CONNECTED THYRISTORS

R5 RIC Quickstart R5 RIC. R5 RIC Quickstart. Saab TransponderTech AB. Appendices. Project designation. Document title. Page 1 (25)

IOT OPERATIONAL EXPERIENCE ON ALICE AND EMMA AT DARESBURY LABORATORY

SPEAR 3: Operations Update and Impact of Top-Off Injection

013-RD

Digital BPMs and Orbit Feedback Systems

TWO BUNCHES WITH NS-SEPARATION WITH LCLS*

RADIATION SAFETY SYSTEM OF THE B-FACTORY AT THE STANFORD LINEAR ACCELERATOR CENTER

Installation / Set-up of Autoread Camera System to DS1000/DS1200 Inserters

Radiation Safety System for Stanford Synchrotron Radiation Laboratory*

DX-10 tm Digital Interface User s Guide

INSTRUCTION MANUAL FOR MODEL IOC534 LOW LATENCY FIBER OPTIC TRANSMIT / RECEIVE MODULE

Product Information. EIB 700 Series External Interface Box

Chapter 5 Flip-Flops and Related Devices

ORM0022 EHPC210 Universal Controller Operation Manual Revision 1. EHPC210 Universal Controller. Operation Manual

FIRST SIMULTANEOUS TOP-UP OPERATION OF THREE DIFFERENT RINGS IN KEK INJECTOR LINAC

B. The specified product shall be manufactured by a firm whose quality system is in compliance with the I.S./ISO 9001/EN 29001, QUALITY SYSTEM.

The Elettra Storage Ring and Top-Up Operation

Product Update. JTAG Issues and the Use of RT54SX Devices

Development of an Abort Gap Monitor for High-Energy Proton Rings *

IQORX30 / IQORX31. Single Mode Fiber Optic Receivers for 3G/HD/SD-SDI Signals

North Damping Ring RF

LED control gear Compact dimming. Uconverter LCAI 2x38 W 0500 K013 one4all ECO series. Ordering data

LCLS RF Reference and Control R. Akre Last Update Sector 0 RF and Timing Systems

2070 PROFINET MODULE

LAX_x Logic Analyzer

THE ARCHITECTURE, DESIGN AND REALISATION OF THE LHC BEAM INTERLOCK SYSTEM

LCLS-II Controls & Safety Systems Status

SignalTap Plus System Analyzer

CS302 Digital Logic Design Solved Objective Midterm Papers For Preparation of Midterm Exam

HS-509 VIBRATION TRIP MODULE

PRINCIPLES AND APPLICATIONS

(Cat. No IJ, -IK)

Latest Timing System Developments

Availability and Reliability Issues for the ILC

Therapy Control and Patient Safety for Proton Therapy

BEMC electronics operation

PEP-I1 RF Feedback System Simulation

Jefferson Lab Experience with Beam Halo, Beam Loss, etc.

Ensemble. Multi-Axis Motion Controller Software. Up to 10 axes of coordinated motion

Commissioning of Accelerators. Dr. Marc Munoz (with the help of R. Miyamoto, C. Plostinar and M. Eshraqi)

FLIP-FLOPS AND RELATED DEVICES

FX-4AD-TC SPECIAL FUNCTION BLOCK USER'S GUIDE

Digital Delay / Pulse Generator DG535 Digital delay and pulse generator (4-channel)

Klystron Lifetime Management System

Quick Reference Manual

A Cathode Development Cornell Cultera This scope includes all labor and purchases required produce photocathodes required by CBETA.

GFT Channel Slave Generator

First Encounters with the ProfiTap-1G

PulseCounter Neutron & Gamma Spectrometry Software Manual

Logic Devices for Interfacing, The 8085 MPU Lecture 4

ACTIVE IF SPLITTER/COMBINER UHP-IFS

Gigabit Multi-mode SX to Single Mode LX Converter. User s Manual NGF-728 Series. Warning COPYRIGHT

TABLE 3. MIB COUNTER INPUT Register (Write Only) TABLE 4. MIB STATUS Register (Read Only)

Acquisition Control System Design Requirement Document

IQACO Changeover Switch

Computer Systems Architecture

Development of beam-collision feedback systems for future lepton colliders. John Adams Institute for Accelerator Science, Oxford University

Vorne Industries. 87/719 Analog Input Module User's Manual Industrial Drive Itasca, IL (630) Telefax (630)

The University of Texas at Dallas Department of Computer Science CS 4141: Digital Systems Lab

Boonton 4540 Remote Operation Modes

Agilent Technologies. N5106A PXB MIMO Receiver Tester. Error Messages. Agilent Technologies

Sequential Logic and Clocked Circuits

Transcription:

LCLS Engineering Specifications Document# 1.1-315 Project Management Revision 2 LCLS Machine Protection System Engineering Design Specifications Stephen Norum Author Signature Date Hamid Shoaee System Manager Signature Date David Schultz Electron Beam Systems Manager Signature Date Patrick Krejcik System Physicist Signature Date Darren Marsh Quality Assurance Manager Signature Date Brief Summary: The Machine Protection System (MPS) is an interlock system used to turn off or reduce the rate of the beam in response to fault conditions that may either damage or cause unwanted activation of machine parts. Change History Log Revision Number Revision Date Sections Affected Description of Change 000 November 8, 2006 All Initial version 001 January 5, 2007 All Updated hardware design 002 December 10, 2007 All DRAFT Page 1 of 74

This document has been formatted for two-sided (duplex) printing. Page 2 of 74

Contents 1 Introduction 7 2 Requirements 9 2.1 Protection...................................... 9 2.2 Timing........................................ 9 2.3 Configurability................................... 9 2.4 Automatic Recovery................................ 9 2.5 User Interface and Diagnostics.......................... 10 2.6 Physical........................................ 10 3 Overview 11 4 Inputs 13 4.1 Fault Signals..................................... 13 4.1.1 Multi-Bit States............................... 13 4.1.2 Latching................................... 13 4.1.3 Logic Levels................................. 14 4.1.4 Connection to Link Nodes......................... 14 4.2 Uploading MPS Algorithm to the Link Processor................ 14 4.3 EPICS......................................... 14 4.4 Event Generator................................... 14 4.5 Chatter Faults.................................... 15 5 Outputs 17 5.1 Mitigation Devices................................. 17 5.2 Laser Light Mitigation............................... 17 5.2.1 Pockels Cell................................. 18 5.2.2 Injector Mechanical Shutter........................ 18 5.2.3 Beam Containment System........................ 18 5.3 Electron Beam Mitigation Single Bunch Beam Dumper........... 19 5.4 EPICS......................................... 20 5.5 Event Generator................................... 21 6 Link Communication 23 6.1 MPS Link Protocol................................. 23 6.2 MPS Link Protocol Message Types........................ 23 Page 3 of 74

6.2.1 Link Synchronization........................... 23 6.2.2 Link Node Status.............................. 25 6.2.3 Link Node Unlatch............................. 26 6.2.4 Link Node Output Control........................ 26 6.3 MPS Link Protocol Timing............................. 27 6.3.1 Link Synchronization and Link Node Status.............. 28 7 Logic 31 7.1 Rate Limiting.................................... 31 7.2 Automatic Recovery................................ 31 7.3 MPS Algorithm................................... 32 7.4 Bypassing Device Faults.............................. 32 7.5 Setting and Changing Device Fault Thresholds................. 32 8 Hardware Overview 33 8.1 Link Processor.................................... 33 8.2 Link Node...................................... 33 9 Link Node 35 9.1 Digital Ouput.................................... 36 9.2 Digital Input..................................... 36 9.2.1 Response Time............................... 36 10 Naming 39 10.1 PV Names...................................... 39 11 Testing 41 11.1 MPS Algorithm Testing............................... 41 11.1.1 Simulate Mode............................... 41 11.2 Gigabit Ethernet................................... 41 11.3 Device Configuration................................ 41 12 User Interface 43 Appendices 44 A List of Acronyms 47 B Link Node FGPA Registers 49 B.1 Latched Input Card State Registers........................ 49 B.1.1 Latched State of Input Cards 0 and 1................... 49 Page 4 of 74

B.1.2 Latched State of Input Cards 2 and 3................... 49 B.1.3 Latched State of Input Cards 4 and 5................... 50 B.2 Current Input Card State Registers........................ 51 B.2.1 Current State of Input Cards 0 and 1................... 51 B.2.2 Current State of Input Cards 2 and 3................... 51 B.2.3 Current State of Input Cards 4 and 5................... 52 B.3 Output Card Registers............................... 53 B.3.1 Output Card Control............................ 53 B.3.2 Output Card Status............................. 54 B.4 Input Card Channel Debounce Registers..................... 55 B.4.1 Debounce Times of Input Card 0 Channels 0 through 7........ 55 B.4.2 Debounce Times of Input Card 0 Channels 8 through 15....... 56 B.4.3 Debounce Times of Input Card 1 Channels 0 through 7........ 57 B.4.4 Debounce Times of Input Card 1 Channels 8 through 15....... 58 B.4.5 Debounce Times of Input Card 2 Channels 0 through 7........ 59 B.4.6 Debounce Times of Input Card 2 Channels 8 through 15....... 60 B.4.7 Debounce Times of Input Card 3 Channels 0 through 7........ 61 B.4.8 Debounce Times of Input Card 3 Channels 8 through 15....... 62 B.4.9 Debounce Times of Input Card 4 Channels 0 through 7........ 63 B.4.10 Debounce Times of Input Card 4 Channels 8 through 15....... 64 B.4.11 Debounce Times of Input Card 5 Channels 0 through 7........ 65 B.4.12 Debounce Times of Input Card 5 Channels 8 through 15....... 66 B.4.13 Debounce Times of Input Card 6 Channels 0 through 7........ 67 B.4.14 Debounce Times of Input Card 6 Channels 8 through 15....... 68 B.4.15 Debounce Times of Input Card 7 Channels 0 through 7........ 69 B.4.16 Debounce Times of Input Card 7 Channels 8 through 15....... 70 B.5 Timestamp and Timing Registers......................... 71 B.5.1 Current Timeslot.............................. 71 B.5.2 Current Timestamp Seconds Past Epoch................ 71 B.5.3 Current Timestamp Nanoseconds and Pulse ID........... 71 C Uniblitz Shutter Electronic Synchronization System 73 Page 5 of 74

Page 6 of 74

1 Introduction This document discusses the Machine Protection System (MPS) design. The MPS is an interlock system to turn off or reduce the rate of the beam in response to fault conditions that may either damage or cause unwanted activation of machine parts. The three active devices in the LCLS MPS are an abort kicker that is able to kick a single electron bunch, and a Pockels cell and mechanical shutter that block the laser light. The MPS is able to reduce the beam rate below operators requested beam rate only. It cannot raise the beam rate above operators requested beam rate. Separate systems support the MPS to protect other energized devices such as power supplies, magnets, and klystrons. A separate beam containment system uses redundant hardware to ensure that no beam or radiation reaches potentially occupied areas. An interim MPS has been developed for early installation in 2006 to provide an MPS during injector commissioning. System requirements are found in ESD 1.1-312, LCLS Machine Protection System Requirements at http://www-ssrl.slac.stanford.edu/lcls/prd/1.1-312-r1.pdf. Links to many MPS related documents including a list of MPS input devices and rate limiting conditions are kept online at https://sharepoint.slac.stanford.edu/ and http://lcls-dev.slac.stanford.edu/tiki-index.php?page=mps. Page 7 of 74

Page 8 of 74

2 Requirements The primary goal of the MPS is to limit any beam loss near the permanent magnet material of the undulator to a reasonable level. The MPS must also protect other sections of the LCLS from radiation damage from the beam. The MPS must react to a fault before the next beam pulse. The LCLS has a nominal rate of 120 Hz, giving the MPS 8.33 ms to react to a fault. 2.1 Protection 1. The MPS must limit the undulator s radiation dosage to below a specified amount. 2. Beamline components are to be protected from excessive beam exposure to prevent damage to the vacuum system and unnecessary activation. 2.2 Timing 1. Must protect machine in less than 8.33 ms. 2. Mitigation devices can limit beam rate from 120 Hz to 0 Hz including intermediate values 60 Hz, 30 Hz, 10 Hz, 1 Hz as well as the single shot and burst modes. 2.3 Configurability 1. Must be able to change configuration of the logic and beam rate. 2. Add and remove inputs to MPS. 3. Bypass device fault inputs to MPS. 4. Set and change fault thresholds. 5. Understand machine modes. 2.4 Automatic Recovery 1. After a fault is corrected, the MPS will has the ability to raise the beam rate to the rate before the fault. Page 9 of 74

2. The MPS is configured for which inputs allow automatic recovery and when it is to be used. 2.5 User Interface and Diagnostics 1. A user interface will provide diagnostics and status information for the MPS including the cause of an MPS trip. 2. When a trip occurs, the MPS uses the timing system to signal devices to stop their circular data buffers for use in postmortem analysis. 2.6 Physical 1. Size: Must fit within a standard instrument rack. 2. Weight: Must be supported by a standard instrument rack. 3. Temperature: 0 C to 50 C. Page 10 of 74

3 Overview The MPS determines the maximum allowed beam rate by processing MPS device fault input signals with a rate limiting algorithm. The algorithm, run on a piece of hardware, called the Link Processor, obtains the state of device faults from Link Nodes. Placed periodically along the LCLS beam line, Link Nodes gather the fault status from mulitple MPS devices such as Protection Ion Chambers (PICs) and limit switches. Figure 3.1 shows an overview of the Link Node and Link Processor inputs and outputs. Timing System MPS Link Processor EPICS Gb Ethernet over Cat6 1000BASE-X Switch Gb Ethernet over Fiber MPS Link Node E MPS Link Node E MPS Link Node E MPS Link Node E MPS Link Node E MPS Link Node E Digital I/O over Copper Device Device Device Device Device Device Device Device Device Device Device Mitigation Device Mitigation Device Figure 3.1. Overview of Link Processor and Link Node Signals The Link Processor is composed of an MVME 6100 loaded with an Event Receiver (EVR) for timing system communication. Link Nodes are rackmount SLAC design and feature pluggable input and output cards. The Link Processor and Link Nodes communicate over a dedicated Gigabit Ethernet network to share status and control data and also have Input Output Controller (IOC) functionality to communicate with other devices on the control system network. Page 11 of 74

Page 12 of 74

4 Inputs 4.1 Fault Signals The MPS monitors binary fault signals provided by external devices. Each signal can represent two states. Groups of signals can be combined to represent many states. Devices, such as PICs, that fault when thresholds are exceeded will use hardware external to the Link Nodes to convert faults into binary signals before being read by the Link Nodes. Slow changing devices (like magnets) are able to provide their fault status to the Link Processor through EPICS. 4.1.1 Multi-Bit States Multiple signals can be combined to provide more states for a device. For example, many moving devices provide two limit switch inputs to the MPS for a single device. The two limit switches, each providing a binary state, are combined by the MPS to create four states. An example of a device with two limit switches providing two signals and multiple states is shown below in Table 4.1. Table 4.1. Example Multi-Bit States for a Moving Device In Limit Switch Out Limit Switch State 0 0 Moving 0 1 Device is Out 1 0 Device is In 1 1 Invalid/Broken 4.1.2 Latching To avoid missing a quickly changing fault input, faults can be latched on rising or falling edges in the following ways: 1. Not latched. 2. Latched, cleared on read. Equivalent to Latching Digital Input Module (LDIM). 3. Latched, cleared on EPICS IOC originating reset command. Page 13 of 74

4.1.3 Logic Levels The MPS continues the use of the 0/24 V voltage range currently used by many MPS devices. Table 4.2. Logic Levels Voltage (V) Logic State 0 0 24 1 4.1.4 Connection to Link Nodes Device fault signals are grouped geographically and input to the nearest Link Node s terminal block. Periodically placing Link Nodes down the beam line will keep cable lengths short and devices grouped logically. 4.2 Uploading MPS Algorithm to the Link Processor The MPS algorithm is loaded as a part of the Link Processor s IOC executable using the standard IOC booting method. 4.3 EPICS The Link Processor will communicate with other IOCs to gather fault information for slow devices (such as magnets) and provides the user with MPS status and control. The Link Processor and Link use watchdogs that cause the MPS to rate limit the beam to 0 Hz if fresh EPICS data is not available after a specified period of time. 4.4 Event Generator The Link Processor uses an EVR to verify the MPS information it sends the Event Generator (EVG) (see Section 5.5). Page 14 of 74

4.5 Chatter Faults Faults that quickly and constantly switch from true to false will be ignored for a specified amount of time. Page 15 of 74

Page 16 of 74

5 Outputs Output signals refer to signals that are sent from the Link Processor and Link Nodes. Figure 5.1 shows an overview of the LCLS beam line and the location of the mitigation devices. Gun BX01 BX02 Linac-X BC1 TD11 BC2 50B1 D10 Dump BX3_ Energy Collimator SBBD (BYKIK) BX3_ Collimators TDUND Undulator (34 sections) BYD Photon Section BXS Energy Spectrometer Dump Dump Main Dump Figure 5.1. Overview of LCLS Beam Line and Mitigation Devices 5.1 Mitigation Devices The Link Processor sends commands to local Link Nodes to control the following mitigation devices: 1. Pockels Cell. 2. Injector mechanical shutter. 3. Single Bunch Beam Dumper. The Single Bunch Beam Dumper (SBBD) s performance is monitored on every pulse. If it fails to meet the MPS requirements, the MPS will use the Pockels cell and injector mechanical shutter to limit beam. In the case that the MPS does not successfully mitigate a fault, the Beam Containment System (BCS) will use a mechanical shutter to block light on the cathode (see Section 5.2.3). 5.2 Laser Light Mitigation The MPS can stop laser light with either the Pockels cell or injector mechanical shutter. The most upstream device is the Pockels cell. The pockels cell blocks light to both the injector cathode and the virtual cathode, whose image is used for feedback. The injector mechanical shutter is further downstream and blocks light to the injector cathode only. Page 17 of 74

Injector Mirror Injector Cathode Injector Mechanical Shutter Splitter Virtual Cathode and Joule Meter BCS Mechanical Shutter Compressor and Trippler System (IR to UV) Pockels Cell Figure 5.2. Path of Laser Light and Mitigation Devices 5.2.1 Pockels Cell The Pockels cell gates the injector laser. It gates beam from 120 Hz to 10 Hz. Lower rates are gated by the injector mechanical shutter. To ensure the virtual cathode has sufficient data for feedback, the Pockels cell does not rate limit the beam below 10 Hz. 5.2.2 Injector Mechanical Shutter The injector mechanical shutter gates the injector laser. It is used to gate a 10 Hz or less frequent beam to 1 Hz or 0 Hz. The injector mechanical shutter is also used to gate a single pulse when the machine is running in Single Shot mode. To avoid wear on the shutter, the shutter only gates beam at a maximum rate of 1 Hz. The shutter is configured to be normally closed. The shutter controller uses a TTL gate input signal provided by the MPS as shown in Table 5.1. 5.2.3 Beam Containment System The injector mechanical shutter control signal is checked against a shutter readback signal provided by the shutter s Electronic Synchronization System (ESS) (See Appendix C). A shutter fault occurs when the shutter control does not match the shutter s readback signal from the ESS. See Table 5.2. Page 18 of 74

Table 5.1. Injector Mechanical Shutter Control Signals Beam Rate, f b (Hz) Output (TTL) f b > 1 High f b = 1 1 Hz trigger with < 0.1 s high-time f b = 0 Low Table 5.2. Shutter Fault Logic Shutter Open Shutter Readback Shutter OK 0 0 1 0 1 0 1 0 0 1 1 1 The MPS provides a dry contact for the BCS to search. The contact is closed while the shutter is operating correctly and open when the shutter has failed as shown in Figure 5.3. When the relay opens, the BCS blocks light on both the injector and virtual cathode by closing its own mechanical shutter. Note that a shutter control fault is a fault of the MPS itself and is not a BCS fault. Shutter Open Shutter Readback XNOR R e l a y + Shutter OK Shutter OK BCS Figure 5.3. MPS Shutter Fault Output to BCS 5.3 Electron Beam Mitigation Single Bunch Beam Dumper The SBBD is used to kick beam in the Linac To Undulator (LTU) section. It can rate limit the beam from 120 Hz to 0 Hz while the linac is kept at a constant rate. The SBBD is triggered at 120 Hz. The trigger is moved from standby time to beam time in order to abort the beam. The amplitude of the SBBD s current is monitored on every pulse. If the amplitude does not fall within a specified range at trigger time, the SBBD has failed Page 19 of 74

and the MPS uses the Pockels cell and mechanical shutter to continue rate limiting at the correct rate. The MPS continues to trigger the SBBD after an SBBD fault. If the SBBD begins to operate correctly, the MPS returns to using the SBBD as the mitigation device. The SBBD trigger is created following the logic shown in Figure 5.4. The user may request to fire the SBBD at beam time (abort) or after beam time (standby). If the MPS chooses to abort the beam, the Link Processor does not transmit MPS Permit signal to the local Link Node. The inverse of this signal is ANDed with an always present Abort trigger generated by an EVR. This signal is ORed with the user request to create a trigger for the SBBD. The SBBD ignore standby triggers that follow an abort triggers. Table 5.3 details which trigger is used by the SBBD. Standby Trigger (EVR) AND MPS Permit (DM) User's Request (IOC) AND NOT OR Trigger to SBBD Abort Trigger (EVR) AND Figure 5.4. SBBD Control Logic Table 5.3. Trigger Used By SBBD MPS Permit User s Request Trigger Used By SBBD 0 0 Abort 0 1 Abort 1 0 Abort 1 1 Standby 5.4 EPICS MPS device fault signal status and configuration checks will be available through EPICS. The Link Processor provides Process Variables (PVs) with information acquired from the MPS hardware including Link Nodes and itself. Page 20 of 74

5.5 Event Generator The MPS provides eight bits of information to the EVG as recorded in Figure 5.5 and Table 5.4. 7 4 3 1 0 Beam Destination Rate Limit Beam Permit Figure 5.5. MPS Data Sent to EVG Page 21 of 74

Table 5.4. Layout of MPS Data Given to EVG Purpose State Value Description of State Beam Permit Rate Limit Beam Destination -------0 0 Beam Not Permitted -------1 1 Beam Permitted ----000-0 Communication Failure ----001-1 0 Hz ----010-2 1 Hz ----011-3 10 Hz ----100-4 30 Hz ----101-5 60 Hz ----110-6 120 Hz ----111-7 Undefined 0000---- 0 No Beam Destination Specified 0001---- 1 Gun Spectrometer Dump 0010---- 2 Straight Ahead Beam Dump (SAB) 0011---- 3 Insertable Tune Up Dump (TD11) 0100---- 4 Insertable Tune Up Dump (TD21) 0101---- 5 D10 Dump 0110---- 6 Single Bunch Beam Dumper (SBBD) 0111---- 7 Insertable Tune Up Dump (TDUND) 1000---- 8 Main Dump 1111---- 15 Experimental Huts Page 22 of 74

6 Link Communication Link Nodes and the Link Processor communicate using the protocols shown in Table 6.1. The link, network, and transport layers encapsulate the custom application layer protocol, MPS Link Protocol (MLP). Table 6.1. MPS 2007 Link Communication Protocols Layer Protocol Link Gigabit Ethernet Network IPv4 Transport UDP Application MLP 6.1 MPS Link Protocol MLP defines a set of header bits and a data section. The data section has a maximum payload of 1466 bytes to ensure the entire message fits within a single standard Ethernet frame. An MLP frame has the following structure and is shown graphically in Figure 6.1. Protocol Version (1 byte) The current protocol version is placed in this field. A zero rate fault is created when an unexpected protocol version is received. Message Type (1 byte) This field is used to inform the receiver of the data format of the following data field. Data (0 to 1466 bytes) The data field has a maximum payload of 1466 bytes to ensure the MLP datagram is not split into multiple Ethernet frames. 6.2 MPS Link Protocol Message Types The MLP defines the following message types. Message sizes do not include the Ethernet, IPv4, UDP, or MLP headers. 6.2.1 Link Synchronization The Link Processor sends this broadcast message to all Link Nodes on update of the timestamp from the timing system (360 Hz). The synchronization message is also used as Page 23 of 74

72 to 1526 46 to 1500 22 24 8 2 to 1468 4 Ethernet IPv4 UDP MPS Link Protocol CRC 1 1 0 to 1466 Protocol Version Message Type Data Figure 6.1. MPS Link Protocol Datagram (sizes in bytes) Page 24 of 74

a request for Link Node status. Details Structure Message Type Value 0x51 Broadcast Type Broadcast, Link Processor to Link Nodes Message Size 11 bytes Timestamp Seconds Past Epoch (4 bytes) The current timestamp s seconds past epoch field. Timestamp Nanoseconds and Pulse ID (4 bytes) The current timestamp s nanoseconds past second and the Pulse ID. Timeslot (1 byte) The current timeslot. Data For EVG (2 bytes) Synchronization data for the EVG. Data includes beam rates for each destination and the beam s current destination. 6.2.2 Link Node Status Upon receipt of a Link Synchronization message (Section 6.2.1), Link Nodes sends their current status of their input bits directly to the Link Processor. Details Structure Message Type Value 0xAC Broadcast Type Unicast, Link Node to Link Processor Message Size 26 bytes Timestamp Seconds Past Epoch (4 bytes) The last received timestamp s seconds past epoch field. Timestamp Nanoseconds and Pulse ID (4 bytes) The last received timestamp s nanoseconds past second and the Pulse ID. Timeslot (1 byte) The last received timeslot. Link Node ID (1 byte) The Link Node s ID number. Page 25 of 74

Input Card Status (12 bytes) The latched state of all 96 Link Node inputs. Unused inputs are included in this field. Output Card Status (2 bytes) The current state of the Link Node s 8 outputs, 4 trigger inputs, and 4 trigger outputs. Link Node Status (2 bytes) The latched state of internal Link Node errors. 6.2.3 Link Node Unlatch After the Link Processor has handled the device status information from a Link Node Status message (Section 6.2.2), it sends a Link Node Unlatch message back to the Link Node. This message informs Link Nodes of which faults the Link Processor has received and can therefore be unlatched. This message mirrors the Link Node Status message. After processing a Link Node Status message, the Link Processor can change the Message Type value of the message to 0x71 and send the message back to the Link Node as a Link Node Unlatch message. Details Structure Message Type Value 0x71 Broadcast Type Unicast, Link Processor to Link Node Message Size 26 bytes Unused (9 bytes) Unused. Link Node ID (1 byte) The Link Node s ID number. Device Status (12 bytes) The device states from the last received Link Node Status message. Unused (2 bytes) Unused. Link Node Status (2 bytes) The internal Link Node error states from the last received Link Node Status message. 6.2.4 Link Node Output Control The Link Processor sends this control message as a broadcast to all Link Nodes. This message supports only 64 Link Nodes. If more Link Nodes need to be addressed, multiple Page 26 of 74

Link Node Output Control message can be sent, or this message can be sent as unicast. Details Structure Message Type Value 0x61 Broadcast Type Broadcast, Link Processor to Link Nodes Message Size 256 bytes Output States and Triggers (256 bytes) A 64 element array with the element format: 1 byte Link Node ID. 2 byte Output states. Two bits per output. See Table 6.2. 1 byte Trigger outputs. One bit per trigger using the four least significant bits of the byte to control the four triggers. Table 6.2. Output State Value Effects Value Effect On Output 0x00 No change 0x01 Set output state to 0 0x10 Set output state to 1 0x11 No change 6.3 MPS Link Protocol Timing Synchronization of the Link Processor and Link Nodes occurs at 360 Hz. Latency of messages can be calculated with the following values: Therefore, the maximum fiber propagation time, t pmax, is t pmax = l max n c 1.5 km 1.4677 = c = 7.3436 µs Page 27 of 74

Table 6.3. Symbol Descriptions and Values Description Symbol Value Bit Rate R 1.125 Gbit/s Max Datagram Size d max 1526 bytes Encoding Overhead E 10/8 Fiber Index of Refraction at 1310 nm n 1.4677 Maximum Fiber Length in MPS l max 1.5 km Number of Link Nodes N LN 30 the time to send out a full datagram, t f d, is t f d = d max E R 1526 B 10/8 = 1.125 Gbit/s = 13.564 µs and the time to serially receive full datagrams from all nodes all from the maximum distance, t worst, is t worst = N LN ( tpmax + t f d ) = 30 (13.564 µs + 7.3436 µs) = 30 (20.908 µs) = 627.24 µs 6.3.1 Link Synchronization and Link Node Status Figure 6.2 shows the timing of transmitting Link Synchronization, Link Node Output Control, and Link Node Status messages during an 8.3 ms period. Time to send the Link Node Status messages from the Link Nodes to the Link Processor assumes t worst. Page 28 of 74

2.77 ms Link Synchronization Link Node Output Control From Link Processor 1 From Link Nodes 0 2 4 6 8 t (ms) 0.7 ms Link Node Status Figure 6.2. Timing Diagram of Link Communication During a 120 Hz Pulse Page 29 of 74

Page 30 of 74

7 Logic 7.1 Rate Limiting Depending on a fault s severity, the MPS may limit beam rate rather than abort all pulses. The MPS can rate limit the beam from 120 Hz to 0 Hz. An example of an MPS rate limiting decision is the limiting of the integrated radiation dosage in the undulator. If the undulator exceeds a cumulative dosage threshold over one second, the MPS may decide to lower the beam rate to mitigate the fault. Cumulative Radiation Dose 120 Hz Beam... No Beam 1 2 3 4 20 1 10 Hz Beam 10 Hz Beam......... 5 10 Loss Threshold Pulse 0 0.16 0.26 1.26 Time (s) Figure 7.1. Example of Integrated Radiation Dosage in the Undulator Figure 7.1 shows an example of integrated radiation dose in the undulator. The MPS calculates the integrated radiation dosage as the cumulative beam loss per second. The beam rate begins at 120 Hz. The loss threshold is exceeded by the 20 th pulse in the first second. The MPS must limit the beam rate to less than 20 pulses/1 s = 20 Hz. The MPS chooses to limit the beam rate to 10 Hz, resets the cumulated beam loss, and waits 1 pulse at 10 Hz = 0.1 s to begin the 10 Hz integration. The last pulse at 120 Hz is also the first pulse at 10 Hz. If the machine continues to lose a similar amount of beam per pulse, the cumulated beam loss will only reach 10 Hz/20 Hz = 50% of the loss threshold. 7.2 Automatic Recovery After certain rate limiting faults clear, the MPS has the option to restore the beam rate to an operator s current requested beam rate. Each fault is configured to either support automatic recovery or require an operator to manually reset the beam rate after the fault clears. Page 31 of 74

7.3 MPS Algorithm An MPS algorithm file is compiled into an the Link Processor s executable. The Link Processor uses the logic to process fault inputs from the Link Nodes and EPICS to limit the beam s rate when necessary. Mitigation devices are monitored by the MPS. If a mitigation device is not performing to specifications, the MPS will limit beam with a different mitigation device. 7.4 Bypassing Device Faults Device faults can be bypassed via an EPICS display if authorized with a correct name and password. After authorization, the operator can bypass a fault by selecting a fault, choosing its bypass state, and supplying an expiration date or a bypass duration. For example, an operator can choose to bypass a flow switch for one day by selecting the flow switch input, selecting its OK state, and giving a bypass duration of 24 hours. The operator could also specify the expiration duration of 24 hours by selecting an expiration date of tomorrow. 7.5 Setting and Changing Device Fault Thresholds Fault thresholds are changed by the MPS device s IOC. The IOCs use Channel Access (CA) security to ensure that users are authorized with a name and password before changes are made. Page 32 of 74

8 Hardware Overview The MPS a client/server design. The client, the Link Processor, requests data from servers, the Link Nodes, in order to determine the maximum allowed beam rate. The Link Processor and Link Nodes communicate over Gigabit Ethernet. 8.1 Link Processor The Link Processor is the core of the MPS. It a specialized EPICS IOC running on an MVME 6100 that gathers fault information from Link Nodes, determines the maximum allowable beam rate, and transmits mitigation control signals to Link Nodes. The Link Processor interfaces with EPICS, the timing system through a mounted EVR, and Link Nodes through Gigabit Ethernet. 8.2 Link Node Link Nodes gather inputs from MPS fault devices using digital input cards (see Section 9.2). They package these inputs into an Ethernet frame and send the data to the Link Processor. After processing of the inputs, the Link Processor responds to the Link Nodes with control commands to either permit or deny beam. The Link Nodes control mitigation devices using digital output cards (see Section 9.1). See Chapter 9 for Link Node details. Page 33 of 74

Page 34 of 74

9 Link Node Link Nodes gather inputs from MPS fault devices using digital input cards (see Section 9.2). They package these inputs into an Ethernet frame and send the data to the Link Processor. After processing of the inputs, the Link Processor responds to the Link Nodes with control commands to either permit or deny beam. The Link Nodes control mitigation devices using digital output cards (see Section 9.1). Figure 9.1 shows the Link Node circuit board layout. Figure 9.1. Circuit Board Layout of the Link Node Hardware Page 35 of 74

9.1 Digital Ouput Link Nodes use digital output cards to send signals to mitigation devices, the EVG, and other devices. The cards provide eight output channels of 0/24 V signals for logic levels 0 and 1 respectively. The cards also feature four trigger inputs and four trigger outputs. The digital output card s circuit board layout is shown in Figure 9.2. Figure 9.2. Circuit Board Layout of Digital Output Card 9.2 Digital Input Link Nodes use digital input cards to read signals from MPS hardware. See Section 4.1.3 for more information on logic levels. The input card has sixteen input channels. The digital input card s circuit board layout is shown in Figure 9.3. 9.2.1 Response Time Debounce time is individually programmable from 10 µs to 5 s (inclusive) in eight steps. Page 36 of 74

Figure 9.3. Circuit Board Layout of Digital Input Card Table 9.1. Selectable Debounce Times Mode Debounce Time 0 10 µs 1 100 µs 2 1 ms 3 10 ms 4 100 ms 5 500 ms 6 1 s 7 5 s Page 37 of 74

Page 38 of 74

10 Naming The MPS follows the LCLS naming format for its PV names and internal names used by the mpl! (mpl!). 10.1 PV Names Page 39 of 74

Page 40 of 74

11 Testing 11.1 MPS Algorithm Testing Test software is used to check the algorithm s runtime correctness testing to ensure that the algorithm was correctly implemented. The algorithm s syntax is checked at compile time. 11.1.1 Simulate Mode The MPS can be put into a simulate mode where a combination of inputs and outputs can be real or simulated. Real signals reflect actual hardware signals while simulated values can be read from and written to with software. When in simulate mode, operators can toggle input bits and view the results of the MPS algorithm from an EPICS interface. 11.2 Gigabit Ethernet Link Nodes have an echo function to return a test message sent from the Link Processor. 11.3 Device Configuration On request, the Link Processor and Link Nodes report their current configurations through EPICS. Page 41 of 74

Page 42 of 74

12 User Interface The MPS will use LCLS standard EPICS displays for control and monitoring of MPS signals. CA security will be used to authorize MPS threshold and bypass changes with a user name and password. The displays will provide the current state of MPS fault inputs along with an associated string and severity level. For example, a multi-bit limit switch input where both an in and out limit switch are not pressed may map to a string Moving colored yellow to reflect a moderate fault severity. The interface will provide displays with rate limiting information including the maximum allowed beam rate and enabled mitigation devices. The configuration of MPS devices will also be available from the displays. Page 43 of 74

Page 44 of 74

Appendices Page 45 of 74

Page 46 of 74

A List of Acronyms BCS Beam Containment System CA Channel Access ESS Electronic Synchronization System see Appendix C EVG Event Generator EVR Event Receiver IOC Input Output Controller LDIM Latching Digital Input Module LTU Linac To Undulator MLP MPS Link Protocol MPS Machine Protection System PIC Protection Ion Chamber PV Process Variable SBBD Single Bunch Beam Dumper Page 47 of 74

Page 48 of 74

B Link Node FGPA Registers FPGA registers are provided to the ColdFire processor, USB interface, and fiber interface to read and set FPGA states and settings. All registers are big-endian. B.1 Latched Input Card State Registers Table B.1. Input Card Status Register Values Value Input Channel State 0 Input is low or unplugged 1 Input is high B.1.1 Latched State of Input Cards 0 and 1 Address 0x00000000 The latched state of input cards 0 and 1. Inputs are latched to the fault state if their signal is unplugged or stays faulted for the input specific debounce time. 31 16 15 0 Input Card 1 Input Card 0 Bit Description Default Value CPU Link 31:16 Latched state of Input Card 1. Bit 16 through 31 map to the card inputs 0 through 15 respectively. 15:0 Latched state of Input Card 0. Bit 0 through 15 map to the card inputs 0 through 15 respectively. 0x0000 R R 0x0000 R R B.1.2 Latched State of Input Cards 2 and 3 Address 0x00000001 Page 49 of 74

The latched state of input cards 2 and 3. Inputs are latched to the fault state if their signal is unplugged or stays faulted for the input specific debounce time. 31 16 15 0 Input Card 3 Input Card 2 Bit Description Default Value CPU Link 31:16 Latched state of Input Card 3. Bit 16 through 31 map to the card inputs 0 through 15 respectively. 15:0 Latched state of Input Card 2. Bit 0 through 15 map to the card inputs 0 through 15 respectively. 0x0000 R R 0x0000 R R B.1.3 Latched State of Input Cards 4 and 5 Address 0x00000002 The latched state of input cards 4 and 5. Inputs are latched to the fault state if their signal is unplugged or stays faulted for the input specific debounce time. 31 16 15 0 Input Card 5 Input Card 4 Bit Description Default Value CPU Link 31:16 Latched state of Input Card 5. Bit 16 through 31 map to the card inputs 0 through 15 respectively. 15:0 Latched state of Input Card 4. Bit 0 through 15 map to the card inputs 0 through 15 respectively. 0x0000 R R 0x0000 R R Page 50 of 74

B.2 Current Input Card State Registers Table B.2. Input Card Status Register Values Value Input Channel State 0 Input is low or unplugged 1 Input is high B.2.1 Current State of Input Cards 0 and 1 Address 0x00000003 The current state of input cards 0 and 1. States follow signal levels after the signals are held constant for the input specific debounce time. 31 16 15 0 Input Card 1 Input Card 0 Bit Description Default Value CPU Link 31:16 Current state of Input Card 1. Bit 16 through 31 map to the card inputs 0 through 15 respectively. 15:0 Current state of Input Card 0. Bit 0 through 15 map to the card inputs 0 through 15 respectively. 0x0000 R R 0x0000 R R B.2.2 Current State of Input Cards 2 and 3 Address 0x00000004 The current state of input cards 2 and 3. States follow signal levels after the signals are held constant for the input specific debounce time. 31 16 15 0 Input Card 3 Input Card 2 Page 51 of 74

Bit Description Default Value CPU Link 31:16 Current state of Input Card 3. Bit 16 through 31 map to the card inputs 0 through 15 respectively. 15:0 Current state of Input Card 2. Bit 0 through 15 map to the card inputs 0 through 15 respectively. 0x0000 R R 0x0000 R R B.2.3 Current State of Input Cards 4 and 5 Address 0x00000005 The current state of input cards 4 and 5. States follow signal levels after the signals are held constant for the input specific debounce time. 31 16 15 0 Input Card 5 Input Card 4 Bit Description Default Value CPU Link 31:16 Current state of Input Card 5. Bit 16 through 31 map to the card inputs 0 through 15 respectively. 15:0 Current state of Input Card 4. Bit 0 through 15 map to the card inputs 0 through 15 respectively. 0x0000 R R 0x0000 R R Page 52 of 74

B.3 Output Card Registers B.3.1 Output Card Control Address 0x00000006 The output card has eight digital output channels and four trigger outputs. Each output is controlled by two bits in this register. See Table B.3 for a description of how these bit pairs map to output states. Table B.3. Digital Output Control Values Value Digital Output State 00 Set low 01 Set high 10 Undefined 11 Do not change Table B.4. Trigger Control Values Value Trigger Control 0 No output 1 Trigger 31 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 TC3 TC2 TC1 TC0 OC7 OC6 OC5 OC4 OC3 OC2 OC1 OC0 Bit Description Default Value CPU Link 31:20 Unused 19 Trigger 3 control 0 R R/W 18 Trigger 2 control 0 R R/W 17 Trigger 1 control 0 R R/W 16 Trigger 0 control 0 R R/W 15:14 Output channel 7 control 00 R R/W 13:12 Output channel 6 control 00 R R/W 11:10 Output channel 5 control 00 R R/W 9:8 Output channel 4 control 00 R R/W 7:6 Output channel 3 control 00 R R/W 5:4 Output channel 2 control 00 R R/W 3:2 Output channel 1 control 00 R R/W 1:0 Output channel 0 control 00 R R/W Page 53 of 74

B.3.2 Output Card Status Address 0x00000007 The output card has eight output channels. The output state of each channel is represented by a bit in this register. See Table B.5 for a description of register values. Table B.5. Output Card Status Register Values Value Output Channel State 0 Output is low 1 Output is high 31 8 7 6 5 4 3 2 1 0 OS7 OS6 OS5 OS4 OS3 OS2 OS1 OS0 Bit Description CPU Link 31:8 Unused 7 Current output state of channel 7 R R 6 Current output state of channel 6 R R 5 Current output state of channel 5 R R 4 Current output state of channel 4 R R 3 Current output state of channel 3 R R 2 Current output state of channel 2 R R 1 Current output state of channel 1 R R 0 Current output state of channel 0 R R Page 54 of 74

B.4 Input Card Channel Debounce Registers Table B.6. Debounce Register Values Mode Register Value Debounce Time 0 000 10 µs 1 001 100 µs 2 010 1 ms 3 011 10 ms 4 100 100 ms 5 101 500 ms 6 110 1 s 7 111 5 s B.4.1 Debounce Times of Input Card 0 Channels 0 through 7 Address 0x00000008 This register controls the desired debounce times for channels 0 through 7 of input card 0. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D007 D006 D005 D004 D003 D002 D001 D000 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 0 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 0 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 0 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 0 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 0 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 0 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 0 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 0 Channel 0 000 R/W R/W Page 55 of 74

B.4.2 Debounce Times of Input Card 0 Channels 8 through 15 Address 0x00000009 This register controls the desired debounce times for channels 8 through 15 of input card 0. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D015 D014 D013 D012 D011 D010 D009 D008 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 0 Channel 15 000 R/W R/W 20:18 Debounce time of Input Card 0 Channel 14 000 R/W R/W 17:15 Debounce time of Input Card 0 Channel 13 000 R/W R/W 14:12 Debounce time of Input Card 0 Channel 12 000 R/W R/W 11:9 Debounce time of Input Card 0 Channel 11 000 R/W R/W 8:6 Debounce time of Input Card 0 Channel 10 000 R/W R/W 5:3 Debounce time of Input Card 0 Channel 9 000 R/W R/W 2:0 Debounce time of Input Card 0 Channel 8 000 R/W R/W Page 56 of 74

B.4.3 Debounce Times of Input Card 1 Channels 0 through 7 Address 0x00000010 This register controls the desired debounce times for channels 0 through 7 of input card 1. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D107 D106 D105 D104 D103 D102 D101 D100 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 1 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 1 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 1 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 1 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 1 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 1 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 1 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 1 Channel 0 000 R/W R/W Page 57 of 74

B.4.4 Debounce Times of Input Card 1 Channels 8 through 15 Address 0x00000011 This register controls the desired debounce times for channels 8 through 15 of input card 1. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D115 D114 D113 D112 D111 D110 D109 D108 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 1 Channel 15 000 R/W R/W 20:18 Debounce time of Input Card 1 Channel 14 000 R/W R/W 17:15 Debounce time of Input Card 1 Channel 13 000 R/W R/W 14:12 Debounce time of Input Card 1 Channel 12 000 R/W R/W 11:9 Debounce time of Input Card 1 Channel 11 000 R/W R/W 8:6 Debounce time of Input Card 1 Channel 10 000 R/W R/W 5:3 Debounce time of Input Card 1 Channel 9 000 R/W R/W 2:0 Debounce time of Input Card 1 Channel 8 000 R/W R/W Page 58 of 74

B.4.5 Debounce Times of Input Card 2 Channels 0 through 7 Address 0x00000012 This register controls the desired debounce times for channels 0 through 7 of input card 2. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D207 D206 D205 D204 D203 D202 D201 D200 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 2 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 2 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 2 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 2 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 2 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 2 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 2 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 2 Channel 0 000 R/W R/W Page 59 of 74

B.4.6 Debounce Times of Input Card 2 Channels 8 through 15 Address 0x00000013 This register controls the desired debounce times for channels 8 through 15 of input card 2. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D215 D214 D213 D212 D211 D210 D209 D208 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 2 Channel 15 000 R/W R/W 20:18 Debounce time of Input Card 2 Channel 14 000 R/W R/W 17:15 Debounce time of Input Card 2 Channel 13 000 R/W R/W 14:12 Debounce time of Input Card 2 Channel 12 000 R/W R/W 11:9 Debounce time of Input Card 2 Channel 11 000 R/W R/W 8:6 Debounce time of Input Card 2 Channel 10 000 R/W R/W 5:3 Debounce time of Input Card 2 Channel 9 000 R/W R/W 2:0 Debounce time of Input Card 2 Channel 8 000 R/W R/W Page 60 of 74

B.4.7 Debounce Times of Input Card 3 Channels 0 through 7 Address 0x00000014 This register controls the desired debounce times for channels 0 through 7 of input card 3. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D307 D306 D305 D304 D303 D302 D301 D300 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 3 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 3 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 3 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 3 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 3 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 3 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 3 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 3 Channel 0 000 R/W R/W Page 61 of 74

B.4.8 Debounce Times of Input Card 3 Channels 8 through 15 Address 0x00000015 This register controls the desired debounce times for channels 8 through 15 of input card 3. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D315 D314 D313 D312 D311 D310 D309 D308 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 3 Channel 15 000 R/W R/W 20:18 Debounce time of Input Card 3 Channel 14 000 R/W R/W 17:15 Debounce time of Input Card 3 Channel 13 000 R/W R/W 14:12 Debounce time of Input Card 3 Channel 12 000 R/W R/W 11:9 Debounce time of Input Card 3 Channel 11 000 R/W R/W 8:6 Debounce time of Input Card 3 Channel 10 000 R/W R/W 5:3 Debounce time of Input Card 3 Channel 9 000 R/W R/W 2:0 Debounce time of Input Card 3 Channel 8 000 R/W R/W Page 62 of 74

B.4.9 Debounce Times of Input Card 4 Channels 0 through 7 Address 0x00000016 This register controls the desired debounce times for channels 0 through 7 of input card 4. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D407 D406 D405 D404 D403 D402 D401 D400 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 4 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 4 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 4 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 4 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 4 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 4 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 4 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 4 Channel 0 000 R/W R/W Page 63 of 74

B.4.10 Debounce Times of Input Card 4 Channels 8 through 15 Address 0x00000017 This register controls the desired debounce times for channels 8 through 15 of input card 4. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D415 D414 D413 D412 D411 D410 D409 D408 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 4 Channel 15 000 R/W R/W 20:18 Debounce time of Input Card 4 Channel 14 000 R/W R/W 17:15 Debounce time of Input Card 4 Channel 13 000 R/W R/W 14:12 Debounce time of Input Card 4 Channel 12 000 R/W R/W 11:9 Debounce time of Input Card 4 Channel 11 000 R/W R/W 8:6 Debounce time of Input Card 4 Channel 10 000 R/W R/W 5:3 Debounce time of Input Card 4 Channel 9 000 R/W R/W 2:0 Debounce time of Input Card 4 Channel 8 000 R/W R/W Page 64 of 74

B.4.11 Debounce Times of Input Card 5 Channels 0 through 7 Address 0x00000018 This register controls the desired debounce times for channels 0 through 7 of input card 5. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D507 D506 D505 D504 D503 D502 D501 D500 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 5 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 5 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 5 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 5 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 5 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 5 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 5 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 5 Channel 0 000 R/W R/W Page 65 of 74

B.4.12 Debounce Times of Input Card 5 Channels 8 through 15 Address 0x00000019 This register controls the desired debounce times for channels 8 through 15 of input card 5. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D515 D514 D513 D512 D511 D510 D509 D508 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 5 Channel 15 000 R/W R/W 20:18 Debounce time of Input Card 5 Channel 14 000 R/W R/W 17:15 Debounce time of Input Card 5 Channel 13 000 R/W R/W 14:12 Debounce time of Input Card 5 Channel 12 000 R/W R/W 11:9 Debounce time of Input Card 5 Channel 11 000 R/W R/W 8:6 Debounce time of Input Card 5 Channel 10 000 R/W R/W 5:3 Debounce time of Input Card 5 Channel 9 000 R/W R/W 2:0 Debounce time of Input Card 5 Channel 8 000 R/W R/W Page 66 of 74

B.4.13 Debounce Times of Input Card 6 Channels 0 through 7 Address 0x00000020 This register controls the desired debounce times for channels 0 through 7 of input card 6. See Table B.6 for mapping of register values to debounce times. 31 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 D607 D606 D605 D604 D603 D602 D601 D600 Bit Description Default Value CPU Link 31:24 Unused 23:21 Debounce time of Input Card 6 Channel 7 000 R/W R/W 20:18 Debounce time of Input Card 6 Channel 6 000 R/W R/W 17:15 Debounce time of Input Card 6 Channel 5 000 R/W R/W 14:12 Debounce time of Input Card 6 Channel 4 000 R/W R/W 11:9 Debounce time of Input Card 6 Channel 3 000 R/W R/W 8:6 Debounce time of Input Card 6 Channel 2 000 R/W R/W 5:3 Debounce time of Input Card 6 Channel 1 000 R/W R/W 2:0 Debounce time of Input Card 6 Channel 0 000 R/W R/W Page 67 of 74