RF Testing of A Single FPIX1 for BTeV James Price Wayne State University 08/24/2004 Performed at Fermi National Accelerator Laboratory
This summer I spent two and a half months working at the Fermi National Accelerator Laboratory in Batavia, Illinois. My project was a part of BTeV, a new pixel detector which is going to be installed on the synchrotron at the lab. Specifically, I ended up doing RF testing on an FPIX1. The FPIX1 is the last version of the detectors heart and soul; the guts of the machine that binds a diode sensor and discriminators and talks to the computers outside to build a picture of the collisions. I call it the last version because the newest version, the FPIX2, which I had hoped to work with, was never ready during the time I was there. Regardless, the FPIX2 and FPIX1 share very similar architecture, so my tests should still be a valid indicator of what to expect in newer models, and in my time there I learned a lot about actually working in the field. So, what exactly do I mean by RF testing? Well, if you ve ever turned on your microwave while you re on a cordless phone, or pushed an unshielded speaker too close to a television or monitor with a cathode ray tube, then you have some experience with electromagnetic interference. Now, without getting into the physics of the beam in a collider, which are currently beyond my scope anyway, at it s simplest, the beam acts as a very powerful electrical current. This can cause it to interact with the detectors in BTeV and possibly generate noise. This noise could appear as aberrations in the data, or just make everything fuzzy. This interference would be highly detrimental to the experiments, creating bad data that leads to incorrect conclusions, and wastes time, effort, and money. So, it fell on me to run simulations so that we could predict how much noise there would be, work on methods of recognizing and filtering noise, and experiment with shielding methods to reduce the noise. The other portion of my job would be organizing a document that would be a record of the testing.
Beam chamber Beam pipe Device Under Test Coaxial Attenuator Scope RF Pulser PMC cpu Power supplies PTA software RF Amplifier Inject pulser High voltage source Current source Trigger delay adjustment Data storage This is the schematic that I made of the apparatus that I would be testing with. It was only the second time I had created anything in MS PowerPoint. Without any sophisticated photo editing software, I had to learn to use MS Paint to adjust the size of the pictures, and tweak them in any necessary ways. It was a good exercise in learning to be resourceful. Some of the pictures I was able to acquire from the manufacturer websites, but many I had to take myself with the digital camera I thankfully brought along. The schematic also provides me with a good visual aid to start explaining just how I ran my simulations. Basically, the FPIX1 sat in the beam chamber where the DUT, or device under test, is indicated in the diagram. Electricity was pulsed into the cells of the chip simulating hits from a collision, while a large current was generated by the
equipment on the right, which then traveled through the beam pipe past the DUT and generated the interference. The whole time this happened, the PMC and PTA would be interpreting the data sent back from the FPIX1 and saving it to the computer disk. Since this kind of thing could only be done by a computer, my job consisted mainly of starting the necessary programs and monitoring the process for glitches. Once the tests completed, the data was collected for analysis. This is the graphical interface of the test program. The Gaussian shape is what I would look for, and any dips in the bars or appearance of blue bars after the red bars had stabilized indicated a problem in the test and flawed data. This early test shows a large blank space at the beginning and many red bars. This is where I set the sweep amplitude
at a much wider range than I needed to, simply because I was unfamiliar with the way it would behave. By the end of the summer, I had the amplitude ranges down almost to a formula based on RF wattage and threshold volts, and had significantly streamlined my test times to 21 minutes from the hour that the test above took. Once the data was collected from the test computer, I had to move it to my own laptop for analysis in Matlab. This was the first time I had ever worked with Matlab, so I was happy to learn some new software. The data collected from the test is manipulated by formulae and put into picture form. Unfortunately, my knowledge was not quite adequate for a full explanation of the data, but over the months I learned to recognize when it was good, when it was bad, and predict where it should be. In all, I took nearly 150 tests over the course of my stay. 120 Threshold: µ = 4087.2587 σ = 150.9611 140 Noise: µ = 93.8369 σ = 7.5208 100 120 100 80 80 60 60 40 40 20 20 0 0.4 0.45 0.5 0 0.007 0.008 0.009 0.01 0.011 0.012 0.013 0.014 Noise Matrix 20 40 60 80 100 120 140 160 2 4 6 8 10 12 14 16
This is the data that I collected. Mean and dispersion for threshold and noise, and a noise map; three pictures for each test, seven tests per series, five series in a complete set. Each series consisted of seven tests at.05 volt intervals between 1.95v and 2.25v thresholds, with series at no RF active, 25w, 50w, 75w, and 100w, and variables of center position shielded, center position unshielded, and left position shielded. The upper left is the Gaussian fit of the threshold, upper right is the Gaussian fit of the noise, and lower right is the 2D representation of the noise in the FPIX1 by pixel location. This graphic is an explanation of what this data is, and how it is generated.
25w L 25w 25w nos 50w L 50w 50w nos 75w L 75w 75w nos 100w L 100w 100w nos No RF L No RF No RF nos Linear (100w nos) Linear (75w nos) Linear (50w nos ) Linear (25w nos) Linear (100w ) Linear (100w L ) Linear (75w L ) Linear (75w ) Linear (No RF nos) Linear (25w) Linear (50w L ) Linear (50w) Linear (No RF) Linear (No RF L) Linear (25w L) 25w L 25w 25w nos 50w L 50w 50w nos 75w L 75w 75w nos 100w L 100w 100w nos No RF L No RF No RF nos Linear (100w nos) Linear (75w ) Linear (75w nos) Linear (100w L ) Linear (50w L ) Linear (50w nos ) Linear (50w ) Linear (25w nos) Linear (75w L ) Linear (25w L) Linear (No RF nos) Linear (25w ) Linear (No RF) Linear (No RF L) Initially, I only collected threshold mean data to put in graph form. As time progressed however, the interesting behavior in the other data made it necessary to observe that also. At the end, I was making graphs of mean and dispersion for threshold and noise, as well as lines of best fit to show the trends in the data. 5000 170 4500 165 4000 160 3500 155 3000 150 2500 2000 145 1500 140 1000 135 500 130 0 threshold mean data 125 threshold disp. data 5000 170 4500 165 4000 160 3500 155 3000 150 2500 145 2000 140 1500 135 1000 130 500 125 0 threshold mean trends 120 threshold disp. trend All graphs have the same x axis, the threshold in volts. The trends represent the trend lines of the exact data points given in the data series. Means fall as threshold voltage increases and as RF increases. Dispersion increases as threshold voltage increases and as RF increases. As the graphic previous indicated, means were the center points of the Gaussian generated by each 100 pulses, and the dispersions are the widths of the Gaussians.
12 25w 105 25w L 25w nos 50w L 50w 100 10 50w nos 75w L 75w 75w nos 100w L 100w 100w nos 95 8 No RF L No RF No RF nos 90 6 85 80 25w 25w 25w nos 50w L 50w 4 50w nos 75w L 75 75w 75w nos 100w L 100w 2 100w nos No RF L 70 No RF No RF nos noise mean data 0 noise disp. data 10 Linear (50w L ) Linear (No RF) 105 9 Linear (No RF L) Linear (25w L ) Linear (100w L ) Linear (75w L ) Linear (25w ) 100 8 Linear (No RF nos) Linear (50w ) Linear (75w ) Linear (100w ) Linear (50w nos ) Linear (75w nos) 95 7 Linear (100w nos) Linear (25w nos) 90 6 85 5 Linear (50w L ) Linear (100w L ) 80 Linear (25w) Linear (75w L ) Linear (25w ) 4 Linear (100w ) Linear (75w nos) Linear (50w nos ) 75 Linear (100w nos) Linear (50w ) Linear (25w nos) 3 Linear (75w ) Linear (No RF nos) Linear (No RF) 70 Linear (No RF L) Linear (No RF L) noise mean trend 2 noise disp. trend Again, all graphics have the same x axis, threshold in volts. The trends represent the trend lines of the exact data points given in the data series. Means fall as threshold voltage increases and can fluctuate as RF increases, but they tend to terminate to a standard range. Dispersion falls as voltage increases and as RF increases, but also tends to terminate to a standard range. Again, means are generated from the center point of the Gaussians, and dispersions are generated from the widths. The falling or increasing of the graphs was usually within a standard range from threshold to threshold, but as can be seen in the graphs, there were occasional variations that gave rise to results that are not quite linear. The noise behavior in particular is interesting because it was not following a linear trend as I would have anticipated. In each graph, it is easy to see the effectiveness of the shielding. The tests performed
unshielded have wildly different results in noise, and much lower means than their shielded counterpoints. Beyond the standard range of sets, I also performed a number of controls, and a series at 75watts with only the board of the FPIX1 shielded, and the wires unshielded. This data was very aberrant. Tests would often fail, and data that I did get was not in line with the previous tests. Though mostly unusable and eventually scrapped, the data did give some insight into how interference was arising in the chip. The lab that I worked in had a broken air conditioner, and after a few days in the sweltering heat, I started to wonder if the high temperature and humidity would adversely affect the data. I started recording humidity and temperature with the start of each test. Eventually it seemed as though there was little significant change arising from the heat or humidity, but I m confidant in saying that I would prefer to have too much data than be missing something important. In the end, my work document, which at the time was titled RF Testing ended up being around 130 pages, mostly of data, graphs, and tables, but also including an introduction, and explanations on the test stand, test procedures, and the design and function of the FPIX1, PTA, and PMC. It is a very effective characterization of the performance of the FPIX1 as a function of threshold voltage and RF input power. It provides an expectation of how the chip should behave when placed in the beam, and will be a guide to setting the operational threshold in the running detector; telling how much noise to expect for running conditions. Because it will be hard to figure out how much RF power the beam will be sending into BTeV, my guide will be invaluable when the detector is up and running. Looking at my guide, and comparing it with the detector
output at a given threshold, they can adjust the threshold to get a noise level that will allow them to run the detector in such a way that most of the output will be real signals and not just spurious noise. I owe a great deal of thanks to the REU program for providing me the opportunity to work on such a great project, Professor David Cinabro for facilitating my trip and helping me understand the project, and to my boss and mentor, Marcos Turqueti, for taking up so much of his time to teach me, and let me be a part of this wonderful experience.