A   B   C   D   E   F   G   H   I   K   L   M   O   P   Q   R   S   T   U   V   W  

electron

Paper Title Other Keywords Page
MOMPMP01 Computational Beam Dynamics for SNS Commissioning and Operation simulation, proton, space-charge, linac 1
 
  • J. A. Holmes, S. M. Cousineau, V. V. Danilov, S. Henderson, D.-O. Jeon, M. A. Plum, A. P. Shishlo, Y. Zhang
    ORNL, Oak Ridge, Tennessee
  • D. A. Bartkoski
    UTK, Knoxville, Tennessee
  Funding: SNS is managed by UT-Battelle, LLC, under contract DE-AC05-00OR22725 for the U. S. Department of Energy.

The computational approach is providing essential guidance and analysis for the commissioning and operation of SNS. Computational models are becoming sufficiently realistic that it is now possible to study detailed beam dynamics issues quantitatively. Increasingly, we are seeing that the biggest challenge in performing successful analyses is that of knowing and describing the machine and beam state accurately. Even so, successful benchmarks with both theoretical predictions and experimental results are leading to increased confidence in the capability of these models. With this confidence, computer codes are being employed in a predictive manner to guide the machine operations. We will illustrate these points with various examples taken from the SNS linac and ring.

 
slides icon Slides  
 
MOM2IS02 Large Scale Parallel Wake Field Computations for 3D-Accelerator Structures with the PBCI Code simulation, vacuum, electromagnetic-fields, diagnostics 29
 
  • E. Gjonaj, X. Dong, R. Hampel, M. Kärkkäinen, T. Lau, W. F.O. Müller, T. Weiland
    TEMF, Darmstadt
  Funding: This work was partially funded by EUROTeV (RIDS-011899), EUROFEL (RIDS-011935), DFG (1239/22-3) and DESY Hamburg

The X-FEL project and the ILC require a high quality beam with ultra short electron bunches. In order to predict the beam quality in terms of both, single bunch energy spread and emittance, an accurate estimation of the short range wake fields in the TESLA crymodules, collimators and other geometrically complex accelerator components is necessary. We have presented earlier wake field computations for short bunches in rotationally symmetric components with the code ECHO. Most of the wake field effects in the accelerator, however, are due to geometrical discontinuities appearing in fully three dimensional structures. For the purpose of simulating such structures, we have developed the Parallel Beam Cavity Interaction (PBCI) code. The new code is based on the full field solution of Maxwell equations in the time domain, for ultra-relativistic current sources. Using a specialized directional-splitting technique, PBCI produces particularly accurate results in wake field computations, due to the dispersion free integration of the discrete equations in the direction of bunch motion. One of the major challenges to deal with, when simulating fully three dimensional accelerator components is the huge computational effort needed for resolving both, the geometrical details and the bunch extensions by the computational grid. For this reason, PBCI implements massive parallelization on a distributed memory environment, based on a flexible domain decomposition method. In addition, PBCI uses the moving window technique, which is particularly well suited for wake potential computations in very long structures. As a particular example of such a structure, the simulation results of a complete module of TESLA cavities with eight cells each for a um-bunch will be given.

 
slides icon Slides  
 
MOA2IS01 The ORBIT Simulation Code: Benchmarking and Applications proton, simulation, space-charge, impedance 53
 
  • A. P. Shishlo, S. M. Cousineau, V. V. Danilov, J. Galambos, S. Henderson, J. A. Holmes, M. A. Plum
    ORNL, Oak Ridge, Tennessee
  Funding: SNS is managed by UT-Battelle, LLC, under contract DE-AC05-00OR22725 for the U. S. Department of Energy.

The contents, structure, implementation, benchmarking, and applications of ORBIT as an accelerator simulation code are described. Physics approaches, algorithms, and limitations for space charge, impedances, and electron cloud effects are discussed. The ORBIT code is a parallel computer code, and the scalabilities of the implementations of parallel algorithms for different physics modules are shown. ORBIT has a long history of benchmarking with analytical exactly solvable problems and experimental data. The results of this benchmarking and the current usage of ORBIT are presented.

 
slides icon Slides  
 
MOA2IS02 Simulations of Single Bunch Collective Effects Using HEADTAIL simulation, impedance, space-charge, single-bunch 59
 
  • G. Rumolo, E. Métral
    CERN, Geneva
  The HEADTAIL code is a very versatile tool that can be used for simulations of electron cloud induced instabilities as well as for Transverse Mode Coupling Instability and space charge studies. The effect of electron cloud and/or a conventional impedance (resonator or resistive wall) on a single bunch is modeled using a wake field approach. The code naturally allows either for dedicated studies of one single effect or for more complex studies of the interplay between different effects. Sample results from electron cloud studies (coherent and incoherent effects) and TMCI studies (e.g., for the PS and SPS) will be discussed in detail and compared, where possible, with results from other codes having similar features and/or with existing machine data.  
slides icon Slides  
 
TUMPMP01 Simple Maps in Accelerator Simulations ion, proton, simulation, vacuum 81
 
  • S. Peggs, U. Iriso
    BNL, Upton, Long Island, New York
  Difference systems (described by maps) exhibit much richer dynamical behavior than differential systems, because of the emphasis they place on occasional "high-frequency" transient kicks. Thus, the standard map (with pulsed gravity) displays chaos, while the gravity pendulum does not. Maps also speed up simulations enormously, by summarizing complex dynamics in short form. A new example of richer bahavior, and of dramatic speed up, comes from the representation of interacting electron clouds and ion clouds. Coupled maps are capable of demonstrating the first order phase transitions (from cloud "off" to "on") that are sometimes observed in practice, and enable the extension of electron cloud simulation to include much slower evolving ion clouds.  
slides icon Slides  
 
TUMPMP03 BEAM DYNAMICS STUDIES FOR THE HIGH-ENERGY STORAGE RING luminosity, target, antiproton, injection 96
 
  • A. Lehrach, B. Lorentz, R. Maier, D. Prasuhn, H. Stockhorst, R. Tölle, D. M. Welsch
    FZJ, Jülich
  • O. Boine-Frankenheim, R. W. Hasse, S. Sorge
    GSI, Darmstadt
  • F. Hinterberger
    Universität Bonn, Helmholtz-Institut für Strahlen- und Kernphysik, Bonn
  Funding: INTAS grant No. 03-54-5584 (Advanced Beam Dynamics for Storage Rings), EU-FP6 FP6 contract No. 515873(DIRAC Secondary-Beams)

The HESR is planned as an antiproton storage ring in the momentum range from 1.5 to 15 GeV/c. An important feature of this new facility is the combination of phase space cooled beams utilizing electron and stochastic cooling and dense internal targets (e.g. pellet targets). In this paper different beam dynamics issues like closed orbit correction, performance of cooled beams interacting with internal targets and luminosity considerations are discussed in respect of utilized simulation codes.

O. Boine-Frankenheim et al., Nucl. Inst. and Meth. A 560, 245 (2006).
A. Lehrach at al., Nucl. Instr. Meth. A 561, 289 (2006).
F. Hinterberger, Jül-Report No. 4206 (2006), ISSN 0944-2952.

 
slides icon Slides  
 
TUPPP01 DEE Voltage Calibration for the ACCEL Proton Therapy Cyclotron cyclotron, proton, vacuum, extraction 102
 
  • J. H. Timmer, P. A. Komorowski, H. Röcken
    ACCEL, Bergisch Gladbach
  ACCEL Instruments GmbH has developed a superconducting cyclotron for the use in proton therapy systems. An essential step during the commissioning of the medical cyclotron is the calibration and balancing of the DEE voltages. Using a very compact and low cost X-ray detector the bremsstrahlung spectrum of stray electrons accelerated by the four RF cavities has been measured. To determine the peak voltage a regression analysis of the measured spectrum has been carried out using a non-linear multiple convolution model taking into account the energy gain of the stray electrons between the liner and the DEE, the bremsstrahlung spectrum integrated over angle as well as the attenuation effects caused by the liner and the limited detector resolution. The correlation between the model and the measurement was very good. A software tool enabling automatic spectrum acquisition and analysis capable of online determination of the DEE voltages has been developed in LabVIEW graphical programming environment. Careful balancing of the DEE voltages resulted in better beam focusing and a cyclotron extraction efficiency larger than 80%. The absolute acceleration voltage has been confirmed by turn-separation measurements.  
 
TUPPP10 Design and Modeling of Field-Emitter Arrays for a High Brilliance Electron Source emittance, cathode, simulation, space-charge 114
 
  • M. Dehler
    PSI, Villigen
  The realization of compact Angstrom wave length free electron lasers depends critically on the brilliance of their electron sources. Field emitters are attractive given their small emission surface and subsequent high current density. The low emittance gun project (LEG) at PSI focuses on developing suitable field emitter arrays (FEA) with a dual gate structure emitting a total current of 5.5A out of a diameter of 500 microns with an emittance in the order of 50 nm rad. Simulations show for idealized emitters that despite micron scale variations of the charge density a low emittance can be obtained by putting the FEA in a pulsed DC diode at 250 MV/m. The challenge lies in modelling all real world effects in the individual field emitter and assembling these into a global emission model. Field emission is often labeled as a cold emission process, nevertheless quantum physical effects lead to a base line energy spread of an order of 150 meV FWHM for the emitted electrons. Replenishing the conduction band with electrons from deep layers gives a further increase in the momentum spread. For the metallic field emitter used, surface roughness has an important influence on the emission properties. It typically gives an additional field enhancement factor of 2.5 to 3 resulting in lower required gate voltages. Additionally we have a detrimental effect on the transverse momentum spread. Work is in progress on obtaining numerical estimates for these effects using among other things measurements using secondary electron microscopy. Further more, the extraction and focusing gates both both give rise to nonlinear defocusing and focusing forces, which have to be minimized by a careful geometric optimization. Combining all these effects gives a reliable parametrization of the individual emitters, which together with a stochastic spatial distribution of emitter properties is used in the global emission model.  
 
TUPPP16 Integration of a Large-Scale Eigenmode Solver into the ANSYS(c) Workflow Environment resonance, free-electron-laser, laser, cyclotron 122
 
  • B. S.C. Oswald, A. Adelmann, M. Bopp, R. Geus
    PSI, Villigen
  The numerical computation of eigenfrequencies and eigenmodal fields of large accelerator cavities, based on full-wave, three-dimensional models, has attracted considerable interest in the recent past. In particular, it is of vital interest to know the performance characteristics, such as resonance frequency, quality figures and the modal fields, respectively, of such devices prior to construction; given the fact that the physical fabrication of a cavity is expensive and time-consuming, a device that does not comply with its specifications can not be tolerated; a robust and reliable digital prototyping methodology is therefore essential. Furthermore, modern cavity designs typically exhibit delicate and detailed geometrical features that must be considered for obtaining accurate results. At PSI a three-dimensional finite-element code has been developed to compute eigenvalues and eigenfields of accelerator cavities (*). While this code has been validated versus experimentally measured cavity data, its usage has remained somewhat limited due to missing functionality to connect it to industrial grade modeling software. Such an interface would allow creating advanced CAD geometries, meshing them in ANSYS and eventually exporting and analyzing the design in femaxx. We have therefore developed pre- and postprocessing software which imports meshes generated in ANSYS for a femaxx run. A postprocessing step generates a result file than can be imported into ANSYS and further be analyzed there. Thereby, we have integrated femaXX into the ANSYS workflow such that detailed cavity designs leading to large meshes can be analyzed with femaXX, taking advantage of its capability to address very large eigenvalue problems. Additionally, we have added functionality for parallel visualization to femaxx. We present a practical application of the pre- and postprocessing codes and compare the results against experimental values, where available, and other numerical codes when the model has no

* P. Arbenz, M. Becka, R. Geus, U. L. Hetmaniuk, and T. Mengotti,
"On a Parallel Multilevel Preconditioned Maxwell Eigensolver".
Parallel Computing, 32(2): 157-165 (2006).

 
 
TUPPP28 New 3D Space Charge Routines in the Tracking Code ASTRA space-charge, simulation, brightness, cathode 136
 
  • G. Pöplau
    Rostock University, Faculty of Engineering, Rostock
  • K. Floettmann
    DESY, Hamburg
  • U. van Rienen
    Rostock University, Faculty of Computer Science and Electrical Engineering, Rostock
  Funding: DESY Hamburg

Precise and fast 3D space-charge calculations for bunches of charged particles are still of growing importance in recent accelerator designs. A widespread approach is the particle-mesh method computing the potential of a bunch in the rest frame by means of Poisson's equation. Recently new algorithms for solving Poisson's equation have been implemented in the tracking code Astra. These Poisson solvers are iterative algorithms solving a linear system of equations that results from the finite difference discretization of the Poisson equation. The implementation is based on the software package MOEVE (Multigrid Poisson Solver for Non-Equidistant Tensor Product Meshes) developed by G. Pöplau. The package contains a state-of-the-art multigrid Poisson solver adapted to space charge calculations. In this paper the basic concept of iterative Poisson solvers is described. It is compared to the established 3D FFT Poisson solver which is a widely-used method for space charge calculations and also implemented in Astra. Advantages and disadvantages are discussed. Further the similarities and differences of both approaches are demonstrated with numerical examples.

 
 
TUPPP29 Charge Conservation for Split-Operator Methods in Beam Dynamics Simulations simulation, focusing, space-charge, electromagnetic-fields 140
 
  • T. Lau, E. Gjonaj, T. Weiland
    TEMF, Darmstadt
  Funding: DFG (1239/22-3) and DESY Hamburg

For devices in which the bunch dimensions are much smaller than the dimensions of the structure the numerical field solution is typically hampered by spurious oscillations. The reason for this oscillations is the large numerical dispersion error of conventional schemes along the beam axis. Recently, several numerical schemes have been proposed which apply operator splitting to optimize and under certain circumstances eliminate the dispersion error in the direction of the bunch motion. However, in comparison to the standard Yee scheme the methods based on operator splitting do not conserve the standard discrete Gauss law. This contribution is dedicated to the construction of conserved discrete Gauss laws and conservative current interpolation for some of the split operator methods. Finally, the application of the methods in a PIC simulations is shown.

 
 
TUAPMP04 Simulation of Secondary Electron Emission with CST Particle Studio(TM) simulation, electromagnetic-fields, vacuum, scattering 160
 
  • F. Hamme, U. Becker, P. Hammes
    CST, Darmstadt
  In accelerator physics and high power vacuum electronics the secondary electron emission (SEE) has in many cases an important influence on the physical behavior of the device. Since its analytical prediction even for simple geometries is extremely cumbersome, numerical simulation is essential to get a better understanding of the possible effects and ideas to change the design. The current paper introduces the implementation of SEE within the code CST Particle Studio (TM), which is an easy to use three dimensional tool for the simulation of electromagnetic fields and charged particles. There are three basic types of secondary electrons, the elastic reflected, the rediffused and the true secondary ones. The implemented SEE model is based on a probabilistic, mathematically self-consistent model developed by Furman and includes the three kinds of secondary electrons mentioned above. The paper presents simulation results with focus to the SEE for the absorbed power within an electron collector of a high power tube. As second example the secondary emission process is studied within the superconducting TESLA cavity, which gives some hints for the understanding of multipactor effects in those cavity and filter structures.  
slides icon Slides  
 
WEMPMP01 Computational Needs for XFELS undulator, emittance, simulation, space-charge 164
 
  • M. Dohlus
    DESY, Hamburg
  X-ray Free Electron Lasers (FEL) make use of the principle of Self-Amplified-Spontaneous-Emission (SASE) where electron bunches interact in an undulator with their own co-propagating radiation. They do not require optical resonators and their frequency is therefore not limited by material properties as the reflectivity of mirrors. The performance of X-ray SASE FELs depends exponentially on the beam quality of the electron bunch. Therefore effects in the beamline before the undulator are as important as particle-field interactions of the FEL-SASE process. Critical components are the low emittance electron source, accelerating sections, the bunch compression system and the undulator. Due to the high peak currents and small beam dimensions space charge (SC) effects have to be considered up to energies in the GeV range. Coherent synchrotron radiation (CSR) drives not only the FEL but is also emitted in dispersive sections as bunch compressors. SC, CSR, and wake fields affect significantly longitudinal beam parameters (peak current, correlated and uncorrelated energy spread) and the transverse emittance. Start-to-end simulations use a sequence of various tracking codes (with or without SC, CSR and wake fields) and FEL programs. Usually the particle or phase space information has to be carefully converted for each transition from one tool to another. Parameter studies need many simulations of the complete system or a part of it and beyond that, calculations with several random seeds are necessary to consider the stochastic nature of SASE-FEL process.  
slides icon Slides  
 
WEPPP01 Recent Developments in IMPACT and Application to Future Light Sources simulation, linac, lattice, space-charge 182
 
  • I. V. Pogorelov, J. Qiang, R. D. Ryne, M. Venturini, A. Zholents
    LBNL, Berkeley, California
  • R. L. Warnock
    SLAC, Menlo Park, California
  The Integrated Map and Particle Accelerator Tracking (IMPACT) code suite was originally developed to model beam dynamics in ion linear accelerators. It has been greatly enhanced and now includes a linac design code, a 3D rms envelope code and two parallel particle-in-cell (PIC) codes IMPACT-T, a time-based code, and IMPACT-Z, a z-coordinate based code. Presently, the code suite has been increasingly used in simulations of high brightness electron beams for future light sources. These simulations, performed using up to 100 million macroparticles, include effects related to nonlinear magnetic optics, rf structure wake fields, 3D self-consistent space charge, and coherent synchrotron radiation (at present a 1D model). Illustrations of application for a simulation of the microbunching instability are given. We conclude with plans of further developments pertinent to future light sources.  
 
WEPPP07 Phase Space Tomography Diagnostics at the PITZ Facility emittance, simulation, diagnostics, quadrupole 194
 
  • G. Asova, S. Khodyachykh, M. Krasilnikov, F. Stephan
    DESY Zeuthen, Zeuthen
  • P. Castro, F. Loehl
    DESY, Hamburg
  Funding: This work has partly been supported by the European Community, contract 011935 (EUROFEL)

A high phase-space density of the electron beam is obligatory for the successful operation of a Self Amplified Spontaneous Emission - Free Elector Laser (SASE-FEL). Detailed knowledge of the phase-space density distribution is thus very important for characterizing the performance of the used electron sources. The Photo Injector Test Facility at DESY in Zeuthen (PITZ) is built to develop, operate and optimize electron sources for FELs. Currently a tomography module for PITZ is under design as part of the ongoing upgrade of the facility. This contribution studies the performance of the tomography module. Errors in the beam size measurements and their contribution to the calculated emittance will be studied using simulated data. As a practical application the Maximum Entropy Algorithm (MENT) will be used to reconstruct the data generated by an ASTRA simulation.

 
 
WEPPP15 Simulations of Pellet Target Effects with Program PETAG01 target, simulation, storage-ring, antiproton 216
 
  • A. Dolinskii, V. Gostishchev, M. Steck
    GSI, Darmstadt
  • O. A. Bezshyyko
    National Taras Shevchenko University of Kyiv, The Faculty of Physics, Kyiv
  New internal targets play an important role in modern nuclear and high energy physics research. One of such targets is a pellet target which is a variant of a micro-particle internal target. This target has a number of very attractive features when it used in a storage ring. The software package PETAG01 has been developed for modelling the pellet target and it can be used for numerical calculations of the interaction of a circulating beam with the target in a storage ring. We present numerical calculations to study the beam dynamics of the ions in the storage ring, where strong cooling techniques in combination with the pellet target are applied. Some important effects due to the target in combination with electron cooling and its influence on the beam parameters have been considered.  
 
WEPPP17 Tracking Code with 3D Space Charge Calculations Taking into Account the Elliptical Shape of the Beam Pipe space-charge, simulation, damping, positron 220
 
  • A. Markovik, G. Pöplau, U. van Rienen
    Rostock University, Faculty of Computer Science and Electrical Engineering, Rostock
  • R. Wanzenberg
    DESY, Hamburg
  Funding: Work supported by DESY, Hamburg

The determination of electron cloud instability thresholds is a task with high priority in the ILC damping rings research and development objectives. Simulations of electron cloud instabilities are therefore essential. In this paper a new particle tracking program is presented which includes the Poisson solver MOEVE for space charge calculations. Recently, perfectly electric conducting beam pipes with arbitrary elliptical shapes have been implemented as boundary conditions in the Poisson solver package MOEVE. The 3D space charge algorithm taking into account a beam pipe of elliptical shape will be presented along with numerical test cases. The routine is also implemented in the program code ASTRA, in addition we compare the tracking with both routines.

 
 
WESEPP03 High-Order Algorithms for Simulation of Laser Wakefield Accelerators simulation, laser, emittance 230
 
  • D. L. Bruhwiler, J. R. Cary, D. A. Dimitrov, P. Messmer
    Tech-X, Boulder, Colorado
  • E. Esarey, C. G.R. Geddes
    LBNL, Berkeley, California
  • E. Kashdan
    Brown University, Providence, Rhode Island
  Funding: This work is funded by the US DOE Office of Science, Office of High Energy Physics, including use of NERSC.

Electromagnetic particle-in-cell (PIC) simulations of laser wakefield accelerator (LWFA) experiments have shown great success recently, qualitatively capturing many exciting features, like the production of ~1 GeV electron beams with significant charge, moderate energy spread and remarkably small emittance. Such simulations require large clusters or supercomputers for full-scale 3D runs, and all state-of-the art codes are using similar algorithms, with 2nd-order accuracy in space and time. Very high grid resolution and, hence, a very large number of time steps are required to obtain converged results. We present preliminary results from the implementation and testing of 4th-order algorithms, which hold promise for dramatically improving the accuracy of future LWFA simulations.

 
 
WESEPP04 The ORBIT Simulation Code: Benchmarking and Applications simulation, space-charge 231
 
  • J. A. Holmes, S. M. Cousineau, V. V. Danilov, J. Galambos, S. Henderson, M. A. Plum, A. P. Shishlo
    ORNL, Oak Ridge, Tennessee
  The contents, structure, implementation, benchmarking, and applications of ORBIT as an accelerator simulation code are described. Physics approaches, algorithms, and limitations for space charge, impedances, and electron cloud effects are discussed. The ORBIT code is a parallel computer code, and the scalabilities of the implementations of parallel algorithms for different physics modules are shown. ORBIT has a long history of benchmarking with analytical exactly solvable problems and experimental data. The results of this benchmarking and the current usage of ORBIT are presented.  
 
WEA1MP01 Parallel Simulation of Coulomb Collisions for High-Energy Electron Cooling Systems ion, luminosity, heavy-ion, simulation 233
 
  • D. L. Bruhwiler
    Tech-X, Boulder, Colorado
  Funding: This work is funded by the US DOE Office of Science, Office of Nuclear Physics.

High-energy electron cooling requires co-propagation of relativistic electrons over many meters with the recirculating bunches of an ion collider ring. The expected increase of ion beam luminosity makes such systems a key component for proposed efforts like the RHIC luminosity upgrade* and the FAIR project**. Correctly simulating the dynamical friction of heavy ions, during brief interactions with low-density electron populations, in the presence of arbitrary electric and magnetic fields, requires a molecular dynamics approach that resolves close Coulomb collisions. Effective use of clusters and supercomputers is required to make such computations practical. Previous work*** will be reviewed. Recent algorithmic developments**** and future plans will be emphasized.

* http://www.bnl.gov/cad/ecooling
** http://www.gsi.de/GSI-Future/cdr
*** A. V. Fedotov et al., Phys. Rev. ST/AB (2006), in press.
**** G. I. Bell et al., AIP Conf. Proc. 821 (2006), p. 329.

 
slides icon Slides  
 
WEA3MP02 Self-Consistent Simulations of High-Intensity Beams and E-Clouds with WARP POSINST simulation, ion, plasma, collider 256
 
  • J.-L. Vay
    LBNL, Berkeley, California
  • A. Friedman, D. P. Grote
    LLNL, Livermore, California
  Funding: Supported by U. S. Department of Energy under Contracts No. DE-AC02-05CH11231 and No. W-7405-Eng-48 and by US-LHC accelerator research program (LARP).

We have developed a new, comprehensive set of simulation tools aimed at modeling the interaction of intense ion beams and electron clouds (e-clouds). The set contains the 3-D accelerator PIC code WARP and the 2-D "slice" e-cloud code POSINST, as well as a merger of the two, augmented by new modules for impact ionization and neutral gas generation. The new capability runs on workstations or parallel supercomputers and contains advanced features such as mesh refinement, disparate adaptive time stepping, and a new "drift-Lorentz" particle mover for tracking charged particles in magnetic fields using large time steps. It is being applied to the modeling of ion beams (1 MeV, 180 mA, K+) for heavy ion inertial fusion and warm dense matter studies, as they interact with electron clouds in the High-Current Experiment (HCX). We describe the capabilities and present recent simulation results with detailed comparisons against the HCX experiment, as well as their application (in a different regime) to the modeling of e-clouds in the Large Hadron Collider (LHC).

 
slides icon Slides  
 
WEA3MP03 Benchmarking of Space Charge Codes Against UMER Experiments space-charge, simulation, diagnostics, cathode 263
 
  • R. A. Kishek, G. Bai, B. L. Beaudoin, S. Bernal, D. W. Feldman, R. B. Fiorito, T. F. Godlove, I. Haber, P. G. O'Shea, C. Papadopoulos, B. Quinn, M. Reiser, D. Stratakis, D. F. Sutter, J. C.T. Thangaraj, K. Tian, M. Walter, C. Wu
    IREAP, College Park, Maryland
  Funding: This work is funded by US Dept. of Energy and by the US Dept. of Defense Office of Naval Research.

The University of Maryland Electron Ring (UMER) is a scaled electron recirculator using low-energy, 10 keV electrons, to maximize the space charge forces for beam dynamics studies. We have recently circulated in UMER the highest-space-charge beam in a ring to date, achieving a breakthrough both in the number of turns and in the amount of current propagated. As of the time of submission, we have propagated 5 mA for at least 10 turns, and, with some loss, for over 50 turns, meaning about 0.5 nC of electrons survive for 10 microseconds. This makes UMER an attractive candidate for benchmarking space charge codes in regimes of extreme space charge. This talk will review the UMER design and available diagnostics, and will provide examples of benchmarking the particle-in-cell code WARP on UMER data, as well as an overview of the detailed information on our website. An open dialogue with interested coded developers is solicited.

 
slides icon Slides  
 
THM1MP02 Parallel Particle-In-Cell (OIC) Codes simulation, diagnostics, emittance, laser 290
 
  • F. Wolfheimer, E. Gjonaj, T. Weiland
    TEMF, Darmstadt
  Funding: This work has been partially supported by DESY Hamburg.

Particle-In-Cell (PIC) simulations are commonly used in the field of computational accelerator physics for modelling the interaction of electromagnetic fields and charged particle beams in complex accelerator geometries. However, the practicability of the method for real world simulations, is often limited by the huge size of accelerator devices and by the large number of computational particles needed for obtaining accurate simulation results. Thus, the parallelization of the computations becomes necessary to permit the solution of such problems in a reasonable time. Different algorithms allowing for an efficient parallel simulation by preserving an equal distribution of the computational workload on the processes while minimizing the interprocess communication are presented. This includes some already known approaches based on a domain decomposition technique as well as novel schemes. The performance of the algorithms is studied in different computational environments with simulation examples including a full 3D simulation of the PITZ-Injector [*].

*A. Oppelt et al Status and First Results from the Upgraded PITZ Facility, Proc. FEL 2005

 
slides icon Slides  
 
THMPMP03 Accelerator Modeling under SciDAC: Meeting the Challenges of Next-Generation Accelerator Design, Analysis, and Optimization. simulation, plasma, space-charge, booster 315
 
  • P. Spentzouris
    Fermilab, Batavia, Illinois
  Under the US DOE Scientific Discovery through Advanced Computing (SciDAC) initiative, a new generation of parallel simulation codes has been developed to meet the most demanding accelerator modeling problems for the DOE Office of Science (DOE/SC). Originally sponsored by DOE/SC's Office of High Energy Physics in collaboration with the Office of Advanced Scientific Computing Research, the new simulation capabilities have also been applied to other DOE projects, and to international projects as well. The new software has been applied to many projects, including the Tevatron, PEP-II, LHC, ILC, the Fermilab Booster, SNS, the JPARC project, the CERN SPL, many photoinjectors, and the FERMI@Elettra project. Codes have also been developed to model laser wakefield accelerators and plasma wakefield accelerators; these codes are being used both in support of advanced accelerator experiments, as well as to provide insight into the physics of ultra- high gradient accelerators. In this talk I will provide an overview of the computational capabilities that have been developed under our SciDAC project, and describe our plans for code development under the next phase of SciDAC.  
slides icon Slides