Paper | Title | Other Keywords | Page | |||||
---|---|---|---|---|---|---|---|---|
MOM2IS01 | A Highly Accurate 3-D Magnetic Field Solver | multipole | 28 | |||||
|
We present a new high precision parallel three dimensional magnetic field solver. This tool decomposes the problem of solving the Poisson equation into the problem of solving the Laplace equation and finding the magnetic field due to an arbitrary current distribution. The underlying theory to find solutions to both these problems using Differential Algebraic methods is developed, resulting in a local field expansion that can be computed to arbitrary order. Using the remainder differential algebraic approach, it is also possible to obtain fully rigorous and sharp estimates for the approximation errors. The method provides a natural multipole decomposition of the field which is required for the computation of transfer maps, and also allows obtaining very accurate finite element representations with very small numbers of cells. The method has the unique advantage of always producing purely Maxwellian fields, and naturally connects to high order DA-based map integration tools. We demonstrate the utility of this field solver for the design and analysis of novel combined function multipole with elliptic cross section that can simplify the correction of aberrations in large acceptance fragment separators for radioactive ion accelerators.
|
|
|
Slides
|
|
|
||
TUMPMP01 | Simple Maps in Accelerator Simulations | electron, proton, simulation, vacuum | 81 | |||||
|
Difference systems (described by maps) exhibit much richer dynamical behavior than differential systems, because of the emphasis they place on occasional "high-frequency" transient kicks. Thus, the standard map (with pulsed gravity) displays chaos, while the gravity pendulum does not. Maps also speed up simulations enormously, by summarizing complex dynamics in short form. A new example of richer bahavior, and of dramatic speed up, comes from the representation of interacting electron clouds and ion clouds. Coupled maps are capable of demonstrating the first order phase transitions (from cloud "off" to "on") that are sometimes observed in practice, and enable the extension of electron cloud simulation to include much slower evolving ion clouds.
|
|
|
Slides
|
|
|
||
WEMPMP02 | Wish-List for Large-Scale Simulations for Future Radioactive Beam Facilities | simulation, heavy-ion, linac, diagnostics | 170 | |||||
|
Funding: This work is supported by the U. S. Department of Energy under contract W-31-109-Eng-38. |
As accelerator facilities become more complex and demanding and computational capabilities become ever more powerful, there is the opportunity to develop and apply very large-scale simulations to dramatically increase the speed and effectiveness of many aspects of the design, commissioning, and finally the operational stages of future projects. Next-generation radioactive beam facilities are particularly demanding and stand to benefit greatly from large-scale, integrated simulations of essentially all aspects or components. These demands stem from things like the increased complexity of the facilities that will involve, for example, multiple-charge-state heavy ion acceleration, stringent limits on beam halos and losses from high power beams, thermal problems due to high power densities in targets and beam dumps, and radiological issues associated with component activation and radiation damage. Currently, many of the simulations that are necessary for design optimization are done by different codes, and even separate physics groups, so that the process proceeds iteratively for the different aspects. There is a strong need, for example, to couple the beam dynamics simulation codes with the radiological and shielding codes so that an integrated picture of their interactions emerges seamlessly and trouble spots in the design are identified easily. This integration is especially important in magnetic devices such as heavy ion fragment separators that are subject to radiation and thermal damage. For complex, high-power accelerators there is also the need to fully integrate the control system and beam diagnostics devices to a real-time beam dynamics simulation to keep the tunes optimized without the need for continuous operator feedback. This will most likely require on-line peta-scale computer simulations. The ultimate goal is to optimize performance while increasing the cost-effectiveness and efficiency of both the design and operational stages of future facilities. |
|
Slides
|
|
|
||
WEPPP10 | Implementation of the DYNAMION Code to the End-To-End Beam Dynamics Simulations for the GSI Proton and Heavy Ion Linear Accelerators | rfq, simulation, linac, emittance | 201 | |||||
|
The advanced multi-particle code DYNAMION is sufficient to calculate beam dynamics in linear accelerators and transport lines under space charge conditions with high accuracy. Special features like the consideration of field measurements, misalignment and fabrication errors, and data from the real topology of RFQ electrodes, drift tubes, quadrupole lenses lead to reliable results of the beam dynamics simulations. End-to-end simulations for the whole linac (from ion source extraction to the synchrotron entrance) allow for the investigation and optimization of the overall machine performance as well as for the calculation of the expected impact of different upgrade measures, proposed to improve beam brilliance. Recently the DYNAMION code is applied to investigate the beam dynamics for the different GSI-linacs: the heavy ion high current UNILAC, the high current proton linac for the future Facility for Antiproton and Ion Research at Darmstadt (FAIR), and the light ion accelerator for the cancer therapy (HICAT), to be commissioned in Heidelberg (Germany) in the near future. Recent results of the beam dynamics simulations by means of the DYNAMION code are presented. The proposed upgrade measures as well as tuning and optimization of the linacs are discussed.
|
|
|
|||||
WEA1MP01 | Parallel Simulation of Coulomb Collisions for High-Energy Electron Cooling Systems | electron, luminosity, heavy-ion, simulation | 233 | |||||
|
Funding: This work is funded by the US DOE Office of Science, Office of Nuclear Physics. |
High-energy electron cooling requires co-propagation of relativistic electrons over many meters with the recirculating bunches of an ion collider ring. The expected increase of ion beam luminosity makes such systems a key component for proposed efforts like the RHIC luminosity upgrade* and the FAIR project**. Correctly simulating the dynamical friction of heavy ions, during brief interactions with low-density electron populations, in the presence of arbitrary electric and magnetic fields, requires a molecular dynamics approach that resolves close Coulomb collisions. Effective use of clusters and supercomputers is required to make such computations practical. Previous work*** will be reviewed. Recent algorithmic developments**** and future plans will be emphasized.
* http://www.bnl.gov/cad/ecooling |
|
Slides
|
|
|
||
WEA3MP02 | Self-Consistent Simulations of High-Intensity Beams and E-Clouds with WARP POSINST | electron, simulation, plasma, collider | 256 | |||||
|
Funding: Supported by U. S. Department of Energy under Contracts No. DE-AC02-05CH11231 and No. W-7405-Eng-48 and by US-LHC accelerator research program (LARP). |
We have developed a new, comprehensive set of simulation tools aimed at modeling the interaction of intense ion beams and electron clouds (e-clouds). The set contains the 3-D accelerator PIC code WARP and the 2-D "slice" e-cloud code POSINST, as well as a merger of the two, augmented by new modules for impact ionization and neutral gas generation. The new capability runs on workstations or parallel supercomputers and contains advanced features such as mesh refinement, disparate adaptive time stepping, and a new "drift-Lorentz" particle mover for tracking charged particles in magnetic fields using large time steps. It is being applied to the modeling of ion beams (1 MeV, 180 mA, K+) for heavy ion inertial fusion and warm dense matter studies, as they interact with electron clouds in the High-Current Experiment (HCX). We describe the capabilities and present recent simulation results with detailed comparisons against the HCX experiment, as well as their application (in a different regime) to the modeling of e-clouds in the Large Hadron Collider (LHC). |
|
Slides
|
|
|