Paper | Title | Page |
---|---|---|
TUCOAAB01 | Status of the National Ignition Facility (NIF) Integrated Computer Control and Information Systems | 483 |
|
||
Funding: This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. #LLNL-ABS-631632 The National Ignition Facility (NIF) is operated by the Integrated Computer Control System in an object-oriented, CORBA-based system distributed among over 1800 front-end processors, embedded controllers and supervisory servers. At present, NIF operates 24x7 and conducts a variety of fusion, high energy density and basic science experiments. During the past year, the control system was expanded to include a variety of new diagnostic systems, and programmable laser beam shaping and parallel shot automation for more efficient shot operations. The system is also currently being expanded with an Advanced Radiographic Capability, which will provide short (<10 picoseconds) ultra-high power (>1 Petawatt) laser pulses that will be used for a variety of diagnostic and experimental capabilities. Additional tools have been developed to support experimental planning, experimental setup, facility configuration and post shot analysis, using open-source software, commercial workflow tools, database and messaging technologies. This talk discusses the current status of the control and information systems to support a wide variety of experiments being conducted on NIF including ignition experiments. |
||
![]() |
Slides TUCOAAB01 [4.087 MB] | |
TUPPC072 | Flexible Data Driven Experimental Data Analysis at the National Ignition Facility | 747 |
|
||
Funding: This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. #LLNL-ABS-632532 After each target shot at the National Ignition Facility (NIF), scientists require data analysis within 30 minutes from ~50 diagnostic instrument systems. To meet this goal, NIF engineers created the Shot Data Analysis (SDA) Engine based on the Oracle Business Process Execution Language (BPEL) platform. While this provided for a very powerful and flexible analysis product, it still required engineers conversant in software development practices in order to create the configurations executed by the SDA engine. As more and more diagnostics were developed and the demand for analysis increased, the development staff was not able to keep pace. To solve this problem, the Data Systems team took the approach of creating a database table based scripting language that allows users to define an analysis configuration of inputs, input the data into standard processing algorithms and then store the outputs in a database. The creation of the Data Driven Engine (DDE) has substantially decreased the development time for new analysis and simplified maintenance of existing configurations. The architecture and functionality of the Data Driven Engine will be presented along with examples. |
||
![]() |
Poster TUPPC072 [1.150 MB] | |
TUPPC126 | Visualization of Experimental Data at the National Ignition Facility | 879 |
|
||
Funding: * This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. #LLNL-ABS-633252 An experiment on the National Ignition Facility (NIF) may produce hundreds of gigabytes of target diagnostic data. Raw and analyzed data are accumulated into the NIF Archive database. The Shot Data Systems team provides alternatives for accessing data including a web-based data visualization tool, a virtual file system for programmatic data access, a macro language for data integration, and a Wiki to support collaboration. The data visualization application in particular adapts dashboard user-interface design patterns popularized by the business intelligence software community. The dashboard canvas provides the ability to rapidly assemble tailored views of data directly from the NIF archive. This design has proven capable of satisfying most new visualization requirements in near real-time. The separate file system and macro feature-set support direct data access from a scientist’s computer using scientific languages such as IDL, Matlab and Mathematica. Underlying all these capabilities is a shared set of web services that provide APIs and transformation routines to the NIF Archive. The overall software architecture will be presented with an emphasis on data visualization. |
||
![]() |
Poster TUPPC126 [4.900 MB] | |
WECOBA05 | Understanding NIF Experimental Results: NIF Target Diagnostic Automated Analysis Recent Accompolishments | 1008 |
|
||
Funding: This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. #LLNL-ABS-632818 The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is the most energetic laser system in the world. During a NIF laser shot, a 20-ns ultraviolet laser pulse is split into 192 separate beams, amplified, and directed to a millimeter-sized target at the center of a 10-m target chamber. To achieve the goals of studying energy science, basic science, and national security, NIF laser shot performance is being optimized around key metrics such as implosion shape and fuel mix. These metrics are accurately quantified after each laser shot using automated signal and image processing routines to analyze raw data from over 50 specialized diagnostics that measure x-ray, optical and nuclear phenomena. Each diagnostic’s analysis is comprised of a series of inverse problems, timing analysis, and specialized processing. This talk will review the framework for general diagnostic analysis, give examples of specific algorithms used, and review the diagnostic analysis team’s recent accomplishments. The automated diagnostic analysis for x-ray, optical, and nuclear diagnostics provides accurate key performance metrics and enables NIF to achieve its goals. |
||
![]() |
Slides WECOBA05 [3.991 MB] | |
THCOAAB05 | Rapid Application Development Using Web 2.0 Technologies | 1058 |
|
||
Funding: * This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. #LLNL-ABS-632813 The National Ignition Facility (NIF) strives to deliver reliable, cost effective applications that can easily adapt to the changing business needs of the organization. We use HTML5, RESTful web services, AJAX, jQuery, and JSF 2.0 to meet these goals. WebGL and HTML5 Canvas technologies are being used to provide 3D and 2D data visualization applications. JQuery’s rich set of widgets along with technologies such as HighCharts and Datatables allow for creating interactive charts, graphs, and tables. PrimeFaces enables us to utilize much of this Ajax and JQuery functionality while leveraging our existing knowledge base in the JSF framework. RESTful Web Services have replaced the traditional SOAP model allowing us to easily create and test web services. Additionally, new software based on NodeJS and WebSocket technology is currently being developed which will augment the capabilities of our existing applications to provide a level of interaction with our users that was previously unfeasible. These Web 2.0-era technologies have allowed NIF to build more robust and responsive applications. Their benefits and details on their use will be discussed. |
||
![]() |
Slides THCOAAB05 [0.832 MB] | |
THCOAAB07 | NIF Electronic Operations: Improving Productivity with iPad Application Development | 1066 |
|
||
Funding: This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. #LLNL-ABS-632815 In an experimental facility like the National Ignition Facility (NIF), thousands of devices must be maintained during day to day operations. Teams within NIF have documented hundreds of procedures, or checklists, detailing how to perform this maintenance. These checklists have been paper based, until now. NIF Electronic Operations (NEO) is a new web and iPad application for managing and executing checklists. NEO increases efficiency of operations by reducing the overhead associated with paper based checklists, and provides analysis and integration opportunities that were previously not possible. NEO’s data driven architecture allows users to manage their own checklists and provides checklist versioning, real-time input validation, detailed step timing analysis, and integration with external task tracking and content management systems. Built with mobility in mind, NEO runs on an iPad and works without the need for a network connection. When executing a checklist, users capture various readings, photos, measurements and notes which are then reviewed and assessed after its completion. NEO’s design, architecture, iPad application and uses throughout the NIF will be discussed. |
||
![]() |
Slides THCOAAB07 [1.237 MB] | |
FRCOAAB04 | Data Driven Campaign Management at the National Ignition Facility | 1473 |
|
||
Funding: * This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. #LLNL-ABS-633255 The Campaign Management Tool Suite (CMT) provides tools for establishing the experimental goals, achieving reviews and approvals, and ensuring readiness for a NIF experiment. Over the last 2 years, CMT has significantly increased the number of diagnostics that supports to around 50. Meeting this ever increasing demand for new functionality has resulted in a design whereby more and more of the functionality can be specified in data rather than coded directly in Java. To do this support tools have been written that manage various aspects of the data and to also handle potential inconsistencies that can arise from a data driven paradigm. For example; drop down menus are specified in the Part and Lists Manager, the Shot Setup reports that lists the configurations for diagnostics are specified in the database, the review tool Approval Manager has a rules engine that can be changed without a software deployment, various template managers are used to provide predefined entry of hundreds parameters and finally a stale data tool validates that experiments contain valid data items. The trade-offs, benefits and issues of adapting and implementing this data driven philosophy will be presented. |
||
![]() |
Slides FRCOAAB04 [0.929 MB] | |