Slobodan Backovic
28/09/2015, 09:30
Andrey Khrgian
28/09/2015, 09:35
28/09/2015, 09:40
Dr
Livio Mapelli
(CERN)
28/09/2015, 10:30
Oral
Although the flagship of CERN physics is the Large Hadron Collider (LHC), the CERN scientific programme is varied and diversified. It extends to low-energy nuclear physics, antiproton experimentation and fixed target experiments at intermediate energies.
After the Higgs discovery in 2012, an intense activity has started to prepare for the future. While the high priority still remains the LHC...
Dr
Christoph Schaefer
(CERN), Dr
Tadeusz Kurtyka
(CERN)
28/09/2015, 11:40
Dr
Massimo Lamanna
(CERN)
28/09/2015, 14:10
CERN IT operates the main storage resources for data taking and physics analysis mainly via three system: AFS, CASTOR and EOS. Managed disk-storage amounts to about 100 PB (with relative ratios 1:10:30). EOS deploys disk resources evenly across the two CERN computer centres (Meyrin and Wigner). The physics data archive (CASTOR) contains about 100 PB so far. We are also providing sizeable...
Dr
Vladimir Korenkov
(JINR)
28/09/2015, 14:40
The report introduces the status and evolution of the information technologies at JINR. The objective of Laboratory of Information Technologies activity is to provide a further development of the JINR network and information infrastructure asked by the research and production activity of JINR and its Member States using the most advanced information technologies. The existing Central...
Dr
Dmitry Peshekhonov
(JINR)
28/09/2015, 15:10
The scientific program and current status of the project NICA realization is presented in the report.
A new scientific project NICA (the Nuclotron based Ion Collider facility) is now under the preparation at the Joint Institute for Nuclear Research (JINR) in Dubna. The project is aimed at two scientific programs: the study of the hot and dense baryonic matter under extreme conditions and at...
Mr
Alexander Paramonov
(Candidate of technical Science, MBA, ACC)
28/09/2015, 16:00
Dr
Alexey Struchenko
(Jet Infosystems)
28/09/2015, 16:20
Dr
Lubomir Dimitrov
(Institute for Nuclear Research and Nuclear Energy)
29/09/2015, 10:00
The higher energy and luminosity of future High Luminosity (HL) LHC, determines a significant increasing of the radiation background around the CMS subdetectors, and especially in the higher pseudorapidity region. Under such heavy conditions, the RPC (used in muon trigger) most probably could not operate effectively. A possible better solution is the so-called GEM (Gas Electron Multiplier)...
Dr
Oleg Strekalovsky
(JINR)
29/09/2015, 10:20
High speed switched capacitor waveform digitizers are increasingly used in studies of rare events in nuclear physics. Digitizers complement the classic analog input systems or completely replace it. To launch the start of registration required trigger signal that determines an interesting event.
Discriminator's threshold levels are set individually via USB 2.0. Trigger signal generating...
Mr
Mikhail Buryakov
(JINR, LHEP)
29/09/2015, 10:35
A conceptual design of the MultiPurpose Detector (MPD) is proposed for a study of hot and dense baryonic matter in collisions of heavy ions over the atomic mass range A = 1–197 at a centre-of-mass energy up to √(s_NN ) = 11 GeV (for Au79+). The MPD experiment is foreseen to be carried out at a future JINR accelerator complex facility for heavy ions – the Nuclotron-based Ion Collider fAcility...
Mr
Vladimir Borisov
(JINR)
29/09/2015, 10:50
The Nuclotron-based Ion Collider fAcility (NICA) is the new accelerator complex being constructed at JINR. More than 250 superconducting (SC) magnets will be assembled and tested at the new test facility in the Laboratory of High Energy Physics JINR. Magnetic measurements system for NICA booster dipole magnets was built and commissioned at late 2013. First cryogenic measurements of ...
Aleksey Kuznetsov
(JINR)
29/09/2015, 11:25
There have been developed some setups for super heavy elements synthesis in FLNR including multi-detector spectrometers of nuclear reaction products. These setups are VASSILISSA, DGFRS (Dubna Gas Filled Recoil Separator), MASHA etc. The number of channels in such spectrometers is growing up continuously and now is about several hundreds.
Electronics for such spectrometers should be...
Dr
NIKOLAY GORBUNOV
(JINR)
29/09/2015, 11:55
The purpose of the TUS space experiment is to study cosmic rays of ultrahigh energies by registration of the generated extensive air showers using a satellite in space. The concentrator located on the satellite is made in the form of a Fresnel mirror directed toward the earth's atmosphere, and at its focus there is a photodetector. The angle of view of the mirror is ± 50, that for the set...
Mr
Evgeny Gorbachev
(JINR)
29/09/2015, 14:00
Nuclotron is a 6 GeV/n superconducting proton synchrotron operating at JINR, Dubna since 1993. It will be the core of the future accelerating complex NICA which is under construction now. The TANGO based control system of the accelerating complex is under development now. The report describes its structure, main features and present status.
Dr
Maxim Karetnikov
(All-Russia Research Institute of Automatics)
29/09/2015, 14:30
At the T(d,n)He4 reaction each 14 MeV neutron is accompanied (tagged) by 3.5 MeV alpha- particle emitted in the opposite direction. A position- and time-sensitive alpha-detector measures time and coordinates of the associated alpha particle which allows determining time and direction of neutron escape. A spectrum of gamma-rays emitted at the interaction of tagged neutrons with nuclei of...
Mr
Dmitrii Monakhov
(JINR)
29/09/2015, 16:10
Betatron tune is one of the important beam parameters that must be known and controlled to avoid the beam instability of the circular particle accelerator. A real-time method for betatron tune measurements at Nuclotron and NICA Booster was developed and tested. A bandlimited noise source and chirp (frequency sweep) was used for beam excitation. The transversal beam oscillation signals were...
Ms
Svetlana Murashkevich
(JINR)
29/09/2015, 16:25
Frank laboratory of Neutron Physics, Joint Institute for Nuclear Research, Dubna, Russia
Software for a data acquisition system of modern one- and two-dimensional position-sensitive detectors with delay-line readout, which includes a software interface to a new electronic module DeLiDAQ-2D with a USB interface, is presented. The new system after successful tests on the stand and on...
Mr
Vasily Andreev
(VBLHEP JINR)
29/09/2015, 16:40
TANGO Controls is a basis of the NICA control system. The report describes the software that integrates the Nuclotron beam slow extraction subsystem into the TANGO system of NICA. Object of control are resonance lenses power supplies and the extracted beam spill controller. The software consists of the subsystem device server, remote client and web-module for the subsystem data viewing. The...
Mr
Dmitriy Ponkin
(LHEP JINR)
29/09/2015, 16:55
The work is devoted to study and development of the ESIS KRION-6T beam emittance measurement device using sectional ion collector method. In the course of the work, the charge measurement possibility using multichannel ADC with current input was researched. MCU-based data acquisition system was designed. The system tests was carried out.
Andrey Yudin
(Vladimirovich)
29/09/2015, 17:25
This article is presenting software and hardware parts of the automatization project of control channel 8 lenses focusing of Phasotron at DLNP of JINR. The article describes goals, concepts and features of the software, developed with Python and QT.
Mr
Ivan Slepov
(JINR)
29/09/2015, 17:40
Mr
Andrey Terletskiy
(JINR)
29/09/2015, 17:40
The report describes structure of data acquisition electronics at BM@N. There will be three main parts related to each other.
The first one is a short description of electronic modules, their technical characteristics, functionality and detectors with which they were used.
The second one describes synchronization method, which was used. In particular, the White Rabbit protocol and its...
Lidija Zivkovic
(Institute of Physics Belgrade, Belgrade, Serbia)
30/09/2015, 09:00
In high-energy physics experiments, online selection is crucial to select interesting collisions from the large data volume. ATLAS b-jet triggers are designed to identify heavy-flavour content in real-time and provide the only option to efficiently record events with fully hadronic final states containing b-jets. In doing so, two different, but related, challenges are faced. The physics goal...
Lee Sawyer
(Louisiana Tech University, UK)
30/09/2015, 09:15
The new centre of mass energy and high luminosity conditions during Run 2 of the Large Hadron Collider impose ever more demanding constraints on the ATLAS online trigger reconstruction and selection system. To cope with these conditions, the hardware-based Level-1 trigger now includes a Topological Processor and the software-based High Level Trigger has been redesigned, merging the two...
Ryan White
(Universidad Técnica Federico Santa María, Valparaíso, Chile)
30/09/2015, 09:30
Electron and photon triggers covering transverse energies from 5 GeV
to several TeV are essential for signal selection in a wide variety
of ATLAS physics analyses to study Standard Model processes and to
search for new phenomena. Final states including leptons and photons
had, for example, an important role in the discovery and measurement
of the Higgs particle. Dedicated triggers are...
Dr
Yang Qin
(University of Manchester, UK)
30/09/2015, 10:05
The design and performance of the ATLAS Inner Detector (ID) trigger algorithms running online on the high level trigger (HLT) processor farm with the early LHC Run 2 data are discussed. During the 2013-15 LHC shutdown, the HLT farm was redesigned to run in a single HLT stage, rather than the two-stage (Level 2 and Event Filter) used in Run 1. This allowed a redesign of the HLT ID tracking...
Needa Asbah
(DESY, Hamburg, Germany)
30/09/2015, 10:20
The trigger system of the ATLAS experiment is designed to reduce the
event rate from the LHC nominal bunch crossing at 40 MHz to about 1
kHz, at the design luminosity of 10^34 cm^-2 s-1. After a successful
period of data taking from 2010 to early 2013, the LHC is restarting
in 2015 with much higher instantaneous luminosity and this will
increase the load on High Level Trigger system, the...
Mr
Tatsuya Mori
(The University of Tokyo)
30/09/2015, 10:35
The Large Hadron Collider (LHC) is foreseen to be upgraded during the shut-down period of 2018-2019 to deliver about 3 times the instantaneous design luminosity. Since the ATLAS trigger system, at that time, will not allow an increase of the trigger rate an improvement of the trigger system is required. The ATLAS LAr Calorimeter read-out will therefore be modified and digital trigger signals...
Prof.
Dario Barberis
(University and INFN Genova (Italy))
30/09/2015, 11:10
The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid...
Mr
Konstantin Gertsenberger
(JINR)
30/09/2015, 11:40
Today the use of databases is a prerequisite for qualitative management and unified access to the data of modern high-energy physics experiments. The developed database describing in this report is designed as comprehensive data storage for the ongoing sessions of the fixed target experiment BM@N at the Joint Institute for Nuclear Research. The structure and purposes of the BM@N facility will...
Mr
SEBASTIAN BUKOWIEC
(CERN)
01/10/2015, 09:00
The continuous growth of luminosity in high energy physics with the LHC restart in 2015 results in larger amount of data to be analysed and a corresponding increase in computing power.
Given these challenges, we have adopted a number of open source projects used by other large scale deployments elsewhere and contributed to those communities.
In particular, OpenStack was chosen as the...
Prof.
Alexander Bogdanov
(St.Petersburg State University)
01/10/2015, 09:30
To have computing power of large system in hand was a dream of computational scientists for a long time. There were a lot of very interesting proposals in that direction, but there always were bottlenecks, that managed to ruin the original idea. We review some of those problems and argue that new technologies can bring solutions at least to majority of them. The use of cloud technologies...
Prof.
Gennady Ososkov
(JINR)
01/10/2015, 10:30
The simulation concept for grid-cloud services of contemporary HENP experiments of the Big Data scale was formulated in practicing the simulation system developed in LIT JINR Dubna. This system is intended to improve the efficiency of the design and development of a wide class of grid-cloud structures by using work quality indicators of some real system to design and predict its evolution. For...
Dr
Alexei Klimentov
(Brookhaven National Lab), Mr
Dimitrii Krasnopevtsev
(National Research Nuclear University MEPhI (RU))
01/10/2015, 11:10
After the early success in discovering a new particle consistent with the long awaited Higgs boson, Large Hadron Collider experiments are ready for the precision measurements and further discoveries that will be made possible by much higher LHC collision rates from spring 2015. A proper understanding of the detectors performance at highoccupancy conditions is important for many on-going...
Andreas-Joachim Peters
(CERN)
01/10/2015, 11:30
The EOS project at CERN is providing large scale storage systems to LHC experiments and many other projects at CERN and beyond. In order to further increase the scalability and availability of the system we are investigating several new technologies such as ethernet connected disk drives and non-volatile memory implementations to further decrease the cost of ownership and the downtime after...
Dr
Andrei Tsaregorodtsev
(CPPM-IN2P3-CNRS)
01/10/2015, 12:00
Multiple research user communities need to put in common infrastructures their computing resources in order to boost the efficiency of their usage. Various grid infrastructures are trying to help the new users to start doing computations by providing services facilitating access to distributed computing resources. The DIRAC project is providing software for creating and operating such...
Mr
Jan Kundrát
(Institute of Physics of the AS CR and CESNET)
01/10/2015, 14:00
Dr
Petr Zrelov
(LIT JINR)
01/10/2015, 14:00
The paper reviews the present status and the perspectives of development of the heterogeneous computing cluster HybriLIT (http://hybrilit.jinr.ru/) which was put into operation in 2014 at the Laboratory of Information Technologies of JINR. HybriLIT provides possibilities to carry out high performance computing within the Multifunctional Information and Computing Complex in LIT JINR.
The...
Mr
Nichita Degteariov
(RENAM)
01/10/2015, 14:15
In recent years distributed information processing and high-performance computing (HPC, distributed Cloud and Grid computing infrastructures) technologies for solving complex tasks with high demands of computing resources are actively developing. In Moldova the works on creation of high-performance and distributed computing infrastructures were started relatively recently due to participation...
Mr
Vitaly Yermolchyk
(NC PHEP BSU)
01/10/2015, 14:30
Status of the NC PHEP BSU Tier 3 site presented. Transition to rackmounted servers started. Due to need in more scalable, reliable platform which provide efficient resource utilization tier infrastructure was ported on cloud with distributed storage. The choise and setup of cloud is discussed.
Ms
Nataliia Kulabukhova
(Saint Petersburg State University)
01/10/2015, 14:30
In this work by saying Virtual Accelerator we mean a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators using distributed computing resources. The main use of the Virtual Accelerator is simulation of beam dynamics by different packages with the opportunity to match the and the possibility to create pipelines of tasks...
Furano Fabrizio
(CERN IT/SDC)
01/10/2015, 15:05
The Dynamic Federations project ("dynafed") enables the deployment of
scalable, distributed storage systems composed of independent storage
endpoints. While the Uniform Generic Redirector at the heart of the
project is protocol agnostic, we have focussed our effort on HTTP-based
protocols, including S3 and WebDAV. The system has been deployed on
testbeds covering the majority of ...
Dr
Yuri Pepelyshev
(JINR)
01/10/2015, 15:20
The pattern recognition methodologies and, artificial neural networks were used widely for the reactor noise diagnostics. It’s very important for pulsed reactor of periodic operation IBR-2M (Dubna, Russia), which is a high sensitivity to reactivity fluctuations (40 times higher than stationary reactors with a uranium fuel).
The cluster analysis allows a detailed study of the structure and...
Dr
Nikolay Kutovskiy
(JINR)
01/10/2015, 15:20
To fulfill JINR commitments in different national and international projects related to modern information technologies usage such as cloud and grid computing as well as to provide the same tools for JINR users for their scientific research the cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen...
Mr
Andrei Ivashchenko
(St.Petersburg State University)
01/10/2015, 15:35
This work is aimed to develop a system, that will effectively solve the problem of storing and analyzing files containing text data, by using modern software development tools, techniques and approaches.
The main challenge of storing a large number of text documents defined at the problem formulation stage, have to be resolved with such functionality as full text search and document...
Mr
Nikolai Iuzhanin
(SPbSU)
01/10/2015, 15:50
In this article the problem of support of scientific projects in the computer center is considered throughout their lifecycle and in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of computer center. In view of the strong integration of IT infrastructure components with the use of virtualization,...
Mr
Igor Pelevanyuk
(JINR)
01/10/2015, 16:05
The BES-III experiment at the Institute of High Energy Physics (Beijing, China) is aimed at the precision measurements in e+e- anihilation in the energy range from 2.0 till 4.6 GeV. The world largest samples of J/psi and psi' events and unique samples of XYZ data have been already collected. Expected increase of the data volume in the coming years required significant evolution of the...
Mr
Ivan Gankevich
(Saint Petersburg State University)
01/10/2015, 16:05
Efficient distribution of high performance computing resources according to actual application needs along with comfortable and transparent access to these resources has been an open question since HPC technologies became widely introduced. One of the application classes that require such functionality are physics applications. In this paper we discuss issues and approaches to manage resources...
Mr
Dmitry Guschansky
(St.Petersburg State University)
01/10/2015, 16:20
Modern information technologies have an impact in research in all possible areas of knowledge, and the humanities are not an exception. Some of them, such as psychology and sociology, can use observations of the human behavior and the opinions of individuals and communities as a base for research. One of the possible ways to acquire the data for the base is from social networking services,...
Mr
Oleg Iakushkin
(Saint Petersburg State University)
01/10/2015, 16:35
Dr
Sergey Manoshin
(FLNP JINR)
01/10/2015, 16:35
At present days practically each new neutron spectrometer before construction or modernization is simulated, and its parameters are optimized with use of calculations on fast modern computers. In several leading world neutron centers development new and support of old program packages (MCSTAS, VITESS, RESTRAX, NISP) with use of a method of Monte Carlo is conducted. In FLNP modules for...
Dr
Charalampos Kouzinopoulos
(CERN)
01/10/2015, 16:50
The Hough Transform algorithm is a popular image analysis method that is widely used to perform global pattern recognition in images through the identification of local patterns in a suitably chosen parameter space. The algorithm can be also used to perform track reconstruction; to estimate the trajectory of individual particles when passed through the sensitive elements of a detector volume....
Dr
Alexander Kryukov
(SINP MSU)
01/10/2015, 17:05
Dr
Mohammad Al-Turany
(GSI/CERN)
02/10/2015, 09:00
The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of a common software framework in an experiment independent way; ALFA (ALICE-FAIR framework). ALFA is designed for high quality parallel data processing and reconstruction on heterogeneous computing systems. It provides a data transport layer and the capability to coordinate...
Dr
Ilija Vukotic
(University of Chicago)
02/10/2015, 09:30
The ATLAS Data analytics effort is focused on creating systems which provide the ATLAS ADC with new capabilities for understanding distributed systems and overall operational performance. These capabilities include: warehousing information from multiple systems (the production and distributed analysis system - PanDA, the distributed data management system - Rucio, the file transfer system,...
Dr
Alexei Klimentov
(Brookhaven National Lab)
02/10/2015, 10:00
Abstract. The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled...
Mr
Mikhail Borodin
(NRNU MEPHI, NRC KI)
02/10/2015, 10:30
The data processing and simulation needs of the ATLAS experiment at LHC grow continuously, as more data are collected and more use cases emerge. For data processing the ATLAS experiment adopted the data transformation approach, where software applications transform the input data into outputs. In the ATLAS production system, each data transformation is represented by a task, a collection of...
Dr
Patrick Fuhrmann
(DESY)
02/10/2015, 11:10
The availability of cheap, easy-to-use sync-and-share cloud services has split the scientific storage world into the traditional big data management systems and the very attractive sync-and-share services. With the former, the location of data is well understood while the latter is mostly operated in the Cloud, resulting in a rather complex legal situation.
Beside legal issues, those two...
Eygene Ryabinkin
(NRC "Kurchatov Institute")
02/10/2015, 11:40
The review of current status and the Program for future developments of data intensive high performance/high throughput computing complex for mega-science in NRC "Kurchatov Institute", supporting the Priority scientific task “Development of mathematical models, algorithms and software for systems with extramassive parallelism for pilot science and technical areas” is presented. Major upgrades...
Prof.
Alexander Degtyarev
(Professor)
02/10/2015, 12:10
Dealing with large volumes of data is tedious work which is often delegated to a computer, and more and more often this task is delegated not only to a single computer but also to a whole distributed computing system at once. As the number of computers in a distributed system increases, the amount of effort put into effective management of the system grows. When the system reaches some...
Prof.
Yury Panebrattsev
(JINR)
02/10/2015, 14:00
Modern education assumes significantly expand cooperation of universities with leading scientific centers for the training of highly qualified specialists. This report focuses on the MEhI and JINR joint project for the STAR experiment at RHIC (Brookhaven National Laboratory).
The STAR experiment is one of the leading international collaboration in the field of modern nuclear physics. Many...
Mrs
Evgenia Cheremisina
(Dubna International University of Nature, Society and Man. State Scientific Centre «VNIIgeosystem».)
02/10/2015, 14:00
For ensuring technological support of research and administrative activity in the sphere of environmental management was developed a specialized modular program complex. Its components provide realization of three main stages of any similar project:
- effective management of data and constructing of information and analytical systems of various complexity;
-complex analytical processing of...
Eygene Ryabinkin
(NRC "Kurchatov Institute")
02/10/2015, 14:20
An overview of Tier-1 operations during the beginning of LHC Run-2 will be presented. We will talk about three supported experiments, ALICE, ATLAS and LHCb: current status of resources and computing support, challenges, problems and solutions. Also we will give an overview of the wide-area networking situation and integration of our Tier-1 with regional Tier-2 centers.
Dr
Iurii Sakharov
(Dubna International University for Nature,Society and Man)
02/10/2015, 14:20
This report provides an insight into the transition of the Russian higher education to standard 3+, peculiarities of the bachelor programs according to standard 3 and the formation of educational programs of the new standard.
Special attention is paid to the optimization of the network of high schools on the basis of consolidation, building a network of Russian universities able to be on the...
Ms
Ksenia Klygina
(JINR), Ms
Victoria Belaga
(JINR), Prof.
Yury Panebrattsev
(JINR)
02/10/2015, 14:40
One important aspect in the pedagogy of modern education is the integration of technological elements of modern science into the educational process. This integration has given rise to what has become to be referred to as blended learning. In this report we focus on the hardware-software complex “Virtual Laboratory of Nuclear Fission” as an example of incorporation of current scientific data...
Dr
Elena Tikhonenko
(JINR)
02/10/2015, 14:55
The Compact Muon Solenoid (CMS) is a high-performance general-purpose detector at the Large Hadron Collider (LHC) at CERN. Russia and Dubna Member States (RDMS) CMS collaboration was founded in the 1994 year. More than twenty institutes from Russia and Joint Institute for Nuclear Research (JINR) are involved in Russia and Dubna Member States (RDMS) CMS Collaboration. The RDMS CMS takes an...
Olga Tyatyushkina
(Dubna Univeristy)
02/10/2015, 15:10
We discuss issues of updating educational program according to requirements of labor market and the professional standards of the IT industry. We suggest the technology of E-learning through open educational resource to provide the participation of employers in the development of educational content and the intensification of practical training.
Prof.
Alexander SHARMAZANASHVILI
(Georgian Technical University), Mr
Niko Tsutskiridze
(Georgian Technical University)
02/10/2015, 15:10
Data_vs_MonteCarlo discrepancy is one of the most important field of investigation for ATLAS simulation studies. There are several reasons of above mentioned discrepancies but primary interest is falling on geometry studies and investigation of how geometry descriptions of detector in simulation adequately representing “as-built” descriptions. Shapes consistency and detalization is not...
Mr
Yury Samoylenko
(Dubna University)
02/10/2015, 15:25
The article describes modern approaches of creating educational environments, describes main technologies for their creating and development and gives examples of projects in this area, both in Russia and abroad. Identification and formalization of the needs of participants in the educational process were done and a concept of an adaptive educational environment in the IT field of study...
Ms
Victoriya Osipova
(Tomsk Polytechnic University, Tomsk, Russia)
02/10/2015, 15:40
The traditional relational databases (aka RDBMS) having been consistent for the normalized data structures. RDBMS served well for decades, but the technology is not optimal for data processing and analysis in data intensive fields like social networks, oil-gas industry, experiments at the Large Hadron Collider, etc. Several challenges have been raised recently on the scalability of data...
Dr
Alexander Karlov
(JINR)
02/10/2015, 15:40
The growing demand for qualified IT experts raises serious challenges for the education and training of young professionals who would answer scientific, industrial and social problems of tomorrow. Virtualization has a great impact on education allowing to increase its efficiency, to cut costs and to expand student audience abstracting users from physical characteristics of computing resources....
Ms
Maria Grigorieva
(National Research Center “Kurchatov Institute”)
02/10/2015, 15:55
Scientific computing in a field of High Energy and Nuclear Physics (HENP) produces vast volumes of data. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, daily runs up to 1.5 M jobs and submit them using PanDA workload management system....
Nadezhda Tokareva
(Dubna Univeristy)
02/10/2015, 15:55
The authors observe the practice of implementing the Virtual Computer Laboratory in Dubna International University for Nature, Society and Man. New generation of the virtual computer laboratory has introduced game-changing technology to make virtualization of professional 3D graphics applications easy to deliver and meet the performance expectations of students studying for...
Mr
Artem Petrosyan
(JINR)
02/10/2015, 16:30
PanDA (Production and Distributed Analysis System) is a workload management system, widely used for data processing at experiments on Large Hadron Collider (LHC) and others. COMPASS is a high-energy physics experiment at the Super Proton Synchrotron (SPS). Data processing for COMPASS historically runs locally at CERN, on lxbatch, the data itself stored in CASTOR. In 2014 an idea to start...
Dr
Andrea Favareto
(University and INFN Genova (Italy))
02/10/2015, 16:45
The ATLAS experiment collects billions of events per year of data-taking, and processes them to make them available for physics analysis in several different formats. An even larger amount of events is in addition simulated according to physics and detector models and then reconstructed and analysed to be compared to real events. The EventIndex is a catalogue of all events in each production...
Mr
Ignacio Barrientos Arias
(CERN)
02/10/2015, 17:00
The CERN IT Department provides configuration management services to LHC experiments and to the department itself for more than 17,000 physical and virtual machines in two data centres. The services are based on open-source technologies such as Puppet and Foreman. The presentation will give an overview of the current deployment, the issues observed during the last years, the solutions adopted,...
Mr
Serob Balyan
(Saint-Petersburg State University), Mr
Suren Abrahamyan
(Saint-Petersburg State University)
02/10/2015, 17:15
Nowadays the use of distributed collaboration tools is widespread in many areas of people activity. But lack of mobility and certain equipment-dependency creates difficulties and decelerate development and integration of such technologies. Also mobile technologies allow individuals to interact with each other without need of traditional office spaces and regardless of location. Hence,...
Valeriy Parubets
(National Research Tomsk Polytechnic University)
02/10/2015, 17:30
The work reviews a development of mathematical solution for modeling heterogeneous distributed data storages. There is a review of different approaches of modeling (Monte-Carlo, agent-based modeling). Performance analysis of systems based on commercial solutions of Oracle and freeware solutions (Cassandra, Hadoop) is provided. It's assumed that developed tool will help optimize data...
Dr
Alexander Khilchenko
(Budker Institute, Novosibirsk), Dr
Igor Semenov
(Project Center ITER)
ITER (International Thermonuclear Experimental Reactor) is one of the most complex international mega project (Cadarasche, France). It integrates more than 180 technical Sub Systems (Vacuum, Cooling, Power Suppliers, Cryogenics, Plasma Diagnostics, etc), procured from different Participant Teams through their 7 Domestic Agencies (China, EU, India, Japan, Korea, RF, US).
COntrol, Data...