Mr
Mitchell Cox
(University of the Witwatersrand, Johannesburg)
02/07/2014, 14:30
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
Modern big science projects are becoming highly data intensive to the point where offline processing of stored data is infeasible. High data throughput computing, or Data Stream Computing, for future projects is required to deal with terabytes of data per second which cannot be stored in long-term storage elements. Conventional data-centres based on typical server-grade hardware are expensive...
Mr
Robert Reed
(University of the Witwatersrand)
02/07/2014, 14:50
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
Big science projects like the Large Hadron Collider (LHC) at CERN and the design of the Square Kilometer Array (SKA) in South Africa are fast exceeding current data throughput capabilities making offline storage infeasible. A potential solution involves using low-cost, low-power ARM processors in large arrays to provide massive parallelisation for data stream computing (DSC). The main...
Mr
Thomas Wrigley
(University of the Witwatersrand, Johannesburg)
02/07/2014, 15:10
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
The computing requirements of Big Science are ever-increasing and the volume of data produced by some of its projects has long exceeded available off-line storage capacity, necessitating the use of innovative means of data capturing. The volume of data expected to be produced by projects such as the Square Kilometre Array and the to-be upgraded Large Hadron Collider will far exceed existing...
Mr
Alexander Novikov
(National Research Centre "Kurchatov Institute")
02/07/2014, 15:30
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
Report presents the approach and its software implementation of high-throughput parallel pipelined data processing system on the example of remote sensing of Earth from satellites (up to 1 Tb of data daily). The system is developed for processing potentially infinite data flow as it emerging in realtime. Data flow can be processing by set of subtasks steps( pipeline) with one or many required...
Mr
Andrey Kiryanov
(PNPI)
02/07/2014, 16:30
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
One of the most widely used storage solutions in WLCG is a Disk Pool Manager (DPM) developed and supported by SDC/ID group at CERN. Recently DPM went through a massive overhaul to address scalability and extensibility issues of the old code.
New system was called DMLite. Unlike the old DPM that was based on daemons, DMLite is arranged as a library that can be loaded directly by an...
Mr
Sergey Bobkov
(NATIONAL RESEARCH CENTRE "KURCHATOV INSTITUTE")
02/07/2014, 16:50
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
In our work we present a new method of feature vector calculation for XFEL diffraction patterns. Existing methods of image feature vector calculation developed for computer vision and patterns recognition are not effective for diffraction patterns analysis since they don't take into account inner properties of diffraction physics. In our research we developed the new method based on connection...
Ирина Филозова
(Объединенный институт ядерных исследований, Лаборатория информационных технологий / ГОУ ВПО Международный Университет природы, общества и человека «Дубна», Институт системного анализа и управления)
02/07/2014, 17:10
Section 3 - Technology for storaging, searching and processing of Big Data
sectional reports
Современные проблемы и задачи требуют для своего решения анализа больших объёмов (слабо/хорошо структурированной) информации, с большой скоростью, распределённых в различных источниках. Время и ресурсы для решения проблем и задач, как правило, ограничены и зачастую несопоставимы с существующими механизмами поиска, селекции и аккумуляции требуемой информации по качеству. Неслучайно, что один из...
Mr
Dmitriy Lotarev
(A.A. Kharkevich Institute for Information Transmission Problems, RAS)
02/07/2014, 17:50
sectional reports
The problem of allocation of Steiner points in Euclidean Steiner Tree is considered. In spite of advances in wireless technologies, many computer networks utilize cables as a physical medium for devices to transfer data. Such networks are allocated on the earth's surface and problem is to minimize the cost of network to connect the computers. The cost of network is sum of building costs and...