Learn More
We present a software architecture and framework that can be used to facilitate the development of data processing applications for High Energy Physics experiments. The development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the long(More)
We present the strategy that has been adopted for the development of the software system for the LHCb experiment. This strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology. The software architecture, called GAUDI, covers event data processing(More)
The Large Observatory For x-ray Timing (LOFT) is a mission concept which was proposed to ESA as M3 and M4 candidate in the framework of the Cosmic Vision 2015-2025 program. Thanks to the unprecedented combination of effective area and spectral resolution of its main instrument and the uniquely large field of view of its wide field monitor, LOFT will be able(More)
The Large Observatory For x-ray Timing (LOFT) was studied within ESA M3 Cosmic Vision framework and participated in the final downselection for a launch slot in 2022-2024. Thanks to the unprecedented combination of effective area and spectral resolution of its main instrument, LOFT will study the behaviour of matter under extreme conditions, such as the(More)
The GAUDI software architecture, designed in the context of the LHCb experiment, maintains separate and distinct descriptions of the transient and persistent representations of the data objects. One of the motivations for this approach has been the requirement for a multitechnology persistency solution such that the best-adapted technology can be used for(More)
To asses stability against 1/ f noise, the Low Frequency Instrument (LFI) on–board the Planck mission will acquire data at a rate much higher than the data rate allowed by the science telemetry bandwith of 35.5 kbps. The data are processed by an on–board pipeline, followed on– ground by a decoding and reconstruction step, to reduce the volume of data to a(More)
  • 1