Floriano Zini

Learn More
Computational Grids process large, computationally intensive problems on small data sets. In contrast, Data Grids process large computational problems that in turn require evaluating, mining and producing large amounts of data. Replication, creating geographically disparate identical copies of data, is regarded as one of the major optimisation techniques(More)
Computational Grids normally deal with large computationally intensive problems on small data sets. In contrast, Data Grids mostly deal with large computational problems that in turn require evaluating and mining large amounts of data. Replication is regarded as one of the major optimisation techniques for providing fast data access. Within this paper,(More)
Grid computing is fast emerging as the solution to the problems posed by the massive computational and data handling requirements of many current international scientific projects. Simulation of the Grid environment is important to evaluate the impact of potential data handling strategies before being deployed on the Grid. In this paper, we look at the(More)
In large-scale Grids, the replication of files to different sites is an important data management mechanism which can reduce access latencies and give improved usage of resources such as network bandwidth, storage and computing power. In the search for an optimal data replication strategy, the Grid simulator OptorSim was developed as part of the European(More)
Optimising the use of Grid resources is critical for users to effectively exploit a Data Grid. Data replication is considered a major technique for reducing data access cost to Grid jobs. This paper evaluates a novel replication strategy, based on an economic model, that optimises both the selection of replicas for running jobs and the dynamic creation of(More)
We are working on a system for the optimised access and replication of data on a Data Grid. Our approach is based on the use of an economic model that includes the actors and the resources in the Grid. Optimisation is obtained via interaction of the actors in the model, whose goals are maximising the profits and minimising the costs of data resource(More)
The remit of the European DataGrid (EDG) [2] project was to develop an infrastructure that could support the intensive computational and data handling needs of widely distributed scientific communities. An important part of managing the data present in such a Grid is file replication placing copies of files at different sites in order to reduce data access(More)
Many current international scientific projects are based on large scale applications that are both computationally complex and require the management of large amounts of distributed data. Grid computing is fast emerging as the solution to the problems posed by these applications. To evaluate the impact of resource optimisation algorithms, simulation of the(More)
Nowadays software applications are characterized by a great complexity. It arises from the need of reusing existing components and properly integrating them. The distribution of the involved entities and their heterogeneity makes it very useful the adoption of the agent-oriented technology. The paper presents the state-of-the-art of CaseLP, an experimental(More)