Learn More
We report on careful implementations of seven algorithms for solving the problem of finding a maximum transversal of a sparse matrix. We analyze the algorithms and discuss the design choices. To the best of our knowledge, this is the most comprehensive comparison of maximum transversal algorithms based on augmenting paths. Previous papers with the same(More)
The betweenness centrality metric has always been intriguing for graph analyses and used in various applications. Yet, it is one of the most computationally expensive kernels in graph mining. In this work, we investigate a set of techniques to make the betweenness centrality computations faster on GPUs as well as on heterogeneous CPU/GPU architectures. Our(More)
The literature search has always been an important part of an academic research. It greatly helps to improve the quality of the research process and output, and increase the efficiency of the researchers in terms of their novel contribution to science. As the number of published papers increases every year, a manual search becomes more exhaustive even with(More)
Centrality metrics have shown to be highly correlated with the importance and loads of the nodes within the network traffic. In this work, we provide fast incremental algorithms for closeness centrality computation. Our algorithms efficiently compute the closeness centrality values upon changes in network topology, i.e., edge insertions and deletions. We(More)
Result diversification has gained a lot of attention as a way to answer ambiguous queries and to tackle the redundancy problem in the results. In the last decade, diversification has been applied on or integrated into the process of PageRank- or eigenvector-based methods that run on various graphs, including social networks, collaboration networks in(More)
The betweenness metric has always been intriguing and used in many analyses. Yet, it is one of the most com-putationally expensive kernels in graph mining. For that reason, making betweenness centrality computations faster is an important and well-studied problem. In this work, we propose the framework, BADIOS, which compresses a network and shatters it(More)
We propose a directed hypergraph model and a refinement heuris-tic to distribute communicating tasks among the processing units in a distributed memory setting. The aim is to achieve load balance and minimize the maximum data sent by a processing unit. We also take two other communication metrics into account with a tie-breaking scheme. With this approach,(More)
Analyzing networks requires complex algorithms to extract meaningful information. Centrality metrics have shown to be correlated with the importance and loads of the nodes in network traffic. Here, we are interested in the problem of centrality-based network management. The problem has many applications such as verifying the robustness of the networks and(More)