• Publications
  • Influence
Transfer Feature Learning with Joint Distribution Adaptation
TLDR
We put forward a novel transfer learning solution, referred to as Joint Distribution Adaptation (JDA), to jointly adapt both the marginal and conditional distributions in a principled dimensionality reduction procedure, and construct new feature representation that is effective and robust for substantial distribution difference. Expand
  • 698
  • 183
  • PDF
Transfer Joint Matching for Unsupervised Domain Adaptation
TLDR
We propose a novel domain adaptation solution, referred to as Transfer Joint Matching (TJM), to jointly perform feature matching and instance reweighting across domains in a principled dimensionality reduction procedure. Expand
  • 377
  • 87
  • PDF
Transfer Sparse Coding for Robust Image Representation
TLDR
Sparse coding learns a set of basis functions such that each input signal can be well approximated by a linear combination of just a few of the bases. Expand
  • 158
  • 33
  • PDF
Domain Invariant Transfer Kernel Learning
TLDR
We propose a transfer kernel learning (TKL) approach to learn a domain-invariant kernel by directly matching source and target distributions in the reproducing kernel Hilbert space. Expand
  • 119
  • 22
A novel approach for process mining based on event types
TLDR
A novel approach for process mining based on two event types, i.e., START and COMPLETE, is proposed. Expand
  • 154
  • 12
  • PDF
A workflow net similarity measure based on transition adjacency relations
TLDR
We define the similarity and the distance based on firing sequences in the context of workflow nets (WF-nets) as the unified reference concepts. Expand
  • 94
  • 11
  • PDF
Mining process models with non-free-choice constructs
TLDR
We propose an algorithm that is able to deal with both kinds of causal dependencies between tasks, i.e., explicit and implicit ones. Expand
  • 258
  • 9
  • PDF
Deep Learning of Transferable Representation for Scalable Domain Adaptation
TLDR
We propose a unified deep adaptation framework for jointly learning transferable representation and classifier to enable scalable domain adaptation, by taking the advantages of both deep learning and optimal two-sample matching. Expand
  • 92
  • 8
DLFuzz: differential fuzzing testing of deep learning systems
TLDR
In this paper, we propose DLFuzz, the first differential fuzzing testing framework to guide DL systems exposing incorrect behaviors. Expand
  • 59
  • 7
  • PDF
Overlap-free Karatsuba-Ofman polynomial multiplication algorithms
TLDR
The authors describe how a simple way to split input operands allows for fast VLSI implementations of subquadratic GF (2)[ x ] Karatsuba-Ofman multipliers. Expand
  • 70
  • 6
  • PDF