Steven K. Tjoa

Learn More
The widespread availability of photo editing software has made it easy to create visually convincing digital image forgeries. To address this problem, there has been much recent work in the field of digital image forensics. There has been little work, however, in the field of anti-forensics, which seeks to develop a set of techniques designed to fool(More)
Recently, a number of digital image forensic techniques have been developed which are capable of identifying an image's origin, tracing its processing history, and detecting image forgeries. Though these techniques are capable of identifying standard image manipulations, they do not address the possibility that anti-forensic operations may be designed and(More)
Recent development in multimedia processing and network technologies has facilitated the distribution and sharing of multimedia through networks, and increased the security demands of multimedia contents. Traditional image content protection schemes use extrinsic approaches, such as watermarking or fingerprinting. However, under many circumstances,(More)
In non-intrusive forensic analysis, we wish to find information and properties about a piece of data without any reference to the original data prior to processing. An important first step to forensic analysis is the detection and estimation of block processing. Most existing work in block measurement uses strong assumptions on the data related to the block(More)
The area of non-intrusive forensic analysis has found many applications in the area of digital imaging. One unexplored area is the identification of source coding in digital images. In other words, given a digital image, can we identify which compression scheme was used, if any? This paper focuses on the aspect of transform coder classification, where we(More)
Dictionary learning through matrix factorization has become widely popular for performing music transcription and source separation. These methods learn a concise set of dictionary atoms which represent spectrograms of musical objects. However, there is no guarantee that the atoms learned will be perceptually meaningful, particularly when there exists(More)
An emerging technology that increasingly gains importance on the biometric market during the last years is vein recognition. Therefore, in this paper a novel palm vein recognition system is presented which is based on the Enhanced Local Gabor Binary Patterns Histogram Sequence (ELGBPHS) algorithm. The ELGBPHS is a well-established face recognition algorithm(More)
Factorization of polyphonic musical signals remains a difficult problem due to the presence of overlapping harmonics. Existing dictionary learning methods cannot guarantee that the learned dictionary atoms are semantically meaningful. In this paper, we explore the factorization of harmonic musical signals when a fixed dictionary of harmonic sounds is(More)
Super-resolution (SR) is a well-studied problem in signal processing, particularly with regard to image and video applications. SR techniques are useful because unlike simple interpolation, they create a high-resolution signal from a low-resolution input by generating new information that was not previously present. A growing body of research shows progress(More)
Nonnegative matrix factorization (NMF) is a widely-used tool for obtaining low-rank approximations of nonnegative data such as digital images, audio signals, textual data, financial data, and more. One disadvantage of the basic NMF formulation is its inability to control the amount of dependence among the learned dictionary atoms. Enforcing dependence(More)