# Finite-State Complexity and the Size of Transducers

@inproceedings{Calude2010FiniteStateCA, title={Finite-State Complexity and the Size of Transducers}, author={Cristian S. Calude and Kai Salomaa and Tania K. Roblot}, booktitle={DCFS}, year={2010} }

Finite-state complexity is a variant of algorithmic information theory obtained by replacing Turing machines with finite transducers. We consider the state-size of transducers needed for minimal descriptions of arbitrary strings and, as our main result, we show that the state-size hierarchy with respect to a standard encoding is infinite. We consider also hierarchies yielded by more general computable encodings.

## 9 Citations

### The Computation of Finite-State Complexity by Tania K . Roblot

- Computer Science
- 2011

This thesis proposes a new variant to Algorithmic Information Theory constructed around finite-state complexity, a computable counterpart to Kolmogorov complexity based on finite transducers rather than Turing machines, and presents a first attempt at applying the finite- state complexity in a practical setting.

### Theoretical Computer Science A linearly computable measure of string complexity

- Computer Science, Mathematics
- 2012

A measure of string complexity, called I -complexity, computable in linear time and space, is presented, which counts the number of different substrings in a given string.

### A linearly computable measure of string complexity

- Computer Science, MathematicsTheor. Comput. Sci.
- 2012

### Advanced Topics on State Complexity of Combined Operations

- Computer Science
- 2010

This thesis discusses the state complexities of individual operations on regular languages, including union, intersection, star, catenation, reversal and so on, and introduces the concept of estimation and approximation of state complexity.

### Invariance and Universality of Complexity

- Computer Science, MathematicsComputation, Physics and Beyond
- 2012

Without any assumptions regarding encodings of functions and arguments and without any assumptions about computability or computing models, the notion of complexity is introduced and a general invariance theorem is proved and sufficient conditions are stated for complexity to be computable.

### Searching for Compact Hierarchical Structures in DNA by means of the Smallest Grammar Problem

- Computer Science
- 2011

It is proved that the number of smallest grammars can be exponential in the size of the sequence and then analysed the stability of the discovered structures between minimal Grammar Parsing for real-life examples.

### A Review of Methods for Estimating Algorithmic Complexity and Beyond: Options, Challenges, and New Directions

- Computer Science
- 2020

It is explained how different approaches to algorithmic complexity can explore the relaxation of different necessary and sufficient conditions in their exploration of numerical applicability with some of those approaches taking greater risks than others in exchange for application relevancy.

### A Review of Methods for Estimating Algorithmic Complexity and Beyond: Options, and Challenges

- Computer ScienceArXiv
- 2020

It is explained how different approaches to algorithmic complexity can explore the relaxation of different necessary and sufficient conditions in their exploration of numerical applicability with some of those approaches taking greater risks than others in exchange for application relevancy.

### A Review of Methods for Estimating Algorithmic Complexity: Options, Challenges, and New Directions †

- Computer ScienceEntropy
- 2020

It will be explained how different approaches to algorithmic complexity can explore the relaxation of different necessary and sufficient conditions in their pursuit of numerical applicability, with some of these approaches entailing greater risks than others in exchange for greater relevance.

## References

SHOWING 1-10 OF 19 REFERENCES

### Resource-Bounded Kolmogorov Complexity Revisited

- MathematicsSIAM J. Comput.
- 1997

A fresh look at CD complexity, where CD t (x) is the smallest program that distinguishes x from all other strings in time t(¦x¦), and a CND complexity, a new nondeterministic variant of CD complexity.

### Automaticity: Properties of a Measure of Descriptional Complexity

- MathematicsSTACS
- 1994

The notion of automaticity is explored, which attempts to model how “close” f is to a finite-state function.

### Approximating the smallest grammar: Kolmogorov complexity in natural models

- Computer ScienceSTOC '02
- 2002

The main result is an exponential improvement of the best proved approximation ratio, which is given an <i>O</i>(log (<i/g</i><sup>*</sup>)) approximation algorithm for the smallest non-deterministic finite automaton with advice that produces a given string.

### Automaticity I: Properties of a Measure of Descriptional Complexity

- MathematicsJ. Comput. Syst. Sci.
- 1996

Let?and?be nonempty alphabets with?finite. Letfbe a function mapping?* to?. We explore the notion ofautomaticity, which attempts to model how “close”fis to a finite-state function. Formally, the…

### Grammar Compression, LZ-Encodings, and String Algorithms with Implicit Input

- Computer ScienceICALP
- 2004

The grammar compression is more convenient than LZ-encoding, its size differs from that of LZ's by at most logarithmic factor, and the constructive proof is based on the concept similar to balanced trees.

### 𝒫𝒮-regular languages

- Computer Science, LinguisticsInt. J. Comput. Math.
- 2011

A class of generalized regular languages, namely 𝒫𝒮-regular languages, is introduced, and some characterizations of such generalizedregular languages are given.

### Approximation algorithms for grammar-based compression

- Computer ScienceSODA '02
- 2002

The approximation ratio is analyzed, that is, the maximum ratio between the size of the generated grammar and the smallest possible grammar over all inputs, for four previously-proposed grammar-based compression algorithms.

### Algorithmic information theory

- Computer ScienceScholarpedia
- 2007

This article is a brief guide to the field of algorithmic information theory, its underlying philosophy, the major subfields, applications, history, and a map of the field are presented.