Artur Rataj

  • Citations Per Year
Learn More
We propose a softening of a genetic program by the so–called fractional instructions. Thanks to their adjustable strengths, a new instruction can be gradually introduced to a program, and the other instructions may gradually adapt to the new member. In this way, a transformation of one candidate into another can be continuous. Such an approach makes it(More)
The model checking tools Uppaal and VerICS accept a description of a network of Timed Automata with Discrete Data (TADDs) as input. Thus, to verify a concurrent programwritten in Java by means of these tools, first a TADD model of the program must be build. Therefore, we have developed the J2TADD tool that translates a Java program to a network of TADDs;(More)
We describe a GPGPU–based Monte Carlo simulator integrated with Prism. It supports Markov chains with discrete or continuous time and a subset of properties expressible in PCTL, CSL and their variants extended with rewards. The simulator allows an automated statistical verification of results obtained using Prism’s formal methods.
We show how to extrapolate an optimal policy controlling a model, which is itself too large to find the policy directly using probabilistic model checking (PMC). In particular, we look for a global optimal resolution of non–determinism in several small Markov Decision Processes (MDP) using PMC. We then use the resolution to find a respective set of decision(More)
Raster images can have a range of various distortions connected to their raster structure. Upsampling them might in effect substantially yield the raster structure of the original image, known as aliasing. The upsampling itself may introduce aliasing into the upsampled image as well. The presented method attempts to remove the aliasing using frequency(More)
This paper discusses the propagation of signals in generic densely connected multilayered feedforward neural networks. It is concluded that the dense connecting combined with the hyperbolic tangent activation functions of the neurons may cause a highly random, spurious generalization, that decreases the overall performance and reliability of a neural(More)
We extend the fractional genetic programming scheme with data elements that are no more scalar, but instead are similar to probability density functions. The extension straightforwardly fits into fractional programming, in which data elements are blended from several values. In the case of our previous work, the blend produced a single scalar value. The(More)