Characteristic-dependent linear rank inequalities via complementary vector spaces

  title={Characteristic-dependent linear rank inequalities via complementary vector spaces},
  author={Victor Pe{\~n}a and Humberto Sarria},
  journal={Journal of Information and Optimization Sciences},
  pages={345 - 369}
Abstract A characteristic-dependent linear rank inequality is a linear inequality that holds by ranks of subspaces of a vector space over a finite field of determined characteristic, and does not in general hold over other characteristics. In this paper, we produce new characteristic- dependent linear rank inequalities by an alternative technique to the usual Dougherty’s inverse function method [9]. We take up some ideas of Blasiak [4], applied to certain complementary vector spaces, in order… 

How to Find New Characteristic-Dependent Linear Rank Inequalities using Binary Matrices as a Guide

This paper shows a method to produce characteristic-dependent linear rank inequalities using binary matrices with suitable ranks over different fields, and some of them imply the inequalities presented in [1,9].

How to Find New Characteristic-Dependent Linear Rank Inequalities using Secret Sharing

Using ideas of secret sharing, a theorem is shown that produces characteristic-dependent linear rank inequalities that can be used for getting lower bounds on information ratios in linear secret sharing.

On strongly α-topological vector spaces

  • H. Ibrahim
  • Mathematics
    Journal of Interdisciplinary Mathematics
  • 2022



Linear rank inequalities on five or more variables

It is proved that there are essentially new inequalities at each number of variables beyond four (a result also proved recently by Kinser) and a list of 24 which, together with the Shannon and Ingleton inequalities, generate all linear rank inequalities on five variables is given.

Computations of linear rank inequalities on six variables

  • R. Dougherty
  • Mathematics
    2014 IEEE International Symposium on Information Theory
  • 2014
Partial results of computations on six-variable linear rank inequalities are presented, showing that the number of sharp inequalities (those which cannot be generated from other inequalities) is more than one billion (counting variable-permuted forms).

Characteristic-Dependent Linear Rank Inequalities With Applications to Network Coding

Two characteristic-dependent linear rank inequalities are given for eight variables and applications of these inequalities to the computation of capacity upper bounds in network coding are demonstrated.

New inequalities for subspace arrangements

  • R. Kinser
  • Mathematics
    J. Comb. Theory, Ser. A
  • 2011

Networks, Matroids, and Non-Shannon Information Inequalities

The Vamos network is constructed, and it is proved that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.

Lexicographic Products and the Power of Non-linear Network Coding

The technique uses linear programs to establish separations between combinatorial and coding-theoretic parameters and applies hyper graph lexicographic products to amplify these separations, showing a polynomial separation between linear and non-linear network coding rates.

Index Coding With Side Information

A measure on graphs, the minrank, is identified, which exactly characterizes the minimum length of linear and certain types of nonlinear INDEX codes and for natural classes of side information graphs, including directed acyclic graphs, perfect graphs, odd holes, and odd anti-holes, minrank is the optimal length of arbitrary INDex codes.

Inequalities for Shannon Entropy and Kolmogorov Complexity

An inequality for Kolmogorov complexities that implies Ingleton's inequality for ranks is presented and another application is a new simple proof of one of Gacs-Korner's results on common information.

Graph-Theoretical Constructions for Graph Entropy and Network Coding Based Communications

An undirected graph on all possible configurations of the digraph, referred to as the guessing graph, is introduced, which encapsulates the essence of dependence amongst configurations and proves that the guessing number of a digraph is equal to the logarithm of the independence number of its guessing graph.

A First Course in Information Theory

This book provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory.