# Characteristic-dependent linear rank inequalities via complementary vector spaces

@article{Pea2019CharacteristicdependentLR, title={Characteristic-dependent linear rank inequalities via complementary vector spaces}, author={Victor Pe{\~n}a and Humberto Sarria}, journal={Journal of Information and Optimization Sciences}, year={2019}, volume={42}, pages={345 - 369} }

Abstract A characteristic-dependent linear rank inequality is a linear inequality that holds by ranks of subspaces of a vector space over a finite field of determined characteristic, and does not in general hold over other characteristics. In this paper, we produce new characteristic- dependent linear rank inequalities by an alternative technique to the usual Dougherty’s inverse function method [9]. We take up some ideas of Blasiak [4], applied to certain complementary vector spaces, in order…

## 3 Citations

### How to Find New Characteristic-Dependent Linear Rank Inequalities using Binary Matrices as a Guide

- Mathematics, Computer ScienceArXiv
- 2019

This paper shows a method to produce characteristic-dependent linear rank inequalities using binary matrices with suitable ranks over different fields, and some of them imply the inequalities presented in [1,9].

### How to Find New Characteristic-Dependent Linear Rank Inequalities using Secret Sharing

- Computer Science, MathematicsArXiv
- 2021

Using ideas of secret sharing, a theorem is shown that produces characteristic-dependent linear rank inequalities that can be used for getting lower bounds on information ratios in linear secret sharing.

## References

SHOWING 1-10 OF 23 REFERENCES

### Linear rank inequalities on five or more variables

- MathematicsArXiv
- 2009

It is proved that there are essentially new inequalities at each number of variables beyond four (a result also proved recently by Kinser) and a list of 24 which, together with the Shannon and Ingleton inequalities, generate all linear rank inequalities on five variables is given.

### Computations of linear rank inequalities on six variables

- Mathematics2014 IEEE International Symposium on Information Theory
- 2014

Partial results of computations on six-variable linear rank inequalities are presented, showing that the number of sharp inequalities (those which cannot be generated from other inequalities) is more than one billion (counting variable-permuted forms).

### Characteristic-Dependent Linear Rank Inequalities With Applications to Network Coding

- MathematicsIEEE Transactions on Information Theory
- 2015

Two characteristic-dependent linear rank inequalities are given for eight variables and applications of these inequalities to the computation of capacity upper bounds in network coding are demonstrated.

### Networks, Matroids, and Non-Shannon Information Inequalities

- Computer ScienceIEEE Transactions on Information Theory
- 2007

The Vamos network is constructed, and it is proved that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.

### Lexicographic Products and the Power of Non-linear Network Coding

- Computer Science2011 IEEE 52nd Annual Symposium on Foundations of Computer Science
- 2011

The technique uses linear programs to establish separations between combinatorial and coding-theoretic parameters and applies hyper graph lexicographic products to amplify these separations, showing a polynomial separation between linear and non-linear network coding rates.

### Index Coding With Side Information

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2011

A measure on graphs, the minrank, is identified, which exactly characterizes the minimum length of linear and certain types of nonlinear INDEX codes and for natural classes of side information graphs, including directed acyclic graphs, perfect graphs, odd holes, and odd anti-holes, minrank is the optimal length of arbitrary INDex codes.

### Inequalities for Shannon Entropy and Kolmogorov Complexity

- Computer ScienceJ. Comput. Syst. Sci.
- 2000

An inequality for Kolmogorov complexities that implies Ingleton's inequality for ranks is presented and another application is a new simple proof of one of Gacs-Korner's results on common information.

### Graph-Theoretical Constructions for Graph Entropy and Network Coding Based Communications

- Computer ScienceIEEE Transactions on Information Theory
- 2011

An undirected graph on all possible configurations of the digraph, referred to as the guessing graph, is introduced, which encapsulates the essence of dependence amongst configurations and proves that the guessing number of a digraph is equal to the logarithm of the independence number of its guessing graph.

### A First Course in Information Theory

- Computer Science
- 2002

This book provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory.