p4pktgen: Automated Test Case Generation for P4 Programs

@article{Ntzli2018p4pktgenAT,
  title={p4pktgen: Automated Test Case Generation for P4 Programs},
  author={Andres N{\"o}tzli and Jehandad Khan and Andy Fingerhut and Clark W. Barrett and Peter M. Athanas},
  journal={Proceedings of the Symposium on SDN Research},
  year={2018}
}
With the rise of programmable network switches, network infrastructure is becoming more flexible and more capable than ever before. Programming languages such as P4 lower the barrier for changing the inner workings of network switches and offer a uniform experience across different devices. However, this programmability also brings the risk of introducing hard-to-catch bugs at a level that was previously covered by well-tested devices with a fixed set of capabilities. Subtle discrepancies… 

Figures and Tables from this paper

p4v: practical verification for programmable data planes

The design and implementation of p4v is presented, a practical tool for verifying data planes described using the P4 programming language that adds several key innovations including a novel mechanism for incorporating assumptions about the control plane and domain-specific optimizations which are needed to scale to large programs.

Towards Runtime Verification of Programmable Switches

The insight is that runtime verification can detect bugs, even those that are not detected at compile-time, with machine learning-guided fuzzing with the aim of enabling a more automated and real-time localization of bugs in P4 programs using software testing techniques like Tarantula.

Verification of P4 programs in feasible time using assertions

This paper implements a prototype of a P4 verification approach, and uses the prototype implemented to show the gains provided by three speed up techniques (use of constraints, program slicing, parallelization), and experiment with different compiler optimization choices.

Fix with P6: Verifying Programmable Switches at Runtime

P6 is based on machine learning-guided fuzzing that tests P4 switch non-intrusively, i.e., without modifying the P4 program for detecting runtime bugs, and significantly outperforms bug detection baselines while generating fewer packets and patches bugs in large P4 programs such as switch.p4 without triggering any regressions.

Debugging P4 programs with vera

Vera is a tool that verifies P4 programs using symbolic execution and automatically uncovers a number of common bugs including parsing/deparsing errors, invalid memory accesses, loops and tunneling errors as well as verifying user-specified properties in a novel language the authors call NetCTL.

FP4: Line-rate Greybox Fuzz Testing for P4 Switches

The design and implementation of FP4 is presented, a fuzz-testing framework for P4 switches that achieves high expressiveness, coverage, and scalability, and can validate both safety and stateful properties, improves efficiency over existing random packet generation baselines, and reaches 100% coverage on a wide range of examples.

P4DB: On-the-Fly Debugging for Programmable Data Planes

P4DB is proposed, a general debugging platform that empowers operators to debug P4 programs in three levels of visibility with rich primitives, and almost introduces no performance overhead on Tofino, the hardware P4 target.

Finding hard-to-find data plane bugs with a PTA

This paper provides a taxonomy of data plane bugs, and uses the taxonomy to derive a Portable Test Architecture (PTA) which offers essential abstractions for testing on a variety of network hardware devices.

SwitchV: automated SDN switch validation with P4 models

This work describes the experiences following a new approach during the development of switch software stacks that extend fixed-function ASICs with SDN capabilities, and focuses on SwitchV, a system for automated end-to-end switch validation using fuzzing and symbolic analysis, that evolves effortlessly with the switch specification.

P4 Switch Code Data Flow Analysis: Towards Stronger Verification of Forwarding Plane Software

This paper focuses on the P4 language, and presents the design and implementation of p4-data-flow, a practical tool which uses data flow analysis for verification of switch programs, and present experiments showing thatData flow analysis may reveal defects from classes not yet covered by existing work, without demanding further programmer effort.
...

References

SHOWING 1-10 OF 25 REFERENCES

P4K: A Formal Semantics of P4 and Applications

This paper provides an executable formal semantics of the P4 language in the K framework and provides an interpreter and various analysis tools including a symbolic model checker and a deductive program verifier for P4.

KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs

A new symbolic execution tool, KLEE, capable of automatically generating tests that achieve high coverage on a diverse set of complex and environmentally-intensive programs, and significantly beat the coverage of the developers' own hand-written test suite is presented.

A NICE Way to Test OpenFlow Applications

This paper proposes a novel way to augment model checking with symbolic execution of event handlers (to identify representative packets that exercise code paths on the controller) and presents a simplified OpenFlow switch model (to reduce the state space), and effective strategies for generating event interleavings likely to uncover bugs.

Automatically verifying reachability and well-formedness in P4 Networks

This work provides an operational semantics for P4 constructs and uses it to compile P4 to Datalog so that the verification model can be automatically updated as the network changes, and demonstrates this vision by compiling the mTag example in the P4 specification and by automatically detecting forwarding bugs.

DART: directed automated random testing

DART is a new tool for automatically testing software that combines three main techniques, automated extraction of the interface of a program with its external environment using static source-code parsing, and dynamic analysis of how the program behaves under random testing and automatic generation of new test inputs to direct systematically the execution along alternative program paths.

Automatic Test Packet Generation

An automated and systematic approach for testing and debugging networks called “Automatic Test Packet Generation” (ATPG), which reads router configurations and generates a device-independent model and finds that a small number of test packets suffices to test all rules in these networks.

CUTE: a concolic unit testing engine for C

A method to represent and track constraints that capture the behavior of a symbolic execution of a unit with memory graphs as inputs is developed and an efficient constraint solver is proposed to facilitate incremental generation of such test inputs.

Software dataplane verification

The main insight is that packet-processing software is a good candidate for domain-specific verification, for example, because it typically consists of distinct pieces of code that share limited mutable state; it can leverage this and other properties to sidestep fundamental verification challenges.

A SOFT way for openflow switch interoperability testing

SOFT is presented, an approach for testing the interoperability of OpenFlow switches that automatically identifies the testing inputs that cause different OpenFlow agent implementations to behave inconsistently and identifies several inconsistencies between publicly available Reference OpenFlow switch and Open vSwitch implementations.

Finding and understanding bugs in C compilers

Csmith, a randomized test-case generation tool, is created and spent three years using it to find compiler bugs, and a collection of qualitative and quantitative results about the bugs it found are presented.