Inspirel banner

Vera++ - Tcl API

The scripts (rules and transformations) are written in Tcl and are executed by the embedded interpreter that has access to relevant state of the program. A set of commands is provided to enable easy read-only operation on the information that was gathered by parsing given source files.

The following Tcl commands are provided:


To process all lines from all source files, use the following code pattern:

foreach fileName [getSourceFileNames] {
    foreach line [getAllLines $fileName] {
        # ...

To process all tokens from all source files, use:

foreach fileName [getSourceFileNames] {
    foreach token [getTokens $fileName 1 0 -1 -1 {}] {
        set tokenValue [lindex $token 0]
        set lineNumber [lindex $token 1]
        set columnNumber [lindex $token 2]
        set tokenType [lindex $token 3]
        # ...

To process only curly braces from the given source file, use:

foreach token [getTokens $fileName 1 0 -1 -1 {leftbrace rightbrace}] {
    # ...

The complete rule script for verifying that the lines are no longer than some limit (the limit can be provided as a parameter, but the default value is defined in by the script itself):

# Line cannot be too long

set maxLength [getParameter "max-line-length" 100]

foreach f [getSourceFileNames] {
    set lineNumber 1
    foreach line [getAllLines $f] {
        if {[string length $line] > $maxLength} {
            report $f $lineNumber "line is longer than ${maxLength} characters"
        incr lineNumber

The above script is actually the implementation of rule L004.

Notes about line splicing

As required by the C++ ISO standard, the line splicing (with the backslash at the end of the line) is performed before tokenizing. This means that the lists of tokens might not strictly fit the list of lines.

Due to the internal mechanisms of the parser, the line splicing freezes the line counter and forces the column counter to continue until the last line in the spliced block. This means that there might be physical non-empty lines that apparently don't have any tokens, as well as tokens that have column numbers not matching the physical source line lengths.

Recognized token types

The following token types are recognized by the parser and can be used for filter selection in the getTokens command (some of these token types are related to compiler extensions):


There is a predefined rule named DUMP that prints on the screen all tokens with their types and position. This rule can be helpful as a guideline for creating custom filtering criteria:

$ vera++ -rule DUMP myfile.cpp