blackduck-c-cpp


Nameblackduck-c-cpp JSON
Version 2.0.3 PyPI version JSON
download
home_pageNone
SummaryScanning for c/c++ projects using blackduck and coverity tools
upload_time2024-03-27 13:25:42
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # blackduck-c-cpp

This code is responsible for running a c/cpp build wrapped by Coverity - capturing the source and binary files involved
and then using the available tools to deliver BDIO and signatures to Black Duck using a variety of tools and
methodologies.

## Overview

C and CPP projects don't have a standard package manager or method for managing dependencies. It is therefore more
difficult to create an accurate BOM for these projects. This leaves Software Composition Analysis tools fewer options
than with other languages. The primary options which are available in this context are: file system signatures. Black
Duck has a variety of old and new signatures which can be used to build a BOM. In order to effectively use signatures,
the tool first needs to know which files to take signatures from. In the past SCA tools have pointed a scanner at a
build directory, getting signatures from a subset of files within the directory sub-tree. The problem with this approach
is that there are many environmental variables, parameters and switches provided to the build tools, which make
reference to files outside of the build directory to include as part of the build. Further, there are, commonly, files
within the build directory, which are not part of the build and can lead to false positives within the BOM.

The new Black Duck C/CPP tool avoids the pitfalls described above by using a feature of Coverity called Build Capture.
Coverity Build Capture, wraps your build, observing all invocations of compilers and linkers and storing the paths of
all compiled source code, included header files and linked object files. These files are then matched using a variety of
methods described in the section of this document called "The BOM".

## Supported Platforms

Debian, Redhat, Ubuntu, openSUSE, Fedora, CentOS, macOS, and Windows are supported.

The signature scan and binary scan will be completed on all supported platforms as permitted by your Black Duck license.
Any scan cli parameters can be used and passed to blackduck-c-cpp tool through the additional_sig_scan_args parameter.

On Unix-like operating systems, a package manager scan will also be run. Since Windows doesn't have a supported package
manager, blackduck-c-cpp scans run on Windows won't include the package manager scan and won't produce a BDIO file.
Here, package manager scan refers to usage of O/S package managers such as yum, apt etc.

## Installation

Minimum version of Black Duck required is 2020.10.0

To install from pypi:

```
pip install blackduck-c-cpp
```

To install a specific version:

```
pip install blackduck-c-cpp==2.0.0
```

## Configuration

Prior to running your build, run any build specific configuration needed. Then the blackduck-c-cpp tool can either be
configured using a .yaml file or with command line arguments.

Here is a sample fully functional .yaml configuration: ardour-config.yaml

```
build_cmd: ../waf build
build_dir: /Users/theUser/myProject/ardour/build/
skip_build: False
verbose: True
project_name: ardour_mac
project_version: may-4-2021
bd_url: https://...
api_token: <token>
insecure: False
```

### API Token

Black Duck API tokens are generated on a per-user basis. To scan to a new project and view the results, the user who
generates the API token for blackduck-c-cpp must at minimum have the **Global Code Scanner**, **Global Project
Viewer**, and **Project Creator** roles assigned. To scan to an existing project and view the results, the user must at
minimum have the project
assigned to their user, and have the **Project Code Scanner** role assigned. See Administration > Managing Black Duck
user accounts > Understanding roles in the Black Duck Help documentation for more details on user roles. The Black Duck
Help documentation is accessible through the Black Duck UI.

To generate an API token:

1. Go to the Black Duck UI and log in.
2. From the user menu located on the top navigation bar, select My Access Tokens.
3. Click Create New Token. The Create New Token dialog box appears.
4. Enter a name, description (optional), and select the scope for this token (to use with blackduck-c-cpp, must be **
   read and write access**).
5. Click Create. The Access Token Name dialog box appears with the access token.
6. Copy the access token shown in the dialog box. This token can only be viewed here at this time. Once you close the
   dialog box, you cannot view the value of this token.

### Bazel

Bazel is supported in Coverity starting in versions 2022.3.0+ and blackduck-c-cpp in versions 1.0.13+.

To enable, use the `--bazel` switch (or set `bazel: True` in your yaml configuration file), but additional Coverity
setup is required as described below.

Bazel builds can be captured on the x86_64 versions of Windows, Linux, and macOS that are supported by Coverity
Analysis.

Compilers for Coverity analysis are supported, but all compilers must be accessible and runnable on the host system:
Remote cross-platform builds are not supported.

#### Bazel Setup

##### Modify project files

###### Workspace file

Like other Bazel integrations, the Coverity integration has an archive of rules to be used by the build.

blackduck-c-cpp will attempt to automatically update this file as required if it hasn't already been modified by the
user. If the automatic update fails, the failure will be logged and the user will need to complete the following steps
manually.

The WORKSPACE (or WORKSPACE.bazel) file defines the root of the Bazel project, and it needs to be modified to reference
the Coverity integration. If you are supplying your own Coverity installation, the Coverity integration can be found in
the Coverity Analysis installation at

```
<Coverity Analysis installation path>/bazel/rules_coverity.tar.gz
```

If you are using the mini package provided by blackduck-c-cpp, then by default the Coverity integration can be found in
the Coverity Analysis installation at

```
`<User home>/.synopsys/blackduck-c-cpp/cov-build-capture/bazel/rules_coverity.tar.gz`
```

You can remove it from the installation and host it anywhere convenient.

Assuming the integration archive is available on a network share at `/mnt/network-share/rules_coverity.tar.gz,` append
the following snippet onto your WORKSPACE file:

```
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
    name="rules_coverity",
    urls=["file:///mnt/network-share/rules_coverity.tar.gz"],
)
  
  
load("@rules_coverity//coverity:repositories.bzl", "rules_coverity_toolchains")
rules_coverity_toolchains()
```

You can use different URLs, depending on whether the integration archive is available locally, on a file share, or
through HTTP. The only part of the kit that is necessary for this is the integration archive, so it can be placed
wherever needed, independently of the rest of the kit. Bazel can fetch from "file://", "http://" and "https://" URLs.
The "urls" field is a list - multiple URLs can be specified, and fetching the integration from them will be attempted in
order.

###### Build file

Unlike the WORKSPACE file, blackduck-c-cpp can't update the BUILD file automatically. This must be completed by the
user.

Bazel uses the BUILD (or BUILD.bazel) file to do the following:

- Mark a package boundary
- Declare what targets can be built in that package
- Specify how to build those targets

The Coverity-Bazel integration needs a new target added that depends on existing targets to generate a "build
description" of all the build commands that would have been executed in the building of those targets. If you had,
for example, a build with two separate targets that you wanted to capture, the BUILD file would start out looking
something like this:

```
load("@rules_cc//cc:defs.bzl", "cc_binary")​
cc_binary(name="foo", srcs=["foo.cc"])
cc_binary(name="bar", srcs=["bar.cc"])
```

To capture the files (including link files) used in the building of the targets :foo and :bar (foo.cc and bar.cc,
respectively), you would
modify the BUILD file to be something like this:

```
load("@rules_cc//cc:defs.bzl", "cc_binary")
cc_binary(name="foo", srcs=["foo.cc"])
cc_binary(name="bar", srcs=["bar.cc"])
 
load("@rules_coverity//coverity:defs.bzl", "cov_enable_link", "cov_gen_script")
cov_enable_link(
    name = "enable_link",
    build_setting_default = True,
)
cov_gen_script(name="coverity-target", deps=[":foo", ":bar"], enable_link = ":enable_link",)
```

Here is an example using Google's open source abseil-cpp library (https://github.com/abseil/abseil-cpp):

Before:

```
package(default_visibility = ["//visibility:public"])

licenses(["notice"])  # Apache 2.0

# Expose license for external usage through bazel.
exports_files([
    "AUTHORS",
    "LICENSE",
])
```

After:

```
package(default_visibility = ["//visibility:public"])

licenses(["notice"])  # Apache 2.0

# Expose license for external usage through bazel.
exports_files([
    "AUTHORS",
    "LICENSE",
])

load("@rules_coverity//coverity:defs.bzl", "cov_enable_link", "cov_gen_script")
cov_enable_link(
    name = "enable_link",
    build_setting_default = True,
)
cov_gen_script(
    name="cov",
    deps = [
        "//absl/status:statusor",
        "//absl/status:status",
        "//absl/random:bit_gen_ref",
        "//absl/functional:bind_front",
        "//absl/flags:parse",
        "//absl/flags:usage",
        "//absl/flags:flag",
        "//absl/debugging:leak_check",
        "//absl/debugging:failure_signal_handler",
        "//absl/debugging:leak_check_disable",
        "//absl/container:node_hash_set",
        "//absl/container:hashtable_debug",
        "//absl/random:random",
        "//absl/random:seed_sequences",
        "//absl/random:seed_gen_exception",
        "//absl/random:distributions",
        "//absl/container:flat_hash_set",
        "//absl/types:any",
        "//absl/types:bad_any_cast",
        "//absl/container:btree",
        "//absl/types:compare",
        "//absl/cleanup:cleanup",
        "//absl/container:node_hash_map",
        "//absl/container:node_hash_policy",
        "//absl/flags:reflection",
        "//absl/container:flat_hash_map",
        "//absl/container:raw_hash_map",
        "//absl/container:raw_hash_set",
        "//absl/container:hashtablez_sampler",
        "//absl/container:hashtable_debug_hooks",
        "//absl/container:hash_policy_traits",
        "//absl/container:common",
        "//absl/container:hash_function_defaults",
        "//absl/strings:cord",
        "//absl/container:layout",
        "//absl/container:inlined_vector",
        "//absl/hash:hash",
        "//absl/types:variant",
        "//absl/types:bad_variant_access",
        "//absl/hash:city",
        "//absl/container:fixed_array",
        "//absl/container:compressed_tuple",
        "//absl/container:container_memory",
        "//absl/flags:marshalling",
        "//absl/strings:str_format",
        "//absl/numeric:representation",
        "//absl/functional:function_ref",
        "//absl/flags:config",
        "//absl/flags:commandlineflag",
        "//absl/types:optional",
        "//absl/types:bad_optional_access",
        "//absl/utility:utility",
        "//absl/synchronization:synchronization",
        "//absl/time:time",
        "//absl/debugging:symbolize",
        "//absl/strings:strings",
        "//absl/numeric:int128",
        "//absl/numeric:bits",
        "//absl/debugging:stacktrace",
        "//absl/types:span",
        "//absl/memory:memory",
        "//absl/algorithm:container",
        "//absl/meta:type_traits",
        "//absl/algorithm:algorithm",
    ],
    enable_link = ":enable_link",
)
```

###### Customization: compilation mnemonics

Which Bazel actions are treated as build commands is determined by the mnemonic of the action. For now, the only
mnemonics that are treated as a build commands by default are CppCompile, Javac and Compile. These are the mnemonics
that the builtin cc_binary/cc_library rules, the builtin java_binary/java_library rules and the standard
csharp_binary/csharp_library rules use for their compilation actions, respectively. If you have custom rules that
generate actions that should be treated as build commands, modify the BUILD file again, extending from this:

```
load("@rules_cc//cc:defs.bzl", "cc_binary")
cc_binary(name="foo", srcs=["foo.cc"])
cc_binary(name="bar", srcs=["bar.cc"])
 
load("@rules_coverity//coverity:defs.bzl", "cov_gen_script")
cov_gen_script(name="coverity-target", deps=[":foo", ":bar"])
```

to something like the following:

```
load("@rules_cc//cc:defs.bzl", "cc_binary")
cc_binary(name="foo", srcs=["foo.cc"])
cc_binary(name="bar", srcs=["bar.cc"])
 
load(
    "@rules_coverity//coverity:defs.bzl", 
    "cov_gen_script", 
    "cov_compile_mnemonics"
    )
cov_compile_mnemonics(
    name="extra_mnemonics", 
    build_setting_default=["FirstMnemonic", "SecondMnemonic"]
    )
cov_gen_script(
    name="coverity-target", 
    deps=[":foo", ":bar"], 
    extra_compile_mnemonics=":extra_mnemonics"
    )
```

### Details

usage: blackduck-c-cpp [-h] [-c CONFIG] [-bc build_cmd] -d
BUILD_DIR [-Cov coverity_root] [-Cd cov_output_dir] [-od output_dir]
[-s [SKIP_BUILD]] [-v [verbose]] -proj PROJECT_NAME -vers PROJECT_VERSION [-Cl CODELOCATION_NAME] -bd
bd_url [-a api_token] [-as additional_sig_scan_args] [-i [insecure]] [-f [force]]
[-djs [DISABLE_BDIO_JSON_SPLITTER]] [-si SCAN_INTERVAL] [-jsl json_splitter_limit]
[-bsfl bdio_split_max_file_entries] [-bscn bdio_split_max_chunk_nodes] [-dg [debug]]
[-st [SKIP_TRANSITIVES]] [-sh [SKIP_INCLUDES]] [-sd [SKIP_DYNAMIC]] [-off [OFFLINE]] [-md modes]
[-uo [USE_OFFLINE_FILES]] [-sc scan_cli_dir] [-Cc cov_configure_args] [-ac additional_coverity_params]
[-es [EXPAND_SIG_FILES]] [-po PORT] [-ba [BAZEL]] [-pgn PROJECT_GROUP_NAME]

arguments:

```
  -h, --help            show this help message and exit
  -c CONFIG, --config CONFIG
                        Configuration file path.
  -bc build_cmd, --build_cmd build_cmd
                        Command used to execute the build
  -d BUILD_DIR, --build_dir BUILD_DIR
                        Directory from which to run build
  -Cov coverity_root, --coverity_root coverity_root
                        Base directory for coverity. If not specified, blackduck-c-cpp downloads latest mini coverity package from GCP for authorized Black Duck customers
                        for Black Duck versions >= 2021.10. For downloading coverity package using GCP, you need to open connection toward *.googleapis.com:443. If you
                        don't have coverity package and your Black Duck version is < 2021.10, please contact sales team to get latest version of coverity package.
  -Cd cov_output_dir, --cov_output_dir cov_output_dir
                        Target directory for coverity output files.If not specified, defaults to user_home/.synopsys/blackduck-c-cpp/output/project_name
  -od output_dir, --output_dir output_dir
                        Target directory for blackduck-c-cpp output files.If not specified, defaults to user_home/.synopsys/blackduck-c-cpp/output/project_name. 
                        output_dir should be outside of the build directory.
  -s [SKIP_BUILD], --skip_build [SKIP_BUILD]
                        Skip build and use previously generated build data. Make sure that your initial coverity wrapped build uses the --emit-link-units flag
  -v [verbose], --verbose [verbose]
                        Verbose mode selection
  -proj PROJECT_NAME, --project_name PROJECT_NAME
                        Black Duck project name
  -vers PROJECT_VERSION, --project_version PROJECT_VERSION
                        Black Duck project version
  -Cl CODELOCATION_NAME, --codelocation_name CODELOCATION_NAME
                        This controls the Black Duck's codelocation. The codelocation_name will overwrite any scans sent to the same codelocation_name, indicating that
                        this is a new scan of a previous code location. Use with care.
  -bd bd_url, --bd_url bd_url
                        Black Duck URL
  -a api_token, --api_token api_token
                        Black Duck API token.  Instead of specifying api_token value in command line or yaml file, use the BD_HUB_TOKEN environment variable to specify a Black Duck API token.
  -as additional_sig_scan_args, --additional_sig_scan_args additional_sig_scan_args
                        Any additional args to pass to the signature scanner. IndividualFileMatching is by default turned on. To pass multiple params, you can pass it like additional_sig_scan_args: '--snippet-matching --license-search'.
                        It accepts scan cli properties; Detect properties are not accepted here.
  -i [insecure], --insecure [insecure]
                        Disable SSL verification so self-signed Black Duck certs will be trusted
  -f [force], --force [force]
                        In case of GCP failure, force use of older version of Coverity (if present)
  -djs [DISABLE_BDIO_JSON_SPLITTER], --disable_bdio_json_splitter [DISABLE_BDIO_JSON_SPLITTER]
                        Disable the json splitter and always upload as a single scan. For using json/bdio splitter, dryrun is needed, so please run in offline mode first.
  -si SCAN_INTERVAL, --scan_interval SCAN_INTERVAL
                        Set the number of seconds to wait between scan uploads in case of multiple scans
  -jsl json_splitter_limit, --json_splitter_limit json_splitter_limit
                        Set the limit for a scan size in bytes. For using json/bdio splitter, dryrun is needed, so please run in offline mode first.
  -bsfl bdio_split_max_file_entries, --bdio_split_max_file_entries bdio_split_max_file_entries
                        Set the limit for maximum scan node entries per generated BDIO file
  -bscn bdio_split_max_chunk_nodes, --bdio_split_max_chunk_nodes bdio_split_max_chunk_nodes
                        Set the limit for maximum scan node entries per single bdio-entry file
  -dg [debug], --debug [debug]
                        Debug mode selection. Setting debug: True sends all the files we found to all matching types. By default, it will only send files not detected by
                        package manager to BDBA and Signature matching.
  -st [SKIP_TRANSITIVES], --skip_transitives [SKIP_TRANSITIVES]
                        Skipping all transitive dependencies
  -sh [SKIP_INCLUDES], --skip_includes [SKIP_INCLUDES]
                        Skipping all .h & .hpp files from all types of scan
  -sd [SKIP_DYNAMIC], --skip_dynamic [SKIP_DYNAMIC]
                        Skipping all dynamic (.so/.dll) files from all types of scan
  -off [OFFLINE], --offline [OFFLINE]
                        Store bdba and sig zip files, sig scan json, and raw_bdio.csv to disk if offline mode is true.
                        For scans over 5GB to use bdio/json splitter, please run in offline mode first. 
                        scan_cli_dir should be specified when run in offline mode to generate dryrun files. 
                        Once the dryrun files are generated, use_offline_files: True can be set to upload those to hub.
  -md modes, --modes modes
                        Comma separated list of modes to run - 'all'(default),'bdba','sig','pkg_mgr'
  -uo [USE_OFFLINE_FILES], --use_offline_files [USE_OFFLINE_FILES]
                        Use offline generated files for upload in online mode
  -sc scan_cli_dir, --scan_cli_dir scan_cli_dir
                        Scan cli directory. Ex: Providing scan_cli_dir as /home/../../Black_Duck_Scan_Installation/ instead of
                        /home/../../Black_Duck_Scan_Installation/scan.cli-2022.4.0/ works.
  -Cc cov_configure_args, --cov_configure_args cov_configure_args
                        Additional configuration commands to cov-configure for different compilers. Inputs taken are of format {"compiler":"compiler-type"}. There is a way
                        to use coverity template configuration to reduce number of template compiler configurations with wildcards: example: "--compiler *g++ --comptype
                        gcc" for adding x86_64-pc-linux-gnu-g++ can be passed as cov_configure_args: {"*g++":"gcc"}
  -ac additional_coverity_params, --additional_coverity_params additional_coverity_params
                        Any additional args to pass to coverity build command. example: "--record-with-source"
  -es [EXPAND_SIG_FILES], --expand_sig_files [EXPAND_SIG_FILES]
                        Use expand_sig_files for creating exploded directory instead of zip in sig scanner mode
  -po PORT, --port PORT
                        Set a custom Black Duck port
  -ba [BAZEL], --bazel [BAZEL]
                        Use if this is a bazel build - make sure you have followed the setup instructions for Coverity
  -pgn PROJECT_GROUP_NAME, --project_group_name PROJECT_GROUP_NAME
                        This is same as --detect.project.group.name in detect. Sets the 'Project Group' to assign the project to. Must match exactly to an existing project
                        group on Black Duck.
  -scv set_coverity_mode, --set_coverity_mode set_coverity_mode
                        specify coverity mode to 'cov-build' to force run with cov-build. cov-cli runs by default for coverity versions >= 2023.9 and cov-build for <
                        2023.9
  -fpc force_pull_coverity_vers, --force_pull_coverity_vers force_pull_coverity_vers
                        For linux platforms, force pull 2022.9 or latest version of coverity if not auto downloaded by blackduck-c-cpp correctly by specifying -'old' or
                        'latest' respectively                        
```

#### blackduck-c-cpp 2.0.0

Here's what changed:

2.0.0 version uses cov-cli instead of cov-build by default. Coverity cli uses cov-build under the hood.
It is a layer of automation on top of cov-build and other tools.
Instead of the user having to figure out the correct cov-configure options,
Coverity CLI guesses at the right options and runs the tools automatically.  
It doesn't always work correctly, so there are options to fix things where needed.
You can also choose to run cov-build by setting following option in yaml file:
set_coverity_mode: 'cov-build'

In CentOS7, the latest version of glibc supported is 2.17. Starting with coverity build capture 2022.12.0, glibc_2.18 is
a requirement.
So, we try to auto-download an older version of Coverity, 2022.9, on linux platforms with glibc 2.17 or older.
If blackduck-c-cpp doesn't auto-download Coverity, you can forcefully download an older version by specifying following
parameter in yaml
file:
force_pull_coverity_vers: 'old'

#### Running

Once your blackduck-c-cpp tool is installed and configured as explained above, simply run the command:

blackduck-c-cpp --config /Users/theUser/myProject/ardour-config.yaml

To use snippet scanning, pass the snippet scanning parameters to the signature scanner using
--additional_sig_scan_args <snippet scanning parameter(s)>. Synopsys recommends using --snippet-matching. See Scanning
Components > Using the Signature Scanner > Running a component scan using the Signature Scanner command line in the
Black Duck Help Guide for more details.

To access the Black Duck server via a proxy, you must set a SCAN_CLI_OPTS environment variable prior to running the
scan. See Scanning Components > Using the Signature Scanner > Accessing the Black Duck server via a proxy in the Black
Duck Help Guide for details.

#### The Bom

Direct Dependencies - These are files which are being linked in to the built executable directly or header files
included by source code as identified by Coverity Build Capture.  
Package Manager - The Package Manager of the Linux system is queried about the source of the files - if recognized,
these are added to the BOM as "Direct Dependencies". Transitive Dependencies - These are files which are needed by the
Direct Dependencies. LDD - LDD is used to List the files (Dynamic Dependencies) of the Direct Dependencies. These files
are then used to query the package manager and results are added to the BOM as "Transitive Dependencies". Binary Matches
BDBA - Any linked object files not identified by the package manager are sent to BDBA (Binary) for matching. Signature
Matches - Any linked object and header files not identified by the package manager as well as all source code identified
by Coverity Build Capture are then sent to the Knowledge Base for signature matching.

## CI Builds

This projects CI build is run through GitLab-CI Pipelines, within this repository. When changes are made on
the `master` (default) branch, the version will be appended with `b` and the pipeline number as metadata. For `release/`
branches, `-rc` will be appended to the version with the pipeline number as metadata, and this will be published to
Artifactory. When changes are made to another branch (`dev/` or `bugfix/` for example), `dev` will be appended to the
version with the pipeline number, and the commit hash will be appended as metadata.

For example:

* default branch: 1.0.0b3821+abcd1234
* release branch: 1.0.0rc4820+abcd1234
* dev branch: 1.0.0dev5293+abcd1234
* release: 1.0.0

Release jobs are also run through GitLab-CI Pipelines, when tagged as per below. The version will be uploaded to
Artifactory at the end of the pipeline.

# Releasing

To release this library, simply tag this repo with a tag of the format: `vMM.mm.ff` like `v1.0.1`. This version should
match the version (minus the `v` in `setup.py`)

Be sure to increment the version in `setup.py` to the next fix version, or minor/major version as necessary. Do not add
any metadata or additional version information to the version, here.

The specific set of steps is:

- Ensure a full `python setup install` completes
- Commit changes
- Tag with `v##.##.##`, matching the version number in `setup.py`
- Push the change log changes, and tag, to GitLab
- Update the version number in `setup.py`
- Commit version change and push to GitLab

## FAQ's

1. If BOM isn't capturing all expected components, what to do?

Make sure you did a clean build. Run all clean commands and configure commands before running the blackduck-c-cpp tool
with build command.
Also, if you are using custom compilers, you have to configure it as follows:
--cov_configure_args: {"gcc.cx.a.b-ac.mips64-linux":"gcc"} where "gcc.cx.a.b-ac.mips64-linux" is compiler and "gcc" is
compiler type.
you can also set matchConfidenceThreshold to 0 in additional_sig_scan_args.

2. How to run snippet scanning?

Pass below command in your yaml file
`additional_sig_scan_args: '--snippet-matching' `
To run it from command line, example:
`blackduck-c-cpp -bc "make" -d "/apps/cpuminer-2.5.1/" -s False -v True -proj "cpuminer-cmd" -vers 1.0 -bd "https:<bd_url>" -a "<api_token" -as ="--snippet-matching --copyright-search" -i False`

3. Where can blackduck-c-cpp.log be found on the system?

All output files will be in
`user_home/.synopsys/blackduck-c-cpp/output/project_name`
by default if --output_dir is not given. Else, All output files will be in output_dir.

4. How to run blackduck-c-cpp?

Run with config file where are arugments are set or through command line.
Example:
`blackduck-c-cpp -c /apps/../../cpuminer-config.yaml`
or
To run it from command line,:
`blackduck-c-cpp -bc "make" -d "/apps/cpuminer-2.5.1/" -s False -proj "cpuminer-cmd" -vers 1.0 -bd "https:<bd_url>" -a "<api_token" -i False`

5. blackduck-c-cpp invokes BDBA. Do we need to be licensed for it? What happens if I don't have BDBA?

It throws `BDBA is not licensed for use with the current Black Duck instance -- will not be used` and goes to next
matching type

6. Running blackduck-c-cpp throwing import errors

Check if you installed blackduck-c-cpp from testpypi. If so, please uninstall and install from pypi for dependencies to
be automatically installed.
If you still see import errors, There may be some issues with multiple installations.
Try to create a virtual environment with python >= 3.7 version. Uninstall blackduck-c-cpp outside virtual environment
and install blackduck-c-cpp inside virtual env. Otherwise, it may be looking at wrong installation path (Can be seen in
stacktrace)
In linux environment:

```
python3 -m venv venv
source venv/bin/activate
pip3 install blackduck-c-cpp
```

7. Where to download coverity mini package?

If coverity_root is not specified, blackduck-c-cpp automatically downloads latest mini coverity package from GCP for
authorized Black Duck users for Black Duck versions >= 2021.10.
For downloading coverity package using GCP, you need to open connection toward *.googleapis.com:443.
If you don't have coverity package and your Black Duck version is < 2021.10, please contact sales team to get latest
version of coverity package.

8. BDBA upload throws an error as follows:

```
    raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response
.......
requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
```

Check your requests-toolbelt library version - `pip show requests-toolbelt`. If you have older version than 0.9.1,
install 0.9.1 version and try again.

9. Windows build - The blackduck-c-cpp process is stuck during a phase

Try giving a keyboard input by pressing enter/any other key if you still have the command prompt open where stuck. We
noticed in Windows that programs sometimes get stuck when we click into the console and enter the "selection" mode to
highlight/copy text from it.

10. Error:

```
headers.pop('Accept')
KeyError: 'Accept'
```

Do `pip show blackduck`. If you have version < 1.0.4, install 1.0.4 version and try again.

11. Windows error - `MemoryError`

Make sure you have the correct installation of python (64bit vs 32 bit) for your operating system.

12. Spaces in the paths to Coverity analysis

` /apps/.../cov\ 2021\ <vers>/bin/cov-build`
Coverity needs to be located in a directory that doesn't have a space in it.

13. Signature scan is performed on zip. Adding other sig scan arguments are not working. What to do?

Set `expand_sig_files: True`

14. How to uninstall blackduck-c-cpp?

pip uninstall blackduck-c-cpp

15. I already have a coverity build for my project. Can I use the tool?

Yes, you can set --cov_output_dir to the path where your coverity output files reside. (build-log.txt and emit
directory), then set `skip_build: True`.

16. How to see more logging information for troubleshooting?

You can see the blackduck-c-cpp.log file in output_dir  (OR) set verbose: True to see if it reveals any issues in
stdout.

17. I have custom compilers. What to do?

If you are using custom compilers, you have to configure it as follows:
cov_configure_args: {"gcc.cx.a.b-ac.mips64-linux":"gcc"} where "gcc.cx.a.b-ac.mips64-linux" is compiler and "gcc" is
compiler type.

18. What is debug mode?

Setting `debug: True` sends all the files we found to all matching types. By default, it will only send files not
detected by package manager to BDBA and Signature matching.

19. How to run a specific matching type?

You can select modes: sig, bdba, pkg_mgr in config file to run specific ones.

20. I already have run blackduck-c-cpp once. I ran in offline mode. I want to run in online mode. Do I need to do the
    full build again?

No, you can set `use_offline_files: True` and `skip_build: True` to use already stored files and just upload it to Black
Duck.

21. I already have run blackduck-c-cpp once. I got a few errors after build is finished which are fixed now. I want to
    run again. Do I need to do the full build again?

No, you can set `skip_build: True` to skip build process.

22. How to exclude full directory in signature scan method?
    signature scanning files are all placed under ..\sig_scan\sig_files\ directory.
    Example: C:\Users\kakarlas.SYNOPSYS\.synopsys\blackduck-c-cpp\output\godot-windows-jul11-2\sig_scan\sig_files\
    It has below paths in sig_files folder:
    C:\Users\kakarlas.SYNOPSYS\.synopsys\blackduck-c-cpp\output\godot-windows-jul11-2\sig_scan\sig_files\Users\
    C:\Users\kakarlas.SYNOPSYS\.synopsys\blackduck-c-cpp\output\godot-windows-jul11-2\sig_scan\sig_files\Program/ Files\
    C:\Users\kakarlas.SYNOPSYS\.synopsys\blackduck-c-cpp\output\godot-windows-jul11-2\sig_scan\sig_files\godot\
    Pass this in excludes.txt:
    /Users/
    /Program\ Files/
    In yaml file: add below command:
    additional_sig_scan_args: '--snippet-matching --exclude-from C:\Users\kakarlas.SYNOPSYS\Desktop\excludes.txt'

23. should it be additional_sig_scan_args: '--individualFileMatching=BINARY' when want to use "Binary" value for
    individualFileMatching?
    individualFileMatching is turned on by default on blackduck-c-cpp tool and can't be turned off.

24. bdio splitter fails to split one part when uploading to hub. How to fix it?
    Splitter operates on the node numbers, not size, so when a dataset contains large archives, it performs suboptimally.
    There are parameters to tweak it - bdio_split_max_file_entries and bdio_split_max_chunk_nodes. 
    bdio_split_max_file_entries= 100000 and bdio_split_max_chunk_nodes=3000 by default. 
    If default values don't work, try reducing bdio_split_max_file_entries to about 80000 and see if the file size drops closer to 5GB.
    


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "blackduck-c-cpp",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/8f/77/648d16797a6db4876c1359c71c68fcdd8baf7dc8f7ae3c14b24d6de3689a/blackduck-c-cpp-2.0.3.tar.gz",
    "platform": null,
    "description": "# blackduck-c-cpp\n\nThis code is responsible for running a c/cpp build wrapped by Coverity - capturing the source and binary files involved\nand then using the available tools to deliver BDIO and signatures to Black Duck using a variety of tools and\nmethodologies.\n\n## Overview\n\nC and CPP projects don't have a standard package manager or method for managing dependencies. It is therefore more\ndifficult to create an accurate BOM for these projects. This leaves Software Composition Analysis tools fewer options\nthan with other languages. The primary options which are available in this context are: file system signatures. Black\nDuck has a variety of old and new signatures which can be used to build a BOM. In order to effectively use signatures,\nthe tool first needs to know which files to take signatures from. In the past SCA tools have pointed a scanner at a\nbuild directory, getting signatures from a subset of files within the directory sub-tree. The problem with this approach\nis that there are many environmental variables, parameters and switches provided to the build tools, which make\nreference to files outside of the build directory to include as part of the build. Further, there are, commonly, files\nwithin the build directory, which are not part of the build and can lead to false positives within the BOM.\n\nThe new Black Duck C/CPP tool avoids the pitfalls described above by using a feature of Coverity called Build Capture.\nCoverity Build Capture, wraps your build, observing all invocations of compilers and linkers and storing the paths of\nall compiled source code, included header files and linked object files. These files are then matched using a variety of\nmethods described in the section of this document called \"The BOM\".\n\n## Supported Platforms\n\nDebian, Redhat, Ubuntu, openSUSE, Fedora, CentOS, macOS, and Windows are supported.\n\nThe signature scan and binary scan will be completed on all supported platforms as permitted by your Black Duck license.\nAny scan cli parameters can be used and passed to blackduck-c-cpp tool through the additional_sig_scan_args parameter.\n\nOn Unix-like operating systems, a package manager scan will also be run. Since Windows doesn't have a supported package\nmanager, blackduck-c-cpp scans run on Windows won't include the package manager scan and won't produce a BDIO file.\nHere, package manager scan refers to usage of O/S package managers such as yum, apt etc.\n\n## Installation\n\nMinimum version of Black Duck required is 2020.10.0\n\nTo install from pypi:\n\n```\npip install blackduck-c-cpp\n```\n\nTo install a specific version:\n\n```\npip install blackduck-c-cpp==2.0.0\n```\n\n## Configuration\n\nPrior to running your build, run any build specific configuration needed. Then the blackduck-c-cpp tool can either be\nconfigured using a .yaml file or with command line arguments.\n\nHere is a sample fully functional .yaml configuration: ardour-config.yaml\n\n```\nbuild_cmd: ../waf build\nbuild_dir: /Users/theUser/myProject/ardour/build/\nskip_build: False\nverbose: True\nproject_name: ardour_mac\nproject_version: may-4-2021\nbd_url: https://...\napi_token: <token>\ninsecure: False\n```\n\n### API Token\n\nBlack Duck API tokens are generated on a per-user basis. To scan to a new project and view the results, the user who\ngenerates the API token for blackduck-c-cpp must at minimum have the **Global Code Scanner**, **Global Project\nViewer**, and **Project Creator** roles assigned. To scan to an existing project and view the results, the user must at\nminimum have the project\nassigned to their user, and have the **Project Code Scanner** role assigned. See Administration > Managing Black Duck\nuser accounts > Understanding roles in the Black Duck Help documentation for more details on user roles. The Black Duck\nHelp documentation is accessible through the Black Duck UI.\n\nTo generate an API token:\n\n1. Go to the Black Duck UI and log in.\n2. From the user menu located on the top navigation bar, select My Access Tokens.\n3. Click Create New Token. The Create New Token dialog box appears.\n4. Enter a name, description (optional), and select the scope for this token (to use with blackduck-c-cpp, must be **\n   read and write access**).\n5. Click Create. The Access Token Name dialog box appears with the access token.\n6. Copy the access token shown in the dialog box. This token can only be viewed here at this time. Once you close the\n   dialog box, you cannot view the value of this token.\n\n### Bazel\n\nBazel is supported in Coverity starting in versions 2022.3.0+ and blackduck-c-cpp in versions 1.0.13+.\n\nTo enable, use the `--bazel` switch (or set `bazel: True` in your yaml configuration file), but additional Coverity\nsetup is required as described below.\n\nBazel builds can be captured on the x86_64 versions of Windows, Linux, and macOS that are supported by Coverity\nAnalysis.\n\nCompilers for Coverity analysis are supported, but all compilers must be accessible and runnable on the host system:\nRemote cross-platform builds are not supported.\n\n#### Bazel Setup\n\n##### Modify project files\n\n###### Workspace file\n\nLike other Bazel integrations, the Coverity integration has an archive of rules to be used by the build.\n\nblackduck-c-cpp will attempt to automatically update this file as required if it hasn't already been modified by the\nuser. If the automatic update fails, the failure will be logged and the user will need to complete the following steps\nmanually.\n\nThe WORKSPACE (or WORKSPACE.bazel) file defines the root of the Bazel project, and it needs to be modified to reference\nthe Coverity integration. If you are supplying your own Coverity installation, the Coverity integration can be found in\nthe Coverity Analysis installation at\n\n```\n<Coverity Analysis installation path>/bazel/rules_coverity.tar.gz\n```\n\nIf you are using the mini package provided by blackduck-c-cpp, then by default the Coverity integration can be found in\nthe Coverity Analysis installation at\n\n```\n`<User home>/.synopsys/blackduck-c-cpp/cov-build-capture/bazel/rules_coverity.tar.gz`\n```\n\nYou can remove it from the installation and host it anywhere convenient.\n\nAssuming the integration archive is available on a network share at `/mnt/network-share/rules_coverity.tar.gz,` append\nthe following snippet onto your WORKSPACE file:\n\n```\nload(\"@bazel_tools//tools/build_defs/repo:http.bzl\", \"http_archive\")\nhttp_archive(\n    name=\"rules_coverity\",\n    urls=[\"file:///mnt/network-share/rules_coverity.tar.gz\"],\n)\n  \n  \nload(\"@rules_coverity//coverity:repositories.bzl\", \"rules_coverity_toolchains\")\nrules_coverity_toolchains()\n```\n\nYou can use different URLs, depending on whether the integration archive is available locally, on a file share, or\nthrough HTTP. The only part of the kit that is necessary for this is the integration archive, so it can be placed\nwherever needed, independently of the rest of the kit. Bazel can fetch from \"file://\", \"http://\" and \"https://\" URLs.\nThe \"urls\" field is a list - multiple URLs can be specified, and fetching the integration from them will be attempted in\norder.\n\n###### Build file\n\nUnlike the WORKSPACE file, blackduck-c-cpp can't update the BUILD file automatically. This must be completed by the\nuser.\n\nBazel uses the BUILD (or BUILD.bazel) file to do the following:\n\n- Mark a package boundary\n- Declare what targets can be built in that package\n- Specify how to build those targets\n\nThe Coverity-Bazel integration needs a new target added that depends on existing targets to generate a \"build\ndescription\" of all the build commands that would have been executed in the building of those targets. If you had,\nfor example, a build with two separate targets that you wanted to capture, the BUILD file would start out looking\nsomething like this:\n\n```\nload(\"@rules_cc//cc:defs.bzl\", \"cc_binary\")\u200b\ncc_binary(name=\"foo\", srcs=[\"foo.cc\"])\ncc_binary(name=\"bar\", srcs=[\"bar.cc\"])\n```\n\nTo capture the files (including link files) used in the building of the targets :foo and :bar (foo.cc and bar.cc,\nrespectively), you would\nmodify the BUILD file to be something like this:\n\n```\nload(\"@rules_cc//cc:defs.bzl\", \"cc_binary\")\ncc_binary(name=\"foo\", srcs=[\"foo.cc\"])\ncc_binary(name=\"bar\", srcs=[\"bar.cc\"])\n \nload(\"@rules_coverity//coverity:defs.bzl\", \"cov_enable_link\", \"cov_gen_script\")\ncov_enable_link(\n    name = \"enable_link\",\n    build_setting_default = True,\n)\ncov_gen_script(name=\"coverity-target\", deps=[\":foo\", \":bar\"], enable_link = \":enable_link\",)\n```\n\nHere is an example using Google's open source abseil-cpp library (https://github.com/abseil/abseil-cpp):\n\nBefore:\n\n```\npackage(default_visibility = [\"//visibility:public\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\n# Expose license for external usage through bazel.\nexports_files([\n    \"AUTHORS\",\n    \"LICENSE\",\n])\n```\n\nAfter:\n\n```\npackage(default_visibility = [\"//visibility:public\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\n# Expose license for external usage through bazel.\nexports_files([\n    \"AUTHORS\",\n    \"LICENSE\",\n])\n\nload(\"@rules_coverity//coverity:defs.bzl\", \"cov_enable_link\", \"cov_gen_script\")\ncov_enable_link(\n    name = \"enable_link\",\n    build_setting_default = True,\n)\ncov_gen_script(\n    name=\"cov\",\n    deps = [\n        \"//absl/status:statusor\",\n        \"//absl/status:status\",\n        \"//absl/random:bit_gen_ref\",\n        \"//absl/functional:bind_front\",\n        \"//absl/flags:parse\",\n        \"//absl/flags:usage\",\n        \"//absl/flags:flag\",\n        \"//absl/debugging:leak_check\",\n        \"//absl/debugging:failure_signal_handler\",\n        \"//absl/debugging:leak_check_disable\",\n        \"//absl/container:node_hash_set\",\n        \"//absl/container:hashtable_debug\",\n        \"//absl/random:random\",\n        \"//absl/random:seed_sequences\",\n        \"//absl/random:seed_gen_exception\",\n        \"//absl/random:distributions\",\n        \"//absl/container:flat_hash_set\",\n        \"//absl/types:any\",\n        \"//absl/types:bad_any_cast\",\n        \"//absl/container:btree\",\n        \"//absl/types:compare\",\n        \"//absl/cleanup:cleanup\",\n        \"//absl/container:node_hash_map\",\n        \"//absl/container:node_hash_policy\",\n        \"//absl/flags:reflection\",\n        \"//absl/container:flat_hash_map\",\n        \"//absl/container:raw_hash_map\",\n        \"//absl/container:raw_hash_set\",\n        \"//absl/container:hashtablez_sampler\",\n        \"//absl/container:hashtable_debug_hooks\",\n        \"//absl/container:hash_policy_traits\",\n        \"//absl/container:common\",\n        \"//absl/container:hash_function_defaults\",\n        \"//absl/strings:cord\",\n        \"//absl/container:layout\",\n        \"//absl/container:inlined_vector\",\n        \"//absl/hash:hash\",\n        \"//absl/types:variant\",\n        \"//absl/types:bad_variant_access\",\n        \"//absl/hash:city\",\n        \"//absl/container:fixed_array\",\n        \"//absl/container:compressed_tuple\",\n        \"//absl/container:container_memory\",\n        \"//absl/flags:marshalling\",\n        \"//absl/strings:str_format\",\n        \"//absl/numeric:representation\",\n        \"//absl/functional:function_ref\",\n        \"//absl/flags:config\",\n        \"//absl/flags:commandlineflag\",\n        \"//absl/types:optional\",\n        \"//absl/types:bad_optional_access\",\n        \"//absl/utility:utility\",\n        \"//absl/synchronization:synchronization\",\n        \"//absl/time:time\",\n        \"//absl/debugging:symbolize\",\n        \"//absl/strings:strings\",\n        \"//absl/numeric:int128\",\n        \"//absl/numeric:bits\",\n        \"//absl/debugging:stacktrace\",\n        \"//absl/types:span\",\n        \"//absl/memory:memory\",\n        \"//absl/algorithm:container\",\n        \"//absl/meta:type_traits\",\n        \"//absl/algorithm:algorithm\",\n    ],\n    enable_link = \":enable_link\",\n)\n```\n\n###### Customization: compilation mnemonics\n\nWhich Bazel actions are treated as build commands is determined by the mnemonic of the action. For now, the only\nmnemonics that are treated as a build commands by default are CppCompile, Javac and Compile. These are the mnemonics\nthat the builtin cc_binary/cc_library rules, the builtin java_binary/java_library rules and the standard\ncsharp_binary/csharp_library rules use for their compilation actions, respectively. If you have custom rules that\ngenerate actions that should be treated as build commands, modify the BUILD file again, extending from this:\n\n```\nload(\"@rules_cc//cc:defs.bzl\", \"cc_binary\")\ncc_binary(name=\"foo\", srcs=[\"foo.cc\"])\ncc_binary(name=\"bar\", srcs=[\"bar.cc\"])\n \nload(\"@rules_coverity//coverity:defs.bzl\", \"cov_gen_script\")\ncov_gen_script(name=\"coverity-target\", deps=[\":foo\", \":bar\"])\n```\n\nto something like the following:\n\n```\nload(\"@rules_cc//cc:defs.bzl\", \"cc_binary\")\ncc_binary(name=\"foo\", srcs=[\"foo.cc\"])\ncc_binary(name=\"bar\", srcs=[\"bar.cc\"])\n \nload(\n    \"@rules_coverity//coverity:defs.bzl\", \n    \"cov_gen_script\", \n    \"cov_compile_mnemonics\"\n    )\ncov_compile_mnemonics(\n    name=\"extra_mnemonics\", \n    build_setting_default=[\"FirstMnemonic\", \"SecondMnemonic\"]\n    )\ncov_gen_script(\n    name=\"coverity-target\", \n    deps=[\":foo\", \":bar\"], \n    extra_compile_mnemonics=\":extra_mnemonics\"\n    )\n```\n\n### Details\n\nusage: blackduck-c-cpp [-h] [-c CONFIG] [-bc build_cmd] -d\nBUILD_DIR [-Cov coverity_root] [-Cd cov_output_dir] [-od output_dir]\n[-s [SKIP_BUILD]] [-v [verbose]] -proj PROJECT_NAME -vers PROJECT_VERSION [-Cl CODELOCATION_NAME] -bd\nbd_url [-a api_token] [-as additional_sig_scan_args] [-i [insecure]] [-f [force]]\n[-djs [DISABLE_BDIO_JSON_SPLITTER]] [-si SCAN_INTERVAL] [-jsl json_splitter_limit]\n[-bsfl bdio_split_max_file_entries] [-bscn bdio_split_max_chunk_nodes] [-dg [debug]]\n[-st [SKIP_TRANSITIVES]] [-sh [SKIP_INCLUDES]] [-sd [SKIP_DYNAMIC]] [-off [OFFLINE]] [-md modes]\n[-uo [USE_OFFLINE_FILES]] [-sc scan_cli_dir] [-Cc cov_configure_args] [-ac additional_coverity_params]\n[-es [EXPAND_SIG_FILES]] [-po PORT] [-ba [BAZEL]] [-pgn PROJECT_GROUP_NAME]\n\narguments:\n\n```\n  -h, --help            show this help message and exit\n  -c CONFIG, --config CONFIG\n                        Configuration file path.\n  -bc build_cmd, --build_cmd build_cmd\n                        Command used to execute the build\n  -d BUILD_DIR, --build_dir BUILD_DIR\n                        Directory from which to run build\n  -Cov coverity_root, --coverity_root coverity_root\n                        Base directory for coverity. If not specified, blackduck-c-cpp downloads latest mini coverity package from GCP for authorized Black Duck customers\n                        for Black Duck versions >= 2021.10. For downloading coverity package using GCP, you need to open connection toward *.googleapis.com:443. If you\n                        don't have coverity package and your Black Duck version is < 2021.10, please contact sales team to get latest version of coverity package.\n  -Cd cov_output_dir, --cov_output_dir cov_output_dir\n                        Target directory for coverity output files.If not specified, defaults to user_home/.synopsys/blackduck-c-cpp/output/project_name\n  -od output_dir, --output_dir output_dir\n                        Target directory for blackduck-c-cpp output files.If not specified, defaults to user_home/.synopsys/blackduck-c-cpp/output/project_name. \n                        output_dir should be outside of the build directory.\n  -s [SKIP_BUILD], --skip_build [SKIP_BUILD]\n                        Skip build and use previously generated build data. Make sure that your initial coverity wrapped build uses the --emit-link-units flag\n  -v [verbose], --verbose [verbose]\n                        Verbose mode selection\n  -proj PROJECT_NAME, --project_name PROJECT_NAME\n                        Black Duck project name\n  -vers PROJECT_VERSION, --project_version PROJECT_VERSION\n                        Black Duck project version\n  -Cl CODELOCATION_NAME, --codelocation_name CODELOCATION_NAME\n                        This controls the Black Duck's codelocation. The codelocation_name will overwrite any scans sent to the same codelocation_name, indicating that\n                        this is a new scan of a previous code location. Use with care.\n  -bd bd_url, --bd_url bd_url\n                        Black Duck URL\n  -a api_token, --api_token api_token\n                        Black Duck API token.  Instead of specifying api_token value in command line or yaml file, use the BD_HUB_TOKEN environment variable to specify a Black Duck API token.\n  -as additional_sig_scan_args, --additional_sig_scan_args additional_sig_scan_args\n                        Any additional args to pass to the signature scanner. IndividualFileMatching is by default turned on. To pass multiple params, you can pass it like additional_sig_scan_args: '--snippet-matching --license-search'.\n                        It accepts scan cli properties; Detect properties are not accepted here.\n  -i [insecure], --insecure [insecure]\n                        Disable SSL verification so self-signed Black Duck certs will be trusted\n  -f [force], --force [force]\n                        In case of GCP failure, force use of older version of Coverity (if present)\n  -djs [DISABLE_BDIO_JSON_SPLITTER], --disable_bdio_json_splitter [DISABLE_BDIO_JSON_SPLITTER]\n                        Disable the json splitter and always upload as a single scan. For using json/bdio splitter, dryrun is needed, so please run in offline mode first.\n  -si SCAN_INTERVAL, --scan_interval SCAN_INTERVAL\n                        Set the number of seconds to wait between scan uploads in case of multiple scans\n  -jsl json_splitter_limit, --json_splitter_limit json_splitter_limit\n                        Set the limit for a scan size in bytes. For using json/bdio splitter, dryrun is needed, so please run in offline mode first.\n  -bsfl bdio_split_max_file_entries, --bdio_split_max_file_entries bdio_split_max_file_entries\n                        Set the limit for maximum scan node entries per generated BDIO file\n  -bscn bdio_split_max_chunk_nodes, --bdio_split_max_chunk_nodes bdio_split_max_chunk_nodes\n                        Set the limit for maximum scan node entries per single bdio-entry file\n  -dg [debug], --debug [debug]\n                        Debug mode selection. Setting debug: True sends all the files we found to all matching types. By default, it will only send files not detected by\n                        package manager to BDBA and Signature matching.\n  -st [SKIP_TRANSITIVES], --skip_transitives [SKIP_TRANSITIVES]\n                        Skipping all transitive dependencies\n  -sh [SKIP_INCLUDES], --skip_includes [SKIP_INCLUDES]\n                        Skipping all .h & .hpp files from all types of scan\n  -sd [SKIP_DYNAMIC], --skip_dynamic [SKIP_DYNAMIC]\n                        Skipping all dynamic (.so/.dll) files from all types of scan\n  -off [OFFLINE], --offline [OFFLINE]\n                        Store bdba and sig zip files, sig scan json, and raw_bdio.csv to disk if offline mode is true.\n                        For scans over 5GB to use bdio/json splitter, please run in offline mode first. \n                        scan_cli_dir should be specified when run in offline mode to generate dryrun files. \n                        Once the dryrun files are generated, use_offline_files: True can be set to upload those to hub.\n  -md modes, --modes modes\n                        Comma separated list of modes to run - 'all'(default),'bdba','sig','pkg_mgr'\n  -uo [USE_OFFLINE_FILES], --use_offline_files [USE_OFFLINE_FILES]\n                        Use offline generated files for upload in online mode\n  -sc scan_cli_dir, --scan_cli_dir scan_cli_dir\n                        Scan cli directory. Ex: Providing scan_cli_dir as /home/../../Black_Duck_Scan_Installation/ instead of\n                        /home/../../Black_Duck_Scan_Installation/scan.cli-2022.4.0/ works.\n  -Cc cov_configure_args, --cov_configure_args cov_configure_args\n                        Additional configuration commands to cov-configure for different compilers. Inputs taken are of format {\"compiler\":\"compiler-type\"}. There is a way\n                        to use coverity template configuration to reduce number of template compiler configurations with wildcards: example: \"--compiler *g++ --comptype\n                        gcc\" for adding x86_64-pc-linux-gnu-g++ can be passed as cov_configure_args: {\"*g++\":\"gcc\"}\n  -ac additional_coverity_params, --additional_coverity_params additional_coverity_params\n                        Any additional args to pass to coverity build command. example: \"--record-with-source\"\n  -es [EXPAND_SIG_FILES], --expand_sig_files [EXPAND_SIG_FILES]\n                        Use expand_sig_files for creating exploded directory instead of zip in sig scanner mode\n  -po PORT, --port PORT\n                        Set a custom Black Duck port\n  -ba [BAZEL], --bazel [BAZEL]\n                        Use if this is a bazel build - make sure you have followed the setup instructions for Coverity\n  -pgn PROJECT_GROUP_NAME, --project_group_name PROJECT_GROUP_NAME\n                        This is same as --detect.project.group.name in detect. Sets the 'Project Group' to assign the project to. Must match exactly to an existing project\n                        group on Black Duck.\n  -scv set_coverity_mode, --set_coverity_mode set_coverity_mode\n                        specify coverity mode to 'cov-build' to force run with cov-build. cov-cli runs by default for coverity versions >= 2023.9 and cov-build for <\n                        2023.9\n  -fpc force_pull_coverity_vers, --force_pull_coverity_vers force_pull_coverity_vers\n                        For linux platforms, force pull 2022.9 or latest version of coverity if not auto downloaded by blackduck-c-cpp correctly by specifying -'old' or\n                        'latest' respectively                        \n```\n\n#### blackduck-c-cpp 2.0.0\n\nHere's what changed:\n\n2.0.0 version uses cov-cli instead of cov-build by default. Coverity cli uses cov-build under the hood.\nIt is a layer of automation on top of cov-build and other tools.\nInstead of the user having to figure out the correct cov-configure options,\nCoverity CLI guesses at the right options and runs the tools automatically.  \nIt doesn't always work correctly, so there are options to fix things where needed.\nYou can also choose to run cov-build by setting following option in yaml file:\nset_coverity_mode: 'cov-build'\n\nIn CentOS7, the latest version of glibc supported is 2.17. Starting with coverity build capture 2022.12.0, glibc_2.18 is\na requirement.\nSo, we try to auto-download an older version of Coverity, 2022.9, on linux platforms with glibc 2.17 or older.\nIf blackduck-c-cpp doesn't auto-download Coverity, you can forcefully download an older version by specifying following\nparameter in yaml\nfile:\nforce_pull_coverity_vers: 'old'\n\n#### Running\n\nOnce your blackduck-c-cpp tool is installed and configured as explained above, simply run the command:\n\nblackduck-c-cpp --config /Users/theUser/myProject/ardour-config.yaml\n\nTo use snippet scanning, pass the snippet scanning parameters to the signature scanner using\n--additional_sig_scan_args <snippet scanning parameter(s)>. Synopsys recommends using --snippet-matching. See Scanning\nComponents > Using the Signature Scanner > Running a component scan using the Signature Scanner command line in the\nBlack Duck Help Guide for more details.\n\nTo access the Black Duck server via a proxy, you must set a SCAN_CLI_OPTS environment variable prior to running the\nscan. See Scanning Components > Using the Signature Scanner > Accessing the Black Duck server via a proxy in the Black\nDuck Help Guide for details.\n\n#### The Bom\n\nDirect Dependencies - These are files which are being linked in to the built executable directly or header files\nincluded by source code as identified by Coverity Build Capture.  \nPackage Manager - The Package Manager of the Linux system is queried about the source of the files - if recognized,\nthese are added to the BOM as \"Direct Dependencies\". Transitive Dependencies - These are files which are needed by the\nDirect Dependencies. LDD - LDD is used to List the files (Dynamic Dependencies) of the Direct Dependencies. These files\nare then used to query the package manager and results are added to the BOM as \"Transitive Dependencies\". Binary Matches\nBDBA - Any linked object files not identified by the package manager are sent to BDBA (Binary) for matching. Signature\nMatches - Any linked object and header files not identified by the package manager as well as all source code identified\nby Coverity Build Capture are then sent to the Knowledge Base for signature matching.\n\n## CI Builds\n\nThis projects CI build is run through GitLab-CI Pipelines, within this repository. When changes are made on\nthe `master` (default) branch, the version will be appended with `b` and the pipeline number as metadata. For `release/`\nbranches, `-rc` will be appended to the version with the pipeline number as metadata, and this will be published to\nArtifactory. When changes are made to another branch (`dev/` or `bugfix/` for example), `dev` will be appended to the\nversion with the pipeline number, and the commit hash will be appended as metadata.\n\nFor example:\n\n* default branch: 1.0.0b3821+abcd1234\n* release branch: 1.0.0rc4820+abcd1234\n* dev branch: 1.0.0dev5293+abcd1234\n* release: 1.0.0\n\nRelease jobs are also run through GitLab-CI Pipelines, when tagged as per below. The version will be uploaded to\nArtifactory at the end of the pipeline.\n\n# Releasing\n\nTo release this library, simply tag this repo with a tag of the format: `vMM.mm.ff` like `v1.0.1`. This version should\nmatch the version (minus the `v` in `setup.py`)\n\nBe sure to increment the version in `setup.py` to the next fix version, or minor/major version as necessary. Do not add\nany metadata or additional version information to the version, here.\n\nThe specific set of steps is:\n\n- Ensure a full `python setup install` completes\n- Commit changes\n- Tag with `v##.##.##`, matching the version number in `setup.py`\n- Push the change log changes, and tag, to GitLab\n- Update the version number in `setup.py`\n- Commit version change and push to GitLab\n\n## FAQ's\n\n1. If BOM isn't capturing all expected components, what to do?\n\nMake sure you did a clean build. Run all clean commands and configure commands before running the blackduck-c-cpp tool\nwith build command.\nAlso, if you are using custom compilers, you have to configure it as follows:\n--cov_configure_args: {\"gcc.cx.a.b-ac.mips64-linux\":\"gcc\"} where \"gcc.cx.a.b-ac.mips64-linux\" is compiler and \"gcc\" is\ncompiler type.\nyou can also set matchConfidenceThreshold to 0 in additional_sig_scan_args.\n\n2. How to run snippet scanning?\n\nPass below command in your yaml file\n`additional_sig_scan_args: '--snippet-matching' `\nTo run it from command line, example:\n`blackduck-c-cpp -bc \"make\" -d \"/apps/cpuminer-2.5.1/\" -s False -v True -proj \"cpuminer-cmd\" -vers 1.0 -bd \"https:<bd_url>\" -a \"<api_token\" -as =\"--snippet-matching --copyright-search\" -i False`\n\n3. Where can blackduck-c-cpp.log be found on the system?\n\nAll output files will be in\n`user_home/.synopsys/blackduck-c-cpp/output/project_name`\nby default if --output_dir is not given. Else, All output files will be in output_dir.\n\n4. How to run blackduck-c-cpp?\n\nRun with config file where are arugments are set or through command line.\nExample:\n`blackduck-c-cpp -c /apps/../../cpuminer-config.yaml`\nor\nTo run it from command line,:\n`blackduck-c-cpp -bc \"make\" -d \"/apps/cpuminer-2.5.1/\" -s False -proj \"cpuminer-cmd\" -vers 1.0 -bd \"https:<bd_url>\" -a \"<api_token\" -i False`\n\n5. blackduck-c-cpp invokes BDBA. Do we need to be licensed for it? What happens if I don't have BDBA?\n\nIt throws `BDBA is not licensed for use with the current Black Duck instance -- will not be used` and goes to next\nmatching type\n\n6. Running blackduck-c-cpp throwing import errors\n\nCheck if you installed blackduck-c-cpp from testpypi. If so, please uninstall and install from pypi for dependencies to\nbe automatically installed.\nIf you still see import errors, There may be some issues with multiple installations.\nTry to create a virtual environment with python >= 3.7 version. Uninstall blackduck-c-cpp outside virtual environment\nand install blackduck-c-cpp inside virtual env. Otherwise, it may be looking at wrong installation path (Can be seen in\nstacktrace)\nIn linux environment:\n\n```\npython3 -m venv venv\nsource venv/bin/activate\npip3 install blackduck-c-cpp\n```\n\n7. Where to download coverity mini package?\n\nIf coverity_root is not specified, blackduck-c-cpp automatically downloads latest mini coverity package from GCP for\nauthorized Black Duck users for Black Duck versions >= 2021.10.\nFor downloading coverity package using GCP, you need to open connection toward *.googleapis.com:443.\nIf you don't have coverity package and your Black Duck version is < 2021.10, please contact sales team to get latest\nversion of coverity package.\n\n8. BDBA upload throws an error as follows:\n\n```\n    raise RemoteDisconnected(\"Remote end closed connection without\"\nhttp.client.RemoteDisconnected: Remote end closed connection without response\n.......\nrequests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))\n```\n\nCheck your requests-toolbelt library version - `pip show requests-toolbelt`. If you have older version than 0.9.1,\ninstall 0.9.1 version and try again.\n\n9. Windows build - The blackduck-c-cpp process is stuck during a phase\n\nTry giving a keyboard input by pressing enter/any other key if you still have the command prompt open where stuck. We\nnoticed in Windows that programs sometimes get stuck when we click into the console and enter the \"selection\" mode to\nhighlight/copy text from it.\n\n10. Error:\n\n```\nheaders.pop('Accept')\nKeyError: 'Accept'\n```\n\nDo `pip show blackduck`. If you have version < 1.0.4, install 1.0.4 version and try again.\n\n11. Windows error - `MemoryError`\n\nMake sure you have the correct installation of python (64bit vs 32 bit) for your operating system.\n\n12. Spaces in the paths to Coverity analysis\n\n` /apps/.../cov\\ 2021\\ <vers>/bin/cov-build`\nCoverity needs to be located in a directory that doesn't have a space in it.\n\n13. Signature scan is performed on zip. Adding other sig scan arguments are not working. What to do?\n\nSet `expand_sig_files: True`\n\n14. How to uninstall blackduck-c-cpp?\n\npip uninstall blackduck-c-cpp\n\n15. I already have a coverity build for my project. Can I use the tool?\n\nYes, you can set --cov_output_dir to the path where your coverity output files reside. (build-log.txt and emit\ndirectory), then set `skip_build: True`.\n\n16. How to see more logging information for troubleshooting?\n\nYou can see the blackduck-c-cpp.log file in output_dir  (OR) set verbose: True to see if it reveals any issues in\nstdout.\n\n17. I have custom compilers. What to do?\n\nIf you are using custom compilers, you have to configure it as follows:\ncov_configure_args: {\"gcc.cx.a.b-ac.mips64-linux\":\"gcc\"} where \"gcc.cx.a.b-ac.mips64-linux\" is compiler and \"gcc\" is\ncompiler type.\n\n18. What is debug mode?\n\nSetting `debug: True` sends all the files we found to all matching types. By default, it will only send files not\ndetected by package manager to BDBA and Signature matching.\n\n19. How to run a specific matching type?\n\nYou can select modes: sig, bdba, pkg_mgr in config file to run specific ones.\n\n20. I already have run blackduck-c-cpp once. I ran in offline mode. I want to run in online mode. Do I need to do the\n    full build again?\n\nNo, you can set `use_offline_files: True` and `skip_build: True` to use already stored files and just upload it to Black\nDuck.\n\n21. I already have run blackduck-c-cpp once. I got a few errors after build is finished which are fixed now. I want to\n    run again. Do I need to do the full build again?\n\nNo, you can set `skip_build: True` to skip build process.\n\n22. How to exclude full directory in signature scan method?\n    signature scanning files are all placed under ..\\sig_scan\\sig_files\\ directory.\n    Example: C:\\Users\\kakarlas.SYNOPSYS\\.synopsys\\blackduck-c-cpp\\output\\godot-windows-jul11-2\\sig_scan\\sig_files\\\n    It has below paths in sig_files folder:\n    C:\\Users\\kakarlas.SYNOPSYS\\.synopsys\\blackduck-c-cpp\\output\\godot-windows-jul11-2\\sig_scan\\sig_files\\Users\\\n    C:\\Users\\kakarlas.SYNOPSYS\\.synopsys\\blackduck-c-cpp\\output\\godot-windows-jul11-2\\sig_scan\\sig_files\\Program/ Files\\\n    C:\\Users\\kakarlas.SYNOPSYS\\.synopsys\\blackduck-c-cpp\\output\\godot-windows-jul11-2\\sig_scan\\sig_files\\godot\\\n    Pass this in excludes.txt:\n    /Users/\n    /Program\\ Files/\n    In yaml file: add below command:\n    additional_sig_scan_args: '--snippet-matching --exclude-from C:\\Users\\kakarlas.SYNOPSYS\\Desktop\\excludes.txt'\n\n23. should it be additional_sig_scan_args: '--individualFileMatching=BINARY' when want to use \"Binary\" value for\n    individualFileMatching?\n    individualFileMatching is turned on by default on blackduck-c-cpp tool and can't be turned off.\n\n24. bdio splitter fails to split one part when uploading to hub. How to fix it?\n    Splitter operates on the node numbers, not size, so when a dataset contains large archives, it performs suboptimally.\n    There are parameters to tweak it - bdio_split_max_file_entries and bdio_split_max_chunk_nodes. \n    bdio_split_max_file_entries= 100000 and bdio_split_max_chunk_nodes=3000 by default. \n    If default values don't work, try reducing bdio_split_max_file_entries to about 80000 and see if the file size drops closer to 5GB.\n    \n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Scanning for c/c++ projects using blackduck and coverity tools",
    "version": "2.0.3",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4b449e1582b4772ba8a447f118dda96990764b71ba109a2b20b7bfbc740129a6",
                "md5": "b8f84f515ab0849203f2b6db04ba8e65",
                "sha256": "c2565d73174ee941082dce1d74c14ab7034652471994785d5f267663fef09ac4"
            },
            "downloads": -1,
            "filename": "blackduck_c_cpp-2.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b8f84f515ab0849203f2b6db04ba8e65",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 78439,
            "upload_time": "2024-03-27T13:25:40",
            "upload_time_iso_8601": "2024-03-27T13:25:40.303913Z",
            "url": "https://files.pythonhosted.org/packages/4b/44/9e1582b4772ba8a447f118dda96990764b71ba109a2b20b7bfbc740129a6/blackduck_c_cpp-2.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8f77648d16797a6db4876c1359c71c68fcdd8baf7dc8f7ae3c14b24d6de3689a",
                "md5": "c45c072f808ace6958ce9bb838b9fc17",
                "sha256": "5dbcadc723f8da2022a37ca72e84332ee6fc3c18f983f421fb7586e909d6bd1b"
            },
            "downloads": -1,
            "filename": "blackduck-c-cpp-2.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "c45c072f808ace6958ce9bb838b9fc17",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 89417,
            "upload_time": "2024-03-27T13:25:42",
            "upload_time_iso_8601": "2024-03-27T13:25:42.711935Z",
            "url": "https://files.pythonhosted.org/packages/8f/77/648d16797a6db4876c1359c71c68fcdd8baf7dc8f7ae3c14b24d6de3689a/blackduck-c-cpp-2.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-27 13:25:42",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "blackduck-c-cpp"
}
        
Elapsed time: 0.21514s