# runai inference

inference management

## Options

```
  -h, --help   help for inference
```

## Options inherited from parent commands

```
      --config-file string   config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
      --config-path string   config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH
  -d, --debug                enable debug mode
  -q, --quiet                enable quiet mode, suppress all output except error messages
      --verbose              enable verbose mode
```

## SEE ALSO

* [runai](https://run-ai-docs.nvidia.com/self-hosted/2.21/reference/cli/runai) - Run:ai Command-line Interface
* [runai inference delete](https://run-ai-docs.nvidia.com/self-hosted/2.21/reference/cli/runai/runai_inference_delete) - delete an inference workload
* [runai inference describe](https://run-ai-docs.nvidia.com/self-hosted/2.21/reference/cli/runai/runai_inference_describe) - describe an inference workload
* [runai inference list](https://run-ai-docs.nvidia.com/self-hosted/2.21/reference/cli/runai/runai_inference_list) - list inference workloads
* [runai inference submit](https://run-ai-docs.nvidia.com/self-hosted/2.21/reference/cli/runai/runai_inference_submit) - submit an inference workload
* [runai inference update](https://run-ai-docs.nvidia.com/self-hosted/2.21/reference/cli/runai/runai_inference_update) - update an inference workload
