runai inference distributed
Runs multiple coordinated inference processes across multiple nodes. Required for models too large to run on a single node.
Options
-h, --help help for distributedOptions inherited from parent commands
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH
-d, --debug enable debug mode
-q, --quiet enable quiet mode, suppress all output except error messages
--verbose enable verbose modeSEE ALSO
runai inference - inference management
runai inference distributed bash - open a bash shell in a distributed inference workload
runai inference distributed delete - delete a distributed inference workload
runai inference distributed describe - describe a distributed inference workload
runai inference distributed exec - execute a command in a distributed inference workload
runai inference distributed list - list distributed inference workloads
runai inference distributed logs - view logs of a distributed inference workload
runai inference distributed port-forward - forward one or more local ports to a distributed inference workload
runai inference distributed scale - scale a distributed inference workload
runai inference distributed submit - submit a distributed inference workload
Last updated