# Workloads in NVIDIA Run:ai

- [Introduction to Workloads](/self-hosted/workloads-in-nvidia-run-ai/introduction-to-workloads.md)
- [Workload Types and Features](/self-hosted/workloads-in-nvidia-run-ai/workload-types.md)
- [NVIDIA Run:ai Native Workloads](/self-hosted/workloads-in-nvidia-run-ai/workload-types/native-workloads.md)
- [Supported Workload Types](/self-hosted/workloads-in-nvidia-run-ai/workload-types/supported-workload-types.md)
- [Extending Workload Support with Resource Interface](/self-hosted/workloads-in-nvidia-run-ai/workload-types/extending-workload-support.md)
- [Defining a Resource Interface](/self-hosted/workloads-in-nvidia-run-ai/workload-types/extending-workload-support/defining-a-resource-interface.md)
- [Quick Start Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-types/extending-workload-support/quick-start-templates.md)
- [Supported Features](/self-hosted/workloads-in-nvidia-run-ai/workload-types/supported-features.md)
- [Workloads](/self-hosted/workloads-in-nvidia-run-ai/workloads.md)
- [Workload Assets](/self-hosted/workloads-in-nvidia-run-ai/assets.md)
- [Workload Assets](/self-hosted/workloads-in-nvidia-run-ai/assets/overview.md)
- [Environments](/self-hosted/workloads-in-nvidia-run-ai/assets/environments.md)
- [Data Sources](/self-hosted/workloads-in-nvidia-run-ai/assets/datasources.md)
- [Data Volumes](/self-hosted/workloads-in-nvidia-run-ai/assets/data-volumes.md)
- [Compute Resources](/self-hosted/workloads-in-nvidia-run-ai/assets/compute-resources.md)
- [Credentials](/self-hosted/workloads-in-nvidia-run-ai/assets/credentials.md)
- [Workload Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates.md)
- [Workspace Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/workspace-templates.md)
- [Workspace Templates (Legacy)](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/workspace-templates/workspace-templates-legacy.md)
- [Training Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/training-templates.md)
- [Standard Training Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/training-templates/standard-training-templates.md)
- [Distributed Training Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/training-templates/distributed-training-templates.md)
- [Inference Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/inference-templates.md)
- [Custom Inference Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/inference-templates/custom-inference-templates.md)
- [NVIDIA NIM Inference Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/inference-templates/nvidia-nim-inference-templates.md)
- [Hugging Face Inference Templates](/self-hosted/workloads-in-nvidia-run-ai/workload-templates/inference-templates/hugging-face-inference-templates.md)
- [Experiment Using Workspaces](/self-hosted/workloads-in-nvidia-run-ai/using-workspaces.md)
- [Running Workspaces](/self-hosted/workloads-in-nvidia-run-ai/using-workspaces/running-workspace.md)
- [Quick Starts](/self-hosted/workloads-in-nvidia-run-ai/using-workspaces/quick-starts.md)
- [Running Jupyter Notebooks Using Workspaces](/self-hosted/workloads-in-nvidia-run-ai/using-workspaces/quick-starts/jupyter-quickstart.md)
- [Train Models Using Training](/self-hosted/workloads-in-nvidia-run-ai/using-training.md)
- [Train Models Using a Standard Training Workload](/self-hosted/workloads-in-nvidia-run-ai/using-training/train-models.md)
- [Train Models Using a Distributed Training Workload](/self-hosted/workloads-in-nvidia-run-ai/using-training/distributed-training-models.md)
- [Best Practices: Checkpointing Preemptible Training Workloads](/self-hosted/workloads-in-nvidia-run-ai/using-training/checkpointing-preemptible-workloads.md)
- [Quick Starts](/self-hosted/workloads-in-nvidia-run-ai/using-training/quick-starts.md)
- [Run Your First Standard Training](/self-hosted/workloads-in-nvidia-run-ai/using-training/quick-starts/standard-training-quickstart.md)
- [Run Your First Distributed Training](/self-hosted/workloads-in-nvidia-run-ai/using-training/quick-starts/distributed-training-quickstart.md)
- [Deploy Models Using Inference](/self-hosted/workloads-in-nvidia-run-ai/using-inference.md)
- [NVIDIA Run:ai Inference Overview](/self-hosted/workloads-in-nvidia-run-ai/using-inference/nvidia-run-ai-inference-overview.md)
- [Deploy a Custom Inference Workload](/self-hosted/workloads-in-nvidia-run-ai/using-inference/custom-inference.md)
- [Deploy Inference Workloads with NVIDIA NIM](/self-hosted/workloads-in-nvidia-run-ai/using-inference/nim-inference.md)
- [Deploy Inference Workloads from Hugging Face](/self-hosted/workloads-in-nvidia-run-ai/using-inference/hugging-face-inference.md)
- [Deploy NVIDIA Cloud Functions (NVCF) in NVIDIA Run:ai](/self-hosted/workloads-in-nvidia-run-ai/using-inference/nvcf.md)
- [Quick Starts](/self-hosted/workloads-in-nvidia-run-ai/using-inference/quick-starts.md)
- [Run Your First Custom Inference Workload](/self-hosted/workloads-in-nvidia-run-ai/using-inference/quick-starts/inference-quickstart.md)
- [Submit Supported Workload Types via YAML](/self-hosted/workloads-in-nvidia-run-ai/submit-via-yaml.md)
