# Deploy Models Using Inference

- [Deploy a Custom Inference Workload](/self-hosted/2.20/workloads-in-nvidia-run-ai/using-inference/custom-inference.md)
- [Deploy Inference Workloads from Hugging Face](/self-hosted/2.20/workloads-in-nvidia-run-ai/using-inference/hugging-face-inference.md)
- [Deploy Inference Workloads with NVIDIA NIM](/self-hosted/2.20/workloads-in-nvidia-run-ai/using-inference/nim-inference.md)
- [Quick Starts](/self-hosted/2.20/workloads-in-nvidia-run-ai/using-inference/quick-starts.md)
- [Run Your First Custom Inference Workload](/self-hosted/2.20/workloads-in-nvidia-run-ai/using-inference/quick-starts/run-your-first-custom-inference-workload.md)
