# Deploy Models Using Inference

- [Deploy a Custom Inference Workload](/self-hosted/2.22/workloads-in-nvidia-run-ai/using-inference/custom-inference.md)
- [Deploy Inference Workloads from Hugging Face](/self-hosted/2.22/workloads-in-nvidia-run-ai/using-inference/hugging-face-inference.md)
- [Deploy Inference Workloads with NVIDIA NIM](/self-hosted/2.22/workloads-in-nvidia-run-ai/using-inference/nim-inference.md)
- [Deploy NVIDIA Cloud Functions (NVCF) in NVIDIA Run:ai](/self-hosted/2.22/workloads-in-nvidia-run-ai/using-inference/nvcf.md)
- [Quick Starts](/self-hosted/2.22/workloads-in-nvidia-run-ai/using-inference/quick-starts.md)
- [Run Your First Custom Inference Workload](/self-hosted/2.22/workloads-in-nvidia-run-ai/using-inference/quick-starts/inference-quickstart.md)
