Running Home Assistant on NVIDIA Jetson Orin Nano: A Complete Docker Setup Guide

Hey everyone! If you’re looking to combine the power of NVIDIA’s Jetson Orin Nano with Home Assistant, you’re in for a treat. The Jetson Orin Nano is an absolute beast for edge AI applications, and running Home Assistant on it opens up some incredible possibilities for smart home automation with on-device AI processing. In this guide, I’ll walk you through setting up Home Assistant in a Docker container on your Jetson Orin Nano.

Why Run Home Assistant on a Jetson Orin Nano?

Before we dive into the setup, let’s talk about why this combination makes sense. The Jetson Orin Nano packs some serious hardware into a compact form factor. We’re talking about up to 67 TOPS of AI performance, 1,024 CUDA cores, and 32 Tensor Cores built on NVIDIA’s Ampere architecture, all powered by a 6-core Arm Cortex-A78AE processor. That’s a lot of computational muscle for edge computing.

While Home Assistant itself doesn’t need this much power, having it on a Jetson opens doors for integrating local AI models for object detection, face recognition, voice processing, and other smart home automations that would normally require cloud services. You can run everything locally, keeping your data private and your latency low.

Prerequisites and Requirements

Before we get started, here’s what you’ll need:

  • NVIDIA Jetson Orin Nano (or Orin Nano Super) development kit
  • JetPack 6.x installed (either via SD card image or host machine flash)
  • Basic familiarity with Linux command line
  • Docker installed (more on this in a moment)
  • At least 16GB of storage space for Home Assistant and its configuration

Understanding the ARM64 Architecture

One important thing to note: the Jetson runs on ARM64/AArch64 architecture, not the x86-64 architecture you might be used to on desktop PCs. This means we need to use ARM64-compatible Docker images. Fortunately, Home Assistant provides official ARM64 images, so we’re covered there.

Setting Up Docker on JetPack 6

Here’s where things get interesting. If you flashed your Jetson using the SD card image, Docker comes pre-installed. However, starting with JetPack 6.2, Docker is no longer installed by default when flashing from a host machine. This is a change from previous versions, so don’t be surprised if you need to install it manually.

Installing Docker (if needed)

If Docker isn’t already on your system, you can install it following the community guides. The JetsonHacks repository on GitHub provides excellent step-by-step instructions for installing Docker on JetPack 6. The key thing here is to make sure you’re installing the correct version that’s compatible with JetPack.

Important warning: Stick with the docker.io package from Ubuntu that’s supported by JetPack. If you install docker-ce from Docker Inc. instead, you might break GPU passthrough due to different runtime configuration paths. Trust me, you don’t want to deal with that headache.

Installing NVIDIA Container Toolkit

To enable GPU access in Docker containers (which you’ll want for future AI integrations), you need the NVIDIA Container Toolkit. Install it with these commands:

sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit
sudo systemctl restart docker

You can verify that the NVIDIA runtime is properly configured by running:

sudo docker info | grep nvidia

If everything’s set up correctly, you should see the NVIDIA runtime listed. This runtime allows Docker containers to access the Jetson’s GPU, which is essential for running AI workloads alongside Home Assistant in the future.

Deploying Home Assistant Container

Now for the main event! Home Assistant provides official Docker images for ARM64 architecture. The image we’ll use is ghcr.io/home-assistant/home-assistant:stable, which is hosted on GitHub Container Registry.

Creating a Configuration Directory

First, let’s create a directory to store Home Assistant’s configuration files:

mkdir -p ~/homeassistant/config

This directory will persist all your Home Assistant settings, automations, and configurations even if you recreate the container.

Running Home Assistant with Docker Run

The simplest way to get started is with a single docker run command:

docker run -d \
--name homeassistant \
--privileged \
--restart=unless-stopped \
--network=host \
-v ~/homeassistant/config:/config \
-v /etc/localtime:/etc/localtime:ro \
-v /run/dbus:/run/dbus:ro \
-e TZ=America/New_York \
ghcr.io/home-assistant/home-assistant:stable

Let me break down what each flag does:

  • -d: Runs the container in detached mode (in the background)
  • –name homeassistant: Gives the container a friendly name
  • –privileged: Allows the container to access hardware devices (needed for things like Zigbee sticks, Z-Wave controllers, etc.)
  • –restart=unless-stopped: Automatically restarts the container if it crashes or when the system reboots
  • –network=host: Uses the host’s network stack, which is important for device discovery protocols like mDNS
  • -v ~/homeassistant/config:/config: Mounts your config directory into the container
  • -v /etc/localtime:/etc/localtime:ro: Syncs the container’s time with your system
  • -v /run/dbus:/run/dbus:ro: Provides D-Bus access (optional but required for Bluetooth integration)
  • -e TZ=America/New_York: Sets your timezone (change this to match your location)

Using Docker Compose (Recommended)

For a cleaner, more maintainable setup, I recommend using Docker Compose. Create a file called docker-compose.yml in a directory of your choice:

version: '3'
services:
  homeassistant:
    container_name: homeassistant
    image: ghcr.io/home-assistant/home-assistant:stable
    volumes:
      - ./config:/config
      - /etc/localtime:/etc/localtime:ro
      - /run/dbus:/run/dbus:ro
    restart: unless-stopped
    privileged: true
    network_mode: host
    environment:
      TZ: America/New_York

Then simply run:

docker-compose up -d

This approach makes it much easier to manage your configuration and add additional services later (like MQTT brokers, databases, or AI services).

ARM64-Specific Considerations

There’s one quirk you might encounter on ARM64 systems like the Jetson. Some ARM-based SoCs use a page size larger than 4K, which can cause issues with jemalloc (a memory allocator used by Home Assistant). If you see an error message like “<jemalloc>: Unsupported system page size”, you can disable jemalloc by adding this environment variable:

-e DISABLE_JEMALLOC=true

Or in your docker-compose.yml:

environment:
  TZ: America/New_York
  DISABLE_JEMALLOC: true

Accessing Home Assistant

Once the container is up and running, give it about 30-60 seconds to initialize. You can check the logs with:

docker logs -f homeassistant

When you see messages indicating that Home Assistant is running, open a web browser and navigate to:

http://<your-jetson-ip>:8123

If you’re accessing it from the Jetson itself, you can use:

http://localhost:8123

You’ll be greeted with the Home Assistant onboarding process where you can create your account and start configuring your smart home!

Important Limitations to Know

Running Home Assistant in a Docker container has one significant limitation: you won’t have access to Home Assistant add-ons. Add-ons are only available with Home Assistant OS or the Supervised installation method.

However, this isn’t necessarily a deal-breaker. Most add-ons are just pre-packaged Docker containers anyway. You can run them separately as standalone containers and configure Home Assistant to connect to them. For example, you can run Mosquitto MQTT, Node-RED, or ESPHome as separate Docker containers on your Jetson.

Future Possibilities: AI Integration

Here’s where things get really exciting. With Home Assistant running on your Jetson, you can integrate powerful AI capabilities that run entirely locally:

  • Object detection using models like YOLOv8 for security camera analysis
  • Face recognition for personalized automations
  • Voice assistants running models like Whisper for speech-to-text and local LLMs for natural language processing
  • DeepStack integration for computer vision tasks accessible through REST APIs

The Jetson community has been developing projects that integrate Ollama (for running local LLMs), custom voice assistants, and even multi-phase Home Assistant AI Lab initiatives. The GPU acceleration available on the Jetson makes these integrations smooth and responsive.

When running AI containers alongside Home Assistant, remember to use the –runtime nvidia flag to enable GPU access:

docker run --runtime nvidia -it your-ai-container

Monitoring and Maintenance

To check if your Home Assistant container is running:

docker ps

To view real-time logs:

docker logs -f homeassistant

To update Home Assistant to the latest version:

docker stop homeassistant
docker rm homeassistant
docker pull ghcr.io/home-assistant/home-assistant:stable

Then rerun your docker run command or docker-compose up -d to recreate the container with the updated image. Your configuration will be preserved since it’s stored in the mounted volume.

Troubleshooting Tips

If Home Assistant isn’t starting or you’re encountering issues:

  • Check the logs with docker logs homeassistant for error messages
  • Verify permissions on your config directory: ls -la ~/homeassistant/config
  • Ensure port 8123 isn’t in use by another service: sudo netstat -tlnp | grep 8123
  • Test the jemalloc fix if you see page size errors
  • Make sure you’re using the host network mode for proper device discovery

Final Thoughts

Running Home Assistant on the Jetson Orin Nano gives you a powerful platform for local smart home automation with the potential for cutting-edge AI integration. While the setup is slightly more involved than using Home Assistant OS, the flexibility and computational power you gain make it worthwhile, especially if you’re planning to leverage the Jetson’s AI capabilities.

The ARM64 architecture is well-supported by Home Assistant, and the Docker approach keeps everything clean and portable. As the Jetson community continues to develop AI integrations and tools, this setup positions you perfectly to take advantage of local AI processing for your smart home.

Whether you’re just getting started with home automation or you’re looking to add serious AI horsepower to your existing setup, this combination of Home Assistant and Jetson Orin Nano is a solid foundation to build on. Happy automating!

Leave a Reply

Your email address will not be published. Required fields are marked *