We help our customers manage nearly one million edge devices belonging to more than dozens of thousands of fleets, all running one or more Docker containers. This provides us with the ability to analyze the names of hundreds of thousands of containers to see what stack is the most popular. So that’s what we’ve done here! Without further ado, here is an anonymized list of the most popular container technologies we see related with Edge AI projects:
1. ROS (Robot Operating System) for Robotics
One of the most repeated names of Docker Containers in balena are ros
and ros2
. Obviously if you are planning a robotics project, the ROS Docker container is essential, do not reinvent the wheel. ROS (Robot Operating System) provides a framework for building robot applications, including integrating sensors, actuators, lidars and cameras, as well as communication between these components.
The ROS and ROS2 services are being widely used in autonomous drones, autonomous vehicles and industrial machinery such as robotic arms, among others.
How is ROS being used with balena?
Balena enables robotics developers using the community maintained ROS blocks available in balenaHub. ROS developers can get the advantage of the hardware integration, manage new deployments from balenaCloud while ensuring their robot fleets remain flexible, reliable and easy to maintain. Having said that, feel free to collaborate with these blocks, updating and improving them.
Running ROS or ROS2 in a Docker container with balena on NVIDIA Jetson, aarch64, armv6hf or amd64 devices simplifies the robotic control, real-time vision-based navigation in the edge, sensor data processing and real-time decision making. Balena makes it easy to remotely update and upgrade these robotic devices from anywhere in the world.
2. Video Analytics with DeepStream
The second most used EdgeAI Docker Container running in balena is the NVIDIA package DeepStream. DeepStream is a streaming analytics toolkit that takes streaming video data as input from cameras or RTSP streams and uses computer vision to convert pixels into insights. It allows you to build powerful video analytics applications on the edge using NVIDIA’s GPUs such as NVIDIA Jetson Xavier or NVIDIA Jetson AGX Orin.
With DeepStream you can build video analytics applications that utilize object detection, tracking and action recognition. It is commonly used in applications where multiple video streams need to be analyzed simultaneously using AI in the edge.
How is DeepStream being used with balena?
Using a DeepStream Docker container, you can leverage NVIDIA’s GPU power to process and analyze data on your edge device remotely. This container is ideal for applications such as security surveillance, traffic management and retail analytics, where real-time video inference is important.
3. Machine Learning with TensorFlow and PyTorch
PyTorch and Tensorflow are highly popular Docker containers running in balena fleets. For real-time inference in AI applications, TensorFlow and PyTorch are some of the most popular containers to manage multiple models simultaneously, enabling a broader range of tasks, including multi-class object recognition and facial recognition handled efficiently on a single Jetson device in the edge.
How is Machine Learning being used with balena?
PyTorch and Tensorflow are popular frameworks for AI inferencing on the edge. Balena makes it easy to push new AI models to your device either one at a time or to an entire fleet at once. Find here a sample application for running visual inference on NVIDIA Jetson Orin NX (Seeed Studio J4012) hardware with PyTorch.
4. JupyterLab for Interactive Development
JupyterLab surprisingly is running in a decent number of fleets in balenaCloud. JupyterLab offers an interactive development environment for creating and running code on any edge device. The JupyterLab container is perfect for those who want to prototype and test their AI models before full deployment. It supports Python, which is widely used for AI development and allows you to visualize data, modify models and debug your code in an interactive notebook interface.
How Is JupyterLab Being Used with balena?
By deploying the JupyterLab container on balena devices, developers are able to accelerate AI model development and testing using JupyterLab notebook. This setup is ideal for edge AI projects that require rapid prototyping and iterative improvement. Whether you are experimenting with computer vision, Generative AI or video analytics, JupyterLab’s interactive environment allows you to visualize outputs and debug workflows efficiently.
The integration of JupyterLab or similar notebooks with balena using Docker Containers ensures that developers can focus on building and refining AI solutions while the platform handles deployment and operational complexities. This combination makes it easier to transition from development to production in edge AI environments. Having said that, we don’t see many fleets in production running JupyterLab services as this is more indicated for the prototyping phase.
5. Generative AI and LLMs
Finally, running in a decent amount of Edge AI fleets we can find some Generative AI and Large Language Models (LLMs) Docker Containers. The GenAI and LLMs are transforming how we interact with technology, enabling applications like text and image generation or conversational AI on the edge. NVIDIA Jetsons, x86 and Raspberry Pi 5 (8GB RAM) can run optimized frameworks for deploying these models.
How Are Generative AI and LLMs Being Used with balena?
Going through balenaCloud services we see various tailored Docker Containers for Generative AI and LLM applications. For instance, projects like balenafied Ollama Open WebUI and LlamaFile for balena demonstrate how these models can be containerized for efficient edge deployment and run in balena. These containers support running LLMs directly on balena devices, enabling real-time text generation and AI-driven decision-making at the edge.
Using balena’s platform, you can manage these applications across fleets of GenAI devices seamlessly. I’m sure that we will see more of this in the future.
Get Started Today
When deploying large-scale Edge AI solutions, selecting the right Docker containers can be a game-changer for performance and scalability. Balena has multiple fleets running the popular containers like ROS, ROS2, DeepStream, Ollama, TensorFlow, PyTorch and TensorRT among others. For an in-depth discussion on the pros and cons of prebuilt versus custom Docker images, check out our blog post Docker Containers for Edge AI: Should You Use Pre-Built or Custom Images on NVIDIA Jetson Devices?
If you have further questions about device compatibility or customization, feel free to contact us through this form, e-mail us, or ask in our forums.
The post Top 5 Docker containers used in large scale edge AI deployments appeared first on balena Blog.