Skip to main content

AI & Data Science

Training & Inference on Cloud and Edge

Role in the Project

Training occurs on cloud infrastructure, while inference runs on both cloud and edge devices for low-latency performance.

Strengths & Weaknesses

Strengths:

  • Cloud training allows for scalability and distributed computing.
  • Edge inference reduces latency and bandwidth costs.

Weaknesses:

  • Cloud training requires significant computing resources.
  • Edge devices need model optimization to run efficiently.

Available Technologies & Comparison

  • Cloud Training: AWS Sagemaker vs. Google Vertex AI vs. On-Prem Kubernetes (Kubeflow).
  • Edge Inference: TensorRT (Chosen) vs. OpenVINO (Optimized for Intel) vs. TFLite (Lightweight, lower accuracy).

Chosen Approach

  • Cloud-based PyTorch training with distributed computing.
  • Edge inference using TensorRT for NVIDIA devices.
⚠️
All information provided here is in draft status and therefore subject to updates.

Consider it a work in progress, not the final word—things may evolve, shift, or completely change.

Stay tuned! 🚀
asdasdasd