kubwa Inferece: AI Inference acceleration system
-
Image / Video / Text
-
Deep Learning Model
Python
-
Software
Model Converting & Optimization
-
Hardware
Inference Acceleration
kubwa Inference System (H/W + S/W)
-
Train
Deep Learning
• TensorFlow
• PyTorch
• Caffe -
Optimization
Model optimizer
• Converting
• Optimizing
• Preparing inferenceRun ModelOptimizer -
Inference
Inference Engine
Inference of optimized learning model according to the user application
User ApplicationInference Engine -
Heterogeneous
Multiple Accelerating
Accelerate inference engines on FPGA and GPU system
VPUFPGAGPU
AI Inference Accleration Applications
- 데이터센터
- Video/Image Processing
- DB & Data Analysis
- Computing Storage
- HPC Acceleration
- High-freq. Trading
- 통신
- 5G Equipment
- Tele-data Acceleration
- Cyber Security
- Edge Computing
- 의료
- ML & AI Inference
- OS / RTOS
- Hypervisor /Container
- Medical Database
- 산업
- Security / Firewall
- Machine Vision
- IoT Solution Stack
- Embedded Vision
- Inspection Equip.
- AI 추론
- Deep Learning
- Auto Driving
kubwa AIoT Cloud Orchestration
• Provides orchestration platform that builds data pipelines for ML & DL models for each application
• Implement of kubwa MLOps in on-premise and cloud environments based on Docker and Kubernetes container
• Achieve the goals of open, convergence, and flow with kubwa AIoT
