Model serving using KServe with Kubeflow on AWS

Configure inferenceService to Access AWS Services from KServe

Configuration for accessing AWS services for inference services such as pulling images from private ECR and downloading models from S3 bucket.

KServe with Kubeflow on AWS

Serve prediction requests using Knative Serving and AWS Load Balancer

KServe with AWS Deep Learning Containers

Run inference using Kubeflow on AWS using AWS Deep Learning Containers