@cdklabs/generative-ai-cdk-constructs
@cdklabs/generative-ai-cdk-constructs / bedrock_batch_stepfn / BedrockBatchSfnProps
readonlybedrockBatchInputBucket:IBucket
The S3 bucket where the Bedrock Batch Inference Job gets the input manifests.
readonlybedrockBatchOutputBucket:IBucket
The S3 bucket where the Bedrock Batch Inference Job stores the output.
readonlyoptionalbedrockBatchPolicy:IManagedPolicy
IAM policy used for Bedrock batch processing
The policy must have the following permissions for the models and inference profiles you plan to use:
const bedrockBatchPolicy = new iam.ManagedPolicy(this, 'BedrockBatchPolicy', {
        statements: [
          new iam.PolicyStatement({
            sid: 'Inference',
            actions: ['bedrock:InvokeModel', 'bedrock:CreateModelInvocationJob'],
            resources: [
              'arn:aws:bedrock:*::foundation-model/*',
              Stack.of(this).formatArn({
                service: 'bedrock',
                resource: 'inference-profile',
                resourceName: '*',
              }),
            ],
          }),
        ],
      });
readonlyoptionalinputPath:string
JSONPath expression to select part of the state to be the input to this state. May also be the special value JsonPath. DISCARD, which will cause the effective input to be the empty object {}.
Input schema:
{
  "job_name": string,        // Required. Name of the batch inference job
  "manifest_keys": string[],    // Required. List of S3 keys of the input manifest files
  "model_id": string        // Required. Model ID to use.
}
The entire task input (JSON path '$')
readonlyoptionalresultPath:string
JSONPath expression to indicate where to inject the state’s output May also be the special value JsonPath. DISCARD, which will cause the state’s input to become its output.
Output schema:
{
  "status": string,        // Required. Status of the batch job. One of: "Completed" or "PartiallyCompleted"
  "bucket": string,        // Required. S3 bucket where the output is stored
  "keys": string[]         // Required. Array of S3 keys for the output files
}
Replaces the entire input with the result (JSON path '$')
readonlyoptionaltimeout:Duration
The timeout duration for the batch inference job. Must be between 24 hours and 1 week (168 hours).
Duration.hours(48)