|
| 1 | +# MongoDB::Atlas::StreamProcessor |
| 2 | + |
| 3 | +## Description |
| 4 | + |
| 5 | +Resource for creating and managing [Stream Processors for an Atlas Stream Instance](https://www.mongodb.com/docs/api/doc/atlas-admin-api-v2/operation/operation-createstreamprocessor). |
| 6 | + |
| 7 | +## Requirements |
| 8 | + |
| 9 | +Set up an AWS profile to securely give CloudFormation access to your Atlas credentials. |
| 10 | +For instructions on setting up a profile, [see here](/README.md#mongodb-atlas-api-keys-credential-management). |
| 11 | + |
| 12 | +## Attributes and Parameters |
| 13 | + |
| 14 | +See the [resource docs](docs/README.md). Also refer [AWS security best practices for CloudFormation](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/security-best-practices.html#creds) to manage credentials. |
| 15 | + |
| 16 | +## CloudFormation Examples |
| 17 | + |
| 18 | +See the example [CFN Templates](/examples/atlas-streams/stream-processor/) for example resources: |
| 19 | + |
| 20 | +- [Basic Stream Processor](/examples/atlas-streams/stream-processor/stream-processor.json) |
| 21 | +- [Stream Processor with DLQ](/examples/atlas-streams/stream-processor/stream-processor-with-dlq.json) |
| 22 | + |
| 23 | +## Prerequisites |
| 24 | + |
| 25 | +Before creating a stream processor, you must have: |
| 26 | + |
| 27 | +- An existing Atlas Project |
| 28 | +- An existing Stream Instance/Workspace (created via `MongoDB::Atlas::StreamInstance` resource) |
| 29 | +- At least one Stream Connection configured (created via `MongoDB::Atlas::StreamConnection` resource) |
| 30 | + - A source connection (e.g., sample data source, cluster connection, or Kafka connection) |
| 31 | + - A sink connection (must be a cluster connection for merge operations) |
| 32 | + |
| 33 | +## Deployment |
| 34 | + |
| 35 | +### Deploy Basic Stream Processor |
| 36 | + |
| 37 | +```bash |
| 38 | +aws cloudformation deploy \ |
| 39 | + --template-file examples/atlas-streams/stream-processor/stream-processor.json \ |
| 40 | + --stack-name stream-processor-stack \ |
| 41 | + --parameter-overrides \ |
| 42 | + ProjectId=<YOUR_PROJECT_ID> \ |
| 43 | + WorkspaceName=<YOUR_WORKSPACE_NAME> \ |
| 44 | + ProcessorName=my-processor \ |
| 45 | + SourceConnectionName=sample_stream_solar \ |
| 46 | + SinkConnectionName=<YOUR_CLUSTER_CONNECTION_NAME> \ |
| 47 | + SinkDatabase=test \ |
| 48 | + SinkCollection=output \ |
| 49 | + State=CREATED \ |
| 50 | + --capabilities CAPABILITY_IAM \ |
| 51 | + --region us-east-1 |
| 52 | +``` |
| 53 | + |
| 54 | +### Deploy Stream Processor with DLQ |
| 55 | + |
| 56 | +```bash |
| 57 | +aws cloudformation deploy \ |
| 58 | + --template-file examples/atlas-streams/stream-processor/stream-processor-with-dlq.json \ |
| 59 | + --stack-name stream-processor-dlq-stack \ |
| 60 | + --parameter-overrides \ |
| 61 | + ProjectId=<YOUR_PROJECT_ID> \ |
| 62 | + WorkspaceName=<YOUR_WORKSPACE_NAME> \ |
| 63 | + ProcessorName=my-processor-dlq \ |
| 64 | + SourceConnectionName=sample_stream_solar \ |
| 65 | + SinkConnectionName=<YOUR_CLUSTER_CONNECTION_NAME> \ |
| 66 | + SinkDatabase=test \ |
| 67 | + SinkCollection=output \ |
| 68 | + DlqConnectionName=<YOUR_DLQ_CLUSTER_CONNECTION_NAME> \ |
| 69 | + DlqDatabase=dlq \ |
| 70 | + DlqCollection=dlq-messages \ |
| 71 | + State=CREATED \ |
| 72 | + --capabilities CAPABILITY_IAM \ |
| 73 | + --region us-east-1 |
| 74 | +``` |
| 75 | + |
| 76 | +## Verification |
| 77 | + |
| 78 | +After deployment, verify the stream processor was created successfully using both Atlas CLI and Atlas UI. |
| 79 | + |
| 80 | +### Atlas CLI Verification |
| 81 | + |
| 82 | +```bash |
| 83 | +# List all stream processors for a workspace |
| 84 | +atlas streams processors list <WORKSPACE_NAME> --projectId <PROJECT_ID> |
| 85 | + |
| 86 | +# Describe a specific stream processor |
| 87 | +atlas streams processors describe <PROCESSOR_NAME> \ |
| 88 | + --instance <WORKSPACE_NAME> \ |
| 89 | + --projectId <PROJECT_ID> |
| 90 | +``` |
| 91 | + |
| 92 | +### Expected CLI Output |
| 93 | + |
| 94 | +The `atlas streams processors describe` command should return: |
| 95 | + |
| 96 | +- `id`: Unique identifier of the processor (matches the `Id` attribute in CloudFormation) |
| 97 | +- `name`: Processor name (matches `ProcessorName` parameter) |
| 98 | +- `state`: Current state (CREATED, STARTED, STOPPED, or FAILED) |
| 99 | +- `pipeline`: Array of pipeline stages matching your Pipeline configuration |
| 100 | +- `options`: DLQ configuration if provided (should match your Options.Dlq settings) |
| 101 | +- `stats`: Processing statistics (available when processor is STARTED) |
| 102 | + |
| 103 | +### Verify Pipeline Configuration |
| 104 | + |
| 105 | +The pipeline should match your CloudFormation template: |
| 106 | + |
| 107 | +- Source connection name should match `SourceConnectionName` parameter |
| 108 | +- Merge target connection should match `SinkConnectionName` parameter |
| 109 | +- Database and collection should match `SinkDatabase` and `SinkCollection` parameters |
| 110 | + |
| 111 | +### Verify DLQ Configuration (if applicable) |
| 112 | + |
| 113 | +For processors with DLQ: |
| 114 | + |
| 115 | +- `options.dlq.connectionName` should match `DlqConnectionName` parameter |
| 116 | +- `options.dlq.db` should match `DlqDatabase` parameter |
| 117 | +- `options.dlq.coll` should match `DlqCollection` parameter |
| 118 | + |
| 119 | +### Atlas UI Verification |
| 120 | + |
| 121 | +1. Navigate to your Atlas project in the [Atlas UI](https://cloud.mongodb.com) |
| 122 | +2. Go to **Stream Processing** section |
| 123 | +3. Select your stream workspace/instance |
| 124 | +4. Verify the processor appears in the **Processors** tab with: |
| 125 | + - **Name**: Matches the `ProcessorName` from your CloudFormation template |
| 126 | + - **State**: Matches the `State` parameter (CREATED, STARTED, or STOPPED) |
| 127 | + - **Pipeline**: Click on the processor to view pipeline stages and verify: |
| 128 | + - Source connection matches your `SourceConnectionName` parameter |
| 129 | + - Merge target connection matches your `SinkConnectionName` parameter |
| 130 | + - Target database and collection match your `SinkDatabase` and `SinkCollection` parameters |
| 131 | +5. For processors with DLQ: |
| 132 | + - Verify DLQ configuration is displayed in the processor details |
| 133 | + - Check that DLQ connection, database, and collection match your parameters |
| 134 | +6. If processor is in STARTED state: |
| 135 | + - Verify processing statistics are available |
| 136 | + - Check that messages are being processed (stats show input/output message counts) |
| 137 | + |
| 138 | +## Notes |
| 139 | + |
| 140 | +- **AWS Only**: This CloudFormation resource is designed for AWS deployments. The provider is effectively AWS. |
| 141 | +- **WorkspaceName**: This field is the same as 'InstanceName' used in other stream resources. |
| 142 | +- **State Management**: When creating a processor, specify `State: STARTED` to automatically start processing, or `State: CREATED` to create it in a stopped state. |
| 143 | +- **Long-Running Operations**: Creating and starting stream processors can take several minutes. The resource uses callback-based state management to handle these operations asynchronously. |
| 144 | +- **Timeout Configuration**: Use `Timeouts.Create` to configure how long to wait for processor creation/startup (default: 20 minutes). |
0 commit comments