0

I have an SQS queue that receives a message with the file name that has been created in a target bucket. The process to send the message is:

  1. csv file is inserted into target_bucket.
  2. A message is sent to an SNS topic.
  3. The SNS topic triggers a lambda function, and this lambda function posts a message into an SQS queue that includes the name of the file that was just created.
  4. To check is messages are arriving to my queue, I do a simple poll from the console.

I know all the components are working just fine because by polling from the AWS Web console I can see the messages. This is an example:

enter image description here

However, the intention is to connect this SQS queue to Matillion so every time a new file is uploaded into my target_bucket a job is executed. This job should read the data from the new file and load it into an SnowFlake table.

I have connected my SQS queue to my Matillion project but every time I load a new file into my target_bucket nothing happens. Here are the project configurations needed for SQS:

I know my queue has access to Matillion because as you can see from the final cell, I have a success message when testing the connection.

enter image description here

Also, I added an environment variable (from Project > Manage Environment Variables) called file_to_load: enter image description here

And finally, in the S3 Load component (from my job), I also added the file_to_load in the pattern section as shown in the image below: enter image description here

1
  • Do you get any relevant information from your catalina.out (a.k.a. the Server Log)? SQS messages that can't be parsed will be silently consumed, with an error in the logfile. Although the message body in your screenshot does look OK to me. Secondly, try turning on the Failure Queue? If it's something like the Project Name is mis-spelled, you will get a failure SQS explaining the reason. Commented Nov 25, 2022 at 16:07

1 Answer 1

0

Found the issue!

In the S3 Load component there is a parameter called S3 Object Prefix. Since I am providing the full path of the files (within the bucket), then the S3 Object Prefix should only have the S3 URI of the bucket name, for example:

  • S3 Object Prefix: s3://{bucket_name}/
  • Pattern: file/path/name.csv (received in file_to_load variable from SQS)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.