Customizing an Action of previously shared Action: changes not taking

Still getting up to speed on terminology, so if I misspeak when describing a component of Pipedream, please steer me in the right direction.

I believe I am looking at a previously shared custom action someone (dylburger) shared over 2 years ago, when I visit this URL:

https://pipedream.com/@dylburger/example-save-uploaded-file-to-amazon-s3-p_o7Cm9z/edit

I would like to:

  • make a copy of this Action
  • make 1 change, and
  • Save/use it in a workflow.
  1. I click “Copy” in the upper right-hand corner

  2. Next, I click into the “Code” window, at which point I get an alert that says: “CUSTOMIZE THIS ACTION? Changes you make will only apply to this Workflow” - I click “Yes”

  3. Then, I make 1 change to the code, here:

const urlResponse = await require("axios").get(fileUrl, { headers: {'Authorization': 'Bearer ' + bearerToken}, responseType: 'stream' })

  1. Next, in order to get bearerToken into the code block, I hover over the “Params” row to reveal the “edit params schema” link, and click it.

  2. Next, I scroll down to the code block and edit the string to:

{
  "type": "object",
  "properties": {
    "fileUrl": {
      "type": "string"
    },
    "s3Bucket": {
      "type": "string"
    },
    "s3Key": {
      "type": "string"
    },
    "bearerToken": {
      "type": "string"
    }
  },
  "required": [
    "fileUrl",
    "s3Bucket",
    "s3Key",
    "bearerToken"
  ]
}

As expected, this adds a new field above the code block. Can’t embed more than 1 image in my forum post, so can’t show it to you here.

  1. I then return to the ‘edit params schema’ link and click back, which now shows bearerToken as a param that must be filled out.

  1. I enter in the value and click save…

The screen refreshes after the save completes…and the new param dialog disappears… (again, can’t show you a screenshot, forum will only allow me to post 1 image.

What am I doing wrong? If I’m not doing anything wrong and it is a known issue, is there at workaround?

Hi @shawn-wm , that example uses the v1 of our workflow editor (we recently shipped the new version, which you’ve likely been working with). Try this, instead:

  1. Create a new workflow
  2. Add a step to your workflow and search “AWS”, then select the Use any AWS API
  3. In the Node.js code step that appears, paste the following code:
import AWS from "aws-sdk"
import axios from 'axios'

export default defineComponent({
  props: {
    aws: {
      type: "app",
      app: "aws",
    },
    fileUrl: {
      type: "string",
      label: "File URL"
    },
    s3Bucket: {
      type: "string",
      label: "S3 Bucket Name"
    },
    s3Key: {
      type: "string",
      label: "S3 Key",
      description: "Key to upload the URL content to"
    },
  },
  async run({ steps, $ }) {
    const { fileUrl, s3Bucket, s3Key } = this
    const { accessKeyId, secretAccessKey } = this.aws.$auth
    const s3 = new AWS.S3({ accessKeyId, secretAccessKey })
    const urlResponse = await axios.get(fileUrl, { responseType: 'stream' })
    const s3Response = await s3.upload({
      Bucket: s3Bucket,
      Key: s3Key,
      ContentType: urlResponse.headers['content-type'],
      ContentLength: urlResponse.headers['content-length'],
      Body: urlResponse.data,
    }).promise()
    return (await s3Response).Location
  },
})
  1. Click the Refresh fields button above the code, and you should be prompted to enter your URL, bucket, and key. Connect an AWS account that has PutObject permissions in the target bucket.

Let me know if that works!

I jumped ahead and made my own Action with the CLI. Very close to yours. Here’s what I ended up pushing with pd publish:

import axios from "axios";
import AWS from "aws-sdk";

export default {
    name: "Secure URL to S3",
    description: "Fetches a secure URL and pipes into an S3 bucket",
    key: "secure_url_to_s3",
    version: "0.0.5",
    type: "action",
    props: {
        aws: {
            type: "app",
            app: "aws",
        },
        fileUrl: {
            type: "string",
            label: "File URL"
        },
        s3Bucket: {
            type: "string",
            label: "S3 Bucket"
        },
        s3Key: {
            type: "string",
            label: "S3 Key"
        },
        bearerToken: {
            type: "string",
            label: "Bearer Token"
        },        
    },
    async run({ steps, $ }) {
        const { accessKeyId, secretAccessKey } = this.aws.$auth
        const s3 = new AWS.S3({ accessKeyId, secretAccessKey });
        const response = await axios.get( this.fileUrl, { headers: { Authorization: `Bearer ${this.bearerToken}` }, responseType: 'stream' } );

        const s3Response = await s3.upload({
            Bucket: this.s3Bucket,
            Key: this.s3Key,
            ContentType: response.headers['content-type'],
            ContentLength: response.headers['content-length'],
            Body: response.data,
        }).promise();

        return s3Response.Location;
    },
  }

How’s this in contrast?

It didn’t seem to like return (await s3Response).Location stating that await has no effect on this type of expression. Thoughts on that?

Hi @dylburger ,

I copied the code you posted on July '22. Set it up in my workflow by following your steps. one thing I noticed was for the S3 bucket name, I only have an option to enter the Bucket Name instead of being able to pick from the drop-down list of all existing buckets in my S3 account. In the pre-made step for streaming file to S3, I was able to pick from the drop-down list.

Anyhow, the issue I am facing is this error message. I have no idea what this mean exactly.

How can I fix this error message so I can actually stream the file to my s3 bucket?

Thanks in advance

This seems to be related to the URL structure of the file I am uploading to s3. I removed the ContentType param from the code and it worked.