How do I modify the upload content before the upload?

Liveview’s upload is simple enough to use. However, I now have a need to modify the file content before the upload. Specifically, I want to scale down the image at the client side before sending the potentially very large file over the network. Is there any javascript hook that enable me to do that?

2 Likes

Did you find a way to do it?

You can achieve that with custom client side JS, before submitting you have files available for manipulation.
Usefull links: LiveView Upload docs

Sorry I am not able to figure that out. How to do it? By using Hooks?
Where are the files stored? Would be very grateful if you can share some code sample.
Thanks.

I’m not sure what your use case is, but you can send events back and forward to preform some actions.
Here you can find example code for uploading files, and you can put some logic in validate to send client side some data or w/e you need and then handle push event on client and preform action on images or something else.

Thank You for the input.
I just want to downsize large images.
I can send events and recieve them on the client.
But I am not sure where the files are located that I need to modify.

I am thinking that for external uploaders live view provides a Uploader with entries. That might be a place to modify the files.

I was just sharing with someone about this yesterday.

In short, yeah everything is done client side via hooks.

In one example I load an image with a wasm lib via a canvas on the dom, that is used to resize.
The other, I upload file to a bucket, call third party scale api with bucket path, download scaled image back to clients re-upload with new signed url.

Edit: I should add its easy to overload the browser in the case of the canvas/wasm workflow.
On a phone loading several 24mb sized images in to canvas has been able to crash a mobile browser really fast and given the size of raw phone images now it was a common issue I seen. The better of the two approaches I found was to use a signed url to throw the image at a buck then trigger a worker to async resize the image and fetch it later and or move it from a temp bucket to a permanent place. Uploading large files to a bucket is a non issue and its was better to have a back up of the source upload in my case too. I also found minio for self hosting buckets which made going the object store approach a lot for effective for me. That said I would love it if there was a really good full client side solution to this. Y

1 Like

Yes, I scale the image using client side js and store the result in a blob. Then I chunk the blob, basse64 encode them, and send the chunks in-band in with pushEvent

See here for the js part:

1 Like

To be clear, you have to do all these extra steps because you want to modify the file client-side and then upload it to your webserver and not to cloud storage, correct?

I wonder if LiveView should support a client-side callback for server uploads :thinking:

1 Like

Yes. My way does not make the best use of the bandwidth, and it is streaming binary data from browser memory to server memory, so it won’t work for large files. On the other hand, it is fairly simple and there is no need to cleanup should the user navigate away.

The only thing I wish from Liveview is the support for binary messages without resorting to base64. I could use a raw socket but then I need to deal with coordination between 2 processes.

That would be very welcome. Especially for common use cases like resizing images before uploading (especially on mobile devices).

Are you using the default S3Uploader?

I was asking about this yesterday or the day before and am currently at this point:

def presign_upload_banner(entry, socket) do
    uploads = socket.assigns.uploads
    key = s3_key(entry)

    config = %{
      region: "eu-west-2",
      access_key_id: System.fetch_env!("AWS_ACCESS_KEY_ID"),
      secret_access_key: System.fetch_env!("AWS_SECRET_ACCESS_KEY")
    }
  
    {:ok, fields} =
      SimpleS3Upload.sign_form_upload(config, @bucket,
        key: key,
        content_type: entry.client_type,
        max_file_size: uploads[entry.upload_config].max_file_size,
        expires_in: :timer.hours(1)
      )
  
    meta = %{uploader: "S3Banner", key: key, url: s3_host(), fields: fields, canvas_height: 360}
    {:ok, meta, socket}
  end

360 is a placeholder for a variable value. You can add values to the meta and access them in the Uploader JS using entry.meta.YOURVALUE

Uploaders.S3Banner = function(entries, onViewError) {
  entries.forEach(entry => {
    let formData = new FormData();
    let {url, fields} = entry.meta;
    Object.entries(fields).forEach(([key, val]) => formData.append(key, val));

    let img = new Image();
    img.onload = function() {
 
      const sliceHeight = entry.meta.canvas_height;
      let canvas = document.createElement('canvas');
      canvas.width = screen.width;
      canvas.height = sliceHeight;

      let ctx = canvas.getContext('2d');
      const cropStart = (img.height - sliceHeight) / 2;
      const cropEnd = cropStart + sliceHeight;

      ctx.drawImage(img, 0, cropStart, img.width, cropEnd - cropStart, 0, 0, canvas.width, canvas.height);

      // convert the canvas content to a Blob object and add it to the form data
      canvas.toBlob(function(blob) {
        formData.append("file", blob);

        // send the form data via XHR
        let xhr = new XMLHttpRequest();
        onViewError(() => xhr.abort());
        xhr.onload = () => xhr.status === 204 ? entry.progress(100) : entry.error();
        xhr.onerror = () => entry.error();
        xhr.upload.addEventListener("progress", (event) => {
          if(event.lengthComputable){
            let percent = Math.round((event.loaded / event.total) * 100);
            if(percent < 100){ entry.progress(percent); }
          }
        });

        xhr.open("POST", url, true);
        xhr.send(formData);
      });
    };
    img.src = URL.createObjectURL(entry.file);
  });
}

The above will take the uploaded image and create a screen width banner that is 360px in height. It will crop the image above and below to leave a “slice” of the original image.

You can obviously just change the JS to change the dimensions and resize as well though.

What I haven’t figured out how to do is validate the upladed image size. I can add an IF statement to check the dimensions and make the above return an error, but no idea how to send an event to the server yet as still learning.

1 Like

I am trying to upload to Digital Ocean Spaces which is similar to S3.
Will post the code once I am done.

One option is to use a hidden <.live_file_input /> and then use this.upload() on a client-side hook to enqueue the resized file. Here is a brief example:

In your HEEx template:

<!-- Users select files with this file input. -->
<input type="file" id="resize-target" phx-hook="ResizeInput" data-upload-target="images" />

<form phx-change="change" phx-submit="submit">
  <!-- The live file input is hidden because users will not interact with it directly. -->
  <.live_file_input upload={@uploads.images} style="display:none;" />
  <button type="submit">Upload</button>
</form>

…and the ResizeInput hook:

// assets/js/resize_input.js
import { convertResizeImageFiles } from './image_utils'

// This hook attaches to a custom input element to resize selected images.
export default ResizeInput = {
  getUploadTarget() { return this.el.dataset.uploadTarget },
  mounted() {
    this.el.addEventListener("change", (e) => {
      e.preventDefault()
      e.stopImmediatePropagation()

      if (this.el.files && this.el.files.length > 0) {
        convertResizeImageFiles(this.el.files, 300, (resizedImageBlob) => {
          // Enqueues the resized blob on the <.live_file_input />.
          this.upload(this.getUploadTarget(), [resizedImageBlob])
        })
      }
    })
  }
}

I put the visible file input outside of the live form so that its change events don’t trigger additional messages to the server.

You can find a working demo in my live_upload_example repo: Add client-side resize demo · mcrumm/live_upload_example@5c797aa · GitHub

Hope that helps!

PS– Please forgive my JavaScript. The convertResizeImageFiles callback should really receive a list of files, that way the upload function would only need to be invoked once for each batch of selected files, however I am currently on vacation so this is the best I could do on short notice :smiley:

4 Likes

Here is the code that I am using now to upload to Digital Ocean Spaces

defp presign_upload(entry, socket) do
    filename = "#{entry.uuid}_#{entry.client_name}"

    # user_id = socket.assigns.user_id
    config = ExAws.Config.new(:s3)
    query_params = [{"x-amz-acl", "public-read"}]

    {:ok, url} =
      ExAws.S3.presigned_url(
        config,
        :put,
        "documents",
        "#{socket.assigns.current_student.username}/#{filename}",
        expires_in: 300,
        query_params: query_params
      )

    meta = %{uploader: "Document", url: url}
    {:ok, meta, socket}
  end

Now the JS code in the app.js file

Uploaders.Document = function(entries, onViewError) {
  entries.forEach(entry => {
    let file = entry.file
    let url = entry.meta.url
    if (file.type.indexOf("image") >= 0) {
      var reader = new FileReader();
      reader.onload = function(readerEvent) {
        var image = new Image();
        image.onload = function(imageEvent) {
          var canvas = document.createElement('canvas'),
            max_width = 1200,// TODO : pull max size from a site config
            max_height = 800,// TODO : pull max size from a site config
            width = image.width,
            height = image.height;
          if (width > max_width) {
            height *= max_width / width;
            width = max_width;
          }
          if (height > max_height) {
            width *= max_height / height;
            height = max_height;
          }
          canvas.width = width;
          canvas.height = height;
          canvas.getContext('2d').drawImage(image, 0, 0, width, height);
          var dataUrl = canvas.toDataURL(file.type);
          let blob = dataURItoBlob(dataUrl);
          var xhr = new XMLHttpRequest()
          onViewError(() => xhr.abort())
          xhr.upload.addEventListener("loadstart", function() {
            // console.log("Load started")
          });

          xhr.upload.addEventListener("progress", function(e) {
            if (e.lengthComputable) {
              let percent = Math.round((e.loaded / e.total) * 100)
              entry.progress(percent)
            }
          })

          xhr.addEventListener('readystatechange', function(e) {
            if (xhr.readyState == 4 && xhr.status == 200) {
            }
            else if (xhr.readyState == 4 && xhr.status != 200) {
              entry.error("Upload Failed")
            }
          })
          xhr.open('PUT', url, true)
          xhr.setRequestHeader("x-amz-acl", "public-read")
          xhr.setRequestHeader("Content-Type", blob.type)
          xhr.send(blob)
        }
        image.src = readerEvent.target.result;
      }
      reader.readAsDataURL(file);
    } else {
      var xhr = new XMLHttpRequest()
      onViewError(() => xhr.abort())
      xhr.onload = () => xhr.status === 204 ? entry.progress(100) : entry.error()
      xhr.upload.addEventListener("loadstart", function() {
        console.log("Load started")
      });

      xhr.upload.addEventListener("progress", function(e) {
        if (e.lengthComputable) {
          let percent = Math.round((e.loaded / e.total) * 100)
          entry.progress(percent)
        }
      })

      xhr.addEventListener('readystatechange', function(e) {
        if (xhr.readyState == 4 && xhr.status == 200) {
          // console.log("Done")
        }
        else if (xhr.readyState == 4 && xhr.status != 200) {
          // console.log("Error")
          entry.error("Upload Failed")
        }
      })
      xhr.open('PUT', url, true)
      xhr.setRequestHeader("x-amz-acl", "public-read")
      xhr.setRequestHeader("Content-Type", file.type)
      xhr.send(file)
    }
  })
}

The dataURItoBlob function

function dataURItoBlob(dataURI) {
  var byteString = atob(dataURI.split(',')[1]);
  var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]
  var ab = new ArrayBuffer(byteString.length);
  var ia = new Uint8Array(ab);


  for (var i = 0; i < byteString.length; i++) {
    ia[i] = byteString.charCodeAt(i);
  }

  var blob = new Blob([ab], { type: mimeString });
  return blob;

}

This is very nice and can be used to consume files directly on the server instead of loading to an external storage.
Thank you so much for sharing (while being on vacation, really appreciate it.).
Enjoy your vacation.

2 Likes