0

I am working on implementing a system to generate google docs and then export them using JavaScript using the Google API service. I have been able to make the docs, and make changes to them, but now I want to export them in bulk. Using the batch endpoint, I am able to make the request for multiple exports, but it always comes back with a redirect. I have not been able to find a way to follow this redirect to export the documents, but since I need to export many of these documents at once, I want to do this request in a batch.

I have tried a few variations of a batch request with different mimeTypes, and the only valid response I have been able to get is 302, redirected.

--batch_1
Content-Type: application/http
Content-Id: <export-{docId}>

GET /drive/v3/files/{docId}/export?mimeType=application/pdf

--batch_1
Content-Type: application/http
Content-Id: <export-{docId}>

GET /download/drive/v3/files/{docId}/export?mimeType=application/pdf

--batch_1

These are two variations that I thought had the best chance of success, but I only get 302 with the first request, and an error with the second.

If possible I would like to make the request in the batch, and get the data back, without making multiple requests to the endpoint. If this is not possible, is there an alternative to reduce API requests?

Edit 1 - Rest of my Code:

props: {
    googleDrive, //how I connect to google drive, uses oAuth 2.0
    docIds: [] //a list of document ids to turn to pdfs,
    folderId: "id" //a folder id to put the pdfs,
  },
async run({ steps, $ }) {
    const boundary = `batch_${Date.now()}`;
    const batchBody = [];

    const authHeader = {
      'Authorization': `Bearer ${this.googleDrive.$auth.oauth_access_token}`,
      "Content-Type": `multipart/mixed; boundary=${boundary}`,
    };

    for (const x of this.docIds) {
      const docId = x.googleDocId;
      const pdfFilename = `${docId}.pdf`;

      // Step 1: Export request (Convert Google Doc to PDF)
      batchBody.push(
        `--${boundary}`,
        `Content-Type: application/http`,
        `Content-Id: <export-${docId}>`,
        ``,
        `GET https://www.googleapis.com/drive/v3/files/${docId}/export?mimeType=application/html`,
        ``
      );

      // Step 2: Delete original document
      /*batchBody.push(
        `--${boundary}`,
        `Content-Type: application/http`,
        `Content-Id: <delete-${docId}>`,
        ``,
        `DELETE /drive/v3/files/${docId}`,
        ``
      );*/
    }

    batchBody.push(`--${boundary}--`);

    console.log(batchBody.join("\r\n"));

    try {
      // Step 1 & 2: Send batch request (Export & Delete)
      const batchResponse = await axios.post(
        "https://www.googleapis.com/batch/drive/v3",
        batchBody.join("\r\n"),
        { headers: authHeader, responseType: "arraybuffer" }
      );

      console.log("Batch Export & Delete Response:", batchResponse.data);

      const pdfFiles = this.extractPdfFiles(batchResponse.data);
      return pdfFiles;
      // Step 3: Upload PDFs into the specified folder
      const uploadBatchBody = [];

      for (const {docId, docName, pdfData} of pdfFiles) {
        const pdfFilename = `${docName}.pdf`;

        // Upload metadata
        const metadata = {
          name: pdfFilename,
          mimeType: "application/pdf",
          parents: [this.folderId],
        };

        const formData = new FormData();
        formData.append(
          "metadata",
          new Blob([JSON.stringify(metadata)], {type:"application/json"})
        );
        formData.append("file", new Blob([pdfData], {type:"application/pdf"}));

        uploadBatchBody.push(
          `--${boundary}`,
          `Content-Type: application/http`,
          `Content-Id: <upload-${docId}>`,
          ``,
          `POST https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart`,
          `Content-Type: multipart/form-data`,
          ``,
          formData
        );
      }

      uploadBatchBody.push(`--${boundary}--`);

      // Step 3: Batch Upload PDFs
      const uploadBatchResponse = await axios.post(
        "https://www.googleapis.com/batch/drive/v3",
        uploadBatchBody.join("\r\n"),
        { headers: {...authHeader, "Content-Type": `multipart/mixed; boundary=${boundary}`} }
      );

      console.log("Batch Upload Response:", uploadBatchResponse.data);

      return { status: "Batch request completed" };
    } catch (error) {
      console.error("Batch request error:", error);
      return { error: "Failed to process batch request" };
    }
  },
  methods: {
    extractPdfFiles(batchResponseData){
      const pdfFiles = [];
      
      // Convert ArrayBuffer to string
      const textDecoder = new TextDecoder("utf-8");
      const batchResponseText = textDecoder.decode(batchResponseData);

      return batchResponseText;
    },
  }

in this code, step 1 is where the issue lies. I can't get the export to respond correctly.

Also I know this code probably is a nightmare to look at, I have changed it a ton of times in hopes of getting it working, and it could be a little jumbled.

8
  • What error you are getting exactly? it is rate limited or quota exceed?
    – BadPiggie
    Commented Jan 31 at 6:21
  • Aside from that any other things you tried?
    – Lime Husky
    Commented Jan 31 at 17:02
  • @BadPiggie The error for the second example is that the endpoint is not valid. The first responds with 302. I am not getting rate limited, but rather have a short timeout on the server hosting my javascript, so awaiting responses doesn't make sense. Commented Feb 1 at 0:46
  • @LimeHusky I haven't found many other approaches to try. Looping singular requests results in a timeout on my server, as I can possible end up awaiting multiple dozens of requests, and my server time out is unfortunately short. Commented Feb 1 at 0:49
  • Is that all the code blocks request you can share or you have other code for us to see? If so paste it up so that other people can help to solve the issue.
    – Lime Husky
    Commented Feb 1 at 0:55

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.