-
Notifications
You must be signed in to change notification settings - Fork 4
Description
Problem description
I am trying to get an AI model based on transformers.js (which uses the onnx runtime) working in deno. Transformers.js added support for Deno in v3. This works great for deno running locally, but does not appear to work in Deno Deploy. When deployed, it gives the following error:
{"type":"error","code":"deploymentFailed","ctx":"The deployment failed: UNCAUGHT_EXCEPTION\n\nError: This API is not supported in this environment\n at Object.Module._extensions..node (node:module:790:21)\n at Module.load (node:module:655:32)\n at Function.Module._load (node:module:523:13)\n at Module.require (node:module:674:19)\n at require (node:module:801:16)\n at Object.<anonymous> (file:///node_modules/.deno/[email protected]/node_modules/onnxruntime-node/dist/binding.js:9:1)\n at Object.<anonymous> (file:///node_modules/.deno/[email protected]/node_modules/onnxruntime-node/dist/binding.js:11:4)\n at Module._compile (node:module:736:34)\n at Object.Module._extensions..js (node:module:757:11)\n at Module.load (node:module:655:32)"}
Steps to reproduce
Sample code to reproduce:
const quantized = false; // change to `true` for a much smaller model (e.g. 87mb vs 345mb for image model), but lower accuracy
import {
AutoProcessor,
CLIPVisionModelWithProjection,
RawImage,
AutoTokenizer,
CLIPTextModelWithProjection,
} from "npm:@huggingface/transformers";
const imageProcessor = await AutoProcessor.from_pretrained(
"Xenova/clip-vit-base-patch16"
);
const visionModel = await CLIPVisionModelWithProjection.from_pretrained(
"Xenova/clip-vit-base-patch16",
{ quantized }
);
const tokenizer = await AutoTokenizer.from_pretrained(
"Xenova/clip-vit-base-patch16"
);
const textModel = await CLIPTextModelWithProjection.from_pretrained(
"Xenova/clip-vit-base-patch16",
{ quantized }
);
function cosineSimilarity(A: number[], B: number[]) {
if (A.length !== B.length) throw new Error("A.length !== B.length");
let dotProduct = 0,
mA = 0,
mB = 0;
for (let i = 0; i < A.length; i++) {
dotProduct += A[i] * B[i];
mA += A[i] * A[i];
mB += B[i] * B[i];
}
mA = Math.sqrt(mA);
mB = Math.sqrt(mB);
const similarity = dotProduct / (mA * mB);
return similarity;
}
export async function getImageEmbedding(imageLocation: string) {
const image = await RawImage.read(imageLocation);
const imageInputs = await imageProcessor(image);
const { image_embeds } = await visionModel(imageInputs);
// console.log(image_embeds.data);
return image_embeds.data;
}
Deno.serve(async () => {
const embed = await getImageEmbedding("./image.png");
console.log(embed);
console.log(embed.length);
return new Response(JSON.stringify(embed));
});
Note: this code does have some typescript and lint errors, but it runs fine on my local machine. I usually focus on getting the code functional before messing with type/lint errors that don't appear to cause issues.
- Create a deno project with this code
- Deploy using deployctl
- Observe error in deployctl command or in Deploy dashboard
Expected behavior
Since this code runs fine without error in local Deno, I would expect it to run in Deno Deploy without error.
Environment
No response
Possible solution
No response
Additional context
A similar error can be reproduced using onnx JS libraries directly and bypassing transformers.js. In that test, I was loading the model from a local file.