Error while using model through Transformers.js

#2
by Arskeliss - opened

Hi dear Maintainers,

I am receiving this error while trying to use the model with dtype: 'fp16' with the transformers.js library.

/home/arskeliss/Dokumentumok/git/azrael-map/node_modules/onnxruntime-node/dist/backend.js:27
            __classPrivateFieldGet(this, _OnnxruntimeSessionHandler_inferenceSession, "f").loadModel(pathOrBuffer.buffer, pathOrBuffer.byteOffset, pathOrBuffer.byteLength, options);
                                                                                           ^

Error: Failed to load model with error: /onnxruntime_src/onnxruntime/core/graph/graph.cc:1471 void onnxruntime::Graph::InitializeStateFromModelFileGraphProto() This is an invalid model. Subgraph output (logits) is an outer scope value being returned directly. Please update the model to add an Identity node between the outer scope value and the subgraph output.

    at new OnnxruntimeSessionHandler (/home/arskeliss/Dokumentumok/git/azrael-map/node_modules/onnxruntime-node/dist/backend.js:27:92)
    at Immediate.<anonymous> (/home/arskeliss/Dokumentumok/git/azrael-map/node_modules/onnxruntime-node/dist/backend.js:64:29)
    at process.processImmediate (node:internal/timers:491:21)

Sign up or log in to comment