Skip to content

Actions: triton-inference-server/onnxruntime_backend

Actions

CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
74 workflow runs
74 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Update CUDA archs in ORT
CI #413: Issue comment #291 (comment) created by pvijayakrish
January 15, 2025 21:46 11s
January 15, 2025 21:46 11s
Update CUDA archs in ORT
CI #412: Issue comment #291 (comment) created by mc-nv
January 13, 2025 23:32 13s
January 13, 2025 23:32 13s
January 12, 2025 11:51 13s
Built-in support for (custom?) decryption of model weights
CI #410: Issue comment #279 (comment) created by vadimkantorov
November 20, 2024 15:56 13s
November 20, 2024 15:56 13s
cudnn_home not valid during build
CI #409: Issue comment #33 (comment) created by 631068264
November 11, 2024 10:48 15s
November 11, 2024 10:48 15s
October 12, 2024 10:04 12s
Failed to allocated memory for requested buffer of size X
CI #407: Issue comment #249 (comment) created by DataXujing
October 12, 2024 10:01 10s
October 12, 2024 10:01 10s
Enable support for TensoRT 10.5
CI #406: Issue comment #274 (comment) created by mc-nv
October 12, 2024 02:31 10s
October 12, 2024 02:31 10s
Enable support for TensoRT 10.5
CI #405: Issue comment #274 (comment) created by pvijayakrish
October 12, 2024 02:19 10s
October 12, 2024 02:19 10s
October 8, 2024 08:16 13s
October 7, 2024 18:00 12s
October 7, 2024 09:58 11s
October 7, 2024 09:37 11s
October 4, 2024 22:57 10s
October 4, 2024 22:31 10s
October 2, 2024 13:52 11s
Memory Leak When Using ONNXRuntime With OpenVino EP
CI #397: Issue comment #132 (comment) created by ogencoglu
September 5, 2024 12:25 15s
September 5, 2024 12:25 15s
build: RHEL8 PyTorch Backend
CI #396: Issue comment #266 (comment) created by fpetrini15
August 26, 2024 17:10 15s
August 26, 2024 17:10 15s
build: RHEL8 PyTorch Backend
CI #395: Issue comment #266 (comment) created by nv-kmcgill53
August 26, 2024 16:32 13s
August 26, 2024 16:32 13s
CPU inference is much slower than with ONNX Runtime directly
CI #394: Issue comment #34 (comment) created by hanswang1
August 21, 2024 08:37 13s
August 21, 2024 08:37 13s
CPU inference is much slower than with ONNX Runtime directly
CI #393: Issue comment #34 (comment) created by Mitix-EPI
August 19, 2024 12:48 13s
August 19, 2024 12:48 13s
CPU inference is much slower than with ONNX Runtime directly
CI #392: Issue comment #34 (comment) created by hanswang1
August 19, 2024 06:16 12s
August 19, 2024 06:16 12s
Hardcoding libonnxruntime.so.1 library name reference value
CI #391: Issue comment #264 (comment) created by pvijayakrish
August 15, 2024 21:46 10s
August 15, 2024 21:46 10s
Hardcoding libonnxruntime.so.1 library name reference value
CI #390: Issue comment #264 (comment) created by mc-nv
August 15, 2024 19:17 12s
August 15, 2024 19:17 12s
Hardcoding libonnxruntime.so.1 library name reference value
CI #389: Issue comment #264 (comment) created by pvijayakrish
August 15, 2024 19:13 12s
August 15, 2024 19:13 12s