Skip to content

Add InferX provider (OpenAI-compatible endpoints)#1719

Open
Prashanth-InferX wants to merge 2 commits intoanomalyco:devfrom
Prashanth-InferX:add-inferx-provider
Open

Add InferX provider (OpenAI-compatible endpoints)#1719
Prashanth-InferX wants to merge 2 commits intoanomalyco:devfrom
Prashanth-InferX:add-inferx-provider

Conversation

@Prashanth-InferX
Copy link
Copy Markdown

@Prashanth-InferX Prashanth-InferX commented May 6, 2026

Adds InferX as an OpenAI-compatible provider.

InferX provides dedicated inference endpoints with no cold starts.

Base URL format:
https://model.inferx.net/funccall/{tenant}/endpoints/{endpoint}/v1

Example model:
Qwen/Qwen3.6-27B-FP8

Compatible with OpenAI-style clients (e.g., OpenCode, AI SDK).

Users provide their own endpoint and API key.

@Prashanth-InferX
Copy link
Copy Markdown
Author

Happy to add more models or adjust format if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant