Skip to content

Support torch validation and add to spmd tests#3003

Draft
ethanglaser wants to merge 2 commits intouxlfoundation:mainfrom
ethanglaser:dev/eglaser-add-torch-spmd-validation
Draft

Support torch validation and add to spmd tests#3003
ethanglaser wants to merge 2 commits intouxlfoundation:mainfrom
ethanglaser:dev/eglaser-add-torch-spmd-validation

Conversation

@ethanglaser
Copy link
Contributor

Description


Checklist:

Completeness and readability

  • I have commented my code, particularly in hard-to-understand areas.
  • I have updated the documentation to reflect the changes or created a separate PR with updates and provided its number in the description, if necessary.
  • Git commit message contains an appropriate signed-off-by string (see CONTRIBUTING.md for details).
  • I have resolved any merge conflicts that might occur with the base branch.

Testing

  • I have run it locally and tested the changes extensively.
  • All CI jobs are green or I have provided justification why they aren't.
  • I have extended testing suite if new functionality was introduced in this PR.

Performance

  • I have measured performance for affected algorithms using scikit-learn_bench and provided at least a summary table with measured data, if performance change is expected.
  • I have provided justification why performance and/or quality metrics have changed or why changes are not expected.
  • I have extended the benchmarking suite and provided a corresponding scikit-learn_bench PR if new measurable functionality was introduced in this PR.

@ethanglaser
Copy link
Contributor Author

/intelci: run

@codecov
Copy link

codecov bot commented Mar 7, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

Flag Coverage Δ
azure 79.87% <ø> (ø)
github ?

Flags with carried forward coverage won't be shown. Click here to find out more.
see 3 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

xp = array_api_modules[target_df]
return xp.asarray(obj)
elif target_df == "torch":
if hasattr(torch, "xpu") and torch.xpu.is_available():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should probably be documented to prevent confusion

@icfaust
Copy link
Contributor

icfaust commented Mar 7, 2026

I assume the dpc runtime mismatch issue between torch and dpnp is solved? otherwise this may make things difficult. good addition

@david-cortes-intel
Copy link
Contributor

/intelci: run

@david-cortes-intel
Copy link
Contributor

The CI error:

        elif target_df == "torch":
            if hasattr(torch, "xpu") and torch.xpu.is_available():
                return torch.as_tensor(obj, device="xpu", *args, **kwargs)
            else:
>               return torch.as_tensor(obj, device="cpu", *args, **kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E               TypeError: as_tensor(): argument 'dtype' must be torch.dtype, not type

@ethanglaser
Copy link
Contributor Author

I assume the dpc runtime mismatch issue between torch and dpnp is solved? otherwise this may make things difficult. good addition

If installing both torch and dpnp without oneapi deps, there are no issues

return xp.asarray(obj)
elif target_df == "torch":
if hasattr(torch, "xpu") and torch.xpu.is_available():
return torch.as_tensor(obj, device="xpu", *args, **kwargs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems we are testing np.float32/np.float64 dtypes in convert_to_dataframe and torch does not take this, maybe we need to conver to torch dtype before passing to as_tensor

@icfaust
Copy link
Contributor

icfaust commented Mar 10, 2026

I assume the dpc runtime mismatch issue between torch and dpnp is solved? otherwise this may make things difficult. good addition

If installing both torch and dpnp without oneapi deps, there are no issues

the release schedules are not matching which caused previous issues in one of the oneAPI minor releases. may break things for a bit at 2026.0 release w. r. t. torch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants