Skip to content

Conversation

stas00
Copy link
Collaborator

@stas00 stas00 commented May 28, 2025

This is a follow up to #923

my original code was a copy from transformers, which has a different fs layout and I missed that. So this PR is fixing it to actually do the right thing.

Now you can have multiple clones of deepspeed and the tests will use the local repo automatically and not the pre-installed deepspeed.

@stas00 stas00 enabled auto-merge May 28, 2025 20:47
@stas00 stas00 added this pull request to the merge queue May 28, 2025
Merged via the queue into master with commit e5afb88 May 29, 2025
10 of 11 checks passed
@stas00 stas00 deleted the stas00-patch-1 branch May 29, 2025 00:59
deepcharm pushed a commit to deepcharm/DeepSpeed that referenced this pull request Jun 16, 2025
…ing tests (deepspeedai#7317)

This is a follow up to deepspeedai#923

my original code was a copy from transformers, which has a different fs
layout and I missed that. So this PR is fixing it to actually do the
right thing.

Now you can have multiple clones of deepspeed and the tests will use the
local repo automatically and not the pre-installed deepspeed.

Signed-off-by: Max Kovalenko <[email protected]>
Antlera pushed a commit to Antlera/DeepSpeed that referenced this pull request Jun 27, 2025
…ing tests (deepspeedai#7317)

This is a follow up to deepspeedai#923

my original code was a copy from transformers, which has a different fs
layout and I missed that. So this PR is fixing it to actually do the
right thing.

Now you can have multiple clones of deepspeed and the tests will use the
local repo automatically and not the pre-installed deepspeed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants