Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[] fix torch package issue caused by heterogenous planner #1905

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 1 addition & 4 deletions torchrec/distributed/planner/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,6 @@
- automatically building and selecting an optimized sharding plan.
"""

from torchrec.distributed.planner.planners import ( # noqa
EmbeddingShardingPlanner,
HeteroEmbeddingShardingPlanner, # noqa
)
from torchrec.distributed.planner.planners import EmbeddingShardingPlanner # noqa
from torchrec.distributed.planner.types import ParameterConstraints, Topology # noqa
from torchrec.distributed.planner.utils import bytes_to_gb, sharder_name # noqa
7 changes: 2 additions & 5 deletions torchrec/distributed/shard.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,8 @@
from torch.distributed._composable.contract import contract
from torchrec.distributed.comm import get_local_size
from torchrec.distributed.model_parallel import get_default_sharders
from torchrec.distributed.planner import (
EmbeddingShardingPlanner,
HeteroEmbeddingShardingPlanner,
Topology,
)
from torchrec.distributed.planner import EmbeddingShardingPlanner, Topology
from torchrec.distributed.planner.planners import HeteroEmbeddingShardingPlanner
from torchrec.distributed.sharding_plan import (
get_module_to_default_sharders,
ParameterShardingGenerator,
Expand Down
Loading