Skip to content

Commit

Permalink
fix torch package issue caused by heterogenous planner (#1905)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #1905

Fix torch package issue caused
context S410933

Reviewed By: IvanKobzarev

Differential Revision: D56360099

fbshipit-source-id: 538744068af1eb552f11cbf4f8e631e3dd4d7aee
  • Loading branch information
gnahzg authored and facebook-github-bot committed Apr 19, 2024
1 parent f034281 commit 303e852
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 9 deletions.
5 changes: 1 addition & 4 deletions torchrec/distributed/planner/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,6 @@
- automatically building and selecting an optimized sharding plan.
"""

from torchrec.distributed.planner.planners import ( # noqa
EmbeddingShardingPlanner,
HeteroEmbeddingShardingPlanner, # noqa
)
from torchrec.distributed.planner.planners import EmbeddingShardingPlanner # noqa
from torchrec.distributed.planner.types import ParameterConstraints, Topology # noqa
from torchrec.distributed.planner.utils import bytes_to_gb, sharder_name # noqa
7 changes: 2 additions & 5 deletions torchrec/distributed/shard.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,8 @@
from torch.distributed._composable.contract import contract
from torchrec.distributed.comm import get_local_size
from torchrec.distributed.model_parallel import get_default_sharders
from torchrec.distributed.planner import (
EmbeddingShardingPlanner,
HeteroEmbeddingShardingPlanner,
Topology,
)
from torchrec.distributed.planner import EmbeddingShardingPlanner, Topology
from torchrec.distributed.planner.planners import HeteroEmbeddingShardingPlanner
from torchrec.distributed.sharding_plan import (
get_module_to_default_sharders,
ParameterShardingGenerator,
Expand Down

0 comments on commit 303e852

Please sign in to comment.