-
Hi Johnnie, If I have a tensor network
Once I have the path object, how can I iterate through it to do my contraction? Right now, I'm calling |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @cotejer, here are a couple of very explicit methods, which might indeed be useful to include in Some setup: tn = qtn.HTN2D_classical_ising_partition_function(10, 10, beta=0.44)
output_inds = () # if hyper tensor network we need to specify these
path = tn.contraction_path(optimize="auto-hq", output_inds=output_inds)
tree = tn.contraction_tree(optimize="auto-hq", output_inds=output_inds) The tree object has a lot more information (plot_tent, print_contractions etc.) and stores the tree/path in a, I think, more intuitive format: def contract_explicit_tree(self, tree, output_inds=None, inplace=False):
tn = self if inplace else self.copy()
# map between nodes in the tree and tensor ids in the network
tidmap = dict(zip(tree.gen_leaves(), tn.tensor_map.keys()))
for parent, left, right in tree.traverse():
# get tensor ids
tidl = tidmap.pop(left)
tidr = tidmap.pop(right)
# the following is like tn._contract_between_tids(tidl, tidr)
# compute the new indices (only needed if hyper tensor network)
new_inds = tn.compute_contracted_inds(
tidl, tidr,
# output_inds here are those of the global contraction
output_inds=output_inds
)
# pop the left and right tensors
tl = tn._pop_tensor(tidl)
tr = tn._pop_tensor(tidr)
# do the contraction! (n.b. you could just do `tl @ tr` if the
# tensor network only has 'standard' indices)
tp = qtn.tensor_contract(
tl, tr,
# output_inds here are those of the local contraction
output_inds=new_inds,
# always wrap as a Tensor, even if scalar
preserve_tensor=True,
)
# add tensor back specifically with tidr
tn.add_tensor(tp, tid=tidr)
# update the tensor map
tidmap[parent] = tidr
return tn The original def contract_explicit_path(self, path, output_inds=None, inplace=False):
tn = self if inplace else self.copy()
# opt_einsum path format treats tensors as a list we pop from
tidlist = list(tn.tensor_map.keys())
for p in path:
# get tensor ids
tids = [tidlist.pop(i) for i in sorted(p, reverse=True)]
# compute the new indices (only needed if hyper tensor network)
new_inds = tn.compute_contracted_inds(
*tids,
# output_inds here are those of the global contraction
output_inds=output_inds
)
# pop the tensors
ts = list(map(tn._pop_tensor, tids))
# do the contraction!
tp = qtn.tensor_contract(
*ts,
# output_inds here are those of the local contraction
output_inds=new_inds,
# always wrap as a Tensor, even if scalar
preserve_tensor=True,
)
# add tensor back specifically with tidp
tidp = tids[-1]
tn.add_tensor(tp, tid=tidp)
# update the tensor list
tidlist.append(tidp)
return tn Another option is to use tn.contract_compressed(
optimize="auto-hq",
output_inds=output_inds,
max_bond=None,
cutoff=0.0,
progbar=True,
# a lot can be achieved with the callbacks:
callback=lambda tn, tid: None,
callback_pre_compress=lambda tn, tids: None,
callback_post_compress=lambda tn, tids: None,
callback_pre_contract=lambda tn, tids: None,
callback_post_contract=lambda tn, tid: None,
)
# log2[SIZE]: 0.00/10.00: 100%|██████████| 179/179 [00:00<00:00, 1342.73it/s]
# 8.332563918608845e+38 |
Beta Was this translation helpful? Give feedback.
Hi @cotejer, here are a couple of very explicit methods, which might indeed be useful to include in
quimb
at some point...Some setup:
The tree object has a lot more information (plot_tent, print_contractions etc.) and stores the tree/path in a, I think, more intuitive format: