weird that two tensors originating from the same source have different mean values
concatenating a list of 2d-tensor along different two axis respectively, leads to tensor A,B where A.T==B, but their mean values along the same axis is slightly different (A.T.mean(axis=0) != B.mean(axis=0))
, why? Theoretically they are the same.
weird that two tensors originating from the same source have different mean values
concatenating a list of 2d-tensor along different two axis respectively, leads to tensor A,B where A.T==B, but their mean values along the same axis is slightly different (A.T.mean(axis=0) != B.mean(axis=0))
, why? Theoretically they are the same.
Pytorch looses precision when converting numbers into tensors
How come pytorch makes such a weird mistake in higher digits in division?
Reshape concatenated tensor into batched tensor (Pytorch)
I have a concatenated tensor that represents a batch of point clouds with pointwise features.
Shape: (batch_size * num_points, feature_dim)
batched PyTorch tensor indexing
I have a source tensor of size (3, 2) and an index tensor of size (3, 3) containing integer values 0, 1, or 2. In pytorch I can do tensor indexing source[index]
to get a tensor of size (3, 3, 2). Example: