Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: __init__() missing 2 required positional arguments: 'sequence_name' and 'sequence_category' #8

Open
yty-sky opened this issue Jun 29, 2023 · 8 comments

Comments

@yty-sky
Copy link

yty-sky commented Jun 29, 2023

"/home/sheepsky/miniconda3/envs/pts_diffusion/lib/python3.8/site-packages/accele
rate/utils/operations.py", line 171, in send_to_device
return type(tensor)(
TypeError: init() missing 2 required positional arguments: 'sequence_name'
and 'sequence_category'

@yty-sky
Copy link
Author

yty-sky commented Jun 29, 2023

how to slolve this?

@lukemelas
Copy link
Owner

This is "Common Issue #3" in https://github.com/lukemelas/projection-conditioned-point-cloud-diffusion#common-issues, the README has instructions on fixing it :)

@yty-sky
Copy link
Author

yty-sky commented Jun 29, 2023

I have solved this, we should replace send_to_device function to the following(not same with the solution in Common Issue 3):

def send_to_device(tensor, device, non_blocking=False, skip_keys=None):
"""
Recursively sends the elements in a nested list/tuple/dictionary of tensors to a given device.

Args:
tensor (nested list/tuple/dictionary of torch.Tensor):
The data to send to a given device.
device (torch.device):
The device to send the data to.

Returns:
The same data structure as tensor with all tensors sent to the proper device.
"""
if isinstance(tensor, (tuple, list)):
return honor_type(
tensor, (send_to_device(t, device, non_blocking=non_blocking, skip_keys=skip_keys) for t in tensor)
)
elif isinstance(tensor, Mapping):
from pytorch3d.implicitron.dataset.data_loader_map_provider import FrameData
if isinstance(skip_keys, str):
skip_keys = [skip_keys]
elif isinstance(tensor, (FrameData)):
if skip_keys is None:
skip_keys = []
return type(tensor)(
**{
k: t if k in skip_keys else send_to_device(t, device, non_blocking=non_blocking,skip_keys=skip_keys)
for k, t in tensor.items()
}
)
elif skip_keys is None:
skip_keys = []
return type(tensor)(
{
k: t if k in skip_keys else send_to_device(t, device, non_blocking=non_blocking, skip_keys=skip_keys)
for k, t in tensor.items()
}
)
elif hasattr(tensor, "to"):
try:
return tensor.to(device, non_blocking=non_blocking)
except TypeError: # .to() doesn't accept non_blocking as kwarg
return tensor.to(device)
else:
return tensor
'''

@surtantheta
Copy link

Hi @yty-sky ,
I tried applying your solution but the issue still persists. Can you help ?

@ziyuz-vision
Copy link

@yty-sky I encountered another issue after applying your solution. Any idea?

AttributeError: 'NoneType' object has no attribute 'sequence_point_cloud'

@szy4017
Copy link

szy4017 commented Oct 27, 2023

@zzy0428 I met the same problem. Do you have any solution?
This is part of the error log.
`Error executing job with overrides: ['dataset.category=tv', 'dataloader.batch_size=24', 'dataloader.num_workers=8', 'run.vis_before_training=True', 'run.val_before_training=True', 'run.name=train__hydrant__ebs_24']
Traceback (most recent call last):
File "main.py", line 165, in main
loss = model(batch, mode='train')
File "/data/szy4017/miniconda3/envs/pcdiffusion/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "/data/szy4017/miniconda3/envs/pcdiffusion/lib/python3.8/site-packages/accelerate/utils/operations.py", line 697, in forward
return model_forward(*args, **kwargs)
File "/data/szy4017/miniconda3/envs/pcdiffusion/lib/python3.8/site-packages/accelerate/utils/operations.py", line 685, in call
return convert_to_fp32(self.model_forward(*args, **kwargs))
File "/data/szy4017/miniconda3/envs/pcdiffusion/lib/python3.8/site-packages/torch/autocast_mode.py", line 12, in decorate_autocast
return func(*args, **kwargs)
File "/data/szy4017/code/projection-conditioned-point-cloud-diffusion/experiments/model/model.py", line 177, in forward
pc=batch.sequence_point_cloud,
AttributeError: 'NoneType' object has no attribute 'sequence_point_cloud'

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
wandb: Waiting for W&B process to finish... (failed 1).
wandb: You can sync this run to the cloud by running:
wandb: wandb sync /data/szy4017/code/projection-conditioned-point-cloud-diffusion/experiments/outputs/train__hydrant__ebs_24/2023-10-27--22-11-43/wandb/offline-run-20231027_221147-k045nwtg
wandb: Find logs at: ./wandb/offline-run-20231027_221147-k045nwtg/logs`

@alexdesko
Copy link

Hi all, I have the same error as @zzy0428 and @szy4017, namely
get_num_points return x.points_padded().shape[1] AttributeError: 'NoneType' object has no attribute 'points_padded'
in model/model_utils.py line 29.
Has anyone found a fix for this ?

@013292
Copy link

013292 commented Mar 11, 2024

after following @yty-sky solution, I've run into the same issue as @alexdesko

Hi all, I have the same error as @zzy0428 and @szy4017, namely get_num_points return x.points_padded().shape[1] AttributeError: 'NoneType' object has no attribute 'points_padded' in model/model_utils.py line 29. Has anyone found a fix for this ?

@lukemelas Are there any hints for addressing this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants