Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sourcery refactored main branch #1

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

sourcery-ai[bot]
Copy link

@sourcery-ai sourcery-ai bot commented Jun 2, 2023

Branch main refactored by Sourcery.

If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.

See our documentation here.

Run Sourcery locally

Reduce the feedback loop during development by using the Sourcery editor plugin:

Review changes via command line

To manually merge these changes, make sure you're on the main branch, then run:

git fetch origin sourcery/main
git merge --ff-only FETCH_HEAD
git reset HEAD^

Help us improve this pull request!

@sourcery-ai sourcery-ai bot requested a review from dsx1986 June 2, 2023 06:20
Copy link
Author

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to GitHub API limits, only the first 60 comments can be shown.

if self.start_index == 2:
readout = (x[:, 0] + x[:, 1]) / 2
else:
readout = x[:, 0]
readout = (x[:, 0] + x[:, 1]) / 2 if self.start_index == 2 else x[:, 0]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function AddReadout.forward refactored with the following changes:

layer_1 = pretrained.act_postprocess1[0:2](layer_1)
layer_2 = pretrained.act_postprocess2[0:2](layer_2)
layer_3 = pretrained.act_postprocess3[0:2](layer_3)
layer_4 = pretrained.act_postprocess4[0:2](layer_4)
layer_1 = pretrained.act_postprocess1[:2](layer_1)
layer_2 = pretrained.act_postprocess2[:2](layer_2)
layer_3 = pretrained.act_postprocess3[:2](layer_3)
layer_4 = pretrained.act_postprocess4[:2](layer_4)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function forward_vit refactored with the following changes:

Comment on lines -190 to +187
readout_oper = [
ProjectReadout(vit_features, start_index) for out_feat in features
]
readout_oper = [ProjectReadout(vit_features, start_index) for _ in features]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function get_readout_oper refactored with the following changes:

Comment on lines -318 to +313
hooks = [5, 11, 17, 23] if hooks == None else hooks
hooks = [5, 11, 17, 23] if hooks is None else hooks
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _make_pretrained_vitl16_384 refactored with the following changes:

Comment on lines -331 to +326
hooks = [2, 5, 8, 11] if hooks == None else hooks
hooks = [2, 5, 8, 11] if hooks is None else hooks
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _make_pretrained_vitb16_384 refactored with the following changes:

state_dict = {}
for k, v in checkpoint['state_dict'].items():
state_dict[k[6:]] = v
state_dict = {k[6:]: v for k, v in checkpoint['state_dict'].items()}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DPT.__init__ refactored with the following changes:

Comment on lines -134 to +132
print(f'[INFO] loading image...')
print('[INFO] loading image...')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 134-165 refactored with the following changes:

print('| {} |'.format(line.ljust(50)))
print(f'| {line.ljust(50)} |')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 89-89 refactored with the following changes:

Comment on lines -22 to +28
paths = sorted(glob.glob(r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64" % (program_files, edition)), reverse=True)
if paths:
if paths := sorted(
glob.glob(
r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64"
% (program_files, edition)
),
reverse=True,
):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function find_cl_path refactored with the following changes:

Comment on lines -23 to +29
paths = sorted(glob.glob(r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64" % (program_files, edition)), reverse=True)
if paths:
if paths := sorted(
glob.glob(
r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64"
% (program_files, edition)
),
reverse=True,
):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function find_cl_path refactored with the following changes:

Comment on lines -21 to +27
paths = sorted(glob.glob(r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64" % (program_files, edition)), reverse=True)
if paths:
if paths := sorted(
glob.glob(
r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64"
% (program_files, edition)
),
reverse=True,
):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function find_cl_path refactored with the following changes:

Comment on lines +197 to -204
if self.embeddings.grad is None:
raise ValueError('grad is None, should be called after loss.backward() and before optimizer.step()!')

# level-wise meaned weight decay (ref: zip-nerf)

B = self.embeddings.shape[0] # size of embedding
C = self.embeddings.shape[1] # embedding dim for each level
L = self.offsets.shape[0] - 1 # level

if self.embeddings.grad is None:
raise ValueError('grad is None, should be called after loss.backward() and before optimizer.step()!')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function GridEncoder.grad_weight_decay refactored with the following changes:

Comment on lines -22 to +28
paths = sorted(glob.glob(r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64" % (program_files, edition)), reverse=True)
if paths:
if paths := sorted(
glob.glob(
r"%s\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64"
% (program_files, edition)
),
reverse=True,
):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function find_cl_path refactored with the following changes:

print(f'[INFO] loading DeepFloyd IF-I-XL...')
print('[INFO] loading DeepFloyd IF-I-XL...')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function IF.__init__ refactored with the following changes:

embeddings = self.text_encoder(inputs.input_ids.to(self.device))[0]

return embeddings
return self.text_encoder(inputs.input_ids.to(self.device))[0]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function IF.get_text_embeds refactored with the following changes:

Comment on lines -328 to +346
print(f'[INFO] loading model ...')
print('[INFO] loading model ...')
zero123 = Zero123(device, opt.fp16, opt=opt)

print(f'[INFO] running model ...')
print('[INFO] running model ...')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 328-331 refactored with the following changes:

Comment on lines -42 to +43
config = list(train_dir.rglob(f"*-project.yaml"))
assert len(ckpt) > 0, f"didn't find any config in {train_dir}"
config = list(train_dir.rglob("*-project.yaml"))
assert ckpt, f"didn't find any config in {train_dir}"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function load_training_dir refactored with the following changes:

Comment on lines -22 to +29
self.last_lr = lr
return lr
else:
t = (n - self.lr_warm_up_steps) / (self.lr_max_decay_steps - self.lr_warm_up_steps)
t = min(t, 1.0)
lr = self.lr_min + 0.5 * (self.lr_max - self.lr_min) * (
1 + np.cos(t * np.pi))
self.last_lr = lr
return lr

self.last_lr = lr
return lr
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LambdaWarmUpCosineScheduler.schedule refactored with the following changes:

Comment on lines -53 to -57
interval = 0
for cl in self.cum_cycles[1:]:
for interval, cl in enumerate(self.cum_cycles[1:]):
if n <= cl:
return interval
interval += 1
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LambdaWarmUpCosineScheduler2.find_in_interval refactored with the following changes:

Comment on lines -67 to +71
self.last_f = f
return f
else:
t = (n - self.lr_warm_up_steps[cycle]) / (self.cycle_lengths[cycle] - self.lr_warm_up_steps[cycle])
t = min(t, 1.0)
f = self.f_min[cycle] + 0.5 * (self.f_max[cycle] - self.f_min[cycle]) * (
1 + np.cos(t * np.pi))
self.last_f = f
return f

self.last_f = f
return f
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LambdaWarmUpCosineScheduler2.schedule refactored with the following changes:

self.last_f = f
return f
else:
f = self.f_min[cycle] + (self.f_max[cycle] - self.f_min[cycle]) * (self.cycle_lengths[cycle] - n) / (self.cycle_lengths[cycle])
self.last_f = f
return f

self.last_f = f
return f
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LambdaLinearScheduler.schedule refactored with the following changes:

txts = list()
txts = []
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function log_txt_as_img refactored with the following changes:

return (len(x.shape) == 4) and (x.shape[1] == 3 or x.shape[1] == 1)
return len(x.shape) == 4 and x.shape[1] in [3, 1]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function isimage refactored with the following changes:

Comment on lines -126 to +137
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if lr < 0.0:
raise ValueError(f"Invalid learning rate: {lr}")
if eps < 0.0:
raise ValueError(f"Invalid epsilon value: {eps}")
if not 0.0 <= betas[0] < 1.0:
raise ValueError("Invalid beta parameter at index 0: {}".format(betas[0]))
raise ValueError(f"Invalid beta parameter at index 0: {betas[0]}")
if not 0.0 <= betas[1] < 1.0:
raise ValueError("Invalid beta parameter at index 1: {}".format(betas[1]))
if not 0.0 <= weight_decay:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
raise ValueError(f"Invalid beta parameter at index 1: {betas[1]}")
if weight_decay < 0.0:
raise ValueError(f"Invalid weight_decay value: {weight_decay}")
if not 0.0 <= ema_decay <= 1.0:
raise ValueError("Invalid ema_decay value: {}".format(ema_decay))
raise ValueError(f"Invalid ema_decay value: {ema_decay}")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function AdamWwithEMAandWings.__init__ refactored with the following changes:

print("Deleting key {} from state_dict.".format(k))
print(f"Deleting key {k} from state_dict.")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function VQModel.init_from_ckpt refactored with the following changes:

Comment on lines -348 to +346
x = x.permute(0, 3, 1, 2).to(memory_format=torch.contiguous_format).float()
return x
return x.permute(0, 3, 1, 2).to(memory_format=torch.contiguous_format).float()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function AutoencoderKL.get_input refactored with the following changes:

Comment on lines -402 to +399
log = dict()
log = {}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function AutoencoderKL.log_images refactored with the following changes:

Comment on lines -438 to +435
if self.vq_interface:
return x, None, [None, None, None]
return x
return (x, None, [None, None, None]) if self.vq_interface else x
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function IdentityFirstStage.quantize refactored with the following changes:

print("Deleting key {} from state_dict.".format(k))
print(f"Deleting key {k} from state_dict.")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function NoisyLatentImageClassifier.init_from_ckpt refactored with the following changes:

Comment on lines -142 to +146
for down in range(self.numd):
for _ in range(self.numd):
h, w = targets.shape[-2:]
targets = F.interpolate(targets, size=(h // 2, w // 2), mode='nearest')

# targets = rearrange(targets,'b c h w -> b h w c')
# targets = rearrange(targets,'b c h w -> b h w c')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function NoisyLatentImageClassifier.get_conditioning refactored with the following changes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants