Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
Signed-off-by: Zhiyuan Chen <[email protected]>
  • Loading branch information
ZhiyuanChen committed Aug 21, 2024
1 parent e7c37b5 commit 5ae148c
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 6 deletions.
8 changes: 7 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,13 +65,19 @@ We also introduce [`Variable`][chanfig.Variable] to allow sharing a value across
[`FlatDict`][chanfig.FlatDict] supports variable interpolation.
Set a member's value to another member's name wrapped in `${}`, then call [`interpolate`][chanfig.FlatDict.interpolate] method. The value of this member will be automatically replaced with the value of another member.

[`dict`][dict] in Python is ordered since Python 3.7, but there isn't a built-in method to help you sort a [`dict`][dict]. [`FlatDict`][chanfig.FlatDict]supports [`sort`][chanfig.FlatDict.sort] to help you manage your dict.
[`dict`][dict] in Python is ordered since Python 3.7, but there isn't a built-in method to help you sort a [`dict`][dict]. [`FlatDict`][chanfig.FlatDict] supports [`sort`][chanfig.FlatDict.sort] to help you manage your dict.

[`FlatDict`][chanfig.FlatDict] incorporates a [`merge`][chanfig.FlatDict.merge] method that allows you to merge a `Mapping`, an `Iterable`, or a path to the [`FlatDict`][chanfig.FlatDict].
Different from built-in [`update`][dict.update], [`merge`][chanfig.FlatDict.merge] assign values instead of replace, which makes it work better with [`DefaultDict`][chanfig.DefaultDict].

Moreover, [`FlatDict`][chanfig.FlatDict] comes with [`difference`][chanfig.FlatDict.difference] and [`intersect`][chanfig.FlatDict.intersect], which makes it very easy to compare a [`FlatDict`][chanfig.FlatDict] with other `Mapping`, `Iterable`, or a path.

#### Dataclass Operations

[`FlatDict`][chanfig.FlatDict] is compatible with [PEP 557](https://peps.python.org/pep-0557/), and will inspect the type annotations in subclasses and make them as members of the [`FlatDict`][chanfig.FlatDict].
It also features [`validate`](chanfig.FlatDict.validate) method which will be called internally to validate the type of the members.
Even better, if a member of the [`FlatDict`][chanfig.FlatDict] have type annotations, the value will be automatically converted to the type of the annotation when setting the value.

#### ML Operations

[`FlatDict`][chanfig.FlatDict] supports [`to`][chanfig.FlatDict.to] method similar to PyTorch Tensor.
Expand Down
6 changes: 6 additions & 0 deletions README.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,12 @@ Python 的`dict`自 Python 3.7 之后就是有序的,但是并没有一个内

此外,[`FlatDict`][chanfig.FlatDict]引入了[`difference`][chanfig.FlatDict.difference][`intersect`][chanfig.FlatDict.intersect],这些使其可以非常简单的将[`FlatDict`][chanfig.FlatDict]和其他`Mapping``Iterable`或者一个路径进行对比。

#### 数据类操作

[`FlatDict`][chanfig.FlatDict]遵守[PEP 557](https://peps.python.org/pep-0557/)提案,他会检查子类的类型注解,并将它们添加成为[`FlatDict`][chanfig.FlatDict]的成员。
他还支持[`validate`](chanfig.FlatDict.validate)方法来检查成员是否符合类型注解,这个方法会在内部被调用。
更好的是,如果一个[`FlatDict`][chanfig.FlatDict]成员有类型注解,那么他会在被赋值时自动转换为正确的类型。

#### 机器学习操作

[`FlatDict`][chanfig.FlatDict]支持与 Pytorch Tensor 类似的[`to`][chanfig.FlatDict.to]方法。
Expand Down
11 changes: 6 additions & 5 deletions demo/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,22 +20,23 @@
from chanfig import Config, Variable


class DataloaderConfig:
class DataloaderConfig(Config):
batch_size: int = 64
num_workers: int = 4
pin_memory: bool = True
attribute = "None" # this will not be copied to the config


class TestConfig(Config):
name: str = "CHANfiG"
seed: int = 1013
activation: str = "GELU"
dataloader: DataloaderConfig = DataloaderConfig()

def __init__(self):
super().__init__()
dropout = Variable(0.1)
self.name = "CHANfiG"
self.seed = 1013
self.activation = "GELU"
self.optim.lr = 1e-3
self.dataloader = DataloaderConfig()
self.model.encoder.num_layers = 6
self.model.decoder.num_layers = 6
self.model.dropout = dropout
Expand Down

0 comments on commit 5ae148c

Please sign in to comment.