Releases: speediedan/finetuning-scheduler
Releases · speediedan/finetuning-scheduler
Fine-Tuning Scheduler Release 2.5.0
[2.5.0] - 2024-12-20
Added
- Support for Lightning and PyTorch
2.5.0
- FTS support for PyTorch's composable distributed (e.g.
fully_shard
,checkpoint
) and Tensor Parallelism (TP) APIs - Support for Lightning's
ModelParallelStrategy
- Experimental 'Auto' FSDP2 Plan Configuration feature, allowing application of the
fully_shard
API using module
name/pattern-based configuration instead of manually inspecting modules and applying the API inLightningModule.configure_model
- FSDP2 'Auto' Plan Convenience Aliases, simplifying use of both composable and non-composable activation checkpointing APIs
- Flexible orchestration of advanced profiling combining multiple complementary PyTorch profilers with FTS
MemProfiler
Fixed
- Added logic to more robustly condition depth-aligned checkpoint metadata updates to address edge-cases where
current_score
precisely equaled thebest_model_score
at multiple different depths. Resolved #15.
Deprecated
- As upstream PyTorch has deprecated official Anaconda channel builds,
finetuning-scheduler
will no longer be releasing conda builds. Installation of FTS via pip (irrespective of the virtual environment used) is the recommended installation approach. - removed support for PyTorch
2.1
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@CyprienRicque
Fine-Tuning Scheduler Release 2.4.0
[2.4.0] - 2024-08-15
Added
- Support for Lightning and PyTorch
2.4.0
- Support for Python
3.12
Changed
- Changed default value of the
frozen_bn_track_running_stats
option to the FTS callback constructor toTrue
.
Deprecated
- removed support for PyTorch
2.0
- removed support for Python
3.8
Fine-Tuning Scheduler Patch Release 2.3.3
[2.3.3] - 2024-07-09
- Support for Lightning <=
2.3.3
(includes critical security fixes) and PyTorch <=2.3.1
Fine-Tuning Scheduler Release 2.3.2
[2.3.2] - 2024-07-08
- Support for Lightning <=
2.3.2
and PyTorch <=2.3.1
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@josedvq
Fine-Tuning Scheduler Feature Teaser Release 2.3.0
Note
Because Lightning is not currently planning an official 2.3.0
release, this FTS release is marked as a pre-release and pins a lightning
2.3.0dev
commit. A return to normal Lightning cadence is expected with 2.4.0
and FTS will release accordingly. Installation of this FTS pre-release can either follow the normal installation from source or use the release archive, e.g.:
export FTS_VERSION=2.3.0 && \
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}-rc1/finetuning_scheduler-${FTS_VERSION}rc1.tar.gz && \
pip install finetuning_scheduler-${FTS_VERSION}rc1.tar.gz
[2.3.0] - 2024-05-17
Added
- Support for Lightning and PyTorch
2.3.0
- Introduced the
frozen_bn_track_running_stats
option to the FTS callback constructor, allowing the user to override the default Lightning behavior that disablestrack_running_stats
when freezing BatchNorm layers. Resolves#13.
Deprecated
- removed support for PyTorch
1.13
Fine-Tuning Scheduler Patch Release 2.2.4
[2.2.4] - 2024-05-04
Added
- Support for Lightning
2.2.4
and PyTorch2.2.2
Fine-Tuning Scheduler Patch Release 2.2.1
[2.2.1] - 2024-03-04
Added
- Support for Lightning
2.2.1
Fine-Tuning Scheduler Release 2.2.0
[2.2.0] - 2024-02-08
Added
- Support for Lightning and PyTorch
2.2.0
- FTS now inspects any base
EarlyStopping
orModelCheckpoint
configuration passed in by the user and applies that configuration when instantiating the required FTS callback dependencies (i.e.,FTSEarlyStopping
orFTSCheckpoint
). Part of the resolution to #12.
Changed
- updated reference to renamed
FSDPPrecision
- increased
jsonargparse
minimum supported version to4.26.1
Fixed
- Explicitly
rank_zero_only
-guardedScheduleImplMixin.save_schedule
andScheduleImplMixin.gen_ft_schedule
. Some codepaths were incorrectly invoking them from non-rank_zero_only
guarded contexts. Resolved #11. - Added a note in the documentation indicating more clearly the behavior of FTS when no monitor metric configuration is provided. Part of the resolution to #12.
Deprecated
- removed support for PyTorch
1.12
- removed legacy FTS examples
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@Davidham3 @jakubMitura14
Fine-Tuning Scheduler Patch Release 2.1.4
[2.1.4] - 2024-02-02
Added
- Support for Lightning
2.1.4
Changed
- Bumped
sphinx
requirement to>5.0,<6.0
Deprecated
- Removed deprecated lr
verbose
init param usage - Removed deprecated
tensorboard.dev
references
Fine-Tuning Scheduler Release 2.1.3
[2.1.3] - 2023-12-21
Added
- Support for Lightning
2.1.3