Skip to content

Releases: filrg/split_learning

v3.1.0

20 Mar 08:49
1432192

Choose a tag to compare

What's Changed

New feature:

  • Clustering by device performance
  • Remove hard code

Full Changelog: v3.0.0...v3.1.0

v3.0.0

11 Mar 04:44

Choose a tag to compare

New feature:

  • Cluster by devices
  • Multi-partition points for each cluster
    Enhancement:
  • Change queue message
  • Remove cluster by data

What's Changed

New Contributors

Full Changelog: v2.0.1...v3.0.0

v2.0.0

17 Dec 07:23

Choose a tag to compare

New feature:

  • Consolidate weights into a single *.pth file.
  • Add event_time to debug events by time.

Enhancement:

  • Modify the data loading mechanism to ensure all clients start training at the same time.
  • Separate server-side validation into a dedicated class.
  • Remove client-side validation.

v1.8.0

01 Dec 14:55

Choose a tag to compare

Split Learning v1.8.0

  • Fix load state_dict from cpu to cuda
  • Detect NaN loss
  • Add training result and test result, retrain when false
  • Average parameter with data size

Full Changelog: v1.7.1...v1.8.0

v1.7.1

27 Nov 13:59

Choose a tag to compare

Split Learning v1.7.1

  • Split data for each client
  • Log training time
  • Profiling and partition algorithm
  • Remove --num_layers on client side

v1.6.1

24 Nov 17:00

Choose a tag to compare

Split Learning v1.6.1

  • Allow partition configuration
  • Server only configuration
  • Select model with name-based
  • Refactor and package the server structure

v1.3.1

19 Nov 05:29

Choose a tag to compare

v1.3.1 Pre-release
Pre-release

The stable version of Split Learning

  • Manual configuration on server and clients
  • Actively declare cut layer points for each different Class