Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.

question about multilayer PFNM #1

Open
SC19099 opened this issue Aug 11, 2020 · 2 comments
Open

question about multilayer PFNM #1

SC19099 opened this issue Aug 11, 2020 · 2 comments

Comments

@SC19099
Copy link

SC19099 commented Aug 11, 2020

When I set the local networks as multilayer (for example, two hidden layers which both have 100 neurals) in the experiment.py and run the document, I find that only the number of the first hidden layer of global network is right and the numbers of other hidden layers are still 100! I have tried to find why but can't work it.
I will appreciate it if someone can solve my problem!

@MayankAgarwal
Copy link
Member

Hi, I'm not sure I understand the issue fully. Can you please provide the following to help me diagnose the issue:

  1. Command to reproduce the experiment. Ex: python experiment.py --args....
  2. Output log file, and
  3. What is the expected output and what's the output PFNM produces

Thanks!

@SC19099
Copy link
Author

SC19099 commented Aug 14, 2020

Hi, I'm not sure I understand the issue fully. Can you please provide the following to help me diagnose the issue:

1. Command to reproduce the experiment. Ex: `python experiment.py --args....`

2. Output log file, and

3. What is the expected output and what's the output PFNM produces

Thanks!

Hello! First, thanks for your reply. When I run the single layer PFNM, the test accuracy is quite good. But when I run the multilayer PFNM, the performance is very poor. Next, I will give the three answers correspondingly.

1.Let the number of hidden layers be 2: python experiment.py --logdir "logs/mnist_test" --dataset "mnist" --datadir "data/mnist/" --net_config "784, 100, 100, 10" --n_nets 10 --partition "homo" --experiment "u-ensemble,pdm,pdm_iterative" --lr 0.01 --epochs 10 --reg 1e-6 --communication_rounds 1 --lr_decay 0.99 --iter_epochs 5

2.The log file is attached.

3.We can see that the test accuracy is only 0.8345, and the neurals' number of the second hidden layer is still 100, which means it did't get trained! Through experiments, I find the performance gets poor rapidly with the number of hidden layers increase(ex: When net_config is "784, 100, 100, 100, 100, 100, 100, 10", the test accuracy is only about 0.1...)
experiment_log-0-1.log

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants