Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 63294dd

Browse files
committed
refactor
1 parent 9afda1f commit 63294dd

28 files changed

+114076
-114030
lines changed

__init__.py

Whitespace-only changes.

config.py

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
from yacs.config import CfgNode as CN
1+
from yacs.config import CfgNode as CN
22
import os.path as osp
33

44
_C = CN()
@@ -13,6 +13,7 @@
1313

1414
_C.DATA = CN()
1515
_C.DATA.USE_DATASET = "cifar10" # "cifar10", "nuswide81", "coco"
16+
_C.DATA.DATA_ROOT = "./data/cifar10" # "cifar10", "nuswide81", "coco"
1617
_C.DATA.LABEL_DIM = 10
1718
_C.DATA.DB_SIZE = 54000
1819
_C.DATA.TEST_SIZE = 1000
@@ -28,7 +29,7 @@
2829
_C.TRAIN = CN()
2930
_C.TRAIN.USE_PRETRAIN = False
3031
_C.TRAIN.BATCH_SIZE = 64
31-
_C.TRAIN.ITERS = 10000
32+
_C.TRAIN.ITERS = 100000
3233
_C.TRAIN.CROSS_ENTROPY_ALPHA = 5
3334
_C.TRAIN.LR = 1e-4 # Initial learning rate
3435
_C.TRAIN.G_LR = 1e-4 # 1e-4
@@ -37,9 +38,16 @@
3738
_C.TRAIN.SAVE_FREQUENCY = 20000 # How frequently to save model
3839
_C.TRAIN.ACGAN_SCALE = 1.0
3940
_C.TRAIN.ACGAN_SCALE_G = 0.1
40-
_C.TRAIN.WGAN_SCALE = 1.0
41+
_C.TRAIN.WGAN_SCALE = 1.0
4142
_C.TRAIN.WGAN_SCALE_G = 1.0
4243
_C.TRAIN.NORMED_CROSS_ENTROPY = True
4344
_C.TRAIN.FAKE_RATIO = 1.0
4445

4546
config = _C
47+
48+
49+
def update_and_inference(cfg_file):
50+
config.merge_from_file(args.cfg)
51+
52+
config.freeze()
53+
return c

config_yaml/cifar_step_1.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
DATA:
2+
USE_DATASET: "cifar10"

config_yaml/cifar_step_2.yaml

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
MODEL:
2+
ARCHITECTURE: "ALEXNET"
3+
PRETRAINED_MODEL_PATH: "/home/liubin/HashGAN/models/cifar10_20181119_163535_D_LR_0.0001_G_LR_0.0001_ARCHITECTURE_NORM_PRETRAIN_False_PARTIAL_True_REALFAKE_False_NORMEDTrue_WGAN_SCALE_1.0_ACGAN_SCALE_1.0_ALPHA_5_FAKE_RATIO_1.0/iteration_99999.ckpt"
4+
DATA:
5+
USE_DATASET: "cifar10" # "cifar10", "nuswide81", "coco"
6+
LABEL_DIM: 10
7+
DB_SIZE: 54000
8+
TEST_SIZE: 1000
9+
WIDTH_HEIGHT: 32
10+
OUTPUT_DIM: 3072 # Number of pixels (32*32*3)
11+
MAP_R: 54000
12+
OUTPUT_DIR: "./output/cifar10_finetune"
13+
IMAGE_DIR: "./output/cifar10_finetune/images"
14+
MODEL_DIR: "./output/cifar10_finetune/models"
15+
LOG_DIR: "./output/cifar10_finetune/logs"
16+
17+
TRAIN:
18+
USE_PRETRAIN: True
19+
BATCH_SIZE: 128
20+
ITERS: 10000
21+
CROSS_ENTROPY_ALPHA: 5
22+
LR: 1e-4 # Initial learning rate
23+
G_LR: 0.0 # 1e-4
24+
DECAY: True # Whether to decay LR over learning
25+
N_CRITIC: 1 # Critic steps per generator steps
26+
SAVE_FREQUENCY: 2000 # How frequently to save model
27+
ACGAN_SCALE: 1.0
28+
ACGAN_SCALE_G: 0.0
29+
WGAN_SCALE: 0.0
30+
WGAN_SCALE_G: 0.0

data/cifar10/database.txt

Lines changed: 54000 additions & 0 deletions
Large diffs are not rendered by default.

data/cifar10/database_nolabel.txt

Lines changed: 54000 additions & 0 deletions
Large diffs are not rendered by default.

data/cifar10/test.txt

Lines changed: 1000 additions & 0 deletions
Large diffs are not rendered by default.

data/cifar10/train.txt

Lines changed: 5000 additions & 0 deletions
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)