Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

We need to batch data. #9

Open
taeguk opened this issue Sep 4, 2016 · 3 comments
Open

We need to batch data. #9

taeguk opened this issue Sep 4, 2016 · 3 comments

Comments

@taeguk
Copy link
Member

taeguk commented Sep 4, 2016

Using all train data or test data to run model can be heavy.
It can occurs resource error because of memory capability.
So, We need to use BATCH SIZE.

For example, the number of train data is 100000.
We can batch 100 times with BATCH_SIZE = 1000.

See https://github.com/taeguk/tensorflow-study/blob/master/MNIST/cnn.py#L159,L171

@taeguk taeguk changed the title We need to consider BATCH SIZE. We need to batch data. Sep 4, 2016
@taeguk
Copy link
Member Author

taeguk commented Sep 4, 2016

If BATCH SIZE is big, resource error can be occured.
If BATCH SIZE is small, train and test are so slow.

So, we have to determine proper BATCH SIZE.
I think there is no clear way.

def find_proper_batch_size():
    lh = 1
    rh = DATA_LEN
    batch_size = 0
    while lh <= rh:
        mid = (lh + rh) // 2
        have a try to make model for checking mid is available batch size.
        if available:
            lh = mid + 1
            batch_size = mid
        else:
            rh = mid - 1
    batch_size = min(batch_size * 9 // 10 - 1, 0)    # for stable batch_size
    if batch_size < 5:
        raise BatchSizeException
    return batch_size

But, I think we can consider this way.

Hmm....
But, it can be burdensome job for us that are busy.

@ovekyc
Copy link
Member

ovekyc commented Sep 4, 2016

Its true we have to condiser about that
I ran the code with mnist 10000 rows without batch, and my mac was really stressful about that.
So we have to introduce batch size
But i think not now, we need to concentrate building whole project first

@taeguk
Copy link
Member Author

taeguk commented Sep 5, 2016

Okay, I don't think it is emergency now, too.
Let's think about this issue later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants