5
pfulop
7y

Waiting 3 days for your graphics card to get trough all training data to see if you wasted your life with bad architecture.

Comments
  • 1
    @hube Even when it comes to general adversarial networks? I thought small datasets will just generate garbage.
  • 1
    @hube I do understand batching (also my gpu wouldn't handle all the images at once), and I know how loss should behave . The question is what would be the result of small dataset (not batch) to test train the gan. I thought it would just generate garbage. The periodical loss that I am printing for generator and discriminator seems to be going down at pretty much the same rate. But even this doesn't mean there is no error that would generate bunch of nonsense.
  • 1
    @hube yes but they presented it as state of the art for mini-Imagenet one shot classification dataset. If I am also correct this works only if the discriminator has learned multiple classes (which is not my case). Also they test it against classes that are unknown to discriminator. All and all I don't think this would work in my case, but I may be wrong. Please correct me, I love being pushed to research and look up more an more (anything so my little gpu doesn't have to suffer)
  • 1
    Just out of curiosity, which card do you have?

    And welcome to devRant!
  • 0
    @hube I am using began (i know it was tested on generating faces but wanted to give it a shot). And I am working with google unique uis dataset http://interactionmining.org/rico . But what I am finding after first results is discriminator learning that everything with top and bottom bar and blank screen in between is valid ( either I should choose just one category from the dataset or use bigger resolution).
  • 1
    @TheCapeGreek GTX 1060 3GB
  • 1
    @pfulop I see. I've been eyeing the 6GB for my new rig. Figured ML loads would still take a while.
  • 1
    @TheCapeGreek I would go at least for 1070ti, 1060 was a mistake
  • 1
    @pfulop Really? I'll look at the benchmarks. 1060 vs the rest seems to be a miniscule increase for the price.
  • 0
    @TheCapeGreek I am working with images right now and more memory is always better. Also it has like double the cuda cores. Can you share the benchmark?
  • 1
    @pfulop Yeah you're right, 1070 Ti is far better.

    1070 Ti vs 1060 6GB and 3GB
    https://videocardbenchmark.net/comp...

    1070 Ti vs 1080 and 1080 Ti
    https://videocardbenchmark.net/comp...

    So basically 1070 Ti is the sweet spot for performance right now, since the 1080 Ti is $250 more for roughly 10% more performance.
  • 1
    @TheCapeGreek yep. I would skip 1080 and consider 1080ti only because of memory. Btw I wish they would have same prices for them in my country 😢
  • 1
    @pfulop I agree. Customs and exchange rates here cause prices to jack up insanely as well.
Add Comment