Forums  > Software  > Deep learning  
     
Page 2 of 2Goto to page: 1, [2] Prev
Display using:  

jslade


Total Posts: 1136
Joined: Feb 2007
 
Posted: 2014-09-07 01:24
I've skimmed Mumford and Desolneux book (recommended here on NP), which I assume is the "lighter" version of the same stuff. This sort of thing strikes me more as an attempt at systemization of ideas, rather than something you could consider "an idea."

The idea of DL is to make this redundant, with some successes. This has always been the connectionist ambition; "brain in a can." Of course, your DL gizmo will work better if you stick a HOG descriptor (sort of a Mumfordian idea) in the data pipeline, but the ideas is to discover new features without so much human intervention.

This foray has been extremely educational for me: I've mostly ignored connectionist ideas until now. Some of them are pretty good.

Crossed with sv507:
FWIIW there are quite a few "successful" architectures at this point; LSTMs are certainly of interest to this audience (I haven't looked at these yet). Also, restricted Boltzman machines. One could characterize Deep Learning as being "the techniques needed to successfully train recurrent networks." I agree with you that many of them are ridiculously fine tuned for the problem at hand.

"Learning, n. The kind of ignorance distinguishing the studious."

gax


Total Posts: 17
Joined: Apr 2011
 
Posted: 2014-09-07 03:08
@sv507: Thanks for the link, I'll take a look.
@jslade: One thing that the Mumfordian idea seems to give you that deep learning doesn't is the ability to sample from the pdf. Though of the top of my head I can't see much use for it, apart from to verify your model.

jslade


Total Posts: 1136
Joined: Feb 2007
 
Posted: 2014-09-10 00:24
Oh yeah: one more framework I only learned about today: Caffe:
http://caffe.berkeleyvision.org/

"Learning, n. The kind of ignorance distinguishing the studious."

sv507


Total Posts: 165
Joined: Aug 2010
 
Posted: 2014-09-10 16:05
Caffe looks really nice [ they aim to provide open implementations of all winning deep NN designs], and nvidia have worked with them


http://devblogs.nvidia.com/parallelforall/accelerate-machine-learning-cudnn-deep-neural-network-library/


what screwed me is the OSX10.9 10.9-specific Instructions

In OS X 10.9, clang++ is the default C++ compiler and uses libc++ as the standard library. However, NVIDIA CUDA (even version 6.0) currently links only with libstdc++. This makes it necessary to change the compilation settings for each of the dependencies.
... never managed to relink boost with libstdc++...

radikal


Total Posts: 259
Joined: Dec 2012
 
Posted: 2014-09-10 18:09
Yeah trying to keep normal workflow + CUDA workflow all in one OSX box makes me Head against Wall

For some ideas, might want to check out this

There are no surprising facts, only models that are surprised by facts

jslade


Total Posts: 1136
Joined: Feb 2007
 
Posted: 2014-09-10 22:01
It was that cudNN article which alerted me to Caffe's existence.
OS-X is nice for the desktop, but I have long since abandoned it as useless for numerics development work because of stuff like this. I am more interested in solving problems than I am in porting code to keep up with OS-X fashions in shared objects or whatever.

"Learning, n. The kind of ignorance distinguishing the studious."

miniflowtrader


Total Posts: 12
Joined: Jul 2014
 
Posted: 2014-09-23 19:28
Theano + Custom can do it all.

I have some RNN implementations with nesterov momentum and hessian free optimization. Message me and we can discuss further.

nikke


Total Posts: 5
Joined: Aug 2011
 
Posted: 2015-01-16 23:38
Now Facebook open sourced some of their models, apparently combining caffe and torch

tabris


Total Posts: 1255
Joined: Feb 2005
 
Posted: 2016-01-28 01:15
Pretty interesting on AlphaGo using deep learning

AlphaGo

Dilbert: Why does it seem as though I am the only honest guy on earth? Dogbert: Your type tends not to reproduce.

lmog


Total Posts: 134
Joined: Mar 2010
 
Posted: 2016-01-28 19:15
I got an intern resume, just wondering what others think. I think the candidate has some experience in deep learning

AB12358


Total Posts: 58
Joined: Apr 2014
 
Posted: 2016-01-29 01:28
Whoa. Very deep learning?

chiral3
Founding Member

Total Posts: 5061
Joined: Mar 2004
 
Posted: 2016-01-29 02:05
I really thought Go was going to hold out for another 100 years. Would be good to run a few more trials. That's kind of depressing.

Nonius is Satoshi Nakamoto. 物の哀れ

NeroTulip


Total Posts: 1013
Joined: May 2004
 
Posted: 2016-01-29 03:39
Bow to our new machine overlords!

It increasingly looks like humans are only the biological boot drive for machine intelligence. Humans cannot win a human/machine war, the only way to survive is to avoid the "us versus them" mentality and merge with technology.

Anyway, back to earth, what's next after Go?

Inflatable trader

EspressoLover


Total Posts: 333
Joined: Jan 2015
 
Posted: 2016-01-29 05:25
Even in chess, centaurs (human-computer teams) still exhibit quite an edge over pure machines. I'd imagine that a deep learning AI is much more "fragile" than the brute force exhaustion AI's in chess. Centaurs should have an even larger and longer-lasting advantage in Go.

Good questions outrank easy answers. -Paul Samuelson

radikal


Total Posts: 259
Joined: Dec 2012
 
Posted: 2016-01-29 20:30
WE STILL HAVE STARCRAFT (Though probably not by year end)

There are no surprising facts, only models that are surprised by facts

AB12358


Total Posts: 58
Joined: Apr 2014
 
Posted: 2016-02-01 00:24
I'm quite looking forward to chess boxing with robots.

akimon


Total Posts: 566
Joined: Dec 2004
 
Posted: 2016-02-01 01:09
lmog, that résumé made my day Big Smile

check out boston dynamic's robot doing household chores

youtube video of boston dynamic's atlas

I guess the singularity isn't coming anytime soon.

sv507


Total Posts: 165
Joined: Aug 2010
 
Posted: 2016-02-01 18:19
fyi udacity are doing a deep learning course using Google's Tensor flow (and taught by Vincent Vanhoucke, Principal Scientist at Google, and technical lead in the Google Brain team.)
Haven't got beyond setting up docker on my wndows machine to use tensor flow, but looks good (succinct yet informative)

udacity deep learning

AB12358


Total Posts: 58
Joined: Apr 2014
 
Posted: 2016-02-08 04:18
Google paper on deep nets for Go

lim_nick


Total Posts: 25
Joined: Jan 2011
 
Posted: 2016-02-11 06:50
seems like tensorflow is getting a lot of traction very quickly

In case someone was wondering about the generality of some of these methods, here's a short but scary talk by norvig: http://www.infoq.com/presentations/machine-learning-general-programming

radikal


Total Posts: 259
Joined: Dec 2012
 
Posted: 2016-02-11 18:49
@sv507 -- I just started first unit of that course and am enjoying it. It's alternately embarrassingly trivial and "wow, that's pretty clever" -- it fills a strange niche. I use tflow indirectly (through Keras) but the class is a particularly clean way to get access to the API and is a more fun way to go through the docs than, well, the docs.

There are no surprising facts, only models that are surprised by facts

Jurassic


Total Posts: 152
Joined: Mar 2018
 
Posted: 2018-10-04 10:31
Does anyone else find tensorflow very difficult to use?

eeng


Total Posts: 21
Joined: Dec 2014
 
Posted: 2018-10-04 12:21
@Jurassic, look up Keras as front-end to Tensorflow.

quantmatters.wordpress.com

Jurassic


Total Posts: 152
Joined: Mar 2018
 
Posted: 2018-10-04 12:46
yeah i found that and its easy to use. was wondering whether anyone uses tensorflow alone then
Previous Thread :: Next Thread 
Page 2 of 2Goto to page: 1, [2] Prev