[FoRK] Science without explicit theory

Jeff Bone jbone at place.org
Sun Jul 6 10:45:30 PDT 2008


On Jul 6, 2008, at 12:37 PM, Jeffrey Winter wrote:

>
> The overall tone of the article is just wrong.  He seems to imply
> a sort of phase change occurring at the data set size, which just
> isn't the case - or at least he doesn't off any real evidence that
> such a thing is happening.

There's a big difference between a data set with a million points and  
one with 100 billion.

I don't know if "phase change" is really an appropriate metaphor, but  
I'd conjecture that the number of interesting questions you can ask,  
and the number of good predictions you can make based on extracted  
models, goes up with the logarithm of the size of the data in terms of  
number of examples.

> While I can appreciate the notion that large data sets - when the
> right algorithms are applied - can offer interesting insights, really
> it's a matter of resolution.  I suppose there is something akin to
> a theory lurking in the Bayesian network of relationships among
> the data points, but isn't the Bayesian analysis
> done up front - the meta-theory if you will - the more interesting
> aspect of this?

The meta-theory ain't what's paying my bills. ;-)

> Yes, there are immensely larger data sets to work with, but the
> insights gained from them are part of a continuum; nothing  
> fundamentally
> different is happening at this magical petabyte level - or at least
> the article doesn't show that it is.

It is weak on examples, I agree.


jb



More information about the FoRK mailing list