. home.aspx



How big data and IoT initiatives render the ‘garbage in, garbage out’ theory invalid

October 01, 2019 / Michael Kanellos

Garbage in, garbage out: it’s one of the great truisms of technology. Sure, it’s been eclipsed by “software will eat the world,” as something to say in a meeting when you really don’t know what’s going on, but it’s still probably uttered a few thousand times a month to explain away the failure of a recent technology initiative. But, like its sister expression “you can’t get fired for hiring IBM,” we’re learning in the “garbage doctrine big data era” that it’s, well, garbage. The problem in most big data or IoT initiatives isn’t that the data is meaningless, inaccurate, vague, or worthless – data harvested from sensors generally is valid. Typically, the problem lies in the massive amounts of data, because data doesn’t naturally organize itself like crystals. The trucks and equipment at a mining site can generate petabytes a day.