You don't have to be human to have knowledge
Stephen Bounds — Tue, 17/02/2015 - 09:16
The post title says it all, really, but I want to put a stake in the ground on this one.
Every time there is a debate within a KM forum on knowledge, someone will say that "knowledge is a human thing". Or perhaps more specifically, they will claim a computer can't have knowledge. A computer can only hold information, it is the humans who programmed it who have the knowledge.
But in a world where we have Siri and Cortana, Watson and Google Self-Driving Cars, the idea that computers can't possess knowledge is looking pretty shaky. When people are openly speculating about AI taking over the world, it's beginning to look dangerously retrograde.
I think that people resist the idea of computers having knowledge because they think it's the same thing as saying that computers are "alive" or that they behave in "human-like" ways. But that's not necessary nor implied in claiming that they hold knowledge.
The AKI model of David Williams correctly, in my view, explicitly ties knowledge to action. Any system that can perform autonomous actions in response to environmental cues is knowledgeable. The structures (biological, mechanical, and/or electronic) that determine responses are its knowledge.
(Sidebar: I suspect the idea of "knowledge as human thing" has evolved as yet another hangover from the DIKW fallacy. Just one more reason why this model needs to be permanently put to rest.)
I have a simple challenge to anyone who thinks that this position is unreasonable: can you tell me a single hypothesis, prediction, or system design choice that will be incorrect or even just worse if we free our definitions of knowledge from the shackles of the conscious mind? Because at this point -- I can't.
Our expertise in complex systems analysis, combined with a deep understanding of technology and modern, agile management and leadership techniques makes knowquestion uniquely positioned to find strategic solutions to your tough problems. Contact us today.