It's not Skynet -- not yet anyway -- but IBM has made a breakthrough in building a new type of CPU. It's calling the experimental silicon a "cognitive computing chips" because they were built and designed to emulate the way the brain works. What does that mean for the channel, and more importantly, the world? Here's just a few thoughts ...

How does a "cognitive computing" CPU work, and what exactly is inside of it? IBM actually explains it best ...

[Neurosynaptic Chips] contain no biological elements. IBM’s first cognitive computing prototype chips use digital silicon circuits inspired by neurobiology to make up what is referred to as a “neurosynaptic core” with integrated memory (replicated synapses), computation (replicated neurons) and communication (replicated axons).

It's essentially the ultimate system on a chip, because it's designed to work like you do.

But what tasks in the real world necessitate cognitive computing? Learning tasks are the most important ones, including natural language comprehension, pattern detection, real-time navigation and most importantly, complex decision-making based on a variety of variable inputs. IBM gives the example of deploying such a chip inside a computer monitoring a traffic intersection. Based on the input from myriad stimuli at the location, the computer can determine what kind of scenario has occurred (a big crash or a fender bender) and automatically flag or send an alert based on the appropriate response level.

Now before we get all freaked out and assume self-aware and self-thinking computers (or robots!) are a hop, skip and a jump away, relax. IBM has only two of these chips in existence, and they're just now being tested. There's no real discussion on how someone would implement, deploy or even develop the software or hardware to use these chips, so it's really anyone's guess on when you'll see them. In fact, chances are you may not see them in your home for a long time, if ever. DARPA is helping fund IBM's endeavor, so these new chips likely will see the battlefield before your bedroom.

If and when these chips go mainstream, I would assume complicated tasks such as data storage optimization, VDI implementation, networks balancing and cloud computing provisioning tasks (if they're even still being used) would become trivial, everyday tasks. So for the channel, and the world, IBM is offering a glimpse at the future, even if it's incredibly far off.

Check out the lengthy press release for all the nitty-gritty tech details of the new CPU, or check out IBM Research for some flavor on the future.