[From the last episode: We reviewed our tour of IoTThe Internet of Things. A broad term covering many different applications where "things" are interconnected through the internet. and cloud computing.]
We’ve talked about conventional computing, so now we’re going to look at a new computing application that’s taking the industry by storm: machine learningMachine learning (or ML) is a process by which machines can be trained to perform tasks that required humans before. It's based on analysis of lots of data, and it might affect how some IoT devices work.. We did a quick overview a long time ago; now we’re going to dig in further.
Let’s start by defining our terms. We hear a lot about “artificial intelligenceA broad term for technology that acts more human-like than a typical machine, especially when it comes to "thinking." Machine learning is one approach to AI.,” or “AI.” That’s a really broad term for all kinds of approaches to getting machinesIn our context, a machine is anything that isn't human (or living). That includes electronic equipment like computers and phones. to “think” like humans. It’s not at all new, and it’s also not very specific. The term includes everything that we have done before, everything we’re doing today, and all the ways we may try to do it in the future.
“Machine learning” is an approach to AI, but it’s also pretty broad. It deals with ways of trainingWith machine learning, a model must be trained in order to perform its function (which is called inference). a machine to make decisions. There are lots of ways to do that training and lots of ways to make those decisions. So it’s also a pretty vague term.
If you’re watching the industry news, you’ll see references to “neural networksA type of conceptual network organized in a manner inspired by our evolving understanding of how the brain works.” (or “neural nets”). Those are a particular way to do machine learning. And there are lots of different types of neural networks, so it’s also a broad concept. The different types of neural network get confusing: DNNs, CNNs, RNNs, and many more. We’ll talk more about what these mean as we go.
“In the Form of the Brain”
There’s a word floating around with respect to neural nets that’s often used imprecisely, but it gets to the origins of machine learning: neuromorphicThis refers to systems that attempt to operate in the same way that the brain operates. Spiking neural networks are the main commercial example.. This literally means, “in the form of the brain or nervous system.” That is, after all, what AI is all about – learning to mimic the brain.
But there’s still lots we don’t know about the brain and how it works. Researchers are feverishly studying it, and we continue to learn more. We do know how some parts of the brain work, but it’s not yet easy to translate that into technology. At the many conferences that include AI topics, there are presentations on academic research work that reveals more about how the brain works or shows new approaches to applying that learning to machines.
Intel, for example, has a large research project on neuromorphic computing. There are no actual products at present; they’re making chipsAn electronic device made on a piece of silicon. These days, it could also involve a mechanical chip, but, to the outside world, everything looks electronic. The chip is usually in some kind of package; that package might contain multiple chips. "Integrated circuit," and "IC" mean the same thing, but refer only to electronic chips, not mechanical chips., but only for test and research purposes. They’re hoping this will lead to products a few years out.
So… if that’s the case, then what’s with all of this AI stuff flooding our inboxes and feeds on a daily basis? It turns out that some researchers found a way around the difficulties of literally mimicking the brain, and this has given rise to the abundance of what are sometimes called, artificial neural networks, or ANNsA type of neural network that’s loosely inspired by biological neurons, but operates very differently.. While they don’t look like the brain, they are easy to manage in a computer. Well, “easy” might not be the best work – they take enormous amounts of computing, so perhaps “workable” is a better word than “easy.”
Starting with Artificial
We’re going to go down this artificial path first, since that’s what’s happening right now. After that, we can start to explore other innovations and look at the paths different companies are going down. Remember as you read, however, that this is all leading-edge stuff. Things change every day, and what sounds like a great idea today may turn out not to work, or we may find a yet better way in the future.
This is also an area where the math can get overwhelming. We’re mostly not going to worry about it except at a basic level. Even without the math, some of it may challenge your imagination – it certainly has mine. It’s not always easy to visualize what’s going on. So we’ll do our best, understanding that this is just one of the risks of venturing into this field.
Leave a Reply