[From the last episode: we looked at the differences between computing at the edgeThis term is slightly confusing because, in different contexts, in means slightly different things. In general, when talking about networks, it refers to that part of the network where devices (computers, printers, etc.) are connected. That's in contrast to the core, which is the middle of the network where lots of traffic gets moved around. and in the cloudA generic phrase referring to large numbers of computers located somewhere far away and accessed over the internet. For the IoT, computing may be local, done in the same system or building, or in the cloud, with data shipped up to the cloud and then the result shipped back down..]
We’ve looked at the differences between MCUs and SoCs, but the one major thing that they have in common is that they have a CPUStands for "central processing unit." Basically, it's a microprocessor - the main one in the computer. Things get a bit more complicated because, these days, there may be more than one microprocessor. But you can safely think of all of them together as the CPU.. Now… anyone can define their own CPU if they want to, but that’s an awful lot of work – especially when there are companies out there designing CPUs for use by chip designers.
While you may think of a CPU as a chip that you could buy in a Fry’s (and it is that, in part), that’s not the whole story. Some companies don’t sell a manufactured CPU; instead, they sell the CPU design – complete with transistor connections – to companies that are going to build an MCU or SoC.
While our interest is mostly in CPUs, what we’re talking about here can apply to lots of circuits. The idea is not to reinvent everything every time you do a chip.
A Completely Custom House?
Let’s use a house as an analogy. You want to build one on your own. Are you going to hand-build a custom door for each entrance? Are you going to fashion hand-crafted custom windows from scratch where you want them? Are you going to cast your own bathtub out of molten iron, lovingly covering it with enamel after? Of course not; you’re going to buy those things pre-made.
That might seem obvious, but there was a time when, if you were building your little house on the prairie, you might be making your own nails – or at least having a blacksmith make nails for you. You made most everything from scratch. That would be a waste of time these days, unless you had a really good reason for spending a bunch of time and money on rolling your own.
The same thing goes for designing circuits. If you’re designing an MCU for sale to folks that need one, you’re not just selling it because of the CPU. You’re also not selling it because of the memory it contains, or other circuits that it may contain – like specialized I/OStands for input/output. This refers to the signals that come into and depart from the computing system (as opposed to the signals running internal to the computing system). circuits or video-encoding circuits. You’re selling it because of the specific combination and arrangement of all of those things.
Differentiating
One of the keys to designing chipsAn electronic device made on a piece of silicon. These days, it could also involve a mechanical chip, but, to the outside world, everything looks electronic. The chip is usually in some kind of package; that package might contain multiple chips. "Integrated circuit," and "IC" mean the same thing, but refer only to electronic chips, not mechanical chips. today is that you want to focus on the things that are going to make your chip different. If you’re designing your own house on a special property, you’re going to focus on the layout and the number of rooms and maybe a specialty kitchen. That’s where you want to spend your energy, not on doing your own doors and windows; those don’t contribute to what makes the house special.
Same thing with chips. Marketing folks talk about “differentiating”: this means doing something that makes your MCU (or whatever chip) better than some other one. Maybe you’ve got a specific combination of features that will make your MCU way better than someone else’s for some specific application. In that case, your differentiation isn’t the CPU itself, or any of the other individual circuits. It’s the way you connect them and arrange them that differentiates it. So you want to focus on that, not on creating your own custom CPU (which, to reiterate, is an absolute TON of work!).
What Might Make an IoT Device Special?
Same thing if you want to use an MCU in an IoTThe Internet of Things. A broad term covering many different applications where "things" are interconnected through the internet. device. Perhaps you’ve got some ideas of how to take, oh, temperature and humidity measurements, along with weather reports available online, and generate a custom weather report for a specific place. Do you care what CPU you use?
Maybe, to some extent, but is it something that you can’t find, and therefore you have to do yourself? Almost assuredly not. And the connections between the CPU and memories and other bits are also probably not that critical. So you just want to buy an MCU that’s got the circuit you need and then start writing the code that’s going to do your weather prediction.
There’s a casual term they use in the industry for this: special sauce. In that application example, the MCU and the sensorsA device that can measure something about its environment. Examples are movement, light, color, moisture, pressure, and many more. are things you just buy and use; you don’t create your own. It’s the algorithmsA way of calculating something, typically as a step-by-step recipe. A familiar one to all of us is how we do long division on paper: that's an algorithm. Algorithms abound in the IoT. and the code you’re going to write that are the special sauce. You want to spend your time on the special sauce, which is what differentiates your chip from others. It’s what makes it better – hopefully. You then don’t have to spend your time reinventing non-critical things that already exist.
Intellectual, not Physical
There’s a not-so-helpful name for circuit blocks – like a CPU – that aren’t chips themselves, but just designs that can then be included in a bigger design that will be turned into an actual chip. They call this intellectual propertyThis can have lots of meanings, but, in the computer-chip world, it refers to parts of a chip design that have been built and optimized by one company, which then sells them to other companies that don’t want to design those blocks themselves. They’re not selling actual chips; they’re selling the design of a block that will be used within a chip., or IPThis can mean two things (at least):
• The Internet Protocol. Governs the addresses of sources and destinations on a network (without worrying about what’s in between). Used on Layer 3 of the stack.
• "Intellectual property." This can have lots of meanings, but, in the computer-chip world, it refers to parts of a chip design that have been built and optimized by one company, which then sells them to other companies that don’t want to design those blocks themselves. They’re not selling actual chips; they’re selling the design of a block that will be used within a chip.
– the idea being that, in the CPU example, it’s not a “physical” CPU because it hasn’t been built yet. It’s an “intellectual” CPU – the ideas are all there; it’s just that you need to plug it into the rest of the chip you’re building before it becomes something physical.
IP is used all over the place these days because it saves tons of time and money. Existing circuits can be reused instead of being redone over and over (with mistakes inevitably being made and corrected over and over). But, with CPUs in particular, there’s a particular curious thing that happens… and we’ll talk about that next week.
Leave a Reply