Inside the Internet of Things - Bryon Moyer Blog
[From the last episode: We looked at the notion of sparsity and how it helps with the math.] We saw before that there are three main elements in a CNN: the convolution, the pooling, and the activation . Today we focus on activation . I’ll start by saying that the […]
Read More[From the last episode: We looked at the reason for doing all the multiply-accumulate math with machine learning.] Before we leave this matrix multiplication thing, there’s one other notion floating about a lot these days. Anyone listening to conversations about how to make machine-learning (ML) algorithms faster will eventually hear […]
Read More[From the last episode: We looked at the reasons for all of the multiply-accumulate hardware in machine-learning engines.] OK, we’ve spent a couple of weeks deep in the math used for machine learning (ML). Now let’s back up a second to look at what this all means in the bigger […]
Read More[From the last episode: We looked at the convolution that defines the CNNs that are so popular for machine vision applications.] This week we’re going to do some more math, although, in this case, it won’t be as obscure and bizarre as convolution – and yet we will find some […]
Read More[From the last episode: We looked at CNNs for vision as well as other neural networks for other applications.] We’re going to take a quick detour into math today. For those of you that have done advanced math, this may be a review, or it might even seem to be […]
Read More