Home

  • Keyword Search

  • Post Date Range

Inference at the Edge

By Bryon Moyer | August 7, 2020 | 0 Comments

[From the last episode: We looked at activation and what they’re for.] We’ve talked about the structure of machine-learning (ML) models and much of the hardware and math needed to do ML work. But there are some practical considerations that mean we may not directly use the pristine model as […]

Read More
activation function

Activation Functions

By Bryon Moyer | July 31, 2020 | 0 Comments

[From the last episode: We looked at the notion of sparsity and how it helps with the math.] We saw before that there are three main elements in a CNN: the convolution, the pooling, and the activation . Today we focus on activation . I’ll start by saying that the […]

Read More
sparsity

The Benefits of Sparsity

By Bryon Moyer | July 24, 2020 | 0 Comments

[From the last episode: We looked at the reason for doing all the multiply-accumulate math with machine learning.] Before we leave this matrix multiplication thing, there’s one other notion floating about a lot these days. Anyone listening to conversations about how to make machine-learning (ML) algorithms faster will eventually hear […]

Read More

Why All the Multiplication for Machine Learning?

By Bryon Moyer | July 17, 2020 | 0 Comments

[From the last episode: We looked at the reasons for all of the multiply-accumulate hardware in machine-learning engines.] OK, we’ve spent a couple of weeks deep in the math used for machine learning (ML). Now let’s back up a second to look at what this all means in the bigger […]

Read More
matrix multiplication

Where Do the MACs Come From?

By Bryon Moyer | July 10, 2020 | 0 Comments

[From the last episode: We looked at the convolution that defines the CNNs that are so popular for machine vision applications.] This week we’re going to do some more math, although, in this case, it won’t be as obscure and bizarre as convolution – and yet we will find some […]

Read More
Site by Consistent Image Web Design