Hello, this is my personal page where I save different resources and materials I find interesting, feel free to use any of the below! I am open towards feedback of any kind, so if you have any suggestions do not hesitate to contact me on any channel specified on my GitHub account!

General purpose:

Multithreaded parallelism in Numpy

Numpy uses BLAS, a linear algebra library which actually uses multithreading for some algorithms. This article presents how to exploit this.

Dispatch techniques for interpreters

This article presents some of the more popular dispatch techniques used in the context of emulating. It has been of particular usefulness for me while writing an RISC-V executor.

A radix sort paper #2

This paper is a direct continuation of the first one that I included, but by another author, that continues by presenting some optimizations brought to Pierre's implementation.

Support vector machines:

MIT class on SVMs

This is a video of a MIT class where SVMs are presented and explained.

The softmax activation function:

Softmax derivative

This is a SO thread that contains some really helpful advice for implementing the softmax derivative.

Softmax usage explained

This is a SO thread that explains how to properly use softmax in neural networks. It helped me understand how to propagate the derivative.

Stable softmax

This article is a combination of a softmax implementation and an explanation. I decided to also include this one because it mentions a few tricks to keep the implementation numerically stable. It also showcases how easily it integrates with the cross entropy loss.

Convolutional neural networks:

CNNs cheatsheet

This article presents a collection of information about CNNs from Stanford.

Graphics: