This lesson is still being designed and assembled (Pre-Alpha version)

GPU Programming: Glossary

Key Points

Introduction
Programming your GPU using CuPy
Programming your GPU using Numba
Closure Day 1
  • There is a large amount of freedom in distributing your computations over the GPU, but a lot of configurations will render your GPU mostly idle.

Your First GPU Kernel
  • Precede your kernel definition with the __global__ keyword

  • Use built-in variables threadIdx, blockIdx, gridDim and blockDim to identify each thread

Registers, Global, and Local Memory
  • Registers can be used to locally store data and avoid repeated memory operations

  • Global memory is the main memory space and it is used to share data between host and GPU

  • Local memory is a particular type of memory that can be used to store data that does not fit in registers and is private to a thread

Shared Memory and Synchronization
Constant Memory

Glossary

FIXME