Become a sponsor to Frank Odom
Your sponsorship helps to support open-source artificial intelligence research. My recent work focuses on efficient Transformer-based architectures, especially for NLP, CV, and self-driving. All of this comes from my personal time, and is released under permissive licenses (e.g. MIT).
If you have benefited from my projects, please consider becoming a sponsor.
1 sponsor has funded fkodom’s work.
Featured work
-
fkodom/fft-conv-pytorch
Implementation of 1D, 2D, and 3D FFT convolutions in PyTorch. Much faster than direct convolutions for large kernel sizes.
Python 485 -
fkodom/transformer-from-scratch
Code implementation from my blog post: https://fkodom.substack.com/p/transformers-from-scratch-in-pytorch
Python 90 -
fkodom/yet-another-retnet
A simple but robust PyTorch implementation of RetNet from "Retentive Network: A Successor to Transformer for Large Language Models" (https://arxiv.org/pdf/2307.08621.pdf)
Python 103 -
fkodom/clip-text-decoder
Generate text captions for images from their embeddings.
Python 103 -
fkodom/dilated-attention-pytorch
(Unofficial) Implementation of dilated attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens" (https://arxiv.org/abs/2307.02486)
Python 50
0% towards 10 monthly sponsors goal
Be the first to sponsor this goal!
$5 a month
SelectBasic
- Get a Sponsor badge on your profile
- Free paid subscription to my blog (fkodom.substack.com)
$25 a month
SelectPlatinum
- Logo or name goes in my project README
- Shoutout on LinkedIn, or platform of your choice
- Everything in $5/month tier
$100 a month
SelectHero 🦸
- Will consider naming my next child after you (no promises 😉)