What I want to do at the Recurse Center between November 9th 2020 and February 12th 2020 (a list in progress)
Contents
- Courses
 - Other areas to cover through some means
 - Other courses that I mean to finish by the end of 2021
 - Object Detection
 - Tools and practices
 - Miscellaneous topics, tasks and tutorials
 - Non-technical or non-engineering
 - Other goals
 
My overarching goal is to follow Paul Graham’s advice to turn myself into ‘the sort of person who can have organic startup ideas’*.
Courses
I don’t know how many of these it is feasible to finish in about 12 weeks but I wish to do as much as possible.
Key:   ⬜ To Do   ✴️  In Progress   ✅ Done   🟨 Target is to finish by end of RC by challenge deadline   🟢 Programming Parts Done   🔵 A self-contained section complete; [ ] Tasks for challenge
Machine learning
- 
    
Natural Language Processing with Deep Learning (Stanford CS224n) (5 x HW + 1 x project proposal + 1 x project milestone + 1 x final project)
✴️ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜
 - 
    
Deep Reinforcement Learning (Berkeley’s CS285) (5 x HW + 1 x project milestone + 1 x final project; course doesn’t specify milestone but I’m including it analogously to the other ML courses here)
✴️ 🟨 🟨 🟨 ⬜ ⬜ ⬜
 - 
    
Berkeley Deep Unsupervised Learning (4 x HW + 1 x project milestone + 1 x final project)
✅ ✅ ✅ ✅ ⬜ ⬜
 
(HW1: Autoregressive models considered complete due to creation of this tutorial in addition to the successful use of autoregressive models in other homeworks)
- 
    
Stanford Deep Multi-Task and Meta Learning (4 x HW + 1 x project milestone + 1 x final project)
✅ 🔵 🟨 ⬜ ⬜ ⬜
 - ✴️ Full Stack Deep Learning
 - Probabilistic Graphical Models
 - 
    
Reproducing kernel Hilbert spaces in Machine Learning (1 x assignment + 2 x practice exercises)
⬜ ⬜ ⬜
 
CS
- 
    
MIT Computation Structures AND / OR Nand2tetris
 - Nand2tetris (13 x HW) ✅ ✅ 🟨 🟨 🟨 ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜
 - 
    
CUDA programming (Caltech CS179, Oxford)
UPDATE: going to focus on CS179 as the assignments are more challenging (no criticism of the Oxford course which has good lectures that I have found useful but it has been designed as a crash course rather than a computer science course) Caltech CS179 (6 x labs + 1 x project proposal + 1 x project) ✅ ✅ ✅ ✅ 🟨 🟨 ⬜ ⬜
Oxford (12 x practicals) ✅ ✅ ✴️ 🟨 🟨 🟨 ⬜ ⬜ ⬜ ⬜ ⬜
 - 
    
Caltech CS171: Introduction to Computer Graphics (HW0 + HW1-6 + HW7.1 + HW7.2)
✅ ✅ ✅ 🟨 🟨 ⬜ ⬜ ⬜ ⬜
 - 
    
MIT Distributed Systems (6.824) (Lab 1 + Lab2A-C + Lab3A-B + Lab4A-B + project proposal + final project)
✅ ✅ ✅ ✅ 🟨 🟨 ⬜ ⬜ ⬜ ⬜
 - Cryptography (Stanford course, maybe also Cryptopals)
 
Stanford course (6 x units + 1 exam) ✅ ✅ ✅ ✅ ✅ 🟨 🟨
- 
    
Stanford Programming Languages (8 x HW)
✅ ✅ 🟨 🟨 🟨 ⬜ ⬜ ⬜
 - Stanford Compilers
 - MIT Computer System Engineering AND / OR MIT Operating Systems Engineering
 - 
    
Washington The Hardware/Software Interface (4 x HW + 5 x labs + 2 x exams - which I may do for more practice but will not regard as exams)
⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ❔ ❔
 
Miscellaneous study goals
- 
    
Stanford Convex Optimization (Stanford’s course on the edX platform; 11 units but only 9 assignments so I am doing an equivalent number of exercises for the remaining two)
✅ ✅ ✅ ✅ ✅ ✅ ✅ 🟨 🟨 ⬜ ⬜
 - 
    
MIT Discrete Mathematics (12 x problem sets but might skip the last 4 as they are about probability which I have covered already)
✅ ✅ ✅ ✅ ✅ ⬜ ⬜ ⬜ ❔ ❔ ❔ ❔
 - 
    
Geometric Folding Algorithms (5 x HW + 1 x project milestone + 1 x final project; course doesn’t specify milestone but I’m including it analogously to the other ML courses here)
✅ 🟨 🟨 🟨 ⬜ ⬜ ⬜
 - 
    
MIT Advanced Fluid Mechanics(12 x units) ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅
✅ ✅ ✅ ✅ - 
    
edX Mastering Quantum Mechanics (14 assignments)
✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅
 
Other areas to cover through some means
- Linear Algebra
    
- Review basics more deeply (when I originally studied it I didn’t know much coding and I think I will perceive it differently and more intuitively now that I have a lot of experience manipulating multi-dimensional arrays)
 - ✴️ Eigenvectors and eigenvalues
 - ✴️ Singular Value Decomposition
 
 - ✴️ Wavelets
 - Causality (list of books discussed here)
 - Quantum Computing and Machine Learning (UTQML101x, this book)
 - Optimal Transport (potential sources to use: A Brief Introduction to Optimal Transport Theory, Computational Optimal Transport)
 - 
    
Expectation Maximisation
 - 
    
Chemistry AND / OR Biology topic each week (x 12)
⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜
 
Other courses that I mean to finish by the end of 2021
- edX Computer Graphics (CSE167x)
 
Object Detection
- Read all the references and make a “depth-first” tree for:
    
- MaskRCNN
 - DETR
 
 - Implement
    
- MaskRCNN
 - RetinaNet
 - YOLO
 - SSD
 - CenterNet
 - DETR
 - VoxelNet (overlaps with Lyft challenge model (see below)
 - BotNet
 - Others
 
 - Come up with my own model
 
Tools and practices
- Deconstruct an existing version and write my own of the following:
    
- Makefile for CUDA
 - Makefile for C++
 - Dockerfile
 - Pip package
 - Bash script
 requirements.txt
 
Miscellaneous topics, tasks and tutorials
✴️ Hopfield Networks
- Study the tutorials in Scratchapixel
 - All the exercises in Stanford Machine Learning (CS229)
 - 
    
Topics from CLRS Introduction to Algorithms book
✴️ Chapter 26 Maximum Flow / MIT Lecture 13
 - Topics from Stanford CS231A (covers some useful theoretical concepts)
 - 
    
Depth First Learning AlphaGo Zero tutorial (6 parts)
✅ ✅ ✅ ✅ ✅ 🟨
 - 
    
Depth First Learning WGAN tutorial (5 parts)
✴️ 🟨 🟨 ⬜ ⬜
Non-technical or non-engineering
 - Reading
 - Language learning
 - Art skills
 - Accounting (I going to use Accounting and Finance for Non-Specialists 11th edition)
 - Economics MIT Microeconomics
 - Valuation NYU Stern course (Valuation MBA Spring 2020)
 - Startup School
 
Other goals
- Think of ~10 ideas that are beyond my present level of skills or knowledge i.e. where I have no idea what to do - and then try to come up with a plan to realise them 0️⃣ 3️⃣
 
✅ ✅ ✅ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜
- Write >= 12 technical blogposts 
whose main purpose is to practise articulating technical ideas rather than to be amazing0️⃣ 3️⃣ - Read >=101 machine learning papers (I get credit only when I have produced some output related to the paper) 0️⃣ 4️⃣ 8️⃣
 
✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ 🟨 🟨 🟨 🟨 🟨 🟨 🟨 🟨 🟨 ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜
- Papers for CS330: Supervised multi-task learning and transfer learning (2)
 - Papers for CS294: Autoregressive Models (2)
 - Notes on NeRFs (1)
 - Weeks 1-4 at RC and ideas for week 5 (excluding the 3 also in the post below, 4)
 - 
    
101 Papers(30)
 - Implement and train ~24 ML papers from scratch
 
✅ ✅ ✅ ✅ ✅ ✅ ✅ ✅ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜ ⬜
- SNAIL
 - MAML for sinusoid data
 - Simple 2D data VAE and DC-GAN
 - Transformer for machine translation (this is a placeholder for now as I can’t remember which other model I implemented and I only want to give myself credit only when it is due)
 - MoCov1v2 (v1 with some v2 features by mistake)
 - MoCov1v2-TPU and MoCov1
 - Various simple Flow Models (counted as 1)
 - 
    
Various simple image VAE Models (counted as 1)
 - 
    
Implement and train the following Kaggle prize-winning models