My notes on what I'm doing daily have become more sparse, note to self to be more verbose.
What is in my notes is reading Vicki Boykis' post "What we don't talk about when we talk about building AI apps". In particular, even though we're moving into this new paradigm of LLMs, we still have to contend with bloated Docker containers. The post describes images with deep learning libraries as large as 10 GB!
This actually reminded me of work I did at Airtable to reduce Docker image size, by
- Reordering the Docker file to have instructions that update less frequently higher in the file,
- Removing unused Python modules, and
- Setting the AWS Codebuild to git deep vs shallow copy
I know, it's nothing fancy. The last one was particularly counter-intuitive, but the AWS rep said they're using a Go git client and apparently that made a difference (it's also unclear the last hack still works). That said, all-in-all a 60% image reduction size.
Modal discusses building a Docker-compatible custom container runner, image builder and filesystem. Nice.