Well, I am teaching ELEC4630 Computer Vision and Deep Learning using Jeremy Howard’s fastai courses as the main teaching tool. My class uses devcontainers to run Jeremy’s notebook code on the UQ deep learning PCs or their own personal computers with or without gpu support.

This seems to be working out well and the students are being introduced to the joys of Visual Studio Code, Unix System Administration, and Jupyter notebooks - all using devcontainers.

Why do I prefer this method of teaching for deep learning? Well, the biggest advantage is that the students have an identical copy of my machine including all the software config. All they have to do is get Windows Subsystem for Linux (WSL), Docker Desktop, and Visual Studio Code up and running. This takes about 30 minutes. Then they simply run all the devcontainers from GitHub and they can access their machines gpu. So there is no Unix configuration and no library incompatibilities to worry about. If there is a problem, I can fix the devcontainer image on GitHub without consulting the IT Group to correct the image.

The IT Group only support networked PCs using Windows, but deep learning must really be taught using Unix. Also, the students really do need root privileges to install certain libraries. This can be achieved by simply using WSL. However, then the problem is that the students need to install so much software on WSL to run a deep learning example notebook that it is impossible to support them. In addition, this requires students to understand Unix system administration which they have never been taught. No two student configured system would be alike. By running the devcontainer, they not only get Unix root access, but also get all the necessary compatible libraries installed because the complete devcontainer machine is specified on GitHub.

They can also make a fork of the repository and commit their modified source to GitHub for assignment submission. In general, the fork must be Private rather than Public, but I have figured out how to do this now.

So why not simply use free public gpu resources on Codespaces, Collab or Kaggle? The trouble is that these resources require a credit card and will bill you if you use them too much. Now some of my students may not have a credit card. Besides, I simply don’t want them paying anything. Also, Collab and Kaggle keep modifying their engines to the latest library versions and this may cause notebook code to break at awkward times - such as when an assignment is due. Codespaces is ok, but they also charge students for gpu access. I simply can’t run a course where I have no control over the coding environment. Also, I cannot assist students on Kaggle and Collab because I don’t have access to their accounts to see what is wrong - especially out of hours. This is also a major problem using AWS services.

With devcontainers, everyone starts with an identical disk image which remains quite stable until I issue a new one. I therefore know where to start if I want to help them. Also, students can refresh their Unix environment in about 10 seconds to get a sane coding environment that I am familiar with.

Devcontainers can be run for free on our UQ Deep learning Lab. Also, these machines are available remotely, so students can access them from anywhere and at any time. In addition, students can run the same GitHub code on their personal laptop or desktop with or without gpu.

Finally, at the end of the deep learning course, students can continue learning and take this environment as far as they like. I have debugged all of Jeremy’s code, so it runs smoothly. That means that students now have the coding environment to run all of his notebooks and continue their learning journey well beyond the course.

Enjoy!

Happy coding on your devcontainers.

Brian

Lovell Portrait