## Pandemic blog 34: teaching on the screen

A small proportion of UW-Madison courses were being given in person, until last week, that is, but not mine. I’m teaching two graduate courses, introduction to algebra (which I’ve taught several times before) and introduction to algebraic number theory, which I’ve taught before but not for quite a few years. And I’m teaching them sitting in my chair at home. So I thought I’d write down a bit about what that’s like, since depending on who you ask, we’ll never do it again (in which case it’s good to record the memory) or this is the way we’ll all teach in the future (in which case it’s good to record my first impression.)

First of all, it’s tiring. Just as tiring as teaching in the classroom, even though I don’t have to leave my chair. This surprised me! But, introspecting, I think I actually draw energy from the state of being in a room with people, talking at the board, walking around, interacting. I usually leave class feeling less tired than when I walked in.

On the screen, no. I teach lectures at 10 and 11 and at noon when both are done I’m wiped out.

My rig, settled on after other setups kept glitching out: Notability open on iPad, I write notes as if on blackboard with the Apple Pencil, iPad connected by physical cable to laptop, screensharing to a window on the laptop which window I am sharing in Microsoft Teams to the class while the laptop camera and mic capture my face and voice.

What I have not done:

• Gotten a pro-quality microphone
• Set up a curated “lecture space” from which to broadcast
• Recorded lecture videos in advance so I can use the lecture hour for discussion
• Used breakout rooms in Teams to let the students discuss among themselves

All of these seem like good ideas.

So far (but I am still in the part of both courses where the material isn’t too hard) the students and I seem to find this… OK. My handwriting is somewhat worse on the tablet than it is on the blackboard and it’s not great on the blackboard. The only student who has told me they prefer online is one who reports being too shy to speak in class, sometimes too shy even to attend, and who feels more able to participate by typing in the chat window with the camera turned off. That makes sense!

I have it easy — these courses have only thirty students each, so the logistical work of handling student questions, homework, etc. isn’t overwhelming. Teaching big undergraduate courses presents its own problems. What happens with calculus quizzes? In the spring it was reported that cheating was universal (there are lots of websites that will compute integrals for you in another window!) So we now have a system called Honorlock which inhabits the student’s browser, watches IP traffic for visits to cheating sites, and commandeers the student’s webcam (!) to check whether their eye motions indicate cheating (!!) This sounds awful and frankly kind of creepy and not worth it. And the students, unsurprisingly, hate it. But then how does assessment work? The obvious answer is to give exams which are open book and which measure something more contentful about the material than can be tested by a usual quiz. I can think of two problems:

• Fluency with the basic manipulations (of both algebra and calculus) is actually one of the skills the class is meant to impart: yes, there are things a computer can do it’s good to be able to do mentally. (I don’t think I place a complicated trig substitution in this category, but knowing that the integral of x^n is on order x^{n+1}, yes.
• Tests that measured understanding would be different from and a lot harder than what students are used to! And this is a crappy time to be an undergraduate. I don’t think it’s a great idea for their calculus course to become, without warning, much more difficult than the one they signed up for.
Tagged , , ,

## Knuth, big-O calculus, implicit definitions (difficulty of)

Don Knuth says we should teach calculus without limits.

I would define the derivative by first defining what might be called a “strong derivative”: The function $f$ has a strong derivative $f'(x)$ at point $x$ if

$f(x+\epsilon)=f(x)+f'(x)\epsilon+O(\epsilon^2)$

I think this underestimates the difficulty for novices of implicit definitions.  We’re quite used to them:  “f'(x) is the number such that bla bla, if such a number exists, and, by the way, if such a number exists it is unique.” Students are used to definitions that say, simply, “f'(x) is bla.”

Now I will admit that the usual limit definition has hidden within it an implicit definition of the above kind; but I think the notion of limit is “physical” enough that the implicitness is hidden from the eyes of the student who is willing to understand the derivative as “the number the slope of the chord approaches as the chord gets shorter and shorter.”

Another view — for many if not most calculus students, the definition of the derivative is a collection of formal rules, one for each type of “primitive” function (polynomials, trigonometric, exponential) together with a collection of combination rules (product rule, chain rule) which allow differentiation of arbitrary closed-form functions.  For these students, there is perhaps little difference between setting up “h goes to 0” foundations and “O(eps)” foundations.  Either set of foundations will be quickly forgotten.

The fact that implicit definitions are hard doesn’t mean we shouldn’t teach them to first-year college students, of course!  Knuth is right that the Landau notation is more likely to mesh with other things a calculus student is likely to encounter, simultaneously with calculus or in later years.  But Knuth seems to say that big-O calculus would be self-evidently easier and more intuitive, and I don’t think that’s evident at all.

Maybe we could get students over the hump of implicit definitions by means of Frost:

Home is the place where, when you have to go there,

They have to take you in.

(Though it’s not clear the implied uniqueness in this definition is fully justified.)

If I were going to change one thing about the standard calculus sequence, by the way, it would be to do much more Fourier series and much less Taylor series.

Tagged , , ,