Independently as Juniors and Seniors

Department of Physics, Lawrence University, Box 599, Appleton, WI 54911-5626

Voice: 920-832-6721; FAX: 920-832-6962; Email: david.m.cook@lawrence.edu

Text of Talk Delivered at the Winter Meeting of the American Association of Physics Teachers

San Diego, CA

8 January 2001

Most efforts using computers in physics programs
focus on introductory courses or individual upper-level courses.
In contrast, for a dozen years the Lawrence Department of Physics
has been orienting sophomores to the use of
general purpose graphical, symbolic, and numeric computational tools
so that they will be able to use these resources on their own
initiative in all their subsequent studies.
With support from NSF CCLI-EMD grant DUE-9952285,
the author is now six months into a two-year project whose main objective
is to recast numerous locally written materials into a
publication that can be easily customized for use with a variety of
hardware and software. The grant also supports faculty
workshops to be held at Lawrence University in the summers of 2001
and 2002 and modest beta testing in 2001-02. Current information about
this project can be found from a link at
`www.lawrence.edu/dept/physics`.

Many of you probably know that, for a dozen years or more, we at Lawrence University have been developing the computational dimensions of our upper-level curriculum. We have built a computational laboratory that makes a wide spectrum of hardware and software available to students and, concurrently, we have developed numerous documents introducing computational tools and describing prototypical applications. We have also developed an approach to introducing students to these resources and drafted several hundred pages of instructional materials to support that approach. Currently, I am on sabbatical. With support from the NSF-CCLI grant referenced in the abstract, I have embarked on a project whose main objective is the creation of an adaptable publication that I, the NSF, and Brooks-Cole (the publisher) all hope will be useful at other places seeking to increase the computational components of their curricula. In addition, to address both dissemination and assessment, the grant supports the holding at Lawrence of one faculty workshop in the coming summer, two faculty workshops in the summer of 2002, and modest beta testing of the materials in the 2001-2002 academic year. Today's talk expands on and updates what I said in a talk some of you may have heard last summer in Guelph. I want to

- lay out the underlying convictions that have emerged as guides to the Lawrence approach,
- briefly describe the structure of the Lawrence curricular components,
- tell you about the nature and current status of the publishing project in which I am currently engaged, and
- invite you to communicate your suggestions and constructive criticism, to apply to participate in the workshop this summer, and/or to give thought to contributing as a "beta tester'".
- tell you how to get more information.

As I see it, the primary tasks of those parts of an undergraduate physics
program that focus on physics *majors* are, first, to awaken in our
students a full realization of the beauty, breadth, and power of
our discipline and, second, though not
by much, to help them develop not only a secure understanding of
fundamental concepts but also the skills to use a variety of tools in
applying those concepts. Among the tools, I would firmly
include the ability to use *computational* resources of
several sorts.

- I believe that our curricula must familiarize students
- with the functions and capabilities of at least one operating system.
- with the use of at least one good text editor (not word processor).
- with several types of computational tool, including
- a spreadsheet like Excel.
- resources like IDL, MATLAB, C or FORTRAN programming, ... for numerical processing of numbers and arrays.
- resources like MACSYMA, MAPLE, MATHEMATICA, ... for symbolic manipulation of expressions.
- resources like Kaleidagraph, IDL, Explorer, ... for graphical visualization of complex data.
- resources like LaTeX for preparing reports and technical manuscripts.

- I believe that our curricula must familiarize students with several types of symbolic and numerical analyses, including solving ordinary and partial differential equations, evaluating integrals, finding roots, performing data analyses, fitting curves to experimental data, preparing technical reports, ....
- I believe that students must be introduced early to these tools in a way that helps them learn how to control the basic tool itself. Using canned exercises written by the instructor has its place, but I aim for more. I want students to develop the skill to work directly with the raw command structures of these several resources.
- I believe that use of computational resources must permeate the curriculum. Beyond curricula that provide the initial nudge in learning to use these tools, we must therefore provide the requisite facilities (hardware and software) and assure easy access at all times to those facilities, and we must structure much of the upper-level curriculum so that students have frequent opportunity to use these resources as they progress through the curriculum.
- I believe that the initial encounter with computational tools cannot be effectively accomplished as an aside to something that has higher priority. That encounter must occur in a context in which learning to use the tools themselves has first priority. Certainly, examples must be drawn from physics, but they must be chosen primarily because they illustrate features of the tools.
- I believe that an upper-level course in computational physics is a valuable curricular inclusion, but I also believe that students need to become acquainted with computational resources long before they have either the mathematical or the physical background to profit from a computational physics course. A course in computational physics should not be either their first or their only encounter with computational tools.

In the broadest of terms, we should---I argue---be structuring our
curricula so that, ultimately, students will recognize when a
computational approach may have merit and will have the personal
confidence to pursue that approach *on their own initiative*.

I think that the best way for me to
communicate the structure of the Lawrence
curriculum *vis a vis* computation is to track the computational
experience of an entering freshman physics major as she moves towards
graduation four years later. Our calendar is a three-term calendar, and
full-time students take three courses at a time, each of which
receives 1 Lawrence course credit, which---by the official rule
anyway---translates to 3-1/3 semester hours.

Prospective physics majors at Lawrence first encounter computational approaches in the introductory courses. Our laboratory is equipped with Macintosh computers, Vernier ULI cards, and a variety of sensors. Beyond LoggerPro (for data acquisition), students have access both in the laboratory and elsewhere on campus to Excel (for data analysis) and Kaleidagraph (for visualization). Exercises assigned in the laboratory routinely involve automated data acquisition, statistical analysis and curve fitting, and exercises assigned in lectures occasionally send students to the laboratory computers for graphing results or pursuing numerical solutions to Newton's laws with editable Excel templates. By the end of the freshman year, prospective majors have already developed some skills in the use of computational tools for physics, particularly skills of value in the laboratory.

Beyond the freshman year, students---of course---continue to use Excel and Kaleidagraph, but they also as sophomores have access to our Computational Physics Laboratory (CPL), which is equipped with six SGI UNIX workstations, monochrome and color printers, and software in all the categories I have already enumerated. Each student has an account in this departmental facility, and each is entitled to a key both to the CPL and to the building, so each has 24/7 access to the CPL.

To help sophomores become confident, regular users of the resources of
the CPL, we offer a course called *Computational Tools in Physics*, to
which---since it is the central focus of this talk---I will return in a moment.
Even those sophomores who
don't elect this course, however, encounter two short computational
workshops---one on IDL and the other on MACSYMA---in
our *required* sophomore mechanics course (Barger and Olsson)
and a couple of
computer-based exercises in our required sophomore electricity and magnetism
course (Griffiths). Thus, *all*
sophomores have at least a small, *forced* exposure to the CPL,
and some---but unfortunately not all---sophomores have a fully
comprehensive introduction to the available capabilities.

Subsequent theoretical and experimental courses alike offer students many opportunities to continue honing their computational skills and, depending on the instructor, some of these courses will direct students explicitly to the CPL for an occasional exercise. Most senior capstone projects will use the resources of the CPL and some, notably projects in fluid mechanics, musical acoustics, xray diffraction, multiphoton quantum transitions and chaotic dynamics, make extensive use of these facilities. Some physics students use the CPL in conjunction with courses in other departments, particularly mathematics.

Let me return now to the *elective* course *Computational Tools
in Physics*, which
is the starting point in our nurturing of our students' abilities to take
full advantage of the resources of the CPL.
This full-credit course is
offered in three 1/3-credit segments, one in each of the three
terms of our academic year. Its topics are coordinated with the
sequence of *required* courses taken by sophomore physics majors.
The first term covers

- Tutorial Orientation to UNIX [1 week]
- Array Processing and Graphical Display (IDL) [2 weeks]
- Publishing Scientific Manuscripts (LaTeX and
`tgif`) [1 week] - Graphical Visualization (Explorer) [2 weeks]
- Symbolic Manipulations (MACSYMA) [2 weeks]
- Circuit Simulation (SPICE) [2 weeks]

The second term is coordinated with an intermediate course in classical mechanics, for which Barger and Olsson is the current text, and focuses on symbolic and numerical approaches to ordinary differential equations. In the first half of the term it covers

- Symbolic Solution of ODEs and Laplace Transforms (MACSYMA) [2 weeks]
- Numerical Solution of ODEs (IDL) [2 weeks]
- Numerical Solution of ODEs; Introduction to FORTRAN (LSODE) [1 week]

The third term is coordinated with an intermediate course in electricity and magnetism, for which Griffiths is the current text, and focuses on symbolic and numerical integration. In the first six weeks of the term, it covers

- Symbolic/Numerical Integration (MACSYMA/IDL) [2 weeks]
- Numerical Integration (FORTRAN, Numerical Recipes) [2 weeks]
- Root Finding [2 weeks]

The Lawrence approach to nurturing the abilities of students to use computational resources

- focuses on application programs with only small attention to traditional programming languages.
- uses numerous physical examples to motivate the study of tools and techniques.
- introduces a versatile text publishing system (LaTeX) for preparing problem assignments and reports.
- orients students to uses of computers both in theoretical contexts (solution of ODEs and PDEs, root finding, numerical integration, ...) and in laboratory contexts (data analysis, least squares fitting, image processing, ...).
- distributes the use of computational resources throughout the curriculum.
- pays close attention to assessing the precision of numerical results.
*most importantly*, alerts students to computational resources early enough so that, subsequently, they are able to use these resources confidently, fluently, effectively, knowledgeably, and independently whenever they deem it appropriate.

Our approach is active; it compels students to play a
personal role in their own learning; it forces students
to defend their solutions in writing; it gives students practice
in preparing and delivering oral presentations; it encourages students
to work in groups; it permeates our curriculum;
and, more than any other objective, it
develops the students' abilities to operate in this
arena *on their own initiative*.

The grant referenced in the abstract and received last February from the Educational Materials Development track of the Course, Curriculum, and Laboratory Improvement program (CCLI-EMD) of the NSF provides support for converting the experience acquired and the extensive library of instructional materials developed at Lawrence into a flexible publication as a resource for other institutions. That we don't all use the same spectrum of hardware and software, however, poses a major challenge. The variety of options and combinations is so great that any single choice (or coordinated set of choices) is bound to limit the usefulness of the end result to a small subset of all potentially interested users. My effort to address that challenge involves assembling different incarnations of the basic materials from a wide assortment of components, some of which---the generic components---will be included in all incarnations and others of which---those specific to particular software packages---will be included only if the potential user requests them. I intend that the specific software and hardware treated in any particular incarnation will be microscopically "tailor-able" to the spectrum of resources available at the instructor's site. One incarnation, for example may include the generic components and the components that discuss IDL, MAPLE, C, and LaTeX while another may include the generic components and the components that focus on MATLAB, MATHEMATICA, and FORTRAN (including Numerical Recipes). Whatever components are included, chapters and pages will be numbered sequentially from the beginning without gaps, and the table of contents and index will contain information about only those components that were actually included.

I hope now to convey some of my
*present* conception of what the final product will look like.
Here, with the broadest brush, is my present tentative table of
contents:

- Overview of Materials
- Introduction to IDL
*or*MATLAB*or*... - Introduction to MACSYMA
*or*MAPLE*or*Mathematica*or*... - Introduction to Programming in FORTRAN
*or*C - Introduction to Numerical Recipes
- Solving ODEs
- Introduction to LSODE
- Evaluating Integrals
- Finding Roots
- Solving PDEs
- Data Analysis/Curve Fitting
- Fourier Analysis and Image Processing
- Introduction to UNIX
*or*Windows*or*... - Introduction to LaTeX
*or*Word*or*... - Introduction to TGIF
*or*...

- Chapter 1 stands alone; chapters 2-5 introduce the general features of an array processor, a computer algebra system, a programming language, and the numerical recipes library; Chapters 6-12 address several important categories of computational processing; and the appendices (Chapters 13-15, since I don't know how to make HTML label with the letters A, B, and C) introduce the use of an operating system, a publishing system, and a program for producing drawings.
- Options are indicated by enclosing the possibilities in square brackets and separating them with vertical bars, Especially within the later chapters there are internal options that are not shown. Further, Here, `or' should be read as `and/or' and, in some cases, selecting none of the options will also be possible.
- The order of presentation
*in the book*does not compel any particular order of treatment*in a course or program of self-study*. While some later sections depend on some earlier sections, the linkages are not particularly tight. In my own use, for example, I might start with the IDL version of Chapter 2 and the MACSYMA version of Chapter 3, then move to the LaTeX version of Appendix B (Chapter 14), return to the IDL and MACSYMA portions of Chapter 8 on integration, then address the IDL and MACSYMA portions of Chapter 6 on ODEs, move to the FORTRAN portions of Chapters 4 and 5, and then return to Chapter 7 on LSODE and the FORTRAN portions of Chapters 6 and 8.

While the objective is for students to become fluent in the use of a spectrum of computational tools---and the chapters are organized by program or by computational technique involved, the focus throughout is on physical contexts.

To illustrate representative chapters, I have chosen here to describe Chapters 2 and 8 more fully. Chapter 2 {\bf [OHD]} represents chapters that introduce basic features of an application program, specifically a program for processing arrays of numbers and creating graphical visualizations of one-, two-, and three-dimensional data sets. The sections of the chapter dealing with MATLAB are tentatively titled

- Beginning a Session with MATLAB
- Basic Entities in MATLAB
- A Sampling of MATLAB Capabilities
- Properties, Objects, and Handles
- Saving/Retrieving a MATLAB Session
- Reading Data from a File
- ON-line Help
- m-Files
- Eigenvalues and Eigenvectors
- Graphing Functions of One Variable
- Making Hard Copy
- Graphing Functions of Two Variables
- Graphing Functions of Three Variables
- Graphing Vector Fields
- Animation
- Advanced Graphing Features (Fonts, Drawing Space Curves, Using Multiple Windows, Customizing Axes, Working with Color, ...)
- Miscellaneous Occasionally Useful Tidbits
- References
- Exercises

The structure of Chapter 8 on evaluating integrals exemplifies the structure of all of the chapters on various computational techniques. Presumably, before approaching any particular section in this chapter, the student would have studied the relevant sections in earlier chapters. Tentatively, the sections in Chapter 8 are titled

- Sample Problems
- Evaluating Integrals Symbolically with MACSYMA
*or*MAPLE*or*Mathematica or ... - Algorithms for Numerical Integration
- Evaluating Integrals Numerically with IDL
*or*MATLAB*or*... - Evaluating Integrals Numerically with MACSYMA
*or*MAPLE*or*Mathematica*or*... - Evaluating Integrals Numerically with FORTRAN
*or*C (including Numerical Recipes) - Exercises

The first section, whose detail we will look at in a moment, sets several physical problems, the successful addressing of which benefits from exploitation of a computational tool. The second section describes how one might use a symbolic tool in application to some of the problems set in the first section. Save for the last, the remaining sections describe suitable numerical algorithms generically and then illustrate how those algorithms can be invoked in a variety of ways. The final section lays out several exercises that students can use to hone their skills. Sections 8.1 (sample problems), 8.3 (numerical algorithms), and 8.7 (exercises) would be included in all versions of the chapter; each individual instructor would select only those other sections that are appropriate to that instructor's site.

Let us step down one further level in the envisioned structure. Here is the present list of sample problems for Chapter 8:

- One-Dimensional Trajectories
- Center of Mass
- Moment of Inertia
- Large-Amplitude Pendulum (Elliptic Integrals)
- The Error Function
- The Cornu Spiral
- Electric/Magnetic Fields and Potentials
- Quantum Probabilities

They range over several subareas of physics and reveal that evaluation of integrals, perhaps as functions of one or more parameters, plays an important role in many areas of physics.

Even among sites that use the same spectrum of hardware and software,
however,
some aspects of local environments are still unique to individual
sites.
Rules of citizenship, user responsibilities,
practices and policies regarding accounts and passwords,
the features and elementary resources of the operating system,
the structuring of public directories, backup schedules,
after-hours access, licensing restrictions in force on proprietary
software, and numerous other aspects are
subject to considerable local variation. This book makes no attempt to
constrain local options in these matters. Throughout the
book, individual users are directed to a publication called the
*Local Guide* for site-specific particulars.
A suggested template for that guide, specifically the one used at
Lawrence, will be included in the supplementary materials available
to each user, but it will require editing to reflect local practices.

Finally, let me speak briefly about how I think I
can achieve the necessary
flexibility in publishing so individual users can select from a spectrum of
components that speak particularly to their specific hardware and software.
Beyond doing a superb job of formatting complex equations,
tables, arrays, and text, the package LaTeX
is able to create tables of contents and indices automatically.
One of the available packages (the `ifthen` package), when
invoked, adds a capability to include or exclude different files
depending on whether a controlling Boolean variable has the value `true'
or `false'. Thus, one

- places each module in its own file (each of which contains also the commands for the associated entries to the table of contents and to the index),
- creates a driving source file that contains little more than commands to set an assortment of Boolean flags, to create the table of contents and index, and to input the files selected by the states of the Boolean flags,
- processes that driving file a
suitable number of times through LaTeX, invoking
`makeindex`just before the last processing through LaTeX, - submits the resulting
`dvi`file to`dvips`, and - prints the resulting PostScript file.

`
\documentclass{book}
`

Input ifthen package

Set flags, e.g., IDL, to true or false

\begin{document}

\tableofcontents

\ifthenelse{\boolean{IDL}} {\input{\head/IDL/IDL.tex} {}

\printindex

\end{document}

Here, we display the parts of the LaTeX code that effect in particular
the conditional input of modules and the making of the table of contents and
index. Early on, we must input the requisite packages
to define commands for conditional execution and for constructing the index, and
we must set a dozen or so flags, e.g., `IDL`,
to `true` or `false` to specify
which modules are to be included. Then, in the main body of the
processing, we request
the table of contents, input all possible modules (with each input statement surrounded
by a conditional controlled by the flag set earlier), and request the printing of
the index. Note that the flag `IDL`, for example,
not only controls whether the IDL chapter is included but also,
internal to many other chapters, controls whether the IDL discussions of
representative problems will be included.
Once the flags are set as desired, we simply process this file as described in the
previous transparency, and the job is done. Clearly, such
flexibility would be impossible were we not able to exploit features of
LaTeX, including the particular capabilities of the
`ifthen` and `makeidx` packages and the auxiliary
program `makeindex` to format the
PostScript files that subsequently can be printed to obtain each
incarnation.

Brooks-Cole is committed to participating in the process of refining this essential procedure so as to be able to make a commercially feasible product. Their editors claim that they will be able to produce the desired customization economically for orders of as few as ten copies. Further, once the structure has been fully worked out and debugged, there is no reason at all why other authors might not contribute components, so---over time---the product will expand to accommodate a wider and wider spectrum of hardware and software and maybe even to include broad topics not originally in the plan.

More detailed information about this summer's workshop, including an on-line application form for participation, is available at the URL

Beyond the workshops this summer and next, my grant supports a modest amount of beta testing during the 2001-2002 academic year. Because of the flexibility of the materials and the variety of contexts in which the materials might be used, testing may occur in formal courses or in informal contexts of self- or minimally guided- study. Further, any single beta-test site will almost surely be in a position to test only portions of the developing materials.

This project is still very much a work in progress, and I have no conclusions to report at this moment. By the completion of the first phase of this project a year from now in December, 2001, I will have devoted full time for the equivalent of one academic year and two summers to the writing and one faculty workshop will have been completed. In the period from January, 2001, to June, 2002, I will be back teaching full time but will also be devoting time to polishing drafts and responding to comments and criticisms from beta testers. In the summer of 2002, I will conduct two more faculty workshops and finish the polishing. The target is to deliver the LaTeX source files for all of the modules and for representive solutions to the exercises to Brooks-Cole by September, 2002, for a January, 2003, publication date. Absent genuine conclusions, let me simply stop by inviting you to

- check the project website at the above URL.
- give me your suggestions and comments about ways in which the broad usefulness of the product I have described could be enhanced.
- apply for participation in the 2001 workshop.
- catch me after this session or later in the meeting---I will be here through Tuesday evening---to put yourself on an email distribution list or pick up a copy of the descriptive flyer.
- stop by the Brooks-Cole booth to pick up a copy of the flyer, add your name to an email distribution list, and/or examine the couple of copies I have placed there containing different selected subsets of materials as they exist at the moment.