Metrika (2000) 51: 91±104
> Springer-Verlag 2000
Design principles for technology-based statistics education Paul F. Velleman1 1 Cornell University, 358 Ives Hall, Ithaca NY 14853, USA (e-mail:
[email protected])
Abstract. Statistics is widely recognized as a discipline well-suited to technology-based education. Despite the appearance of a number of innovative contributions to computer-based statistics teaching, there have been few clear statements of appropriate design principles. This article begins the process of codifying consistent principles for the design of computer-based materials for teaching statistics. The principles emerged as part of an ongoing e¨ort to develop multimedia-based statistics materials for the ActivStats project. We draw from that project to illustrate the principles with speci®c examples. Key words: a 1 Introduction The design of a learning environment for technology-mediated learning in statistics requires attention to several principles that have not been clearly articulated. These principles emerge from recognizing and embracing the challenges and opportunities that computer-based teaching provides. This article proposes speci®c principles, illustrates them with examples from a published computer-based statistics text, and discusses alternatives and consequences. More detail about multimedia instruction appears in Ambron and Hooper (1990) and Grabinger et al. (1990). Haykin et al. (1993) survey multimedia methods and technology more generally. Jensen and Sandlin (1995) examine current use of multimedia in higher education. Biehler (1993) suggests desirable features of software for teaching statistics. Web address to chapter 3: www.datadesk.com/ActivStats/.
92
P. F. Velleman
2 Statistics and computer-based teaching Statistics o¨ers several advantages as a subject for computer-based teaching:
. Statistics is practiced with computers. Students learning how to apply sta. .
.
. .
tistics on computers are learning the skills they need to carry away from a course in the manner in which they will need those skills. In formal education theory, this is sometimes called authentic instruction. Much of the development of probability and the argument for statistical inference rely on imagining many repetitions. Computers make possible such repetition, turning this aspect of the course into a participatory experimental science course rather than a dry mathematics course. Many of the insights of statistics can be illustrated graphically. Abstract concepts, from least squares to the relationship between density curves and probability to conditional probability and independence can be made concrete with appropriately designed illustrations. When these illustrations can be animated or, even better, can interact with the student directly, the advantages of such visualization go far beyond anything possible on a static book page or blackboard. Modern statistics teaching has moved toward courses motivated by data analysis. The Report of the Joint ASA/MAA focus group on teaching statistics (Cobb 1992) speci®cally recommends ``More data and concepts; Less theory, fewer recipes.'' Computer-based teaching makes it practical to provide students with many real-world datasets for them to analyze, and tools for the analyses. The days of pounding a calculator as the primary activity in a statistics ``laboratory'' are (or should be) long past. Modern courses emphasize the practical value of statistics. Video stories about statistics applications and links to internet sites help to anchor the course, answering the students' key question: ``Why am I taking this course?'' The traditional answer (``It's required, shut up and sit down.'') was never very good. Modern answers are based on showing students directly how statistics is valuable in science, industry, and policy formation. It is always di½cult to ®nd an ideal pace for a statistics course. Because statistics is not mathematics, even students with solid mathematics background may stumble over concepts like the central limit theorem and the sampling distribution of a statistic. Computer-based teaching can allow each student to devote extra time to some topics, while accelerating through others.
3 An example ActivStats (Velleman, 1997) is a multimedia presentation of the introductory statistics course. Many of the principles discussed here were developed and re®ned in the process of designing and writing ActivStats. Section 8 discusses experiences using ActivStats in teaching and suggests issues for future research. Information about ActivStats can be found at www.datadesk.com/ActivStats/. ActivStats is designed to supplement a variety of statistics course formats. It is thus neither a distance learning course itself, nor materials speci®cally designed for synchronous distance classes. Computer and communications technology can be used to facilitate synchronous communication among stu-
Design principles for technology-based statistics education
93
dents and between students and teachers in di¨erent locations. Although such synchronous classes may enhance learning in some subjects in which they can bring an expert or practitioner into the classroom at an a¨ordable price, it is not clear that synchronous classes enhance statistics learning. It seems far more likely that the bene®ts of asynchronous learning modes that let students advance at their own individual pace and review material as often as they wish are of particular value to statistics students. This article thus focuses on ways to get the most out of technology in teaching statistics. 4 Design principles As part of the ongoing ActivStats development project we have codi®ed several design principles. The principles given here have been re®ned over the several years of the project based on experience in designing, implementing, and teaching with computer-based teaching materials, and from student feedback.
. Use each multimedia channel appropriately. Velleman and Moore (1997)
.
.
discuss speci®c multimedia channels and their best use in teaching statistics. For example, text read from a screen is not as e¨ective or as easy to work with as text printed on a page. Simply producing a book-like manuscript in html is not an appropriate use of the technology. Similarly, video is known to be ine¨ective for exposition of new ideas but e¨ective for changing attitudes. Thus, video clips of a lecturer are a poor use of the technology, but video that shows real-world applications of statistics can be quite e¨ective. Address the subject one concept at a time. Each concept deserves the best possible presentation. In designing ActivStats, we partitioned the introductory course into individual concepts and methods. By rough count, the introductory statistics course covers almost 150 new concepts, de®nitions, and methods in a semester of work. To cover the material, a lecturer must cover four or ®ve new concepts in each lecture. But statistics methods are best learned by applying them. Traditional teaching methods thus ask students to try to absorb and understand four or ®ve new ideas when they can only use their new knowledge for the ®rst time at some later date (typically the evening before the homework is due.) Computer-based teaching makes it possible to motivate and explain a single concept, giving the student all the time he or she needs to be comfortable with the new knowledge. We can then immediately reinforce the new, fragile, learning by o¨ering opportunities to apply new methods, employ the new knowledge in an interactive simulation, or exercise it with a quiz or other study assistant. Grant control to the student. Students should feel in control of the pace and intensity of the learning experience. Indeed, one of the principal advantages that technology o¨ers students of statistics is the ability to accelerate coverage of topics they ®nd easy and slow down discussion of topics that they ®nd di½cult or confusing. ActivStats provides this control by teaching individual concepts and by letting the student initiate each learning activity. The activities themselves last for between two and ®fteen minutes, with the longer ones involving extensive student interaction. In addition, a student can stop and review an activity at any time (Figure 1).
94
P. F. Velleman
Fig. 1. The ActivStats Lesson book introduces topics. Students launch learning activities by clicking on the icons (three of which are shown on the page).
.
When students control the pace of presentation, they no longer feel that they are hanging on, trying to keep up with a course that moves at a pace out of their control. Some of our students work through activities rapidly. Others review them several times until they feel comfortable enough to go to the next concept. Be consistent. Design consistency requires attention to the deepest of details. Computer-based learning places the student in an invented world. We all must be comfortable with our surroundings before we can turn to learning new things. The computer-based world must therefore be comfortable and consistent. Students shouldn't have to learn new ways of doing things with each tool or even ®nd changing color schemes and labeling conventions. For example, the vertical axis of all displays in ActivStats looks and works in the same way. The axis appears ®rst in bar charts and histograms.
Design principles for technology-based statistics education
95
It appears again as the axis against which to display values from a single variable in a dotplot. The same dotplot then appears as a tool in which points dragged by the student dynamically change measures of center and spread so students can see how individual values a¨ect (or fail to a¨ect) statistics such as the mean, median, and standard deviation. Subsequent activities add a horizontal axis, teaching about scatterplots. The same Cartesian display then hosts exercises in correlation (placing points on the plot to yield a speci®ed correlation coe½cient), and in ®tting regression lines by least squares. Later in the course, displays of cumulative results from random trials are displayed in the same format to illustrate the Law of Large Numbers and Central Limit Theorem. Throughout all of these displays, colors, dot sizes, type fonts and sizes, and menu commands for controlling and printing displays are consistent. Internal consistency is so important that we have chosen to deviate from tradition in favor of consistency. Throughout ActivStats the variable to be analyzed, summarized, displayed, and understood is denoted y. This choice is consistent with the use of a single vertical axis to display a single variable in a dotplot or boxplot. More important, it is consistent with denoting the dependent variable in a regression y. Thus, throughout the course students are describing, displaying, and modeling a variable denoted y, whether as a single variable, as a variable measured in several groups, or as the dependent variable in a regression. The tradition of beginning with a variable denoted x, for discussing means and standard deviations, then changing to y for discussing correlation and regression introduces unnecessary confusion and inconsistency. 5 Functionality principles Computer-based materials behave. That is, they cannot be considered to be static objects or text to be read. Far more than a book or set of workgroup activities, computer-based tools create a learning environment in which students must live while they are trying to absorb the course material. The design of such environments requires attention to how they function. Invented worlds must still be consistent wherever possible and must follow a logic of behavior that helps students to feel acclimated and comfortable. Functionality principles are more di½cult to elucidate. They are probably better understood by example and, of course, best understood with examples that themselves exhibit the behavior. In a paper-based article, this last option is not available, but the illustrations provided attempt to give a sense of the functions described here. Use multiple channels for exposition and explanation The exposition and explanation of a new concept is best done with a combination of narration and display. Text ± especially text on a computer screen ± should be avoided; few students can read such material as an initial introduction. (Text can be quite valuable, but it is more likely to help students to review and organize after they have already absorbed the concepts.) Recorded
96
P. F. Velleman
narration carries a human touch, but the narrator must be someone who understands both the statistics and how to teach these concepts. Much can be conveyed with appropriate nuances of phrasing and emphasis. Narration alone is not su½cient. Synchronized with the narration, students should see pictures, displays, equations, words, and animations on the computer screen. The screen should not merely write what the narrator says. Rather, it should illustrate what the narrator is trying to convey. Thus, for example, as the narrator discusses the formula for a t-based con®dence interval for the mean of a particular dataset, the values of y-bar, s, n, and the value from the t-table can each be seen to slide from their places in tables of results to their appropriate places in the formula. Our goal has been to convey the impression of a kindly tutor standing at the student's shoulder pointing out interesting things on the screen while explaining the main ideas. A variety of methods can be used to draw the student's attention to a part of the screen. The most successful methods involve some degree of animation and movement on the screen. Thus, for example, one can simply draw an arrow to point to something of interest. Alternatively, one can draw a circle or oval around something of interest. When the student's attention has been focussed far from the point desired, we have had great success with an ``iris'' that begins as a translucent circle of color covering most of the screen and then shrinks to a smaller circle or ellipse around the item of interest. Active learning with interactive animation solidi®es understanding Active learning calls for students to interact directly with tools that illustrate or depict concepts. Research has shown that the experience of working with newly learned concepts helps to consolidate the learning. Computers allow us to design interactive tools to facilitate such interaction. However, this is one of the greatest challenges in developing computer-based learning tools. For each concept, the designer and author (who may be the same individual) must meet the challenge of teaching that particular concept with the best use of the available technology. Designers should maintain the focus on pedagogical e¨ectiveness, even at the expense of technical wizardry. For example, one of the more e¨ective uses of computer technology for teaching statistics is in providing students experience with random behavior. The de®nition of a random phenomenon as one whose short-term behavior is not predictable, but that nevertheless demonstrates long-run regularity, is one that is not natural to many students at ®rst. We thus need an animated tool with random outcomes for students to work with. In designing such a tool, we determined that it must meet several criteria:
. It should not attempt to imitate a real-world object such as a coin, die, or wheel. Such animations are fake, and appear to be fake to students. . roulette It should not have a numeric outcome. Numbers appear in many places .
in a statistics course; they should not appear where they are not needed. We settled on a color (red or blue) as the outcome, but other non-numeric values (shapes, sizes, names) could be equally e¨ective. It should not rely on pseudorandom numbers. Students are sophisticated. They know that computers are deterministic. One of the greatest challenges was to devise a way to generate truly random values.
Design principles for technology-based statistics education
97
Fig. 2. A tool that generates random outcomes and satis®es the ®ve stated criteria.
. The outcome probabilities should not, in general, be 50/50. Students (and, .
historically, mathematicians) ®rst think that ``random'' means ``equally likely.'' We wanted to knock down that conception immediately. Outcomes should be generated visibly with a mechanism that is realistic. We coined the term object verity to refer to the need for the objects on the screen to follow reasonable laws of physics ± or at least, generally accepted laws of ``cartoon physics.''
Figure 2 shows a design that meets these criteria. Students initially are presented with a horizontal bar divided into a blue and a red region. A cursor tracks from left to right and, upon reaching the right margin, immediately begins at the left edge again. The cursor is under the student's control, moving as long as the mouse button is depressed and stopping when it is released. When the cursor stops, the dot to the side of the bar shows the color of the region under the cursor, and a copy of the dot is seen to fall into a bin below. The bins collect the outcomes; red dots in one bin, blue in the other. At ®rst, students are able to control where the cursor stops. But by accelerating the cursor and dividing the bar into many thin stripes of alternating color, it is easy to present a task beyond the ability of even the best video game expert to control, and thus a process whose outcome is random in the same sense that a coin ¯ip is random; it is beyond the intentional control of the student. The tool also o¨ers a graph of the fraction of outcomes of a given color. (A click on the graph changes the color whose result is plotted and, of course, the corresponding color of the line used in the display.) Students readily agree
98
P. F. Velleman
Fig. 3. The tool con®gured to generate 25 iid random outcomes. When the mouse is clicked, each cursor starts at a random location and with a random speed. Cursors stop on mouse up, so the outcomes are truly random.
that the fraction of red outcomes will converge on the true proportion of the bar that is red. Students understand that the tool is generating random outcomes, that each of them will have a di¨erent experience with the tool, and that the longrun regularity experienced by each student in the class individually is an important phenomenon. Unity of functionality uni®es concepts When concepts are related, the visualization of those concepts with animation and interaction should be similarly related. A uni®ed presentation encourages students to see the underlying conceptual consistencies. Thus, for example, the tool that generates random outcomes can be generalized in several ways. First, we introduce multiple, identical bars (Figure 3). Each of them has a cursor that starts in a randomly assigned place and moves at a randomly assigned speed. The cursors still move when the mouse is depressed and stop when it is released, but now a single mouse release generates as many as 25 independent, identically distributed outcomes. Of course, object verity demands that each of the resulting color dots can be seen to drop into it's appropriate results bin. Random trials generated in this way are far more e¨ective pedagogically than pseudorandom values generated out of the student's sight by a computer program. Experience over many years with a diverse range of students has shown that few of them understand random values generated automatically by computers. The lessons that we thought we were teaching with such random quantities were lost on many of our students. A visible random mechanism is more convincing and encourages students to think about the underlying concepts rather than the number generation details. Once generalized to generate many i.i.d. outcomes with a single click, the tool can be used to teach more advanced ideas. For example, students are presented with bars in which the colors are hidden (Figure 4). Clicking the
Design principles for technology-based statistics education
99
Fig. 4. Modifying the random tool to teach the reasoning of hypothesis testing. Students are asked to assess the claim that half of each bar is red. Here, with 175 trials run, the evidence has mounted against that hypothesis, but it is up to the student to decide when the evidence is su½cient to reject it. In the process, students ®nd a personal P-value.
mouse to generate a trial shows the cursors moving and the color dots then take on the resulting colors and fall into their appropriate results bins. Students are given a claim that the bars are half red and half blue and asked whether they believe the claim. They collect data until it becomes clear that the claim is false. (The true value is generated randomly, so each student has a di¨erent experience, but the value is kept away from 50%.) With enough data, students are willing to reject the claim on the grounds that the observations di¨er too much from what they would expect if 50% were the true fraction of red. In the process, the students have, on their own, reasoned through a classical hypothesis test. They can even record the number of trials they ran before rejecting the claim. That will allow them later in the course to calculate the probability of the observed outcome under the null hypothesis that ®nally led them to reject that null hypothesis. This probability is their personal P-value, the probability of an event that they would personally declare to be rare and therefor reject. Experience has shown that many students ®nd an event that is expected less than about 5% to be su½cient evidence to reject the null hypothesis. In another generalization, the endpoints of the bar are labeled 0 and 1, and the stopping place of the cursor generates a random value between 0 and 1. With numeric outcomes we can compute means, and students quickly discover the Law of Large Numbers for themselves. The bars can then be seen as Uniform densities, with the cursor passing over equal areas in equal amounts of time. By generalizing that principle, we can replace the uniform densities with Normal, skewed, and bimodal density shapes. The resulting tool (Figure 5) can draw iid random samples of sizes between 1 and 36 from any of 4 densities, all under student control. The values in each sample are combined in a statistic (the mean, to start, but others are available), and the resulting statistic values gathered in a histogram. Students can then discover the Central Limit Theorem and, by varying the size of the samples (the number of bars generating outcomes), even ®nd the relationship between sample size and the standard deviation of the sampling distribution.
100
P. F. Velleman
Fig. 5. Generalizing the random tool to draw i.i.d. samples from a normal density and accumulate the means of the samples gives students a tool for learning about sampling distributions and the Central Limit Theorem.
Other generalizations build on these same skills and insights. The bars can, for example, generate outcomes in up to four colors and can also be shaded or not. A randomly generated outcome then consists of both a color and a shading state. If the shaded part of the bar has the same color distribution as the unshaded part, then the probability of, for example, a red outcome is independent of the shading outcome. If the shaded and unshaded regions are colored di¨erently or in di¨erent proportions, then color is not independent of shading. The important point is that by providing consistent functionality and design, the underlying unity of the concepts is visualized in a unity of representation and function. Links to the world make the course real A course based on sophisticated animations can slip into a world of its own divorced from practical applications. Statistics is a discipline with ®rm roots in practical applications. Technology o¨ers several ways to tie what students learn to real-world applications and motivations.
Design principles for technology-based statistics education
101
Digitized video takes students out into the world. Video can condense a story that takes place over a period of time and in several locations into one that can be told e¨ectively in two minutes. For example, a video in ActivStats (edited from a story in the Against All Odds telecourse) shows attempts in Florida to protect manatees from injury due to motor boats. The story covers many days and shows scenes underwater, in a laboratory, and in boats, but it is told in less than two minutes. The Internet links students to real-world sites and provides current news. Students can follow up on the manatee story by visiting the Sea World Manatee Project site to learn more about e¨orts to protect manatees. Links to sites that post survey results regularly provide current data appropriate for discussing survey methods and for teaching conditional probability, independence, and inference about proportions. Statistics students should learn to use a statistics package. When a technology-based course can assume that students are sitting at a computer, it becomes natural to provide many datasets and exercises in which students analyze the data and write up their conclusions. Where possible, the data should relate to other parts of the course. Thus, for example, after students view the video about protecting manatees, they are given data on manatee deaths from motorboat accidents and numbers of motorboats registered, and lead through a regression analysis. A later project introduces data on the size of the manatee population and asks students to think harder about the issues ± might the problem be that the protection program has been successful so that there are now more manatees to be killed? 6 Quality and professionalism Authors of books and articles work with professional editors. Editors help to achieve a consistency and economy of style, and work with authors to eliminate unresolved references, anachronisms, inconsistent terminology or notation, and similar loose ends. Far too much of the material published electronically has never been edited. Unfortunately, this oversight is painfully obvious even to a casual reader. Students deserve professional quality materials. They should not encounter terms that have not been de®ned or have to ®gure out notation that changes during the course. Materials should be technically professional as well. Narrations should be recorded in a studio with adequate rehearsal. Soundtracks for videos require appropriate music and sound e¨ects (``Foley'' in the jargon of the trade.) Even the design of icons and buttons on the screen should be professional and consistent. Professionalism in technology-based materials includes professional user interface design and professional programming. Neither of these aspects of technology-based materials can ordinarily be accomplished by amateurs, no matter how great their skills as teachers or how deep their understanding of statistics. Finally, professionalism requires that the author of materials that teach statistics be trained as a statistician. It should go without saying that materials to teach statistics should be correct. Unfortunately, that is not true of many contributions posted on individual web sites. But pedagogical concerns run
102
P. F. Velleman
deeper. Many professionals and academicians apply statistics as part of their own professions. Some teach statistics. But statistics is a subtle and profound discipline. Non-statisticians rarely address the material with the depth of understanding required for designing and writing technology-based presentations of individual concepts. Indeed, statistics is a discipline that is particularly easy to teach badly. Consumers (in this case, teachers) should check an author's credentials in statistics before relying on technology-based materials. This is especially true because, unlike a textbook, web-based materials are unlikely to have bene®ted from a review process.
7 Practical details Materials of the kind described here use all of the capabilities of modern computers, and strain the current capabilities of the internet. They thus require careful attention to technical compromise. Sound: Full, CD-quality sound is recorded at 44KHz in stereo. We have found that for narrations, 22KHz monophonic sound is not noticeably worse, but that any further degradation in sound quality is quite noticeable. Students may listen to many hours of narration during the course; sound quality matters. Video: Full-screen, 24-bit color, 30 frame-per-second video can be played on modern computers, but requires ®les that are far too large to ®t comfortably on a CD-ROM and cannot be transmitted over the internet in any large amounts. We work with quarter-screen pictures at roughly 15 fps. Even that rate can only be delivered by CD-ROM or internal high-speed local area networks at present. Animation: For animations to achieve object verity they must be smooth and respond to user actions immediately. To guarantee such performance, we have programmed in C++ rather than in java. The practical result of these choices is that we deliver ActivStats on a CDROM. One happy aspect of current technology is that the sound and video ®les, which ®ll most of the disk, are suitable for all platforms we support, so only a small (about 10 MB) portion of the disk need be platform-speci®c. We can thus provide a hybrid disk that runs on Macintosh, and Windows 95/98/ NT systems. Compatibility: We believe that students bene®t from having a textbook as well as the electronic materials. We have designed ActivStats to work with virtually any textbook, providing ways to re-order the lessons to match the presentation order of most common texts. Distance Learning: The principles, methods, and materials discussed here are particularly appropriate for distance education. Several schools have o¨ered distance courses using ActivStats, and a number of others are in planning stages. Distance learning must pay close attention to the allocation of materials to media. Web sites, discussion boards, and email all provide communication among students and between students and teachers. When each student has a CD-ROM with the core course material, they can study o¨-line and yet link easily to the course web site. When homework is designed as a computer-based experience, it is natural to submit it electronically.
Design principles for technology-based statistics education
103
8 Experience and practice ActivStats has been used in teaching introductory statistics classes at undergraduate, graduate, and high school levels for four years. It has been used in small seminars and in large (300 student) lecture course. It has been used in community colleges and ivy league universities. Unfortunately, we have no systematic data on performance and success. We do, however, have a wealth of anecdotal information. Many teachers working independently have found that a successful way to work with such material is to assign students to work with the computer-based material ®rst and then to teach a lesson on the subject. Students come to class having seen the basic ideas, heard the new terminology, and knowing that the formulas are available. Class can then focus more on applications, on synthesis of the new ideas with others in the course, on discussion and question answering, and on helping student to integrate the new concepts and methods into their growing understanding of the subject. Students then are assigned homework exercises for each lesson. Most of these exercises require additional computer work, often working with data that has been provided along with the other materials. Across a wide range of situations, students working in this way report a median of 2 hours of computer work (preparation and homework) for each hour of class meeting. Some report spending as little as half that much time; others as much as twice that much time. In courses for which ®nal examinations are standardized across many years, we have seen an improvement in performance. The greatest improvement has been in students who previously would have been below the middle of the class in grade rank. Brie¯y, the best students perform well in spite of the teaching methods. Conversely, a student who is determined to fail by skipping class and missing homework can still fail. However, students who need to work hard to do well in statistics ®nd that computer-based materials support their e¨orts more e¨ectively. Such students have regularly been able to improve their performance throughout the course.
9 All the running you can do. . . The Red Queen declares that ``here, you see, it takes all the running you can do to keep in the same place.'' Those who maintain technology-based teaching materials think she must have had that task in mind. Each operating system release, and each improvement in video compression and digital sound processing is likely to require some changes to the materials. New hardware can also present new problems. We have found it necessary to schedule annual releases of ActivStats simply to insure compatibility and reliability. Of course, such a schedule also permits continual revision and quality improvement. The Continuous Quality Improvement attitude of seeking suggestions and criticisms is vital for such a project. Only by actively seeking obscure bugs and inconsistencies can we improve over time. For that purpose we maintain a web site that invites comments, criticism, and suggestions from teachers and students. By running ``twice as fast,'' as the red queen advises, we have been able
104
P. F. Velleman
to add new lessons, new video, and new web references, and to constantly expand the archive of homework exercises and datasets. We hope to be able to continue the e¨ort. 10 Conclusion The world of technology-based education is changing rapidly. New technology, both hardware and software, becomes available at a dizzying rate. In the face of rapid change, we must focus on developing high-quality content that can be presented to students with whatever technology is currently most appropriate. The principles enunciated here apply to content independent of the technical details. They are presented in the hope that they will help others to develop high-quality materials. Greater selection of quality materials will bene®t both students and teachers and will prompt still further improvement. However, poorly designed materials, materials with statistical errors, and materials limited by current technology hurt progress in this area. We can look forward to exciting advances and improvements in computerbased teaching, but we may not be able to predict exactly what form these may take. Regardless of the details, attention to quality, design, and functionality will improve the value and usefulness of new materials. References Ambron S and Hooper K (1990) Learning with Interactive Multimedia: Developing and Using Multimedia Tools in Education, Microsoft Press, Redmond, Washington Biehler R (1993) ``Software tools and mathematics education: The case of statistics,'' W. Dor¯er, C. Keitel and K. Ruthven, eds., Learning from Computers: Mathematics Education and Technology. Springer, Berlin Cobb G (1992) ``Teaching statistics,'' in L. A. Steen (ed.) Heeding the Call for Change: Suggestions for Curricular Action, Mathematical Association of America, Washington, DC Grabinger RS, Wilson BW and Jonassen DH (1990) Building Expert Systems in Training and Education, Praeger, New York Haykin R ed. (1993) Demystifying Multimedia, Apple Computer, Inc., Cupertino, California Jensen RE and Sandlin PK (1995) Electronic Teaching and Learning: Trends in Adapting to Hypertext, Hypermedia, and Networks in Higher Education, available by anonymous FTP from pacioli.loyola.edu, by binary ®le transfer from the ``pub'' directory Velleman PF and Moore DS (1996) ``Multimedia for Teaching Statistics: Promises and Pitfalls,'' The American Statistician, 50, 217±225 Velleman PF (1997) ActivStats, Addison Wesley Longman, Reading, MA