Prism Magazine - February 2003
Getting Down To Business
Unsettling State of Affairs
Down & Out in Afghanistan
Comments
E-Mail
Briefings
Databytes
Refractions
Teaching Toolbox
ASEE Today
Professional Opportunities
Last Word
Back Issues

Teaching Toolbox

The Real Test

- BY ANNA MULRINE    

The ABET visits were a huge success at your school but now that they're over, how do you put continuous improvement into practice?

ABETvisits have never been known for inspiring a palpable sense of excitement around campus. In recent years, however, deans of engineering programs with ABET visits following closely on the heels of Engineering Criteria 2000 were downright nervous. "We approached the visit with some apprehension, because it was the first one under the new rules," says Peter Hudleston, associate dean for student affairs at University of Minnesota–Twin Cities. Andrew Martinez, associate dean for undergraduate studies at Tulane University in New Orleans, concurs. "We were hearing horror stories all around us, the walking wounded," he says. "So we went into it a little unsure about what to expect."

Today, those campuses are breathing a sigh of relief. The general consensus? "It wasn't as traumatic as we thought it might be," says Hudleston. "It's a terribly lot of work, but the payoff is really tremendous," adds Martinez. But now those campuses are facing a new challenge: The implementation, or "continuous quality improvement" phase of ABET.

And after surviving their visits, the universities have some advice for programs preparing for ABET.

Many of the ongoing challenges of ABET are firmly rooted in the flexibility prescribed by EC 2000. "Back in the early 1990s, ABET was criticized for being overly prescriptive in its criteria, and for stifling innovation," says Dan Hodge, ABET's accreditation director. From that critique sprang EC 2000, with its twin aims: to address outcomes rather than input, and to encourage continuous quality improvement.

That ongoing improvement means that schools are evaluating data and overhauling departmental achievement criteria long after the ABET team leaves campus. "It used to be, after an ABET visit, you put the materials in a file and said, ‘I'll open that drawer six years from now,'" says Hodge. "It had no impact on your day-to-day operations." That, of course, has changed. "We couldn't just pull the old plan out, dust it off, and turn it in," says Martinez, only a bit wistfully. With EC 2000 has come more flexibility—but more work as well. "ABET has improved the format tremendously—it's harder on us, but better," Martinez adds. "Before, it was bean counting in the extreme," he says. "Now, it's still bean counting, but we get to say what the beans are."

Defining those beans, and in turn successfully jumpstarting improvements that will continue long after ABET team visits, begins years in advance of the actual ABET evaluation for most campuses. The University of Alabama, for example, kicked off its accreditation preparation with a 1,000-day countdown to ABET. The primary challenge in the beginning, says Kevin Whitaker, the university's associate dean for academic programs, was "getting people excited about it, because it can be a pretty dry process." Later they formed teams. "We had a team for just about everything except breathing," Whitaker adds. These teams, campuses agree, are the first step in a successful ABET visit—and set the stage for the continuous quality improvement that will continue on campus. "It's the preparation that's the key in making the final exam good," Martinez adds.

It is also an opportunity for each department to take a tough look at its objectives. At Tulane, for example, the electrical engineering department decided on new criteria for achievement—flexibility, creativity, and competence—then "built everything around trying to develop these characteristics," says Martinez. It was a step, he adds, that proved particularly valuable in allowing "each department to have its own character." Pre-reviewers they brought in before the ABET visit agreed that "it was good that each department looked different. They weren't just cookie cutter," Martinez explains. "They were completely unique."

The University of Washington staged a mock visit to prepare for its 2001 accreditation visit. "We actually invited a few people with a lot of experience and pretended that everything was real," says Chen-Ching Liu, associate dean for organizational infrastructure at the university. It was a helpful process, he says, that allowed them to hone their presentation and materials for the real ABET team. Campuses found that all of the steps they took to prepare laid the groundwork for future innovation. "ABET now provides a formal and robust structure in which to think about what you are as a college, what your program is all about," Thomas Blake, associate dean in the College of Engineering at the University of Massachusetts–Amherst, says. "And what measurements we're putting in place to assess students—and ourselves."

But as they were in the midst of readying their campuses for ABET, many of the universities made a surprising discovery. They found that it just might be possible to prepare a bit too much—in other words, to gather cartloads of information, but "after the fact, scratch their heads and say, ‘What am I going to do with it?' " explains Hodge. "The danger we found, and this happened in more than one department, is that it's easy to fall into the trap of data collection," Martinez says. "Even after asking lots of questions and polling people, you have to take that next step," he says. Whitaker agrees. "You really need to present complete cycles. You can't just show them a survey. You need to say, ‘This is what we used, this is the data we collected, and this is how we used the data to make a change.' "

 

What Students Think

That's not to say, however, that the data collection didn't help quite a bit with the ABET-prescribed ongoing quality improvement goals. For example, in a student survey in preparation for ABET, the University of Minnesota discovered that students gave consistently low ratings to the advising process. They are now focusing on how best to strengthen linkages between the departments and courses that may aid the student's schedule choice and design.

The university also began placing a greater emphasis on international competitiveness and cooperation. As a result, they expanded study-abroad options for their engineering students—who have not been particularly encouraged to participate in those sorts of opportunities in the past, he says. The university offered a three-credit seminar in Beijing on globalization of the software industry, and a course in manufacturing, held in Switzerland, for mechanical engineering students. "Both of them were oversubscribed," says associate dean Hudleston. "The fact that ABET emphasizes global awareness helps encourage faculty to offer things like this—and students to take it."

At the University of Massachusetts–Amherst, students in the civil engineering program indicated through ABET-inspired departmental surveys that they were interested in increasing the number of field trips to visit construction sites for specific classes, notes Alan Lutenegger, head of civil and environmental engineering there. The department also redesigned the capstone course, increased the number of team projects, and set aside a room where teams could meet and develop their projects.

Through department meetings at Tulane, the faculty discovered that although the department had offered math courses that covered probability, they never actually required a dedicated probability course. "So it was tough-but possible—to sidestep the requirement," says Martinez. They created a required probability class and, in the process, discovered that the change improved the curriculum as well. "It allowed us more time, in an overfull course anyway, to develop more aspects of mathematics," he says.

Many of these improvements continue today. At Tulane, student surveys indicated that the undergraduates were interested in business classes. The university is now considering introducing a management sequence into the curriculum; a senior engineering student is further investigating student interest and logistics for her capstone project. "Very likely, this will happen," says Martinez. "Even if it doesn't, it's an opportunity that arose directly from this accreditation process."

Throughout the course of this continuing improvement implementation, one of the challenges universities continue to face post-ABET is knowing when they have met their improvement goals. "At what point would you say you've achieved the outcome in terms of improvement?" wonders University of Washington's Liu before their ABET visit. "What documentation do you need to say you've achieved outcomes? Can you use a grade, for example? It might be helpful if there's a more clear description from ABET about at what point you've achieved the requirements for outcome assessment," he says.

And, of course, there are the categories where ability proves a bit more elusive to demonstrate. "Some of the EC 2000 criteria are fuzzier," concedes ABET's Hodge. "How to improve teamwork abilities, how to build team skills. And how does one instill a sense of lifelong learning in an individual—I think programs have struggled with that one." Whitaker at University of Alabama certainly has. "That's a tough one, and we're still struggling with how to show it better. How do we know if our students have an appreciation for lifelong learning?"

Tulane has grappled with how to demonstrate creativity. In fact, the school's electrical engineering department is seriously considering removing the ability from its new and aforementioned list of requirements with which all students should graduate—a list that was generated by the department after ABET preparation meetings. "I mean, how do you count beans about creativity? We weren't entirely sure," says Martinez. "One of the ABET reviewers said, ‘This is fine—we like it and everything—but you might want to rethink the creativity goal.' Because what happens when someone comes in creative and leaves creative?"

The good news, universities say, is that the more they implement changes that arise out of surveys and faculty meetings, the easier closing the loop becomes. At the University of Massachusetts, says Blake, students are now noting that they would prefer to print directly in the computer lab, rather than go to a central facility as they've been doing. "That's the sort of thing we can fix easily." At Tulane, says Martinez. "The problems we're hearing now are getting less and less significant. The last time we did a survey," he says, "They wanted more tables in the student lounge. We're getting down to the really manageable problems now. It's not ‘We want a new dean,' which is good."

 

Anna Mulrine is a freelance writer based in Washington, D.C. She can be reached at amulrine@asee.org.


More ...

 

 
 
Prism@asee.org