question 

Ask, and ye can Assess
By Vern R. Johnson

Questionnaires can be very helpful in assessing an engineering program's effectiveness. Here's how to develop and use one of your own.

Engineering educators have always thought it important to know how well their academic programs are doing. The Accreditation Board for Engineering and Technology's recent adoption of Engineering Criteria 2000 places an even greater premium on such information by requiring accurate, reliable data on all aspects of program performance.

Educators can generate this data using many different assessment tools. This article focuses on one such instrument: the questionnaire.

Why Questionnaires?

Questionnaires are survey instruments designed to collect the opinions of a specific audience. They can assess satisfaction with a wide range of topics and provide a general idea of what respondents sense as important. This flexible nature enables engineering educators to adapt questionnaires to suit a variety of assessment purposes.

A well-designed questionnaire can provide educators with valuable information on specific modifications they need to make to improve program effectiveness. For example, say the results of a spring survey of undergraduates show that freshmen want more "hands-on" application of engineering principles. When updating the first-year introduction-to-engineering courses, faculty members could then incorporate more of these activities.

Educators can also use questionnaires to help them refine their set of program outcomes (i.e., the attributes they want graduates to have). For example, one new outcome related to the addition of more hands-on experiences to the undergraduate curriculum could be the stipulation that graduates have a basic understanding of real-world engineering.

Developing a Questionnaire
There are 10 basic steps to developing and using an effective questionnaire.

  • 1.Select a process to assess.
  • 2.Identify the audience to be surveyed.
  • 3.Determine how the data will be used.
  • 4.Decide which specific attributes of the process will be assessed.
  • 5.Choose a scoring system.
  • 6.Construct the questionnaire.
  • 7.Conduct the survey and tabulate the data for analysis.
  • 8.Analyze and interpret the data.
  • 9.Use the interpreted data to muster stakeholder support.
  • 10.Implement appropriate improvement projects and monitor their progress.

Using a hypothetical engineering department, let's walk through an extended example of how engineering educators can develop a highly targeted, program-specific questionnaire using this 10-step procedure.

Steps 13: Establish Basic Parameters
A mechanical engineering department wants to assess its senior-year capstone design course (Step 1). The department decides to survey students who have already completed the course (Step 2) in the hope of gathering data it could use to improve the capstone experience (Step 3).

Step 4: Decide Which Attributes to Assess
Next, the team designing the questionnaire selects the course attributes that the survey will address (Step 4). The mechanics involved in completing this fourth step require a little additional explanation. In selecting the attributes the questionnaire will focus on, the design team has to weigh several factors. First, it must recognize that the department may have designed the capstone course to fulfill several of the 11 accreditation outcomes required under EC 2000. These outcomes stipulate that all engineering graduates have:

  • an ability to apply knowledge of mathematics, science, and engineering;
  • an ability to design and conduct experiments as well as to analyze and interpret data;
  • an ability to design a system, component, or process to meet desired needs;
  • an ability to function on multidisciplinary teams;
  • an ability to identify, formulate, and solve engineering problems;
  • an understanding of professional and ethical responsibility;
  • an ability to communicate effectively;
  • the broad education necessary to understand the impact of engineering solutions in a global and societal context;
  • a recognition of the need for and an ability to engage in lifelong learning;
  • a knowledge of contemporary issues; and an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

To help isolate which of these outcomes the capstone design course supports, the team recasts them all as shorter statements in verb, noun format. For example, the first accreditation outcome might be: apply knowledge of mathematics, science, and engineering.

The team then examines the abbreviated list and removes the outcomes the capstone design course does not meet. It then arranges the remaining outcomes that the course does meet, such as analyzing and interpreting data, in an order that is easy for the intended audience to understand. The list might look something like this:

  1. Function on a multidisciplinary team.
  2. Identify a customer or industrial need.
  3. Formulate descriptions of problems that when solved satisfy a need.
  4. Apply mathematics, science, and engineering to solve the problems.
  5. Design a component to meet desired needs.
  6. Analyze and interpret data.
  7. Communicate using oral progress reports.
  8. Communicate using written final reports.

In all likelihood, when the mechanical engineering department developed the capstone design course's objectives, it also incorporated feedback from other stakeholders, such as alumni and prospective employers, as well as from EC 2000 requirements. The stakeholders probably enumerated skills they believed students ought to acquire from such a class. Those desired skills might include:

  • the ability to use textual material to support project design
  • the ability to obtain materials for project construction
  • the ability to pilot-test a component prior to full implementation.

The final list of attributes to be assessed in the questionnaire includes these three skills as well as the eight distinct course objectives identified by the team. It reads as follows:

  1. Function on a multidisciplinary team.
  2. Identify a customer or industrial need.
  3. Formulate descriptions of problems that when solved satisfy a need.
  4. Apply mathematics, science, and engineering to solve the problems.
  5. Design a component to meet desired needs.
  6. Use text materials to support project design.
  7. Obtain materials for project construction.
  8. Pilot-test a component prior to full implementation.
  9. Analyze and interpret data.
  10. Communicate using oral progress reports.
  11. Communicate using a final written report.

Ideally, questionnaires should focus on no more than 10 to 15 attributes. Respondents can usually evaluate a list of this size in a few minutes without losing interest.

Step 5: Choose a Scoring System
After identifying attributes to be assessed, the team chooses a scoring system for the questionnaire (Step 5). Scoring should be simple and clear, but also must effectively measure respondents' satisfaction with the course's treatment of each attribute and an estimation of how the respondents view each attribute's relative importance.

A scoring system can assess satisfaction and importance directly (see Undergraduate Spring Survey on page 27). Or, educators can choose to have respondents assess satisfaction and then identify the three to six most important attributes (see Senior Capstone Design Experience Survey on page 27). This second approach works better with both faculty members and students. Often these groups rate everything almost equally important, but they can select the three to six most important attributes.

A scoring system of 1 to 5 or A to E is suitable for most questionnaires. Always include a score of NA for not applicable.

Step 6: Construct the Questionnaire
After selecting an appropriate scoring system, the team constructs the questionnaire form itself (Step 6). The form should feature an opening statement that explains the survey's purpose (to better focus course improvement efforts) and how the department plans to use the results (to identify aspects of the course that need to be improved). A set of instructions on how to complete the questionnaire follows the opening statement as well as a list of the 11 attributes and the scoring system.

Before releasing the questionnaire, the team pilot-tests the form with a group of representative students (perhaps members of a student professional society) to ensure that students will interpret the instructions correctly. Sometimes a discussion with the pilot-test group is helpful as well.

Lastly, team members draft instructions about how the respondents should submit the questionnaire to the department. Whenever possible, arrange for the completed questionnaires to be picked up at the time and place of distribution. When this is not possible, the questionnaire should include instructions about where respondents should submit the completed forms. Drop boxes in key locations on campus, mail, or fax are all options.

Step 7: Conduct the Survey and Tabulate the Data
A third party distributes the questionnaire to all graduating seniors who have completed the capstone design course (Step 7). To ensure a useful result, a statistically significant number of students must complete the survey.

After collecting the completed questionnaires, team members begin tabulating the data (Step 7). They can use hand calculations or spreadsheets, but both are very limiting. Many prefer to use an electronic database with an analysis program, such as ASSESS 1.0. Engineering educators can access this free program from the Web page www.engr.arizona.edu/~acadaff/
assess.zip
or request the software via the following e-mail address: vjohnson@arizona. edu. Designed specifically for the University of Arizona's engineering school but easily adapted for other schools, this program features a user-friendly interface and can calculate a large number of survey responses.

Step 8: Analyze and Interpret the Data
The team members can now begin analyzing and interpreting the data (Step 8). First, they calculate the average satisfaction and importance level for each attribute on the questionnaire. (Do not include the NA scores.) They then compute overall averages for satisfaction and importance by including all attributes. By graphing those items according to their importance and satisfaction coordinates, educators can isolate the attributes that have above-average importance and below-average satisfaction. These are the attributes that need the most attention.

In the hypothetical case described here, respondents reported overall satisfaction with their experience; however, they rated the two communication attributes above average in importance and below average in satisfaction. This finding reveals that the students believe that their capstone design course did not adequately prepare them to perform well in these important areas.

Step 9: Use the Data to Muster Stakeholder Support
The department can use the graphical analysis of the questionnaire's findings to gain institutional and stakeholder support for undertaking program improvements (Step 9). This type of data presentation is often very effective in effecting changes.

Step 10: Implement and Monitor Improvement Projects
Once the department has attained stakeholder support (financial or otherwise), it starts work to revamp and revitalize the senior capstone design course (Step 10). As part of this effort, faculty members might draft a project statement that describes what they hope to accomplish, such as creating learning units on how to present oral progress reports. The department then assembles a team interested in tackling the course redesign. This team's first actions would be to set a series of milestones for accomplishments and establish a reporting schedule.

Building Your Own Repertoire
Questionnaires are but one of the many tools (including portfolios, P.E. exam results, and alumni achievement data) that engineering educators can include in an effective assessment repertoire. When used in combination, these tools give educators a measure of how well they are doing and provide concrete information about specific steps they can take to improve their program.

Vern R. Johnson is associate dean for academic affairs in the College of Engineering and Mines at the University of Arizona.

______________________________________________
return to PRISM online; or October PRISM online