Master of Science Program in Computer Engineering
- Mission Statment
The mission of the Master of Science Program in Computer Engineering, in adherence to the core themes of the mission of Weber State University, is to provide students a high quality graduate-level education in Computer Engineering. This education, which emphasizes advanced engineering principles coupled with hands-on experience, enables students to make significant contributions to society as professional engineers. The program stresses design and problem solving using math, science and advanced computer engineering principles.
- Student Learning Outcomes
- Graduate Certificates (Not Applicable)
- Associate Degrees (Not Applicable)
- Graduate Degree
At the end of their study at WSU, Masters of Science in Computer Engineering students will:
- Demonstrate the ability to apply knowledge of math, science and engineering.
- Demonstrate the ability to design a system, component or process.
- Demonstrate the ability to identify, formulate and solve engineering problems.
- Demonstrate the ability to apply master’s level knowledge to the specialized area of computer engineering.
- Graduate Certificates (Not Applicable)
- Curriculum Grid
- Program and Contact Information
The WSU Master of Science in Computer Engineering (MSCE) program provides an avenue for students in Computer Science and Engineering to pursue a graduate degree in a high-demand and growing discipline. Further, it offers professionals in the local work force an opportunity to earn an advanced engineering degree, bolster innovation in the community, and thereby promote economic growth. Finally, for those students who are interested, this program provides the necessary preparation for doctoral programs at other institutions of higher learning.
Contact Information:
Dr. Fon Brown
Weber State University
1703 University Circle
Ogden, UT 84408-1703
(801) 626-7781 - Assessment Plan
The assessment plan is executed using two types of instruments:
- The project defense assessment instrument works as follows: Faculty attending a final design review answer four questions corresponding to the four learning outcomes listed in Section C.
Responses from these questions fall into a four-point asymmetrical Likert scale:
4 = strongly agree
3 = agree
2 = mixed, and
1 = disagree.- The student’s committee chair calculates the mean response for each question. These responses are recorded in the Project Defense Assessment Report, which the chair submits to the program director. The director computes a graduating cohort average for each of the four questions and enters those averages into the continuous improvement record. If the mean value for any question falls below 2.67, the program faculty must initiate action to address the unsatisfactory learning outcome result(s). Conversely, if all mean values are at or above 2.67, no action is initiated by the faculty.
- Each course in the MSCE curriculum grid marked “H” has an associated assessment rubric that measures students’ performance with respect to the 4 student learning outcomes listed in Section C. Through the continuous use of these rubrics, assessment at both the course and program level is an ongoing process that provides a measurable means of program improvement.
- The course assessment rubric works as follows. At the end of each semester, the instructor scores each performance indicator (PI) for the course. A four-point scale is used. The rubrics are designed with a “trigger point.” If the score of a PI is 1 (unsatisfactory) or 2 (developing), the instructor initiates action to make course level changes with respect to the applicable PI for the course. If the score of a PI is 3 (satisfactory) or 4 (exemplary), no action is taken by the instructor. Then, the mean PI score for each course and section* is transferred to a program level “continuous course improvement” record, a document that summarizes the mean PI scores. This spreadsheet utilizes a trigger point of 2.67 and if a mean PI score falls below the trigger point, the faculty at the program level must make significant changes to the course or the program to remedy the problem. Thus, depending on the trigger points activated, both the instructor and program faculty have input to the continuous improvement process.
- ECE 6010 assessment data are recorded in the continuous course improvement record only for the semester in which the student defends.
Project Defense Assessment:
- The project defense assessment is a direct assessment instrument that is completed by all faculty attending the final design review (defense) of a student’s project. This instrument assesses the student’s mastery of the program-level learning outcomes listed in Section C.
- These assessment instruments are described below.
Course assessment rubrics:
- The course assessment rubric is a direct assessment instrument that articulates the expectations for student performance.
- Dimensions (performance indicators) Scale (levels of performance) of 1, 2, 3 or 4 Descriptors (descriptions of the levels of performance)
- Assessment Report Submissions
- 2021-2022
- 2019-2020
1) First year student success is critical to WSU’s retention and graduation efforts. We are interested in finding out how departments support their first-year students. Do you have mechanisms and processes in place to identify, meet with, and support first-year students? Please provide a brief narrative focusing on your program’s support of new students:
- Any first-year students taking courses in your program(s).
- All new students meet with the program director for an orientation.
- Students declared in your program(s), whether or not they are taking courses in your program(s)
- Students who are not registered receive a phone call or email to check on their status (first absent semester only).
2) A key component of sound assessment practice is the process of ‘closing the loop’ – that is, following up on changes implemented as a response to your assessment findings, to determine the impact of those changes/innovations. It is also an aspect of assessment on which we need to improve, as suggested in our NWCCU mid-cycle report. Please describe the processes your program has in place to ‘close the loop’.
- If any of the assessment triggers are activated, the faculty or a subcommittee meet to address the trigger. For example, in Fall 2017 the defense reports triggered a faculty response. The faculty met on 4/10/18 (minutes available on request) to address the problem. The decision was made to enforce a policy that limited the number of graduate students any one faculty member could mentor to one (unless all the faculty had a student). That policy seems to have been successful inasmuch as defense assessments have steadily improved since then.
The full report is available
- Any first-year students taking courses in your program(s).
- 2017
1) Based on your program’s assessment findings, what subsequent action will your program take?
Based on these findings, industry partners and the engineering industrial advisory board, the MSCE program has made and will now make the following adjustments:
- The MSCE curriculum has been altered to include an advanced computer security course. The impetus for this change was a HAFB report indicating a need for graduates skilled in this area.
- The Artificial intelligence course has been folded into the machine learning course to accommodate the aforementioned computer security course.
- This assessment suggests that we need to raise the bar in terms of assessment criteria. To that end, next year the performance indicator scores will be based on the top 80% of students instead of the top 65%.
- Some courses in this assessment scored surprisingly high. An upper threshold will now be instituted to trigger a departmental review if the mean score of a course is over 3.5. This will allow the faculty to revisit a course and its outcomes to determine if they are challenging enough.
- Based on discussions with the engineering advisory board, a new course in digital testing is being developed. It will take the place of the advanced VLSI course (which has less industry support).
- Faculty will make changes to their individual courses as noted in Section F Subsection A.
We are interested in better understanding how departments/programs assess their graduating seniors. Please provide a short narrative describing the practices/curriculum in place for your department/program. Please include both direct and indirect measures employed.
A project defense assessment is completed by faculty attending the final design review (defense) of a student’s project. This instrument assesses the student’s mastery of all program-level learning outcomes. The resulting data are then compiled for each graduating class as described in Section E. (The defense typically occurs in a student’s final semester, so this assessment reasonably gauges the program’s effectiveness for the graduating class.)
No indirect measures of assessment are employed.
The full report is available - 2016
1) Based on your program’s assessment findings, what subsequent action will your program take?
-
2016-2017 is the first full year of operation for this program, and no program or course assessment data exist on which to base action.
2) We are interested in better understanding how departments/programs assess their graduating seniors. Please provide a short narrative describing the practices/curriculum in place for your department/program. Please include both direct and indirect measures employed.
- A project defense assessment is completed by faculty attending the final design review (defense) of a student’s project. This instrument assesses the student’s mastery of all program-level learning outcomes. The resulting data are then compiled for each graduating class as described in Section E. (The defense typically occurs in a student’s final semester, so this assessment reasonably gauges the program’s effectiveness for the graduating class.)No indirect measures of assessment are employed.
The full report is available for viewing.
-
- 2015
No report submitted.
- 2014
No report submitted.
- 2013
No report submitted.
- 2021-2022
- Program Review