Resumen
The success of information systems programs accreditation process is dependent on a crucial accreditation compliance requirement which is assessing programs outcomes and using these assessments to close the loop and introduce key adjustments in the programs to raise outcomes level above an acceptable benchmark level. Furthermore, institutions would be required to maintain these assessments processes throughout the accreditation period and produce interim reports about their implementations. However, outcomes assessments feedback implementation is a complex and dynamic process which can produce unexpected and surprising results. This study investigates the different forms of assessment process and studies its dynamics by developing an academic program model with outcomes assessments feedback. These developments are based on genuine experiences with information systems program assessments and successful accreditation by ABET. The development uses system dynamics approach which can model complex and dynamic socio-technical systems. A system dynamics modeling and simulation are developed which can help academic administrators study the consequences of any assessment policy decisions in an interactive and dynamic way. Simulation runs revealed inherent oscillation of program outcomes over the years. Both the model and the discovered program outcomes oscillation are validated by analyzing actual assessments data collected over five year period. Extenuating this oscillation demands a careful design of the assessments feedback policies as well as course delivery to balance its effects. The simulation results lead to recommending guidelines to achieve effective assessments and raise the program quality. This is a novel approach to study outcomes assessments processes using the rich system dynamics approach.