Resumen
There has been an increasing focus internationally on the quality and impact of research outputs in recent years. Several countries, including the United Kingdom and New Zealand have implemented schemes to base the funding of research on research quality. The Australian government is planning to implement a Research Quality Framework (RQF) in the next few years that will impact greatly on funding of research in Australian universities. A key issue for Australian researchers is how the quality and impact of research is defined and measured in their discipline areas. Although peer review is widely used to assess the quality of research outputs, it is expensive and labour intensive. Other surrogate quality measures are often used. This paper focuses on measuring the quality of research outputs in the information systems discipline. We argue that measures such as citation indexes are inappropriate for information systems and that the publication outlet is a more suitable indicator of quality. We present a ranking list of journals for the information systems discipline, and discuss the approach we have taken in developing the list. We discuss how the ranking list may be used in defining and measuring the quality of information systems research outputs, the limitations inherent in the approach and discuss lessons we have learned in developing the list.