Situation Specific Selection

Horses for Courses: Situation Specific Selection Revisited Helena Kriel, Technikon Pretoria, Sept 1999


1. Introduction

Section 32(b) of the Constitution of the Republic of South Africa, Act No 108, 1996, states that every person has the right .to further education, which the state, through reasonable measures, must make progressively available and accessible. In addition, the African National Congress. Reconstruction and Development Plan (1994:66) states that higher education systems represent a major resource for national development and contributes to the worldwide advance of knowledge. The central role higher education plays in the social, cultural and economic development in modern societies is emphasised by the White Paper on Higher Education (Department of Education, 1997:7).


2. Participation in Higher Education

In 1993 the overall participation rate in higher education in all post grade 12 educational programmes in public and private education institutions was estimated at 20%. The participation rate of Whites at this time was slightly less than 70% while that for Indians was about 40%, for Coloureds 13% and Blacks about 12%. (Stumph, 1997:1). The discrepancy in participation rate can clearly be seen and can partly be explained by the fact that traditionally scholastic achievements were used as selection criteria to gain access to South African tertiary institutions. Kotze, Van der Merwe and Nel (1996:39) state that: mainly because of unequal educational opportunities, (this) placed many students at a severe disadvantage. Although research in South Africa and elsewhere has clearly indicated that matriculation results remain the best predictor of success at tertiary level, it was also found that matriculation results of matriculants from the previous Department of Education and Training and equivalent school systems, particularly at the lower ranges, are an inaccurate reflection of students. academic ability, or potential for success at tertiary level. Van Aswegen (1997:14) supports the above-mentioned authors by referring to the current crises faced by universities and Technikons to find appropriate criteria for admission of disadvantaged undergraduate black students.

From the above it is clear that an alternative - fair and reliable - method of managing access has become inevitable.


3. Increasing access by means of alternative/special admission methods.

De Jager et al (1997:2) state that the term special admission is used internationally for the processes used in combination with regular admission procedures in order to accommodate applicants who would not qualify for admission if only the regular or ‘traditional’ academic criteria were applied. Special admission procedures are usually aimed at students whose matric results are below the standard required for admission at tertiary institutions, but who have the potential to benefit from higher education.

At the Technikon Pretoria, where this study was done, a system was developed whereby a prospective student is evaluated in terms of his or her potential to determine possible success in the chosen course. The Technikon Pretoria Potential Assessment (TPPA), as the process is known, attempts to level the playing field and by so doing, to give an equal chance of access to all applicants, irrespective of past academic achievement. In this way the Technikon Pretoria has taken the first step in the cycle of increasing black participation in Higher Education. (Moody, 1994:1).


4. Situation Specific Selection

In 1985 the HSRC undertook a research project in respect to the selection of tertiary students. They recommended course specific selection. The fields of study at universities, but especially at Technikons, impose distinctive requirements and challenges to students. The ideal therefore would be a course specific selection procedure or differential admission requirements for each course of study.

Furthermore, present day society places its emphasis on specialization and specialized roles. Cronbach (1990:9) feels that individuals should follow different paths in training in order to fulfil these roles. At the Technikon Pretoria the option of psychometric testing was taken as it, in measuring relevant characteristics, plays an important part in the differentiation of people in terms of specific roles. The Potential Index Batteries developed by Erasmus and Minaar were decided upon, as these batteries were developed in South Africa. These batteries furthermore operate on the basis of situation-specific standardisation as well as a situation-specific determination of validity and reliability. The batteries come with state-of-the art techniques and procedures of standardisation, validity and reliability and therefore constitute a unique approach in accommodating questions of diversity.

Legally speaking, the utilisation of psychometric testing has specific implications regarding the validity and reliability of the instruments involved, especially as how these reflect on the question of bias under conditions of diversity in terms of language and culture.


5. Method of Research

In order to adhere to the aforementioned guidelines and regulations set for assessment, an ongoing research project was launched by the Department of Student Counselling at the Technikon Pretoria in 1996. The data base used for the study, comprise 11898 psychometric records. The following are continuously monitored: situation specific reliability situation specific item bias analysis situation specific validity for the purpose of this paper, an overview of the results obtained will be given.


6. Results

As different batteries of tests (‘indices’) are being compiled for each group of prospective students as part of the Potential Index Batteries theory of ‘tailormade assessment’. The number of respondents assessed per index, differ and will thus each time be given (e.g. prospective Journalism students are expected to prove their environmental awareness and are therefore, inter alia, assessed on the index for general knowledge to determine their competency in this area, while prospective Engineering students are assessed on areas such as non-verbal reasoning etc.) The situation and its specific need - rather than considerations of potential generic to all human beings and what is seen as ‘average’ about people - determine the compilation of the battery for the particular assessment.


6.1 Reliability

The following situation specific reliability coefficients were obtained for the PIB-indexes utilised in the assessment of the academic potential of prospective students. Where the possible answers to an item consist of an item range, Cronbach’s coefficient Alpha was computed. Where the possible answers were dichotomous, the Kuder-Richardson-20 formula was used. [If all items are perfectly reliable and measure the same thing (true score), then the reliability score is equal to 1.0] Generally a reliability coefficient of 0.75 for cognitive indices and 0.65 for emotional/social indices is considered acceptable.


Table 1: Reliability coefficients as computed for indices utilised in the assessment of the potential of prospective students

Index

Method

N

Reliability coefficient

Composition of Wholes

KR20

3 353

0.84

Spatial Reasoning and Perception

KR20

4 564

0.79

Creativity

Cronbach

4 525

0.75

Reading Comprehension

KR20

11 454

0.83

Mental Alertness

KR20

11 108

0.87

Vocabulary

KR20

1 803

0.76

Interpersonal Relations

Cronbach

5 102

0.82

Self-Image

Cronbach

2 749

0.79

Motivation

Cronbach

2 687

0.71

Stress Management

Cronbach

1 585

0.92

Assertiveness

Cronbach

1 950

0.74

 

6.2 Item bias analysis

Legal and professional guidelines prohibit the utilisation of psychometric testing unless measures have been taken to ensure that the relevant assessment tool is culture fair and not biased against members of a specific group. The debate over the terms fairness and bias has been going on for some time and the difference between the two concepts is not always clearly understood. According to Murphy and Davidshoffer (1994:275) fairness refers to a value judgement regarding decisions or actions taken as a result of test scores. Bias on the other hand is a statistical characteristic of the test scores, or of the predictions based upon these scores. Thus, bias is said to exist when a test makes systematic errors in measurement or prediction (Murphy and Davidshoffer, 1994:275).

Brown (1983:224) defines test bias as follow: A test can be considered biased if it differentiates between members of various groups on bases other than the characteristic being measured. That is, a test is biased if its content, procedures, or use result in a systematic advantage (or disadvantage) to members of certain groups and if the basis of this differentiation is irrelevant to the test purpose. According to Brown, it is important to realise that his definition does not imply that a test is biased merely because members of different groups perform differently. When different groups perform differently on a test and the scores reflect differences in characteristics or traits, the test is not biased.

However, Brown concludes that as soon as a mean and/or distribution difference between groups are found, the possibility of tests bias should be investigated. In order to ensure that the instrument used is culture fair and not biased against any group of students, an item bias analysis is regularly performed on the total data base. As the biographical data of the testee is obtained during the process of being registered for the assessment, it is possible to extract information on gender and home language for this computation. As the number of respondents differ from index to index, the number of testees in the particular population is given with each coefficient.

The following results were found with the indexes used by the Technikon Pretoria:


Table 2: The result of an item bias analysis conducted on Index 1 (General Knowledge) [N=2160]

Group 1

 Group 2

Correlation between z-scores

P

English

Northern Sotho

0.957

0.000

English

Zulu

0.914

0.000

English

Southern Sotho

0.825

0.000

English

Xhosa

0.815

0.000

English

Tswana

0.935

0.000

Afrikaans

Northern Sotho

0.327

0.110

Afrikaans

Zulu

0.278

0.000

Afrikaans

Tswana

0.289

0.000

Afrikaans

Southern Sotho

0.360

0.000

Afrikaans

Xhosa

0.240

0.000

Some items from Index 1 showed definite bias as far as the Afrikaans and the African languages are concerned. Items that showed bias are: item 8 (against group 1), 12, 1, 17, 20, 23 (all against group 2).


Table 3: The result of an item bias analysis conducted on Index 3 (Reading Comprehension) [N=11 454]

Group 1

 Group 2

Correlation between z-scores

P

English

Northern Sotho

0.993

0.000

English

Zulu

0.989

0.000

English

Southern Sotho

0.993

0.000

English

Xhosa

0.973

0.000

English

Tswana

0.995

0.000

Afrikaans

Northern Sotho

0.886

0.000

Afrikaans

Zulu

0.900

0.000

Afrikaans

Tswana

0.903

0.000

 

Table 4: The result of an item bias analysis conducted on Index 5 (Mental Alertness) [N=11 108]

Group 1

 Group 2

Correlation between z-scores

P

English

Afrikaans

0.913

0.000

English

Northern Sotho

0.992

0.000

English

Zulu

0.989

0.000

English

Southern Sotho

0.981

0.000

English

Xhosa

0.974

0.000

English

Tswana

0.989

0.000

English

Tsonga

0.979

0.000

Afrikaans

Northern Sotho

0.878

0.000

Afrikaans

Zulu

0.904

0.000

Afrikaans

Southern Sotho

0.896

0.000

Afrikaans

Tswana

0.890

0.000

 

Table 5: The result of an item bias analysis conducted on Index 12 (Vocabulary) [N=1 803]

Group 1

 Group 2

Correlation between z-scores

P

English

Afrikaans

0.792**

0.000

English

Northern Sotho

0.949

0.000

English

Zulu

0.939

0.000

English

Southern Sotho

0.884

0.000

English

Xhosa

0.851

0.000

English

Tswana

0.956

0.000

English

Tsonga

0.931

0.000

Afrikaans

Northern Sotho

0.642

0.000

Afrikaans

Zulu

0.739

0.000

Afrikaans

Southern Sotho

0.701

0.000

Afrikaans

Tswana

0.740

0.000

Afrikaans

Tsonga

0.753

0.000

(**The English vocabulary test is taken down on Afrikaans speaking students)


6.3 Predictive Validity

There seems to be consensus among writers that validity is the most important characteristic of a psychometric test (Brown, 1983:19; AERA, APA, NCME, 1985:9; Anastasi and Urbina, 1997:8). Validity is said to refer to the extent to which a test measures what it is designed or developed to measure (Brown, 1983:19; Walsh & Betz, 1985:56; Anastasi and Urbina, 1997:8; Kline, 1993:15). A number of theorists add that validity involves the extent to which appropriate and meaningful inferences can be made from test scores and other measurements (Sax, 1980:289; Brown, 1983:98; AERA, APA, NCME, 1985:9; Mehrens & Lehmann, 1991:265).

Cascio (1991:149) feels that the traditional view of validity, namely the extent to which a measurement procedure actually measures what is designed to measure, is inadequate as it implies a procedure has only one validity, which is determined in a single study. In this regard Van Aswegen (1997:19) concludes that, particularly in selection research, many investigations are required to obtain thorough understanding of the interrelationships between scores from a particular procedure and other related variables. According to Binning and Barret (as quoted by Van Aswegen 1997:19) three strategies can be applied in this investigation process, namely content related, construct related and criterion related evidence of validity.

Although construct and content related validity were computed, this paper will focus on predictive or criterion related validity. To put the validity coefficients found into perspective, reference is made to Roe and Greuter who in 1991 reported a mean validity coefficient of general mental ability and special aptitudes of 0.25 and for personality tests a mean validity coefficient of 0.15. The selection interview has a mean validity of 0.15; for reference checks 0.17 has been found; for academic achievement 0.17 and for expert recommendation 0.21.

The validity coefficients reported in this paper on the relevant indices of the Potential Index Batteries, often by far exceed internationally reported mean validities which were found with other and probably less situation-specific assessment tools.


Table 6: Results of multiple regression analysis performed on data: predictors versus criteria (academic performance): Journalism students

Criteria 1

Predictors

R2

Mathematically calculated average

Mental Alertness Reading Comprehension Self-Motivation Spelling

0.58

Political Science (Major)

Time management* Spelling Mental Alertness Self-motivation

0.81

Journalism Practice (Major)

General Knowledge Reading Comprehension Mental Alertness

0.71

Photo Journalism

Mental Alertness Time Management* Reading Comprehension

0.59

Broadcasting Journalism

Mental Alertness Time Management* General Knowledge

0.53

Publishing Technology

Time Management* General Knowledge Mental Alertness

0.51

Contemporary History

Time Management* Spelling Self-motivation

0.56

Word Processing for Journalists

Mental Alertness

0.42

Afrikaans

Reading Comprehension* Spelling General Knowledge Self-motivation

0.62

*Inverted relation


Table 7: Results of multiple regression analysis performed on data: predictors versus criteria (academic performance): Analytic Chemistry students

Criteria

Predictors

R2

Analytical Chemistry

Reading Comprehension Numerical Ability Perception General Knowledge Vocabulary

0.86

Chemistry

Vocabulary Reading Comprehension Perception Abstract Reasoning Composition of Wholes

0.90

Physics

Vocabulary Abstract Reasoning Mental Alertness Numerical Ability Composition of Wholes

0.90

Mathematics

Vocabulary General Knowledge Spatial Reasoning Mental Alertness

0.50


Table 8: Results of multiple regression analysis performed on data: predictors versus criteria (academic performance): Biological Science students

Criteria

Predictors

R2

Anatomy and Physiology

Mental Alertness Reading Comprehension Numerical Ability

0.22

Calculations and Statistics

Spatial Reasoning and Perception Abstract Reasoning Reading Comprehension General Knowledge

0.54

Chemistry

Reading Comprehension Spatial Reasoning and Perception Abstract Reasoning

0.31

Physics

Spatial Reasoning and Perception Abstract Reasoning Reading Comprehension

0.45

Computer Literacy

Spatial Reasoning and Perception Composition of Wholes

0.45

Haematology

Mental Alertness Composition of Wholes Vocabulary

0.39

Microbiology

Mental Alertness Abstract Reasoning Reading Comprehension

0.3

 

Table 9: Results of multiple regression analysis performed on data: predictors versus criteria (academic performance): Drama students

Criteria

Predictors

R2

History of Costume

General Knowledge Self-Motivation Mental Alertness

0.40

Text Analysis

Mental Alertness Stress Management*

0.57

Practical Exercise in the use of Speech Sounds

General Knowledge Self-image*

0.34

Voice Production

General Knowledge Self-image* Mental Alertness Stress Management*

0.63

Practical Interpretation of poetry and prose

Reading Comprehension Mental Alertness General Knowledge

0.57

Acting

Mental Alertness Stress Management*

0.49

Movement

Mental Alertness Stress Management* General Knowledge

0.65

Oral Interpretation

Stress Management* Reading Comprehension Mental Alertness

0.70

*Inverted relation


Table 10: Results of multiple regression analysis performed on data: predictors versus criteria (academic performance): Performing Art Technology student

Criteria

Predictors

R2

Textile Design 1A

Abstract Reasoning Reading Comprehension Spatial Reasoning and Perception

0.37

Costumes 1A

Reading Comprehension Mental Alertness Composition of Wholes

0.33

Make-Up 1A

Reading Comprehension Abstract Reasoning Composition of Wholes Mental Alertness

0.43

History of Theatre

Mental Alertness Spatial Reasoning and Perception Reading Comprehension

0.38

Stage Technology

Reading Comprehension Stress Management Interpersonal Relations Mental Alertness

0.30

Communication

Mental Alertness Spatial Reasoning and Perception Composition of Wholes

0.35

Communication Writing

Reading Comprehension Abstract Reasoning Interpersonal Relations Spatial Reasoning and Perception

0.39


Table 11: Results of multiple regression analysis performed on data: predictors versus criteria (academic performance): Mechanical Engineering (Pre-Tech) students

Criteria

Predictors

R2

Mathematically calculated average

Creativity Reading Comprehension Spatial Reasoning and Perception

0.42

Mechanical Engineering Drawing

Creativity Reading Comprehension Mental Alertness

0.45

Applied Technology

Creativity Spatial Reasoning and Perception

0.45

Mechanics

Creativity Reading Comprehension Composition of Wholes

0.57

Electrical Engineering

Creativity Mental Alertness Spatial Reasoning and Perception

0.54

Mathematics

Composition of Wholes Creativity Reading Comprehension

0.37

 

7. Conclusion

It can be concluded from the above that, although there is always room for improvement, the current situation-specific potential assessment policy followed by the Technikon Pretoria, fully adheres to legal and professional guidelines set for psychometric evaluation. As many factors cause the process to be dynamic, the analysis of reliability, item bias and validity will continue, to ensure accountable selection decisions.

 

8. List of References

African National Congress. 1994. The Reconstruction and Development Programme: A policy framework. Johannesburg: Umyano Publications. American Educational Research Association, American Psychological Association & National Council on Measurement in Education. 1985. Standards for educational and psychological testing. Washington, DC: American Psychological Association.

Anastasi, A. & Urbina, S. 1997. Psychological testing. 7th edition. New York: Macmillan Publishing Company. Brown, F.G. 1983. Principles of Educational and Psychological testing. 3rd edition. USA: Holt, Rinehart and Winston. Cascio, W.F. 1991. Applied Psychology in Personnel Management. 4th ed. Englewood Cliffs: Prentice Hall.

Cronbach, L.J. 1990. Essentials of psychological testing. 3rd edition. New York: Harper and Row. De Jager, A.C.; Van Lingen, J.M. & Watson, A.S.R. 1997. Increasing access for Black students through a special admission procedure: The UPE experience. Paper presented at the 17th Annual Conference of the Society for Student Counselling in Southern Africa, Technikon Port Elizabeth.

Department of Education. 1997. A Programme for Higher Education Transformation. White Paper on Higher Education. Government Gazette, vol 386. Government Printer: Pretoria. Kline, P. 1993. The Handbook of Psychological Testing. London and New York: Routledge. Kotze,N.; Nel, A. & Van der Merwe, D. 1996. Culture fair selection procedures: the case of Psychometrics. Paper presented at the 16th Annual Conference of the Society for Student Counselling in Southern Africa, PREMOS, Pretoria.

Mehrens, W.A. & Lehmann, I.J. 1991. Measurement and evaluation in Education and Psychology. 4th edition. USA: Harcourt Brace Jovanovich College Publishers. Moody, C.D. 1994. Strategies for improving the representation and participation of Black faculty, students and staff in tertiary education. Paper presented at Conference on Multiculturism, Vaal Triangle Technikon.

Roe, R.A. & Greuter, M.A. 1991. Developments in personnel Selection Methodology. In: Hambleton, R.K. and Zaal, J.N. Advances in Educational and Psychological Testing: Theory and Applications. Boston/ London/Dortrecht: Kluwer Academic Publishers.

Sax, G. 1980. Principles of Educational and Psychological measurement and evaluation. 2nd edition. Belmont: Wadsworth Publishing Company. Stumph, 1997. The White Paper on Higher Education: Implications for counselling and guidance of students particularly regarding access and retention issues. Paper presented at the 17th Annual Conference of the Society for Student

Counselling in Southern Africa, Technikon Port Elizabeth. Van Aswegen, M. 1997. The standardisation of a learning potential battery for the selection of poorly qualified employees. Unpublished thesis: University of Pretoria. Walsh, W.B. & Betz, N.E. 1985. Tests and assessment. New Jersey: Prentice Hall.

I BUILT MY SITE FOR FREE USING