The Employee Aptitude Survey (EAS), used for more than 50 years in selection and career counseling, was developed to yield “maximum validity per minute of testing time” (Ruch, Stang, McKillip, & Dye, 1994, p. 9). Derived from earlier ability tests, it consists of 10 short tests that may be given singly or in any combination. There are alternate forms for nine of the tests. EAS is claimed to be easily administered, hand scored, and interpreted—although, in fact, advanced training or consultation is needed to interpret or use the EAS in selection. The Web site of the company that publishes the EAS (http://corporate.psionline.com/) contains brief suggestions for use and interpretation of scores and suggests that further technical support is available via phone or e-mail. The tests may be administered in paper and pencil form or online. A technical report suggests that these differing modes of administration yield comparable results. Instructions suggest that group and individual administration are equivalent; data on other ability tests suggest that group administration facilitates performance on highly speeded tests such as the EAS.
The test themselves have not been revised since 1963. The Examiner’s Manual is newly revised (2005), but the Technical Manual (1994) and Supplemental Norms Report (1995) are dated. A validation table is available on the Web site, but closer inspection reveals that it is drawn from the 1994 manual, which itself is based largely on older reports. Skimpy norm sample detail limits the EAS’s value in selection; local validation studies would often be required. Few EAS validity studies have been published in peer-reviewed journals. A brief example (apparently hypothetical) uses the EAS to increase selection validity and achieve cost savings; the estimates are overly optimistic. Furthermore, the Supplemental Norms Report reveals ceiling problems across several tests, making the EAS unsuitable for use in some upper-level occupations.
The norm tables do not constitute validity data since the groups were not selected on some criterion of success. They may serve as rough guides in career counseling, showing clients where they stand relative to various occupational groups. Only the validity coefficients in the Technical Manual have been validated for such predictions. They are fewer in number, grouped into eight broad occupational categories. A meta-analysis suggests the EAS has predictive validity across several occupational and educational groups, but differential validity evidence (gender, age, and race) is not included. Only rough predictions about training or occupational success are possible.
Reliability data are limited to alternate form single-session administrations. Although the coefficients are impressive, they overestimate reliability, the tests being homogenous in content and highly speeded. No internal consistency data are presented, although internal consistency ratings would likely be high, given the above. Except for the Manual Speed Accuracy Test, no test-retest (i.e., stability) data are presented.
The EAS has a long record of usefulness in industrial selection and career counseling. Despite these criticisms, the EAS compares favorably with other multifactor ability batteries (e.g., USES General Aptitude Test Battery, Differential Aptitude Test, Armed Services Vocational Aptitude Battery) when used to test clients seeking information about their abilities. The tests are short, easy to administer and score, and straightforward to complete. Interpretation can be a challenge.
References
- Ruch, W., & Stang, S. (1995). Employee Aptitude Survey supplemental norms report. Los Angeles: Psychological Services.
- Ruch, W., & Stang, S. (2005). Employee Aptitude Survey examiner’s manual (3rd ed.). Los Angeles: Psychological Services.
- Ruch, W., Stang, S., McKillip, R., & Dye, D. (1994). Employee Aptitude Survey technical manual (2nd ed.). Los Angeles: Psychological Services.