Nearly 40 experts, guided by competency-testing experts, identified the job tasks performed by various disciplines. An ACEDS global field survey determined the importance and frequency of the tasks, deciding on 15 major e-discovery areas of emphasis. Then, experts wrote 220 questions, which were separately analyzed by other experts. That was followed by more than 25 telephone conference calls where each item was reviewed for relevance. The testing experts then reviewed the chosen questions for clarity, relevance and comprehensibility.
A comprehensive breakdown of the procedures ACEDS followed in constructing the examination is found below.
ACEDS™ created the Certified E-Discovery Specialist (CEDS) certification examination in 2010. ACEDS™ built the CEDS certification examination following strict psychometric principles of soundness, integrity and professionalism. The CEDS certification adheres to the high standards of the Institute for Credentialing Excellence (ICE) (formerly known as the National Organization for Competency Assurance, NOCA).
The CEDS certification is independent and has no ties to any product, professional service or software. The CEDS certification examination is a fair, comprehensive, legally defensible, unambiguous test of knowledge and skill in e-discovery. Kryterion, a leading global psychometric testing firm engaged by ACEDS to assist with the development of the examination, assures testing soundness, security and integrity.
Assisted by Kryterion’s experts, ACEDS strives to assure that the results of the CEDS examination provide a reliable assessment of a candidate’s knowledge and skill of best practices in e-discovery.
The CEDS examination development process was designed to comply with the legal and technical requirements for professional certification and licensure as well as the requirements for accreditation by the American National Standards Institute (ANSI) and the National Commission for Certifying Agencies (NCCA)
The use of subject matter experts (SMEs) throughout the test development process is integral to the validity, legal defensibility and credibility of the CEDS certification exam. SMEs were chosen to be representative of the target population for the exam in terms of age, gender, educational and ethnic background, geographic location, work environment, and other factors that may influence how they perform their job.
The CEDS examination development process followed these steps:
1. Test Definition
2. Job Task Analysis Workshop
3. Online Test Blueprint Survey
4. Item Writing Workshop
5. Psychometric and Grammar Edit
6. Virtual Technical Review
7. Beta Testing
8. Psychometric Analysis
9. Passing/ Cut Score Study
10. Score Beta Test Takers
Kryterion conducted initial meetings by conference call and web meetings with ACEDS stakeholders to fully define the purpose of the exam, target audience, minimal level of competence required to pass the exam, and other important exam parameters. The exam stakeholders agreed upon major parameters and outlined them in the Test Definition Document.
A job task analysis is a systematic process for collecting information regarding functions (i.e., responsibilities and duties) and tasks performed on a job as well as the knowledge and skills (i.e., competencies) required to perform those tasks. The results of the job task analysis describe the breadth and depth of knowledge and skills that must be covered by the certification exam in order for it to be deemed valid, credible, and useful.
Kryterion uses the results of the job task analysis to create test objectives for an exam. Test objectives help ensure that the knowledge and skills measured by the exam are the same as those used to perform the job (i.e., measure applied job knowledge).
Kryterion facilitated a two-day live focus group session in South Florida where 12 subject matter experts (SMEs) from around the United States gathered. They identified the primary tasks, subtasks, and supporting knowledge, skills and abilities expected of a CEDS certified professional. The result of the focus group was a list of close to 500 proposed test objectives.
The results of the job task analysis were used to create an online test blueprint survey of more than 272 items to allow for worldwide administration to a large population of e-discovery professionals. The online survey results, represented by responses from approximately 500 people who answered the survey, were used to weigh the proposed test objectives by the job task analysis focus group.
The survey participants were asked to rate the level of expertise, frequency, criticality, and importance to test for each of the objectives identified by the focus group. The rating scales had a five-point scale for each practice analysis criterion. ACEDS provided the names and email addresses of individuals to invite to participate in the survey. The online survey was available for a three-week period to allow for maximum flexibility in administration to accommodate different professional schedules.
Kryterion analyzed the survey response data and submitted a draft blueprint to ACEDS for review, comment, modification, and approval.
Once the test blueprint was finalized, items (test questions) can and must be written. In the initial two-hour meeting, Kryterion provided training to SMEs on how to use the “Webassessor” online item writing system. SME/item writers were asked to write 3-5 items to be reviewed during two subsequent, two-hour virtual training sessions. The goal of the item writing workshop was to create 230 items, or test questions, to be beta tested.
Kryterion then performed psychometric, sensitivity, and English grammar edits on all items. The psychometric edit verified that each item conformed to applicable psychometric item writing standards. The sensitivity edit ensured that the items did not appear to favor any particular nationality, race, religion, or gender. The English grammar edit corrected grammar, usage, readability, clarity, and consistency of usage. After editing, ACEDS had SMEs review the items to ensure the substantive nature of the item was not inadvertently altered during the editing process. More than 30 “item review” telephone conference sessions were held with at least six subject matter experts participating in every call until all 230 items were reviewed. Some individual items were reviewed for 20 or 30 minutes.
ACEDS has calculated that more than 1,000 hours of professional time were devoted by the more than 40 volunteer SMEs who participated in the item writing and item review process.
At each step of the way, Kryterion worked with ACEDS to conduct a technical review with the groups of SMEs of the items for:
• Congruence with test objective
• Technical accuracy
• Scoring accuracy
• Importance to practice
• Plausibility of incorrect options
Each review session was conducted by telephone conference call and a virtual classroom and, in all, required approximately dozens of hours to complete.
The purpose of beta testing is to collect data from Test Taker responses on each item to statistically analyze the item’s performance and determine whether it should be retained, discarded, or revised. During beta testing, the items must be administered to a minimum of 60 examinees representing the target audience. The beta exam is administered at secure testing sites with certified proctors.
After beta testing is complete, Kryterion performs an item analysis, which provides valuable information on item difficulty, discrimination (i.e., how well an item discriminates between high and low test performers), and answer choice analysis (i.e.; proportion of Test Takers choosing each option). A psychometric analysis is compiled by Kryterion, which identifies items that exhibited unusual statistical performance. ACEDS makes the final determination to retain, discard, rework, or retire items from the test bank.
Kryterion performs a modified Angoff standard setting procedure to provide a recommended passing/cut score range. The modified Angoff is conducted during two two-hour virtual meetings. In the first meeting, SMEs learn about using the modified Angoff test range. In the Angoff method, SMEs are asked to predict the proportion of minimally competent examinees that would answer each item correctly. Each SME independently completes one round of test predictions, which Kryterion compiles and has SMEs review the results in a second virtual meeting. After the second meeting, each SME independently reviews the items for a third and final prediction. Kryterion compiles and analyzes the final predictions and provides a recommended cut score range. ACEDS is responsible for determining the final cut score.
After the exam was finalized, beta Test Takers are notified of their pass/fail status and topic-level scores.