EXAM DEVELOPMENT

New and Improved

Launching The Linux Foundation Certified IT Associate (LFCA) certification during a pandemic required a different (and better) approach to exam development

Certification Magazine, July 2021, Summer Edition

The global pandemic required many certification programs to shift to remote operations—not only delivering tests via remote-proctoring, but actually creating the certification exams using remote processes as well. While many test sponsors doubtless replaced the traditional exam development process with Zoom workshops, The Linux Foundation took a novel approach.

By using an asynchronistic exam development platform to leverage a global group of subject matter experts (SMEs), the foundation was able to launch a brand-new credential—The Linux Foundation Certified IT Associate (LFCA)—in record time. Can this approach help other programs succeed in a post-pandemic world?

It turns out that’s a question for Clyde Seepersad, general manager of training and certification for The Linux Foundation. Here’s what Seepersad had to say about the process of getting a new Linux certification off the ground in record time.

Question: What was the impetus for creating the new LFCA credential?

Seepersad: The Linux Foundation wanted to create a credential that would be suitable for those who are new to the IT world and interested in becoming an administrator or an engineer. This entry-level credential, called The Linux Foundation Certified IT Associate (LFCA), was designed to be a stepping stone to more advanced credentials. We realized that there aren’t a lot of good options for folks who are trying to start an IT career in the age of cloud and in the age of continuous integration and deployment of code. So, seeing that gap in the market—and hearing that consistently from our customers—we wanted to get something out quickly, in order to help bring in that next generation of talent. However, we also needed to be able to recruit people from all the right demographics and geographies, and of course we were dealing with the global pandemic.

Question: How did you approach that task?

Seepersad: We wanted to bring this certification to market quickly because we saw that there was a demand, but we knew the traditional approach of trying to get folks together in person to develop the exam was not going to work. We needed a platform that would allow us to do everything remotely—pulling in people from a variety of geographies, industries, demographics, and age groups to effectively represent the communal take on what would be the right components of knowledge for a certification of this type. We wanted to move quickly despite the pandemic, but we really wanted to get a broad perspective on “What does an entry-level credential look like in the age of cloud computing?”

However, scheduling SMEs and working around participants’ bandwidth is always a challenge in exam development. And it’s one that invariably lengthens time-to-market. Considering all of these issues, we adopted a full asynchronous model using the Certiverse platform—and it eliminated almost all of the scheduling and availability issues.

Question: How did you identify and recruit subject matter experts (SMEs)?

Seepersad: Thankfully, the global open source technology community is very strong—so identifying potential SMEs wasn’t as challenging as trying to recruit them during a pandemic. Traditionally, program sponsors have offered stipends and badges to recruit and incentivize subject matter experts, as well as covering all travel-related expenses for having them attend in-person workshops. In developing the LFCA, we took a completely different approach and instead offered a share of the exam revenue. We used a model similar to training sites like Udemy and Pluralsight, giving the SMEs the opportunity to earn money for their questions that were used in the exam. We also added a gamification piece to engage their competitive spirit and motivate the SMEs.

Question: Was that approach successful?

Seepersad: Yes, we were very pleased with the outcome. We successfully recruited 2 dozen SMEs from 13 different countries, which gave our project the truly global perspective we were looking for.

Question: In a traditional exam development process, everyone is gathered together in person—what challenges did you face in trying a new approach?  

Seepersad: Since we were developing exams using an online, distributed, mostly asynchronous process, we had several things to consider as we were thinking about compressing the timeline. One, of course, was keeping people engaged, right? I think we all know now, after a year of Zoom, that it's sometimes hard to stay engaged with an online process. So, part of our rationale in trying to compress the timeframe, was to make sure that folks didn't get pulled away by other things. It also helps keep the context fresh. That is, as you're thinking about developing an item, if you're not completing it in a single session and you're coming back to it, being able to move quickly through the process helps keep that context fresh—so you remember why you were architecting the question the way you were. Compressing the exam timeline really helped make sure that the exam authors stayed on task.

Question: Was it difficult to train the subject matter experts?

Seepersad: 100% of the participants were new to using the platform, so seeing the low level of facilitation that was required is something that we were really pleased by. Item writing is definitely the most challenging part of creating an exam. Certiverse uses AI and machine learning to guide participants through the process and Natural Language Processing helps content authors with syntax, so the exam is written with a consistent voice. Despite having multiple authors, the end product doesn’t sound like “exam-by-committee.”

Certiverse conducted a survey of the SMEs after we were fully done with development and 91 percent said they really liked using the platform.

Question: What other concerns did you have while going through the process?

Seepersad: One of our chief concerns was would the crowdsourced content really be high quality? We had put together this panel of folks and asked people to contribute—not just to the definition of the exam—but also to the authoring of the items. Typically, we would use 8-10 people as authors. In this case, we were using dozens of authors. So, it was really more of a crowdsourcing model. While this was great in terms of increasing the diversity of the content authors, there was also some risk of lower quality: What would the questions come back looking like with a much larger pool of authors?

Question: How did you ensure quality?

Seepersad: The Certiverse platform uses AI designed by a psychometrician to guide participants in writing high-quality, valid items, but there’s a human expert component as well. One of the premises that went into designing the program was that we wouldn't have these single points of failure—people would be able to interact with other with each other—to comment, critique, shape, and hone the items. All of the SMEs were required to review other contributors’ items in order to be eligible for compensation, and all items had to make it through a minimum of two peer reviews. The SMEs also had the ability to chat with each other within the platform and discuss technical content. This eliminated additional editing and rewrites and improved quality.

Question: Were you able to balance quality with the speed you were looking for?

Seepersad: The total timeline from beginning to beta was four months. The job task analysis and survey were completed in three weeks, because we got a lot of interest from SMEs as a result of the revenue-sharing model. The SMEs were given four weeks to review and write items and were able to complete the item goals laid out by the psychometrician well within that timeframe. The beta exam was launched in December. We got enough test takers to have item analysis and standard-setting done within two weeks and launched the live exam by January 15th. Over 1,000 people registered for the exam in the first 100 days. This is very difficult to do—especially if it's a brand-new exam. Particularly with entry-level, where folks can be a little bit skittish because they're not fully committed to this career yet.

You know, we got great feedback. I knew a couple of the item writers from past work and have been able to get some unfiltered feedback from them which was all very positive about their experience and the process. You know it was a difficult year for all of us, so being able to get the exam out quickly—live, well-reviewed, with very little upfront cost—yeah, it’s been borderline miraculous.

Question: Do you have any advice for other certification programs that may be considering a process like this?

Seepersad: So many programs are in need of content right now, especially since pivoting to more remote-proctored testing as a result of the pandemic. When you give up some control on the testing side (as invariably happens with remote-proctoring) you need to compensate for it by producing more items. Content is King. In other words, if you want to keep your exams secure, you need as many questions as you can get—because a larger item bank to pull from means less chance of item exposure.

This process provides a quick, easy and affordable way to get that content built. So, for test sponsors whose exams aren’t performance-based and who need a large volume of items, this is a great approach. Likewise, for organizations that may be looking to launch a credential—especially those who may be lacking exam development expertise or an unlimited budget—I’d highly recommend it.

Widening the panel of contributing SMEs brings greater perspective, diversity and stability to the exam development process, and the revenue-sharing model incentivizes them to become evangelists for the program. We all learned a lot during the launch of the LFCA credential and are developing additional tests using the Certiverse platform right now.

Clyde Seepersad is senior vice president & general manager of training and certification at The Linux Foundation. Over the past decade, Clyde has held senior leadership positions in the education space, most recently as head of operations at 360training.com and before that as a senior executive of Houghton Mifflin Harcourt, a global leader in education. 

 

Similar posts