SCORM 1.2 vs. SCORM 2004: Which One Is Better?
Although SCORM got its most recent update in 2009, it’s still going strong, being widely used by the eLearning community to this day. A few versions of the standard were released over the years, from which only two have stood the test of time: SCORM 1.2 and SCORM 2004. But, even though they’ve both been around for so long, it’s still hard to pinpoint the precise differences between them and their respective strengths and weaknesses.
In this article, we’ll take a look at these two most notable versions of SCORM so you can choose which one to go with.
How SCORM Began
In the year 2000, Sharable Courseware Object Reference Model 1.0 (SCORM for short) was introduced to the public as a new concept for interoperability in computer-based training. As we look back upon this event, it may seem nothing short of groundbreaking, since SCORM eventually became eLearning’s ‘household word.’ However, its initial release didn’t exactly make a splash.
For one thing, the world of digital training already had interoperability standards: collectives like AICC and IMS had released their specifications that aimed to make eLearning more standardized a few years before SCORM. Also, at version 1.0, SCORM was more of a collection of disparate concepts than a specification that could be implemented to solve real-life problems in digital education.
But, there was something special about SCORM that allowed it to eventually become the most ubiquitous specification in the eLearning world. You see, Advanced Distributed Learning (ADL) – the US government’s initiative behind SCORM, didn’t build it from the ground up. Instead, it took the best ideas from a number of already existing standards, like AICC, and integrated them to get a full-fledged solution.
In part, that’s why SCORM was named a Reference Model and not an actual standard – in its original embodiment, it referenced a number of existing specifications, rather than being a thing in and of itself.
Here are the specifications that were adopted in SCORM 1.0:
- AICC’s Computer Managed Instruction (CMI)
- AICC’s Course Structure Format (CSF)
- IEEE LTSC’s and ARIADNE’s Learning Object Metadata (LOM)
Some of these specifications were merely changed slightly in order to be used together, while others were stripped of many excessive elements that could complicate the implementation of the new “standard” for early adopters.
In 2001, a year after its initial release, SCORM saw versions 1.1 and 1.2 come out. And while the former improved the specification enough to turn it into an implementable solution, it was the latter stable release that really made SCORM notorious in the industry.
SCORM 1.2: The Most Supported Version Among Authoring Tool and LMS Vendors
Released in October 2001, SCORM 1.2 became such a lasting de facto standard for eLearning that it’s found in over 70% of eLearning content to this day. The reason for this is that SCORM 1.2 was finally ready to be implemented by both Learning Management System (LMS) and Authoring Tool vendors. And when the industry found out how quickly they can ensure compatibility by implementing SCORM compliance, it became an instant hit.
Suddenly, there was no need to reinvent the wheel every time a new company wanted to jump on the eLearning wagon and get their own LMS. All that was needed was to make their learning content conformant with SCORM and find an LMS that is compliant with it. The costs were orders of magnitude less than when you needed to hire a bunch of software developers to create a proprietary computer-based training platform and get it to work with whatever training content you had.
For SCORM 1.2, a couple of new specifications were adopted – IMS’s content packaging and their joint work on metadata done with IEEE. Also, it was the first version of the standard that was broken up into two separate guidelines, or “books”, as ADL called them:
- Run-Time Environment (RTE): defines a Javascript API that allows training content and LMS to communicate and record stats, describes the “verbs” that should be allowed in such communication (passed, failed, etc.) and other service commands.
- Content Aggregation Model (CAM): explains how separate resource files should be organized to form structured learning content. It also shows how content can be packaged for portability and what information should be used to describe the content (metadata).
Pros and cons
Let’s take a look at some of the pros and cons of using SCORM 1.2 in online training:
Pros
- Supported by a majority of authoring tools and LMSs. SCORM 1.2 compliance is still much more common in the eLearning landscape than support for any other standard.
- Simple and reliable. SCORM 1.2 is a rather small set of specifications compared to its successor, SCORM 2004. That’s why it’s simpler for developers to implement and there are fewer possibilities for misinterpretation and conflicts between different software that support SCORM 1.2.
Cons
- Can only be used with web content. This is a general downside of using SCORM, no matter which version. Since the Javascript API that SCORM uses to record training stats only works in a web browser, you can’t easily implement SCORM to track learning activities, say, in a desktop virtual reality app.
- Not good for long training modules. SCORM 1.2 allocates a really small amount of data for storing course progress – only 4096 characters. Hence, if there are quite a few interactions and slides in your training module, learners will have to complete it in one sitting. Otherwise, the module will continue not where they left off, but instead, from the point when the cmi.suspend_data variable’s limit was exceeded.
- Can’t track slide views and quiz results simultaneously. Success and completion can’t be tracked simultaneously, meaning that your course can only have one status at a time: passed, completed, failed, incomplete, or browsed. The status is set in the cmi.core.lesson_status variable.
- Doesn’t have a variable to store interaction text. When you check quiz results in the LMS, you will see what the learner’s response was, but won’t have the question text. Some authoring tools like iSpring Suite bypass this limitation (and a few others) of version 1.2 by putting the question text inside the interaction_id variable.
This version of SCORM was just what the eLearning world needed. So, it’s no surprise that it has been implemented so widely. At the same time, ADL received plenty of feedback from SCORM 1.2 adopters that evidenced certain shortcomings that needed to be addressed. And there were a couple of big features in the backlog that creators of the standard themselves wanted to implement. So, the work on the next generation of the standard soon began.
SCORM 2004: New Sequencing Capabilities and Detailed Data Reporting
In January 2004, the new version of SCORM was presented to the public. It was named SCORM 1.3 at first, but then the name was changed to SCORM 2004. It featured the same Run-Time and Content Aggregation books as SCORM 1.2, plus a new one by the name of Sequencing and Navigation (SN). This new book was based on a specification by IMS called Simple Sequencing.
While specifications that previously existed in SCORM 1.2 were simply revamped for the new version, the new book brought in entirely new capabilities. For one thing, a single content package could now include multiple SCOs, and it was possible to put them in a sequence, allowing content authors to fit entire courses into a single SCORM package. The new version also made it possible to have reusable SCOs – you could dynamically switch content bits in and out of your training modules to save time.
Unfortunately, ADL quickly discovered that, while the revamped portion of the old SCORM was doing great, the SN book had some integral flaws that didn’t allow the new specification to be implemented properly. Therefore, work on an update began.
2nd edition
Just 6 months after the original release of SCORM 2004, the 2nd edition of it was out. It addressed the pressing issues in the SN book, so this release of SCORM 2004 was actually the first one that could be implemented in full. It was also this version that, for some reason, had the suspend data limit (where course progress is stored) reduced to 4,000 characters – even less than SCORM 1.2 had.
3rd edition
The next update saw light in 2006 and featured many changes to all books (full list here). For example, the suspend data limit was finally increased to 60,000 characters, which allowed content creators to build longer courses with more complicated interactions, the state of which could be fit into the bigger suspend data storage.
It also was the first version of SCORM to feature certain requirements for a user interface of a SCORM-compliant LMS. The requirements came about because the new SN book featured so many complex rules for sequencing content inside a SCORM package that LMSs needed to comply with them too.
4th edition
It took ADL 3 years from the 3rd edition to release the final, 4th installment of SCORM 2004. The standard that came out in 2009 once again improved stability and included further additions to the SN book. But did that help SCORM 2004 reach the same popularity among adopters as its predecessor, SCORM 1.2? Not exactly.
There were a few reasons that stood in the way of SCORM 2004’s popularity – one being that the added SN book was often seen as complicated and irrelevant by eLearning vendors. Another reason is that around 2008, after the first iPhone was released, smartphones started looming on the horizon as a new type of device that people would use to consume digital training. This rising trend posed several unique challenges for SCORM, mostly due to the fact that mobile devices might not always have a stable network connection. So, SCORM 2004, even in its last edition, still needed a large-scale rework if ADL wanted to make it ready for the mobile age.Additionally, there were security concerns raised by the community that stemmed from the nature of the Javascript API, an inseparable part of SCORM.
All these issues led the industry to stick with the tried and true SCORM 1.2 rather than spending time and money to implement a new fancy SCORM 2004, whose biggest improvement would be an unnecessarily complex specification for sequencing.
Pros and cons
Let’s check out the pros and cons of SCORM 2004, as compared to SCORM 1.2:
Pros
- New capabilities. A new specification of Sequencing and Navigation was added to support multi-SCO packages and regulate how learners could navigate between the SCOs.
- Improved performance. The Run-Time Environment book was thoroughly reviewed and some superfluous elements that were in SCORM 1.2 were removed.
- Separate completion and success tracking. For example, if a user viewed all slides in your module, but failed the test, the module can report itself as Completed / Failed whereas, with SCORM 1.2, it could only be reported with one status.
- Added variables for clearer data reporting. In SCORM 2004, each interaction (a test question, for example) can have a description in addition to a short alphanumeric ID number in SCORM 1.2. As a result, when you grade learners’ attempts in an LMS, you see not just responses, but question text as well.
- Increased suspend data limit. 64,000 characters in the 3rd Edition. Generally, using content that is compliant with SCORM 2004 results in somewhat more detailed statistics in the LMS, since the character limits were increased for certain fields.
Cons
- Can only be used with web content. Same as SCORM 1.2, version 2004 didn’t change much when it comes to how content-LMS communication is handled. It still uses the same old Javascript API that limited SCORM to be used in web browsers.
- The Sequencing and Navigation book was too voluminous for vendors to implement. The idea of this specification was that instructional designers should be able to produce content as simple SCOs that are reusable and can be sequenced inside a course. But, most authoring tool vendors didn’t want to implement that specification since it was marked as optional. Also, the same functionality started to make its way into LMSs with their Learning Paths that allowed the sequencing of individual modules into training courses. So, the industry basically agreed to ignore that specification.
- SCORM 2004 isn’t supported by as many eLearning products as SCORM 1.2. For example, Moodle LMS has great support for SCORM 1.2, whereas it doesn’t really work with version 2004. So, you won’t be able to take advantage of the improvements in the newer version if either your authoring tool or the LMS that you’re using doesn’t support it.
SCORM 1.2 vs. 2004: Head-To-Head Comparison
Now let’s compare the two standards “mano a mano:”
Feature | SCORM 1.2 | SCORM 2004 |
Separate completion and success statuses | – | + |
Suspend data limit (characters) | 4,096 | 64,000 (3rd edition and later) |
Meaningful descriptions for interactions (test question text, etc.) | – | + |
One SCORM package can contain multiple SCOs | – | + |
SCOs can be sequenced and have individual scores reported | – | + |
Can be used outside of a web browser | – | – |
Courses can be hosted outside of LMS | – | – |
Includes an optional sequencing specification that complicates the implementation of the standard for eLearning vendors | – | + |
How many LMS and authoring tool vendors support it? | More than 90% | Less than 50% |
Conclusion
To conclude, we would like to say that SCORM 1.2 is certainly a lot more common– even to this day. That’s not because SCORM 2004 had critical flaws or wasn’t well thought out. No, it really had some important improvements compared to the previous incarnation. However, the complicated sequencing and navigation book made vendors avoid SCORM 2004 or implement it in the most basic form that doesn’t do justice to the really big changes that differentiate it from SCORM 1.2.
We suggest you go for SCORM 2004 if both your LMS and authoring tool support it – there are enough minor advantages it provides over 1.2 that will make your life a little easier. For example, a more adequate suspend data limit will prevent any issues with saving students’ progress in long training modules. Also, separate completion and success statuses that 2004 supports will let you track slide views and quiz results separately – this will make training stats in your LMS more informative and easier to assess.
Otherwise, stick with SCORM 1.2 – it’s a time-proven community favorite that does have its own quirks, but they are all manageable. Especially if you use an authoring tool like iSpring Suite that provides excellent support for the standard while helping bypass some of its limitations.
Along with version 1.2, iSpring Suite is also compliant with SCORM 2004. And for those in search of an LMS for your SCORM 1.2/2004 courses, check out iSpring Learn – it has a mobile app that will allow your learners to download SCORM modules and view them offline.