Tracking individual questions delivered randomly

For issues related to delivery of content to end-learners e.g. LMS tracking via SCORM and delivery of content offline.

Moderator: Nav

Tracking individual questions delivered randomly

Postby jzoeller » Wed Oct 22, 2008 12:24 pm

We are getting ready to create an assessement with a bank of 100 questions. In the assessment we will select 50 questions. My question is just how this will be handled by the LMS. Does each unique question have an assigned ID even though they are delivered randomly? We would like to be able to track which questions users have problems with.
Jim Zoeller
UPS Airlines
jzoeller@ups.com
jzoeller
 
Posts: 44
Joined: Mon Apr 02, 2007 5:56 am
Location: Louisville, KY

Re: Tracking individual questions delivered randomly

Postby Nav » Thu Oct 23, 2008 12:02 pm

Hi Jim,
I think that this depends on your LMS. I've created a lesson called "SCORM Interactions Example", located in Shared Library > Examples. If you download that course as a SCORM 2004 version and upload it into your LMS, see if it works (i.e. set the value and then get the value using the buttons). If that doesn't work, let me know. If it works, great, you've just started! Save a copy of the lesson to see how it works, but here's the general idea:

It's saving the data as an "interaction", which is a model element of SCORM. You must give each interaction a name by doing the following
SetLMSValue > "cmi.interactions.n.id", "name (probably 1 or 0)"
Where "n" is the index of the interaction. You have to start at 0, so if you have 100 questions, you'll start at 0 and go to 99. You can always just create 0 and ignore it after that, only paying attention to 1-100 to keep everything numbered the same.

Then, for each interaction that you've named, you need to set the result. So after they get question 42 right, you'll put in an action like the following:
SetLMSValue > "cmi.interactions.42.result", "correct"
And put that in a conditional statement with a "false" result if they get it wrong.

This example only tests to make sure that the information is being sent. I'm not sure how you will view this information, as that depends on your LMS's reporting ability.

The other thing is that since you have a HUNDRED questions, it's going to take forever to set all these actions up in SmartBuilder. It might be easier to use variables, appending, and whatnot to make the indexes more programmatic, and save this as a template. I can help you with this, but before I do, I want to make sure that the "SCORM Interaction Example" lesson works on your LMS, and you are able to view the interaction results. Let me know what you find out about these 2 items, or if anything was unclear in this explanation. Good luck!

- Nav
Nav
 
Posts: 866
Joined: Mon Nov 05, 2007 2:58 pm

Re: Tracking individual questions delivered randomly

Postby jzoeller » Thu Dec 11, 2008 8:22 am

OK, I tested the SCORM Interactions sample. When I run it through the SCORM Conformance Test Suite, it seems to hang and never sets the Interaction ID in the Test Suite. I also created another lesson, Crew Scheduling Assessment - Fall, with the same concepts and have the same problem. As soon as I select an option, the lesson hangs. Please take a look and let me know what changes need to be made.
Jim Zoeller
UPS Airlines
jzoeller@ups.com
jzoeller
 
Posts: 44
Joined: Mon Apr 02, 2007 5:56 am
Location: Louisville, KY

Re: Tracking individual questions delivered randomly

Postby Nav » Thu Dec 11, 2008 11:30 am

Hi Jim,
Is your LMS using SCORM v2004 or SCORM v1.2? I am using testtrack.scorm.com to test this, and it seems to work fine(ish).

When I download the example as a v2004, everything works.

When I download the example as v1.2, it sets the interaction result and everything, but the "get" doesn't work to report it back in to the lesson. I can look in the log and the report and see that the results are recorded, however when I use the get buttons, they return empty strings (and I get a little popup that complains about cmi.interactions.results being write only in v1.2).

If your LMS/conformance test is only v1.2 compatible, it may run into issues when you try to upload a v2004 lesson into it. I'm not sure, since testtrack seems to work with both v2004 and v1.2 and automatically detects what the lesson was downloaded as.

As for your particular lesson, I believe the hanging problem is in this screenshot:

screenshot.jpg
screenshot.jpg (40.6 KiB) Viewed 17533 times

It looks like you were going to put in feedback, but then deleted the set, leaving undefined nodes in the flow chart. This breaks the flow, and your lesson hangs because it doesn't know what to do next. A good way to test to see if the SCORM calls are causing problems is to try previewing your lesson in SmartBuilder, where there is no SCORM communication going on, and see if it works there.

So, if you delete those 2 undefined nodes, it seems to work fine, and I was able to get it to report the results to the LMS. I've never really used the ADL SCORM Conformance Test Suite, and when I tried it, it was confusing and scary and I couldn't make it work. Running it directly on your LMS seems to be the most reliable way to see if things work, and testtrack comes a close second (probably just because I'm familiar with it and I'm not familiar with the test suite).

Hmm, so what the next steps from here? I suppose, once you make that small change to your flowchart, I'd see if you can upload your lesson and the example lesson directly into the LMS you'll be delivering it from, and see how it goes once there. If you have to use the test suite, then check the log to see if the interactions are being recorded. If they're not, let me know, and maybe we can set up a gotomeeting and we can look at what results are being reported and try to see where the problems are. Let me know how it goes!

- Nav
Nav
 
Posts: 866
Joined: Mon Nov 05, 2007 2:58 pm

Re: Tracking individual questions delivered randomly

Postby jwafford » Wed Jan 21, 2009 9:07 am

Nav,

I have been working on this project with Jim, and I just picked it up after some time away from it. I started attempting to use the interaction settings as you had previously shown me. Unfortunately, when I upload the content into my LMS, or when I try to run the content through the ADL comformance test, it keeps hanging up on the first Submit button.

The lesson is entitled CrewSchedulingAssessment-Spring. It works fine in preview mode, but I am having trouble diagnosing why it doesn't want to continue past that first Submit button. I need to get past that first display set to verify in my LMS and the Conformance tester whether the interactions are being transferred.

Can you take a look at it and see if you can tell me where my problem lies?

Thank you for any help you can provide!

Jeff
jwafford
 
Posts: 13
Joined: Thu Apr 24, 2008 4:47 am

Re: Tracking individual questions delivered randomly

Postby Nav » Wed Jan 21, 2009 11:34 am

Hi Jeff,
Are you using SCORM v1.2 or v2004? If you look around on your conformance test software, it should say somewhere in the title. And, more importantly, what is your LMS going to use, v1.2 or v2004?

We can also set up a gotomeeting and get this solved all in one go. Feel free to call me at 760.635.5700x207.

- Nav
Nav
 
Posts: 866
Joined: Mon Nov 05, 2007 2:58 pm

Re: Tracking individual questions delivered randomly

Postby jwafford » Wed Jan 21, 2009 11:52 am

Hello, Nav.

We are using SCORM 2004.
jwafford
 
Posts: 13
Joined: Thu Apr 24, 2008 4:47 am

Re: Tracking individual questions delivered randomly

Postby Nav » Wed Jan 28, 2009 11:30 am

Jeff,
Although we talked about this and found a resolution, I'm just going to post our conclusions here so that it might be helpful for others reading this post.

We found out that the main reason that your lesson seemed to do okay on the ADL SCORM Conformance Test, but was still failing when trying to submit data to the LMS was because your Flash security settings were too stringent (which is the default setting). This is often the case when trying to locally view content that gets and sends information out of the flash piece. This is generally not a problem if it is uploaded into the LMS. We usually recommend scorm.com's Test Track software for testing SCORM compliance (it also gives you detailed logs that help when debugging).

Changing the Flash security settings allowed you to view your content locally. Most people don't run into this issue because we offer an "off-line version download" that circumvents this (but does not use SCORM, because there's no LMS to deal with).

To change the Flash security settings, all you have to do is right click on any Flash content, go to Settings..., Make sure you are on the Security Tab, and click Advanced. This will open up a new window or tab on the Adobe site, and on the left you want to click on Global Security Settings Panel. You will notice a panel embedded in the HTML page, which is the actual panel you will need to make changes in (it is not an image). From here, just select Always Allow. The settings should be saved as soon as you change the selection, but to confirm it might be a good idea to go to a different tab or link, and come back the Global Security Settings to confirm that your changes have been saved.

I think that about closes up this issue. We're still in the process of resolving a separate issue regarding which SCORM calls are required by your LMS, but I don't think anyone else will really benefit from that information.

- Nav
Nav
 
Posts: 866
Joined: Mon Nov 05, 2007 2:58 pm


Return to Delivery (LMS Tracking / SCORM / Offline Delivery)

Who is online

Users browsing this forum: No registered users and 2 guests

cron
Not able to open ./cache/data_global.php