Tag Archives: Edufountain

Eight Tips to Reduce Online Cheating in Your Online Class

Jeffrey Young talked to me for a story he was doing on online cheating for the Chronicle of Higher Education. I’ve given a lot of thought to the problem of cheating in online courses during my career building edu software. Here are a few tips I’ve gathered to help you build online courses that stop cheaters.

1-Culture and Learning Design — The culture of the class and the learning design can have a major impact on cheating behaviors.  More constructivist activities like blogging, wiki creation and group projects tend to reward learning.  Foster a culture that rewards contributing to the corpus of knowledge within the class.  If students are recognized for individual contributions, they will pressure their peers to do their own work rather than copy each other.

2-Question Pools and Random Blocks — Build question pools and use random blocks instead of single questions— Instead of giving each student the same question, create variations for each topic area and then use the “random block” feature to show a different set of questions with the same difficulty for each student.  If you have a question bank from a publisher this is very easy to do because questions are often tagged by level of difficulty and topic.

3-Randomize Question Ordering, Answer Order and Question by Question display — In most systems you can set a quiz to go question by question, instead of all at once.  You can also randomize the order of questions and the order of multiple choice or matching questions.  This makes copying off another student’s system a bit more difficult.

4-Change quiz feedback display options — for high stakes exams change feedback display options to hold the feedback until after the grades are posted.  You can also limit the time that feedback is available for students.

5-Use “Negative marking” — You can assign negative points for a wrong answer.  This penalizes students for guessing by lowering their overall score with each wrong answer. This feature was added to quizes in Blackboard Learn SP8.

6-Use calculated questions — Calculated questions allow you define a range of variables and formula to ensure that each student does a unique problem set.  

7-Allow multiple attempts and use “formative assessments”— use the quiz to help the learner understand the topic rather than high stakes summative quizzing.  The goal here is to develop topic mastery by having students take the same quiz multiple tmes in the learning process and be able to see their ongoing mastery of the subject.  By lowering the stakes of the individual quiz attempt the student is rewarded for learning rather than punished for failing. 

8- Think like a video game designer — Consider video games where the player repeats the same level over and over again until they master it.  In really good games the mastery of the level is the reward, and using a cheat code makes the game boring and unplayable. “Watch a video, click next and take a quiz” style courses reward “cheating” and copying.  Completing the sequence is the reward, people will do whatever gets them through sequence the fastest. Cheating may be a symptom that the learning design needs to be revisited for the activities in the class. For more on this read Punished By Rewards by Alfie Kohn.

More thoughts on OERs and Open Content

I was reading Michael Feldstein’s blog about OERs and reading my earlier post on the subject as well as some discussion on the Openness Listserv. I’m afraid, very afraid for the future of OERs and I have no idea what to do about a problem I see looming. What happens when the front groups and big PR budgets figure out how much influence they could gain with the right kind of free textbook? Economics 101 written by a front group for (opposite side of your political spectrum). I mean Glenn Beck has a “university” now. You think edit wars on Wikipedia are tough. Consider this recent controversy combining free science materials and global warming. I’m mostly pondering this problem and hoping smart folks out there have some ideas for solutions. I got nothing, except maybe that we need more librarians.

Open Education and the Midterms

The US midterm election is today and I wonder what it will mean for Open Education Repositories and Open Textbooks. One interesting scenario would be if Harry Reid loses but the Democrats retain the majority in the Senate. Dick Durban one of the main contenders to succeed Reid in the leadership sponsored S1714 which would fund grants for open and freely distributed textbooks. If you want to see more support in the senate for funding these initiatives then you should cheer educators like Patty Murray (Washington) and Michael Bennet (Colorado). You may also support Joe Sestak running for PA senate. Rep Sestak was on the House committee that did the work on HR 3221 – Section 505 that was to setup grants for the development of open textbooks.

Generally the Republicans have taken a skeptical view of open media like PBS, NPR as well as institutions like the Department of Education. Comments from conservatives regarding S1714 and hr3221 section 505 were very negative, some see it as a plot to enshrine political positions into the textbooks.

On the other hand with private money from the Gates, Hewlett and Melon foundations, perhaps OER and open text books are better off without the heavy hand of the US government. I also fear that just as we’ve seen with K-12 science curriculum standards, more direct involvement by the Feds in production of book may lead to some ugly fights. Imagine the debates while the DOE funds a biology, economics or civics textbook.

Edufountain II: Mad Scientists for Future of eLearning Seminar

The US Army hosted a conference called The Mad Scientists Future Technology Seminar 2008 . The goal of the conference was to explore how the proliferation and development of speculative technologies could affect the battlefield in the next 10-25 years.

I think we need our own mad scientist series for education. I’ve spent the spring blogging about the future of Web 2.0 and education. I was able to use these notes in a white paper that was distributed at the BbWorld DevCon conference.

This paper was focused on things I see as having a major impact in the next few years and near horizon product cycles. The next set of topics I want to explore are things that are more speculative applications of technologies.

As a parent of the class of 2022 I’m wondering what the high school experience looks like almost a decade out. What are some possible future scenarios for teaching and learning.

There are enormous strides being made in understanding cognitive development, human computer interaction and neurotropics. Quantum computers may allow us to perform multi-vector analysis and searches to solve problems currently beyond computer science and math. Breakthroughs in human health could make us stronger, healther and longer lived. At the same time we could see dark futures with computers controlling and directing our lives; bleak worlds where environmental and system collapse leave us entering a new dark age, etc.

My plan is to look at some far flung ideas and try to think about how they might impact the VLE and education. I’m going to seek out some mad scientists at my upcoming conferences get their take on the future as well.

As an open thread what ideas would you like to explore? Who would you like me to pull aside for Q&A? And what questions do you have? Finally are you a mad scientist out there in a secret volcano labouring to reshape teaching and learning using technology? Drop me a line.

Some Thoughts On Standards and Vendors

mfeldstein has an interesting post up on his site about how to judge a vendor’s support for standards. I’m going to weigh in with my personal views and then provide some comments on his article. The usual disclaimers apply here that these are my personal thoughts and not those of any employer or group I work with /for around etc. Though others are free to chime in agreement or disagree.

Personal Soapbox
I want to start by articulating my personal view as a participant in the standards community. I’m a member of the IMS Technology Advisory Board and a contributor to the IMS Learning Information Systems (LIS) working group. I also represent my company on the Common Cartridge Alliance Program Management Group (CCAPMG). I’ve done quite a bit of work on the bulk data exchange service for LIS. I also work with the SIF association and ADL collab from time to time. I’ve invested my professional and personal time in helping to evangelize for education technology standards. I’ve recently joined the OpenSocial working group and am working with some folks in the community to create a profile of OpenSocial focused on educational data.

It is my personal mission to use standards to create what I call the edutech commons. This would be a set of common code libraries usable by Virtual Learning Environments, Student Information Systems, Learning Tools and Learning Networks which would allow easy integration. In the same way that BSD’s libraries for TCP/IP have become the defacto implementation used by most operating systems, I would like the code that connects up technology in .edu to be easily plugable into many implementations. The whole point of object oriented coding was that we should be able to create consumable objects that wrap business processes and share them across projects.

We’ve actually seem some great progress in this front on the Basic LTI project. Thanks to hard work of folks like Dr Chuck, Stephen Vickers and others we now have a pretty good set of basic LTI libraries out there in the field.

Another great example of this is the Shindig project for OpenSocial. With Shindig any vendor can implement the opensocial framework by creating service provider interfaces (SPI’s) to map their data and classes to the open social API. Because Shindig creates a common service layer, code can be more easily moved between OpenSocial implementations.

The alternative world is one of lots of clean box implementations. We end up in the world of JavaScript, SCORM and other “standards” which have many quirks from implementation to implementation. The lack of a code library in the commons results in multiple interpretations and levels of compliance and this makes interoperability more difficult.

A second concept that is important to me personally is the notion of data transfer. It is my personal view that data should be written to the system once and then be able to move with limited restrictions throughout the network. I want data creators to be able to move easily through the network with lower transfer costs by data consumers. It frustrates me that so much government information is not packaged in a way that it is easily portable. Consider the very necessary work of various companies who compile state education standards and resell the resulting databases. They must parse through different formats, word documents, pdfs and sometimes type written documents without an electronic equivalent other than a TIFF file to compile the information into a computer consumable database. Imagine if instead there was a semantic markup for educational standards. State governments could establish a mechanism for publishing standards in a way that these items could be easily parsed. I love the UK Government Data program’s ambition to make all government documents published with semantic markup and meta-data on a central site.

A Personal Reaction to the article on e-Literate

Knowing my personal biases, let me apply some comments to the proposed rubric for vendors and standards. Michael starts by discussing drivers for standards including the notion that

Standards tend to reduce the total amount of money that customers spend on integration, which generally means that somebody is going to make less money.

I strongly disagree with this. Integration is low value consulting in my view. It is not strategic and does nothing to deepen your relationship with your customer. Customers generally have a limited budget for services with a vendor and if you use it up on integration you aren’t talking about how they can more strategically deploy your product and providing them with customizations that better adapt the software to their business. Furthermore by raising the costs connecting the software to other business operations you pull yourself out of the networks that create the next generation of value. My goal is to create standards that are easily consumable and implement the highest value functionality in a common format maximizing portability. I’m hopeful that LIS won’t launch a new wave of expensive products for customers to buy but will instead add capabilities to existing products and lower maintenance and development costs for customers and creators.

Conformance Profiles
Michael writes that is is important to understand how a vendor conforms to the standard. He writes about how conformance profiles are useful in assessing how the features of the standard are implemented by the vendor. I think this whole experience is a sucky one and I loathe it. Most people never read the conformance profiles and fewer still actually understand them. I’ve been on many calls with developers working to understand specific nuances of a conformance profile on top of the already complex standard document. I’ve also worked with consultants in the field trying to parse a 200+ page spec document to figure out if the problem was a bug, conformance issue or just some misunderstanding of the spec. I just want the integrations to work. As a coder I don’t want to spend time analyzing hundreds of different vendors conformance documents to figure out how company “a” implements the person id. As a technology buyer I don’t want to have to know the ins and outs of every standard implemented in the products I use. For enterprise integration I just want to read a list of sections and populate my database and give you back some grades at the end of the semester. I want SOA that works, not some crazy new way to do EDI that requires another peice of expensive middle ware just to make systems exchange data. In my dream world to point system A at System B and generally have them talk to each other with minimal fuss because they both implement a standard. Just like I can call your phone # and have it make its way through the network across hardware from Cisco, Lucent, Alcatel, and Motorola until I hear your voice on the other end of the line. As a phone company customer I didn’t have to review the variances between the CMDA data network vs the WiMax protocols to make the phone call happen. I’m very happy that basicLTI has gotten to this point. It seems like Chuck sends me an email every week saying test out this new thing we integrated with BasicLTI and we enter in our shared secrets and register the tool and it just works.
To walk back my rant a bit, there are lots of reasons standards describe capabilities that are not used or differently implemented by vendors. I’ll also give Oracle some credit for getting a product out there that folks could look at so at least there is some baseline for implementers to look at. I also highlight that there are technologies that can be used to auto-negotiate based on a mutual level of conformance and capability based on a automated profile A good example of this is in OpenSocial where you can query an OpenSocial container for its features. This lets you write code to handle the case where a container doesn’t have a feature. I hope we can see something like this in IMS standards eventually.

Conformance Testing
Good conformance tests make your life easy, but too often conformance tests don’t mirror the real world implementations. Conformance tests can also drive standards implementation and adoption if done correctly. The best example of this is the ACID3 test for HTML. This test has had a profound impact on getting browser vendors to improve their compliance with HTML standards and performance. At the same time many standards come out with very low threshold conformance tests. The result becomes a bunch of people with the certification logo, but no customers successfully going production with the “standard”. IMS has been getting much better at building conformance tests in the last couple of years.

Testing with other vendors
Michael argues that testing against other vendors is key. I will extend this by saying that the standard needs to be judged with multiple providers and consumers connecting. A single vendor can put a lot of energy into lining up a bunch of partners to create proprietary one offs, but this doesn’t mean conformance. A measure of a standard and vendor is can you swap out the component on either side and still have it work. For example in the BasicLTI world we’ve been able to hook up Blackboard, Moodle, D2L, Sakai, Olat and other VLE’s on one side and multiple learning tools like Wimba, Learning Objects, etc on the other side. This shows a highly flexible and solid implementation. I’m excited to see SunGard moving into the LIS world as this will create multiple providers to test against.

DevCon 2010 Preview

This Sunday (July 10) we will kick things off with the OSCELOT Open Source Day. This will be a fun day of code jamming and collaborating. I’m bringing my laptop and my IDE.
Monday and Tuesday we will have the official Blackboard sponsored program. Anna Kamenetz will be our keynote speaker, followed by my own annual DevCon keynote in the afternoon. We’re going to be joined by a special guest, Ray Henderson. I think Ray’s willingness to co-present at the DevCon keynote signifies that he personally takes openness of our platform very seriously. I’ve asked him to make some public comments and commitments regarding further opening of the Blackboard Learn(&tm;) platform and standards. As part of this commitment we will have a significant block of time at the conference dedicated to IMS standards. IMS staff will be presenting information with Blackboard customers and partners demonstrating how these standards can be used in Blackboard.

After our comments we will have some tremendous sessions including a performance engineering workshop lead by Steve Feldman, and other tracks focused on System Administration, Getting Started with Building Blocks, Database Reporting and Tools. We’ve also setup collaborative areas in the hallway where we will have Blackboard Experts standing by ready to provide expert advice and insights on building and extending Blackboard.

If you are missing DevCon and Open Source Day this year, then I hope you’ll follow along on twitter and blogs with various information. I will do my best to try to get some blog posts up during the week with my own reflections. If you are covering DevCon via your blog or social media post a comment below and let me know where I can follow your conversation.

Reflecting on Learning Impact 20Ten

I just got home from the 20Ten Learning Impact Conference in Long Beach California. This was a great conference and I had a great time talking with so many leaders in the industry.

A few quick notes:

-Open Educational Repositories — Dr. Charles Reed, Chancellor of the CSU system gave an amusing keynote on Open Educational Repositories. Members of the CSU digital marketplace team were present in force. Dr Reed described the success of the Merlot Project, and challenged publishers with the statement, “Algebra II hasn’t changed enough to justify $120 / text book.”

-IMS GLC is becoming a cornerstone of the educational standards world. The presence of for profit educational institutions, military (a stronghold of SCORM), K-12 (including some large virtual schools like Florida Virtual Schools), international (IMS Korea and JISC in the UK), government policy makers from the US department of Ed, publishers and vendors from around the world gives credibility to the IMS as the broadest organization in the currently overly fragmented world of education technology standards.

-A trend to lighter more implementable standards. I spoke with many TAB (technical advisory members) and IMS staff, and I believe there is now consensus towards simplified specification development focused on implementations over use case volume. The Basic LTI/Common Cartridge approach has gained acceptance as superior to past approached. This model of lightweight standards focused on primary use cases that can be rapidly adopted by industry is winning over the past model of elaborating every possible use case and writing 400 page spec documents. One of my frustrations in many years in educational technology standards has been to see a number of “standards” emerge which become so complicated that every vendor creates a unique profile. The result of so many profiles is that customers lose interoperability, which was the goal of standards in the first place. This new approach is winning over the membership. At the LTAC meeting on Thursday we saw demonstrations of BasicLTI and Common Cartridge from most major vendors, I’m happy that Blackboard was one of them.

Edufountain: Get Ready for MiFi in the classroom

Educational institutions are at both ends of the bandwidth spectrum. On the one extreme we have Internet2 participants who have setup a high speed research backbone and at the other end we have schools still struggling to go to broadband. Users in many places are still using dial-up and may not have many broadband options. At the low end of the spectrum where schools are under constrained budgets or places where the network infrastructure isn’t built out for broadband a new category of device may offer some assistance and offer users greater choice and flexibility.
4G networks are now available in many areas and are expanding rapidly. This new class of cellular data service provides broadband connectivity in a mobile package. This has enabled the cellular providers to provide a new gadget for your backpack: the mobile hotspot, or MiFi. These devices provide your growing array of WiFi enabled gadgets with a single broadband gateway and a local wireless network. I’ve recently dropped my home Internet connection for a pair of these devices (the Sprint Overdrive). As someone with multiple gadgets (phone, pad, laptop, etc) that all consume WiFi, I dislike having to reconnect and reconfigure each device as a move between hot spots or get to a hotel while on the road. The ability to carry a personal network and not have to navigate between various WiFi authentication schemes is very useful.
I see a lot of opportunities for these devices in education. Instructors could carry these devices and turn them on when needed. The network could go on the school bus to create a kind of mobile lab for student field trips. Since this uses cellular technology places can be wired up as quickly as a tower can be raised; rather than having to wire up buildings or get WiFi hubs deployed.
The ability of the mobile communications firm to get economies of scale may enable them to lower the cost of access to broadband on campus. This may also allow campus networks to be used more heavily for research purposes; pushing more consumer directed activities off to the mobile communications firm’s network. The MiFi form factor today comes as a standalone device, but this may only be a temporary model. A new generation of mobile phones promise to offer this feature as a part of the phone. As the mobile providers explore how best to package this offering we will get a better understanding of how the economics and service will play out.
I think there are some important elements of the user experience that may limit mainstream adoption of this MiFi concept. The first is that the configuration is probably beyond many users. I’m comfortable setting up a WiFi access point and choosing the encryption scheme, SSID and other options; but I imagine that most users are probably going to find the technology about as easy as setting up a TV and BluRay player; or putting together the toys on Christmas eve. When I purchased my hot spot the folks at the store offered to help me configure my gadgets if I brought them by; so it may be that this configuration issue is moot. Still I think that the complexity of the offer (portable hot spot); may limit uptake until more people see their geek-friends carrying them. The second is that there doesn’t seem to be a way to easily configure these devices for rapid deployment in an organization. It would be great if these could be easily configured by IT with SSID and VPN settings pre-configured so that the instructor could simply be assigned one like a laptop or another digital device. I’m concerned that as this moves to the mobile phone it will become harder for IT to manage these points. Finally there is the issue of local policies and control. Today schools can enforce policies related to Internet usage by administering and configuring the primary network. Instructors and others rely on the security policies and content filters in place to enforce compliance with safe browsing and other requirements (especially in k-12). In the world of MiFi there is no central point of management. Many will argue that this is a good thing, but I’m sure that others will find this a difficult technology to accommodate in its current form.
My personal hope is that this technology will be embraced by institutions and supported as a means to get broadband connections it the classroom and on the go in an efficient, cost effective manner. I also hope that institutions will welcome these devices in and let early adopters explore the possibilities rather than ban them or restrict usage. I hope that students will explore the technology as well using these devices to build on the fly networks for collaboration and sharing. As always I’d love to hear from admins and edu-tech folks on campus for sightings of MiFi in the classroom and around the institution. Share your stories in the comments below.

Edufountain: Dealing with Volcano Like Events

I’m stuck in the middle of an unprecedented event in modern aviation. The whole of the UK and Northern Europe is shut down for an indeterminate time do to a volcano. I miss my family, but I’m not in any immediate difficulty other than living out of my suitcase for far longer than I intended. I’m going to use this as a case study in how to manage events which fall outside our normal experience from an organization and leadership perspective. In education we are confronted with these situations. It might be H1N1 shutting down campus for 2 weeks or simply a persistent issue in our network infrastructure that we are struggling to diagnose and resolve. With dropped calls on our phones, gmail crashes, even hard disk failures we’ve developed many processes for handling disruptions and in some cases perhaps consider them ordinary. Yet at the same time within these disruptions there can be events that overwhelm us because they are initially thought of as a normal disruption but then grow beyond our organizations ability to manage without leadership.

Consider my crazy week. On Thursday April 15th I woke up in Swansea Wales to the news that were likely to be canceled as the result of a volcanic eruption. Delays are a normal part of air travel. Experienced travelers on our team immediately called to book hotels and rebook for the next days flights. Of course the next morning we discovered that the flights were delayed yet again. We repeated this procedure one more day until we began to recognize that this was not an ordinary delay like a snowstorm, or a mechanical issue.

This is where leadership makes a big difference. Culturally human beings work from patterns based on experience. Organizations magnify this this pattern driven behavior as the organization reacts to events it has previously encountered with solutions that previously worked. The larger the organization the more this stimulus-response is codified. Many customer service operations at airlines are driven by expert systems and metrics that look at employee performance based on expectations of desired outcomes from previous incidents and after action reviews.

The key problem here is that leadership had to recognize that this event was not manageable by the normal processes. Let us consider this process for a typical cancellation. Travelers contact the airline and rebook or cancel their plans based on their individual needs. The airline accommodates them with a later flight or a refund or otherwise according to the terms of their ticket. This process is built upon the expectation that travelers will be able to book a later flight as the situation clears in a short period of time, or that only a single node of the travel network (e.g. one airport) would be impacted. In this circumstance there are several differences, first a large number of nodes in the travel network were critically affected. This included not just airports, but other modes of transit as users moved to substitute options (e.g. take the train). In normal airport disruptions the volume accommodated by alternate transit modes is short lived and congestion points naturally resolve themselves.

In the current crisis there was simply no leadership from the airlines, government transportation officials until the situation was totally and completely chaotic. Even now as the airports reopen one is left wondering if there was ever a need for a disruption of this scale or why passengers were left in endless cycles of re-booking and delay while watching their credit cards fill up with charges. Despite a robust multi-modal transportation infrastructure in Europe there was no leadership from those in power to actually resolve the growing problems of stranded individuals. The organization paralyzed. This is despite the fact that many low level people from hotel clerks to airline reservation desks and pilots could all see clear possible solutions.

Leaders must recognize moments of organizational paralysis and take decisive action. They must understand when the very patterns that have built their entire organization are simply not going to bring resolution. In this specific case the leaders must also weigh the safety and caution necessary to ensure that flights not only resume, but don’t end up crashing into the Atlantic.

To review our own team’s response to this situation. Normally our travel arrangements are managed at an individual level with travelers working with our corporate travel agency and airlines to resolve delays, cancellations and other change requests. In this circumstance it was clear by Saturday that this was not going to work. People were stuck and the route options were not clear. So our leadership gathered and formulated a number of options for stranded employees. We explored three options: One was for people to wait it out, the second was to arrange for mass transit by ship to the US, and the third was to transport employees to an open airport. Staying in place offered ease, but an unpredictable return time that many employees felt was too disruptive for their personal needs. This lead us to a second option taking a ship. Initially as airports closed around Europe and it looked like there might not be any air options the ship option seemed viable. We discovered that the Queen Mary was launching a transatlantic voyage on the 22nd of April. However we were not alone and that ship sold out prior to our being able to secure passage. Other ship options would have been on very small craft not likely to be particularly pleasant for folks. Finally we discovered based on wind charts and calls to various folks in airport operations centers that Madrid was unlikely to be significantly impacted by the eruption and would likely remain open. The only issue was the matter of getting from the UK to Spain. Our team used its resources to secure travel via the Eurostar to Paris and was able to get automobile transport to Madrid from there. Employees were given the option to remain in place and wait things out or travel home via the more certain exit time depending on their needs. Given my own personal travel anxieties and car sickness issues, I opted for the wait in London option. A PowerPoint was created to explain to stranded team members options and alternatives.

So as a company I’d say we did fairly well. There are a few lessons here for anyone managing a complex system with an organization. First recognize when a volcano like disruption is occurring. Understand that the organization will be unable to respond using normal procedures and that leaders must step in quickly to establish the alternatives. Failing to do this will result in paralysis and various individual acts that might not be helpful for the organization as a whole. We immediately found out where people were and what solutions they were exploring (one person was already in line at the train station; another had made a spreadsheet of open airports and was calling about flights). The result was that we could coordinate these actions and use them. We also didn’t end up with 10 people in line for trains but not flights or flights but no trains.

A second lesson was to establish support relationships before events occur, not after. As a company we had an emergency travel agency, we had preferred airline and hotel providers. People in our company also worked to build personal relationships with these vendors so that in a crisis we weren’t just some anonymous person calling the helpline the first time. Our vendors knew who we were. When the organizational models break, these relationships allow you to quickly establish a new model.

The third lesson learned is quickly establish a new model for members of the organization to follow. Rather than simply let the team follow the normal option of everyone work out your logistics, we collapsed folks down into two categories. We offered alternatives and explained them to the team. Everyone knew their options and could make a decision that best suited their situation. They also understood the consequences of the decision (e.g. stay put and don’t know when you are getting home, or accept short term discomfort but generally know +/- 1 day your return time). We also saw that after our explanation of the choices some folks changed their minds. I initially had planned to join the Madrid group, but opted to stay put after discussing alternatives with my peers and family.