Category Archives: Openness

Crackdown on Coursera

Bartender pouring drinks

Would you care for a stiff drink of free online learning.

It seems that some government officials in Minnesota have declared those offering MOOCs or other free online classes to be dangerous outlaws who must be prosecuted. What’s next? Will Minnesota go after Khan Academy? Hacker spaces and Instructables? Enterprising Barry Dahl suggests opening coffee shops over the border where students might freely learn. I suppose this beats the alternative of going to some underground speakeasy, or should it be “learneasy” where some bright bohemians will drink from the cup of free online learning through foreign proxy servers.

Eugene Volokh of the UCLA School of Law and the Volokh Conspiracy blog explores some of the constitutional and legal issues in play here for those scratching their head wondering how this can be remotely constitutional.

Update: Slate reports that Minnesota has reconsidered its position and will cease the crack down on any free online courses. As FDR said upon the repeal now is a great time for a beer.

Blackboard CourseSites Goes Semantic

There is too much talk about open and openness these days. No one seems to agree on what open is, but everyone agrees it is important. We’ve descended into semantic chaos where people fight to claim they are really “open” and others accuse them of just “openwashing”. I’m taking a break from the terms. Instead I’m just going to describe the technologies I’ve implemented and leave it to you, the reader to decide if you want to call it open, closed, or something else.

To start this new policy off let me describe one of our latest features and then I invite your comments and feedback.

On CourseSites I’m leading ongoing development to make it easy to share the Course experience more broadly via Social Media and Search. This capability is delivered using the emerging Semantic Web infrastructure put forth by the team at the Learning Resource Metadata Initiative and Schema.org.

The first is to create a public component of the course, a web page where anyone can drop by and ask to join, or browse as a guest (if the instructor wants). It links to a public instructor profile with a blog, where the instructor can elect to describe his or herself in a way that connects to the courses they teach. The Course home page also acts as a place to share the educational materials from the class in both IMS Common Cartridge or Blackboard Learn archive format. The materials are shared under the Creative Commons CC-BY license. This allows a permissive reuse of the materials in other educational contexts, while preserving the attribution of the original authors.

These pages contain Semantic web tags to describe the materials they contain. This makes them searchable, share-able and otherwise useful to applications beyond Blackboard. For example look at this example Course Homepage as rendered through the browser:

How the Human Sees the Course Homepage

Now consider how Google sees the same page:

How google views the Course Homepage

Note how information is encoded in a way that Google can pull key details right form the page. Information such as “version” and file links are consumable by a third party application. The descriptive scheme we use has been developed by a broad set of search engine companies at Schema.org (above). This ensures that from the moment we launched this feature Google and other search engines can consume the information.

We’re also experimenting with ways to make this page more accessible to social discovery as well. We include a standard “share” gadget that lets you publish the link to these materials to hundreds of different social media solutions. Also included on these pages is another Semantic Web technology pushed by Facebook called “OpenGraph“. This allows the link you share to Facebook to contain smart data.  Here is that same course homepage viewed through Facebook.

How CourseSites sees the course home page. LInk and title information are pre-populated.

This integration from Blackboard into Google, Bing, and other search engines along with social media like Facebook and Twitter was done completely through the Blackboard Building Blocks technology.  One of my next projects will be to take the building block and work to make it available to other Blackboard installations.  I hope in participating in the adoption of  a standards driven technology supported by search engines and social media, we will encourage sharing, re-use and re-mixing of educational resources that are linked into the LMS/VLE.

A Blackboard Standards Update

To follow up on Ray Henderson’s blog post earlier today

I’ve been at the IMS Quarterly Meeting in Lone Star College in the Woodlands, Texas this week. My Blackboard colleagues and I have been showing off our progress on IMS Standards. We are finishing testing of two technologies in our Blackboard Learn product: IMS Common Cartridge and Basic LTI. We’ve taken our integration of Common Cartridge and Basic LTI into the Bb Learn core and included support for BLTI links inside a Common Cartridge Package. I’m also pleased that we will include both Import and Export of Common Cartridge within the core platform. This will do a lot for learning object repositories and sharing.

The Blackboard approach to Basic LTI actually extends Building Blocks technology in a powerful new way. We’ve made it possible to define a BLTI link within the bb-manifest.xml file which means that Basic LTI links can be used within a number of workflows in the application. As far as I know we are the only vendor to allow these more complex link placement options. We also make it possible for administrators to define trusted tool providers and enable course builders to create links to these providers as easily as one would put in a URL. Finally since we’ve integrated Common Cartridge and Basic LTI a CC package that included BasicLTI links can be used to define placement of tools within the flow of course materials. For example one could build a module which ended with a link to a simulation after providing some local training activities.

Blackboard is also making progress on Shibboleth. We’ve joined the InCommon Federation and hope to setup Blackboard as an identity provider. We are working with a handful of customers to work through a few key use cases involving SAML user provisioning and synchronization and sharing courses between institutions.

I’m not going to go into too much detail on our new SCORM partnership, but I think we’ll see that our player improves dramatically as we move to the RUSTICI player.

Finally I know that system admins and managers want to see better integration between the Blackboard Learn and other campus systems. The administrative systems need to work better. I’ve been on the LIS working group now for several years within IMS and so I’m happy to report a couple of items. First the working group has finally finished key elements of the core profile. We’ve worked to refine and simplify the API so that it will meet the needs of integrating administrative and vle systems in both real time and batch scenarios. This newly simplified profile was the last hurdle preventing us from completing our implementation. We’ve been working closely with SunGard in the last few months and I’m optimistic we will have something to show at the next IMS meeting in January. Now we’re still a bit of a way from shipping LIS. The workgroup still has some issues to work out in the Outcomes (grade exchange) profile and with authentication. The timeline for the LIS workgroup is that the workgroup will be doing testing of all profiles but outcomes by the end of January, and then plan to publish and finalize the grade exchange and hopefully be done with the specification by Learning Impact. I’m cautiously optimistic that we will meet these timelines. There is a lot of work left though within the workgroup. In addition to authentication and grade exchange there is a set of fairly complex certification tests which we(the workgroup and IMS) will need to complete, test and review. Still it seems like as an industry 2011 will be the year that IMS LIS is finally ready.

Open Education and the Midterms

The US midterm election is today and I wonder what it will mean for Open Education Repositories and Open Textbooks. One interesting scenario would be if Harry Reid loses but the Democrats retain the majority in the Senate. Dick Durban one of the main contenders to succeed Reid in the leadership sponsored S1714 which would fund grants for open and freely distributed textbooks. If you want to see more support in the senate for funding these initiatives then you should cheer educators like Patty Murray (Washington) and Michael Bennet (Colorado). You may also support Joe Sestak running for PA senate. Rep Sestak was on the House committee that did the work on HR 3221 – Section 505 that was to setup grants for the development of open textbooks.

Generally the Republicans have taken a skeptical view of open media like PBS, NPR as well as institutions like the Department of Education. Comments from conservatives regarding S1714 and hr3221 section 505 were very negative, some see it as a plot to enshrine political positions into the textbooks.

On the other hand with private money from the Gates, Hewlett and Melon foundations, perhaps OER and open text books are better off without the heavy hand of the US government. I also fear that just as we’ve seen with K-12 science curriculum standards, more direct involvement by the Feds in production of book may lead to some ugly fights. Imagine the debates while the DOE funds a biology, economics or civics textbook.

Educause 2010 Reflections

I had a lot of very interesting conversations and attended some great sessions and went to many booths at Educause 2010. What I learned will shape my personal technology advocacy and thinking. Let me highlight a few for your reaction.

1) An IT Labor Shortage?
Is there a labor shortage in IT globally and/or specifically in Academic Computing. Some CIOs, IT directors and even a few software executives seemed to indicate that they were having trouble staffing positions. How will this shortage impact on open source projects, commercial software adoption and companies providing hosting services or SaaS based models. There were lots of opinions by traditional advocates of various models. I’ve heard that it could be very negative for any one of these models depending on what folks decide works best in the end. For example if Academic IT decides they can no longer maintain staff to support open source applications they might go to a commercial provider like Blackboard, or perhaps a hosting company like rSmart. On the other hand maybe the staff shortage stems from the perceived value of owning the whole technology and the role of commercial and services providers. My personal view is that it becomes harder and harder to justify staff intensive solutions. Highly proficient technical staff are in demand across a number of industries. Global demand for IT talent is such that solutions need to focus on driving down their staff footprint on campus.

2) Shibboleth?
Based on the InCommon meetings and Identify Management track it seems to me that Shibboleth is finally gaining real traction. I heard about a lot of success stories. I was even excited to see that a number of Blackboard customers have found success using the lightweight support we’ve made available. As homework from the conference I’m working to get Blackboard back into InCommon and working with our Technology Product Manager to provide a more detailed roadmap on how we should extend Shibboleth. I’ve been doing quite a bit with Open ID over the last year; but I heard pretty clear that schools want Shibboleth for its perceived higher quality security.

3) Open Database Is Making An Impact in Analytics?
I heard quite a bit about small projects to mine the VLE for data. John Fritz at UMBC had a pre-conference workshop on Monday that I heard great things about regarding how they are doing data mining in their Blackboard system to improve outcomes and performance. SunGard announced a Signals building block for Blackboard which provides a nice dashboard with predictive information about student performance. I was interviewed by the guys at Action Analytics and I’ll link to the video when its up on their site.

4) Campus Computing Report?
I was generally pleased to see that the #1 and #2 and #3 trends (eBooks and Mobile and Lecture capture) have been strongly supported by Blackboard’s technology strategy. In the summer of 2009 we released our first eReader integration (a simple building block that supported the Amazon Kindle). Today we have partnerships with the major providers of eBooks including BN and Follett. In the mobile front Blackboard has now gone through two generations of integration with our Learn platform and are seeing remarkable adoption. Meanwhile Mobile Central continues to expand its footprint to more campuses. On the lecture capture front we have a very strong partnership with Echo 360 (provider of wired classrooms), ShareStream and Kaltura (video streaming and management companies) and with the formation of Bb Collaborate there are possibilities for the recording of online collaborations as well. It is good to see that our strategy is validated by the trends seen as important in the industry.
What do you think?

Some Thoughts On Standards and Vendors

mfeldstein has an interesting post up on his site about how to judge a vendor’s support for standards. I’m going to weigh in with my personal views and then provide some comments on his article. The usual disclaimers apply here that these are my personal thoughts and not those of any employer or group I work with /for around etc. Though others are free to chime in agreement or disagree.

Personal Soapbox
I want to start by articulating my personal view as a participant in the standards community. I’m a member of the IMS Technology Advisory Board and a contributor to the IMS Learning Information Systems (LIS) working group. I also represent my company on the Common Cartridge Alliance Program Management Group (CCAPMG). I’ve done quite a bit of work on the bulk data exchange service for LIS. I also work with the SIF association and ADL collab from time to time. I’ve invested my professional and personal time in helping to evangelize for education technology standards. I’ve recently joined the OpenSocial working group and am working with some folks in the community to create a profile of OpenSocial focused on educational data.

It is my personal mission to use standards to create what I call the edutech commons. This would be a set of common code libraries usable by Virtual Learning Environments, Student Information Systems, Learning Tools and Learning Networks which would allow easy integration. In the same way that BSD’s libraries for TCP/IP have become the defacto implementation used by most operating systems, I would like the code that connects up technology in .edu to be easily plugable into many implementations. The whole point of object oriented coding was that we should be able to create consumable objects that wrap business processes and share them across projects.

We’ve actually seem some great progress in this front on the Basic LTI project. Thanks to hard work of folks like Dr Chuck, Stephen Vickers and others we now have a pretty good set of basic LTI libraries out there in the field.

Another great example of this is the Shindig project for OpenSocial. With Shindig any vendor can implement the opensocial framework by creating service provider interfaces (SPI’s) to map their data and classes to the open social API. Because Shindig creates a common service layer, code can be more easily moved between OpenSocial implementations.

The alternative world is one of lots of clean box implementations. We end up in the world of JavaScript, SCORM and other “standards” which have many quirks from implementation to implementation. The lack of a code library in the commons results in multiple interpretations and levels of compliance and this makes interoperability more difficult.

A second concept that is important to me personally is the notion of data transfer. It is my personal view that data should be written to the system once and then be able to move with limited restrictions throughout the network. I want data creators to be able to move easily through the network with lower transfer costs by data consumers. It frustrates me that so much government information is not packaged in a way that it is easily portable. Consider the very necessary work of various companies who compile state education standards and resell the resulting databases. They must parse through different formats, word documents, pdfs and sometimes type written documents without an electronic equivalent other than a TIFF file to compile the information into a computer consumable database. Imagine if instead there was a semantic markup for educational standards. State governments could establish a mechanism for publishing standards in a way that these items could be easily parsed. I love the UK Government Data program’s ambition to make all government documents published with semantic markup and meta-data on a central site.

A Personal Reaction to the article on e-Literate

Knowing my personal biases, let me apply some comments to the proposed rubric for vendors and standards. Michael starts by discussing drivers for standards including the notion that

Standards tend to reduce the total amount of money that customers spend on integration, which generally means that somebody is going to make less money.

I strongly disagree with this. Integration is low value consulting in my view. It is not strategic and does nothing to deepen your relationship with your customer. Customers generally have a limited budget for services with a vendor and if you use it up on integration you aren’t talking about how they can more strategically deploy your product and providing them with customizations that better adapt the software to their business. Furthermore by raising the costs connecting the software to other business operations you pull yourself out of the networks that create the next generation of value. My goal is to create standards that are easily consumable and implement the highest value functionality in a common format maximizing portability. I’m hopeful that LIS won’t launch a new wave of expensive products for customers to buy but will instead add capabilities to existing products and lower maintenance and development costs for customers and creators.

Conformance Profiles
Michael writes that is is important to understand how a vendor conforms to the standard. He writes about how conformance profiles are useful in assessing how the features of the standard are implemented by the vendor. I think this whole experience is a sucky one and I loathe it. Most people never read the conformance profiles and fewer still actually understand them. I’ve been on many calls with developers working to understand specific nuances of a conformance profile on top of the already complex standard document. I’ve also worked with consultants in the field trying to parse a 200+ page spec document to figure out if the problem was a bug, conformance issue or just some misunderstanding of the spec. I just want the integrations to work. As a coder I don’t want to spend time analyzing hundreds of different vendors conformance documents to figure out how company “a” implements the person id. As a technology buyer I don’t want to have to know the ins and outs of every standard implemented in the products I use. For enterprise integration I just want to read a list of sections and populate my database and give you back some grades at the end of the semester. I want SOA that works, not some crazy new way to do EDI that requires another peice of expensive middle ware just to make systems exchange data. In my dream world to point system A at System B and generally have them talk to each other with minimal fuss because they both implement a standard. Just like I can call your phone # and have it make its way through the network across hardware from Cisco, Lucent, Alcatel, and Motorola until I hear your voice on the other end of the line. As a phone company customer I didn’t have to review the variances between the CMDA data network vs the WiMax protocols to make the phone call happen. I’m very happy that basicLTI has gotten to this point. It seems like Chuck sends me an email every week saying test out this new thing we integrated with BasicLTI and we enter in our shared secrets and register the tool and it just works.
To walk back my rant a bit, there are lots of reasons standards describe capabilities that are not used or differently implemented by vendors. I’ll also give Oracle some credit for getting a product out there that folks could look at so at least there is some baseline for implementers to look at. I also highlight that there are technologies that can be used to auto-negotiate based on a mutual level of conformance and capability based on a automated profile A good example of this is in OpenSocial where you can query an OpenSocial container for its features. This lets you write code to handle the case where a container doesn’t have a feature. I hope we can see something like this in IMS standards eventually.

Conformance Testing
Good conformance tests make your life easy, but too often conformance tests don’t mirror the real world implementations. Conformance tests can also drive standards implementation and adoption if done correctly. The best example of this is the ACID3 test for HTML. This test has had a profound impact on getting browser vendors to improve their compliance with HTML standards and performance. At the same time many standards come out with very low threshold conformance tests. The result becomes a bunch of people with the certification logo, but no customers successfully going production with the “standard”. IMS has been getting much better at building conformance tests in the last couple of years.

Testing with other vendors
Michael argues that testing against other vendors is key. I will extend this by saying that the standard needs to be judged with multiple providers and consumers connecting. A single vendor can put a lot of energy into lining up a bunch of partners to create proprietary one offs, but this doesn’t mean conformance. A measure of a standard and vendor is can you swap out the component on either side and still have it work. For example in the BasicLTI world we’ve been able to hook up Blackboard, Moodle, D2L, Sakai, Olat and other VLE’s on one side and multiple learning tools like Wimba, Learning Objects, etc on the other side. This shows a highly flexible and solid implementation. I’m excited to see SunGard moving into the LIS world as this will create multiple providers to test against.