A bit of a ramble about why I think the LMS /VLE or whatever you want to call it will continue to thrive in 2012.
These tests were focused on driving Blackboard towards implementing three important industry standards that enable portability and interoperability of systems. I know that many customers want to be able to consider from a broad set of market choices and they recognize that standards enable that choice.
As of today this challenge is answered. Blackboard has integrated these technologies into its core. Blackboard is first to market with IMS Common Cartridge 1.1 (import and export) and demonstrated interoperability via IMS LIS with Oracle and SunGard at the 2011 Learning Impact Conference. BasicLTI has become and industry standard and is integrated deeply into Blackboard Technologies. LTI powers our partnerships with companies like McGraw-Hill and enables Blackboard Collaborate to integrate with a variety of platforms.
As a person who has been working with IMS standards for a long time, I’m really happy to see this progress. It is personally quite rewarding to see talking about interoperability transform into doing interoperability. Let’s make 2011 the year we transformed this challenge into a benchmark. I await updates from the community on the state of standards in other VLEs such as, Sakai and Moodle. Let’s see those benchmarks and meet them together.
mfeldstein has an interesting post up on his site about how to judge a vendor’s support for standards. I’m going to weigh in with my personal views and then provide some comments on his article. The usual disclaimers apply here that these are my personal thoughts and not those of any employer or group I work with /for around etc. Though others are free to chime in agreement or disagree.
I want to start by articulating my personal view as a participant in the standards community. I’m a member of the IMS Technology Advisory Board and a contributor to the IMS Learning Information Systems (LIS) working group. I also represent my company on the Common Cartridge Alliance Program Management Group (CCAPMG). I’ve done quite a bit of work on the bulk data exchange service for LIS. I also work with the SIF association and ADL collab from time to time. I’ve invested my professional and personal time in helping to evangelize for education technology standards. I’ve recently joined the OpenSocial working group and am working with some folks in the community to create a profile of OpenSocial focused on educational data.
It is my personal mission to use standards to create what I call the edutech commons. This would be a set of common code libraries usable by Virtual Learning Environments, Student Information Systems, Learning Tools and Learning Networks which would allow easy integration. In the same way that BSD’s libraries for TCP/IP have become the defacto implementation used by most operating systems, I would like the code that connects up technology in .edu to be easily plugable into many implementations. The whole point of object oriented coding was that we should be able to create consumable objects that wrap business processes and share them across projects.
We’ve actually seem some great progress in this front on the Basic LTI project. Thanks to hard work of folks like Dr Chuck, Stephen Vickers and others we now have a pretty good set of basic LTI libraries out there in the field.
Another great example of this is the Shindig project for OpenSocial. With Shindig any vendor can implement the opensocial framework by creating service provider interfaces (SPI’s) to map their data and classes to the open social API. Because Shindig creates a common service layer, code can be more easily moved between OpenSocial implementations.
A second concept that is important to me personally is the notion of data transfer. It is my personal view that data should be written to the system once and then be able to move with limited restrictions throughout the network. I want data creators to be able to move easily through the network with lower transfer costs by data consumers. It frustrates me that so much government information is not packaged in a way that it is easily portable. Consider the very necessary work of various companies who compile state education standards and resell the resulting databases. They must parse through different formats, word documents, pdfs and sometimes type written documents without an electronic equivalent other than a TIFF file to compile the information into a computer consumable database. Imagine if instead there was a semantic markup for educational standards. State governments could establish a mechanism for publishing standards in a way that these items could be easily parsed. I love the UK Government Data program’s ambition to make all government documents published with semantic markup and meta-data on a central site.
A Personal Reaction to the article on e-Literate
Knowing my personal biases, let me apply some comments to the proposed rubric for vendors and standards. Michael starts by discussing drivers for standards including the notion that
Standards tend to reduce the total amount of money that customers spend on integration, which generally means that somebody is going to make less money.
I strongly disagree with this. Integration is low value consulting in my view. It is not strategic and does nothing to deepen your relationship with your customer. Customers generally have a limited budget for services with a vendor and if you use it up on integration you aren’t talking about how they can more strategically deploy your product and providing them with customizations that better adapt the software to their business. Furthermore by raising the costs connecting the software to other business operations you pull yourself out of the networks that create the next generation of value. My goal is to create standards that are easily consumable and implement the highest value functionality in a common format maximizing portability. I’m hopeful that LIS won’t launch a new wave of expensive products for customers to buy but will instead add capabilities to existing products and lower maintenance and development costs for customers and creators.
Michael writes that is is important to understand how a vendor conforms to the standard. He writes about how conformance profiles are useful in assessing how the features of the standard are implemented by the vendor. I think this whole experience is a sucky one and I loathe it. Most people never read the conformance profiles and fewer still actually understand them. I’ve been on many calls with developers working to understand specific nuances of a conformance profile on top of the already complex standard document. I’ve also worked with consultants in the field trying to parse a 200+ page spec document to figure out if the problem was a bug, conformance issue or just some misunderstanding of the spec. I just want the integrations to work. As a coder I don’t want to spend time analyzing hundreds of different vendors conformance documents to figure out how company “a” implements the person id. As a technology buyer I don’t want to have to know the ins and outs of every standard implemented in the products I use. For enterprise integration I just want to read a list of sections and populate my database and give you back some grades at the end of the semester. I want SOA that works, not some crazy new way to do EDI that requires another peice of expensive middle ware just to make systems exchange data. In my dream world to point system A at System B and generally have them talk to each other with minimal fuss because they both implement a standard. Just like I can call your phone # and have it make its way through the network across hardware from Cisco, Lucent, Alcatel, and Motorola until I hear your voice on the other end of the line. As a phone company customer I didn’t have to review the variances between the CMDA data network vs the WiMax protocols to make the phone call happen. I’m very happy that basicLTI has gotten to this point. It seems like Chuck sends me an email every week saying test out this new thing we integrated with BasicLTI and we enter in our shared secrets and register the tool and it just works.
To walk back my rant a bit, there are lots of reasons standards describe capabilities that are not used or differently implemented by vendors. I’ll also give Oracle some credit for getting a product out there that folks could look at so at least there is some baseline for implementers to look at. I also highlight that there are technologies that can be used to auto-negotiate based on a mutual level of conformance and capability based on a automated profile A good example of this is in OpenSocial where you can query an OpenSocial container for its features. This lets you write code to handle the case where a container doesn’t have a feature. I hope we can see something like this in IMS standards eventually.
Good conformance tests make your life easy, but too often conformance tests don’t mirror the real world implementations. Conformance tests can also drive standards implementation and adoption if done correctly. The best example of this is the ACID3 test for HTML. This test has had a profound impact on getting browser vendors to improve their compliance with HTML standards and performance. At the same time many standards come out with very low threshold conformance tests. The result becomes a bunch of people with the certification logo, but no customers successfully going production with the “standard”. IMS has been getting much better at building conformance tests in the last couple of years.
Testing with other vendors
Michael argues that testing against other vendors is key. I will extend this by saying that the standard needs to be judged with multiple providers and consumers connecting. A single vendor can put a lot of energy into lining up a bunch of partners to create proprietary one offs, but this doesn’t mean conformance. A measure of a standard and vendor is can you swap out the component on either side and still have it work. For example in the BasicLTI world we’ve been able to hook up Blackboard, Moodle, D2L, Sakai, Olat and other VLE’s on one side and multiple learning tools like Wimba, Learning Objects, etc on the other side. This shows a highly flexible and solid implementation. I’m excited to see SunGard moving into the LIS world as this will create multiple providers to test against.
This Sunday (July 10) we will kick things off with the OSCELOT Open Source Day. This will be a fun day of code jamming and collaborating. I’m bringing my laptop and my IDE.
Monday and Tuesday we will have the official Blackboard sponsored program. Anna Kamenetz will be our keynote speaker, followed by my own annual DevCon keynote in the afternoon. We’re going to be joined by a special guest, Ray Henderson. I think Ray’s willingness to co-present at the DevCon keynote signifies that he personally takes openness of our platform very seriously. I’ve asked him to make some public comments and commitments regarding further opening of the Blackboard Learn(&tm;) platform and standards. As part of this commitment we will have a significant block of time at the conference dedicated to IMS standards. IMS staff will be presenting information with Blackboard customers and partners demonstrating how these standards can be used in Blackboard.
After our comments we will have some tremendous sessions including a performance engineering workshop lead by Steve Feldman, and other tracks focused on System Administration, Getting Started with Building Blocks, Database Reporting and Tools. We’ve also setup collaborative areas in the hallway where we will have Blackboard Experts standing by ready to provide expert advice and insights on building and extending Blackboard.
If you are missing DevCon and Open Source Day this year, then I hope you’ll follow along on twitter and blogs with various information. I will do my best to try to get some blog posts up during the week with my own reflections. If you are covering DevCon via your blog or social media post a comment below and let me know where I can follow your conversation.