Sticking with Blackboard

Over the last year, our college underwent a thorough review of its learning management system. After considering all the options and running a trial of Instructure’s Canvas, we decided to stick with Blackboard Learn. Blackboard has established a dominant position in the market over the years often by buying out its rivals. Canvas, our trial system, has mounted a substantial challenge to Blackboard mainly by employing a fresh approach to this category of software. It adopted a truly cloud-based platform and focused the design on the user experience. The LMS market has been both fluid and complex as illustrated by this infographic created by Phil Hill.

LMS msrket trends
It has been interesting to see the changes that have taken place.

The reasons our college decided to remain with Blackboard, despite having some problems with it over the years, are varied. Partly, it may have been a feeling of better the devil you know but also there was undoubtedly the influence of positive change at Blackboard under the new management (see A new Blackboard? ). In particular, there seems to be a reaction to competition from Canvas in the area of user experience.

Last month I attended Blackboard’s world conference for the first time to get a sense of its path forward. I was impressed with the scale of the event and the new emphasis on the user experience. A user experience lab was accessible throughout the conference so participants could test forthcoming improvements that will make the system “friendlier”. It is a substantial and real change over Blackboard’s previously clunky approach to interface design when doing something simple could take multiple clicks. The recent hiring of Jon Kolko, who is well known within the human-computer interaction and design communities, illustrates Blackboard’s new approach. Jon will be focused on improving the company’s mobile apps.

Blackboard’s move to the cloud was a major announcement (see this video for a short non-technical explanation of the cloud). During our review of Canvas vs Blackboard there was some confusion over what was meant by a cloud-based system. Blackboard was thought to be on the cloud because it was hosted on Blackboard’s servers and not on a College server. A hosted service gives each user its own copy of the Blackboard product (often different versions) running on dedicated resources. A true cloud system is one in which users have separate data and custom views but everyone shares the same version of the product. Blackboard will run its new cloud version on Amazon Web Services just as Canvas does.

Blackboard has a number of products (57 in all). In addition to being an LMS vendor, it offers a leading tool for synchronous learning (Collaborate) and tools for analytics. These tools have been sold separately and operated under separate business units but will in the future be merged and integrated. The next versions of Blackboard products will be sold under four levels of integrated services:

Learning Core: The Learn LMS (with some added tools)
Learning Essentials: Learn plus Collaborate
Learning Insight: Adds Analytics for Learn to track learning outcomes
Learning Insight & Student Retention: Basically everything Blackboard offers.

In addition to the many presentations on Blackboards products and how they are being used, the conference featured two very inspiring key note speakers. One is posted online.

A shorter version of the other can be found in this TED talk.

For another perspective on the conference see:
Blackboard’s Big News that Nobody Noticed
Posted on July 18, 2014 by Michael Feldstein

Common Challenges

I recently participated in a conference on health care informatics. A number of the speakers discussed the main challenges in health care: cost, quality, integration and including a patient (end-user) voice. These could apply equally to education.

Some presenters spoke about topics I have included in my recent blog entries. For example, in my February post on the curriculum problem, I noted how it is increasingly difficult for the curriculum in most disciplines to keep pace with the accelerated rate of knowledge generation brought about by the internet age. One speaker, a physician, called on members of his profession to stop pretending they can be current with the latest information about their discipline. He talked of the knowledge processing-capacity gap and the need to provide contextual information support at the point of care.

We have the capability to produce checklists, reminders, alerts, warnings and identification of alternatives (automatically updated based on the latest knowledge) for electronic records and treatment protocol forms. However, such technologies are not in widespread use. As long as we are relying solely on knowledge in the heads of medical professionals rather than on just-in-time knowledge embedded in the technology infrastructure, the gap will get wider.

In my March entry on the scientific method, I noted how much educational practice is informed by unsubstantiated belief instead of educational science and effectiveness data. A conference presenter referred to a study of over 1,000 physicians in the UK that revealed only 3 percent successfully reviewed effectiveness data about their practice regularly and 55 percent had never attempted to collect and review data. The majority just have faith that they are making a difference.

There are few virtuous loops in health care that measure patient outcomes, analyze data to determine what is working and instantly share the results with care providers. Large-scale data collection and analysis is required to make this happen and lack of data standards creates a barrier. For example, a project in Massachusetts to create a data interchange is targeting only the 20 electronic health record systems that have substantial market presence, meaning that in one state there are so many incompatible systems in use it is not possible to cover them all.

A whole industry has been created by IT companies that provide translational services between different health related data systems. This problem has been an issue with educational software systems too, although there has been progress in getting many vendors to adopt some common standards for data interchange. Organizations such as the IMS Learning Consortium with its Learning Tools Interoperability standard have been helpful in this regard.

Despite these problems there was a great deal of optimism among conference participants that technology has the potential to radically transform and improve the health care system. In addition to large-scale data analytics to help improve health outcomes, there were a number of presentations on the use of mobile apps and providing greater patient access and involvement through personal health records and virtual visits.

This short video provides an example of the impact of large-scale (big) data analytics in healthcare. It is possible to argue for similar analytics being applied to education.

The User Experience in Education

Usability is a term usually related to the interface design of technology-related systems. Its aim is to improve a system’s ease of use. The classic definition of usability is relative rather than absolute. One cannot say a design is usable, user-friendly or has good usability. One can say that design A is relatively more usable than design B based on some measure of: effectiveness (user goal completion), efficiency (time taken to complete the task, number of errors and number of times seeking assistance), satisfaction (user’s rating of experience) or learnability (amount of instruction/study required). Numerous evaluation techniques have emerged to measure these factors.
It is also possible to argue that, similar to specific products, organizational systems and services are designed with more or less usability. For example, the design of signage to facilitate navigation in a public library contributes to the usability of the library as a service.

library

The term user experience (UX) has become more commonly used. UX recognizes that while usability is important and an often neglected part of design, the holistic nature of design is such that a successful design requires balancing all of its different aspects (usability, functionality, aesthetics, etc.). It recognizes that a finished design is a gestalt—the whole is greater than the sum of its parts. UX initially was mainly applied to technology (e.g. web site design) but is starting to gain traction for evaluating and improving higher education; for example, the library experience and first-year experience.

Design decisions dominated by those not trained in UX design create a barrier to achieving a good user experience. As illustrated in the above photo, many designers will design with their own perspective of what is important and assume the user is just like them. Another problem is that designers will sometimes categorize the advocates, purchasers or managers of what is being designed as the users instead of identifying and understanding variations among actual users. Design should be centered on the end-user.

The development of new textbooks is one example of how user-centered design of the UX has yet to influence education. Publishers tend to assess the quality of a new textbook by asking peers of the authors (potential advocates for purchase) instead of conducting studies with students (the end-users) to determine whether the book supports and inspires learning.

Reference: A Textbook Example of What’s Wrong with Education by Tamim Ansary

This video presentation by Paul Bennett illustrates user-centered design of the user experience in action:

Don’t Wear Size 10.5 Shoes

I once met a senior professor who wrote everything on a chalk board during class because he thought that the act of writing on the board had a positive effect on student learning. He was a professor of science who, when it came to teaching, believed in magic. The world of education is full of untested beliefs implemented by teachers and, on a larger scale, by politicians and administrators. In previous blog entries I have described the magical belief in the use of technology alone to transform education.
 
Definitions of science are often split between science as a body of knowledge and science as a process for discovering and evaluating knowledge (the scientific method).  The scientific method should not be compartmentalized as applying only to the traditional areas of science knowledge (physics, biology and chemistry); rather, it can apply to all areas of knowledge. It requires much more creativity but no less rigor when applied to the nontraditional areas. 
 
The lack of application of scientific method to education is explored in this article: “Why Can’t a Teacher Be More Like a Scientist? Science, Pseudoscience and the Art of Teaching” (Mark Carter, Kevin Wheldall, Australasian Journal of Special Education, Vol. 32, Iss. 1, 2008). 
 
The possibility that individuals are predisposed to believe in way-out explanations instead of applying scientific method is covered entertainingly in this TED talk by Michael Shermer, “Why People Believe Weird Things”:

There are many areas, not just education, where scientific method is underemphasized and untested beliefs take hold. The media often plays a negative role in this by reporting on a single study showing that people doing X had a greater risk of Y (where Y is something bad) (see Ransonhoff D., Ransonhoff R. 2001 for a discussion of this issue). This creates two problems. First, a single study proves nothing. There must be a large weight of evidence (many studies) supporting a theory before it can be accepted as equivalent to fact. Second, correlation of two things does not mean there is a causal relationship. When discussing science as a process, I always ask ask my students who is wearing size 10.5 shoes. I then tell them that if they are male they should change their shoes immediately, since more men die wearing 10.5 size shoes than any other size. This helps them view studies that report correlations with a more critical perspective.
 
Until we all do a better job of integrating the scientific method into education and promoting it to the general public, untested beliefs will continue have more influence than they should. 

References:
 
Ransonhoff D, Ransonhoff R. Sensationalism in the media: when scientists and journalists may be complicit collaborators.  Eff Clin Pract.2001;4:185-188.
 
This video contains a reasonable summary of the scientific method.

The curriculum problem

The rapid growth in the amount of knowledge students are expected to acquire, which has led to ever-expanding curricula, is not being fully addressed in many disciplines. Instructors are trying to pack more and more material into the same amount of time. For students, this is akin to attempting to drink water from a fire hydrant: knowledge is sprayed at them faster than they can absorb it.

drinking from fire hydrant

The first universities had a limited curriculum sanctioned by the catholic church. However, there was from the beginning a the notion of a “tree of knowledge” growing out with different branches, as illustrated by Scott B. Weingart on his website.

During the Age of Enlightenment, more secular influences led to the establishment of several distinct disciplines, which over time schismed into new branches. Empiricism and natural philosophy developed into natural science with three main branches: physics, biology and chemistry. As knowledge expanded, each again branched into many sub-disciplines, e.g. astrophysics and biochemistry. This trend accelerated in the twentieth century with entirely new disciplines emerging regularly. For example, computer science evolved from mathematics and first appeared as a degree program in the early 1950s. In a relatively short time it has developed many offshoots, including information technology, software engineering and web science.

In traditional curriculum design there is the notion of a core body of knowledge associated with each discipline, that is certain things that graduates should know and understand and be able to do. In the past, if an expert in a particular discipline did not have core knowledge they would not be able to function. Filling in gaps in knowledge would require the time-consuming task of visiting the library.

The growth in knowledge over the last century is seen in traditional outlets for dissemination–books and journals— and is now being dramatically boosted through the internet via blogs, web sites and online videos [1].  Previously, the world learned about discoveries mainly when they were published in a journal (assuming one had access to a local library) or at a conference. Today, collaboration and sharing among knowledge creators happens rapidly and on a global scale through the Internet.

In many disciplines, core knowledge is now a fast moving target. What is known is changing so rapidly that what was held to be core 10 years ago may now be out dated or wrong. At the same time, internet and mobile technology enables us to instantly obtain most available knowledge when needed, provided we have the skills to know how and where to look.

The challenge is to revise the notion of curriculum design, rebalancing it more toward cognitive skills and less toward memorization of a relatively fixed body of knowledge. A sense of tradition often plays a part here: I remember the medical and dental students at my college struggling to memorize Grey’s anatomy. I had the feeling that this was more like a tribal right of passage, rather than an essential component of being an effective physician.

The modern curriculum designer should distinguish “need to know” from “nice to know”. The skill of acquiring just-in-time knowledge using technology should be emphasized. A curriculum should focus on a deep understanding of enduring skills, concepts and principles. Content that is traditionally memorized needs ruthless editing.  This presents a problem as most contributors to curriculum design find it easier to add items than to remove them. The necessary culture change in curriculum design brought about by the Internet age may take some time to achieve.

[1]   The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index  an article in the journal Scientometrics.

The Myth of Multitasking

Everyone has no doubt heard something along the lines of “young people today are more adept at multitasking”. You may also have heard more specific variations on this such as “students studying a certain subject” and “people from a certain country“ are better at multitasking than others.
 
The problem with such statements is they fly in the face of evolutionary biology. It takes many generations of a species’ evolution before real changes begin to emerge. Young people, students of certain subjects and people from certain countries are working with the essentially same brains as the rest of us. Brains have not magically changed in one generation.
 
Genetic factors are one thing but what about the environment? That also influences behavior. And of course the big thing that has changed in the environment is the technologies we use. So the argument may go that because someone has become proficient with a technology they have in effect supplemented their brain power in a way that makes them better multitaskers. 
 
One of the first scientists to sound the alarm on this kind of thinking was Clifford Nass. He talks about the problem and its relationship with technology in this NPR interview.

The basic message from the scientific research is that rather than multitask we rapidly switch between the serial processing of tasks. There is an extra load involved in performing the switch. Those who think they are good at multitasking are actually fooling themselves. Attempting to multitask demonstrably diminishes performance on each individual task. This is something that instructors may need to emphasize with their students, particularly when it comes to studying for exams. 

Additional reading and listening:

You Say Multitasking Like It’s a Good Thing by Charles J. Abaté
http://199.223.128.53/assets/img/PubThoughtAndAction/TAA_08_02.pdf
 
Think you are multitasking? NPR audio
http://www.npr.org/templates/story/story.php?storyId=95256794

Death By PowerPoint

The most ubiquitous tool in modern education is PowerPoint. Education is not alone in the widespread adoption of this tool. When I worked in research with the U.S. military, I was surprised at how much PowerPoint was used to capture and disseminate knowledge, a fact lamented in a New York Times article, “We Have Met the Enemy and He Is PowerPoint”.

In a recent article in Faculty Focus, a newsletter by Magna Publications, “Improve Your PowerPoint Design with One Simple Rule”, John Orlando states, “…. 90% of the problem can be solved by following one simple rule: No bullet points.” Visuals that illustrate ideas or concepts are fine but text that echoes what you say or reminds you to say it is not. At worst, PowerPoint becomes just an auto cue for what a lecturer wants to say, in which case it would be better replaced by a well-scripted online recording of a lecture. 

An argument for live lectures is that they can be inspirational if delivered by a skilled orator. This has prompted humorists to speculate how PowerPoint might have influenced the great speeches of history: I have a dream , Gettysburg address

In working with students doing presentations over the years I have found many of them use PowerPoint as a crutch in what for them is often a nerve-racking experience. Left to their own devices, they may have 30 or more slides for a 15- minute talk. They worry about finishing too early but more often run out of time. They often would rather look at the screen and read the slides than talk directly to the audience. I deliberately restrict their use of PowerPoint to no more than five slides for a 15-minute talk. A slide may have only pictures and diagrams, no bullets. 

I have tried to change how I use PowerPoint (more recently Apple Keynote) over the years. Understanding how easy it is to use it poorly, I restrict my use of it and focus more on teaching more interactively. For example, I teach design to technology students. There are principles of design, e.g. consistency. In the past I would have listed the principles as bullet points on a slide and discussed each one. I now use web links with examples of good and bad web page design and ask the students to discuss them together in groups. They must establish criteria to rate them and provide reasons for why one is better than the other. This task engages the students in analysis, discussion and collaboration. 

I find the students are fully engaged in the problem and in the process they discover most of the design principles for themselves. It takes a little more time to teach this way but the students learn more and are fully involved in the class. Unlike with a traditional PowerPoint presentation, they cannot pretend to be listening while actually being more attentive to what is happening on their phone screens.

Finally, I will leave you with comedian Don McMillan presenting a humorous expose of issues with PowerPoint use in his “Life After Death by PowerPoint”: