Ok, let’s get out of the way that you’re reading a blog post about Educause by a commercial solutions provider who was there.  We know that you know that we have a bias about our product and about our presentations at such events.

We get it.  But we hope you’ll read on anyway, because it may pleasantly surprise you.

Educause had a lot to be excited about this year.  We likely can’t do justice to it like @mfeldstein67, who is seen by most to be an impartial observer.  Although his piece, “What I Didn’t See At Educause” was strong, I really, really, REALLY wish he would have come by our booth and asked us his questions.  Sigh.

Maybe he felt anyone in ‘Start-Up Alley’ was too immature to answer an efficacy question effectively.  It’s hard to say.  But knowing Michael, had he come over, we could have had a really good conversation.  Why?

Educause2019a

For a few reasons.  First, he may have been surprised to see a “start-up” (we’re really not a start-up, but sitting in graduate row of start-up alley would lead one to suspect we are) with so much traffic.  Even the other graduates were asking why so many people were coming to see our stuff.

The answer is simple.  Our partners are not just happy with us, they’re genuinely pleased with the early results of our implementations and connective-tissue assistance. 

True, they are not yet at a place to perform the A/B testing or the randomized trials that we are encouraging them all to attempt.  They almost all assure us that they will, but for now, they are just really excited to be implementing a product with people who speak their language and know their political, bureaucratic landscape. 

But the second reason we think Mr. Feldstein would have enjoyed a conversation with us is because we have an answer to his question, “How do you know it works?”  The answer is that two of our executives performed these exact kinds of trials and experiments before they left their private university and came to work with us.  In fact, the results of those experiments were part of the reason they were so excited to join our team, sharing this information with every school in the land.

In one case, a communication experiment was performed.  Just about everyone today knows that students loathe email.  Some reports suggest that less than 30% of students even open school email at all, treating it just like spam.  Wow.  So, prior to the launch of Campus at their institution, the Office of Innovation sent out an “all-hands” message to all faculty, staff, and students.  Within the message was a (trackable) link which was important and needed to be clicked as information was being sought.  However, only 19% of users clicked on that link.

About four months later, the Campus platform was up and running, connecting people to every tool, but also funneling notifications to people by cluster, stakeholder type, etc.  So, a very similar message was created and sent out, again with a trackable link.  But this time a notice was created, alerting people to the message.  And guess what?  67% of all faculty, staff, and students clicked on the link.  

In experiment number two, new data was added to the school’s “success platform” algorithms as an A/B experiment.  Two groups of 200 students were randomly added to this trial, with one set using Campus and the other set not using Campus.  (It hadn’t been turned on for all students yet.)  The digital footprint left by Campus-using students included a lot of information that the success platform did not (or more accurately could not) access.  Campus has data on almost every action outside of the LMS, whereas most success platforms measure behaviors only from the LMS. 

So, while the typical success system is dependent on the LMS to look for grades and scores, comparing and contrasting that against other student behaviors as well as an individual student’s historical markers, this experiment added another aggregated score.  It added affective data from students who were clicking in and around the digital campus, making posts, liking posts, commenting on things, asking questions, and clicking into systems other than the LMS.  The result?  The ability to see “at-risk” students about 2.2 weeks prior to the success system alone.  That speed also meant getting in front of students before grades dropped, which any success adviser will tell you is usually the death toll for drops.

We hope you see why a conversation about efficacy of the platform would have been really enjoyable.  Perhaps we can have that very conversation with you.

But, without tooting our own horns too much, it was incredibly reassuring to see 4 to 5 times more booth traffic this year, mostly due to word-of-mouth conversations with our current partners.  There is a reason we tripled in size from last year’s Educause and there is a reason we will likely do so again before next year.  This stuff works. 

We have a bunch of science and studies behind the world-class design and infrastructure, but we also have more than 75 years of formal, higher education experience that helps our choices pass the “sniff” test, as we move forward.

So as we come out of Educause, we hope you might take a look around the websiteThen shoot us an email or request a demo.  We would love to show you what everyone was so excited about in Chicago.  But most of all, we would love to help you help your students.  That’s why we do this, after all…

{{cta(‘f3efec4f-e79e-4379-abbb-f689b57346fe’)}}