Behold the good bot and more surprises from our undergrads

Our society hinges on the teachings in university classrooms. Thankfully, there's good news to report

students group laptops study
Credit: Flickr by pixbymaia

Win some, lose some. I like to poke fun at our institutes of higher learning, especially when they spend grant money on real-world research like zombie safe zones. I suppose that’s funny if it’s not April 15.

But then I overheard an undergrad from a New York City university that shall remain nameless attempt to explain to his study partner that Tönnies’ Gemeinschaft-Geselleschaft dichotomy referred to two large German mine shafts that split apart from a single tunnel. That spawned actual pain right behind my cornea, as did the fact the study groups seem to have moved from libraries and coffee shops to bars. At least I can now immediately quiet my frustration with a third shot of scotch.

Today, at least, our college population has scored 50 percent with me. True, it was a failing grade when I was in school, but I’m looking at it as an even swap: abject fail vs. hopeful vision -- finally.

Academic underachievement

Our letdown comes from Texas A&M Galveston where Professor Irwin Horwitz decided to fail his entire Strategic Management class due to unacceptable behavior that included lying, cheating, backstabbing, rumormongering, and physical threats. In a show of support typical of academic administrators, A&M immediately overruled the good professor, citing the class had not yet ended, therefore no final grades could be administered. The university then quickly removed Horwitz as instructor and replaced him with a minion of its own.

You’d think this would be cited as a resounding success story for real-world business education, and most of the InterWebs drew that conclusion when the story went viral. After all, Silicon Valley is today’s MBA mecca where lying, cheating, backstabbing, and baby chewing are considered baseline survival tools. Getting senior management to bail you out of a jam could even be seen as extra credit -- magna cum sleazy.

But the folks who painted these students as unsung academic heroes didn’t use the Internet to its full potential. If they had, they’d have looked up Prof. Irwin Horwitz on Texas A&M Galveston’s website and discovered that he’s an “Instructional Associate Professor” of not only pompous redundancy in job titles but also A&M’s Department of Maritime Administration. They’d have also seen a staff photo they might have mistaken for a birthday clown.

The course description reads as you would expect, full of terms like “top management decision making,” “industry analysis,” “organizational mission and objectives,” and “strategic implementation and control” -- exactly the kind of chum needed to attract tomorrow’s business sharks. Horwitz also cited “social responsibility,” which is probably where he lost it.

That’s not the epic fail. The fail is that he was teaching the art of Silicon Valley business to a bunch of people who might someday have to pull together to survive the high seas. A&M needs to realize that if those students take a three-hour cruise, they’ll wind up eating Gilligan, pimping Ginger, and exploiting the Professor for patents. Then again, if Somali pirates start hiring at the management level, they’re probably golden.

Bot breakthroughs we can use

Enough fail -- our success story comes from the University of Washington. In an uncharacteristically useful experiment that showed actual foresight, UofW researchers have tested the security measures of robotic surgery systems. I’ve whined about us speeding toward an Internet of completely unsecured things before, so knowing that a few academic luminaries are eschewing zombies and Satanic business practices to perform research in the interest of public safety gets a big thumbs-up from me.

Robotic surgeons aren’t currently slated for autonomous operation (I know, Google ... yet), but more as remote- controlled medbots for use in isolated areas or emergency scenarios like large natural disasters or battlefields. A doctor sits in her office and uses a computer interface to pilot the robodoc while it removes that irritating steel girder from your brain pan, perhaps even using her Apple Watch.

While these doctor drones are remotely driven by (hopefully) qualified humans, that doesn’t change the fact that they’re twirling Ginsu-sharp implements around your innards while being directly controlled by a computer. Now imagine if some 14-year-old gets bored with swatting and decides it might be more fun to cut into this feed and replace your nose with your colon. Or the NSA decides to add a script of its own to every invasive procedure that tucks a microtracker behind our livers -- for our own good. According to the researchers at UofW, neither scenario is difficult because the security around robotic surgical systems sucks, like blink-and-you-get-root-access sucks.

I’ve said it before, the Internet of things is to security as a politician is to shame. It doesn’t have any. However, that well-established fact and weekly headlines covering an endless variety of dastardly digital deeds have done nothing to slow down IoT. Maybe the thought of a Horwitz student grabbing control of that robo scalpel about to dig into your appendix will open some eyes. Go Huskies.

To comment on this article and other InfoWorld content, visit InfoWorld's LinkedIn page, Facebook page and Twitter stream.
From CIO: 8 Free Online Courses to Grow Your Tech Skills
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.