We are all in the gutter, but some of us are looking at the stars. – Oscar Wilde
I confess that at times I feel whiplash from swinging back and forth from the review of the new release of a tool to broad prognostications about society and culture, but the Oscar Wilde quote at the top is a good backdrop to my feelings about that.
This past year has been a time of great change and turbulence in the tools, practices, and thinking around the rapidly shifting world of work. Social technologies have been a key part of that, but practices like remote work — due to Marissa Mayer’s decision to drastically curtail it at Yahoo — and new approaches to management occupied as much of the 2013 news hole here at Gigaom Research as the release of new devices and software.
As a result, like an astrologer looking into the canopy above with my feet in the gutter, I won’t spend much time discussing individual stars in the 2014 skies, but instead I’ve come up with four constellations where I expect to see a lot of action in the coming year.
The consumerization of work
One trend that has recently upended the business world has been the adoption of consumer technologies — and their architectures and user experience — either directly into business use or as a forcing function leading to redesign and replacement of older hardware and software.
This has been most obvious with regard to the adoption of companion devices like smartphones and tablets in the enterprise. This Bring Your Own Device phenomenon offers companies the possibility of real cost savings on purchase and provisioning of “computications” tools. But it also terrifies the risk-averse IT staff of most firms.
I wrote earlier this year that this trend might better be called Bring Your Own Mind, since we become so dependent on the companion tools we use that leaving them at home would be something like a lobotomy:
The capabilities of tablets and smart phones are growing so rapidly it is hard to even recall what it was like a few years back, using dumb phones. [...] These devices are a central aspect of personal productivity and identity. People want to choose these tools based on how they do their work.
So BYOD should really be considered a shift of the boundary where the company’s control over the way we work — which equates to the way we think — is receding.
BYOM is a trend that will accelerate — if that is even possible — in 2014, as devices become more capable, and a new generation of apps and services are rolled out that increasingly counter the security concerns of the CIO.
The deepest and broadest aspect of the consumerization of work is the central role that file sync-and-share — as implemented by a growing list of companies like Box, Dropbox, Hightail, InterLinks, and SugarSync — now plays in the way that work is done. I have written a great deal about what I call the distributed-core architecture (see “Hightail raises $35M: the file sync-and-share market is red hot“), where file sync-and-share acts as a virtual distributed file system. This is, in fact, plugging a hole in the operating systems that dominate our world today (OS X, Windows, iOS, and Android) all of which treat the web basically as an afterthought.
These, and related trends, will continue to converge on an increasingly consumerized workplace and workforce, where consumer technologies and practices will displace entrenched alternatives. Consider Salesforce’s decision to drop its own file sync-and-share application, Chatterbox, to partner more closely with Box (see “Salesforce drops Chatterbox, announces Salesforce Files“), as just one past example.
(Disclosure: Hightail is backed by Alloy Ventures, a venture capital firm that invests in the parent company of Gigaom Research, Giga Omni Media.)
Dominance of mobile OS and the emergence of social OS
The real growth area for hardware continues to be companion devices (smartphones, tablets, and wearables), and so we are moving to a mobile-first work of work, as well. Already we have seen groundbreaking consumerized work-related products rolled out first or exclusively on mobile devices. That includes Mailbox on iOS (almost instantaneously, parent company Orchestra was acquired by Dropbox), and Anchor by Tomfoolery.
So we can expect that the most innovative, and disruptive notions will appear there, first.
I am expecting to see someone roll out a phone in 2014 where the distributed core file system is the file system built into the phone’s OS, where sharing of files, folders, and messages is a built-in aspect of the system, not a bunch of apps loaded later. This will most likely occur as some variant of Android and from a small-fry upstart like Jolla. But it could also be from a market disruptor like Amazon, who is rumored to be planning a smartphone roll-out in 2014.
Consider the disruptive power of social connection built into the operating system, as opposed to 10,000 siloed apps. And consider also the explosion of productivity that came from the emergence of email standards that allowed interoperability of email back in the 1990s. We’ll see some rumblings of this tectonic shift in 2014, I’ll bet.
Quantified self and the “me-ization” of productivity and performance
The Quantified Self is a trend being precipitated by consumerized companion devices and the growing existence of tracking performance on a daily level. Originally kicked off by the desire to track physical activity (e.g., Nike Fuel band and other dedicated devices), this trend is becoming more mainstream as smartphones and other wearables are capable of tracking location, motion, velocity, and who we are interacting with.
Consider the work of Sandy Pentland, the director of the Human Dynamics Laboratory at MIT, who has had pioneered the tracking of people geographically within business offices using customized ‘e-badges’ that transmit data:
As he [Pentland] described last year in the Harvard Business Review, he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
Pentland’s work shows the potential for tools of this sort, but the advance is more likely to come from individuals option to use low-cost apps on conventional companion devices, and the community’s data pooled for the purpose of everyone getting at the key factors for being more creative, productive, and learning the skills of becoming a charismatic connector.
It will arise from an almost obsessive “me-ization” around productivity and performance rather than company-imposed tracking of the sort that Pentland applied, I think, but it might be both sides at once.
Algorithmic science displaces folklore: AI in the workplace
It’s astonishing to realize how bad the results are from many common, everyday business practices. Consider hiring: Even a company like Google, filled with some of the world’s smartest people and nearly unlimited resources to spend on hiring, had for years applied a notorious brainteaser-based approach to winnowing out job candidates that, in point of fact, did a lousy job of predicting future performance.
Ultimately, sanity has prevailed, and they no longer ask people, “How many basketballs can fit in a school bus,” or the like. They researched themselves and found out that riddles don’t work, and also that no one was particularly good at hiring, as outlined in this piece:
Years ago, we did a study to determine whether anyone at Google is particularly good at hiring. We looked at tens of thousands of interviews, and everyone who had done the interviews and what they scored the candidate, and how that person ultimately performed in their job. We found zero relationship. It’s a complete random mess, except for one guy who was highly predictive because he only interviewed people for a very specialized area, where he happened to be the world’s leading expert.
No surprise, perhaps, that it’s better to make hiring more formalized and oriented toward “behavioral interviewing” where candidates are asked questions that place them in a specific context and where they spell out their perceptions, like “give me an example of when you solved an analytically difficult problem.”
The reality remains that the conventional approach to interviewing people for jobs is a total mess: complete folklore, and ungrounded in any predictive way.
However there is a great deal of new science that shows that analytic tools can be a very good job of predicting success in well-defined domains, so long as a large body of data is available for algorithmic analysis. For example, Gild has developed software that can read the open-source submissions of programmers, evaluate how good the code is likely to be, and cross-correlate that with the programmer’s social media involvement. Based on the manner in which other programmers treat a candidate, and the first-order assessment of their coding, Gild has pretty good success determining who is the real deal and who is the wannabe.
Most importantly, even for those who have not contributed to open source projects, Gild can make inference on a candidate with strong certainty simply based on the way other programmers relate to the candidate.
Obviously, these capabilities could be directed to the developers within a company, to identity the plodders and the superstars, as well as with candidates. And increasingly, people will be less involved in the nuts and bolts of evaluations like this, leaving it to “engines of meaning” — that is, AI and big data algorithms — to operate in a spiderish, bottom-up way, and we’ll put the top-down, cognitively-biased approaches of the past in the trashcan, and click on “empty.”
A final thought
Things are changing so quickly we may start to suffer from something like the optical illusion that comes from looking at a fast-moving car’s hubcaps: At certain speeds, they can appear to be rolling backwards. The equivalent in our case is to start to imagine that just because we are accelerating quickly these days we can see farther into the future than we could in the past: that the future is closer just because we are moving faster. Lamentably, this is not the truth. It is just sooner, not closer.
And in fact, we have to accept the increased levels of risk inherent in hurtling blindly into an unknown future, because the old techniques didn’t work even when everything was moving 10 times slower.