Home » BI industry

Category: BI industry

More tangible, less “smart”

First came the prediction: “The whole ‘smart’ thing is about to get real.” Really? What made this apparent insider say so? The prediction, given in a forum where other predictions flew in the breeze, was on the record. Then two months later, it had gone off the record.

What happened? Apparently, there had been a corporate-wide retreat from “smart” — as explained to me, a retreat from the aspirational to the tangible.

That’s unlike conventional marketing, which blows by the tangible in its rush toward aspiration. Does this portend a wider retreat from the term “smart”? Is it a shift toward a market that prefers the tangible over the dream? Or is it just dumb?

Tableau is the new Apple again

Fourteen thousand people looked on earlier this month as Tableau’s new CEO, Adam Selipsky plodded onto the stage. The stage was 10 times wider than at the first conference, an audience 70 times bigger — and a CEO not even a quarter as fiery as the first one. But he seems to be a good fit for Tableau’s new era.

I’ve watched this show every year since 2008, when founding-CEO Christian Chabot paced the 20-foot stage and put on his first tent-revival style keynote. Back then, I talked to many of the 200 or so attendees in the jammed hallways at the Seattle’s Edgewater Inn and heard story after story — most of which went something like this: We had data we didn’t understand, and then someone in our group said we should try this funny tool he’d downloaded. By the end of the day, we found something we had to tell the boss about.

Chabot’s pitch reminded me of Steve Jobs just after Jobs’s returned to Apple, and I wrote here that Tableau was “the new Apple.” The whole conference exuded warmth and humanity. Even the food was good.

If Chabot was the new Steve Jobs, Selipsky is the new Tim Cook. The question now is whether the product’s underlying humanity is safe with Selipsky. Is the product’s essence — what Selipsky has called “the golden egg” — safe from corruption from newer, bigger market pressures?

Until now, we felt assured that the three founders stood watch. No matter how they expressed the Tableau essence, they were always watching and guiding. But none of the three even showed up to the conference this year. They’ve ordained an executive who’s more suited to manage Tableau at its new maturity than than I assume any of those three are.

At a Q&A session for industry analysts, I asked how he could assure Tableau users that the “golden egg” will remain intact?

Excerpts of his answer:

The honest answer is I’m not 100% sure. It’s the hardest job that I have. I feel that very personally. … There are some things you know. You know when you understand the product. You know when you understand how we go to market. How do know when you understand the culture and what is valuable about it and what also needs to evolve about it? It’s something that has been an acute topic of thought for me over the past 13 months. … I kind of [feel like I’m in] a room where [we are] moving furniture around and doing a bunch of remodeling in the room. And there’s a golden egg somewhere in there. And we’re not 100 percent sure where it is; you can’t always see it. The thing we want to do is to make sure we don’t smash it as we’re doing a bunch of remodeling.

He’s sincere. He’ll identify the egg and respect it. A committee that chief product officer Francois Ajenstat described to me will care for it, cultivate it, and ensure that it has food, water, and warmth.

The funniest part of all this is that there never was any such egg or essence. I’ll bet that from the beginning the Tableau priesthood just felt their way in the dark. The product, the company, the competition, and the market is all too complex and too subtle for anything else. Those who look for any kind of true north will go crazy as poles shift.

In time, all that will remain of Tableau the company is yet another business story of innovation, disruption, and a final decline. The real, enduring legacy will be that, thanks to Christian Chabot, Chris Stolte, Pat Hanrahan, Jock MacKinlay and others, the data in those stories can be visualized, beautifully and meaningfully — with some tool or other.

So carry on, Adam Selipsky. Build a still bigger, higher, more secure nest. Next year, try to plod over more of the stage, deliver your lines better, and make sure the welcome reception serves edible food.

A lucky dog and what he showed about doing good with data

Jasper the dog’s story, in which luck turned from bad to good, was a clear-cut case of doing good with data.

He was found along a road in a rural part of Southern California dragging his back half in the dust. He’d been injured. Someone brought him to a shelter — which is where his luck might have turned very bad. He could have spent a short stint on pain meds as he waited for adoption, then a quick shot of pentobarbital.

Jill Dyché told the story at the recent Pacific Northwest BI and Analytics Summit in her presentation on doing good with data — which arose from her work with Los Angeles-area dog shelters. By day, she is SAS vice president of best practices and at other times she is an advocate for modern data practices to improve dog rescue.

Bad KPIs

You can imagine how a shelter manager would figure Jasper’s fate, based on no data but impressions over years running barebones operations: He couldn’t walk, he wasn’t photogenic, and he had a high risk of post-traumatic stress. To those sad KPIs, the manager would add the cost of patience: He occupied scarce cubic yards of kennel space, ate a cup of food twice a day, drank precious California water, and required a friendly pat once or twice a day. Even so, the dog was in a “great mood,” as Jill described him. The shelter gave him a break.

Real luck struck, though, when Jill did a video of him, one of many she’s done since 2014 and posted on Facebook. Jill sits on the grass with a dog extolling the dog’s playful friendliness, high IQ, and good looks. Why bother? Because videotaped dogs get adopted far more often than those who don’t — contrary to belief in the shelter community. She has data to prove it.

Good attracts more good

Does the cool-eyed business community care about doing good with data? Apparently it does. Several of the assembled two dozen data-industry experts chimed in with stories of the appeal. Chris Twogood, vice president of marketing at Teradata, told how several candidates for a high level position wanted to know about the company’s program. Josh Good, director of product marketing at Qlik, told how Qlik people donate their data skills for good causes. Others had similar stories.

The rough and tumble of doing good

Jill broadened out from dog rescue to find other ways organizations were doing good with data. She found four main categories — with examples of doing good that are not as clear-cut as Jasper.

• Organizations have been using data to rethink old problems, such as declining fisheries. The Sustainable Fisheries Group, based at University of California Santa Barbara, aided by Stanford ChangeLabs and funded by the Rockefeller Foundation, hopes to help thousands of small fish harvesters to improve their practices. As wild salmon stocks decline, for example, these business people can get help finding other fish to harvest as salmon take a rest.

• Give one, get one. Buy a pair of Tom’s Shoes and another pair goes to someone in need of shoes who can’t afford them. Smile-Squared gives toothbrushes away. Soapbox, selling “soap that matters,” uses analytics to monitor its suppliers as it gives away bacteria-fighting soap around the U.S and the world.

• Profit for good. Elon Musk, though a villain to some, supports open data, such as data in support of solar power. On the other hand, there’s Indian gaming and its benefits for about as many people as can crowd around a craps table.

• Government — for me the most interesting. This is where the rough and tumble of city politics really puts “good” to the test. The nascent “smart cities” movement, which puts networked sensor data to work on city priorities, forces tricky questions: What exactly are the priorities and, having decided them, what data do we look at? Traffic flow? Sustainability? Neighborhood sociability and good restaurants? In my own favorite and unfavorite city, Berkeley CA, they say the hell with good traffic flow. Let’s have good food! I’m all for that except when I’m driving across town.

It’s all so complicated. Our love for Jasper and the goodness he means to us is so much easier than the “good” of most other things. Still, we have no choice but to try. As I think I’ve heard Jill say, one dog at a time.

Data: it’s just notation, not reality

The always fascinating Donald Farmer, former Qlik exec and now Treehive Strategy principal, has news for users of data business: “Data isn’t the real world.” It’s just a reflection that’s framed by stories we tell ourselves.

Stories come first, contrary to the data industry’s dubious vision. Data, the marketing likes to imply, is a divine compass from a virginal birth. Just get some and you’ll know the way.

There was no virginal birth for data but, as Donald Farmer illustrates, there is jazz. In his presentation to two dozen industry experts at this year’s annual Pacific Northwest BI and Analytics Summit, you can try this: Force a John Coltrane song into musical notation, then give it to 10 jazz musicians. They’ll produce 10 different songs — and not one will be Coltrane’s song.

“People say we’re recording [business events],” said Donald, “but we’re not. We’re notating it. It’s a representation. Sampling is more like it.” He’s not the first one to say such things, but it takes someone of Donald’s authority to win much notice.

Data needs interpretation, and that’s always based on assumptions. “When we say we ‘lost an opportunity,'” he said, “that’s just a story we tell ourselves.” Sales people often come back from meetings gloomy about lost sales. “They say, ‘I’m going to miss my quarterly target, or my girlfriend will leave me because I couldn’t give her the vacation I promised.’ We think that’s the real world.”

The “lost” sale may be not be lost for long, such as when the prospect comes back in six months after the competitor failed to deliver. The salesperson may also cultivate a trusted-advisor role and win in the long run. And the girlfriend leaving just because she couldn’t go to Cancun, well, maybe that’s a good thing.

Even in IoT (Internet of Things), what’s assumed to be pure data, hot off the sensor, was configured based on beliefs. What is “just a binary signal” is limited, for example, to a given spectrum.

What’s a business person to do? Farmer suggests that data users “walk back down the ladder” and to inspect any unconsciously adopted limits. There, on the lower rungs of the mind, you might find unfounded assumptions, stories, and alternate premises.

Donald’s observations stirred up concerns, of course. Suzanne Hoffman, veteran BI software executive now with ZenOptics, asked about the effect of too many individual interpretations. “That’s chaos,” she said. “You can’t have that.” Donald replied that that’s just competitive advantage: “Businesses do things in different ways,” he said. Suzanne: “Isn’t the goal of methodology to accept thinking ‘outside the box’?” Donald: “Methodology can get in the way of doing that.”

Merv Adrian, vice president of research at Gartner, said, “It’s the difference between implicit and explicit…We live every day in the implicit set of choices and the ideology that represents. … If we can deconstruct how we got here, we might make different choices.”

Ideology is embedded even in the design of analysis tools. Tableau makes certain things easy for those assumed to be using it, skilled analysts (at least according to Qlik dogma). They are different from the users Qlik assumes it serves, everyday business people. Qlik’s users, less skilled in analytics, won’t have to face statistically-laden trend lines, Donald explains — though he hasn’t yet said what Qlik offers instead.

Donald’s forthcoming book will go into far more depth on the subject in the first half. The second half will address handling ambiguity. He expects it to be out in the second quarter of 2018.

Data lake: compositional or architectural?

Is the data lake following the typical path for new technology? Merv Adrian, research VP, data management and integration and Gartner was talking about data lakes and big data projects at the just-concluded Pacific Northwest BI and Analytics Summit. Josh Good, senior director of product marketing at Qlik asked the question.

Merv’s answer:

That’s a terrific question. We’re talking about a phenomenon of some recency which is the notion of the new platform sell. [It’s] not a new application, not a new function, but a new platform designed to replace existing ones or supplement them (usually the first until they figure out that’s not practical). And that, I think, is the larger market failure … or the blunting of the thrust that there’s this new opportunity to build new platforms.

I’m relatively convinced that people coming into the market now are not thinking about the replacement of the end to end. They are looking for parts. If they’ve gotten at all sophisticated or knowledgable about how to achieve the outcome that they presumably have defined, then they have put together in their head at least some sort of chart they can draw on the wall, which is a bunch of boxes that connect to one another with flows, and they’re identifying the APIs among them.

That’s becoming an issue especially as we move to the cloud and people start talking about services-based architectures and are thinking about the way they want to get to where they want to go is a composition exercise, not an architecture one.