Home » Archives for January 2010

Month: January 2010

Marco looks to BI for help

My friend Marco’s spam-bait operation was down last year, and he’s been asking me what business intelligence can do for him. He had just read one of TDWI’s promo emails last night when he called me again.

“I like Vegas. Should I go?” he asked from somewhere that sounded far away. I said it all depended on what he wanted to learn. Is making sense of his data important? If yes, go. But there seemed to be more to his question.

He’s gone through one shady business after another since the early ’60s, when as a teenager he sold drugs on the street. Now he sells fake email addresses in huge blocks to Eastern European spammers. All his customers have had a good education, he tells me, yet most retain some of their families’ traditional ways. He describes them picking over his blocks of email addresses as if over oranges in a bin, rejecting one, taking another. They seem to rely entirely on feel, and Marco makes sure each new batch feels “fresh” and authentic year after year.

“Cool. My data’s real, real important to me,” said Marco. “So’s my know-how, my experiments, my research. Those experts in Vegas dish on how to manage all that, man?”

Definitely the data, I said, but not much on the qualitative end of his research. He was disappointed. 

“You know, you got me going on this insight thing, man,” he said. “And then you change the story. This business intelligence takes care of only some of my insight? Only some of it? What do they think, data’s the only way you get insight?”

He had a point. I thought fast. I said he should think of his operation like a speeding car. He liked that. I said he needed a “dashboard” to let him know how he was doing. He liked that, too. There was a course on Tuesday, I said, all about that.

“Cool, man. But what about my research? I got these journals I keep with my results and theories and shit like that. What about all that? I keep losing track of it all.”

I said I thought he was talking about knowledge management or something. 

“Yeah, that sounds like what I want. Knowledge management. They don’t do that there?” 

I explained that data was this event’s main focus. Other events … but he cut me off.

“No, man. Here’s what it’s about,” he said. “It’s all about marketing. I don’t know much about business intelligence, but I bet that every benefit, feature, whatever comes from a different tool. Each comes from a different vendor,” he said in a tired sing-song, “and the producers of this event have a line on a certain kind of vendor. To protect their game, they make up a category. Get hip, man. It’s always like that.” 

He quickly added, as if he had already bored himself, “How’s the food there? Can a guy score somethin’ to eat?”

The best Caesar’s can offer, I said. Then he had to go answer the door. I heard urgent knocking.

Bring in the shrinks for decision analysis

Now comes the hard part in business intelligence: figuring out how the humans can make better use of all our data and tools for decision making, writes Wayne Eckerson, director of TDWI Research. Let’s bring in the shrinks.

When Wayne points to a trend, it’s news even if others might have already foreseen it. He’s one of the industry’s most thoughtful observers, and one of the most deliberate.

In Tuesday morning’s blog post, he suggests improving BI by enlisting those who study how people make decisions.

To take BI to the next level, we need better insights into human behavior and perception. In other words, it’s time to recruit psychologists onto our BI teams.

He gave an example of one place that could have benefited from visits to the shrink’s couch.

A recent article in the Boston Globe called “Think Different, CIA” provides some instructive lessons for companies using BI tools to make decisions. The article describes a phenomenon that psychologists call “premature cognitive closure” to explain how humans in general, and intelligence analysts in particular, can get trapped by false assumptions, which can lead to massive intelligence failures. It turns out that humans over the course of eons have become great at filtering lots of data quickly to make sense of a situation. Unfortunately, those filters often blind us to additional evidence — or its absence — that would disprove our initial judgment or “theory.” In other words, humans rush to judgment and are blinded by biases. Of course, we all know this, but rarely do organizations implement policies and procedures to safeguard against such behaviors and prevent people from making poor decisions.

See his full post here.

Be sure to see the comments, too. He writes in reply to questions, “Like data governance, we need some principles for approaching and managing decisions. Maybe we should start a decision governance institute!?”

I can’t help notice: an institute.

See “CIA’s insights on the psychology of analysis on Datadoodle.

“Streetlights and Shadows”

Some of the books Stephen Few reviews may at first glance to have little to do with data analysis. On second glance, though, they have everything to do with it. He often goes into the essence of thinking, insight, and decision making — core knowledge for BI practitioners.

See his latest, posted yesterday afternoon, on Gary Klein’s Streetlights and Shadows.

Mapping the many faces of “retention”

Everybody knows what “retention” means until they have to design a metric. Ken Rudin, once of LucidEra and now general manager of analytics at the games site Zynga, thought that he and his team could “put something together” quickly — but it actually took “four solid weeks of discussion and debate.”

About 50 million people play Zynga games every day. It’s the leading online social gaming platform, according to Ken, and it’s grown from zero in 2007 to revenues of “a few” hundred million dollars annual revenue. Every day, the company captures 20 to 30 billion records of data, and Ken and his team use that data to improve revenue, viral marketing — and customer retention.

Zynga players play free. The revenue comes in a few dollars at a time for “virtual goods.” In the popular game FarmVille, for example, a player might get tired of the old-fashioned plow. The tractor upgrade costs $2.

“There are tons of different ways you can think about retention,” he laughs, “and which one should we use?”

How do you know when a customer has left? “Unless we don’t get a note saying, ‘Hi, we’re no longer playing,’ how do we know?”

Of course, no player’s going to make it that easy, so how long should Zynga wait before considering the player gone? A week? A man could have dropped his virtual pitchfork for a real vacation — or he could have plowed the last row.

Ken dealt with analytics all the time at LucidEra, but games were new to him. He’s learned a few things.

“It turns out, as you might imagine, that it depends on the game,” he says. The average simulation-game player tends to visit frequently, for example. Poker players, though, are much more likely to come back after, say, a three-month gap.

The retention curve also varies by the length of each player’s tenure. A new player who stays away 30 days is much less likely to return than a player who’s been at Zynga for years. Ken now puts users in three basic tenure buckets: “new,” “mature,” and “elder.”

Whatever question you try to answer, it has to be actionable. “There are metrics, and there are metrics that matter,” he says. If volume plunges, were the missing players mostly new ones? If so, it could indicate frustration; perhaps the games need better tutorials or less functionality at the beginning. Or were most of the missing the long-term customers? If so, perhaps the games haven’t offered enough challenge.

Ken expects growth when the economy improves. “When we look at what happens over holidays, such as July Fourth and Thanksgiving, usage really drops. Then it picks up as people go back to work,” he says. “[The games] are part of their routine. On vacation, players break their routines. They sleep late and spend more time with family. They don’t play the game.”

“It’s fascinating,” says Ken. “In analytics, so much of the problem is figuring out what the question really is.”

I think he means that it’s a great game.

Rolling heads can’t think

Wolf Blitzer calls for heads to roll after the Christmas Day attack. But Jill Dychè is a data pro, and she’d rather let the heads think.

“Who should get fired?” is the same conversation as after screwups in corporations, writes Dychè, principal at Baseline Consulting.

Instead, the government should be addressing process issues. Indeed, the real conversation should be how to move forward. These questions should be asked now: “How should we bring identifying data together? What are the key sources? How should integration, access, and usage policies be formulated? What would a sustainable process look like?” Those questions aren’t “who” questions, they’re “how” questions, and they should be front-and-center in the national security conversation.

Read the full blog post.