Case

Damballa: A Startup Horror Story

In 2006, Merrick Furst was the undergraduate dean of the College of Computing at Georgia Tech. He’d had a remarkable career: he was one of the co-inventors of probabilistic circuit analysis, was dean of the graduate program in computer science at Carnegie Mellon, and then president at the International Computer Science Institute at UC Berkeley, before moving to Atlanta. Furst had an entrepreneurial bent: in between academic stints, he’d founded several companies. The most notable was Essential Surfing Gear, which was likely the first company to provide apps for web browsing. Essential Surfing Gear was sold in 2000.

In 2005, eBay’s chief information security officer (CISO) Howard Schmidt visited Georgia Tech for a board meeting. Schmidt wasn’t just a CISO; he had served at the White House as a cybersecurity coordinator in the Executive Office of the President, where he was known as one of the foremost experts in cybersecurity. This was early enough in the history of the web that it was possible the term had not yet even been invented. Furst met with Schmidt during his visit; he was excited to talk to Schmidt about some new cybersecurity technology they’d been working on in Georgia Tech.

At the time, computer viruses were mostly viewed as annoyances. Sure, they slowed down computer performance, popped up obscene messages or crashed individual infected systems. But they weren’t regarded as professionalised threats — most viruses of the time were not designed to steal identities, auth credentials or financial information, nor were they built by criminal groups or state actors. Many viruses were the result of hobbyists — ‘script kiddies’ as the term went. But there were early signs that a new threat was materialising, and the team at Georgia Tech had noticed. What Furst wanted to show Schmidt was a solution to this emerging threat. The group had observed that a variety of cybersecurity attacks was increasingly perpetuated through a network of compromised machines. The name they used to call the threat was ‘bot armies’. The name that eventually stuck was ‘botnet’. Already, botnets were taking control of vast numbers of computers without their owner’s knowledge, and the malware that made up the botnets was becoming increasingly capable of more and more problematic attacks at the bidding of sophisticated ‘bot-masters’. 

The fact that a name did not exist was a demonstration of how early this all was: to most of the folks doing business on the Internet, this seemed like something out of a cyberpunk novel. They barely knew the shape of the threat, or the potential danger.

Schmidt understood the threat immediately. He thought that Georgia Tech’s solution had relevance to eBay’s commercial interests, and asked Furst out to San Jose to present. Years later, Furst and his business partner Matt Chanoff would write

The meeting seemed like a spectacular success. Howard and his team already knew that botnets were busy ripping off eBay and its customers. Computers around the world were already impersonating humans and, for example, setting up fraudulent sellers and posting fake reviews so that buyers would trust them. Howard described this as “trust fraud.” They were also subverting the advertising revenue model with fake clicks. At least as worrying as all that, bots were appearing on internal eBay computers and doing who-knew-what.

And they were ubiquitous, estimated at the time to be lodged on 17 percent of all computers worldwide (emphasis added). The Georgia Tech team’s technology (…) appeared to be a revolutionary solution for a huge and promising market.

Howard and his fraud team did some calculations right in front of Merrick and said, “If you can stop this kind of trust fraud, it can save eBay $40 million per year. How much will you sell it for? (emphasis added)” Merrick, who didn’t have an actual product yet, let alone a pricing plan, did what experienced entrepreneurs do—he made up a plausible number and said, “$150,000 per year or so, to start.” Howard jumped on it. His next question was “How soon can you deliver? (emphasis added)

To Merrick, Chanoff — and eventually their investors — this was clear proof of demand. eBay wasn’t the only team that responded eagerly. The Georgia Tech team heard similar things from dozens of prospective companies. Even before they formed a company to commercialise the tech, they sold a rudimentary data feed to a large security company for $100k a year.

Furst and Chanoff founded Damballa in 2006. They negotiated IP rights from Georgia Tech and got started converting the tech into production-ready software. Furst became the company’s initial CEO, with the intention of handing it off to other execs as the company gained momentum. Thanks to anecdotes like eBay’s, and Furst and Chanoff’s illustrious backgrounds, the Damballa team raised money easily and on good terms: their initial raise was $2.5 million on a $5 million valuation — considered remarkable for 2006. The two VC funds they raised from began helping Furst and Chanoff with building out the team. 

Six months later, Damballa was ready with a product for eBay. They turned up at the company asking, in effect, “Who should we talk to, and where do you sign?” But then, strangely, eBay began dragging their feet. Schmidt delegated the project to a subordinate. There were many polite conversations that never led anywhere. The signs of demand — so strong at the beginning, so remarkable and so clear, suddenly seemed illusory. Damballa never sold a trust fraud or click fraud solution to eBay … or to anyone else.

More than a decade later, Furst and Chanoff would reflect on what happened next:

Everyone at Damballa believed that all the elements that made up demand were in place. The product would save customers a large amount of money, it worked the way they needed it to, and we had a competent team and sufficient capital to operate. Most important, we all had a fixed idea in our heads that we never questioned: companies would not tolerate their machines being compromised. Bots hiding secretly on the company computers led to all sorts of risks. Click fraud, trust fraud, stealing passwords, eavesdropping on company emails, stealing proprietary data or customer information—we made up examples, and we heard examples from people like Howard. We didn’t feel stuck. Even as sales lagged expectations, we always felt that we saw the problem and could move forward by fixing it. Maybe our software increased processing time. Maybe putting third-party software like ours inside customers’ firewalls was too risky for them. Maybe putting it outside their firewalls made them feel vulnerable. Maybe the particular examples of bad things bots could do weren’t hitting home and we had to change the marketing.

These were all obvious, reasonable actions. The team found a replacement CEO and Furst moved himself to the board. Over the next several years, as sales to other companies fell well short of expectations, internal company and board conversations kept revolving around the same issues. They did everything a company on the cusp of a major breakthrough would do: they replaced key management personnel, improved the product, raised more money — repeatedly, over many years, eventually deploying $69 million in venture capital. The technology was novel, the target market was large and lucrative … what was wrong?!

As management and the board worked to get the company on track, they addressed all the conventional issues. They believed in a very clear idea of why there should be demand, and worked on the basis of that belief. They thought the customers were compromised by bots and that they would buy things to fix that—that they would not not buy because they couldn’t allow themselves to be compromised. That view stayed at the root of all the company’s plans and tactics, and it didn’t budge. Internally, there were variations on that basic belief. Some people thought that money was the issue: customers would buy because being compromised cost them money. Others thought our customers would be afraid that their customers would be scared away or displaced by bots. Still others thought risk was the issue: customers would buy because they were otherwise vulnerable to fraud allegations.

With hindsight we can see that these just aren’t effective ways to understand customer demand. The right question ought to have been, “What ever gave us the impression that eBay would be a customer?” On what basis did we believe that our preferred value proposition would actually drive sales? (emphasis added)

Furst would later describe those years as “living in a waking dream.” In The Heart of Innovation, his 2023 book with the answers to many of these questions, Furst wrote that conversations during this period was extremely painful, because everyone felt like they knew what they were talking about and yet nothing they said had reliable predictive power. Even when they disagreed, their core premises were all the same: “Customers bought due to a value proposition. They bought based on certain properties that they possessed, and that the product possessed.” 

But Damballa’s value proposition was perfect. And yet it did not lead to sustained demand. Furst observed that such logic didn’t work — at least not universally. Which meant that it wasn’t very useful.

In the end, through a ton of hard work and effort, Damballa grew to $12 million in annual sales. This was itself some signal of demand, though not the level of demand that the founders and investors expected. In an interview with James Altucher in November 2023, Furst reflected (emphasis added):

Furst: What [Damballa] did sell into — and this is me making up a little bit of the story, but I have some experience here — [Damballa’s sales reps] would go to companies like Federal Express, and they would say: “We think there are machines on [your] internal network that are compromised. A hundred percent of the time when we’ve put in a board on the internal network and monitor traffic, we’re able to identify machines that are fraudulently controlled.” And people would say: “Well, I don’t really believe it.” And so we’d say: “Okay, well, can we put in a board, and you’ll pay us some amount of money at the end of the month, and then you can decide if you want to buy?” So it’s like proof of concept. At the end of the month, one hundred percent of the time, we’d be able to show them that their networks are compromised. And everybody who’s working in [Damballa] and [who invested] is sort of convinced, well, if you really can show every large company that they’re compromised, every large company is gonna buy from you. That just seemed impossible to imagine that people would be okay with that. But there were times when our [chuckles] salespeople would install one of these boards, come back, like, two or three weeks later and say: “Okay, you have a treasurer who's got a machine that’s hooked up to the bank accounts of the company, and his credentials are being transferred in plain text to bad guys in the former Soviet Union.” And they would go: “Oh, my gosh, that’s crazy. That’s crazy,” and they still wouldn’t buy.

Altucher: And why was that?

Furst: No, so I’ll go the other way around. You’re asking the wrong question. You’re imagining that there’s like, “Why don't they do it?” I’m just saying it’s obviously okay for them not to do it because their company is doing just fine now, even though their treasurer credentials have already been stolen. You can’t imagine that it wouldn’t be okay, but [clearly it is].

Altucher: So okay, so here’s what I would do as a business strategist. You were selling that proof of concept, right? Like you were putting in those boards. But this is a numbers game. So what you’re telling me is that some percentage of companies that are infiltrated by bots, [for those subset of companies] it’s not ok with them that they’re infiltrated, and so they’ll buy. So you just need to give for free the proof of concept, so you would have as many proof of concepts as possible [to find the companies that were not ok being infiltrated].

Furst: So yes and no. [It turns out] it’s a much smaller market. That’s why I’m saying maybe there’s a hundred million dollars total market you could sell for people that actually cared about whether or not ... that cared in the sense of it’s not okay for them to not do something about it. Now, there were some companies that would absolutely swear by [our product]. They had to have it. They couldn’t imagine us taking it out, and it turns out, I’m convinced, in retrospect, of course — [and] we can’t go back in hindsight — they were doing something else with the product. They weren’t simply identifying the machines internally that were compromised because that software could also prioritise which machines they could go fix. So if you looked at the companies that bought, there were some companies that had an internal staff that they would send out every morning to go wipe the machines they were worried about. For those companies that had such an internal staff, there were more machines to wipe than they could actually have people go wipe. So in the morning, they had to figure out how they were gonna allocate their staff to go to those machines. Those companies, when they bought this product, they couldn’t not have it. The rest of them, not so much. So you see, that market was probably the hundred million dollar market ... it’s like, you know, how many people can actually go without knowing which machines are the ones that are most compromised? The answer is a relatively small number. You see what I’m saying? And so, because the company didn’t understand what the authentic demand was, and it fooled itself to think the authentic demand was something else, the company raised too much money and made bad decisions.

Furst and Chanoff write, years later: “In hindsight, it’s arguable that Damballa did uncover an authentic demand, but because we never figured out its precise nature, we never understood the situations where it occurred or their frequency, so we were overoptimistic about the addressable market size. That led to financing the company unsustainably.”

Damballa was eventually sold to a consortium of investors for a mere $9 million dollars, in 2016. This consortium in turn sold the company to Roswell-based Core Security, in what was described as a ‘fire sale’. This occured 10 years later. Furst and Chanoff report that all but the last round of investors lost money. It was a bad outcome for a decade-long journey.

Sources

Finished reading this case?

Member Comments