I was very privileged to have the opportunity to speak recently at the Commonwealth Club of California, one of country’s top public affairs forums. I faced pretty tough competition that evening, with former Secretary of Labor Robert Reich speaking at the club at the very same time. Still, I managed to draw a respectable crowd to The Dark Side of Social Media: Privacy, Manipulation and Terms of Use. I spoke about the issues that compelled me to write my novel, as well as the eerie parallels between the book’s fictional presidential campaign and the contest unfolding between Donald Trump and Hillary Clinton. You can listen to the audio here. I’d like to thank Sarah Granger for her fantastic questions, as well as the Commonwealth Club for hosting Sarah and me.

One more thing: During Q&A, I referred to a well-known (in some circles) quotation about the ineffectiveness of traditional advertising. I had not prepared for this, so I pulled the quote from the recesses of my memory and I erroneously attributed it to F.W. Woolworth, founder of the Woolworth Company. In fact, it was another American retail pioneer named John Wanamaker who is credited with the following: “Half the money I spend on advertising is wasted; the trouble is I don’t know which half.” My bad.



Authorscott allan morrison

Reposted from The Big Thrill

By Kieran Crowley

Scott Allan Morrison spent nearly 20 years digging up stories for various news wires covering politics, business, and technology. From Silicon Valley, he reported on everything from the darkest days of the dot-com crash to the Web 2.0 boom and covered most of the world’s top tech companies. This experience prepared him well for writing the tech-savvy thriller, TERMS OF USE.

Tell us about TERMS OF USE.

TERMS OF USE is a thriller about the dark side of social media, where big Internet companies use your data to control your thoughts and actions. Some people call it a techno-thriller, but I think of TERMS OF USE as an unnerving commentary on the bargain our society has struck with the companies that give us “free” Internet services. And to ensure TERMS OF USE is relevant to all readers, I settled on a political story line, with a dollop of geopolitical intrigue, a menacing hit man, and a hint of romance for good measure.

What motivated you to write your novel?

During my years as a Silicon Valley reporter, I often wondered whether we could and should trust the companies that amass, store and analyze our data. What could happen if the wrong people took control of these companies (and every electronic bread crumb we’ve left behind)? How bad could it get? As you see in TERMS OF USE, things could get pretty wild.

With the impact of social media on our lives today, do you see the events you write about as possible in our near future?

Yes. Maybe not this specific story, but I wouldn’t be at all surprised to see an Internet corporation abuse its power in the future. Everything in my novel is possible today. All that’s missing is motive.

Do you think society has become too dependent on social media and the Internet in general?

Let me be clear, I love the power of the Internet to connect us, to teach us, to help us become more creative, make better decisions and be more productive. Are we very dependent on the Internet? Definitely. That’s not necessarily a bad thing as long as we can be sure the profit-driven companies that control the Internet, the new gatekeepers of information, act responsibly and in a manner that benefits and safeguards users. As of today, their terms of use (or terms of service) give them wide latitude to exploit—and profit from—our data as they see fit, and there are very few checks on that power.

What do you think would happen if suddenly social media were unavailable?

Mass chaos. Anarchy…. Ok, maybe it wouldn’t be quite that bad. But people would be really, really pissed.

How did working as a journalist inform/guide your fiction writing?

I wouldn’t have been in a position to write this story if not for my career. And once I decided to go for it, I was able to draw on my industry knowledge, sense of story, and reporting skills to shape the plot. It was much harder to allow myself to let go as a writer, but I eventually managed to overcome my reticence.

What would you say is the single most dangerous threat to society today, technologically, physically or otherwise?

A black swan—something totally unexpected. I’m spouting off about social media, scientists are warning us about global warming, and pundits have us terrified of ISIS. But I’m guessing we haven’t given much thought, if any at all, to the single most dangerous threat to society. History has shown that it’s usually the unforeseen events that throw us for a loop.

Do you have any exciting adventures from your reporting career you’d like to share?

Roaming around the southern Mexican state of Chiapas in a VW bug, dodging military checkpoints as we tried to locate Subcommandante Marcos during the Zapatista rebellion in January 1994. We never found him.

Your bio mentions you speak “rusty Mandarin.” In my book, any Mandarin would be exceptionally good. Where did you learn it?

I taught English in Taiwan for a few years. Once I (finally) developed an ear for Mandarin’s tones (it ain’t easy), I managed to pick things up fairly quickly. By the time I left, I could get by in day-to-day conversation. It’s been so long now that “rusty” is a charitable description of my Mandarin.

And now for the deep philosophical question that seeks to discover the true thoughts and imaginations of Scott Allan Morrison. You are marooned on a desert island with only coconuts and raw fish for your sustenance. What two books, one beverage, and one other thing small enough to fit in your pocket would you want to have with you?

Catch-22, In Cold Blood, a margarita (I insist on being marooned on a tropical desert island), and a fold-up, pocket-sized machete (they have those, right?)

Authorscott allan morrison

Whenever I pitched Terms of Use to an agent, publisher or book reviewer, I was always quick to point out I’d been a reporter for almost two decades. They’d have to take me somewhat seriously because everyone knows journalists can write. Right? If only that were true.

Oh sure, I could whip up a 400-word news story with my eyes closed and one hand on my flask (just kidding). But as I waded into my novel, I came to appreciate how poorly prepared I was to tackle long-form fiction. The imagination, intellectual stamina, and emotional commitment required to write a novel was nothing like newspaper journalism.

The biggest challenge was getting my head around the enormity of my project. To get to 100,000 words (about the length of my novel), I’d need to string together 250 news stories, all flowing seamlessly from one to the next in a way that excited, challenged and ultimately satisfied the reader. How the hell was I going to do that? It was telling that, save for the people closest to me, I did not reveal to anyone what I set out to do.

I decided the only way forward was to break my task down into smaller goals — not chapters, but goals. I’d start writing to see what I could come up with. If in a few months my wife (who loves thrillers) decided it was worthwhile, I’d keep going. When I was accepted for the Community of Writers at Squaw Valley annual workshop, it was another sign to stick with it. Every step of the way, I received just enough positive feedback to keep moving forward. It was not until I signed with an agent that I knew I would finish.

My many years I spent in newsrooms did pay off in some respects. I am a strong grammarian (most good writers are), which gave me a huge advantage over the many Walter Mittys who want to write a book but don’t have a clue how to piece together a sentence. Perhaps more importantly, years of reporting helped me develop a strong sense of story. That may sound simple enough, but I’ve run across quite a few writers (journalists and aspiring novelists) who tend to get lost in their ideas and jumbled narratives.

I leaned heavily on my reporting skills and news sense as I plotted key storylines in Terms of Use. I was fortunate that I could call on dozens of Silicon Valley insiders to keep me from going off the rails. Collectively, these coders, network architects, security ninjas, cryptography experts, tech entrepreneurs, venture capitalists and IT consultants helped me come up with many of the realistic scenarios that make Terms of Use so unnerving. I also interviewed a doctor and several law enforcement agents. After many months and scores of interviews, I’d written the first draft of a pretty good plot-driven story. Then I got stuck.

As much as I liked my first draft, I knew it wasn’t close to being good enough. My characters were only vaguely sketched out and the story was full of narrative gaps. My reporting skills were no longer enough, and the notion that I could just make stuff up still seemed somewhat foreign to me. Maybe that was because I didn’t quite understand how to harness my imagination. I didn’t know how to summon ideas on demand.

I knew I had an unwieldy idea generator in my head. This black box occasionally came to life at 3am, spitting out random ideas that I’d remember only if I was half awake. This is how Terms of Use was conceived. But my idea machine worked on a random schedule; many days, weeks, even months, could go by before it cranked out anything of value.

A funny thing happened to me over the next several months. I didn’t realize it at first, but I slowly found myself living in my characters’ world. I’d often heard actors talk about this phenomenon, but it seemed like gibberish at the time. And the more waking hours I spent refining my characters, toying with dialogue and chewing on problems, the more my thoughts began to intrude on – and interrupt — my slumber. Soon I was waking up almost every night at 3am, ideas bursting from my head. I knew then I was over the creative hump.

I still had to free myself in one other respect. In all my years as a business journalist, I rarely had the opportunity to flex my writing muscles, at least not in the way fiction writers do when they describe a scene, create convincing characters, convey emotion, illustrate action and pull readers to the edge of their seats. I struggled with this challenge – mightily – at first, often erring on the side of melodramatic. Fortunately, I found a fantastic writing group and they were able to set me straight with their valuable feedback and suggestions. It was like having an editor again, and despite all my earlier bluster about knowing how to write, I certainly needed one – or in this case, seven.

I imagine my story is not all that different from that of any other writer. We all have our strengths, and undoubtedly a few weaknesses. Journalism helped me develop a sense of story, strong interview skills and a familiarity with words. It turns out I also had more a bit more creativity, perseverance and emotional stamina than I realized. It just took me a little while to figure that out.


Authorscott allan morrison

Reposted from Fresh Fiction

The cover of my new novel Terms of Use looks nothing like I had envisioned – and, boy, am I glad.

I remember the very first time I stopped to consider the cover. My publisher Thomas & Mercer had sent a questionnaire in which I was asked, among other things, to describe my ideal cover and the mood I wished it to convey. They also asked me to submit images or other art to guide the cover designer.

Terms of Use is a thriller about the dark side of social media, so I knew immediately the cover should feel foreboding. And my story revolves around a hero who goes on the run to save himself and, it turns out, pretty much everyone else. So I thought my cover should feature a man looking over his shoulder as he escapes a group of people linked together in a network of spokes and nodes. As a second option, I suggested a collage of a face of a woman, suggestive of the socialbots in my story.

Many weeks passed as I lost myself in copy editing and proofreading my story. Then one day I received an email with two cover concepts in an attachment. I fumbled with my mouse as excitement got the better of me. I clicked on the attachment and the covers popped onto my screen... Ugh!

The first cover was an assault on the senses, starting with the x-rayed images of people superimposed on a network of colorful nodes and connecting lines that gave way to a big black hole in the center of the frame. The title and my name were laid out within this circle. I hated it. The second concept featured a collection of pixelated faces, washed in a blue-green tint that felt cold and distant. A total turn off.

The strange thing was that even though I viscerally disliked the images, I could totally understand how the designer came up with those covers. They certainly fit with the vague notions and art samples I’d submitted on the questionnaire.

I was worried. As a new author, I didn’t know how much I could or should push back. Making matters worse, my agent really liked the first concept. In the end I decided to speak up. We could do better, I said, and to T&M’s credit they quickly agreed to go back to the drawing board.

A second round of covers, concepts three through five, arrived several weeks later. These were better but equally unsatisfying. Concept three featured a man running through a tunnel but the image simply didn’t resonate with me. Number four took a minimalist approach; nothing more than big block letters superimposed over a metallic blue background. The third one came completely out of left field. It featured a tangle of bright green lines wrapped around the title. The lines, I was told, were supposed to represent computer cables. But to me, they looked more like strands of algae, and it didn’t help that the background was also deep green. I couldn’t help but think I was staring at the cover of some bio-thriller, or perhaps a horror story about the swamp monster.

I hopped on the phone with the T&M team a short while later and tried to diplomatically share my thoughts. At one point, however, the phrase “swamp monster” slipped from my lips and I quickly tried to soften my criticism by adding: “It might work… if the lines and background weren’t all green.”

I didn’t know it at the time, but that throwaway line sealed the deal. A day later, on a second call, the T&M team declared that it was time to come together behind a concept and move forward. Their clear favorite was the “swamp monster” – with a new color scheme. And since I’d suggested it, I couldn’t really back away from the idea.

I endured a very uneasy week, worried I’d be saddled with a cover I didn’t like. But when it arrived, I found myself staring at an image that was at once very similar and yet completely transformed. The cables were now various shades of green, yellow and blue and the once green background was predominantly charcoal black. It looked darn good. In fact, the more I looked at it the better I liked it. I couldn’t have asked for more.


Authorscott allan morrison

I’m not too concerned that Skynet will unleash its army of Terminator robots on us. But to hear Bill Gates and Elon Musk tell it, we all probably have good reason to worry that computers will one day become too smart for our own good.


That day might seem far off for most of us, but companies like Facebook and Google are already developing artificial intelligence technologies to expand their "deep learning" capabilities. These new technologies will be used to mine our data to assess who we are and what we want, and – to hear the Internet giants tell it – deliver elegantly tailored experiences that help us better understand and interact with the world around us.

There are different terms and examples to describe and illustrate this new capability. David Lazer, an authority on social networks at Northeastern University, refers to it as the rise of the social algorithm and says this represents an epic paradigm shift that is fraught with social and policy implications. Cardozo School of Law’s Brett Frischmann calls it techno-social engineering, and he too is wary about potential consequences.

There would be nothing inherently wrong with techno-social engineering if we could be absolutely certain the Internet companies that collect and analyze our data acted only in our best interests. But if not, we could all be susceptible to manipulation by powerful systems we couldn’t possibly understand. Frischmann and R.I.T.’s Evan Selinger question whether we are moving into an age in which “humans become machine-like and pervasively programmable.”

We already know the Internet is segmenting us into distinct groups based on economic, social, educational, regional, political and behavioral classifiers, among others. Internet titans rely on these classifiers to “filter” the world for us – which really means they are deciding what stories and opinions we read, which ads and offers we see, and the type of opportunities we receive. They also decide for us what we don’t see.

Nicholas Carr recently highlighted how social networks regulate the political messages we receive – as well as our responses. “They shape, through the design of their apps and their information-filtering regimes, the forms of our discourse,” he wrote, before adding that we may soon discover the filters applied to our expression and dialogue by these “new gatekeepers” are more restrictive than ever.

Take this a step further and we get to some very uncomfortable questions: What might happen if and/or when market forces pressure these profit-driven “gatekeepers” to exploit our data in unexpected or unforeseen ways? For example, might it one day be possible for a political aspirant to surreptitiously  “buy” favorable coverage on a social network’s feed, so that users saw a disproportionately positive stream of stories and comments about that candidate? Harvard’s Jonathan Zittrain outlined a similar scenario here.

Such hypotheticals might sound outlandish today, but there are few constraints on the manner in which Internet giants can use our data to develop more capable algorithms, which could in turn underpin new services not necessarily built with users in mind. As Frischmann writes, this is powerful technology and it can significantly concentrate power.

“We need to ask who is doing the thinking as we increasingly use and depend on mind-extending technologies,” he says. “Who controls the technology? Who directs the architects?”

Right now, it’s the profit-driven companies that dominate the Internet. These companies insist the trust of their users is of paramount importance to them. But they are the same companies that keep moving privacy goalposts and rewriting their terms of use (or service) to ensure they enjoy wide latitude and broad legal protection to use our data as they see fit.


Authorscott allan morrison

I began writing Terms of Use as a thought exercise focused on the trade-offs we all make when we provide our data to companies that offer us “free” Internet services.


There are countless companies that fit this description, but most of all, we’re talking about the titans of the Internet: Facebook, Google, Yahoo, Twitter, etc. Never before has it been possible to collect the huge reservoirs of information that today’s Internet giants have amassed on each and every one of us – one search, one page view, one comment, one “like,” one photo, one purchase at a time.

That’s the obvious stuff, but they are also collecting “passive” information: how long you linger or look at something, where you come from on the web, the times of day you surf the web, etc. This is the most insidious type of information, because it can be reassembled to help these companies enrich their understanding of you in ways you do not expect. In short, they know far more about us than we realize and it shouldn’t have surprised anyone when Edward Snowden revealed the NSA had been siphoning a “high volume” of our data from Google and Yahoo.

Privacy advocates went nuts, but most of us simply buried our heads in the sand and hopefully said: “I’m not a terrorist, so they are not interested in me.” It may (or may not) be true that the NSA isn’t interested in you. But the Facebooks and the Googles are most certainly interested in everything you do online. By correlating all your data with their data streams, tech companies have developed intimate user profiles that include all sorts of personal details, opinions, habits, tastes, preferences, relationships and affiliations. It's virtually impossible to use the web as a nontechnical user without leaving rich metadata about yourself everywhere you go.

Information is power; it is also extremely valuable to these companies. We already accept that they mine our data to uncover patterns, make recommendations and bombard us with advertising. In the last few years they’ve made huge strides with their ability to anticipate events that haven’t happened yet. What happens as these companies become more sophisticated in their ability to effectively manipulate us and influence outcomes?

This was my starting point for Terms of Use: How might a large Internet company with advanced data mining and predictive analytics capabilities use – or more to the point, misuse – our data in the future? It was not very hard to spin out all sorts of scenarios and writing a novel seemed like a good way to have some fun with a very serious issue.

I was well into my novel by the time Facebook revealed in an academic paper last year that it had manipulated the news feeds of almost 700,000 users to see if it could affect people’s emotions. The company declared success; and based on its decision to publish those results, it would seem Facebook was quite proud of its achievement. So while I started writing Terms of Use as a thought exercise, it turns out that Facebook has been doing the real experiments.


Authorscott allan morrison