Technology is a powerful force for making things better and cheaper. But it is not without problems. It is, after all, a tool wielded by humans, and we occasionally have our failings. Technology can make drugs that save or destroy lives; it can create energy for peace, or for destruction. So inquiring about the innate goodness or badness of a given technology is kind of beside the point. Like any tool, it amplifies the power of the people who use it, in competence and error, for good and evil. A more helpful question might be: better and cheaper according to whom? The same principle holds true for the technology that is transforming media. Internet-scale networks have made distributing media much better and much cheaper. Algorithms have made media discovery better and cheaper. Computers, smartphones, and now generative AI have made producing media better and cheaper. Technology is driving the cost of content creation toward zero, and we are now reckoning with media abundance. Is that a good thing? Well, that depends on who it serves. I see two divergent media futures emerging—each based on networks that serve very different groups. In the first future, our networks serve the technologists who build them, the advertisers that pay for them, and the governments that control them. These networks compete for users in a war for attention by making systems that spit out superficially compelling content. In these networks, the creators serve the algorithm. They are always replaceable, and get a tiny, fluctuating slice of the revenue, or nothing at all. In this future, AI can replace human creativity with zero negative consequence for the people the networks serve. These kinds of networks view media as just content, making media like a drug. These kinds of networks already are, and will probably remain, extremely popular. You use one of these networks because it gives you what you desire, but your desires are not fixed. It’s the network’s job to feed your desire, and it wants you to always want more. It drags you to the lowest common denominator, erodes your attention span, and saps your agency. You get what you crave, but the price is a degraded culture and a life that passes you by without your even noticing. I call this media’s “drug future”: a wire to your head that drips dopamine, co-opts your mind, and steals your life. In the second future, the networks serve the people who use them. They compete for the hearts and minds of creators and audiences by offering a better social contract. Creators own their platform, their work, and their connection with their audience. They can make real money and keep the vast majority of the revenue their work brings to the network. In this future, AI amplifies and assists human creativity rather than superseding it, because it is built on actual human relationships—when one human likes the work of another human, and chooses to reward it. In turn, the network values what the users value, instead of only what keeps them glued to the screen. The consumers in these systems get to choose their own heroes. Their time and money become votes for the kind of culture they want to thrive. They get to directly fund the work of the creators they most trust and love based on things like quality and depth, instead of the cheapest hit for their attention. An influx of independent voices will do the kind of work that fills your soul instead of just your time, and in turn they will be well funded for that service. Consumers and creators together will build a media ecosystem that helps you take back your mind. I call this the media’s “culture future”: driven by direct relationships, built on trust. Which of these futures will come to pass? Actually, both are already here. The only question is in what proportion they influence the future. In the next decade, how many people will live in the drug future versus the culture future? The answers will come from the two different business models that support these networks. The business model of the drug future is well established and has been very successful: you grow a giant network and centralize the economics. The primary customer is the advertiser, and the money flows directly to the platform overlord to distribute (or not) as they see fit. This is the “algorithm knows best” model—and it is very hard for platforms to resist its pull. Both Instagram and YouTube, for instance, have at various points of their existence prized the relationships between creators and their audiences, but as soon as TikTok came along to offer a better drug, those platforms had to change quickly to keep up. The creator gets pushed to the back, subsumed by whatever the algorithm believes will engage the most attention. The business model for the culture future, however, is only beginning to emerge. This model is what we mean when we say Substack is building an economic engine for culture. This model doesn’t need to match the scale or addictiveness of the drug future’s model—because that’s not the way it is funded. The new model offers the best creators and their audiences a better deal, the same way all good deals happen: because incentives are aligned. The algorithm serves the people, not the other way around. The platform can only win when the creators win; the creators can only win when their subscribers win. That’s why we make it completely free for anyone to publish to any audience of any size, why 90% of all subscription dollars go directly to the creators, and why creators can export their mailing lists and leave with their audience at any time. We started Substack in 2017 and brought this idea to a single writer. Since then, the platform has grown to tens of thousands of creators making money from subscriptions, and hundreds of thousands of creators using Substack to publish their words, audio, images, and video to tens of millions of people. Substack is now unquestionably home to some of the smartest culture on the internet, and every day more creators from the biggest networks in the world are joining, bringing their audiences with them. We can still get high on the drug future. And probably will. All of these people, including me, will continue to visit (and enjoy) the dopamine carnivals. But building a life around a drug future doesn’t seem like the best option. We believe that once they get a taste of this model, creators and audiences will fall deeply in love with the culture future: a system that gives them more ownership, more control, and more value while restoring the primacy of direct, real relationships. Every new participant in this ecosystem is a soldier in the fight to reclaim our attention, a little weight on the scale to help tip the balance back in favor of culture and away from digital drugs. Our modest goal with Substack is to help accelerate and amplify the advent of the culture future through a better media system. Culture is not just about getting what you want; it is about learning what to want. It shapes our values, beliefs, identity, taste, and how we relate to other people. And in turn, we shape it, when we choose what to pay attention to, what to elevate, what we contribute back. Culture is nothing less than our personal and shared quest for the meaning of life. The networks that support culture should serve the people who live that life. |
Search thousands of free JavaScript snippets that you can quickly copy and paste into your web pages. Get free JavaScript tutorials, references, code, menus, calendars, popup windows, games, and much more.
The two futures of media
Subscribe to:
Post Comments (Atom)
How to debug large, distributed systems: Antithesis
A brief history of debugging, why debugging large systems is different, and how the “multiverse debugger” built by Antithesis attempts to ta...
-
code.gs // 1. Enter sheet name where data is to be written below var SHEET_NAME = "Sheet1" ; // 2. Run > setup // // 3....
No comments:
Post a Comment