“Social media is a new way to communicate, report and share information amongst friends, family, and colleagues online, as well as meeting people with similar i
News on the Internet and the Extension of Reportage
The Background on the Need to Communicate
The history of news gathering and dissemination goes back to the times of 'exploration' and 'colonization' of known and unknown peoples and lands. The explorers sent back their news to the 'mother country' for a motley crew of readers and would-be investors. The decision as to what's news and what's not news took shape in different stages of news reportage and dissemination from the beginning of journalism. This was the beginning of the spin doctrine that is so common place today. Which means that the news were designed to sparked public interest and following, and the actual distorting and exaggerating of the news event being reported. In those days, reporters were all sorts of men with different interests and pursuits. The same nations which controlled physical transportation, one way in which Imperial powers fought for more acquisitions of land and trading systems, were the ones that first constructed the first news networks that to sell information to the world's newspapers The system of transportation through shipping and information gathering and selling were important in helping define what is news and helped pattern the relationships between colonizer and colonized. These agencies built their news route and branch offices in the colonial world when they 'opened it up'. This enabled them to collect their own intelligence and demand on Commercial information peaked, on stocks, currencies, commodities, harvests and extractive processes. In Europe, they used pigeon and horses to join various cable system, for instance, cable was laid down to improve and provide for faster and quickest flow of intelligence and information around central Europe. This widespread growth in communications technology extended the frontiers of knowledge and transformed the environment in which we live. The application of the communications process varies from country to country. The old international information order meant that that the powers of disseminating information was a domestic matter and the media in developing nations was left to function in a non-governmental, autonomous sphere. UNESCO noted that the media of the richer nations was a way in which the domination of world public opinion or a source of moral and cultural pollution. Culture and how it is organized influences the way a country handles its messages and the content.
The growth of political ideology and hardening of attitudes, led to the confrontation between East and West and the political independence of countries formerly subjugated by colonial rule. Also, the goodwill to remedy and redress these wrongs, was tempered by a liberal dosage of national self-interest. In the process, news and wire services were now placed on the computer in digital form, and this facilitated communication with other sources of information and distribution. Dependence on information, which has created a social hierarchy, got broken down and people now find themselves exposed to TV information channels. This new technology has also permanently altered social relations, and it opened wider national debate. The conflict between newly independent countries and their rulers is not only about the plight of the past, but It has to do with the reality of the present and the concerns of the day as reflected in the news. Race has been an ever present state of affairs.
Packaging News in the Age of Technology
The news program is structured like a newspaper. The day's most important story is the lead, and the first two sections are generally devoted to the other important hard news of the day. Most of these stories are domestic news, usually about political or economic happenings, much of it originating from Washington. Features, which take up the remain sections, are more often on topics of social importance or interest, such as health; and television journalists like to end the program with an amusing human interest anecdote, of the 'man bites the dog' genre.
There are people , or actors who populate the news and activities that become newsworthy. Journalists say the the news ought to be about individuals than groups or social processes; and by and large, they achieve their aim. Most news is about individuals, although they may be in conflict with groups or impersonal forces such as "inflation" and "communism, or something like that. National news is by definition about the nation, and so the most frequent actors in the new are inevitably individuals who play a role in national activities(e.g., Tiger Woods, etc.) They could be well-known people, ordinary people prototypical of the groups or aggregates that make up the nation. The Knowns, furthermore, could be political, economic, social, or cultural figures; they could also be holders of official positions or powers behind thrones who play official roles. Knowns are a combination of people. Some are assumed by journalists to familiar names among the audience; others have appeared frequently in the news and are therefore well-known to the journalists. some are not necessarily known by name but occupy well-known positions, like governor of a large state or mayor of a troubled city.(Gans)
The news has dealt with race because of the primary social division in the news has been racial, although this was largely a consequence of racial flare-ups in the 1960s and 1970s. Racial news featuring whites reflected a dichotomy with public officials and upper middle class citizens who sought to advance racial integration and less affluent whites who demonstrated against it being deemed most noteworthy. the news also paid attention to racial differences, but did not often deal with income differences among people, or even with people as earners of income. Some of the news dealt with stories about the successful entry of women into previously all-male occupations and institutions. Ideology was deemed significant in communist nations and among parties and adherents of the Left and Right, both overseas and domestically. Although the news distinguishes between conservative, liberal and moderate politicians and party wings, these are perceived as shades of opinion; and being flexible, they are not considered ideologies. The news were decided based on race, sex, ideology class and age. these still are what makes or is decided as news.
The recent introductions and improvements along with development of the technology of reporting, for example, shorthand, telegraphy, photography, microphones, satellites, cell phones, lap-tops, twittering, Internet, emailing Youtube, video-casting and so on, these have increased rather than simplify the theoretical problems of objectivity in the news. News has acquired a new and powerful authority from the size and scope of the new and increasingly vast contemporary audience, that the business of governments has long focuses on, and is now focused on issues which journalism selects for salience and priority. Every step that has been taken towards enlightenment involves carrying the burden of misconceptions and past conceptions of observing civilizations. All the explorers and today's reporters works alone, because he brings with him a baggage and totality of the past observing which has become part of his culture and his conceptual apparatus.
Information is also closely related to economic hegemony and the extension of power and influence; and those countries, which have ample means of communication, use information as a means to further national, economic and political objectives. This power, then in TV news, is harnessed in such a way that it holds us hostage to the channel we are watching. A tease or commercial is used to keep the viewer watching the news. During the news program, the tease does not stop there, because what is known as 'bumpers' and 'teases' are used to keep one watching through use of 'promises of exclusive stories' and 'tape', 'good looking anchors', 'helicopters', 'team coverage', 'hidden cameras' and better journalism. And when the news is finished, you are pleaded with 'do not touch that dial'. Whether you know it or not, we are programmed to watch news by programmers. Even with a remote in ones hand, we are likely to stay tuned to the channel we have been watching. This is why the best news program may not have ratings as high as a news program with strong lead-in.
In the 1990s a corporate control of the media created a lot of concern. Then came the notion that the internet or digital communications will set us free. This is hardly unprecedented because every major new electronic media technology this century, from film, AM radio, Shortwave radio, FM radio, facsimile broadcasting, terrestrial television broadcasting, Cable TV and satellite broadcasting, has spawned similar utopian notions. Viewers and listeners were told how these new technologies would crush the existing monopolies over media, culture and knowledge and open way for a more egalitarian and just social order.
In 1998, the CEO of Cisco Systems even went further to say that the Internet will have the same impact as the Industrial revolution had, but instead of that happening in 100 years, it will happen in seven years. Writers from Nicholas Negroponte and Douglas Rushkoff to George Gilder and old Newt Gingrich informed us that we are entering a period of fundamental social change like we have never seen before. But the claim for what the Internet will do to media and communications are no less sweeping. Negroponte went as far as to say that the Internet will be the most enabling technology of all media-TV,Radio, newspapers, magazines and so on. The Internet, it is claimed, had the potential to undermine corporate and commercial control of the media. Perry Barlow neatly observed this by dismissing concerns about media mergers and concentration by saying that the big media firms are 'merely rearranging deck chairs on the Titanic'. But the commercial system has merely donned a new set of clothes, the internet is run by Big commercial firms and the content subject their whims and profit. The more things change, the more they stay the same.
Converging Divergence and the New Spin
Most media analysts from Marshal McLuhan to Noam Chomsky, have shown, television and print news cater to the corporate and political entities who created them, and make sure they keep them in business. But even if the original intentions of the media were to manipulate the American psyche by deadening our senses and winning over our hearts and minds to prepackaged ideologies, this strategy has by now back fired. The sense or signification carried out by the bosses of the media is that they will continue to control content in order to decide what is worthy of representation, and profits. But the use of the media during the recent Presidential election deconstructed those ideas thoroughly.
Human Rights, Executives and Profits
Television and the newsmagazines diverge, however, in their treatment of the middle class population, for the individuals who appear in the newsmagazines are more often of the upper-middle class, while those on television are frequently of the lower middle class. Most news is about affluent people, almost by definition, since the main actors in the news are public officials. Public officials are distinguished by their geographical, racial, ethnic and religious background more often than by their economic background. Generally speaking, then, the national news features middle-class and upper middle class blacks who have 'overcome' racial, economic and especially political obstacles, with the less affluent black more often newsworthy as protesters, criminals and victims. Racial news featuring whites reflects similar dichotomy, with public officials and upper middle class projected as citizens who seek to advance racial integration getting more coverage and less affluent whites, who demonstrate against it being not most newsworthy, but slightly covered nonetheless.
If one, then, were to use the same say of looking and knowing about how news become disseminated through TV and other news outlets, the very same pattern emerges. Those who want to improve and promote the well-being of all Americans are shown in a positive light, and those opposed to the general health and welfare of poor Americans, given newsworthiness because of the opposition and violence they engender. Because the news is dominated by stories about conflict(Racism, Health Care, Abortion, Gun Rights, Immigration, etc), and because of its concern with unity and consensus, or lack thereof, the overall picture is of a conflicted nation and society. But, is that a true picture in contemporary US?
Old news, New technologies
Many other such complexes exist, some of long standing. Often the subject of magazine cover stories and television documentaries, also serves as leads to more routine news stories, with actors, activities or statistics becoming newsworthy by virtue of their shedding some light on the condition of one or another complex. The networks have always been largely concerned about making money, but at an earlier time they felt obligated to operate first class news departments. Technical demands of television are complex and unrelenting that everyone concerned is preoccupied with getting matters right, and frequently, it is a case of techniques triumphing over substance. TV executives are very sensitive to public criticism, and their principal consideration in responding to public criticism is profit and loss. The news director who manages to arrange public matters so that public criticism is kept to a minimum, and profits remain high, rarely get fired. News and pseudo news shows, fixates peoples attention on what is peripheral to an understanding of their lives, and may even disable them from distinguishing what is relevant from what is not.
The viewer must come with a prepared mind that has information, opinions, and a sense of proportion and articulate value system. The TV viewer or news listener lacking such mental preparedness, to them, a news program is only a kind of rousing light show. Here a falling building, there a fire-alarm fire, everywhere the world becomes an object, without meaning, connections or continuity. News gathering and dissemination has not changed very much. The motive still remains manipulation, selection, signification, representation, ideology and profit. This maxim remains true of all old, new and emerging media, that with techniques and converging technologies, the media is not the message, but profits and control of men's attitudes and behaviors is the norm.
Approximately 99 percent of Americans own televisions, 70 percent of whom subscribe to cable; 100 percent own radios; an 77 percent subscribe to newspaper. Inmost homes, the television set is on at least seven hours per day, though studies find children watch about eight hours a ay. Americans listen to the radio 2.5 hours a day and spend 45 minutes reading their daily newspaper. In addition, American receive most of their news from television and often believe what they see and hear on the news.(Douglas Kelner) African Americans watch and listen to more broadcast media than these averages. for example, African Americans spend more than 70 hours a week watching television - 20-35 percent more than whites(reston, 1994)
The power of the media is profound. It sets agendas, interprets meaning, confers status, and in its worst case, endorses destructive behavior. It's most powerful impact is on children, who frame definitions of and draw conclusions about the world through the messages they receive. Studies conducted in the 1990s show that children across all races associate positive characteristics with the minority characters. Although children believe that all races are shown doing both good and bad things on the news, they agree that the news media tend to portray both African American and Latino people more negatively than White or Asian individuals. African American children feel that entertainment media represent their race more fairly than the news media (47 percent to 25 percent). Asian children feel the opposite, favoring the news media (36 percent to 28 percent), while White and Latino children are split between the two. All ages and races expressed faith that the media could help bring people together by showing individuals of varying races interacting together.(Children Now, 1998))
Yet, as a capitalist enterprise, the main purpose of the Eurocentric media - media created by and reflecting the worldview of people of European descent - is to create and maintain consumers of all age. And the Eurocentric media is experiencing steady growth and rapid consolidation. The top media corporation that have "dominant" power over American culture have shrunk to only ten.(Ben Bagdikian) Although one of the, AOL Time Warner, is now headed by Richard Parsons, a Black man, the top echelon is almost completely White. Not surprisingly, the product - whether packaged in magazines of television shows - is orientated toward a white audience. Meanwhile, the Afrocentric media - media created by and reflecting the worldview of people of African descent - is generally struggling to keep afloat.
News Media Are Targeted But Audiences Are Not
A Contemporary View On Multiperspective Journalism
As I have stated above, the power of the media is profound. It sets agendas, interprets meaning, confers status, and in its worst case, endorses destructive behavior. We learn from Herbert Gans that "In the past and in my writings I have urged that the news become more mutiperspective , that national news depend less on tope down news, form high level government and other official authoritative sources. Instead, new media should do more than reporting from and about other levels and sectors of society and how these see and interpret the country and its problems. The mainstream news media as well as the economy and polity in which the news media are embedded have changed over the past decades and the arrival of the Internet offers a chance to add different kinds of news. These changes justify a revisit the multiperspectival news which focuses particularly on the journalists' role in representative democracy.
Gans was asked in an interview that since he wrote his book "Deciding What's News" in 1979, what would he change and what would he leave intact - from the original? Gans replied:
"I wish someone would do such a study, because both [network TV news and newsmagazines] now have smaller audiences and smaller budgets, but the network evening news has barely changed in format or content in the last 40 years, while the newsmagazines are changing drastically. I would want to do such a study how they decide how and what to change - and what to keep - and what direct and indirect roles news organizations' budgets and news audiences play in these changes and in the shape of national news generally."
And Gans goes on to define how in 1979 he defined "Multiperspectival news" and explains why he still thinks it is still a relevant need, and why it is still relevant today given the challenges facing legacy news outslets: Gans elaborates: "When I did my original research, national news was limited mostly to what I might now call 'monoperspectival news', sometimes also called 'stenographic news'. I oversimplify somewhat here,but national news was dominated by journalists reporting what authoritative sources, especially top government officials, told them - or, when these disagreed, what "both sides" (usually Republican and Democratic) were claiming."
Gans adds: "Multiperspectal news reporting is more diverse. It seeks news about other subjects that are newsworthy for the variety of audiences in the total news audience; it obtains news from many other sources, including ordinary citizens, and it reports a variety of political, ideological, and social viewpoints (or perspectives. Here's my favorite example. Poor audiences need business news like everyone else, but not about investing in the stock market or the latest newsworthy act,legal or illegal, by corporate bigwigs. They need to know about businesses in which they can afford to shop and the ones that will hire them, as well as the charitable and public agencies that can help them when they are jobless and in need. Today, thanks to cable news and the Internet, the news is much more multiperspectival that it was in the past, but it reaches a far smaller audience than traditional legacy news."
Gans offers his perspective as to whether he thinks that newsrooms as they function today can shift to adapt to make these changes in their perspective. He response thusly": "When I published "Deciding What's News" in 1979, I suggested increasing the number and variety of news media. Today,cable, the Web and other technologies have made that happen, and we are at a stage in the innovation process in which further new journalistic formats and ideas are being tried out all the time. What can be monetized and what can be supplied free of charge remains to be seen, but if the needed monies and audiences are there, the journalists and the newsrooms will come to stay."
Gans has argued that journalists are stuck in the box of their own world views,which often reflected their own social status and the ensuing beliefs about their own social status and the ensuing beliefs about nation and society, and gives an idea as to how we can get around that. "I find the idea of journalists as representatives intriguing, in part because the U.S. is an upscale democracy, the politics of which is dominated by corporate campaign funders and the upper-middle-income population that votes and participates more actively than the rest. As a result, U.S. politics does a poor job of representing the remainder of the citizenry, especially those earning below the median income and various numerical minorities."
Gans further adds that: "Journalists are not elected officials and they cannot be political representatives or advocates but they can be represent people in a variety of other ways, for example by turning their experiences and problems into news, and by asking politicians and other authoritative sources questions to which unrepresented and poorly represented citizens need answers. Journalists can also pay more attention to (now) poorly represented political and other ideas. I think J-schools and news organizations can either recruit journalists who can get journalists unstuck from their own boxes or teach them to do so. Moreover, multiperspectival news - and representative journalism - would require a greater diversity of journalists, especially those coming from and familiar with the lives of poorly represented citizens and ideas."
This is what Gans has to say about who he sees as today's journalist, since the 'type' of journalists existing then when he wrote his book in 1979, has changed today in the 21st century technological media environment. Gans elaborates as follows: "I don't think professional journalists have changed at all that much since the book came out, other than that they are more educated and professionally better trained. Newsrooms have changed, as have all other workplaces, but I don't see a big difference in today's news judgements. Even the inverted pyramid has not yet been torn down."
Gans foes on to point out that : "True, there are amateurs who supple photos, videos, and even news stories to supplement what is gathered by professional journalists - but the latter still provide most of of the actually-consumed news. It's just that much of the news and opinion we once told each other face-to-face and in small groups is now visible to so many other people in the Web's so called Social Media. However, even if it is visible, does not necessarily mean that it is seen. One must always remember that ordinary people do not pay the same kind of attention to news as do journalists and media researchers, whether it is Facebook's personal news or the TV networks' professional news."
It is also important to take note When Gans speaks about citizen news, which he sees as helping be more educated politically, and he tries to point out as to where do such changes as the citizen journalism movement and hyperlocal news fit into tis vision of citizen News. Gans informs us in this manner: "Journalists need to pay more more attention to what citizens are doing, politically, and what their elected representatives do and don't do for them. conversely, elected representatives should know more about their constituents, especially the silent ones. Reporting more such news would incorporate the citizenry a little more into the political process, and would also offer citizens seeking to be more active examples of citizen activity. Since such news will probably always be of lower priority to professional journalists, help from the citizen journalism movement and supporters of local news would be desirable."
Gans proceeds to explain a little bit about the relationship between the general news audience and this targeted audience, and if whether there is enough overlap to create a common conversation.: "News media are targeted but the audiences are ot. there are mews media which seek to communicate with a particular audience, which may be targeted by gender, education, race, etc (we have discussed the race factor of the news gathering and media projections above, before we delved into Gans's take on multiperspectival perspective); and, there are other news media - most, in fact - which try to attract everyone. However, audiences head for where they want to go, and may people turn to both general and targeted media. But I wouldn't imagine much overlap or a common converstation. I am not even sure that many people converse very often about what's in the news media, other than journalists and media researchers."
In contemporary technological times, media, information, reporting has taken on a new tack, and we shall be elaborating further on this aspect of looking at other alternative takes on who and what decides news and who are the audiences and what the effects and affects are to the users. Below we look into this new phenomena of sharing and dissemination of information.
We will defer at this stage to Ian Crouch who wrote the following article:
How Viral Culture Is Changing How We Learn, Share, Create and Interact:
[New Ways Of an Emerging Reporting/Reportage and Viral Journalism]
"Bill Wasik’s "And Then There's This: How Stories Live And Die in Viral Culture" deceptively slim book is packed with anecdotes, theories, and arguments about contemporary media culture. It’s part memoir from Wasik, the merry prankster who created the 'flash mob craze' in 2003. And it’s part cultural inquiry, complete with clever social experiments and searing commentary.
While Wasik admits he is often tempted “to lionize viral culture as a people-powered paradise,” he thoroughly and persuasively argues that most of what we see, read, and discuss with one another is disposable by design, and ultimately corrosive. Let’s consider some of Wasik’s larger arguments.
Does the name Blair Hornstine ring a bell? Probably not, though that’s fine with Wasik. He suspects that if you remember anything about her, it will be her brief notoriety as the “girl who sued to become valedictorian” of her graduating class in 2003. She had been forced to share the top spot with a classmate due to a technicality, and rather than graciously share the honor, she decided instead to take her case to federal court, where the judge awarded her sole rights to the position and a hefty chunk of cash in punitive damages. Hornstine’s tale might have ended there, had the local paper she often wrote for not discovered that many of her stories contained extensive cases of plagiarism. Harvard rescinded her acceptance and talking heads rushed to label her emblematic of all that was wrong with America’s success-obsessed youth.
Wasik uses this unsavory story to introduce a new term, the nanostory — the “media pileons that surge and die off within a matter of months, days, even hours.” Once Hornstine became a big story — in major American newspapers and across cable and the Internet — she ceased to be a person, or even a name. She instead became a titillating and easily digestible modern fable, modern not only in its meaning but in its arc. All stories have a fixed lifespan, but the nanostory is so named both for its brief, bright existence and its false aura of social importance:
We allow ourselves to believe that a narrative is larger than itself, that it holds some portent for the long-term future; but soon enough we come to our senses, and the story, which cannot bear the weight of what we have heaped upon it, dies almost as suddenly as it emerged.
These stories get huge initial buzz, but then suffer from nearly simultaneous backlash — as if fame and backlash are not only inseparable but adjacent on a timeline. The stories and people involved are, writes Wasik, “gobbled up into the mechanical maw of the national conversation, masticated thoroughly, and spat out.” Susan Boyle was a nanostory. So was Miss California. You’ll find a couple new ones on most news aggregators every day.
These are silly news stories that break big. But Wasik also argues that many nanostories are generated and promoted by the subjects themselves. Take Amber Lee Ettinger, last fall’s Obama Girl, whose parlayed her role in the YouTube hit “Crush On Obama” into political commentator appearances on CNN and elsewhere. She recently 'tossed out the first pitch"at the Brooklyn Cyclones’ Obama night.
Viral culture and the media mind
But you might argue that these bits of triviality have always been around. We’ve been distracting ourselves with gossip, nonsense, and superficial oddities for ages. What makes the nanostory different?
Wasik says the nanostory thrives today because we live in a viral culture. That culture labels ideas and stories as culturally significant almost instantaneously; it rewards shamelessness and confers attention for the briefest of moments. But Wasik admits that these same factors have been in place throughout the television age. The difference, he writes, is the audience. Wasik argues that people now operate with a collective media mind: that we are all savvy marketers of ourselves and eager to reward such initiative in others.
Having been sold culture for so many years, in so many sophisticated ways, consumers have now been handed the tools to sell themselves and they are doing so with great gusto.
Central to that self-marketing is access to data. Just as corporate marketers measure success and make predictions based on sophisticated analysis, so do individuals who put themselves out there online. Wasik argues that the ubiquity of user behavior data on sites such as Tecnocrati and Alexa give individuals tools that once cost corporations millions. Not only can we monitor the performance of a blog post or uploaded video, but we can use data to predict what new content might make a poster famous.
And Wasik argues that fame seems more attainable than ever. Anyone posting on YouTube, Twitter, or Facebook is making a considered presentation of themselves to the world at large. They are acting out a role in the public sphere. And that act, Wasik argues, “changes what you say, how you act, how you see yourself.” Wasik never says it explicitly, but when he writes about the “hoardes of supposed naifs out there writing their blogs,” he is primarily talking about a certain demographic: the always-coveted 18-25s and 26-35s. In Wasik’s viral culture, these demographics are no longer just consumers of media and advertising; they’ve seized the means of cultural production as well. While it might be a democratic triumph to have the power to create media wrested away from a select group of culture makers, the new products created by that democracy leave Wasik dismayed.
The flash mob
Wasik knows firsthand about the allure of new media fame. He caused a stir in 2006 when he outed himself in Harper’s as the until-then anonymous architect of flash mobs, a social spectacle that hit New York in 2003 and spread around the globe — even seeping into the world of corporate advertising. Born out of what Wasik describes as a sort of existential boredom, the flash mob was a supposedly spontaneous assembly of people in a public place — a Claire’s accessory shop, the lobby of a Grand Hyatt — orchestrated beforehand by a series of email instructions. Wasik’s descriptions of these new media capers make for great reading — flash mob attendees wander through the rug department of Macy’s looking for the perfect “love rug,” or form a line blocks long ending at St. Patrick’s Cathedral, waiting for Strokes tickets that will never turn up. While another writer may have smugly observed the nonsense he had created, the strength of these tales comes from Wasik’s obvious bewilderment and dismay. How had he done this?
Again, Wasik argues that the prominence of his nanostory emerged from the Internet’s unique role as an archive of data and content. While television swamped us with stories, we had few ways to measure who cared about what. Now, as social networks and traffic monitors reveal popularity on a nearly minute-to-minute scale, we are more susceptible than ever to herd behavior or the bandwagon effect. Wasik recalls watching with a mix of delight and horror as he transformed into the shrouded cult-figure “Bill,” venerated by the hoards newly at his disposal. The crowds grew as e-mails were forwarded on and on, and soon the conventional media began to take notice. Wasik did interviews with hundreds of media outlets, themselves eager to be seen reporting on the cutting-edge of culture.
Wasik was interviewed by scores of media outlets from around the world, but he singles out reporter Amy Harmon of The New York Times as particularly emblematic of the way that members of the traditional media approached the story. Harmon contacted Wasik (still anonymous at that point) by phone long after many other outlets had covered the story. Harmon said she knew the Times was late to the story but planned to devote prominent space in the "Week In Review section identifying flash mobbing as a fascinating current trend. Instead,Harmon's piece focused on growing backlash against the mobs on the web. But Wasik writes that the backlash hadn’t even happened yet — the Times was simply pushing this nanostory along to the next logical stop on its path to irrelevancy. The evidence marshaled by Harmon, Wasik writes, “hardly constituted a ‘backlash’ against this still-growing, intercontinental fad, but what I think Harmon and the Times rightly understood was that a backlash was the only avenue by which they could advance the story, i.e., find a new narrative.”
Wasik never makes clear whether he feels the Times story led to the backlash that followed, but eventually it did come, and the mobs were soon over. He notes that they could ultimately unsustainable, since they were the very definition of a nanostory: they didn’t mean anything, and soon people came to realize it.
But mobs live on. Wasik describes the surreal experience of attending a “flash mob” for the Ford Fusion at City Hall Plaza in Boston, orchestrated entirely by a marketing company. The event is a bust; it feels staged and obviously commercial. People walk by, disregarding the scene as the marketing stunt it clearly was. Like traditional media, which seemed both eager and clumsy while covering the story, corporate America had fallen for the gag but somehow missed the point.
Following his unnerving success with flash mobs, Wasik set out to perform a series of similar experiments in the new media world. Fascinated by the star-making power of the music-review site Pitchfork on the indie scene, Wasik attempts to destroy the Swedish band Peter Bjorn and Hohn as they threaten to emerge at the 2007 SXSW festival. He starts the blog Stop Peter Bjorn and John and arranges a mock-protest. Within a week EW.com and Mother Jones, among others reported on growing backlash against the group. Soon this innocuous pop group had earned the label “controversial” on the San Francisco Chronicle's culture blog. But not all his experiments build buzz: his oppodepot.com, which attempted to aggregate dirt on all the 2008 Presidential candidates, attracted little traffic — because the site failed to take a partisan angle, Wasik says.
So what? Conclusions and ideas looking forward
What conclusions could one draw from reading Wasik describe — and experiment with — these new cultural products? Here are a few:
— The Internet is a false cure for boredom. Wasik says he began the flash mob project out of a sort of existential boredom; he admits he is often bored. The web, with its instant access to information and entertainment, is the ultimate place to “do something” without doing much at all, Wasik argues. He writes that the web has not eliminated our boredom, just distracted us from it.
— Viral culture rewards narrow thinking. Each of Wasik’s social experiments was founded on a clearly defined meme, a narrow and finely executed idea. We see that blogs that tend to gain notoriety (or book deals) do "one thing really well", while blogs with wide focus often fall by the wayside. But serious writing or good political commentary thrive on complexity, exactly what viral culture rejects as too complicated.
— Culture is infected with a virus. He writes that a meme or nanostory is like an “independent agent loosed into the world, where it travels from mind to mind, burrowing into each, colonizing all as widely and ruthlessly as it can.” Wasik argues that these trivial items are choking us, blinding us, and making us both stupid and crazy.
— Everything is a meme. Though Wasik lumps both news items and self-promoted projects under the same “nanostory” or “meme” headings, it seems that these two categories are fundamentally different. He devotes too little energy to sorting out who continues to control and disseminate information, and grants too much power to individuals and too little to still powerful media conglomerates and corporations.
— Crowds are not as wise as they seem. Wasik suggests that culture has devolved into a popularity contest, with news outlets mistaking clickthroughs, pageviews, and most-emailed lists for reliable indicators of quality and worth. He cites a Columbia Study that found people tasked with downloading and evaluating music relied largely on the popularity of the songs among other respondents.
— Nanostories are killing us. In the book’s final section, Wasik offers this rousing plea: “We want reason in our politics, greatness in our art, and we see that these are incompatible with our feckless, churning conversation. We must learn how to neuter our nanostories, or at least cut off their food supply.” Viral culture, he seems to argue, is at perhaps permanent odds with seriousness and quality.
— We might be doomed. While there’s much to agree with in Wasik’s arguments, he offers us few specifics on how to “neuter” these viral stories. He mentions Jake Silverstein’s idea of an Internet Ramadan, during which participants go offline for a month, or Intel’s flirtation with offline “quiet time” one morning a week. Rather than offer specifics, Wasik focuses on individual choices, the familiar idea of unplugging ourselves from the constant flow of information — or, more elegantly, that “we must become judicious controllers of our own contexts, making careful and self-reflective choices about what we read, watch, consume.”
Wasik asks hard questions to which there are no simple answers. But if we are experiencing a moment of cultural catastrophe, shouldn’t the remedies extend beyond such relatively simple, personal decisions? After all, we can only consume what the culture makes available. A constructive question going forward, it seems, might be: rather than simply cut ourselves off, can we use the apparatus of digital media to produce and enjoy quality content?"
What Makes a Story Newsworthy?
Media College informs us in this way:
News can be defined as "Newsworthy information about recent events or happenings, especially as reported by news media". But what makes news newsworthy?
There is a list of five factors, detailed below, which are considered when deciding if a story is newsworthy. When an editor needs to decide whether to run with a particular story, s/he will ask how well the story meets each of these criteria. Normally, a story should perform well in at least two areas.
Naturally, competition plays a part. If there are a lot of newsworthy stories on a particular day then some stories will be dropped. Although some stories can be delayed until a new slot becomes available, time-sensitive news will often be dropped permanently.
The word news means exactly that - things which are new. Topics which are current are good news. Consumers are used to receiving the latest updates, and there is so much news about that old news is quickly discarded.
A story with only average interest needs to be told quickly if it is to be told at all. If it happened today, it's news. If the same thing happened last week, it's no longer interesting.
The number of people affected by the story is important. A plane crash in which hundreds of people died is more significant than a crash killing a dozen.
Stories which happen near to us have more significance. The closer the story to home, the more newsworthy it is. For someone living in France, a major plane crash in the USA has a similar news value to a small plane crash near Paris.
Note that proximity doesn't have to mean geographical distance. Stories from countries with which we have a particular bond or similarity have the same effect. For example, Australians would be expected to relate more to a story from a distant Western nation than a story from a much closer Asian country.
Famous people get more coverage just because they are famous. If you break your arm it won't make the news, but if the Queen of England breaks her arm it's big news.
5. Human Interest
Human interest stories are a bit of a special case. They often disregard the main rules of newsworthiness; for example, they don't date as quickly, they need not affect a large number of people, and it may not matter where in the world the story takes place.
Human interest stories appeal to emotion. They aim to evoke responses such as amusement or sadness. Television news programs often place a humorous or quirky story at the end of the show to finish on a feel-good note. Newspapers often have a dedicated area for offbeat or interesting items.
Enigma Codebreakers Spawned Modern-day Computers, In 1936..
Joe Miller Wrote:
Joan Clarke's ingenious work as a codebreaker during WW2 saved countless lives, and her talents were formidable enough to command the respect of some of the greatest minds of the 20th Century, despite the sexism of the time.
But while Bletchley Park hero Alan Turing - who was punished by a post-war society where homosexuality was illegal and died at 41 - has been treated more kindly by history, the same cannot yet be said for Clarke.
The only woman to work in the nerve centee of the quest to crack German Enigma ciphers, Clarke rose to deputy head of Hut 8, and would be its longest-serving member.
She was also Turing's lifelong friend and confidante and, briefly, his fiancée.
Her story has been immortalized by Keira Knightley in The Imitation Game, out in UK cinemas this week.
In 1939, Clarke was recruited into the Government Code and Cypher School (GCCS) by one of her supervisors at Cambridge, where she gained a double first in mathematics, although she was prevented from receiving a full degree, which women were denied until 1948.
As was typical for girls at Bletchley, (and they were universally referred to as girls, not women) Clarke was initially assigned clerical work, and paid just £2 a week - significantly less than her male counterparts.
Within a few days, however, her abilities shone through, and an extra table was installed for her in the small room within Hut 8 occupied by Turing and a couple of others.
In order to be paid for her promotion, Clarke needed to be classed as a linguist, as Civil Service bureaucracy had no protocols in place for a senior female cryptanalyst. She would later take great pleasure in filling in forms with the line: "grade: linguist, languages: none".
The navy ciphers decoded by Clarke and her colleagues were much harder to break than other German messages, and largely related to U-boats that were hunting down Allied ships carrying troops and supplies from the US to Europe.
Her task was to break these ciphers in real time, one of the most high-pressure jobs at Bletchley, according to Michael Smith, author of several books on the Enigma project.
The messages Clarke decoded would result in some military action being taken almost immediately, Mr Smith explains.
U-boats would then either be sunk or circumnavigated, saving thousands of lives.
Turing 'kissed me'
During this time, Clarke and Turing became ever closer, co-ordinating their days off in order to spend more time together. In 1941, he proposed, although the engagement was ultimately short-lived.
"We did do some things together, perhaps went to the cinema and so on, but certainly, it was a surprise to me when he said... 'Would you consider marrying me?'," Clarke recounted in an interview for a BBC Horizon documentary, aired in 1992.
"But although it was a surprise, I really didn't hesitate in saying yes, and then he knelt by my chair and kissed me, though we didn't have very much physical contact.
"Now next day, I suppose we went for a bit of a walk together, after lunch. He told me that he had this homosexual tendency.
"Naturally, that worried me a bit, because I did know that was something which was almost certainly permanent, but we carried on."
However, just a few months later, Turing broke off the engagement, believing that the marriage would ultimately fail
Nonetheless, Clarke and Turing remained close friends until his death in 1954.
Graham Moore, who wrote the screenplay for The Imitation Game, says he saw similarities between the two cryptanalysts, and he believes this brought them together.
"They were both such outsiders, and that gave them some common ground," he explains.
"They were able to see things in a different way to others."
Indeed, they shared each other's passions, such as chess, puzzles, botany and even, on one occasion, knitting.
Forgotten by history
Because of the secrecy that still surrounds events at Bletchley Park, the full extent of Clarke's achievements remains unknown.
Although she was appointed MBE in 1947 for her work during WW2, Clarke, who died in 1996, never sought the spotlight, and rarely contributed to accounts of the Enigma project.
But the esteem in which she was held by her colleagues, and the fact that "her equality with the men was never in question, even in those unenlightened days", as Michael Smith writes, are a tribute to her remarkable abilities.
Morten Tyldum, the director of The Imitation Game emphasises that Clarke succeeded as a female in cryptanalysis at a time "when intelligence wasn't really appreciated in women".
There were a handful of other female codebreakers at Bletchley, notably Margaret Rock, Mavis Lever and Ruth Briggs, but as Kerry Howard - one of the few people to research their roles in GCCS - explains, their contributions are hardly noted anywhere.
"Up until now the main focus has been on the male professors who dominated the top level at Bletchley," she says. In order to find any information on the women involved, you have "to dig much deeper".
"There are a lot of people in this story who should have their place in history," says Keira Knightley.
"Joan is certainly one of them"
People have a variety of motivations for receiving news online. surveys show that the single largest reason for getting news online is convenience. this has no always been the case. In 1996, the 53 percent of Americans who cited the inability to get all they want from "traditional news sources" was greater than the 45 percent who cited convenience as a reason for going online for election news. By 2000, convenience had not just passed the inadequacy of traditional news sources as a reason for getting online news, but overwhelmed it by 56 to w9 percent.
Another reason is that the dimension of online consumption did not represent a typical social and demographic section of America. The receipt of online news varies with socioeconomic factors. Both use the Internet and consumption of news generally increase with education. It is, therefore, not surprising that those receiving news online are generally better educated than the average American.
Internet news-reading has also eliminated newspaper-reading by the youth. The Internet is one medium where young people get their news than the elderly.
There is a huge gulf between the established media and citizen perceptions of online news. The gulf persisted even after the Internet had become a majority medium. This is clear from a survey comparing comparing media personnel with the online news public. While 77 percent of media elites said that the public was less comfortable with the reliability of online news, only 28 percent of the online public said it was. While 27 percent of the online public agrees that "there is too much news on the Internet to sort through and make sense of it all," over half of media elites said that "there is too much news on the Internet for the Public to sort through and make sense of it all."
In sum, citizens feel empowered while media elites are somewhat threatened by the political journalism on the Internet. It seems likely that people of the world will be comforted by the Internet's influence on political juranlism. thomas Jefferson put it this way: "Our citizens may be deceived for a while, and have been deceived, but as long as the presses can be protected, we may trust them for light.". In our present-day life this life can be found coming from a glowing monitor, but at the same time, with the public choosing as to what is news.
Converging and Submerging Media
Convergence Of Old Traditional Media In the Web
The viewer and consumer of news is the one who decides what's news and what is not newsworthy. This point is made clearer by Seib who notes:
"the technology has progressed to the point that it allows the viewer to see more of the process of gathering news. ... People are seeing news news as it develops. And I'm not sure that's bad. It kind of hits at some of the criticism of the media for slanting the news. You can't say it was slanted when its live.
"When a major story greaks, partnerships usually give way to battling for unique story angles. The resulting news product, however, should not be regarded as merely a trophy by the competing journalists. Most important is the effect on the audience-how people use the information they receive.
"... One reason for the emphasis on speed was competition from the Internet. Rather than leave the public to its own devices and let the people get the news on their own in this way, television news executives decided to try to match the Internet's speed as information provider. The problem with this approach is that merely delivering raw information is not journalism.
"The distinction could be seen in newspapers' coverage of many important reports like the Starr Report. Many presented lengthy excerpts and a few, such as the New York Times, printed the entire coument. But their news stories were carefully considered and structured: emphasis was placed on what was judged most important, caveats were offered about the unproven nature of allegations, and limits were placed on how much of the sexually graphic material was included.
The Starr's release was a turning point for the Internet in ts relationship with other information media. For many people, the Web was no longer merely an ocean on which to surf for the news, but had become a primary source. According to Eb traffic tracker RelevantKnowledge, approximately 24.7 million people saw the Starr report during the first two days it was online. That exceeds the circulation of America's fifty largest daily newspapers.
"Another step forward with the Starr report was improved screen format. Software companies are making material easier to read by combining text with a table of contents and scrolling footnotes at the bottom of the screen. The great duel between the computer screen and the printed page centers on 'convenience'(As duly noted above), for the information consumer.
"For the Internet site designer. the goal is to minimize the amount of clicking and other maneuvering required to read a document. Graphics should always be helpful and notdistracting. the trial-and-error approach is quickly moving toward making the Internet a much more reader-friendly medium."
In my reckoning, I think the Internet is is more than -reader/viewer friendly, if one were to mull a bit on this point. The software that is being constantly updated in one's computer from unknown sources, and the fact that Cable TV. like Time Warners channels and system of delivery, is constantly being updated, is one of the many technological developments when gizmos and information are wrapped with a technique that explains both their function and, effect and effect on the viewer.
Another way of looking at is is understanding what options and choices of convenience are being presented to the news/information consumingInternet polity, there are some choices for the viewer/reader, and there are designed and structured rules and operation dictated by the gizmos and the Internet and information storage, accumulation and disbursement and their techniques.
We known from Seib, that "radio and television" were not limited to reporting what had already happened. As years passed, increasingly sophisticated technology made this kind of reporting easier and more common. Going 'live' became the trademark for broadcast and the cable news.
"Cable and satellite carriers have fostered a proliferation of television offerings, and the pace of daily life-inculding the interminable "drive time"-has reinforced radio's popularity. but the journalistic standards have not always kept up with technological advances. As the continuum of electronic news stretches into the future, significant doubts exist about the quality of the news product.
"Now that is changing. Even print news organizations are delivering their product electronically as well as on paper. The World Wide Web is the next major news medium. On the Web, newspapers and magazines can go and live. Bu doing so, they give up time that was a friend to judgement. They also find that part of their traditional cultured has become obsolete. The system of the print newsroom was built upon a regular publishing schedule.
"for journalists crafting stories, the news cycle-daily for newspapers, longer for magazines-was as reliable as a metronome. No more. Morning newspapers such as the Washington Post are producing online editions at other times in the day and delivering updated reports at a moment's notice. The public has little tolerance for any sluggishness in regard to Web news.
"Television and radio are also adapting to the realities of the Internet, creating their own Web sites as the online audience grows. For electronic news organizations, the Web allows expansion of real-time offering. Networks and stations use their Web sites to offer "Netcasts"-initially supplements to regular news programming (such as expanded reporting of election night vote tallies), with unique content soon to follow. The sites also feature new and archived video and audio demand.
Despite the nearly infinite capacity of the Internet, the Online World is already getting crowded. The Web world of news is cluttered with sites that are delivering similar products. Only the most resolute Web junkies will partake of all this, and the new medium's economics are proving harsh for those whose sites do not have well-known brand name or other appealing features. Many offerings will disappear as other merge with former competitors. On the Web, the products of the New York Times and ABC News will be very much alike in the mix of text, audio and video. The logical step is the joint enterprise. This is "COnvergence-nes organizations from different media coming together in the new medium. the Viral Streaming Media Ecology has offered and tipped the balance of traditional news, and is presenting with news ways of Media and a new environment of surfing the informational Web.
Teaching Mass Media Commmunication andMass Communication
Will Online News Kill Print?
Analogue Journalism: Deciding News The Old Way
“How did Céline Dion’s backyard merit a front-page photo,” reader Michelle Guilmette asked this week. “Is there nothing else newsworthy out there?”
The day before, reader Bill Archibald emailed to inquire why the Star had devoted time and space to the story of Ludwig the cat who went missing at Pearson airport. Why, he asked, is a lost cat news?
In any given week, readers of the Star are apt to ask some variation of the essential questions at the heart of those emails: What is news? Who decides what the Star pays attention to — and what it ignores. What runs on Page 1 and on the home page of thestar.com?
Readers are quick to weigh in on what the Star covers as well as what it doesn’t cover.
I often hear from those of you who are disappointed that the Star did not cover an event in which you have a particular interest. For example, this week a reader wondered why he could not find news about the Princess Patricia's Regiment anniversary celebrations in the Star.
Another longtime reader, a 70-year-old man who told me he was sexually abused in his childhood, wrote an impassioned letter imploring the Star to provide more coverage of the serious questions raised in the final report (released last December) of Ontario’s public inquiry into sexual abuse allegations in Cornwall.
The Oxford Canadian Dictionary defines news as “information about important or interesting recent events.” There’s broad scope in that for judgment about what is “important” — information you need to know — and what is “interesting” — stuff you might want to know.
Deciding what’s news is the core work of the media. As the renowned journalist and media critic Walter Lippman once said: “All the reporters in the world, working all the hours of the day, could not witness all the happenings in the world.”
Journalism is, by necessity, the art of selection, of deciding what matters and how to present that to audiences. While the Internet and the emergence of “citizen journalism” and social media have made it easier to connect and communicate within our global village, leading some to argue that journalism’s role as a “gatekeeper” is not necessary, there’s a case to be made that the barrage of accessible information makes the editor’s job of selection more vital.
The Star’s senior editors strive to provide a mix of what they believe readers need to know and what you might want to know. Clearly, on any given day, their news judgment won’t be in accord with that of all readers — or even all Star journalists. “Why is that news?” is a sentiment as apt to be expressed in the newsroom as in the public editor’s email box.
Indeed, such was the case with Thursday’s Page 1 play of Céline Dion’s $20 million new estate. For my part, I’m with reader Keung Lui who wrote: “I am happy for Céline and her 8-year-old son that they could afford a $20 million play house. But is this so-called news worth the front page of theToronto Star? Don’t you have some real and more important news to report?”
How do journalists decide what is news? Is news simply determined by an editor’s whim, as expressed by the oft-cited cliché of the powerful editor who declares, “News is what I say it is.”
Textbook definitions of news that aim to teach aspiring journalists how to develop “news judgment” are of little practical use in the daily, and increasingly online, hourly, fray of deciding what’s news. For example, few editors ever consciously consider what one text tells us: “News is information about a break from the normal flow of events, an interruption in the unexpected” (practical translation: Dog bites man: not news. Man bites dog: news).
Stanley Walker, the famous editor of the now-defunct New York HeraldTribune defined news as the three W’s — “women, wampum and wrongdoing” (practical translation: sex, money and crime). That’s sexist, to be sure. How far off is it, though? Consider how those universal elements figure in many important and interesting news stories.
Journalism textbooks define the factors of newsworthiness as the impact of information on citizens, whether conflict and controversy are involved, timeliness, the prominence of those involved and proximity to the audience.
Novelty and oddity also factor in. Many successful editors, striving to appeal to readers, have long defined news as that which makes a reader say, “Gee whiz!”
For most journalists, deciding what’s news is instinctive, rooted in experience and their perceptions of what readers want. Practical factors such as space, reporting resources, the mix of hard news and softer features, the number of events competing for attention, as well as the availability of compelling photos to illustrate the news, are also at play.
All these theories aside, there is one overriding consideration that helps explain the daily puzzle of what is news: What’s newsworthy on a “slow news day” is far different than what you’ll read when a natural disaster happens or a parliamentary scandal breaks.
It’s a safe bet that Céline Dion’s water-park would not have made such a splash on the day a tsunami struck or there was a tidal wave of earth-shaking news.
Internet News is Different from Newspaper News in many ways
Holding Out On News And Deciding What 's News: Seniors On News Selection
Bridget Cagney’s mother read to her every night when she was a small child. “Then when I learnt to read books and newspapers, and you had to learn quickly in those days, it was the greatest joy of my childhood,” said the Queens, N.Y., resident about growing up in Ireland’s County Cork. “And reading still is a great joy.”
Cagney buys the New York Times about every second day and all three of the Irish weeklies published in New York City. “And Jim gets the Post,” she said of her husband, who emigrated with her in 1967.
Most Seniors Online—But Fewer to Read News
The Cagneys don’t own or use a computer, which puts them in a minority among those 65 and older in the United States. Last year, the Pew Research Center for the Internet and American Life announced that for the first time a majority of seniors (53 percent) use e-mail or the Internet.
But a previous Pew survey revealed that most of the older set doesn’t get news from any online source. The study found that only four in 10 members of those 65-74 ever go online for news, and merely one in six members of the “Greatest Generation” (75 and over) do so.
Paul Finnegan, executive director of the New York Irish Center in Long Island City, which encourages seniors to acquire computer skills, said his observations coincide with the Pew Center’s findings.
In an informal survey he conducted in early May of those who attend the center’s Wednesday seniors’ lunch, 40 people said they preferred newspapers as a source of news, while five indicated TV or radio was best for them. Only four chose the Internet.
“That TV/radio figure is a surprise,” Finnegan said. He wasn’t surprised, though, that all four of those who voted for online news are enthusiastic stalwarts of the center’s Saturday morning computer class.
Center regular Julia Anastasio, who sometimes goes online, is one of those who favor print media. “I get the Daily News every day and the Irish Echo every week,” said the native of County Offaly, Ireland. “The Irish Independent[Ireland’s largest-circulation daily] opens up on my computer. I sometimes go to the computer class and I’m getting better. I know how to Google.”
Anastasio’s favorite website is that of the Offaly-based Midland Radio 103, where she can read death notices and local sports news, as well as listen to music.
Even more enthusiastic computer users interviewed for this article regard online sources as supplemental, not as a replacement for print media.
“I'm computer fluent,” said Neil Hickey, a journalist for more than 50 years. He subscribes, though, to the print editions of the New York Times, Wall Street Journal and several periodicals.
“There are huge advantages to the digital revolution,” said Hickey, an adjunct professor at the Columbia University Graduate School of Journalism and a former editor-at-large for the Columbia Journalism Review. “I couldn't live without Google and e-mail. The whole world of information is at your fingertips.
“YouTube,” he added, “is a great joy and a phenomenal resource.”
But, Hickey said, “I tell students that, for me, at least, reading news online is unsatisfying and insufficient to my needs.”
Views of Three Former Teachers
Three former teachers interviewed expressed contrasting positions about the Internet. But all, like Hickey, said that for them print news is primary.
Patrick O’Sullivan, who spent his career teaching Spanish, commented, “You could spend hours at the computer.” The New Jersey resident, who has a second career as a realtor and follows the stock market as a hobby, continued, “But I go online for what I can’t read in the New York Timesand Barron’s.”
O’Sullivan is unimpressed with the news he sees online. “Unlike the rich writing he finds in theTimes, he said, “There’s no great beauty to it.”
Joan Monsoury of Manhattan, said she relies on the Times, NPR News and PBS. She said she doesn’t feel “motivated” to acquire a computer: “If anything happens, I hear about it several times a day.”
Former English teacher Pat McGivern is someone who might be expected to take to the online experience more easily than others. Although she is a typist who worked with computers in classrooms before retiring just over a decade ago, she doesn’t own one. Instead, she checks and responds to e-mail at her local library on Long Island.
"Computers are a nuisance,” said McGivern, who still clips out newspaper articles to give to friends and family members.
She explained that reading e-mails causes her eyestrain after a while. That is not a problem when it comes to print, she noted, but lack of time is.
McGivern, who is studying the Irish language at Lehman College, said she hardly has time these days for the Times’ extensive arts articles she likes, plus the Irish Echo’s coverage of music and arts “and to know what's going on," McGivern said.
‘Watered Down’ Print
If the Times contains too much, McGivern finds that other print media offer too little. She recently dropped her subscription to Time magazine. “It's too dull and watered down,” she said. She had similar complaints about Long Island’s Catholic paper, which recently changed to a magazine format.
“We had very intellectual Catholic publications coming into the house in the 1950s. Now, they’re all very watered down. There’s not much in the way of theology,” she said.
Maurice “Mickey” Carroll stated, “There’s a lot of garbage passing around as news.” He should know. Carroll was the reporter for the defunct New York Herald Tribune, who was in the basement of Dallas police headquarters when Lee Harvey Oswald was shot dead live on national television following the Kennedy assassination.
Today, said Carroll, who worked for nine newspapers, the Times among them, “You’re getting blogs, opinion, amateurish stuff. It’s neatly printed. It looks the same.”
Now the director of Quinnipiac University Polling Institute, Carroll, stressed that in the past, “You knew how to behave with facts. It was in your blood. Even with new digital media, you hope that they will absorb the same standards.”
Optimistic about the economic viability of professional journalism, though, Carroll said, “Fingers crossed, say a prayer, it will sort itself out.”
Still, Carroll worries that the rise of cable news and the multiplicity of sources online means that people can cherry pick the evidence to suit their argument, a development he feels undermines the national conversation.
“TV is a big trap for seniors, particularly male seniors,” said Pat McGivern. “My friends in the Midwest are more liberal, but my friends in New York-- some of them listen to the guys who rant and rave.” She added that a member of her family believes that NPR is under the control of communists.
Newspapers’ ‘Serendipitous Aspect’
Carroll said he “surfs the headlines” online. “Every now and then I look at Politico,” he said. But he believes that looking through a newspaper yields better results. “The serendipitous aspect,” he said. “That’s lost [online].
“I’ve got to have a newspaper in my hands. But that’s because I’m old,” Carroll said, with a laugh. His friend Francis X. Clines, a member of the Times editorial board, told him that he’s typically the only person in the elevator at work with the newspaper under his arm. “None of the kids have it,” he said.
For some seniors, it is more than a case of what they’re used to; it’s what they like.
“I love the feel of the paper,” said Bridget Cagney, who sets aside time to read at the end of the day. “I get a great sense of warmth when I look at headlines in [the Hudson Newsstand] Grand Central.”
Cagney emphasized, “I can’t imagine giving up the paper. I deplore the day that we have to.”
In The Living Room, One's Hooked On Internet Pay TV
Some Unintended Consequences of Online News Reportage
On Monday afternoon, visitors to ABCNews.com who wanted to learn more about the latest news on the tragedy in Newtown, Conn., had a range of Web video options to choose among. But before they could watch, say, “Newtown Shooting: Teachers and Parents Turn to School Security,” visitors had to first sit through a playful ad for skin-care products. At CBSNews.com, watching “60 Minutes Reports: Tragedy in Newtown” came with a pre-roll pitch for insurance. On Yahoo! News, a video for “Newtown: Mourning and Grief” was coupled with an ad for batteries.
The juxtaposition of heart-wrenching news coverage with cheery holiday jingles can be particularly jarring online, which is a more active viewing experience—and a more intimate one. Also, without the presence of an anchor to ease the transition from news to advertising and back, the viewer can be watching singing dogs one moment and crying children the next.
This represents a tricky and growing challenge for news organizations. According to EMarketer, online video is the fastest growing category of Web ads; spending is expected to skyrocket from $2.93 billion in 2012 to $8.04 billion in 2016. When disaster strikes, be it a mass shooting, a terrorist attack, or a deadly storm, broadcasters attempt to strike a balance between making money from the surge in online viewers and managing advertisers’ reluctance to be seen alongside tragic news. In the worst-case scenario, the broadcaster and advertiser end up repelling the viewers they most seek to court.
While some brands (such as airlines) have contractual stipulations to halt their ads from appearing before videos about a tragedy in their industry (such as airline crashes), the decision is often left up to the discretion of the publisher.
Annie Rohrs, a spokesperson for CBSNews.com (CBS), says the site’s policy is to pull not only pre-roll ads but also display ads from news stories involving tragedies. “We removed all ads from our Newtown coverage on Friday,” Rohrs explained via e-mail on Monday night. “Due to a technical glitch, pre-roll ads were briefly run today in some video coverage.”
Julie Townsend, a spokesperson for ABC News (DIS), says that the news organization’s policy is to remove as many ads as possible from stories involving tragedies such as Sandy Hook, but that technical considerations make removing all pre-roll videos on a breaking news story more difficult than pulling down all the banner ads. Yahoo News (YHOO) declined to comment.
On NBCNews.com, viewers shouldn’t see any pre-roll ads in front of stories about the shooting in Newtown—for now. “On this story, very early in the coverage last week, we decided to turn the ads off,” says Stokes Young, executive producer for multimedia at NBCNews.com. “It was an editorial decision that is in line with our long-standing practice of considering the interests of the subjects of our stories, the viewers, and our advertisers, and whether playing a pre-roll ad in front of a video about such a horrific story would be appropriate for any of those three groups.”
Young says that at NBC News every video producer has the ability to turn off the pre-roll ad on a specific video—and also to escalate the question of appropriateness to the website’s top editors. Ultimately, it’s the site’s editorial team that decides whether a blanket policy of shutting off the ads is needed, due to the sensitivity of a particular news event. “Then we’ll let our partners in ad sales on the business side know,” says Young. “Generally, they’re supportive.”
Eventually, as the news cycle progresses from reporting on the initial victims of the tragedy to, say, exploring the long term political ramifications of the event, the editorial team will talk about whether to turn the ads back on. On Monday afternoon at NBC News, said Young, that moment still seemed a long way off.
Why News Is Like Sugar For Your Brain
Is there Value In News As we Receive it from Newspapers, TV/Cable or Internet..?
Czabe in his Internet Blog posted this piece:
This is a rather interesting story about how bad chronic consumption of modern television driven, internet delivered electronic news can be for your mind and even health.
It's As Bad As Sugar is For Your Body
And I'll admit: I have a hard time not reaching for this limitless supply of "brain sugar." News. Opinion articles. Politics. World affairs. And of course, sports news. News, news, news.
I often chide my own father, and father-in-law for watching cable and network news shows way too much.
My dad DVR's all 3 network newscasts each night, then proceeds to watch each one in succession, alternately yelling at the screen about their horrible and obvious liberal bias (no debate there) and fending off my mother's yelling at him about how they all "have the same damn stories!" (she's right too!).
My father-in-law when he visits, parks himself in front of Fox News and CNN for hours and hours at a time. I sometimes have to gently chide him to turn it off, because all cable news does is "anger up the blood" as the great Satchel Paige once said about fried meats.
Here's a few of the reasons why this author says we'd all be better off severely cutting back on our "news" consumption..
In the past few decades, the fortunate among us have recognised the hazards of living with an overabundance of food (obesity, diabetes) and have started to change our diets. But most of us do not yet understand that news is to the mind what sugar is to the body. News is easy to digest. The media feeds us small bites of trivial matter, tidbits that don't really concern our lives and don't require thinking. That's why we experience almost no saturation. Unlike reading books and long magazine articles (which require thinking), we can swallow limitless quantities of news flashes, which are bright-coloured candies for the mind. Today, we have reached the same point in relation to information that we faced 20 years ago in regard to food. We are beginning to recognise how toxic news can be.
News is irrelevant. Out of the approximately 10,000 news stories you have read in the last 12 months, name one that – because you consumed it – allowed you to make a better decision about a serious matter affecting your life, your career or your business. The point is: the consumption of news is irrelevant to you. But people find it very difficult to recognise what's relevant. It's much easier to recognise what's new. The relevant versus the new is the fundamental battle of the current age. Media organisations want you to believe that news offers you some sort of a competitive advantage. Many fall for that. We get anxious when we're cut off from the flow of news. In reality, news consumption is a competitive disadvantage. The less news you consume, the bigger the advantage you have.
News works like a drug. As stories develop, we want to know how they continue. With hundreds of arbitrary storylines in our heads, this craving is increasingly compelling and hard to ignore. Scientists used to think that the dense connections formed among the 100 billion neurons inside our skulls were largely fixed by the time we reached adulthood. Today we know that this is not the case. Nerve cells routinely break old connections and form new ones. The more news we consume, the more we exercise the neural circuits devoted to skimming and multitasking while ignoring those used for reading deeply and thinking with profound focus. Most news consumers – even if they used to be avid book readers – have lost the ability to absorb lengthy articles or books. After four, five pages they get tired, their concentration vanishes, they become restless. It's not because they got older or their schedules became more onerous. It's because the physical structure of their brains has changed.
News wastes time. If you read the newspaper for 15 minutes each morning, then check the news for 15 minutes during lunch and 15 minutes before you go to bed, then add five minutes here and there when you're at work, then count distraction and refocusing time, you will lose at least half a day every week. Information is no longer a scarce commodity. But attention is. You are not that irresponsible with your money, reputation or health. Why give away your mind?
News makes us passive. News stories are overwhelmingly about things you cannot influence. The daily repetition of news about things we can't act upon makes us passive. It grinds us down until we adopt a worldview that is pessimistic, desensitised, sarcastic and fatalistic. The scientific term is "learned helplessness". It's a bit of a stretch, but I would not be surprised if news consumption, at least partially contributes to the widespread disease of depression.
The recent Boston Marathon bombing has been a godsend for the cable networks, even though they would sanctimoniously deny it if pressed. It has given them hours and hours of sensational footage already, and with the capture of that one dummy alive, will provide weeks and months of additional "programming"
And there's nothing "wrong" per se about keeping up with what happens to him now, and what we might learn as to his terrorist connections.
But will it affect your life in any meaningful way? Will it deepen your knowledge of something in life that will be useful or give you happiness going forward?
No. Not a chance.
I am going to try to resist getting suckered in as best I can, because I know it's easy to do and I'm far from perfect. Even replacing the news with silly, staged shows like Duck Dynasty is a far better choice. At least Phil saying "happy-happy-happy" makes me smile and relaxes me.
Something Wolf Blitzer has never done."
In the age of social media, there is no such thing as broadcasting – everything is a conversation.
Mobile Devices Are changing Community Information Environments
The analysis above of the Maas media environments and news is not a simple matter that can be glossed-over. What I mean by saying so, is that there has been an evolution, change and shift of paradigm of news gathering, dissemination, presentation, consumption in every which way we can imagine from the past up to the the present technological society. This is important for the changes in the media are constantly changing as I am onto this Hub. This Hub is about the News and channels of discourse. The news has morphed into the what people make and deice is and can be news, and the old news organization are facing a new challenge they have never really anticipated-their consumers, do not only consume the news that these agencies produce, but they themselves produce and are making news and doing so with new and emerging gizmos within new and converging/emerging media environments.
The coming of the digital framework and environment ha changed the news and information paradigm to a new entity which we shall have to look at even much more carefully than we have hitherto done. One author who seems to capture this sense with alarming alacrity is Douglass Rushkoff who states:
“I am much less concerned with whatever it is technology may be doing to people that what people are choosing to do to one another through technology,” Mr. Rushkoff writes. “Facebook’s reduction of people to predictively modeled profiles and investment banking’s convolution of the marketplace into an algorithmic battleground were not the choices of machines.” They were made by human intelligence, because present shock’s ways of targeting, pinpointing and manipulating aren’t just shocking. They’re very lucrative too."
What I am trying to point out here is that the changes that have been wrought by these new merging and emerging technologies, techniques and their environments need to be understood much better and looked at from the old to the new, and maybe we can begin to have a semblance of being able to wrap our heads as to what is happening to our information/data base and environment-more especially, what our means of discourses are and how do we decide what is news today. Rushkoff informs us thusly:
"In my book I describe the present shock of politicians who — thanks to the 24/7 coverage ushered in by “the CNN effect” that began in the 1980s — “cannot get on top of issues, much less get ahead of them.” I notes that both the political left (MSNBC, with its slogan “Lean Forward”) and right (conservatism devoted to reviving traditional values) share this goal: They’re trying to escape the present."
It is therefore, with this mindset that I shall , below, begin to look at the media from the perspective of other people(writers) so as to give the reader a sense of what is really happening and maybe they can pick up from it some important insights as to what the contemporary media is all about.
..The State of The News Media
By Tom Rosenstiel and Amy Mitchell of the Project for Excellence in Journalism
By several measures, the state of the American news media improved in 2010.
After two dreadful years, most sectors of the industry saw revenue begin to recover. With some notable exceptions, cutbacks in newsrooms eased. And while still more talk than action, some experiments with new revenue models began to show signs of blossoming.
Among the major sectors, only newspapers suffered continued revenue declines last year—an unmistakable sign that the structural economic problems facing newspapers are more severe than those of other media. When the final tallies are in, we estimate 1,000 to 1,500 more newsroom jobs will have been lost—meaning newspapers, newsrooms are 30% smaller than in 2000.
Beneath all this, however, a more fundamental challenge to journalism became clearer in the last year. The biggest issue ahead may not be lack of audience or even lack of new revenue experiments. It may be that in the digital realm the news industry is no longer in control of its own future.
News organizations — old and new — still produce most of the content audiences consume. But each technological advance has added a new layer of complexity—and a new set of players—in connecting that content to consumers and advertisers.
In the digital space, the organizations that produce the news increasingly rely on independent networks to sell their ads. They depend on aggregators (such as Google) and social networks (such as Facebook) to bring them a substantial portion of their audience. And now, as news consumption becomes more mobile, news companies must follow the rules of device makers (such as Apple) and software developers (Google again) to deliver their content. Each new platform often requires a new software program. And the new players take a share of the revenue and in many cases also control the audience data.
That data may be the most important commodity of all. In a media world where consumers decide what news they want to get and how they want to get it, the future will belong to those who understand the public’s changing behavior and can target content and advertising to snugly fit the interests of each user. That knowledge — and the expertise in gathering it — increasingly resides with technology companies outside journalism.
In the 20th century, the news media thrived by being the intermediary others needed to reach customers. In the 21st, increasingly there is a new intermediary: Software programmers, content aggregators and device makers control access to the public. The news industry, late to adapt and culturally more tied to content creation than engineering, finds itself more a follower than leader shaping its business.
Meanwhile, the pace of change continues to accelerate. Mobile has already become an important factor in news. A new survey released with this year’s report, produced with Pew Internet and American Life Project in association with the Knight Foundation, finds that nearly half of all Americans (47%) now get some form of local news on a mobile device. What they turn to most there is news that serves immediate needs – weather, information about restaurants and other local businesses, and traffic. And the move to mobile is only likely to grow. By January 2011, 7% of Americans reported owning some kind of electronic tablet. That was nearly double the number just four months earlier.
The migration to the web also continued to gather speed. In 2010 every news platform saw audiences either stall or decline — except for the web. Cable news, one of the growth sectors of the last decade, is now shrinking, too. For the first time in at least a dozen years, the median audience declined at all three cable news channels.
For the first time, too, more people said they got news from the web than newspapers. The internet now trails only television among American adults as a destination for news, and the trend line shows the gap closing. Financially the tipping point also has come. When the final tally is in, online ad revenue in 2010 is projected to surpass print newspaper ad revenue for the first time. The problem for news is that by far the largest share of that online ad revenue goes to non-news sources, particularly to aggregators.
In the past, much of the experimentation in new journalism occurred locally, often financed by charitable grants, usually at small scale. Larger national online-only news organizations focused more on aggregation than original reporting. In 2010, however, some of the biggest new media institutions began to develop original newsgathering in a significant way. Yahoo added several dozen reporters across news, sports and finance. AOL had 900 journalists, 500 of them at its local Patch news operation (it then let go 200 people from the content team after the merger with Huffingtonpost). By the end of 2011, Bloomberg expects to have 150 journalists and analysts for its new Washington operation, Bloomberg Government. News Corp. has hired from 100 or 150, depending on the press reports, for its new tablet newspaper, The Daily, though not all may be journalists. Together these hires come close to matching the jobs in 2010 we estimate were lost in newspapers, the first time we have seen this kind of substitution.
A report in this year’s study also finds that new community media sites are beginning to put as much energy into securing new revenue streams — and refining audiences to do so — as creating content. Many also say they are doing more to curate user content.
Traditional newsrooms, meanwhile, are different places than they were before the recession. They are smaller, their aspirations have narrowed and their journalists are stretched thinner. But their leaders also say they are more adaptive, younger and more engaged in multimedia presentation, aggregation, blogging and user content. In some ways, new media and old, slowly and sometimes grudgingly, are coming to resemble each other.
The result is a news ecology full of experimentation and excitement, but also one that is uneven, has uncertain financial underpinning and some clear holes in coverage. Even in Seattle, one of the most vibrant places for new media, “some vitally important stories are less likely to be covered,” said Diane Douglas who runs a local civic group and considers the decentralization of media voices a healthy change. “It’s very frightening to think of those gaps and all the more insidious because you don’t know what you don’t know.” Some also worry that with lower pay, more demands for speed, less training, and more volunteer work, there is a general devaluing and even what scholar Robert Picard has called a “de-skilling” of the profession.
Among the features in this, the eighth edition of the State of the News Media produced by the Pew Research Center’s Project for Excellence in Journalism, is a report on how American Newspapers fare relative to those in other countries, two reports on the status of community media, a survey on mobile and paid content in local news, and a report on African American Media. The chapters this year have also been reorganized and streamlined: each is made up now of a Summary Essay and a longer, separate By the Numbers section where all the statistical information is more easily searchable and interactive.
Each year, this report also identifies key trends. In addition to the growing significance of third-party players in shaping the future of news, six stand out entering 2011:
The news industry is turning to executives from outside. The trend has a scattered history. The complex revenue equation of news — that it was better to serve the audience even to the irritation of advertisers that paid most of the bills — tended to trip up outsiders. It spelled the end, for instance, of Mark Willes at Times Mirror when he let advertisers dictate content. With the old revenue model broken, more companies are again looking to outsiders for leadership. One reason is new owners. Seven of the top 25 newspapers in America are now owned by hedge funds, which had virtually no role a few years ago. The age of publicly traded newspapers companies is winding down. And some of the new executives are blunt in their assessments. John Paton, the new head of Journal Register newspapers told a trade group in December: “We have had nearly 15 years to figure out the web and, as an industry, we newspaper people are no good at it.” A question is how much time these private equity owners will give struggling news operations to turn around. One of these publishers told PEJ privately that he believed he had two years.
Less progress has been made charging for news than predicted, but there are some signs of willingness to pay. The leading study on the subject finds that so far only about three dozen newspapers have moved to some kind of paid content on their websites. Of those, only 1% of users opted to pay. And some papers that moved large portions of content to subscription gave up the effort. A new survey released for this report suggests that under certain circumstances the prospects for charging for content could improve. If their local newspaper would otherwise perish, 23% of Americans said they would pay $5 a month for an online version. To date, however, even among early adopters only 10% of those who have downloaded local news apps paid for them (this doesn’t include apps for non-local news or other content). At the moment, the only news producers successfully charging for most of their content online are those selling financial information to elite audiences — the Financial Times is one, the Wall Street Journal is another, Bloomberg is a third — which means they are not a model that will likely work for general interest news.
If anything, the metrics of online news have become more confused, not less. Many believe that the economics of the web, and particularly online news, cannot really progress until the industry settles on how to measure audience. There is no consensus on what is the most useful measure of online traffic. Different rating agencies do not even agree on how to define a “unique visitor.” Does that denote different people or does the same person visiting a site from different computers get counted more than once? The numbers from one top rating agency, comScore, are in some cases double and even triple those of another, Nielsen. More audience research data exist about each user than ever before. Yet in addition to confusion about what it means, it is almost impossible get a full sense of consumer behavior — across sites, platforms, and devices. That leaves potential advertisers at a loss about how to connect the dots. In March 2011, three advertising trade groups, supported by other media associations, announced an initiative to improve and standardize confusing digital media metrics called Making Measurement Make Sense, but the task will not be easy.
Local news remains the vast untapped territory. Most traditional American media —and much of U.S. ad revenue – are local. The dynamics of that market online are still largely undefined. The potential, though, is clear. Already 40% of all Online Ad spending is local, up from 30% just a year earlier. But the market at the local level is different than nationally and requires different strategies, both in content creation and economics. Unlike national, at the local level, display advertising — the kind that news organizations rely on — is bigger than search, market researchers estimate. And the greatest local growth area last year was in highly targeted display ads that many innovators see as key to the future. Even Google, the king of search, sees display as “our next big business,” as Eric Schmidt, its CEO, told the New York Times in September.
The nature of local news content is also in many ways undefined. While local has been the area of greatest ferment for nonprofit startups, no one has yet cracked the code for how to produce local news effectively at a sustainable level. The first major concept in more traditional venues, the push toward so-called “hyperlocalism,” proved ill-conceived, expensive and insufficiently supported by ads. Yahoo’s four-year old local news and advertising consortium has shown some success for certain participants but less for others. There are some prominent local news aggregators such as Topix and Examiner.com, and now AOL has entered the field with local reporting through Patch. Whether national networks will overtake small local startups or local app networks will mix news with a variety of other local information, the terrain here remains in flux.
The new conventional wisdom is that the economic model for news will be made up of many smaller and more complex revenue sources than before. The old news economic model was fairly simple. Broadcast television depended on advertising. Newspapers on circulation revenue and a few basic advertising categories. Cable was split half from advertising and half from cable subscription fees. Online, most believe there will be many different kinds of revenue. This is because no one revenue source looks large enough and because money is divided among so many players. In the biggest new revenue experiment of 2010, the discount sales coupon business led by Groupon, revenue can be split three ways when newspapers are involved. On the iPad, Apple gets 30% of the subscription revenue and owns the audience data. On the Android system, Google takes 10%. News companies are trying to push back. One new effort involves online publishers starting their own ad exchanges, rather than having middlemen to do it for them. NBC, CBS and Forbes are among those launching their own, tired of sharing revenue and having third parties take their audience data.
The bailout of the auto industry helped with the media’s modest recovery in 2010. One overlooked dimension in the year past: a key source of renewed revenue in news in 2010 was the recovery in the auto industry, aided by the decision to lend federal money to save U.S. carmakers. Auto advertising jumped 77% in local television, 22% in radio and 17% in magazines. The other benefactor of the news industry, say experts, was the U.S. Supreme Court: Its Citizens United decision allowing corporations and unions to buy political ads for candidates helped boost political advertising spent on local television to an estimated $2.2 billion, a new high for a midterm campaign year.
Local news is going mobile. Nearly half of all American adults (47%) report that they get at least some local news and information on their cellphone or tablet computer.
What they seek out most on mobile platforms is information that is practical and in real time: 42% of mobile device owners report getting weather updates on their phones or tablets; 37% say they get material about restaurants or other local businesses. These consumers are less likely to use their mobile devices for news about local traffic, public transportation, general news alerts or to access retail coupons or discounts.
One of the newest forms of on-the-go local news consumption, mobile applications, are just beginning to take hold among mobile device owners.
Compared with other adults, these mobile local news consumers are younger, live in higher income households, are newer residents of their communities, live in nonrural areas, and tend to be parents of minor children. Adults who get local news and information on mobile devices are more likely than others to feel they can have on impact on their communities, more likely to use a variety of media platforms, feel more plugged into the media environment than they did a few years ago, and are more likely to use social media:
- 35% of mobile local news consumers feel they can have a big impact on their community (vs. 27% of other adults)
- 65% feel it is easier today than five years ago to keep up with information about their community (vs. 47% of nonmobile connectors)
- 51% use six or more different sources or platforms monthly to get local news and information (vs. 21%)
- 75% use social network sites (vs. 42%)
- 15% use Twitter (vs. 4%)
Tablets and smartphones have also brought with them news applications or “apps.” One-quarter (24%) of mobile local news consumers report having an app that helps them get information or news about their local community. That equates to 13% of all device owners and 11% of the total American adult population. Thus while nearly 5 in 10 get local news on mobile devices, just 1 in 10 use apps to do so. Call it the app gap.
These mobile app users skew young and Hispanic. They are also much more active news consumers than other adults, using more sources regularly and “participating” in local news by doing such things as sharing or posting links to local stories, commenting on or tagging local news content, or contributing their own local content online.
Many news organizations are looking to mobile platforms to provide new ways to generate revenue in local markets. The survey suggests there is a long way to go before that happens. Currently, only 10% of adults who use mobile apps to connect to local news and information pay for those apps. This amounts to just 1% of all adults.
When it comes to payments for news more broadly, 36% of adults say they pay for local news content in some form – be it for their local print newspaper, for an app on their mobile device or for access to special content online. The vast majority of those who pay for local news, 31% in all, are paying for local print newspaper subscriptions and only a fraction are paying for apps or for access online to local material.
One question in the news industry is whether the willingness to pay for online content would grow if people faced the prospect of their local media not surviving otherwise. Pressed on the value of online access to their local newspaper, 23% of survey respondents say they would pay $5 a month to get full access to local newspaper content online. When asked if they would pay $10 per month, 18% of adults say yes. Both figures are substantially higher than the percentage of adults (5%) who currently pay for online local news content. Nonetheless, roughly three-quarters say they would not pay anything.
Asked the value of their local newspaper, respondents are divided. Just under a third (28%) say the loss of the local newspaper would have a major impact on their ability to keep up with local information. Another 30% say it would have a minor impact. But the plurality — 39% — say the loss of the newspaper would have no impact."
The constant and reliable readership and consumer of news has been broken and fragmented into many parts that are affected by the different emerging technologies, techniques, mediums and gizmos(including modes of reportage, and dissemination of new information data, affected by the viral soup and stream. The power of the internet, together with the new and constantly emerging/merging gizmos with their refined techniques, are affecting and effecting the readership and the consumers of the media and data, and forming/shaping them into anew and emerging consumer and decider of what news and information suits them or not, and in the process, they shape the news and information, news disseminatiogathering. and the consumer is informer, and the consumer is involved in receiving/recycling which the old agencies are churning out to them. Deciding what's news has taken on a new form and operation, and as the technologies evolve, so will the decision as to what news is be affected and reflected by the consumers and disseminators of the latter day computer/technological societies we now live in.
At the same time, We recall what Tom Rosenstiel and Amy Mitchell stated above:
'News organizations — old and new — still produce most of the content audiences consume. But each technological advance has added a new layer of complexity—and a new set of players—in connecting that content to consumers and advertisers.
'In the digital space, the organizations that produce the news increasingly rely on independent networks to sell their ads. They depend on aggregators (such as Google) and social networks (such as Facebook) to bring them a substantial portion of their audience. And now, as news consumption becomes more mobile, news companies must follow the rules of device makers (such as Apple) and software developers (Google again) to deliver their content. Each new platform often requires a new software program. And the new players take a share of the revenue and in many cases also control the audience data.
"That data may be the most important commodity of all. In a media world where consumers decide what news they want to get and how they want to get it, the future will belong to those who understand the public’s changing behavior and can target content and advertising to snugly fit the interests of each user. That knowledge — and the expertise in gathering it — increasingly resides with technology companies outside journalism.
"In the 20th century, the news media thrived by being the intermediary others needed to reach customers. In the 21st, increasingly there is a new intermediary: Software programmers, content aggregators and device makers control access to the public. The news industry, late to adapt and culturally more tied to content creation than engineering, finds itself more a follower than leader shaping its business."
This remains the mainstay of the business of news and information services, and this has adjusted according to the demands of the new and emerging media and audiences...
Lessons in Media Ecology: Strategies + Eulogies for Big Media
Understanding Media Ecology
It is at this juncture that we take an in-depth look into the Media Ecology's Archeological infrastructure and structure to enhance the discourse about these viral streaming ecologies. In this article the emergent paradigm of media ecologies is distinguished from the ‘actually existing’ media ecology emerging out of the work of McLuhan, Postman and the media ecology association. The appearance of Fuller’s book was understandably unsettling for members of the latter and certainly marks at least a profound rupture in the media ecological paradigm if not a total break.." (Godard)
We further learn from Gordard in this extensively cited piece below that:
"While Matthew Fuller’s book entitled Media Ecologies has had a considerable impact on research into new media, digital art, alternative media and other spheres, it still remains relatively little-known in mainstream media studies and contains great potential for further development in relation to many fields of media research.
Media Ecology is a term that has existed for some time at the peripheries of media studies and theories, and is notably associated with the celebrated media theorist Marshall McLuhan. There is, however, a certain perhaps necessary confusion around the deployment of the term ‘Media Ecologies’ in Fuller’s book, partly because of the differences in this deployment from the already existing field of research known as ‘Media Ecology’, a US-based post-McLuhan stream of media research of which the most well-known figure is undoubtedly Neil Postman.
The following essay will therefore touch upon these differences, before giving a different genealogy of Media Ecologies via the encounter between the rethinking of Ecology or rather Ecologies undertaken by Felix Guattari and the free radio movement in the 1970s, focusing especially on Radio Alice.
The Differences Between Fuller’s Media Ecologies and ‘Actually Existing’ Media Ecology
That the contrast between Media Ecologies the abovementioned school of Media Ecology is not some exercise in Derridean hair-splitting is made abundantly clear by reading the review of the book that was published in Afterimage entitled ‘Taking Issue’, by Lance Strate, who is a central participant in the media ecology movement. Strate quotes the old saying that a rose by any other name would smell as sweet and as a good McLuhanite feels compelled to reject its wisdom: ‘If, on the other hand, you believe that the medium is the message, and that a good name is better than riches, then you may understand my concern over the title of Matthew Fuller’s new book, Media Ecologies’ (Strate, 2005: 55).
Strate goes on to add that Fuller’s book has little to do with Media Ecology, for which he gives a useful history, stating that it came out of conversations between Marshall McLuhan, Eric Mcluhan and Neil Postman, dating back to 1967. He also points out that Fuller’s treatment of this tradition amounts to four pages of the introduction to Media Ecologies (2-5) and that Fuller fails to make any reference to any of its key texts. In many ways it is unsurprising that Strate would feel put out by Fuller’s book and feel the need to provide a corrective history of the term with which he has been working for some time.
His review makes abundantly clear how alien the book Media Ecologies is to this tendency and it is clear that it is coming from quite different theoretical sources and significantly operates within an equally different discursive universe. Beyond the quibbling over history is a real disagreement about media ecologies themselves that, as Fuller rightly points out, are treated by the media ecology tradition through an amalgam of humanism and technological determinism.
While the work of McLuhan can and has given rise to numerous possible interpretations ranging from a literary, anecdotal and metaphorical anthropocentrism to Friedrich Kittler’s radical machinic anti-humanism, the work of at least some of the media theorists associated with the media ecology school retreats from the more radical implications of McLuhan’s work into a type of liberal humanism, an operation that has both conceptual and political implications.
Consider, for example, the work of Neil Postman. In both Amusing Ourselves to Death (1987) and the more recent Technopoly (1993), Postman adopts a form of populist technophobia that only seems to maintain from McLuhan his anecdotal style and love of metaphor and whose only antidote to the Behemoth of technological domination seems to be a quite conservative notion of pedagogy. In other words, it is an approach to media that would be better characterised as pre rather than post-McLuhanite (in the art historical sense of pre-Raphaelite) in that the full co-implications of human beings and technology is treated in a monolithic, rather than in a complex way.
This is strangely reminiscent of the Frankfurt School culture industry model of mass culture, whose one-sided and somewhat paranoid account of mass media has been the subject of important critiques. I would not extend this criticism to all practitioners of ‘actually existing media ecology’, some of whom seem to be relatively insightful scholars of McLuhan and the other theorists who Fuller characterises as a ‘vivid set of resources’ (Fuller, 2005: 4). But the point I would like to make is that Fuller’s book is a much needed intervention into this field, which in some respects can be seen as so many footnotes to McLuhan’s original and still important insight that the medium is the message.
As opposed to both the humanist conservative environmentalism of the media ecology school, Kittler’s anti-humanist technological determinism and the creative industries invocation of information ecologies as a free market strategy, Fuller injects a much needed materialism, politics and complexity into the term media ecologies as he uses it:
The book asks: what are the different kinds of [material] qualities in media systems with their various and particular or shared rhythms, codes, politics, capacities, predispositions and drives, and how can these be said to mix, to interrelate and to produce patterns, dangers and potentials? Crucial to such an approach is an understanding that an attention to materiality is most fruitful where it is often deemed irrelevant, in the immaterial domains of electronic media. (2)
What is crucial in this passage is the emphasis on the materiality of the supposedly immaterial components of media systems, including digital ones, and the association of this with politics since this not only distinguishes media ecologies from media ecology but from a good deal of media and specifically new media theory as well, precisely by proposing a material politics of media. In fact this is really the key reason why there is such a distance between media ecologies and media ecology: whereas the latter is closer to environmentalism, that is, the consideration of media systems as parts of relatively stable environments for which normative ideas about human beings form the centre, ‘media ecologies’ is closer to ecological movements. As Fuller describes this difference:
Echoing the differences in life sciences and various Green political movements, ‘environmentalism’ possesses a sustaining vision of the human and wants to make the world safe for it. Such environmentalism also often suggests … a state of equilibrium … Ecologists focus more on dynamic systems in which any one part is always multiply connected, acting by virtue of these connections and always variable, so that it can be regarded as a pattern rather than simply an object. (4)
This ecological as opposed to environmental conception of media ecologies (and the plural is also essential here) is necessarily activist, intervening in established knowledges about media systems and tracking the radical dynamisms that constitute them, however stable they might appear to be. This goes some way to explaining why the subsequent chapters of the book have varying methodological approaches and are engaged with radically diverse objects ranging from a single piece of Net Art, ‘The Camera that Ate Itself’ (55-84) to the London pirate radio network (13-54) that is perhaps the most systematic and recognisable ‘application’ of the concept of media ecologies.
The second part of this essay will therefore switch from discussing what Media Ecologies is not, in other words the media ecology movement, to one key source for what it is, that is a radically material and political intervention into established approaches to media including that of media ecology that, as Fuller acknowledges, draws substantially on the work of Felix Guattari.
The Three Ecologies and the Free Radios
Fuller acknowledges Guattari as a key reference not only for rethinking ecology but also media ecologies in the following terms: ‘Guattari’s use of the term ecology is worth noting here, first, because, the stakes he assigns to media are rightly perceived as being profoundly political or ethico-aesthetic at all scales. Aligning such political processes with creative powers of invention that demand “laboratories of thought and experimentation for future forms of subjectivation” (Guattari’s words), also poses a demand for the inventive rigor with which life among media must be taken up’ (5).
At the risk of leaping ahead to the conclusion of this essay, I would argue that at the very least, Fuller’s book is a fine example of applying just such an experimental attitude and just such inventive rigor to the field of media in order to, in Deleuzian terms, create a new concept of media ecologies, while nevertheless drawing productively but never slavishly on existing resources such as Guattari’s rethinking of ecologies as part of what he calls ecosophy.
Guattari was increasingly drawn towards ecology in his later writings, most explicitly in his essay The Three Ecologies which begins with the often quoted phrase from Gregory Bateson: ‘There is an ecology of bad ideas, just as there is an ecology of weeds’ (Guattari, 2000: 19). In the context of this essay, one might also be tempted to add the hypothesis of an ecology of bad media systems.
The point is, first of all, that ecology should not be limited to the physical systems studied by environmental science but ought to include (at least) two other levels, namely a social ecology of social relations and a mental ecology of subjectivity or rather the production of subjectivity. Guattari was well aware of the suspicion that tended to be applied to this third level whether from the ‘hard’ sciences or ‘hard’ politics, but for him this dimension is key to any truly ecosophic project. His treatment of these objections to taking seriously the incorporeal but material dimension of mental ecology in which sensibilities, intelligence and processes of desire take place, what Guattari referred to as vectors of subjectivation, is worth quoting in full:
I know that it remains difficult to get people to listen to such arguments, especially in those contexts where there is still a suspicion—or even an automatic rejection—of any specific reference to subjectivity. In the name of the primacy of infrastructures, of structures or systems, subjectivity still gets bad press, and those who deal with it, in practice or theory, will generally only approach it at arm’s length, with infinite precautions, taking care never to move too far away from pseudo-scientific paradigms, preferably borrowed from the hard sciences: thermodynamics, topology, information theory, systems theory, linguistics etc. … In this context, it appears crucial to me that we rid ourselves of all scientistic references and metaphors in order to forge new paradigms that are instead ethico-aesthetic in inspiration. (Guattari, 2000: 25)
Among other things, this dimension of subjectivation is crucial as it is the actual site where politics takes place, where new modes of sensibility and intelligence can be experimented with, mutate and transform themselves. No amount of dire warnings, backed up as they may be by hard empirical evidence, about such phenomena as global warming, for example, are ever going to result in the slightest political change without addressing these vectors of subjectivation, especially if they are merely imposed as part of a larger culture of fear and the cultivation of toxic and paranoid forms of subjectivity. Subjective ecologies and social ecologies are indissociable from physical environments and exist in complex relations of co-determination which any truly media ecological or even ecological practice needs to take fully into account.
But Guattari’s rethinking of ecology is not merely relevant for this reason but also because it was itself intimately involved with a rethinking of media themselves, which function for Guattari as just such vectors of subjectivation and perhaps the most important ones in contemporary societies.
As I stated earlier, Guattari was profoundly affected by his encounter with and participation in the Free Radio movements in Italy and France. In The Three Ecologies as in elsewhere in his work this encounter forms the basis for thinking what he referred to as the post-media era that he saw as potentially emerging from the rubble of mass media society: ‘An essential programmatic point for social ecology will be to encourage capitalist societies to make the transitions from the mass-media age to a post-media era in which the media will be appropriated by a multitude of subject-groups capable of directing its resingularisation.
Despite the seeming impossibility of such an eventuality, the current unparalleled level of media alienation is in no way an inherent necessity. It seems to me that media fatalism equates to a misunderstanding of a number of factors’ (Guattari, 2000: 40). The most relevant of these factors for our purposes is the third one Guattari mentions which is ‘the technological evolution of the media and its possible use for non capitalist goals, in particular through a reduction in costs and through miniaturisation’ (41).
From a contemporary perspective it is hard not to see everything from digital video to activist cybercultural projects such as Indymedia to digital networks in general to the various forms of social software as some kind of technological realisation of this call for a post-media era, that seems to have become at once less impossible and less utopian.
However, as I have argued elsewhere, this would be a far too technologically determinist understanding of Guattari’s concept of ecologies that pays too little attention to the crucial domain of mental ecology. In fact today’s miniaturised media are highly unstable ecologies where there is a clash of imcompossible forces and unpredictable vectors, ranging from the reformulation of capitalism as cognitive to the experimentation with new mediatised modes of subjectivation. What this shows is that far from being utopian or too abstract, Guattari’s conception of a post-media era is at once perfectly real and in need of further complexification, which is just what Fuller’s concept and practice of media ecologies sets out to do.
Therefore rather than examining the contemporary media ecologies referred to above, the last part of this essay will focus in more detail on the Free Radio movement of the 1970s, specifically to bring out its impact on Guattari’s concept of a post-media era that is in turn influential on Fuller’s book. Nevertheless, much of what Guattari was able to discern in free radio stations like Radio Alice is of great relevance to the media ecologies of contemporary new media forms, as Fuller’s account of London pirate radio in Media Ecologies amply demonstrates.
Millions and Millions of Alice’s in Power
In the late 1970s Guattari devoted several texts to the phenomena of popular free radio and especially that taking place in Italy. ‘Why Italy’ (Guattari, 1996a: 79-84) is the essay that gives the clearest indication of why he considered this such an important phenomenon. First of all there is the concrete context, that he had been asked to introduce the French translation of Alice é il diavolo, principal documentation of this radio station and its political trajectory, interested him since it is a radio of an explicitly situationist and Deleuzo-Guattarian inspiration, thereby constituting an auto-referential feedback loop between his own rhizomatic thought and media subversion.
More importantly, Radio Alice and its conflict with the apparatus’s of state control that eventually resulted in a massive wave of repression, demonstrates very clearly how the media are a key site of struggle over the contemporary production of subjectivity; in Guattari’s terms, despite its apparent economic and technological backwardness at that time, Italy was the future of England, France and Germany. The molar aspect of this is that the polarising of politics into the mutually reinforcing duality of state violence and terrorism was developed first of all in Italy before being applied elsewhere and could be seen as an embryonic of the global economy of fear under which we live today. However, what is behind this polarisation was the emergence of a new regime of consensus or control in which all previously existing forms of resistance such as trade unions or the communist party would be tolerated provided they fit into the overall regime of consensual control, for which they provide very useful tools for subjective reterritorialisation: the historic compromise between the Italian communist party and the social democrats being just one example of this process.
Guattari does not really go into detail about the specific political history of the Italian far left which had its roots in the 1960s development of Operaismo or ‘Workerism’, then developed via the interactions between an increasing radicalisation of both proletarian forms of action and workerist theory, the emergence of the student movement in the late 1960s, accompanied by the political expression of new subjectivities such as the feminist and gay liberation movements and ultimately the emergence of what became known as Autonomia or the ‘area of autonomy.’
According to Guattari, the groups associated with this tendency and that still advocated violent rupture with the consensus embodied in the historic compromise would be hunted down and eliminated, with no pretence of liberal models of justice or legal rights, which was indeed what happened first in Italy and then in Germany. But Guattari was less interested in terror or state repression, while considering them important issues demanding responses on a ‘molar’ or representational political level.
His primary interest in this essay is in the molecular revolution that was taking place around Radio Alice, one that the emerging consensual state apparatus was not able to tolerate. For Guattari, this is not a mere shift away from traditional apparatus’s of struggle such as the communist party which have become completely compromised with the state in favour of new micropolitical groupings such as gay liberation or the women’s movement; these new groupings are no less susceptible to becoming reterritorialisations, finding their institutional place in the manufacture of consensus.
As he puts it, ‘there is a miniaturisation of forms of expression and of forms of struggle, but no reason to think that one can arrange to meet at a specific place for the molecular revolution to happen’ (82). While Guattari does not state it explicitly here, this corresponds very closely to the rejection of even micropolitical identities or political forms such as organisational Autonomia enacted by Radio Alice; it was not just a question of giving space for excluded and marginalised subjects such as the young, homosexuals, women, the unemployed and others to speak but rather of generating a collective assemblage of enunciation allowing for the maximum of transversal connections and subjective transformations between all these emergent subjectivities.
Guattari refers toAlice as ‘a generalised revolution, a conjunction of sexual, relational, aesthetic and scientific revolutions all making cross-overs, markings and currents of deterritorialisation’ (84). Rather than pointing to a new revolutionary form, the experimentation of Radio Alicewas a machine for the production of new forms of sensibility and sociability, the very intangible qualities constitutive of both the molecular revolution and the post-media era.
Guattari is somewhat more specific about these practices in the essay ‘Popular Free Radio’ (1996a: -78). In this essay he poses instead of the question of why Italy, that of why radio? Why not Super 8 film or cable TV? The answer, for Guattari is not technical but rather micropolitical. If media in their dominant usages can be seen as massive machines for the production of consensual subjectivity, then it is those media that can constitute an alternate production of subjectivity that will be the most amenable to a post-media transformation.
Radio at this time had not only the technical advantage of lightweight replaceable technology but more importantly was able to be used to create a self-referential feedback loop of political communication between producers and receivers, tending towards breaking down the distinctions between them: ‘the totality of technical and human means available must permit the establishment of a veritable feedback loop between the auditors and the broadcast team: whether through direct intervention by phone, through opening studio doors, through interviews or programmes based on listener made cassettes’ (75).
Again the experience of Radio Alice was exemplary in this regard: ‘We realise [with Radio Alice] that radio constitutes but one central element of a whole range of communication means, from informal encounters in the Piazza Maggiore, to the daily newspaper—via billboards, mural paintings, posters, leaflets, meetings, community activities, festivals etc’ (75). In other words, it is less the question of the subversive use of a technical media form than the generation of a media or rather post-media ecology, that is, a self-referential network for an unforeseen processual production of subjectivity amplifying itself via technical means.
As Guattari points out this is miles away both from ideas of local or community radio in which groups should have the possibility on radio to represent their particular interests and from conventional ideas of political radio in which radio should be used as a megaphone for mobilising the masses. In contrast, on Alice, serious political discussions were likely to be interrupted by violently contradictory, humorous and poetico-delirious interventions and this was central to its unique micropolitics.
It was even further removed from any modernist concern with perfecting either the technical form of radio (for example through concerns with perfecting sound quality) or its contents (the development and perfection of standard formats); listening to the tapes of Radio Alice is more than enough to convince about this last point. All of these other approaches to alternative radio, that is the local, the militant and the modernist, share an emphasis on specialisation; broadcasters set themselves up as specialists of contacts, culture and expression yet for Guattari, what really counts in popular free radio are ‘collective assemblages of enunciation that absorb or traverse specialities’ (75).
What this meant in practice was that on Alice an extreme heterogeneity of materials was broadcast tending towards a delirious flow of ‘music, news, blossoming gardens, rants, inventions, … messages, massages, lies’ (Berardi et al 2009: 82). Innovations of Radio Aliceincluded the instantaneous reporting of news in the form of callers telephoning directly into the radio broadcasts from demonstrations and other political events and the lack of centralised control over what voices or ideas could be expressed, a philosophy of openness that would later be taken up by Independent Media Centres in the digital era.
This meant in practice that calls denouncing the radio producers as ‘filthy communists’ coexisted with calls to support a current demonstration to the caller who rang up just to declare that whoever stole his bicycle is a ‘son of a bitch’ (82). In short there was a delirious flow of expression that disturbed the social order less through its content than by opening up channels of expression and feedback between this free expression and current political events culminating in the radio becoming a key actor in the explosive political events of Bologna in March, 1977, at the climax of which the radio station itself was targeted by the police and several of its key animators arrested.
What this type of radio achieved most of all was the short-circuiting of representation in both the aesthetic sense of representing social realities and in the political sense of the delegate or the authorised spokesperson, in favour of generating a space of direct communication in which, as Guattari put it, ‘it is as if, in some immense, permanent meeting place—given the size of the potential audience—anyone, even the most hesitant, even those with the weakest voices, suddenly have the possibility of expressing themselves whenever they wanted. In these conditions, one can expect certain truths to find a new matter of expression’ (76). In this sense, Radio Alice was also an intervention into the language of media; the transformation from what Guattari calls the police languages of the managerial milieu and the University to a direct language of desire:
"Direct speech, living speech, full of confidence, but also hesitation, contradiction, indeed even absurdity, is charged with desire. And it is always this aspect of desire that spokespeople, commentators and bureaucrats of every stamp tend to reduce, to filter. … Languages of desire invent new means and tend to lead straight to action; they begin by ‘touching,’ by provoking laughter, by moving people, and then they make people want to ‘move out,’ towards those who speak and toward those stakes of concern to them." (76-77)
It is this activating dimension of popular free radio that most distinguishes it from the usual pacifying operations of the mass media and that also posed the greatest threat to the authorities; if people were just sitting at home listening to strange political broadcasts, or being urged to participate in conventional, organised political actions such as demonstrations that would be tolerable but once you start mobilising a massive and unpredictable political affectivity and subjectivation that is autonomous, self-referential and self-reinforcing, then this is a cause for panic on the part of the forces of social order, as was amply demonstrated in Bologna in 1977. Finally, in the much more poetic and manifesto-like preface with which Guattari introduces the translation of texts and documents from Radio Alice, he comes to a conclusion which can perhaps stand as an embryonic formula for the emergence of the post-media era as anticipated by Radio Aliceand the Autonomia movement more generally:
"In Bologna and Rome, the thresholds of a revolution without any relation to the ones that have overturned history up until today have been illuminated, a revolution that will throw out not only capitalist regimes but also the bastions of bureaucratic socialism … a revolution, the fronts of which will perhaps embrace entire continents but which will also be concentrated sometimes on a specific neighbourhood, a factory, a school. Its wagers concern just as much the great economic and technological choices as attitudes, relations to the world and singularities of desire. Bosses, police officers, politicians, bureaucrats, professors and psycho-analysts will in vain conjugate their efforts to stop it, channel it, recuperate it, they will in vain sophisticate, diversify and miniaturise their weapons to the infinite, they will no longer succeed in gathering up the immense movement of flight and the multitude of molecular mutations of desire that it has already unleashed. The police have liquidated Alice—its animators are hunted, condemned, imprisoned, their sites are pillaged—but its work of revolutionary deterritorialisation is pursued ineluctably right up to the nervous fibres of its persecutors." (Guattari, 1978: 11)
This is because the revolution unleashed by Alice was not reducible to a political or media form but was rather an explosion of mutant desire capable of infecting the entire social field because of its slippery ungraspability and irreducibility to existing sociopolitical categories. It leaves the forces of order scratching their heads because they don’t know where the crack-up is coming from since it did not rely on pre-existing identities or even express a future programme but rather only expressed its own movement of auto-referential self-constitution, the proliferation of desires capable of resonating even with the forces of order themselves, which now have to police not only these dangerous outsiders but also their own desires. This shift from fixed political subjectivities and a specified programme is the key to the transformation to a post-political politics and indeed to a post-media era in that politics becomes an unpredictable, immanent process of becoming rather than the fulfilment of a transcendent narrative. In today’s political language one could say that what counts is the pure potential that another world is possible and the movement towards it rather than speculation as to how that world will be organized.
Apart from anticipating many of the subsequent problematics of the counter-globalization movement, what this citation tells us most of all about the post-media era is that it is not something that can be given in advance; it is instead a process of the production of subjectivity, the becoming of a collective assemblage of enunciation whose starting point is the emptiness and coerciveness of the normalizing production of subjectivity that the mass media currently enact. This already gives us some indications as to what aspects of digital network culture might be able contribute to this emergence of a post-media sensibility and which elements in contrast merely help to add sophistication and diversity to normalisation processes under the guise of interactivity.
Guattari’s engagement with free radio was not, however, limited to Radio Alice but was also played out in relation to range of free radio initiatives in France from 1977 to 1981. In fact it was the events surrounding Radio Alice and its repression that led to Guattari’s first involvement with Radio Verte. According to Thierry Lefebvre, a press conference set up by Guattari, on the 11th of July, 1977, in order to denounce the imprisonment of Franco Berardi, who was coincidentally provisionally released that very day, was instead used to announce that Radio Verte would begin broadcasting the next day at 7 AM (Lefebvre, 2008: 115).
The next day a few people showed up in a borrowed office with the minimum of equipment necessary to begin broadcasting: two microphones, a turntable, a small mixing desk and a 100 watt transmitter. The transmission was oriented more to spontaneity than professionalism and went out live; three of the people present were Italians formerly involved with Radio Alice, thus making the radio experiment directly linked with the recent experience of free radio in Italy, reinforced by making this the topic of the first emission: ‘They spoke of Franco Berardi, about the conditions of his arrest, the situation in Bologna, the appeal of intellectuals against repression in Italy.
Little by little the discussion turned towards the necessity for the breaking up of the monopoly of the airwaves, on the problem of the right to speech of immigrant workers’ (Le Mattin de Paris, July 1977, cited in Lefebvre 2008: 116-117). Guattari’s involvement with French free radio was not limited to this particular station and he was also involved with Radio Libre Paris and later Radio Tomate amongst others. However, his involvement was not limited to particular stations but also in contributing to the organization of the free radio movement association, ALO, not without causing some controversy with some radio animators claiming that Guattari and his collaborators were attempting to impose an Italian political model on the French radio experience, before a similarly radicalized political plane effectively existed in France.
As the ALO became increasingly closely aligned with the nascent emergence of commercial radio initiatives, Guattari became disillusioned with the experience of free radio in France, concluding in 1980 that ‘[Today] the fanatics of radio for radio’s sake, the mythomaniacs of “new communications”, occupy centre stage. A new sickness, benign but tenacious, “radio-maniacal” narcissism, is spreading like an epidemic’ (334). If the experience of French free radio, for Guattari, became less a radio of the movement than a movement for radio fetishists, it nevertheless demonstrated Guattari’s pragmatic and active involvement in the field of radio as a potentially radical media ecological practice.
It also demonstrated the ecological interdependence of radio experimentation and its socio-political context. In particular, it pointed to the marked differences between the radical political and social movements of Autonomia in Italy and their equally drastic repression and the far more middle of the road political situation of France, epitomised by the election of the Socialist party of François Mitterrand, an election supported by several intellectuals formerly associated with the far left like Régis Debray, after ironically reinventing himself as the founder of ‘mediology.’
The 1980s, with their ascendancy of global neo-liberal policies on both the right and the left, and a concomitant deregulation, commercialisation and globalisation of the entire mediascape including radio, marked the end of a certain political conception of free radio; a fairly bitter result for those involved with radical free radio movements, who saw their efforts to break state monopolies over the airwaves succeeding for the benefit of a new generation of transnational commercial media operators, perhaps one of the key reasons that Guattari referred to the early years of this decade as ‘the years of winter.’
Nevertheless the desire to appropriate the airwaves for other forms of expression was one that would be continually reactivated in different forms in a variety of contexts, including in the experience of London pirate radio that Matthew Fuller engages with inMedia Ecologies.
While London pirate radio is not based on any leftist political agenda, in other respects it fully embodies Felix Guattari’s call for a micropolitical radio, facilitating the expression of subjectivities, in this case largely but not exclusively Afro-Caribbean youth, who are otherwise excluded from expression via the mainstream media. Referring to Simon Reynolds’ account of pirate radio in Energy Flash (1998), Fuller points to the way that pirate radio operated as a feedback loop between the creative chaos of the radio transmissions themselves and the ‘hardcore massive’ at home who were directly integrated into the radio transmissions via call-ins, SMS messaging and a range of extra radio phenomena including clubs, parties, flyers and graffiti, drugs and new modes of DJing and musical expression.
Part of what Fuller does is to provide both an inventory of all the elements whether technological, subjective or environmental, out of which pirate radio is constituted, as well as mapping their material relations. While far more detailed in dealing with technical devices such as turntables or mobile phones than Guattari’s writings on free radios, Fuller nevertheless provides an analysis that similarly shows the interdependence of radiophonic and extra-radiophonic elements, including the surrounding urban environment that made London pirate radio possible. For Fuller the combinations between the various components that make up pirate radio constitute a machinic phylum with a tendency to become self-organising, which is a tendency that was no less evident in the case of Radio Alice.
The sound of pirate radio is not only independent of its technical and social components but also ‘articulates them, gives them sensual, rhythmic and material force’ (Fuller, 2005: 19). Fuller also shows how a media ecological approach while not excluding ‘content’ has to locate this content in the multiple connections of the media ecology considered as a mega-machine that articulates different technologies, humans, voices, subjectivities, experiences, radio waves, laws and regulations, digital networks, money and the relations and feedback between all these elements. In summary, pirate radio is, for Fuller, ‘always more than it is supposed to be … it is made and makes itself, by its always awesome capacity to flip into lucid explosions of beats, rhythms, and life’ (53).
In this way there is a direct ‘transmission’ between the 1970s experience of political free radios as engaged with by Guattari and the very different experience of contemporary pirate radio, linked less by any similar content or political aspirations than by a related machinic phylum able to crystallise a production and expression of subjectivity in a specific socio-political environment.
Guattari’s account of Radio Alice as a media ecology serves as an exemplary statement of media ecological practice, emphasising its political, subjective and ethico-aesthetic dimensions: in other words, Guattari’s conception of media ecology, and I would also argue Fuller’s, is less the question of the subversive use of a technical media form than the generation of a media or rather post-media assemblage, that is a self-referential network for an unforeseen processual and political production of subjectivity amplifying itself via technical means.
The post-media field envisaged by Guattari is today being realised in complex ways in a number of domains ranging from media art projects operating on a largely aesthetic register to politically motivated media labs to reinventions of the potentials of earlier media forms such as television, radio and journalism.
Usefully, Joanne Richardson in her introduction to the Anarchitexts collection of essays on global digital resistance distinguishes at least three post-media domains of tactical media, sovereign media and autonomous media culture. In her definition of the second of these territories of post-media praxis, she provides a description highly resonant with the project of media ecologies as formulated both by Guattari and more recently by Fuller
"Tactical media knows the pleasures of media-in-itself and recognises the value of participation, but is still focused on a message and aims to reach an audience, however alternative. By contrast, sovereign media have learned to feign ignorance, ignore the demand for usefulness and the oppressive category of the audience. They mediate no information and are not the condition of possibility for any exchange. They communicate themselves, not to an audience of spectators but to a peer of equals, partners engaged in the same activity." (Richardson 2003: 11-12)
This is not to argue the sovereign media should be the 21st Century media ecological paradigm par excellence but to emphasise that the media ecological or post-media era envisaged by Guattari is now a complex and diverse reality, characterised by a multiplicity of bifurcating projects as expressed by the range of contributions to the Anarchitextscollection itself, which contains more than fifty contributions from at least as many post-media projects. This complexity and liveliness of contemporary media ecological praxis is also what this current issue of Fibreculture aims to make its own critical contribution to."
Marshal McLuhan, And The Discussion Of The Implications Of Modern Day Media
The Perception Of The Media According To McLuhan
Existing Ecological Environmental Gadget Now and into the future
Sebastian Anthony informs us thusly:
According to a new report that looks at how continuing improvements to artificial intelligence and robotics will impact society, “robotic sex partners will become commonplace” by 2025. A large portion of the report also focuses on how AI and robotics will impact both blue- and white-collar workers, with about 50% of the polled experts stating that robots will displace more human jobs than they create by 2025.
The report, called “AI, Robotics, and the Future of Jobs” and published by Pew Research, is a 66-page monster[PDF]. The report basically consists of a bunch of experts waxing lyrical about what the world will look like in 2025 if robots and AI continue to advance at the same scary pace of the last few years. Almost every expert agreed that robots and AI will no longer be constrained to repetitive tasks on a production line, and will permeate “wide segments of daily life by 2025.” The experts are almost perfectly split on whether these everyday robots will be a boon or a menace to society, though — but more on that at the end of the story.
While the report is full of juicy sound bites from experts such as Vint Cerf, danah boyd, and David Clark, one quote by GigaOM Research’s Stowe Boyd caught my eye. By 2025, according to Boyd, “Robotic sex partners will be a commonplace, although the source of scorn and division, the way that critics today bemoan selfies as an indicator of all that’s wrong with the world.”
Back In 2012
These robo-partners won’t necessarily have human-level intelligence (that’s still another 10+ years away I think), but they’ll look, move, and feel a lot like real humans. At a bare minimum, the study of human-robot relationships — might advance in that way and becomes advanced enough and there could be some wide-ranging repercussions.
But, back to the bigger story: Will advanced AI and robots make the world a better place or not? Basically everyone agrees that robotics and AI are going to displace a lot of jobs over the next few years as the general-purpose robot comes of age. Even though these early general-purpose bots (such as Baxter in the video below) won’t be as fast or flexible as humans, they will be flexible enough that they can perform various menial tasks 24/7 — and cost just a few cents of electricity, rather than minimum wage. Likewise, self-driving vehicles will replace truck drivers, taxis, pizza delivery kids, and so on.
Displacing jobs with robots isn’t necessarily a bad thing, though. Historically, robots have been a net creator of jobs, as they free up humans to work on more interesting things — and invent entirely new sectors to work in. More robots also means less drudgery — less tilling the fields, less stop-start commute driving — and in theory more time spent playing games, interacting with your family, etc.
On the other hand, the robot jobocalypse is likely to happen very quickly — so fast that our economic, education, and political systems may struggle to keep up. Previously robots mostly replaced blue-collar workers, but this next wave will increasingly replace skilled/professional white-collar workers. A lot of these specialized workers may find themselves without a job, and without the means to find a new one. We may suddenly see a lot of 50-year-olds going back to university.
Robert K. Logan on The Origin and Evolution of Language
THE IMPACT OF GLOBALIZATION ON SOCIETY: Globalization and Technology could involve the following things
Rather than analyzing the decline of the humanities narrowly in terms of turf battles among disciplines, one ought to situate it within the larger social and historical panorama. Astonishingly, the debate proceeds with scant reference to the massive presence and continuing expansion of the technological system. Despite the postmodern belief in the continuousness of academe and the "real world," vestiges of the ivory tower mentality may have induced us to think we are protected from the system, at least from its worst excesses. Nonetheless, the nature and impact of technology have been examined closely by such writers as Lewis Mumford and Jacques Ellul, Roderick Seidenberg, Gilbert Simondon, and Siegfried Gideon, not to mention Heidegger and Marcuse. They and others have tried to comprehend the unparalleled shift in adaptive behavior that has happened within the space of a hundred odd years, from industrial to technological society.
Students of technology may be placed within two broad groupings: instrumentalists and substantivists. Instrumentalists believe that technologies are single tools that lie ready to hand as in a toolbox, and that tools are neutral or value-free means to chosen ends. Typically, instrumentalists speak of technologies rather than technology, thinking they can pick and choose among options while keeping their hands on the reins of power. For them, technology is indifferent to politics.6 A car is a car and a computer is a computer in any social or political context, and top-down management, bureaucratic expertise, and quality-control are the same everywhere. On the instrumentalist view, technology differs from law and religion, "which cannot be readily transferred to new social contexts because they are so intertwined with other aspects of the societies in which they originate."7
By contrast, the substantivists, a minority that includes such figures as Ellul and the later Mumford, argue that technology is a monolithic phenomenon vastly greater than the sum of its parts. For Heidegger, human beings are mere "standing reserves," raw materials to serve the system. Far from being neutral, technology has become the substance informing more and more of life, like an implacable bureaucracy at the core of things that directs decisions at every turn. Thus, choosing technology entails "unwitting cultural choices": instrumentalists might defend fast food as the most efficient way of getting calories, saving time, and avoiding social complexities; substantivists would recall the ritualistic aspects of the dinner hour, lament the breakdown of the family, and denounce the coarsening of taste.8 They would decry the fact that French children prefer what they affectionately call the "MacDo" to French cuisine.
The technological paradigm of Jacques Ellul is admittedly extreme, but its very extremism focuses the issues in their clearest light.9 According to Ellul, modern technology began with the machine, abstracted rinciples from it, then outstripped it, became independent, and finally turned itself into a political, economic, and social reality. For the essential concept and its all-embracing referent, Ellul uses the term "technique" (la technique), defined as "the totality of methods rationally arrived at and having absolute efficiency (for a given stage of development) in every field of human activity."10 Technique has five major features. The "prime characteristic"—indeed, the "supreme imperative"—is the principle of least effort or efficient ordering.11 This includes rationalization, measurement, standardization (e.g., of the production process), linearity, segmentation, simplification, minimum waste, and speed. Human values are filtered out except where they facilitate the technical means which are omnipotent and often "unfriendly," thereby requiring the user-friendly convention. No real choice exists among technical methods: after all the necessary calculations are factored, the decision is obvious because technique dictates the one best means or least effort.12 Rival technology signifies that the principle has yet to make its latest judgment on a case, which will not be final because improvements and breakthroughs are always in the offing.13 If mistakes occur, technique intervenes to remove the defect and a new pathway is opened.
A second feature of technology is self-augmentation: machines keep making more and more machines. "Everything occurs as if the technological system were growing by an internal, intrinsic force, without decisive human intervention."14 Progress is irreversible and unceasing, and the progression is geometric as opposed to arithmetic. A breakthrough in one field brings solutions on all sides, like the internal combustion engine, the laser, or the computer; "these solutions in turn create even more problems which in turn demand ever more technical solutions."15 Paratechnologies quickly develop in response.16
A third characteristic of technology is monism. The parts of the system are united to one another and recombine easily because they do not vary in their essentials. Technique is acultural, ahistorical, a-geographical; there is no Eastern or Western technology. We live inside a "transnational and multi-polar, interdependent, and highly interactive" order.17 Monism imposes the good with the bad uses of technique. At the point when atomic energy had been harnessed, it was bound to be used for a bomb. Information-gathering services can be applied to scholarship or surveillance. "Technique never observes the distinction between moral and immoral use. It tends, on the contrary, to create a completely independent technical morality."18Robert Merton labels it the morality of "know-how": "Technique transforms ends into means. What had been prized in its own right now becomes worthwhile only if it helps achieve something else. And, conversely, technique turns means into ends. ‘Know-how’ takes on an ultimate value."19 Further, monism entails linkage: techniques of communication combine with techniques of administration and militarism—to produce progaganda, which becomes a new technique that can be applied elsewhere, as in advertising.
Fourth, technique implies universalism. It grows on all sides, across the planet, and into space, and everyone wants it, and more and more of it, from the richest to the poorest nations, from the capitalist nations to the socialist nations, from democratic regimes to totalitarian regimes.(Harry Braverman)
For Harry Braverman, Ellul is a bourgeois ideologist, "fetishizing" technology, treating it independently of social relations, and not seeing it as a weapon in the hands of capitalists (Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century [New York: Monthly Review Press, 1974], 229). But Ellul points out that technology had won over not only the capitalists, but the workers; that "a common will developed to exploit the possibilities of technique to the maximum, and groups of the most conflicting interests (state and individual,bourgeois and working class) united to hymn its praises" (Technological Society, 54-55). Moreover, he concedes that technique had improved the lot of labor in many ways, e.g., by reducing the working day and revolutionizing medicine. However, Ellul’s whole argument is that, if the workers were suddenly to come to power, their state would not want any less technology than the one they overthrew, so that the problem of technology would not disappear. The state is the major supporter of the technological system, and hence a frequent target of Ellul’s; it does not so much matter what kind of state from a strictly technological standpoint, as long as its ideology does not interfere, or interferes as little as possible, with the technological imperatives. Among reasons given for the collapse of the Soviet Union was its inability to keep up with the technological revolution.
John McMurtry defends Marx’s belief in technology as an extension of human nature against Ellul’s position that technology is in contradiction with it: Marx would construe such "neo-Luddite . . . distress at technology as, demystified, distress at the capitalist law of utilization of technology." But he never gets to the core of Ellul’s critique any more than Braverman (
Once a part, now it is the envelope of the whole. Moreover, one can never do with just a little technology. A commitment to some of it inevitably brings in the rest: like a "universal language," it "shapes the total way of life."
In September 1997 China tried to justify its plan to privatize major industries by calling it "socialism with Chinese characteristics." No one was taken in by such propaganda. Ideology is mere window-dressing and tends to interfere with the smooth functionig of a world-wide system. Morning business news in the West begins with reports on the closing of the Hong Kong market. Technique subdues nature, a good example being the tentacular suburb which invades the environment and subjugates it.
One is hard pressed to think of a single aspect of human activity that has not been subjected to "a reflection of technological orientation": sports, entertainment, "speeded tests," sex, personal relations, religion, How to books of myriad number.(Ellul) The technological phenomenon crosses class lines, universalizes taste, and creates a global civilization. Mega-events such as the Olympic Games, the Soccer Championship, Princess Diana’s Funeral, impossible without technology, are watched by people numbering over a billion—and this is only the beginning.
Role of Information and communication technologies
The Role of Science and Technology in Future Design
Jerome Karle wrote the following article:
The role of science and technology in future design will be discussed from the perspective of someone who has lived all his life in the United States and whose scientific experience has spanned the years since the late 1930s. It is likely that the reader will find in my discussion characteristics that apply to many developed countries and developing ones. Inasmuch as scientific progress is highly dependent on financial support and, in modern times, on general societal support, it is appropriate to discuss the interaction of science and society.
Using the United States as an example, some of the topics to be discussed are the views of public officials who influence the distribution of research funds, the response of funding agencies and the views of scientists. Finally, we shall look at the co-evolution of science and society and attempt to draw some conclusions concerning their related future and the implications for the future of technology.
Views of Public Officials
Public officials who are involved in setting or influencing science policy have expressed opinions that indicate that they intend to change the basis for supporting research and development. They speak in terms of a "paradigm shift" based on some new perception of the role of science in society. The word paradigm has several meanings, but in the way it is used here the words "pattern" or "model" may be good substitutes. In other words, the public officials wish to alter somewhat the pattern of funding for science. Their motivation is to orient research more toward programs that, for example, ensure a stronger economy and improvements in the environment. It is becoming increasingly apparent that those public officials who control public funds, will be reluctant to fund research programs that they consider unrelated to national needs.
An example of priority-setting by public officials was the vote in the House of Representatives against further construction of the high energy accelerator known as the superconducting super collider. This shift in spending priorities implies that nuclear physics may receive less support in the future if it continues to be viewed as less related to the new national priorities than other scientific disciplines.
Views of Funding Agencies
The effect of the intention of federal officials to shift public research funds toward research programs that serve the national priorities has already affected the nature of the funding available at the funding agencies. For example, at the National Science Foundation, a small increase in funding for the chemistry division is directed toward so-called strategic research initiatives that involve, for example, advanced materials and processing, biotechnology, environmental chemistry and high-performance computing.
It is likely that this trend will continue. The Federal Coordinating Council on Science, Engineering and Technology identified the current national priority areas as high-performance computing, advanced materials, manufacturing research and education, biotechnology and global change. The expressed intention is to get more effort into those areas, but not to have them be entirely exclusive.
Views of Scientists
Many questions arose in the scientific community as a consequence of the use of words such as "new paradigm," "strategic areas", "priorities," and "national competitiveness" in statements concerning the future funding of science. The questions concerned many aspects of the support of science, such as, is the paradigm really new, who decides which areas are strategic and who sets the priorities, and are the important contributions of curiosity-driven basic research to be largely sacrificed.
The indications so far are quite clear that the government expects to shift publicly funded research activity into the areas that are deemed strategic. Is this a new paradigm or merely a shift in emphasis? Quite apparently there has been over the years heavy funding and much research in the strategic (priority) areas. There also has been in the United States, a major Industry-University cooperative research program conducted by the National Science Foundation. It celebrated its 20th year of operation in January, 1994.
An account of this very successful and extensive program has been presented in the January 24, 1994 issue of Chemical and Engineering News published by the American Chemical Society. The motivation of this cooperative program is to develop and transfer industrially relevant technologies from the university into practice. There are currently more than 50 active centers involving about 1,000 faculty members, about 1,000 graduate students and 78 universities.
More than 700 organizations sponsor the centers, including government agencies, national laboratories and about 500 industrial firms. A table in the article lists 55 research topics covering a broad array of technologies. It is pointed out that the success rate is very high, namely only 6% of the centers have failed. Major investments have been made by sponsor organizations, based on center technologies. There are also many other industry-university collaborations that are not part of the National Science Foundation program.
Do we really have a "new paradigm" and, if so, what is it? Performing research in the interest of national needs is not new. Cooperating with industry is not new. Setting priorities is not new. What could be new? It is indicated that what is new is that by control of public funds curiosity driven research is to be curtailed to some unspecified degree in favor of research perceived to be in the national interest.
This, I believe is the source of the apprehension among scientists. The major developments in science and technology generally derive from curiosity driven research and these developments have had over time great impact on the national interest, enriching the country with whole new industries and making contributions to the health, welfare, comfort and security of society. Is curtailing curiosity driven research in the national interest?
The Impact of Curiosity Driven Basic Research
Many scientific groups have produced literature that describes, in terms of many examples, how curiosity driven research has led to important developments in the interest of society. The October, 1993 issue of Physics Today celebrated the one hundredth anniversary of the journal, Physical Review. A major part of this issue was devoted to the matter of basic research. An article by Robert K. Adair and Ernest M. Henley pointed out that "a century of fundamental physics research has appeared in thePhysical Review.
Such research is the seed corn of the technological harvest that sustains modern society." In an article on the laser, Nicholaas Blembergen points out that "the first paper reporting an operating laser was rejected by Physical Review Letters in 1960. Now lasers are a huge and growing industry, but the pioneers' chief motivation was the physics." In an article on fiber optics, Alister M. Glass notes that "fundamental research in glass science, optics and quantum mechanics has matured into a technology that is now driving a communications revolution."
In an article on superconductivity, Theodore H. Geballe states that "it took half a century to understand Kamerlingh Onnes' discovery, and another quarter-century to make it useful. Presumably we won't have to wait that long to make practical use of the new high-temperature superconductors." Other articles concerned nuclear magnetic resonance, semiconductors, nanostructures and medical cyclotrons, all subjects of great technological and medical importance that originated in basic physical research.
In a preface for a publication of the American Chemical Society, Science and Serendipity, the President of the ACS in 1992, Ernest L. Eliel, writes about "The Importance of Basic Research." He writes that "many people believe - having read about the life of Thomas Edison - that useful products are the result of targeted research, that is, of research specifically designed to produce a desired product. But the examples given in this booklet show that progress is often made in a different way.
Like the princes of Serendip, researchers often find different, sometimes greater, riches than the ones they are seeking. For example, the tetrafluoroethylene cylinder that gave rise to Teflon was meant to be used in the preparation of new refrigerants. And the anti-AIDS drug AZT was designed as a remedy for cancer." He goes on to say that "most research stories are of a different kind, however.
The investigators were interested in some natural phenomenon, sometimes evident, sometimes conjectured, sometimes predicted by theory. Thus, Rosenberg's research on the potential effects of electric fields on cell division led to the discovery of an important cancer drug; Kendall's work on the hormones of the adrenal gland led to an anti-inflammatory substance; Carothers' work on giant molecules led to the invention of Nylon; Bloch and Purcell's fundamental work in the absorption of radio frequency by atomic nuclei in a magnetic field led to MRI.
Development of gene splicing by Cohen and Boyer produced, among other products, better insulin. Haagen-Smit's work on air pollutants spawned the catalytic converter. Reinitzer's discovery of liquid crystals is about to revolutionize computer and flat-panel television screens, and the discovery of the laser - initially a laboratory curiosity - is used in such diverse applications as the reattachment of a detached retina and the reading of barcodes in supermarkets.
All of these discoveries are detailed in this booklet (Science and Serendipity). Ernest Eliel goes on to point, out that "the road from fundamental discovery to practical application is often quite long, ranging from about 10 years in the example of Nylon to some 80 years in the case of liquid crystals." He concludes that "if we stop doing fundamental research now, the 'well' that supplies the applications will eventually run dry. In other words, without continuing fundamental research, the opportunities for new technology are eventually going to shrink."
Some of the other topics in the brochure on Science and Serendipity, that were included to document further the importance of basic research, concerned several examples of the impact of chemistry on medicine. There are, in fact, countless such examples. The Federation of American Societies for Experimental Biology (FASEB) in their Newsletter of May, 1993 considered basic biomedical research and its benefits to society.
I quote from the FASEB Public Affairs Bulletin of May, 1993. "There have been recent suggestions that tighter linkage between basic research and national goals should become a criterion for research support. Concerns also have been raised that science is being practiced for its own sake, and that it would be better for the nation if research were oriented more toward specific industrial applications."
They go on to point out that "the available evidence, however, clearly indicates that the desired linkage already exists. Indeed, a majority of scientists are intimately involved in the study and treatment of common human diseases and collaborate closely with clinical scientists. Industries involved in biomedical development have been remarkably efficient in commercial application of treatment modalities based on discoveries resulting from fundamental research funded primarily by the federal government.
"A critical factor in sustaining the competitive position of biomedical-based industries is for basic research to continue to provide a stream of ideas and discoveries that can be translated into new products. It is essential to provide adequate federal support for a broad base of fundamental research, rather than shifting to a major emphasis on directed research, because the paths to success are unpredictable and subject to rapid change.
"History has repeatedly demonstrated that it is not possible to predict which efforts in fundamental research will lead to critical insights about how to prevent and treat disease; it is therefore essential to support a sufficient number of meritorious projects in basic research so that opportunities do not go unrealized. Although its primary aim is to fill the gaps in our understanding of how life processes work, basic research has borne enormous fruit in terms of its practical applications.
We recognize that during a time when resources are constrained, it may be tempting to direct funding to projects that appear likely to provide early practical returns, but we emphasize that support for a wide-ranging portfolio of untargeted research has proven to be the better investment. This provides the broader base of knowledge from which all new medical applications arise. Decisions regarding what research to fund must be based on informed judgments about which projects represent the most meritorious ideas."
FASEB continues with a discussion of economic benefits and a number of examples of basic research-driven medical breakthroughs. "Society reaps substantial benefit from basic research. Technologies derived from basic research have saved millions of lives and billions of dollars in health care costs.
According to an estimate by the National Institutes of Health on the economic benefits of 26 recent advances in the diagnosis and treatment of disease, some $6 billion in medical costs are saved annually by those innovations alone. The significance of these basic research-derived developments, however, transcends the lowering of medical costs: the lives of children as well as adults are saved, and our citizens are spared prolonged illness or permanent disability. Fuller, more productive lives impact positively on the nation's economic and social progress."
FASEB continues with thirteen examples of contributions by basic research to the diagnosis and treatment of numerous diseases, most of them very serious. Also noted in this Public Affairs Bulletin is that "our ability to know in advance all that is relevant is very poor" (Robert Frosch) and that, in suggesting new ideas for the management of funding for science, never considered were "the serious consequences of harming the system."
Up to this point, we have been concerned with basic science and its support by government funds in a modern society. Although there is also some support by private institutions established for that purpose and also some industrial investment in generally product-oriented basic research, the greatest amount of support by far comes from public funds. One of the ways that the public is repaid for their support is through the technology that fundamental research generates.
I suspect that the economic return from technology alone more than compensates for the monies expended for the entire basic research effort. I have no estimate, however, of whether my suspicion is true or not. It should be noted that the public gains much more than the economic value of technology. It gains culture, comfort, convenience, security, recreation, health and the extension of life. What monetary value can be put on the triumphs of health over debilitating or fatal disease? The monetary value has to be higher than the purely economic savings that were noted above in the 26 examples referred to in the FASEB Bulletin.
The word "technology" means industrial science and is usually associated with major activities such as manufacturing, transportation and communication. Technology has been, in fact, closely associated with the evolution of man starting with tools, clothing, fire, shelter and various other basic survival items. The co-evolution persists and, since basic science is now very much a part of developing technologies, the term co-evolution of science and society which is used at times very much implies the co-evolution of both basic science and industrial science with society.
Advances in technology are generally accompanied by social changes as a consequence of changing economies and ways of carrying out life's various activities. An important question arises concerning how basic scientific discoveries eventually lead to new technologies and what that may mean to the rational support of basic research and the future of science and technology in the developed and developing world.
There are great uncertainties in the process that starts with basic research and ends with an economically successful technology. The successful discovery of a new development in research that appears to have technological significance does not ensure the economic success of technologies that may be based on it.
Nathan Rosenberg of Stanford University, in a speech, "Uncertainty and Technological Change", before the National Academy of Sciences (April, 1994), pointed out that there are great uncertainties regarding economic success even in research that is generally directed toward a specific technological goal.
He notes that uncertainties derive from many sources, for example, failure to appreciate the extent to which a market may expand from future improvement of the technology, the fact that technologies arise with characteristics that are not immediately appreciated, and failure to comprehend the significance of improvements in complementary inventions, that is inventions that enhance the potential of the original technology.
Rosenberg also points out that many new technological regimes take many years before they replace an established technology and that technological revolutions are never completed overnight. They require a long gestation period. Initially it is very difficult to conceptualize the nature of entirely new systems that develop by evolving over time.
Rosenberg goes on to note that major or "breakthrough" innovations induce other innovations and their "ultimate impact depends on identifying certain specific categories of human needs and catering to them in novel or more cost effective ways. New technologies need to pass an economic test, not just a technological one."
What does this mean with regard to government managed research? I quote from Rosenberg's speech.
"I become distinctly nervous when I hear it urged upon the research community that it should unfurl the flag of 'relevance' to social and economic needs. The burden of much of what I said is that we frequently simply do not know what new findings may turn out to be relevant, or to what particular realm of human activity that relevance may eventually apply.
Indeed, I have been staking the broad claim that a pervasive uncertainty characterizes, not just basic research, where it is generally acknowledged, but the realm of product design and new product development as well - i.e., the D of R&D.
Consequently, early precommitment to any specific, large-scale technology project, as opposed to a more limited, sequential decision-making approach, is likely to be hazardous - i.e., unnecessarily costly. Evidence for this assertion abounds in such fields as weapons procurement, the space program, research on the development of an artificial heart, and synthetic fuels.
"The pervasiveness of uncertainty suggests that the government should ordinarily resist the temptation to play the role of a champion of any one technological alternative, such as nuclear power, or any narrowly concentrated focus of research support, such as the War on Cancer. Rather, it would seem to make a great deal of sense to manage a deliberately diversified research portfolio, a portfolio that will illuminate a range of alternatives in the event of a reordering of social or economic priorities.
My criticism of the federal government's postwar energy policy is not that it made a major commitment to nuclear power that subsequently turned out to be problem-ridden. Rather, the criticism is aimed at the single-mindedness of the focus on nuclear power that led to a comparative neglect of many other alternatives, including not only alternative energy sources but improvements in the efficiency of energy utilization."
To these words, I add those (noted by FASEB) of Bruce Ferguson, Executive Vice President of Orbital Sciences Corporation, a space technology firm. Ferguson said, "The federal government should focus its research and development spending on those areas for which the benefits are diffuse and likely to be realized over many years, rather than areas for which benefits are concentrated on particular products or firms over a few years. These areas are not well covered by corporate investment, yet are vital to the long-term economic strength of the country."
Some reactions to "strategic" research are recounted in an article in Natureof February 10, 1994 (Vol. 367, pp. 495-496) from which I quote some passages. The concept of strategic research "is not an unfamiliar cry, witness last year's debate in Britain about harnessing of research to 'wealth creation.'
Nor, of course, is the objective in any way disreputable; what scientist would not be cheered to know that his or her research won practical benefits for the wider world as well as a modicum of understanding? The difficulties are those of telling in advance which particular pieces of research will lead to 'new technologies' and then to 'jobs'.
"The recent past is littered with examples of adventurous goal-directed programmes of research and development which have failed for intrinsic reasons or which, alternatively, have been technically successful, but unusable for economic or other reasons."
The article goes on to say that the affection for strategic research in the United States may prove short-lived. "In Britain, much the same seems to be happening. Having pinned its reorganization of research on the doctrine of science for wealth-creation, the government appears now to be more conscious of the problems it has undertaken to solve.
Indeed, the prime minister, John Major, seemed to be suggesting in a speech last week that the British part of the research enterprise deserves respect of the kind accorded to other social institutions at the heart of his 'back to basics' rhetoric. After more than a decade of needless damage-doing, that would be only prudent."
As a final remark, the article ends with the statement: "On the grander questions, on both sides of the Atlantic, it seems likely that the first flush of enthusiasm for turning research into prosperity will be abated by the reality of the difficulties of doing so. When governments discover in the course of seeking radical reorganization that the best they can do with their parts of the research enterprise is to cherish them, the lessons are likely to be remembered. If the outcome in the research community is a more vivid awareness of how much the world at large looks to research for its improvement, so much the better."
The Future of Science, Technology and Society
In discussing the future of science (including industrial science) and society, it is valuable to recount some of the important points that emerged from the previous discussion.
1. As a consequence of recognizing the economic benefits that derive from the development of novel, successful technologies, governments have been attempting to direct research, supported with public funds, toward subjects that are perceived as national priorities. This contrasts with broad-based "curiosity" oriented basic research.
2. The views of scientists, a distinguished economist, some industrial leaders and an editorial comment in a distinguished science journal provide very strong indications that governmental management of goal-oriented research is replete with uncertainties and pitfalls and, although well-motivated, may cause serious damage to the scientific culture. This, of course, would defeat the original purpose, since the co-evolution of science and society is a very-well documented and irrefutable phenomenon.
3. Strong arguments are presented in this article by individuals and groups that support the current system of governmental funding of a very broad range of scientific efforts as probably being as close to optimal with regard to national priorities as is possible. No one can predict with any certainty what the most successful inventions and technologies will be in the future. The economic return on federally supported funding was the subject of a report by the Council of Economic Advisors to President Clinton. This report was released in November 1995. It documents high returns to the economy and the importance of governmental involvement. 1
4. By any measure, basic scientific research has made monumental contributions to technology and national priorities. The bond between basic research and the development of both novel and current technologies has been and is well in place.
There is no question that science and society will continue to co-evolve. The nature of this evolution will certainly be affected by the extent to which governments set funding priorities. Societies whose governments recognize the dependence of the development of successful novel technologies on broadly supported basic research are more likely to be healthier and economically prosperous in the future than those that do not. Because of the unpredictability of the details of the new science and technology that will evolve, the details of social evolution are also unpredictable.
Technology and Art: Engineering the future
From The Present Now Technological Society Into The Future..
In his essay Benjamin Pope is trying to peer into the human future over the long term, by looking at the types of institutions that survive across centuries and even millennia: Universities, “churches”, economic systems- such as capitalism- and potentially multi-millennial, species – wide projects, namely space colonization.
I liked Pope’s essay a lot, but there are parts of it I disagreed with. For one, I wish he would have included cities. These are the oldest lived of human institutions, and unlike Pope’s other choices are political, and yet manage to far out live other political forms- namely states or empires. Rome far outlived the Roman Empire and my guess is that many American cities, as long as they are not underwater, will outlive the United States.
Pope’s read on religion might be music to the ears of some.
Even the very far future will have a history, and this future history may have strong, path-dependent consequences. Once we are at the threshold of a post-human society the pace of change is expected to slow down only in the event of collapse, and there is a danger that any locked-in system not able to adapt appropriately will prevent a full spectrum of human flourishing that might otherwise occur.
Pope seems to lean toward the negative take on the role of religion to promote “a full spectrum of human flourishing” and , “as a worst-case scenario, may lock out humanity from futures in which peace and freedom will be more achievable.”
The image mankind call ‘the present’ has been written in the light but the material future has not been built. Now it is the mission of people like Grace, and the human species, to build a future. Success will be measured by the contentment, health, altruism, high culture, and creativity of its people. As a species, Homo sapiens sapiens are hackers of nature’s solutions presented by the tree of life, that has evolved over millions of years.
We can’t help seeing the future of technology as nearly synonymous with the future of our own civilization, and a civilization, when boiled down to its essence, amounts to a set of questions a particular group of human beings keeps asking, and their answer to these questions. The questions in the West and Globally are things like what is the right balance between social order and individual freedom? What is the relationship between the external and internal (mental/spiritual) worlds, including the question of the meaning of Truth? How might the most fragile thing in existence, and for us the most precious- the individual- survive across time? What is the relationship between the man-made world- and culture- visa-vi nature, and which is most important to the identity and authenticity of the individual?
The progress of science and technology intersect with all of these questions, but what we often forget is that we have sown the seeds of science and technology elsewhere and the environment in which they will grow can be very different and hence their application and understanding different based as they will be on a whole different set of questions and answers encountered by a distinct civilization.
. . . And as for how Humanity is steered . . .
Searle wrote; “… one of the ways we might succeed in facing these hurdles is by recovering the ability to imagine what an ideal society, Utopia, might look like.”
I totally agree.
Except for manipulative commercial and propaganda purposes, society has ignored most the jewels of insight that the understanding of human psychology has provided since the 1930’s. Humanity is swept along by ongoing competition among the obsolete and demented Ur-myths that various of our world cultures have heretofore childishly, habitually imprinted.
Once suitably harnessed, most citizens reflexively accept, and if goaded and spurred, will hysterically strain with the 'tribe' to propagate and justify their accepted biases. Humanity has wounded itself; been crippled & burdened with self-inflicted misdeeds and character flaws.
Seen in the latest lights of rationality, has the climb toward “progress” merely positioned us at a higher ledge to consider falling from? Whether based upon bible, koran, techno-computer-logic, paleo-archaic retreats, USA exceptionalism, DMT invoked conversations with “machine elves”, etc., neither our traditional, nor trendy, cultural maps appear accurate or attractive enough to be intelligently applied to various species-incriminating trials.
Individually secured, for what it’s worth, I mention the autobiographical myth of Olaf Stapledon, “Star Maker”, and its closest derivative documents, as the best map likely to be available to this, or any other, species.
FB Media Convention
Media Center reporths:
In his 1995 book Being Digital, Nicholas Negroponte predicted that in the future, on- line news would give readers the ability to choose only the topics and sources that inter- ested them.
“The Daily Me,” as Negroponte called it, wor- ried many guardians of traditional journalism. To actively allow a reader to narrow the scope of coverage, observed some, could undermine the “philosophical underpinnings of traditional media.”1
The vision that seemed cutting edge and worri- some eight years ago seems to have come partly true. The Wall Street Journal, MSNBC.com, The Washington Post and CNN, to name a few, all offer readers some degree of personalization on the front pages of their sites.
Millions of Yahoo members customize their MyYahoo personal news portal with the same news wire reports that editors use in daily news- papers across the globe. Google’s news page uses a computer algorithm to select headlines from thousands of news sites — creating a global news- stand, of sorts.
And media outlets from Fox News and the Drudge Report to individual weblogs offer the kind of opinionated slant to the news that Negroponte envisioned.
But is the future of online news simply a con- tinued extrapolation of this trend – news a la carte? Does greater personalization necessarily mean greater understanding for a democracy?
In the view of futurist and author Watts Wacker, the question is not about greater per- sonalization but about greater perspectives. According to Wacker, the world is moving faster than people can keep up with it. As a result, there are fewer common cultural references that can be agreed upon. Ideas, styles, products and mores accelerate their way from the fringe to the main- stream with increasing speed.
To combat the confusion, consumers are seek- ing more perspectives, Wacker says.2 They re- search an automobile for purchase by spending time online and reading both professional and amateur reviews alike.
But what are they doing when it comes to news?
And what will they be doing in the future?
To understand that, Wacker advises, you must seek out people from the future today and study them.3 How do you find people from the future? Locate early adopters — people who are using
and appropriating technology in new ways.
In South Korea, it looks like one future of on-
line news has arrived a few years early. OhmyNews.com is the most influential online news site in that country, attracting an estimated 2 million readers a day. What’s unusual about OhmyNews.com is that readers not only can pick and choose the news they want to read – they also
With the help of more than 26,000 registered
citizen journalists, this collaborative online newspaper has emerged as a direct challenge to established media outlets in just four years.4
Unlike its competitors, OhmyNews has em- braced the speed, responsiveness and commu- nity-oriented nature of the Web.
Now, it appears, the vision of “The Daily Me” is being replaced by the idea of “The Daily We.”
The rise of “we media”
The venerable profession of journalism finds itself at a rare moment in history where, for the first time, its hegemony as gatekeeper of the news is threatened by not just new technology and competitors but, potentially, by the audience it serves. Armed with easy-to-use Web publishing tools, always-on connections and increasingly powerful mobile devices, the online audience has the means to become an active participant in the creation and dissemination of news and informa- tion. And it’s doing just that on the Internet:
• According to the Pew Internet Project, the ter- rorist attacks of Sept. 11, 2001, generated the most traffic to traditional news sites in the his- tory of the Web. Many large news sites buckled under the immense demand and people turned to e-mail, weblogs and forums “as conduits for information, commentary, and action related to 9/11 events.”5 The response on the Internet gave rise to a new proliferation of “do-it-your- self journalism.” Everything from eyewitness accounts and photo galleries to commentary
and personal storytelling emerged to help people collectively grasp the confusion, anger and loss felt in the wake of the tragedy.
• During the first few days of the war in Iraq, Pew found that 17 percent of online Americans used the Internet as their principal source of information about the war, a level more than five times greater than those who got their news online immediately after the Sept. 11 terrorist attacks (3 percent). The report also noted that “weblogs (were) gaining a follow- ing among a small number of Internet users (4 percent).”6
- Immediately after the Columbia shuttle di- saster, news and government organizations, in particular The Dallas Morning News and NASA, called upon the public to submit eye- witness accounts and photographs that might lead to clues to the cause of the spacecraft’s disintegration.7
- ABCNews.com’s The Note covers 2004 politi- cal candidates and gives each an individual we- blog to comment back on what was reported.8 In addition, presidential candidate Howard Dean guest-blogged on Larry Lessig’s weblog for a week in July 2003. (A future president of the United States might be chosen not only on his or her merits, charisma, experience or voting record but on the basis of how well he or she blogs.)
- College coaches, players and sports media outlets keep constant vigil on numerous fan forum sites, which have been credited with everything from breaking and making news to rumor-mongering. “You can’t go anywhere or do anything and expect not to be seen, be- cause everyone is a reporter now,” says Steve Patterson, who operates ugasports.com, a Web site devoted to University of Georgia sports.9
Before the Iraq war, the BBC knew it couldn’t possibly deploy enough photojournalists to cover the millions of people worldwide who marched in anti-war demonstrations. Reaching out to its audience, the BBC News asked readers to send in images taken with digital cameras and cell phones with built-in cameras, and it published the best ones on its Web site.10
Weblogs come of age
The Internet, as a medium for news, is matur- ing. With every major news event, online media evolve. And while news sites have become more responsive and better able to handle the growing
demands of readers and viewers, online com- munities and personal news and information sites are participating in an increasingly diverse and important role that, until recently, has oper- ated without significant notice from mainstream media.
While there are many ways that the audience is now participating in the journalistic process, which we will address in this report, weblogs have received the most attention from main- stream media in the past year.
Weblogs, or blogs as they are commonly known, are the most active and surprising form of this participation. These personal publishing systems have given rise to a phenomenon that shows the markings of a revolution — giving any- one with the right talent and energy the ability to be heard far and wide on the Web.
Weblogs are frequently updated online jour- nals, with reverse-chronological entries and numerous links, that provide up-to-the-minute takes on the writer’s life, the news, or on a specific subject of interest. Often riddled with opinion- ated commentary, they can be personally reveal- ing (such as a college student’s ruminations on dorm life) or straightforward and fairly objective (Romenesko). (We discuss weblogs in greater detail in Chapter 3.)
The growth of weblogs has been largely fueled by greater access to bandwidth and low-cost, often free software. These simple easy-to-use tools have enabled new kinds of collaboration unrestricted by time or geography. The result is an advance of new social patterns and means for self-expression. Blog-like communities like Slashdot.org have allowed a multitude of voices to participate while managing a social order and providing a useful filter on discussion.
Weblogs have expanded their influence by attracting larger circles of readers while at the same time appealing to more targeted audiences. “Blogs are in some ways a new form of journal- ism, open to anyone who can establish and main- tain a Web site, and they have exploded in the past year,” writes Walter Mossberg, technology columnist for the Wall Street Journal.
“The good thing about them is that they intro- duce fresh voices into the national discourse on various topics, and help build communities of interest through their collections of links. For instance, bloggers are credited with helping to get the mainstream news media interested in the racially insensitive remarks by Sen. Trent Lott (R.-Miss.) that led to his resignation as Senate
Mossberg’s description of weblogs as a new
kind of journalism might trouble established, traditionally trained journalists. But it is a jour- nalism of a different sort, one not tightly confined by the traditions and standards adhered to by the traditional profession.
These acts of citizen engaging in journalism are not just limited to weblogs. They can be found in newsgroups, forums, chat rooms, collaborative publishing systems and peer-to-peer applica- tions like instant messaging. As new forms of participation have emerged through new tech- nologies, many have struggled to name them. As a default, the name is usually borrowed from the enabling technology (i.e., weblogging, forums and usenets).
The term we use — participatory journalism — is meant to describe the content and the intent of online communication that often occurs in col- laborative and social media. Here’s the working definition that we have adopted:
Participatory journalism: The act of a citizen, or group of citizens, playing an active role in the process of collecting, reporting, analyzing and disseminating news and information. The intent of this participation is to provide independent, reliable, accurate, wide-ranging and relevant information that a democracy requires.
Participatory journalism is a bottom-up, emer- gent phenomenon in which there is little or no editorial oversight or formal journalistic work- flow dictating the decisions of a staff. Instead, it is the result of many simultaneous, distributed conversations that either blossom or quickly at- rophy in the Web’s social network (see Figure 1.1 – Top-down vs. Bottom-up).
While the explosion of weblogs is a recent phenomenon, the idea of tapping into your au- dience for new perspectives or turning readers into reporters or commentators is not. Many news organizations have a long history of tapping into their communities and experimenting with turning readers into reporters or commentators. In the early 1990s, newspapers experimented with the idea of civic journalism, which sought participation from readers and communities in the form of focus groups, polls and reaction to daily news stories. Most of these early projects centered around election coverage. Later, news-
papers sought to involve communities in major deliberations on public problems such as race, development and crime.
According to a report from the Pew Center for Civic Journalism, at least 20 percent of the 1,500 daily U.S. newspapers practiced some form of civic journalism between 1994 and 2001. Nearly all said it had a positive effect on the commu- nity.12
Civic journalism has a somewhat controversial reputation, and not everyone is convinced of its benefits. While civic journalism actively tries to encourage participation, the news organization maintains a high degree of control by setting the agenda, choosing the participants and moderat- ing the conversation. Some feel that civic journal- ism is often too broad, focusing on large issues such as crime and politics, and not highly respon- sive to the day-to-day needs of the audience.13
Yet, the seed from which civic journalism grows is dialogue and conversation. Similarly, a defining characteristic of participatory journal- ism is conversation. However, there is no central news organization controlling the exchange of information. Conversation is the mechanism that turns the tables on the traditional roles of journalism and creates a dynamic, egalitarian give-and-take ethic.
The fluidity of this approach puts more empha- sis on the publishing of information rather than the filtering. Conversations happen in the com- munity for all to see. In contrast, traditional news organizations are set up to filter information before they publish it. It might be collaborative among the editors and reporters, but the debates are not open to public scrutiny or involvement.
John Seely Brown, chief scientist of Xerox Corp., further elaborates on participatory jour- nalism in the book The Elements of Journalism: “In an era when anyone can be a reporter or com- mentator on the Web, ‘you move to a two-way journalism.’ The journalist becomes a ‘forum leader,’ or a mediator rather than simply a teach- er or lecturer. The audience becomes not con- sumers, but ‘pro-sumers,’ a hybrid of consumer and producer.”14
Seely Brown’s description suggests a symbiotic relationship, which we are already seeing. But participatory journalism does not show evidence of needing a classically trained “journalist” to be the mediator or facilitator. Plenty of weblogs, fo
rums and online communities appear to function effectively without one.
This raises some important questions: If participatory journalism has risen without the direct help of trained journalists or news industry initiatives, what role will mainstream media play? And are mainstream media willing to relinquish some control and actively collaborate with their audiences? Or will an informed and empowered consumer begin to frame the news agenda from the grassroots? And, will journalism’s values endure?
Morphing Of FB Into Social Media Ecosystem...
8 2014 Announcement Summary
At f8, Facebook’s quantity and quality of announcements was notable. While aimed at developers, the implications of course significantly impact consumers, brands, and advertisers.
Let’s run through the list of Facebook’s news organized by its Build, Grow, Monetize focal point…
The new Facebook Login:
Letting people select what information to share with apps
Anonymous Login: A way for people to log in to apps without sharing personal information from
Facebook with developers
A two-year stability guarantee for our core developer products
Making linking between apps easier
We updated Parse pricing to make building apps less expensive, and introduced tools for developers to build apps with Parse that work offline.
Message Dialog: Letting people share content from apps with friends through Facebook
Mobile Like Button:
Like the Pages or content of individual apps through a native, mobile Like button
A new program to help mobile startups grow through a package of resources and tools provided by industry leaders
Send To Mobile:
An easy way for people to send an app to their phone after visiting a web site and logging in with Facebook
Monetize Audience Network: Developers now have a new way to effectively monetize mobile Apps
Was f8 a 10?
If you judge success by audience response, the series of cheers and applauds that followed each announcement was certainly encouraging. While each announcement is important in its own right, in aggregate, they represent something far more profound.
The new “big blue” of the social economy has just grown up from a global social network with 1.2 billion users into a full-fledged social ecosystem. As Mark so called it, Facebook aims to build a “cross-platform platform.”
What does that mean?
It’s another way of saying social ecosystem.
What’s that mean?
In simpler terms, it means that Facebook as a platform is becoming incredibly portable and universal in pursuit of ubiquity.
Whether or not people are actually in Facebook at any one moment, with developers building apps upon the network of new tools, Facebook users carry their digital life with them. As a “cross-platform platform,” Mark’s vision is that developers can carry the power and appeal of the social graph seamlessly across IoS, Android, Microsoft, and others.
Take a look at Facebook’s list of new products again and what follows will make even more sense.
People First Or People's Information For FB et al Profits
People First Or The Users Data And Profits First..
What Google search is to AdWords, what surfing and cookies are to ad targeting and re-targeting, Facebook’s social graph and social expressions are emerging as an omniscient index of personal connections and preferences. Now, developers, brands, and anyone with something to say or sell, can find people based on psychographics not just demographics as long as they’re willing to 1) pay Facebook directly or indirectly for it and most importantly 2) think thoughtfully about who it really is that they’re trying to reach and what makes them both unique and well-qualified to hear from you.
In many ways, like Twitter, Facebook is moving toward an Interest Graph and away from a straightforward social graph. Users will not only discover new content, apps, and ads based on who they know but additionally by what they like and express inside and outside of the network. And while that isn’t exactly new, the cloud for which this data resides has now been packaged, productized and presented to the market for experimentation.
Indeed, Facebook is still a social network. That’s its edge of the wedge play.
But the rest of the wedge, and its new revenue generating pillars include:
- Social Data
- Data as a Service
In a way, this is all very 1984. But at the same time, we as users have more control than ever before to fine-tune the signal of content and marketing that finds us. At the same time, to actually reach us, developers and marketers have to rethink their approach to grab our attention, pique our curiosity, and reward us for our time. In turn, we’ll reward our suitors with the very things they value. See, this isn’t just business, it’s personal. This is what sets Facebook apart from Google. It’s a more natural approach using a human algorithm rather than just a sophisticated technological algorithm.
Best of all, what Facebook did well at this year’s f8 was lead by example. With each of the company’s announcements and overall, the company itself demonstrated how to build, grow, and monetize.
Media/Viral Streaming/Social Media Ecosystem Watch...
User Multi-Tasking While Doing Something Else: A New Media Ecosystem For Users..
Television And Social Media: A Shortened View..
1. Multi-Tasking Is Now An Endemic Part Of Media Consumption.
2. Multi-Tasking Is Driven By Device Penetration.
3. Television Is The Primary Medium At The Heart Of Most Multi-Tasking.
4. Different Devices Drive Different Behaviour.
In this post I will summarize four further conclusions.
Again, all of them are based on extensive research and analysis.
But these conclusions are, arguably, much more profound.
And their implications are enormous.
Let me explain, starting with my fifth conclusion:
5. Most Multi-Tasking Is Not Television-Centric; But It Is Centred On Television.
Most Multi-Tasking activity is NOT directly related to TV content.
In fact Ipsos’s research suggests that the massive proportion of TV-related Multi-Tasking is not related to TV content:
But – even though most Multi-Tasking is not related directly to Television content, it is always focused on one medium above all others.
Multi-Taskers say that the majority of their attention is still on the Television, even whilst they are Multi-Tasking:
6. Most TV Multi-Tasking Is Communication-Related.
The majority of the Multi-Tasking time is related to some form of communication activity.
Most of that communication is with close friends, often one-on-one – particularly among Smartphone Multi-Taskers.
Social Media comes second.
So most communication tends to be relatively personal.
And the most common communication channel remains Email, ahead of by IM/Chatting and Social Media.
Let’s look at Social Media Multi-Tasking for a moment.
Because Social Media Users also Multi-Task a lot, according last month’s research from Credit Donkey, published by eMarketer.
And it’s really interesting to see how they Multi-Task.
For one simple, fundamental reason:
7. Most Social Media Multi-Tasking is Television-related.
Social Media Users Multi-Task everywhere.
They Multi-Task when they’re driving, which sounds suicidal.
They Multi-Task when they’re exercising, which sounds risky.
They Multi-Task when they’re shopping, which most men will understand.
They Multi-Task when they’re drunk, which we all regret.
They Multi-Task when they’re on the toilet, which I don’t want to think about.
They Multi-Task when they’re traveling and at work, which seems sensible.
But 84% of Facebook Users, and 67% of Twitter Users Multi-Task with one other Medium more than any other:
Yes, it’s true:
The majority of Social Media Multi-Tasking involves good, old-fashioned Television.
And what’s the natural result?
8. Television Drives Much Of The Conversation On Social Media.
If you don’t believe me, ask Twitter.
An amazing fact about Twitter was recently announced by ThinkBox – the marketing body for commercial TV in the UK.
It’s almost unbelievable, but it’s true.
That’s incredible, isn’t it?
But it’s a fact, and it’s worth repeating:
40% of all tweets are related to Television content during prime time Television.
It’s no surprise that Nielsen, the research company, has recently struck a deal with Twitter:
They will create a new television ratings measurement in the U.S. that will monitor TV conversations on the social network.
These “Nielsen Twitter TV Ratings”, as they’re called, will launch in time for the Fall, 2013 TV season.
And there’s more.
If you look at the most popular categories on Facebook and Twitter worldwide, they’re dominated by just three categories.
All three are based on Mass, Popular Entertainment.
And all are built by Mass Media:
They are Music, Television and Sport.
And this connectivity between Mass Media and Social Media is truly global.
In December 2012, for example, the Pew Research Center published the findings from its enormous survey of Social Media habits across 21 countries.
In the survey, they asked which subjects that Social Users around the world most communicate about, and the answer was simple.
It was Mass, Popular Entertainment.
And ThinkBox UK’s recent POETIC research came to two clear conclusions:
i Television creates more Word Of Mouth for Brands than any other Medium:
ii Television creates more Band Conversations than any other Medium:
And in France, AT Kearney recently quoted a Havas study which showed an equally powerful statistic:
Television advertising generates a barely-believeable 53% of all ‘earned’ media:
And all of this is happening, of course, before the inevitable growth of Social TV that will be driven by the dramatic increase of Connected Televisions around the world over the next 3-4 years.
Just take a look at the latest forecasts, if you want to have a guess at its impact:
TV Research Charts
Everything is developing as I write – and it’s critical that we try to monitor what’s happening every day.
For more insights on the impact of Social TV on Media and Communications, I recommend that you keep close to Wharton’s Social TV Lab here.
Because we all have much more to learn as the new Communications Ecosystem develops.
But – on the basis of current information, at this current moment – the data is undeniable, and the conclusions are clear:
a. Television is at the heart of most Media Multi-Tasking.
b. Most Multi-Tasking involves Communication – either personal communication or broader communication on Social Networks.
c. When Multi-Taskers communicate, their primary focus is always on the Television content – even if their communication is not always directly related to it.
d. So Multi-Tasking will frequently result in communication that is either directly or indirectly influenced by Television content.
e. As a result, old-fashioned Television has massive influence on Social Media, and a disproportionate impact on the most powerful form of Brand Communications on earth: Word Of Mouth.
I will be looking at the issue of Word Of Mouth again in one of my next blogs.
In the meantime, it’s important to remind ourselves of one of the most surprising developments in the world of New Media:
For years everyone has been talking about the Death of Television.
On the contrary – it looks like New Media is making Television more powerful than ever.
TV; Social Media; Mass Media And Emerging Gizmos...
The Future Of Journalism in The Viral Stream
Various short articles have been written by different people of which I will sample a few below. This format was inspired or edited by Bob Franklin which he Termed:
The Future of Journalism: Developments And Debates
Aggregation, Content Farms and Huffinization: The rise of low-pay and no-pay journalism
The business model of gathering, producing and distributing news is changing rapidly. Producing content is not enough; moderation and curation by ‘news workers’ is at least as important. There is a growing pressure on news organizations to produce more inexpensive content for digital platforms, resulting in new models of low-cost or even free content production. Subscription, advertising revenues and non-profit funding are in many cases insufficient to sustain a mature news organization. Aggregation, either by humans or machines, is gaining importance. At ‘content farms’ freelancers, part-timers and amateurs produce articles that are expected to end up high in web searches. Apart from this low-pay model a no-pay model – The Huffington Post - emerged where bloggers write for no compensation at all. We analyse the background to all this, the consequences for journalists and journalism and the implications for online news organizations. We investigate aggregation services, content farms and no-pay or low-pay news websites.
Crowdfunding and Non-Profit Media: The emergence of new models for public interest journalism
Miguel Carvajal, José A. García-Avilés and José L. González
The media environment has changed dramatically in the last few years. Audience fragmentation and online advertising atomisation have transformed existing business models and put into question traditional media management practices. Now more than ever, policy makers and editors are concerned about the future of newspapers. In this changing scenario, there are new media models that attempt to promote and preserve public interest journalism. Among them, non-profit institutions and community-funded platforms are the most innovative and relevant alternatives. They promote audience involvement using what is known as crowdfunding, or they are funded by grants received from wealthy millionaires. For these new models, profit margins and income are unwelcome. Despite the fact that they could be regarded as non-business models, they are actually changing the paradigm of public interest journalism while providing fresh ideas for traditional media. The aim of this paper is to explain the nature of crowdfunding by describing the context in which it takes place and considering its impact on journalism. We have created a database to identify all the crowdfunding initiatives around the world. The results highlight the emergence of these platforms and other systems that make possible crowdfunded journalism and investigative reporting. Transparency, user involvement and control over where their money goes tend to be the success factors of these initiatives.
The Algorithms behind the Headlines: How machine-written news redefines the core skills of human journalists
Arjen van Dalen
With the introduction of machine-written news, the automation of journalism has entered a new phase. Algorithms can now automatically generate news stories on the basis of statistical information and a set of stock phrases, without interference from human journalists. This paper analyzes reactions to the launch of a network of machine-written sport websites to see how this new technology forces journalists to re-examine their own skills. In their response to these technological developments, journalists define their profession by the tasks that are fulfilled rather than the persons who possess the skills and knowledge to fulfil them. Responding to automated news content, journalists highlight analytical skills, personality, creativity and the ability to write linguistically complex sentences as important skills defining journalism, rather than factuality, objectivity, simplification and speed. Journalists see “robot journalism” as an opportunity to make journalism more human. When routine tasks can be automated, journalists will have more time for in-depth reporting. This view is discussed in the light of the commercialization of news and of previous studies on the impact of technological developments on journalistic labour.
Tweets and Truth: Journalism as a discipline of collaborative verification
This paper examines how social media are influencing the core journalistic value of verification. Through the discipline of verification, the journalist establishes jurisdiction over the ability to objectively parse reality to claim a special kind of authority and status. Social media question the individualistic, top-down ideology of traditional journalism. The paper considers journalism practices as a set of literacies, drawing on the theoretical framework of new literacies to examine the shift from a focus on individual intelligence, where expertise and authority are located in individuals and institutions, to a focus on collective intelligence where expertise and authority are distributed and networked. It explores how news organizations are negotiating the tensions inherent in a transition to a digital, networked media environment, considering how journalism is evolving into a tentative and iterative process where contested accounts are examined and evaluated in public in real-time.
Sociability, Speed and Quality in the Changing News Environment
Speed and quality used to be considered the twin pillars of ‘good’ journalism. Now there is a third pillar: sociability. It is no longer enough to be ‘first with the news’, nor is it sufficient to be comprehensive and trustworthy. It is now increasingly considered necessary to ensure that news is produced in a form that is capable of spreading virally. This paper considers the way in which ‘viral’ transmission is impacting on the work of news journalists and news organisations.
Twitter Links between politicians and journalists
This article analyses a Twitter network of 150 Dutch journalists and politicians in 2010 and shows that Twitter networks have an underlying structure that is more detailed than one would expect from a simple list of followers and following. In-Degree (followers) measures a users’ popularity as a news source and Out-Degree (following) measures openness and newsgathering by users and give insights into the structure of this underlying network. Also, the bridging function- being a hub or link- in this network can be clearly identified, and shows how important a person is as a networker. Ego-network analysis may specify these positions and reveal how a single user has links and bridges to central network positions. From the network connections on Twitter between politicians and journalists it cannot be concluded that there is a closed elite like a fully connected group of users controlling information. Journalists and politicians are mutually depended on each other and how this dependency is constructed is shown by various network centrality measures, specifying their role (source versus news gatherer) and position in the network (being a networker or not). Consociationalism used to be one of the main characteristics of the Dutch political and media structures. Twitter network analysis on a sub-group level shows that contacts on Twitter between reporters and politicians are no longer influenced by religious or ideological identity of parties and media. Finding news and spreading news is the driving force in the Twitter network between politicians and journalists.
The Journalistic Hyperlink: Prescriptive discourses about linking in online news
Juliette De Maeyer
Hypertextuality has always been a fundamental characteristic of the web since its inception. It also impacts on journalism: the ability to link pages, sites and documents stands out as one of the features that essentially differentiates online news from other media. This paper investigates how prescriptive discourses about online journalism deal with hypertextuality. Focusing on hyperlinks as a concrete embodiment of the vague notion of hypertextuality, this project discusses how hyperlinks have been incorporated within the body of journalistic shared knowledge.
We draw on a qualitative content analysis of journalism textbooks, as well as interviews with journalism educators in French-speaking Belgium. Analysing them qualitatively, we discuss how different traditional journalistic values are invoked and articulated when it comes to give guidelines about the ideal use of hyperlinks. Results highlight inherent contradictions between the values that are summoned, but we argue that such inconsistencies are constructive and that they are crucial for journalistic collective identities.
Breaking News Online: How news stories are updated and maintained around-the-clock
This article examines the consequences of ‘around-the-clock’ news cycles online for the product of news. It argues that as a result of increased emphasis on continuous deadlines, the ‘news story’ is diversified into a fluid, always updated/corrected product challenging existing notions of news as a set piece of work. In this context, ‘time’ becomes an even more important factor for news production and blurs further pre-existing news formats. The ‘continuously updated news story’ can change many times during the day and challenges the idea of news as the finished product of journalistic work. This research studies six UK news websites and monitors how specific news stories are broken and updated during the course of a day. It specifically focuses on the frequency of updates, the amount and type of information added as well as their sources in order to investigate patterns of news updating in each organisation. The patterns of news updating that emerge suggest that we need to rethink the ‘news story’ as a fixed entity which has been associated with the distinct news cycles of traditional media. Although the daily cycles are not completely abolished, the news stories are rarely finalised.
Freelance Journalists as a Flexible Workforce in Media Industries
Maria Edstrom and Martina Ladendorf
Economic cutbacks in the media sector diminish the chances of employment for journalists, and consequently the number of atypical workers in the media industry, such as freelancers, is growing worldwide. This study of Swedish freelancers is grounded in both quantitative and qualitative data. The quantitative data are taken from ongoing surveys conducted by researchers at the University of Gothenburg, based on representative samples of practising journalists made in 1989, 1994, 1999 and 2005. Around 2000 journalists were included in each survey. The qualitative material consists of 13 biographical interviews with freelancers in northern Sweden. The results will be compared with international studies. The choice to work as a freelance journalist is connected to lifestyle, and the idea of ‘life as a project’, as well as entrepreneurialism, in ways that are connected to the societal processes of individualization and flexibilization.
“We Used to be Queens and Now We Are Slaves”: Working conditions and career strategies in the journalistic field
Roman Hummel, Susanne Kirchhoff and Dimitri Prandner
Following Pierre Bourdieu’s theory of the social field (Bourdieu 1984; Benson/Neveu 2005) we examine how changes in the media have affected the career strategies, journalistic practice and role images of women and men working in Austrian news media. By employing Bourdieu’s theory, information on role models and self-perception gathered by surveys can be interpreted within their proper contexts and give insight to the structure and constitution of contemporary Austrian Journalism. Problems arising from the diversification of journalism and gender-related disparities in career opportunities become more accessible empirically. In addition, field theory sheds light on the ongoing changes of the field’s properties, such as developments in the actual routines of news gathering and production. Finally, by applying Bourdieu’s theory to journalism research one gains a deeper insight into the specific set of stakes that are shaping both the perception and practice within the journalistic field. In this context, we discuss the benefits and drawbacks in the profession as a part of the field’s “illusio”, which still holds a strong attraction for newcomers.
Russian and Swedish Journalists: Professional roles, ideals and daily reality
Gunnar Nygren and Elena Degtereva
A strive for autonomy is a key part of professional journalistic culture, although the degree of autonomy varies between countries and media systems. A survey distributed to 100 journalists in Sweden and Russia explores their views on journalistic autonomy: the professional duties of journalists, the degree of autonomy they enjoy in their day-to-day work, as well as journalists’ opinions about the development of press freedom. The findings reveal that journalists in both countries share many professional values but also feel pressures on their professional autonomy – in Sweden mostly a commercial pressure and in Russia predominantly a political pressure but also the commercial interests of owners and advertisers. There are also some clear differences. Independence in daily work is less for Russian journalists and the obstacles ahead of publishing more common – and they have a negative view on the development of press freedom.
The 4C’s of Mobile News: Channels, conversation, content and commerce
François Nel and Oscar Westlund
Newspapers are in flux. Having seen their traditional businesses battered by forces that include structural changes fuelled by the rapid growth of networked digital technologies and cyclical shifts in the economy, mainstream news publishers have intensified efforts to adapt their journalism processes and products. However, growing digital revenue streams to match, if not surpass, the losses in print circulation and advertising incomes has proved difficult. A bright – or at least not quite so dim – spot glows from mobile devices. Drawing on data from an annual audit conducted in 2008, 2009, 2010 and 2011, this article examines how66 metropolitan newspapers in England, Scotland, Wales and Northern Ireland have performed with respect to channels, content, conversation and commerce (4C’s) of mobile news. While findings show the expansion of newspapers’ mobile endeavours, these are uneven and characterised by repurposing existing content and duplicating traditional commercial models.
The Re-birth of the 'Beat': A hyperlocal newsgathering model
Scholars have long lamented the death of the 'beat' in news journalism. Today's journalists generate more copy than they used to, a deluge of PR releases often keeping them in the office, and away from their communities. Consolidation in industry has dislodged some journalists from their local sources. Yet hyperlocal online activity is thriving if journalists have the time and inclination to engage with it. This paper proposes an exploratory, normative schema intended to help local journalists systematically map and monitor their own hyperlocal online communities and contacts, with the aim of re-establishing local news beats online as networks. This model is, in part, technologically-independent. It encompasses proactive and reactive news-gathering and forward planning approaches. A schema is proposed, developed upon suggested news-gathering frameworks from the literature. These experiences were distilled into an iterative, replicable schema for local journalism. This model was then used to map out two real-world 'beats' for local news-gathering. Journalists working within these local beats were invited to trial the models created. It is hoped that this research will empower journalists by improving their information auditing, and could help re-define journalists' relationship with their online audiences.
Foreign Reporting in the Sphere of Network Journalism
This article explores the relationship between foreign reporting and information provision through social media channels. Drawing upon globalization debates and research on foreign news coverage, it discusses the emergence of a new kind of reporting from afar. Within a complex, global communication space, layers of information and interpretation frameworks for news stories are multifaceted. As we witness the evolution of a sphere of ‘network journalism’, journalists gather news while bloggers, Twitterers or Facebook users contribute to the information flow. Taken together, the material provided by traditional journalists and alternative information sources form a global news map. Case examples from the Arab Spring assist to demarcate some characteristics of this communication sphere and suggest that seizing interactive communication tools could assist to strengthen news coverage in favor of what Berglez refers to as a ‘global outlook’ on news.
The Newsroom of the Future: Newsroom convergence models in China
Shixin Ivy Zhang
This study uses four news organizations and their online services based in Beijing to identify newsroom convergence models in China. It finds that there is a gulf between central-level and local news sites’ convergence efforts. At the macro level are policy barriers such as TV licensing, licensing to distribute news on the Web and the issuance of press cards. At the meso level, the websites have five main revenue streams: state subsidy, advertising, wireless value-added services, sales of data and website construction. State subsidy is only available to key state news sites like Xinhuanet and Peoplesnet. These central-level sites can afford to experiment with multiplatform and multimedia services as well as original products and services. By contrast, local news sites struggle to stay in the market and their strategies focus on providing local news and information while forging strategic partnerships with big businesses to sustain their advertising platform.
The Convergence Process in Public Audiovisual Groups: The case of Basque public radio television (EITB)
Ainara Larrondo, José Larrañaga, Koldo Meso and Irati Agirreazkuenaga
The digitization and diversification prompted by the development of web divisions has situated media groups at a decisive point, requiring strategies of adaptation that necessarily involve multimedia convergence. This key term for understanding communication today alludes to a gradual process which has the integration of newsrooms as its goal and is making itself felt in different interrelated fields. In Europe, public audiovisual corporations such as the BBC (Great Britain), SVT (Sweden), NRK (Norway), DR (Denmark) or YLE (Finland) have provided some relevant cases of convergence to date. In Spain, such adaptation is still moderate and it is the regional media that are showing a particular predisposition to change. In this context, this essay analyses the experience of one of the pioneering public groups in the Spanish state, the public radio television of the Basque Autonomous Community, Euskal Irrati Telebista (EITB). In line with other studies with similar characteristics, it employs a mixed methodology incorporating quantitative and qualitative procedures. The results make it possible to argue that EITB is slowly advancing towards convergence, setting out from strategies typical of the initial phases of this process, such as grouping newsrooms together in the same physical space, cross-media promotion, taking advantage of synergies of multiplatform distribution or basic editorial coordination, which places this group midway between digitization and convergence.
Contents and Abstracts of August 2012
(Vol 6 - No 4)
News on New Devices: Is Multiplatform News Consumption a Reality?
H. Iris Chyi and Monica Chadha
News organizations worldwide now deliver content through multiple electronic devices such as computers, smartphones, e-readers, and tablets. While multiplatform news delivery is widely prevalent, is multi-platform news “consumption” a reality? This study examines the extent to which people own, use, and enjoy these electronic devices. Results of a national survey of U.S. Internet users suggested that despite the excitement about newer, more portable devices, not all devices are equally “newsful.” Most people use only one electronic device for news purposes on a weekly basis. We identified the predictors of device ownership and multi-platform news consumption and discussed the implications for multi-platform news publishing.
A Paleontology of Style: The evolution of the Middle East in the AP Stylebook, 1977- 2010
This article looks at the commonly understood rules and guidelines, which are set out and regularly modified in the Associated Press Stylebook, for how news about the Middle East “ought” to look and sound for US readers. By examining official news language longitudinally across a period from before the Iranian revolution to the second decade of the “war on terror,” it finds patterns that shed light not only on the normal evolution of news language but on the particularly Orientalized features of news about the Middle Eastern “other.” These findings are especially relevant in an era of shrinking newsroom resources in which centrally determined features of language are, increasingly, national decisions.
Elimination of Ideas and Professional Socialisation: Lessons learned at newsroom meetings
Gitte Gravengaard and Lene Rimestad
This article investigates and interprets social and cultural production and reproduction as we turn our attention to an important part of routinised practice in the newsroom: the early newsroom meetings. These meetings are essential sites for the building of the craft ethos and professional vision. Our aim is to study how this building of expertise takes place at meetings with a particular focus on the decision-making process concerning ideas for new news stories. In order to do this, we perform linguistic analysis of news production practices, as we investigate how the journalists’ ideas for potential news stories are eliminated by the editor at the daily newsroom meetings. The elimination of ideas for news stories are not just eliminations; they are also corrections of culturally undesirable behaviour producing and reproducing the proper perception of an important object of knowledge – what constitutes ‘a good news story’ – in this community of practice.
Promoting Aesthetic Tourism: Transgressions between generalist and specialist subfields in cultural journalism
Cultural departments in newspapers are reported to be encountering increasing pressures towards the production of news and more ‘journalistic’ expression. However, journalistic values do not fit unproblematically into the dualistic professionalism of cultural journalism. The article elaborates an insight into the practice of arts journalism as an act of framing between the epistemic paradigms of
journalistic and aesthetic tradition. The traditions are inspected as Bourdieusian fields, and the boundary-crossing between the fields is clarified by close-reading texts with deviations from the typical framings, accompanied by social tensions and indignation among the actors of the artistic field. By investigating the reconciliation between the two epistemic frames of cultural journalism, the journalistic and the aesthetic paradigm, certain restrictions between the generalist and specialist roleshifts
can be found. This distinctive boundary between the fields leads into the question about how the ewsworthiness of culture could be defined without losing the sensitivity to the characteristics of arts and culture, operated by their own logic.
Envisioning Journalism Practice as Research
As pressure grows on journalism academics to publish scholarly outputs and attract external research income, many express frustration over the uncertain status of journalism practice in relation to the requirement for making a contribution to knowledge. (Harcup, 2011) Simultaneously, work in education theory has highlighted contextual shifts in arts and humanities education that signify a pressing
need for journalism studies as well as other disciplines to define their position regarding practice within research. Recent reflections on practice and research within journalism education (Niblock, 2007) suggest the discipline is seeking forms of scholarship that cohere better with its industry-facing character. This paper seeks to originate both a methodological framework and an epistemological perspective that acknowledges practitioner perspectives as accumulated knowledge. Drawing on concepts of reflexivity and habitus, it will negotiate and evaluate a range of potential models of practice as research, and discuss their implications for furthering the profile of journalism scholarship.
Covering “Financial Terrorism”: The Greek debt crisis in U.S. news media
The 2010 Greek financial crisis marks an important chapter in an era where the underlying maneuvers of private financial entities figure centrally in the wherewithal of Western nation states. Utilizing framing research, this study examines representation of the Greek crisis by US news media from December 2009 to July 2010. In contrast to the incident’s coverage in the European and business press initially attributing the crisis to speculation and the manipulation of Greek debt, major US news media presented the event through specific event-driven frames that obscured knowledge of deeper causes. By drawing attention to dramatic events in Athens and the American stock markets, US outlets presented the financial crisis in narrow terms that blamed the event on alleged character flaws and ineptitudes of a nation and its people. This reportage legitimized proposals of economic austerity
as reparation. In the midst of excessive business and financial-related information, the ability of US journalism to explain how and for whom transnational economic processes proceed remains provisional. Journalism prompting public discourse on such dynamics is crucial at present as the formulas hastening the Greek crisis now threaten industrialized countries throughout the West.
News Accuracy in Switzerland and Italy: A transatlantic comparison with the U.S. press
Colin Porlezza, Stephan Russ-Mohl and Scott Maier
Nearly 80 years of accuracy research in the United States has documented that the press frequently errs, but empirical study about news accuracy elsewhere in the world is absent. This article presents an accuracy audit of Swiss and Italian daily regional newspapers. Replicating U.S. research, the study offers a trans-Atlantic perspective of news accuracy. To compare newspaper accuracy in Switzerland and
Italy to longitudinal accuracy research in the United States, the study followed closely the methodology pioneered by Charnley (1936) and adapted by Maier (2005). News sources found factual inaccuracy in 60 percent of Swiss newspaper stories they reviewed, compared to 48 percent of U.S. and 52 percent of Italian newspapers examined. The results show that newspaper inaccuracy – and its corrosive effect on media credibility – transcends national borders and journalism cultures. Nowadays, digitization offers new ways of implementing correction policies. Media organizations need, however, to adapt to these changes and to adapt their structures in particular to new forms of participative and interactive twoway
Don’t Feed the Trolls! Managing troublemakers in magazines’ online communities
‘Trolling’ and other negative behaviour on magazine websites is widespread, ranging from subtly provocative behaviour to outright abuse. Publishers have sought to develop lively online communities, with high levels of user generated content. Methods of building sites have developed quickly, but methods of managing them have lagged behind. Some publishers have then felt overwhelmed by the size and behaviour of the communities they have created. This paper considers the reasons behind trolling and the tools digital editors have developed to manage their communities, taking up the role of Zygmunt Bauman’s gardeners in what they sometimes refer to as “walled gardens” within the internet’s wild domains. Interviews were conducted with online editors at the front line of site management at Bauer, Giraffe, IPC, Natmags, RBI and the Times. This article shows how publishers are designing sites that encourage constructive posting, and taking a more active part in site management. Web 2.0 and the spread of broadband, which have made management of fast-growing communities difficult, may themselves bring positive change. As uploading material becomes technically easier, “ordinary”
citizens can outnumber those who, lacking social skills or with little regard for social norms, originally made the Internet their natural habitat.
Communicative Action of Journalists and Public Information Officers: Habermas revisited
Judith McIntosh White
Public information officers (PIOs) represent a type of communications professional distinct from public relations practitioners (PRPs). From a structural functionalist viewpoint, journalists and PIOs share goals: Both see themselves as facilitating the information flow into the public sphere. Habermas’ communicative action models defining journalists as committed to revealing the ‘whole truth’ to the public, but PRPs as enmeshed in advocating private interests, do not adequately describe PIOs. Although journalists’ and PIOs’ goals are similar, barriers exist to inhibit their cooperation in achieving those mutual goals. Such barriers arise from academic ideal types fostering inaccurate perceptions of each other, perceptions reinforced by adaptive structuration within their respective organizations’ cultures. Empirical data support that PIOs’ and journalists’ divergent attitudes about their professional praxis combine with ideal-type constructions and organizational cultures to produce communication disconnects between the two.
"The People's Debate”: The CNN/YouTube debates and the demotic voice in political journalism
Matt Carlson and Eran N. Ben-Porath
Popular participation lies at the core of democratic governance, yet the mediated conversation of politics has largely been limited to elite sources and professional journalists. New technological alternatives to the mass communication model create opportunities for the non-elite “demotic voice” in the mediated public sphere. However, the expansion of who speaks is not without tensions or efforts by traditionally powerful voices to retain control over communication channels. This article analyzes struggles surrounding the growing role of the demotic voice through a case study analysis of the 2007 CNN/YouTube debates among candidates for the US presidency.
Remediating: Journalistic strategies used in positioning citizen-made snapshots and text bites from the 2009 Iranian post-election conflict
Rune Saugman Andersen
When mass protests and violent crackdown followed the 2009 Iranian presidential election, western mass media found themselves in a precarious situation: eager to report on the unfolding events, but without access to them; save through snapshots and text bites posted to content-sharing sites by unknown users. Basing news coverage on such content challenged journalistic understandings of credibility as produced by professional routines, thus disturbing the foundation of epistemic authority on which professional journalism builds. Neglecting it, however, would challenge journalism’s ability to portray anything at all. This article investigates how the positioning of citizen micro-journalism was textually negotiated in news reports by attributing different degrees of epistemic authority to citizen-made content. It argues that the strategies used greatly privilege the unknown image vis-à-vis the unknown verbal report. While verbal reports are treated as illustrative of communication practices or attributed doubt, images are allowed to represent the crisis, and frequently made indistinguishable from professionally produced content. Furthermore, in attributing citizen-made content to news agencies and mediation channels, the incorporation practices treat intermediation as a source of credibility. Deconstructing the process of constructing epistemologically authoritative news thus highlights how mediation, news values, source practices, and image conventions are relied on to perform credibility.
Online Journalism and Election Reporting in India
The news media scenario in India has been transformed substantially in the post-liberalization period as privatization and deregulation have facilitated cross-border flows of capital and technology. Online news media, a new yet popular segment, has emerged in the past decade in the wake of India’s rapid integration into the global economy. This article focuses on online news reporting of the last general election in India: the 2009 Lok Sabhā Election. Although there are an impressive number of studies regarding online social networking or new media in the global context, scant attention has been paid to the Indian subcontinent, the involvement of Indian politicians and political journalists with online media. Considering these aspects, this article explores how online media in India are changing the established political culture, albeit in a limited manner, and raises the issues that interweave notions of modernity, class-consciousness, and emerging participatory practices. The article seeks to make sense of how Indian journalism is transforming through social media use by analysing three different stakeholders during the Indian general election: politicians, political journalists and ordinary citizens. The very fact of “being” or “using” social media, it argues, becomes an “enticing” aspect for politicians to relate to the young, urban, upwardly mobile middleclass citizens of India and becomes pivotal in the discursive construction of a binary between the “old” politics/politicians and the “new” politics/politicians in present day India.
A Journey through Ten Countries: Online election coverage in Africa
Ben Akoh and Kwami Ahiabenu
The African Elections Project (www.africanelections.org) was established with the vision of enhancing the ability of journalists, citizen journalists and the news media to provide more timely and relevant election information and knowledge, while undertaking monitoring of specific and important aspects of elections using social media tools and ICT applications. Elections are the cornerstone of democracy and the media have a key role to play in deepening democracy by providing impartial coverage of elections. In addition to traditional election coverage, online election reporting on the Africa continent has been experiencing growth in recent years. It takes the form of special election websites that incorporate elements of citizen journalism or crowdsourcing and is mostly driven by mobile phones. It is mashed up with blogs, interactive maps and social media tools such as Twitter, Flickr, YouTube and Facebook among others. This article chronicles the African Elections Project’s field experiences based on the elections it has covered in ten countries: Ghana, Cote d'Ivoire, Guinea, Mauritania, Malawi, Mozambique, Namibia, Botswana, Togo and Niger, showing the similarities and importance of online election coverage in these countries. The internet is gradually providing new sets of tools for journalists which could be relevant and applicable for reporting elections. The paper concludes by showing the difficulties journalists encounter in the practice of reporting elections and offers suggestions for future research.
“Second-order” Elections and Online Journalism: A comparison of the 2009 EP elections’ coverage in Greece, Sweden and the UK
European Parliament elections are often classified as ‘second-order’ and there are few pan-European media outlets through which European Union (EU) elites can address voters directly. Given these conditions, can online journalism help broaden the scope of European political communication, facilitate interaction across the borders and refocus European Parliamentary election coverage on EU issues? Using an analytical model based on the public sphere, this article assesses online reporting of the 2009 European Parliamentary elections in Greece, Sweden and the United Kingdom, on three levels: publicization; participation; and public opinion formation. The results show that despite the differences between the selected countries in terms of online communication infrastructure and the maturity of the online public sphere, cross-national patterns of European Parliamentary election coverage emerge. This allows for reserved optimism regarding the role of online journalism in the building of a European public sphere.
(Not) the Twitter Election: The dynamics of the conversation in relation to the Australian media ecology
Jean Burgess and Axel Bruns
This paper draws on a larger study of the uses of Australian user-created content and online social networks to examine the relationships between professional journalists and highly engaged Australian users of political media within the wider media ecology, with a particular focus on Twitter. It uses an analysis of topic-based conversation networks using the hashtag on Twitter around the 2010 federal election to explore the key themes and issues addressed by this Twitter community during the campaign, and finds that Twitter users were largely commenting on the performance of mainstream media and politicians rather than engaging in direct political discussion. The often critical attitude of Twitter users towards the political establishment mirrors the approach of news and political bloggers to political actors, nearly a decade earlier, but the increasing adoption of Twitter as a communication tool by politicians, journalists, and everyday users alike makes a repetition of the polarisation experienced at that time appear unlikely.
Social Media as Beat: Tweets as news source during the 2010 British and Dutch elections
Marcel Broersma and Todd Graham
While the newspaper industry is in crisis and less time and resources are available for newsgathering, social media turn out to be a convenient and cheap beat for (political) journalism. This article investigates the use of Twitter as a source for newspaper coverage of the 2010 British and Dutch elections. Almost a quarter of the British and nearly half of the Dutch candidates shared their thoughts, visions, and experiences on Twitter. Subsequently, these tweets were increasingly quoted in newspaper coverage. We present a typology of the functions tweets have in news reports: they were either considered newsworthy as such, were a reason for further reporting, or were used to illustrate a broader news story. Consequently, we will show why politicians were successful in producing quotable tweets. While this paper, which is part of a broader project on how journalists (and politicians) use Twitter, focuses upon the coverage of election campaigns, our results indicate a broader trend in journalism. In the future, the reporter who attends events, gathers information face-to-face, and asks critical questions might instead aggregate information online and reproduce it in journalism discourse thereby altering the balance of power between journalists and sources.
Film Review: Tale of Two Dragons
The Promise of Computational Journalism
Terry Flew, Christina Spurgeon, Anna Daniel and Adam Swift
Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, the social sciences, and media and communications. New technologies may enhance the traditional aims of journalism, or may initiate greater interaction between journalists and information and communication technology (ICT) specialists. The enhanced use of computing in news production is related in particular to three factors: larger government data sets becoming more widely available; the increasingly sophisticated and ubiquitous nature of software; and the developing digital economy. Drawing upon international examples, this paper argues that computational journalism techniques may provide new foundations for original investigative journalism and increase the scope for new forms of interaction with readers. Computer journalism provides a major opportunity to enhance the, production of original investigative journalism, and to attract and retain readers online.
Journalistic Discourse in the Mood of Change: A Comparative study of European MICE magazines
Jan Richard Baerug and Halliki Harro
The border between journalism and marketing communication is diminishing and media such as television and magazines are especially vulnerable to the colonisation of traditional journalistic genres by promotional information. From the point of view of audience perception, grouping certain media channels or discourses into ‘journalistic’ and others into ‘promotional’ or ‘mixed’ would provide a certain level of predictability, as well as a basis for their judgement of information. However, we argue here that category confusion takes place even inside sub-sectors of niche magazines. The objective of this international comparative research is to analyse the editorial ideologies and discursive practices concerning the hybridisation of media discourse in one media sector: the Meetings, Incentives, Conferences and Events (MICE) sub-sector of niche magazines. Can these magazines be identified as ‘journalistic’, ‘advert’ or ‘mixed’ oriented media? The empirical research is focused on the production process (the implementation of editorial principles) in key MICE magazines mainly in European countries.
The Politics of Journalistic Creativity: Expressiveness, authenticity and de-authorization
This article begins with the assertion that creativity in journalism has moved from being a matter of guile and ingenuity to being about expressiveness, and that this reflects a broader cultural shift from professional expertise to the authenticity of personal expression as dominant modes of valorization. It then seeks to unpack the normative baggage that underpins the case for creativity in the cultural industries. First, there is a prioritization of agency, which does not stand up against the phenomenological argument that we do not own our own practices. Second, creative expression is not necessarily more free, simply alternately structured. As with Judith Butler’s performativity model, contemporary discourses of creativity assume it to have a unique quality by which it eludes determination (relying on tropes of fluidity), whereas it can be countered that it is in spontaneous, intuitive practice that we are at our least agencical. Third, the article argues against the idea that by authorizing journalists (and audiences) to express themselves, creativity is democratizing, since the always-already nature of recognition means that subjects can only voice their position within an established terrain rather than engage active positioning.
More News for Less: How the professional values of 24/7 journalism reshaped Norway’s TV2 newsroom
Maria Konow Lund
Using an ethnographic case study of the Newschannel at TV2 Norway, this article reveals ways in which the assembly-line mentality required by 24/7 news production nevertheless encourages reporters to negotiate a certain autonomy over their work and the routines required to produce it. By reorganizing its staff’s use of time, space, and resources, TV2 was able to generate roughly eighteen hours of live news coverage a day during the article’s research period from 2007 to 2009. This production process is framed in terms of Schlesinger’s (1978) “reactive” mode, here qualified as “reactive-active”, because it allows for the possibility of broadcasting “live” and gathering news at the same time. The article also revisits the concept of “professionalism” with regard to a traditional broadcaster’s implementation of a 24/7 news channel within its existing newsroom. As a result of this process, more news—and more content concerning that news—is produced more efficiently while the tenets of traditional journalism remain operative.
Opening the Gates: Interactive and multimedia elements of newspaper websites in Latin America
Ingrid Bachmann and Summer Harlow
In the light of newspapers’ struggle to maintain readers and viability in the digital era, this study aims to understand better how newspapers in Latin America are responding to this shift toward user-generated and multimedia content. Using a content analysis of 19 newspapers from throughout Latin America, this study found that newspaper websites are bringing citizens into the virtual newsroom on a limited basis, allowing them to interact with each other and with the newspaper but only to a modest degree. For example, while all newspaper websites have some multimedia content and most have Facebook and Twitter accounts, few allow readers to report errors, submit their own content, or even contact reporters directly. Further, most online newspaper articles include photos, but video, audio and hyperlinks rarely are used. These results further our understanding of how online interactivity is changing the traditional role of journalists and how Latin America is responding to the challenge.
Old Turf, New Neighbors: Journalists' perspectives on their new shared space
Online news readers’ comments have been the subject of intense debates in newsrooms across the United States. Caught unexpectedly as hosts of this new public space, journalists are trapped in a conundrum between upholding traditional ideals of providing a space for dialogue for their public but yet at the same time not wanting to create a space for hate in online news readers’ comments sections. This research examines perspectives of 30 journalists in the United States through in-depth interviews regarding this new online public space. The survey results illuminates the experiences journalists are undergoing in their new roles, how they balance their responsibilities and the vision they hold of the future for this new space as they learn to navigate unguided through this new electronic landscape.
Newsroom Practices and Letters-to-the-Editor: An analysis of selection criteria
Marisa Torres da Silva
Letters-to-the-editor provide a significant forum for public debate, enabling the exchange of information, ideas and opinions between different groups of people. Since journalistic work is central to the processes of citizenship, this article observes the social context surrounding letters-to-the-editor in four Portuguese press publications. Keeping in mind the existence of a set of selection criteria, based on newsroom practices, it is possible to characterize the debate that takes place in the letters’ section as a construction. As with any other editorial content, the published letters are also a result of a selection, editing and framing process, shaped by journalistic routines and subject to limitations such as space and time.
Managed Mediocrity? Regional journalists’ perspectives on the “pampers generation” of Russian reporters
Elina Viktorovna Erzikova and Wilson Lowrey
This paper adopts Bourdieu’s field theory to examine a possible professional gap between young and experienced Russian regional journalists. In-depth interviews revealed that experienced journalists have a negative view of their young colleagues, seeing them as unskilled, poorly motivated, mediocre, and submissive to authority. In their turn, beginners see the older generation as lacking dynamism and dedication to help young reporters to master professional skills. It appeared that younger reporters tend to choose different professional priorities, to pursue sources of “capital” that derive from beyond the journalistic field, and to follow different historical trajectories than older journalists. Because of the dependency of media on the state and the governmental reward for mediocrity, older study participants tended to doubt that young reporters would seek or obtain a measure of journalistic autonomy.
Film Review: Et Tu, George? Campaign journalism and the politics of compromise
Broadening the focus: why lifestyle journalism is an important field of study
Guest Editor’s Introduction
This introduction to the special issue outlines the case for an increased focus on studying lifestyle journalism, an area of journalism which, despite its rapid rise over recent decades, has not received much attention from scholars in journalism studies. Criticised for being antithetical to public interest and watchdog notions of journalism, lifestyle journalism is still ridiculed by some as being unworthy of being associated with the term journalism. However, in outlining the field’s development and a critique of definitions of journalism, this paper argues that there are a number of good reasons for broadening the focus. In fact, lifestyle journalism – here defined as a distinct journalistic field that primarily addresses its audiences as consumers, providing them with factual information and advice, often in entertaining ways, about goods and services they can use in their daily lives – has much to offer for scholarly inquiry and is of increasing relevance for society.
Lifestyle Journalism as Popular Journalism: Strategies for Evaluating its Public Role
This essay argues that lifestyle journalism, which is often considered trivial, should be analyzed for its public potential. I delineate how lifestyle journalism’s dimensions of review, advice, and commercialism can be transformed into strategies for research that probe the social, cultural and economic context of this media output. Then I discuss how its discourse is worth analyzing for its ideological connection. John Fiske’s (1989) ideas on “popular news” and Irene Costera Meijer’s concept of “public quality” (2001) are presented as guidelines for interrogating the public relevance of this type of journalism. Findings from studies on the globalization discourse in travel journalism and music journalism are used to exemplify this research framework.
”Lifestyle journalism – blurring boundaries”
Nete Nørgaard Kristensen & Unni From
The article argues that, in contemporary journalism, the boundaries between lifestyle journalism and cultural journalism are blurring. The discussions of the article are based in comprehensive empirical studies, more specifically a content analysis of the coverage of lifestyle, culture and consumption in the Danish printed press during the twentieth century and the first decade of the twenty-first; and secondly, interviews with Danish cultural journalists and editors. The studies reveal that the coverage of lifestyle is expanding and that culture, lifestyle and consumption are today contiguous – sometimes even inseparable – subject matters, which even for journalists are difficult to separate. The findings are interpreted in the light of especially Jansson’s (2002) approach to mediatization of consumption as an expression of more general socio- and media cultural transformations of society.
Travel Journalism, Cosmopolitan Concern and the Politics of Destination Branding
As journalism scholars’ interest in the impact of public relations on hard news has grown in recent years, little attention has been paid to attempts by elite sources to influence soft journalism. In an effort to better understand what can, in fact, be complex interactions between travel journalists and public relations practitioners, this paper tracks one destination’s brand over an extended period of cosmopolitan concern. It finds that in times of conflict, government tourism public relations may become politically instrumental, as public relations practitioners seek simultaneously to promote the destination and shield it from media scrutiny. At such times, travel journalists may subvert traditional expectations of their genre by exposing contradictions in the brand. The paper concludes that the power of travel journalism derives not only from its authors’ capacity to communicate through their texts but also from their tendency to be enmeshed in the interactivity of the brand.
Bread and Circuses: Food Meets Politics in the Singapore Media
Andrew Duffy and Yang Yuhong Ashley
While there has been consistent academic interest in the link between the media and politics, this attention has mostly bypassed lifestyle journalism. Yet this can reflect the political and social realities of a country if less clearly than more overtly political coverage. This paper seeks to demonstrate how the Singapore government has used food to help construct a national identity and how the local print media have been a partner in this. It analyses how food has been represented in the Singapore press in relation to attitudes that contribute to nation-building. The findings suggest that the food-related articles studied usually reflected a culture of self-improvement, an ethnic-cultural element and cosmopolitan attitudes, all of which were identified as touchstones of Singapore’s government-approved national identity. In addition, there is also marginally more press coverage of cosmopolitan and foreign food compared to local food, in concurrence with government initiatives to place the country as a globalised hub.
Health and lifestyle to serve the public: A case study of a cross-media programme series from the Norwegian Broadcasting Corporation
This study traces the changes of a prime time health and lifestyle television program (Puls) on the main channel of the Norwegian Broadcasting Corporation (NRK) across the last decade. The overall pattern of change is that as a public-service broadcaster, the NRK continues to broadcast factual programmes in prime time, which incorporate different entertainment and interactive elements to guarantee their popularity and relevance to the viewers. The integration of television and online activity has lessened over the years. The television content has become more lifestyle-oriented, while the Internet content has more stress on factual information and news. The use of reality elements has led to a strong emphasis on motivating the viewer to adopt an active lifestyle, while the viewers’ possibilities to participate actively in the programme have diminished.
Hypertextuality and Remediation in the Fashion Media: The case of fashion blogs
Since their appearance in the early noughties, fashion blogs have established themselves as a central platform for the circulation of fashion related news and information. Often the creation of fashion outsiders, they have entered the mainstream fashion media, bringing to light the shifting nature of fashion journalism. The paper discusses the rise of the fashion blogosphere and the impact of new technologies on the mediation of fashion. Drawing on the notions of hypertextuality and remediation it contributes to a recurring question in academic studies of digital culture: how new are new media? The paper looks at the ways fashion blogs define themselves in relation to traditional fashion journalism and the traditional fashion press. Their relation of co-dependance and mutual influence is unpacked to shed light on the contemporary field of the fashion media, and the role of new technologies in the production, circulation and consumption of fashion related news.
Service Journalism as Community Experience: Personal Technology and Personal Finance at The New York Times
This paper looks at service journalism and its evolution as a community platform through blog comments and social media through a case study of two sections of The New York Times’ business section: the personal finance section and the personal technology section. The paper proceeds through a discussion of the importance of networked journalism, and relies on in-depth qualitative interviews with the journalists closest to the decisions being made about how service journalism at the Times becomes a participatory experience for readers. The article argues that a Web 2.0 world facilitates a community experience that changes the one-to-many relationship that journalists have with their readers; instead, journalists make decisions about coverage and engage in conversations with readers in response to this new relationship with readers.
A New Generation of Lifestyle Magazine Journalism in China: the Professional Approach
This study examines the new approach and working standards of Chinese journalists who are working in international consumer magazines in China. In the past 15 years, Chinese lifestyle journalists have reoriented their multiple functions to present their social role as an ‘information vehicle’, ‘serving the rising class’, with ‘independence from media ownership and commercial forces’ and ‘contributing consumerism to culture and traditional society’. The elements involved in this new genre of journalism include financial and operational autonomy from the State and editorial independence from their international parent magazine companies. This professional approach also utilizes a working strategy that lays bare each individual’s unique journalistic identity. The professionalization of these lifestyle magazine journalists is the result of a global and local media cultural collision; a product of the reconciliation of commercialism and professionalism.
Farewell to Journalism? Time For Rethinking..
We learn from Piet Bakker that:
The business model of gathering, producing and distributing news is changing rapidly. Producing content is not enough; moderation and curation by “news workers” is at least as important. There is a growing pressure on news organizations to produce more inexpensive content for digital platforms, resulting in new models of low-cost or even free content production. Subscription, advertising revenues and non-profit funding are in many cases insufficient to sustain a mature news organization. Aggregation, either by humans or machines, is gaining importance. At “content farms” freelancers, part-timers and amateurs produce articles that are expected to end up high in Web searches. Apart from this low-pay model a no-pay model—the Huffington Post—emerged where bloggers write for no compensation at all. We analyse the background to all this, the consequences for journalists and journalism, and the implications for online news organizations. We investigate aggregation services, content farms and no-pay or low-pay news websites.
aggregation, content farms, Huffington Post, journalism
Journalists are expected to work for different platforms, develop techniques to gather and check online information, acquire skills to become multi-media professionals, and learn how to deal with the contributions of amateurs. Producing content is not enough; moderation and curation are the new buzz-words of the trade while gate-watching is at least as important as gate-keeping, “to harvest content from volunteers and act as curator, correcting and editing copy” is the new role for professional journalists according to Jones and Salter (2012, p. 100). This has led to higher demands in terms of output—in quantity—because revenues from traditional media decline while income from digital operations is still marginal and competition is fierce (Picard, 2010a). A possible solution is producing more content at lower costs for these new platforms. This has led to new models of low-cost or even free content production.
Many of these enterprises are successful. The Huffington Post, mainly relying on free work from bloggers, and one of the most visited websites in the United States, was sold for $315 million to America Online (AOL) in the beginning of 2011. Google News, relying on algorithms and search robots has versions in more than 60 countries. Pulse, aggregating RSS-feeds (Real Simple Syndication) from media websites, is one of the most downloaded apps for the iPad. Their way of newsgathering, however, has been disputed as well. The Huffington Post was threatened by a lawsuit on behalf of 9000 unpaid bloggers, Pulse was summoned by the New York Times to remove their RSS-feeds from the service and Google News was forced by Belgian French-language publishers to remove all their content from Google News.
The Economics of Online Content Production
Cases: Low-pay, No-pay and Aggregation
Generating income from digital operations has been troublesome for most news organizations to say the least. In the last decade of the twentieth century most content was offered for free while it was expected that advertising, e-commerce and marketing (offering subscriptions through the website) could make up for the costs and maybe even provide a small profit. Growing competition—resulting in low advertising rates, and the economic crises of the twenty-first century—made media companies move to other models: paid and metered access, registration, premium access for paid subscribers, pdf-versions, paid archives, pay-walls and paid access through apps on iPads and other tablets. Currently three streams of revenues dominate online news operations.
Subscription and single copy sales—meaning direct payment by users for content, is used by specialized publications such as the Wall Street Journal, the Financial Times and by news organizations with a more general focus like The (London) Times and some local papers. Also subscription to pdf-papers and apps for the iPad do fall within this category. Notwithstanding the 30 per cent cut Apple takes from all sales through their App Store, many newspapers and magazines now embrace this model. Prices, however, are generally lower than for physical products.
Advertising and e-commerce such as banner adds, pop-ups, sale of products, pre-roll video ads and other advertising formats are used by almost every news organization. Blogs contain “adds by Google” and social networks like Facebook and LinkedIn carry “targeted” advertising. Media websites also promote other activities such as workshops, travel and seminars.
Sponsoring, donations and non-profit public funding are the most-used non-commercial options for sustaining a news website. Public broadcasters are sponsored by their mother-organization; non-profit organizations (like Amnesty International, universities, governments) provide news on their websites. Public funding is gaining in importance, demonstrated by examples like Spot.us, The Bay Citizen, SF Public Press, ProPublica (United States), helpmeinvestigate.com (United Kingdom) and De Nieuwe Reporter (The Netherlands). Downie and Schudson state: “Financial support for reporting now comes not only from advertisers and subscribers but also from foundations, individual philanthropists, academic and government budgets, special interests, and voluntary contributions from readers and viewers” (2010, p. 56).
All digital operations compete for audience attention while most of them also compete for advertising. Few news organizations can rely on subscription as the main source of revenue, except for specialized (financial) news providers. The iPad has made digital subscription an option for news organizations, but it is too early yet to judge whether this can grow into a substantial revenue stream. Operations that are able to ask users for (high) subscription prices can usually also ask advertisers for higher advertising rates as they provide access to a specific, usually affluent, audience.
Picard (2010a) argues that online news providers would benefit from specialization, targeting niches and providing high-end quality journalism at high prices. This may be so, but for many news organizations this is not a viable option. They do not have the audience, the reputation, the brand, the staff or the resources to embrace that strategy. Their basic approach is to go for a larger audience with general news offered for free and produced at low cost. Lewis et al. (2008) and Davies (2008) provided examples of this behaviour by UK newspapers. A smaller staff is asked to produce more output in less time, not only for the printed newspaper but also to keep the website up-to-date. Whether local audiences can be defined as a niche with possibilities of online subscription and higher ad rates depends on the number of competitors. If users can choose between substitutes, offering more or less the same content, chances of them willing to pay for local news are not very high.
The Economics of Online Content Production
More Content at Lower Costs
A Model for the Online News Business
Cases: Low-pay, No-pay and Aggregation
News organizations have three options when it comes to controlling costs and increasing output; they can save on staff or have staff members work differently; they can make compromises with regard to content or go for cheaper content; and they can employ technologies to replace staff, produce content or increase their audience.
Staff options are employed with the overall objective of having less (or less expensive) staff producing more content; “increasing the volume of news and information” is the first strategy of news organizations according to Picard (2010a, p. 84).
Reducing staff is a phenomenon that can be observed worldwide. Alterman (2011) estimates that between 1990 and 2008 a quarter of newspaper jobs in the United States disappeared. Website Papercuts (newspaperlayoffs.com) keeps track of all recent changes in US newspapers—it shows dozens of lay-offs every month. A job lost at the newspaper also affects the online operation as most news media have an integrated news room.
Replacing journalistic staff by non-unionized staff, technicians or community managers works particularly well in countries where unionized staff have better pay and working conditions. In the United States, the United Kingdom and France, conflicts around this issue already date from the pre-Internet era. Online it means that technicians, re-writers and staff whose duty it is to stimulate others (bloggers, volunteers) to contribute, are replacing journalists.
De-integration of print and online operations as online staff are usually paid less compared to “regular” journalists. In France, La Depeche du Midi replaced journalists working for the online edition with a separate re-write staff with technical knowledge of Search Engine Optimization (SEO) (Smyrnaios and Bousquet, 2011). De Belgian Persgroep, owner of four Dutch newspapers, has taken all online duties away from the journalists of the papers and formed a new combined online newsroom for the websites of all titles. Younger—and less expensive—staff are employed for the operation.
Paying less for items; particularly for photos. The prices freelancers receive per photo have dropped. Dutch news magazine HP/De Tijd decided in 2011 to pay 40 per cent less (€300) for photo features, Dutch newspaper publisher Wegener (part of the Mecom group) pays less than €50, De Telegraaf Media Group—the largest publisher in the Netherlands—pays €18 for most freelance pictures. Second newspaper AD pays €120 for half a day of work in which the photographer is expected to take as many pictures as possible. Also reviewers of concerts, plays and movies have seen their earnings drop.
Demanding more output from staff members is a general trend. For the United Kingdom, Lewis et al. (2008) showed that staff members were expected to produce more content. AOL moulded this strategy into a system of “scaled production” for which it was calculated that the cost per item would decrease from $99 to $84 while page views would increase. Writers for the online news service Seed were expected to use the system first, whereby the cost of a written article would not be more than $25 because of increased production.
Replacing regular staff with freelancers could result in a 50 per cent cost saving on articles. Dutch newspapers have been focusing more on the core business of news; meaning that features, culture and specials are outsourced to freelancers (Bakker et al., 2011). Picard argues that “there is a widespread and growing use of freelance journalists, heavy reliance on acquired content from news, video, and feature services” (2010a, p. 83).
Cutting down on international reporting is done by replacing foreign correspondents by freelance stringers. The largest Dutch newspaper De Telegraaf closed all but three (New York, London, Brussels) foreign bureaus and started to employ stringers instead. According to Starr (2010), the number of foreign American newspaper correspondents dropped by 30 per cent between 2002 and 2006. This will result in less online coverage as well.
Outsourcing and off-shoring production means that page-layout, ICT, photo-editing, moderating of comments, subediting and content production is handled by external parties, either to domestic parties (outsourcing) or foreign countries (off-shoring). Australian publisher Fairfax as well as several UK newspapers have outsourced subediting to an independent company. Using foreign subediting can lead to cost reduction because it avoids expensive night shift work. The Pasadena (United States) website Pasadena Now hired reporters from Mumbai and Bangalore to cover the Pasadena City Council. The Boston Globe, The San Francisco Chronicle and NPR have outsourced comment moderation to a Canadian firm. Dutch newspaper De Telegraaf also employs a private company to moderate comments.
Using user-generated content (comments, photos, video) from amateurs, bloggers or social media can result in substantial cost reductions, as this content is usually free. Almost all websites now have this feature, although getting the audience to send in useful material can be expensive.
Employing “amateur” bloggers or volunteers can result in free content although in some cases these contributors get paid on the basis of the revenue the articles generate. The US Examiner local network uses this last model. In some cases contributors are offered a fixed price, usually a very low price. Content farm Demand Media pays $15 for an item. UK-based SnackMedia pays £10 for a 1000-word article. When Roy Greenslade reported on this offer on his Guardian blog, one commenter posted: “It's ten quid more than many offer”. On a more positive note, Downie and Schudson state that “the ranks of news gatherers now include not only newsroom staffers but also freelancers, university faculty members, students, and citizens” (2011, p. 56). The Huffington Post and the Guardian's “Comment is Free” website do not pay bloggers at all.
Media companies used to have a preference for producing content in-house as much as possible. The new realities of failing online revenues have challenged this paradigm. Non-original content is no longer a no-go area.
Re-publishing content or “repurposing and reutilizing existing news and information” (Picard, 2010a, p. 84) has been an often-used strategy whereby items for the printed or broadcast version were reused online, sometimes in an abbreviated format. Earlier this model was referred to as “shovel-ware” (Chyi and Sylvie, 1998).
Using more syndicated content and content from press agencies (see Davis, 2008) has been employed for some years already. In the Netherlands the most successful news website nu.nl has a small staff that is mainly charged with copy/pasting and rewriting articles from national press agency ANP.
Using more PR material and press releases from commercial and non-profit organizations, including video will result in serious cost reductions. The UK website churnalism.com tracks the behaviour of newspaper and broadcast websites; the number of press releases they use and whether they rewrite the received copy or just copy/paste it. Churnalism is defined as “a news article that is published as journalism, but is essentially a press release without much added”. Results show that many of these items are published with few or no alterations.
Using stock-photos instead of original material can be a serious cost-saver. Dreamstime.com (available in seven languages) offers “high quality Royalty-Free images for as low as 0.17 €/image or free”. Some media also use pictures from Wikipedia and Flickr for news items on their websites.
Material “found” online can be used, slightly rewritten, cited, linked to, translated, “enriched”, combined and sometimes researched further. This practice is close to pure “aggregation”
The shovelling journalist is sometimes replaced by robots, algorithms and search engines. Technology can help to produce more content, make content production cheaper or to increase readership and revenue. Some technologies must be handled by humans; others function more or less independently. All strategies (including aggregation, social media, curation and using RSS-feeds) use material from other media and provide links to the original source; driving traffic to the original website is the justification for the use of third-party content.
Aggregation is the general heading for strategies whereby (automated) Web searches result in relevant articles on specific subjects. Google News is probably the best-known example but many other services exist, often using Google search as their main tool. Google News depends on searches in news, grouping results into categories (national, international, sports, entertainment, etc.) and presenting links—sometimes with the first paragraph—to the original news items. The most important (most found) subjects are placed on top of Google News. Legacy news media are not too fond of automatic aggregation. Belgian publishers took Google news to court; Dutch aggregator kranten.com was sued by publisher PCM while US wire service Associated Press announced in 2009 they would “pursue legal and legislative actions against aggregators who use content without permission” (Abel, 2009).
Curation combines automatic aggregation with human labour. WorldCrunch and Europe Today (translating international news into English), Presseurop and News360 (translating news into different languages) offer material from international media. Selection and rewriting is part of the aggregation process. Several “content curation tools” like Scoop.it are available to assist the aggregator. Also US news website Newser combines search engines with work by humans. Although all items appear to be written by “Newser staff”, the content of the presented links makes it clear that other media are used as sources.
Material from social media can be embedded on news websites using special “widgets” whereby only tweets that contain specific words or phrases are selected. The iPad app Flipboard—one of the most popular news apps—mainly uses updates from Twitter, Facebook, Tumblr and blogs.
RSS is used by websites to promote and distribute content, but these RSS-feeds themselves can be used by other services as a source. The Pulse iPad app, for instance, uses RSS-feeds from various news sources. The company employed 12 people in the beginning of 2011: one CEO, one marketing employee and 10 programmers. Pulse started as a paid service but moved to free distribution as it sought other revenue streams like sponsored promoted content. The Zite iPad app also only uses news from other sources (TechCrunch, Businessinsider, blogs, newspaper websites) through Twitter posts, RSS-feeds and social bookmarking. The News.me app is based on what other people read. Mobile news service Ongo offers a paid service—using news from other sources.
Promotion through social media means that items are automatically posted on Twitter, Facebook or LinkedIn. Readers can share the URL of the item on these and other services like Google+, Digg or Delicious as well. The AddThis service “Boost your site's traffic” offers dozens of options.
SEO is the process whereby articles are written in ways that makes it likely that these articles will end up high—preferably on the first page—of a Google, Bing or Yahoo! search. Content farms such as Demand Media use this strategy to instruct authors. It means the use of simple and often-used words in headings and body text. Google Analytics and Google Adwords are helpful tools when it comes to implementing an SEO strategy.
Introducing automatic features like “best-read” or “most-commented” can increase the number of pages people read on a website, and polls are used to engage audiences. The first two options are fully automatic; polls include some human interference, but comments and results can be used for news items as well.
An option in using technology for journalism bordering on science fiction is using software to write articles. StatSheet, a US company that offers sports statistics, experimented with software that turns game statistics into “automated content” (see van Dalen's essay in this issue; van Dalen, 2012). Narrative Science, a spin-off from Northwestern University, can “cost-effectively produce” sports stories and financial reports as well as local community content: “our proprietary artificial intelligence platform produces reports, articles, summaries and more that are automatically created from structured data sources” (www.narrativescience.com).
A Model for the Online News Business
When skilled professionals produce unique content that is highly valued by an affluent audience, an online news business model will include paid access and high advertising rates. With a general audience and news that can be obtained elsewhere as well, options are limited; free access and maximizing the audience is a likely course as advertising rates will be low. But also strategies of extreme cost reductions (staff, content, technology) can be followed. Not all strategies mentioned above, however, have to result in lower-quality journalism. Freelancers do not produce inferior quality compared to staff members, an Australian sub-editor can work just as well as someone from the United Kingdom, a story aggregated from The Guardian is a good story everywhere, pictures from eye witnesses can be newsworthy, bloggers can provide valuable insights, while good SEO may result in more revenues. Nevertheless, most strategies will result in having fewer professionals produce more—but less original—content, while “outsourced content tends to create lower value” (Picard, 2010a, p. 84).
The choices publishers have are summarized in Figure 1. Instead of the linear value chain model, a circular model is proposed. The nature of the content (unique, specialized, general, local) has a direct influence on the audience and the willingness to pay for this content, given the availability of substitutes. The nature of the audience has consequences for the revenues (subscription, advertising, public funding) and the business model. Policies on VAT rates, tax exemption for non-commercial operations and copyright protection also influence the model. The way of content production—the number and nature of the staff members and the technologies used—is based on the expected revenues. While the value chain is an open-ended model based on value creation, the model in Figure 1 is based on choices of audiences, expected revenues and options for content production.
A Model for the Online News Business
Cases: Low-pay, No-pay and Aggregation
Below three cases will be presented, each making different choices for their business model. All of them, however, have steered away from the traditional model that most legacy media use, particularly when it comes to employing experienced—and therefore expensive—staff. At content farms non-competitive content is produced at low prices, the Huffington Post is producing competitive content by non-paid contributors and aggregators use technology to “produce” non-competitive content.
The “content farm” employs freelancers, non-journalists, bloggers, part-time writers and amateurs who produce articles on subjects that are expected to end up high in searches and generate traffic to the websites catered by these services. Its strategy is a combination of low pay for writers and the employment of SEO technology. Advertising is the main revenue source.
Demand Media employs more than 12,000 authors and publishes between 4000 and 6000 articles per day at a rate of $15 for a 500-word story—copy-editors receive $2.50 per article (Frank, 2011). Stories end up on Demand's eHow, Cracked, Trails or LiveStrong websites but Demand also provides the websites of USA Today, The San Francisco Chronicle and the Houston Chronicle with news (Shaver, 2010). Authors are presented with ready-made headlines based on key words that are likely to turn up in Web searches. Stories are checked for plagiarism by software but also by editors (Blanda, 2010). In 2011 Demand announced that it would introduce quality content and started paying writers up to $350 for longer (850 words) articles as a reaction to a change in the Google search algorithms in 2011 that resulted in lower positions for Demands content.
Various content farms exist, although none particularly likes being called a content farm. Any service that offers cheaply produced free content from undisclosed authors, however, technically falls under that heading. Besides Demand Media there are services like Helium (“where knowledge rules”), iNeedaGreatStory, Suite101 (Canada), StudioNow and Seed (both AOL), Associated Content (Yahoo!), Ask.com and About.com. Some websites offer writers a share of the advertising revenues depending on the number of clicks a page gets; the US network of local websites Examiner.com employs this strategy. In April 2011 240 local Examiner sites were available, around 3000 articles a day were published, usually by amateurs and local bloggers. AOL's local network Patch—targeting 800 smaller communities—employs full-time editors but also freelancers who are paid up to $50 for stories. Media-economist Robert Picard writes about content farms on his blog The Media Business:
These enterprises are providing high quantity, low quality material on topics designed to produce many search hits and driven by the desire to make money from advertising received as high traffic sites … These producers and a whole range of similar organizations are producing material in content farms that rely on freelancers who are paid as little as $1 an article or get no payment except for number of page views for their specific work. It is a throwback to the penny-a-word days of journalism in the 19th century. (Picard 2010b)
Some services have professional journalists and bloggers write for them for no compensation at all. In terms of content, however, there is an emphasis on original opinionated content. Promotion through social media is used extensively.
The best-known example of such an operation is the Huffington Post, a popular website that employs around 9000 unpaid bloggers. Well-known authors, celebrities and politicians are invited to blog and use the website as a podium for their opinions. Bloggers also cross-post content to promote their own blogs. The Huffington Post was launched in 2005 and was one of the most-visited news websites in the United States in 2011. Criticism not only concerned content—particularly when medical or scientific information was published—but also the fact that bloggers were not compensated for their efforts. This criticism increased after the $315 million sale to AOL. Original content might be the focus of the website, but in 2011 a Huffington Post writer was suspended because of “over-aggregation” of an item from Ad Age. Websites like The Daily Beast and the Guardian's Comment is Free use more or less the same model.
Websites that do little more than finding news and publishing it, either in full, as a digest or as a heading with a link to the original source, are usually called aggregators. The best-known international example is Google News, but apart from that national, local and subject-specific models exist.
A research in 20 Dutch local communities (Bakker et al., 2010) found an average number of 26 news channels in these communities, two-thirds of these channels (345) were digital media, and of these channels 136 (40 per cent) were local news aggregators while 75 were Twitter feeds that only contained links to articles published elsewhere. A typical example is hetnieuws.in—a service that offers news for all 400 Dutch communities and is funded by advertising. Every community has a page with news aggregated from other media websites (often other aggregators), presented as a heading with two sentences and a link to the source. Users can subscribe to a community-specific Twitter feed, “like” the service on Facebook, embed the community-stream in a website or receive the news by mail.
In 2011 the hyper local network Dichtbij.nl (owned by De Telegraaf Media Group) also launched in all 400 Dutch communities; journalists are employed as “community managers” who select, analyse and “enrich” items from other sources and users, and write articles for the target group, often from a human interest or commercial point of view. Users can contribute items themselves as well.
Cases: Low-pay, No-pay and Aggregation
News organizations would probably rather employ specialized professionals who write highly valued content for a paying audience or distribute current affairs news to a mass audience so advertisers can pick up the bill, than produce low-quality, poorly produced content from digital sweat shops or news gathered by machines crawling the World Wide Web. But these options are not open to everybody, and the models presented in this article are those that cover the area between specialized high-quality news production and distributing news to mass audiences. When subscription is not an option and advertising offers only limited revenues, solutions have to be found elsewhere. These strategies show business models based on low-pay, no-pay and “automated” journalism.
Instead of only denouncing these operations and the content that is produced because of the assumed low quality and lack of originality, it should be emphasized that these services also can do the opposite: offering original and quality journalism, as is demonstrated by the Huffington Post, niche publications and some local models. Aggregators also produce societal value because they distribute news to larger audiences.
The models presented above are not necessarily the dominant models in newsgathering. But the dominant logic of online news is that it seems very difficult to build online revenue models. Online advertising rates will remain low in competitive markets; Google will take the lion's share of the money spent online while pay-walls and paid apps for tablets will not generate an income that can sustain a full-size news department for most media. Subsidizing the online operation with offline money is an option used by many legacy media but start-ups and entrepreneurs will go for a low-cost model with a focus on aggregation, user-generated content and contributions from volunteers.
The main drawback of the models mentioned above is that some genres and some sorts of journalism will be underrepresented. Investigative journalism, background information, foreign reporting and systematic community coverage cannot be expected from such operations. It is hard to envisage how volunteers can cover communities, local governments or state departments on a regular basis without resources, legal protection and support from an organization.
From Newspapers to Laptops and The Viral Stream
Online Journalism and the Promises of New Technology
A critical review and look ahead by Steen Steensen..
Research about online journalism has been dominated by a discourse of technological innovation. The “success” of online journalism is often measured by the extent to which it utilizes technological assets like interactivity, multimedia and hypertext. This paper critically examines the technologically oriented research about online journalism in the second decade of its existence. The aim is twofold. First, to investigate to what degree online journalism, as it is portrayed in empirical research, utilizes new technology more than previously. Second, the paper points to the limitations of technologically oriented research and suggests alternative research approaches that might be more effective in explaining why online journalism develops as it does.
Whenever new technology emerges which is expected to play a major role in the evolution of media, researchers, scholars, business executives and practitioners alike all participate in a game of prophesying revolution. Mosco (2004) argues that the entry of such new technologies has always been surrounded by myths about their revolutionary powers. The telephone, radio, television and the computer have all been surrounded by mythical pronouncements on how they might prompt “the end of history, the end of geography and the end of politics” (2004, p. 13).
Needless to say, these technological inventions did change the world dramatically, but not in such a quick and radical fashion as the fortune-tellers seemed to believe.
Similar myths dominated the introduction of the Internet, and the research into online journalism was not left untouched. The 1990s saw several publications predicting for instance “the end of journalism” (Bromley, 1997; Hardt, 1996) due to the implementation of digital technology, while others, like Pavlik (2001), were profoundly optimistic on behalf of the future of journalism in new media. According to Boczkowski (2004) and Domingo (2006), these first online journalism analysts were driven by technological determinism. Domingo argues that research about online journalism in the first decade of its existence was partly paralyzed by what he labels “utopias of online journalism” (2006, p. 54).
These utopias were especially related to how hypertext, multimedia and interactivity would foster innovative approaches that would revolutionize journalism. Domingo labels these normative investigations the first wave of online journalism research. He then argues that the subsequent research about online journalism falls into two new waves: a descriptive and empirical wave of research focusing on the degree to which the wonders of the new technology described by the first-wave researchers actually materialized; and a wave of research that takes a constructivist rather than a technological determinist approach to researching online journalism.
However, this third wave of online journalism research is still just a modest ripple compared to the “tsunami wave” of research embarking on a technological approach that has continued to flood the literature on online journalism. Moreover, many of the studies labelled by Domingo as constructivist research are dominated by an initial desire to investigate the impacts of technology on online journalism. In this paper I critically assess the contributions of this “techno-approach” to research on online journalism. The aim is twofold; first, to review this body of empirical research in order to find out whether online journalism is more technologically innovative today than at the turn of the millennium. Second, to point to the limitations of such a technological approach to the research about online journalism. What exactly can such an approach tell us about the reality of online journalism?
What other approaches might be considered?
I have chosen to limit the review to studies conducted in the United States and Europe that are published in either acclaimed peer-reviewed academic journals or presented at peer-reviewed academic conferences. Some reports and books are also included. These studies were found through searches in Google Scholar and to some extent limited to the access provided by my academic institution. From the references of this first body of studies, additional studies were found, which in turn became the source of a few additional studies. Studies published/presented prior to 2000 have not been included unless they provide some relevance as context for more recent studies.
The Assets of New Technology
The techno-approach to research on online journalism has been dominated by investigations of the three assets of new technology that are generally considered to have the greatest potential impact on online journalism: hypertext, interactivity and multimedia (Dahlgren, 1996; Deuze, 2003Deuze, 2004; Deuze and Paulussen, 2002; Domingo, 2006; Paulussen, 2004; among others). The general assumption of the “techno-researchers” has been that an innovative approach to online journalism implies utilizing these assets of new technology. Exploring the innovative possibilities of these assets hence became the goal of the techno-approach to research on online journalism.
Several researchers have made attempts to widen the list of technological assets for online journalism. Dahlgren (1996) added archival and figurational, Harper (1998), Lasica (2002) and others spoke of personalization in some way or other, inspired by the (in the second half of the 1990s) much hyped concept of “the Daily Me”, introduced by Negroponte (1995). Pavlik (2001) added contextualization and ubiquity, and in recent years much attention has been given to the asset of immediacy (see for instance Domingo, 2006).
Zamith (2008) extended the list to a compilation of seven assets: interactivity, hypertextuality, multimediality, immediacy, ubiquity, memory and personalization. In addition, the literature on technology and online journalism is flooded by a sea of different concepts that describe similar or even the same phenomenon or asset—concepts like convergence, transparency, hypermedia, user-generated content, participatory journalism, citizen journalism, wiki-journalism and crowdsourcing.
However, most of these (additional) assets can be treated as concretizations of interactivity, hypertext and multimedia depending of course on how these three concepts are defined. I will base the following review on rather broad definitions of these three concepts. The definitions will be offered in the introductions of each of the following sections. In Table 1 I lay out the different concepts that flood the literature to make visible how I understand their reliance to hypertext, interactivity and multimedia.
TABLE 1 Assets of new technology on online journalism in reliance to hypertext, interactivity and multimedia
It must, however, be noted that techno-approach research lacks commonly accepted definitions of hypertext, interactivity and multimedia. This creates some confusion as to what these characteristics represent and how they differ from one another. What some label “interactivity”, others label “hypertext”. In fact, both hypertext and multimedia can be characterized (and are often characterized) as “interactivity”.
Hypertext is generally understood as a computer-based non-linear group of texts (i.e. written text, images etc.) that are linked together with hyperlinks. The term was first coined by Nelson (1965), who described it rather roughly as “a series of text chunks connected by links which offer the reader different pathways” (cited in Tsay, 2009, p. 451). Most scholars researching hypertext in online journalism rely on what Aarseth labels a “computer industrial rhetoric” (1997, p. 59), i.e. an understanding of hypertext as a technological function (made visible by the electronic link) rather than an observable practice of interaction between text and reader.
Researchers interested in hypertext as a text-reader practice are more likely to coin the object of study a practice of interactivity rather than a practice of hypertext.
The general assumption of researchers interested in hypertextual online journalism is that if hypertext is used innovatively it would provide a range of advantages over print journalism: no limitations of space, the possibility to offer a variety of perspectives, no finite deadline, direct access to sources, personalized paths of news perception and reading, contextualization of breaking news, and simultaneous targeting of different groups of readers—those only interest in the headlines and those interested in the deeper layers of information and sources (Dahlgren, 1996; Deuze, 1999; Engebretsen, 2000Engebretsen, 2001; Fredin, 1997; Gunter, 2003; Huesca, 2000; Jankowski and van Selm, 2000; Kawamoto, 2003; among others).
Empirical research on the presence and relevance of hypertext in online journalism tends to rely on the methodology of quantitative content analysis to statistically count the amount of links present in online news sites. The findings are generally (but with many variations) categorized according to the three different types of hypertext identified by Shipley and Fish (1996): target links (links within documents), relative links (links to other pages within a site), and external links (links from one site to another site) (cited in Wood and Smith, 2005).
Most of the content analysis studies of hyperlinks in online journalism are snapshots of a situation at specific moments in time. A few of them are larger, cross-country studies, like Kenny et al. (2000), who investigated 100 online newspapers (62 from the United States and 38 from “other countries”) at the end of the millennium and found that 33 percent of them offered links within news stories (target links) and only 52 percent of them offered some kinds of hyperlinks. Jankowski and van Selm (2000) investigated 13 online news sites in the United States, The Netherlands and Canada and found similar results. A few years later, van der Wurff and Lauf presented studies of 72 European online newspapers and found that hyperlinks was the least developed “internet feature” (2005, p. 37).
In their research on the front-pages of 26 leading online newspapers in 17 countries worldwide in 2003, Dimitrova and Neznanski (2006) found that use of hyperlinks had become “an established feature of online news”, but that the majority of the links was relative links (within-site links, mostly to archived material). Compared to these studies, Quandt (2008) found in a study of 10 online news sites in the United States, France, Germany, the United Kingdom and Russia that hyperlinks was used to a somewhat greater extent: 73 percent of the 1600 full-text articles he analyzed had relative links, 14.3 percent had target links and 24.7 percent had external links.
Other, more nation-specific studies conclude that hyperlinks/hypertext is not utilized to its potential in online journalism, especially concerning the use of target links and external links (in Scandinavia: Engebretsen, 2006; in Slovenia: Oblak, 2005; in Ireland: O'Sullivan, 2005; in Flanders: Paulussen, 2004; in the United States: Pitts, 2003; in Spain: Salaverria, 2005).
A common explanation in these studies for the perceived lack of hypertext in the online news sites investigated is that a majority of the stories published online are little more than shovel ware, i.e. stories that are originally published in print. Only a few studies offer more theoretically informed explanations of the findings, and even fewer offer a longitudinal approach. One study that does both is Tremayne's (2006) analysis of front pages of 10 online newspapers in the United States over a period of six years (1999–2004). He found that the amount of external links decreased during these years, while relative links increased. He explained this by network theory:
[a]s each organization builds up its own archive of Web content, this material is being favoured over content that is off-site. This is just one example of preferred attachment, which is the driving principle of network theory. (Tremayne, 2006, p. 60).
Preferred attachment may be the result of a protectionist strategy aiming at keeping readers on-site, even though it is not portrayed as such in network theory. Such a strategy conflicts with the utilization of hypertext technology.
Surveys and Experimental Studies
While content analysis has been the preferred method to investigate hyperlinks/hypertext in online journalism, other methods have also been utilized. Quinn and Trench (2002) conducted a survey amongst 138 “media professionals” engaged in online news production in Denmark, France, Ireland and the United Kingdom. The respondents agreed that providing hyperlinks could make stories more valuable to the readers. However, they were sceptical as to whether the readers “should be left to make their own judgment about the relevance of links, rather than … having the news services provide guidance to users” (Quinn and Trench, 2002, p. 35).
O'Sullivan (2005) interviewed Irish online journalists and found that few of them found hyperlinks to be an important feature of online journalism. On the contrary, they expressed concerns as to whether (external) hyperlinks would lead readers away from their site. Krumsvik found in his case studies of online news production at the CNN and the Norwegian public broadcaster NRK, that hypertext was to a little extent utilized—external links were “ignored” (2009, p. 145).
In an experimental study of how readers in the United States evaluate in-text (target) links in news stories, Eveland et al. (2004) found that only the experienced Web users found such hypertext structured news stories valuable. For in-experienced users, the hypertext structure was a disadvantage. Sundar (2009) found similar results in his experimental study. However, users seem to be satisfied with relative links according to a survey amongst readers of Flemish online newspapers (Beyers, 2005).
Based on these studies, it seems that relative hyperlinks, i.e. hyperlinks to other stories within the online news site, is the most common form of hypertext structure found in online journalism, while target links (links within stories) and external links are used to a lesser degree. A protectionist attitude might prevent utilization of external links while utilization of target links may be obstructed by a high degree of shovel ware material and uncertainty as to whether users actually benefit from such links.
Like hypertext, interactivity is a slippery concept that is used to describe numerous processes related to communication in general and practices like online journalism in particular. Based on a review of the “history” of interactivity, Jensen arrives at this definition: interactivity is “a measure of a media's potential ability to let the user exert an influence on the content and/or form of the mediated communication” (1998, p. 201). Jensen separates interaction from interactivity and his definition is therefore mainly a technological one. Interaction refers to the social dimension of interactivity, and McMillan argues for an incorporation of this dimension as well. Accordingly, she has identified nine different understandings of interactivity along two different axes (McMillan, 2002McMillan, 2005) (see Figure 1).
All these kinds of interactivity may be found in an online newspaper. However, the Human-to-Computer axis (or “navigational interactivity”, as Deuze (2001) labels it) is similar to what, in the previous section, I categorized as hypertext. The research covering this axis was therefore included there. Out of the then six notions of interactivity that are left only two seem to have occupied researchers of interactivity in online journalism to a great extent: human-to-human (both features and processes). This research is dominated by questions such as the degree to which users are allowed to interact with online newsrooms/online journalists through emails; the extent to which online news sites offer discussion forums; and whether users are allowed to comment on stories or in other ways be involved in the production process.
As with hypertext, the research on interactivity in online journalism is dominated by content analysis, even though a greater body of this research also relies on surveys and interviews with journalists. Kenny et al. (2000) concluded that only 10 percent of the online newspapers in their study offered “many opportunities for interpersonal communication” and noted that little had changed since the introduction of Videotex1 25 years earlier: “Videotex wanted to push news electronically into people's homes, and so do today's online papers”.
Similar findings and conclusions are found in Pitts’ (2003), Jankowski and van Selm's (2000) and Dimitrova and Neznanski's (2006) studies of news sites in the United States; in van der Wurff and Lauf's (2005) investigations of European online newspapers; in Quandt's (2008) analysis of news sites in the United States, France, the United Kingdom, Germany and Russia; in Paulussen's (2004) investigation of Flemish online newspapers; Oblak's (2005) study of Slovenian online news sites; O'Sullivan's (2005) research on Irish online newspapers; Fortunati et al.'s (2005) study of online newspapers in Bulgaria, Estonia, Ireland and Italy; and Spyridou and Veglis’ (2008) study of Greek online newspapers.
Comparisons between these studies are, however, difficult to make, reflecting differences in both methodological approaches and theoretical understandings of what constitutes interactivity. However, it might seem that European online newspapers tend to offer slightly less interactivity than online newspapers in the United States.
In a longitudinal study of 83 online news sites in the United States, Greer and Mensing (2006) found a slight increase in interactive features from 1997 to 2003. The possibility of customizing news, however, decreased during the same period. Li and Ye (2006) found that 39.2 percent of 120 online newspapers in the United States provided discussion forums—twice as many as in Kenney et al.'s study six years earlier. Hermida and Thurman found “substantial growth” (2008, p. 346) in user-generated content in 12 British online newspapers from 2005 to 2006 (concerning features like comments to stories and “have your say”). In an analysis of the level of participatory journalism in 16 online newspapers in the United States, the United Kingdom, Spain, France, Germany, Belgium, Finland, Slovenia and Croatia, Domingo et al. concluded that interactive options promoting user participation “had not been widely adopted” (2008, p. 334).
However, their findings suggest a distinct increase in most such interactive options compared to earlier studies, especially regarding the possibility for users to comment on stories, which 11 of the 16 online newspapers allowed. The process of selecting and filtering news, however, remains the most closed area of journalistic practice, allowing the authors to conclude that: “[t]he core journalistic role of the ‘‘gatekeeper’’ who decides what makes news remained the monopoly of professionals even in the online newspapers that had taken openness to other stages beyond interpretation” (Domingo et al., 2008, p. 335).
Some content analysis studies offer insights into how interactive features such as discussion forums are used. Fortunati et al. concluded that users “prefer to remain anonymous and silent” (2005, p. 426). Li and Ye (2006) found similar results, and Thurman (2008) found that the BBC News website's comments system “Have Your Say” attracted contributions from not more than 0.05 percent of the site's daily users.
Some studies focus on interactivity in so-called j-blogs, e.g. weblogs written by journalists and published on their online newspapers’ site. Singer (2005) found, in her research focused on 20 j-blogs in the United States, that the journalists “are … sticking to their traditional gatekeeper function even with a format that is explicitly about participatory communication” (2005, p. 192). However, two other studies of j-blogs offer alternative findings. Wall (2005) investigated US j-blogs on the Iraq war in 2003 and found that these j-blogs emphasized audience participation to a much greater extent than the online newspapers in general. Robinson (2006) investigated 130 US j-blogs and found similar results.
Surveys and Interviews
Studies relying on surveys and interviews with journalists contribute with similar findings as the content analysis studies. Riley's qualitative interviews with journalists at a metropolitan US newspaper in the late 1990s offer some interesting insights into the attitude towards interactivity at the time. According to Riley et al. (1998), most reporters were “horrified at the idea that readers would send them e-mail about a story they wrote and might even expect an answer”.
Heinonen (1999) found similar attitudes in his interviews with Finnish journalists during the same period. However, this attitude seems to have changed. Schultz (2000) found a slightly more positive attitude towards interactivity among journalists at The New York Times, as did Quinn and Trench (2002) in their interviews with journalists in 24 online news organizations in Denmark, France, Ireland and the United Kingdom. More recent studies suggest an even broader acceptance of interactivity among online journalists. In a survey of journalists in 11 European countries O'Sullivan and Heinonen (2008) found that 60 percent of the respondents agreed that linking with the audience is an important benefit of online journalism. O'Sullivan's (2005) study in Ireland, Paulussen's (2004) in Flanders, and Quandt et al.'s (2006) study in Germany and the United
States all found similar results.
In a broad-scale study relying on 89 in-depth interviews with editors and journalists in newspapers and broadcasting stations in 11 European countries, Metykova (2008) found that the relationship between journalists and their audience had indeed become more interactive, especially regarding email and text message interaction. However, this increase in interactivity “tended to be seen as empowering journalists to do their jobs better rather than blurring the distinction between content producers and content consumers” (Metykova, 2008, p. 56).
Chung (2007) in interviews with website producers nominated for the Online Journalism Award in the United States, and O'Sullivan (2005) found that online journalists, web producers and editors find it difficult to implement interactive features, even though they express a willingness to do so. O'Sullivan (2005) offers an interesting perspective: the use of freelancers may obstruct interactive features because freelancers cannot be expected to interact with readers to the same degree as the in-house editorial staff. Freelancers are generally not paid to participate in discussions with readers or initiate other kinds of interactivity.
Surveys of online newspaper users in Europe found that users lacked interest in participating on discussion forums and similar features (in Sweden: Bergström, 2008; in Flanders: Beyers, 2004Beyers, 2005; in Finland: Hujanen and Pietikäinen, 2004; in Germany: Rathmann, 2002). The most important facility of online newspapers according to these survey studies seems to be that online newspapers are continuously updated. Already in the mid-1990s Singer (1997) found, in interviews with 27 journalists in the United States, that those journalists who were positive towards the Internet and new technology emphasized the importance of immediacy in online journalism. Quandt et al. (2006) found that the online journalists in Germany and the United States valued immediacy as the most important feature of online journalism. O'Sullivan found that immediacy was the “big thing” and that frequent updates was “the great strength of online media” (2005, p. 62).
To summarize the research on interactivity in online journalism, it seems clear that online news sites are becoming more and more interactive, first and foremost regarding human-to-human interactivity. Users are allowed to contribute to the content production by submitting photos and videos and by commenting on stories and participate in discussion forums. However, users are seldom allowed to participate in the selecting and filtering of news. The traditional norm of gatekeeping is thus still very much in place in the practice of online journalism. As Fortunati et al. concluded: “the power relation between media organisations and readers is not in play” (2005, p. 428).
Furthermore, the research reveals that online journalists and editors are becoming more eager to interact with readers, but organizational constraints like time pressure and the utilization of freelancers prevent them to a certain degree from doing so. Last, but not least, user studies suggest an overwhelming indifference to interactivity—it seems that people prefer to be passive consumers, not active producers.
However, it seems that the picture might be slightly different when online newspapers report on major breaking news events, like natural disasters and other types of crises events. Several studies in recent years that focus on citizen journalism, like for instance Allan and Thorsen's compilation of case studies from around the world (2009), have demonstrated a boost in user participation and interactivity in the coverage of such events. In other words, it may seem that when crises strike, gatekeeping is to a certain degree abandoned.
Deuze (2004), p. 140) argues that the concept of multimedia in online journalism studies is generally understood in either of two ways: (1) as a presentation of a news story package where two or more media formats are utilized (e.g. text, audio, video, graphics etc.), or (2) as a distribution of a news story packaged through different media (e.g. newspaper, website, radio, television etc.).
Most research on multimedia in online journalism deals with the first understanding. When in the following I use the term multimedia, I will refer to such an understanding, albeit in a slightly more pragmatic sense that better fits the empirical research on multimedia in online journalism. Since an online news story with text and a photo is generally not considered to be multimedia, I will have the term refer to stories and websites where more than two media are utilized. I will also let the term include not only the presentation of news, but also the production of news.
As with hypertext and interactivity, most studies of multimedia in online
journalism rely on content analysis of websites. Schultz (1999) found that only 16 percent of online newspapers in the United States had multimedia applications in the late 1990s. Two more qualitative-oriented content analysis studies revealed a similar lack of multimedia (in the United States, Canada and the Netherlands: Jankowski and van Selm, 2000; in the United States: Dibean and Garrison, 2001). Jankowski and van Selm concluded that of all supposed added-value facilities of online journalism multimedia “is perhaps the most underdeveloped” (2000, p. 7).
However, online news sites affiliated with TV stations were more prone to utilize multimedia according to the same study. Yet, in a more extensive investigation of TV broadcasters’ online news sites in the United States, Pitts lamented: “[t]he majority of stations provide text-only stories, thus failing to use the multimedia capabilities of the web” (2003, p. 5). In their extensive investigation of European online journalism, van der Wurff and Lauf (2005) found that print newspapers were as much about multimedia as online newspapers. Quandt (2008) found that 84.5 percent of the 1600 stories he analyzed in 10 online news sites in the United States, the United Kingdom, Germany, France and Russia were strictly text-based. In Scandinavia, Engebretsen (2006) found that online newspapers used a bit more multimedia, but still not more than found in previous studies in the United States.
Dimitrova and Neznanski's (2006) study of the coverage of the Iraq war in 2003 in 17 online newspapers from the United States and elsewhere showed no increase in the use of video and audio in US newspapers compared to Schultz's study published seven years earlier. Furthermore, they found minimal difference between the international and the US online newspapers (slightly more use of multimedia in the US online newspapers). However, Greer and Mensing (2006) found a significant increase in multimedia use during the same period (1997–2003) in their longitudinal study of online newspapers in the United States.
Interviews and Surveys
Studies relying on interviews and surveys with online journalists and editors reveal some of the possible reasons for the lack of multimedia in online journalism found in the content analysis studies. According to Jackson and Paul (1998) (the United States) and Neuberger et al. (1998) (Germany) online journalists and editors had a positive attitude towards utilizing multimedia technology, but problems related to lack of staff, inadequate transmission capacity and other technical issues obstructed the materialization of multimedia content. Later studies indicate that online journalists and editors downscale the value of multimedia content: Quandt et al. (2006) found that multimedia was considered to be the least important feature of Web technology for online journalism.
O'Sullivan (2005) found similar results in his qualitative interviews with Irish online journalists. Thurman and Lupton interviewed 10 senior editors and managers affiliated with British online news providers and found that the general sentiment was that “text was still core” (2008, p. 15). However, Krumsvik, in interviews with CNN and NRK (Norwegian public broadcaster) executives, found a much more positive attitude towards multimedia than towards interactivity and hypertext (2009, p. 145). And a recent case study of multimedia content on the BBC online concludes that video content has increased tremendously (Thorsen, 2010).
There are few studies investigating users' attitudes towards multimedia news online. In an experimental study, Sundar (2000) found that those who read text-only versions of a story gained more insight into the topic of the story than those who read/viewed multimedia versions of the same story. Beyers (2005) found that only 26.4 of the Flemish online newspaper readers in his survey thought the added value of multimedia was an important reason to read online newspapers.
To summarize the findings of the research on multimedia in online journalism deriving from the techno-approach, it seems that multimedia remains the least developed of the assets offered to journalism by Internet technology. Online journalism is mostly about producing, distributing and consuming written text in various forms, even though some studies describe an increase in the use of multimedia, especially in broadcasting stations’ online news sites in recent years. However, it seems that practitioners are struggling to cope with multimedia, and the users seem to be quite indifferent.
The review leaves an impression that online journalism is left behind by the technological developments in new media. Linear text is preferred over hypertext and multimedia (hypermedia). Traditional norms of gatekeeping are preferred over participatory journalism and alternative flows of information, albeit interactivity seems to play a larger role when it comes to how major breaking news events, like crisis events, are researched and covered. Journalists and editors seem, at least to some extent, eager to embrace change brought forward by new technology, while users do not seem to care. All in all, it seems that technology may not be the main driving force of developments in online journalism. The question is therefore: how can research focused on online journalism better grasp why online journalism develops as it does?
Some researchers suggest that ethnography and a closer look at the practices and routines of online news production is the answer. Brannon (1999), Boczkowski (2004), Domingo (2006), Küng >(2008), Steensen (2009a) and the case studies presented in Paterson and Domingo (2008) all utilize the methodology of ethnography, even though their approaches to a large extent are still dominated by the technological discourse. Some other studies also utilize ethnographic methodology, but from a broader, albeit still technology-oriented, approach that aims at finding out how convergence of newsrooms affect the production of journalism (Dupagne and Garrison, 2006; Erdal, 2009; Klinenberg, 2005; Lawson-Border, 2006).
These studies provide valuable insights into the complexity of online journalism production and put forward findings that shed light on why technology is not utilized to the degree that has been previously postulated. Steensen (2009a), building on Boczkowski (2004), for instance found that newsroom autonomy, newsroom work culture, the role of management, the relevance of new technology and innovative individuals are vital factors as to how innovative online newsrooms are; and Domingo (2006) found that striving for immediacy hindered the use of other assets of new technology in the newsrooms he researched.
Notwithstanding the significant contributions of these studies, there are still many shortcomings of the research on online journalism. I will conclude this paper with six suggestions for further research.
First, studies of online journalism could benefit from a broader contextualization. Mitchelstein and Boczkowski (2009) argue that the research on online journalism lacks historical dimensions. Relating online journalism to developments in journalism prior to the Internet boom could therefore be a suggestion. Viewing online journalism in relation to media theory and how media and media products transform over time could be another.
Mitchelstein and Boczkowski also identify a need for more cross-national studies, and for online journalism researchers to look beyond the newsroom and the news industry and take into account structural factors such as the labor market and comparable processes in other industries in order to better understand “who gets to produce online news, how that production takes place, and what stories result from these dynamics” (2009, 576). It should, however, be noted that the works of Deuze (2007) and Marjoribanks (2000)Marjoribanks (2003) and their joint editing of a special issue of the journal Journalism 2 to some extent address these shortcomings.
Second, the research on online journalism is flooded by a range of theoretical concepts that are either interchangeable or are interpreted differently by different researchers. Concepts like interactivity, hypertext and multimedia are understood in different ways, and other concepts, like genre and innovation are generally used without any theoretical discussion on what they represent and how they might inform the research on online journalism.
A stronger emphasis on conceptualization is therefore needed.
Third, most of the research in the field of online journalism is limited to a focus on the presentation and to some degree the production and reception of hard/breaking news and the rhetoric of online news sites’ front pages.
The development of other genres therefore seems to have been downplayed in the research, even though some studies have been conducted on online feature journalism (Boczkowski 2009; Steensen 2009bSteensen 2009c). Furthermore, sections and stories that are reached by other means than via links from the front page (e.g. traffic to stories and sections generated from search engines) seem to be under-represented in the research. A stronger emphasis on the diversification of online journalism is therefore needed.
Fourth, research on online journalism could benefit from a greater recognition of and reflection on the text as a research unit. Although most research on online journalism deals with text in one way or another, there is a striking neglect of theoretical and methodological reflections on what texts are, how they facilitate communication, how they relate to media, and how they connect media with society. Genre theory and discourse analysis could for instance be valuable tools to establish research approaches that aim at investigating online journalism as communication. Lüders et al. (2010), for instance, show how the concept of genre provides vital insights into the emergence of new media like the personal weblog.
Fifth, although some of the research mentioned above makes longitudinal claims, the empirical material is seldom of longitudinal character. This seems to be a flaw considering the swift development of online journalism and the lack of common theoretical and methodological approaches, which makes comparisons between findings difficult.
Sixth, research about online journalism suffers from a methodological deficiency. The research is dominated by content analysis, surveys and interviews.
Qualitative approaches are rarely utilized, even though ethnographic news production studies seem to gain popularity. However, given the limited cases that are possible to investigate with such a methodology, more ethnographic research is need. Furthermore, content analysis should to a greater extent be combined with qualitative textual analysis of online journalism texts—all in order to uncover the complexity of online journalism.
Twittering the News
The emergence of ambient journalism, by Alfred Hermida
This paper examines new para-journalism forms such as micro-blogging as “awareness systems” that provide journalists with more complex ways of understanding and reporting on the subtleties of public communication. Traditional journalism defines fact as information and quotes from official sources, which have been identified as forming the vast majority of news and information content. This model of news is in flux, however, as new social media technologies such as Twitter facilitate the instant, online dissemination of short fragments of information from a variety of official and unofficial sources.
This paper draws from computer science literature to suggest that these broad, asynchronous, lightweight and always-on systems are enabling citizens to maintain a mental model of news and events around them, giving rise to awareness systems that the paper describes as ambient journalism. The emergence of ambient journalism brought about by the use of these new digital delivery systems and evolving communications protocols raises significant research questions for journalism scholars and professionals.
This research offers an initial exploration of the impact of awareness systems on journalism norms and practices. It suggests that one of the future directions for journalism may be to develop approaches and systems that help the public negotiate and regulate the flow of awareness information, facilitating the collection and transmission of news.
awareness systems, Internet, journalism, micro-blogging, social media, Twitter
Twitter is one of a range of new social media technologies that allow for the online and instant dissemination of short fragments of data from a variety of official and unofficial sources. The micro-blogging service emerged as a platform to help organize and disseminate information during major events like the 2008 California wildfires, the 2008 US presidential elections, the Mumbai attacks and the Iranian election protests of 2009 (Lenhard and Fox, 2009).
Twitter's emergence as a significant form of communication was reflected in the request by the US State Department asking Twitter to delay routine maintenance during the Iranian poll as the service was an important tool used by Iranians to coordinate protests (Shiels, 2009). Media restrictions led websites of The New York Times, the Guardian and others to publish a mix of unverified accounts from social media as “amateur videos and eyewitness accounts became the de facto source for information” (Stelter, 2009).
The micro-blogging service illustrates what Hayek described years before the invention of the Internet as “the knowledge of particular circumstances of time and place” (1945, p. 519). He proposed that ignorance could be conquered, “not by the acquisition of more knowledge, but by the utilisation of knowledge which is and remains widely dispersed among individuals” (Hayek, 1979, p.15). At that time, he could not have predicted the development of a system that has created new modes of organizing knowledge that rely on large, loosely organized groups of people working together electronically.
A variety of terms have been used to describe this: crowd-sourcing, wisdom of crowds, peer production, wikinomics (Benkler, 2006; Howe, 2008; Surowiecki, 2004; Tapscott and Williams, 2006). Malone et al. (2009) suggest that the phrase “collective intelligence” is the most useful to describe this phenomenon, which they broadly define as groups of individuals doing things collectively that seem intelligent. I suggest that micro-blogging systems that enable millions of people to communicate instantly, share and discuss events are an expression of collective intelligence.
This paper examines micro-blogging as a new media technology that enables citizens to “obtain immediate access to information held by all or at least most, and in which each person can instantly add to that knowledge” (Sunstein, 2006, p. 219). It argues that new para-journalism forms such as micro-blogging are “awareness systems”, providing journalists with more complex ways of understanding and reporting on the subtleties of public communication. Traditional journalism defines fact as information and quotes from official sources, which in turn has been identified as forming the vast majority of news and information content.
This news model is in a period of transition, however, as social media technologies like Twitter facilitate the immediate dissemination of digital fragments of news and information from official and unofficial sources over a variety of systems and devices. This paper draws from literature on new communications technologies in computer science to suggest that these broad, asynchronous, lightweight and always-on communication systems are creating new kinds of interactions around the news, and are enabling citizens to maintain a mental model of news and events around them, giving rise to what this paper describes as ambient journalism.1
Definition of Micro-blogging
Micro-blogging has been defined as a new media technology that enables and extends our ability to communicate, sharing some similarities with broadcast. It allows “users to share brief blasts of information (usually in less than 200 characters) to friends and followers from multiple sources including websites, third-party applications, or mobile devices” (DeVoe, 2009). Several services including Twitter, Jaiku and Tumblr provide tools that enable this form of communication, although status updates embedded within websites such as Facebook, MySpace, and LinkedIn offer similar functionality.
One of the most popular micro-blogging platforms is Twitter. Between April 2008 and April 2009, the number of Twitter accounts rose from 1.6 million to 32.1 million (Vascellaro, 2009). This growth was partially fuelled by increased media attention to Twitter as celebrities such as Oprah Winfrey adopted the service (Cheng et al., 2009).
Despite the rapid uptake, Twitter is still only used by a select number of people. In the United States, 11 percent of American adults use Twitter or similar tools (Lenhard and Fox, 2009) and research suggests that 10 percent of prolific Twitter users account for more than 90 percent of messages (Heil and Piskorski, 2009). However, Twitter users tend to be the people who are interested in and engaged with the news. Studies show that the largest single group of tweeters, making up 42 percent, are between the ages of 35 and 49, and that the average Twitter user is two to three times more likely to visit a news website than the average person (Farhi, 2009).
Twitter is a flexible system that routes messages sent from a variety of devices to people who have chosen to receive them in the medium they prefer. It asks users the question: “What are you doing?” Messages are limited to 140 characters as the system was designed for SMS messages, but there are no limits on user updates.
The “tweets” can be shared publicly or within a social network of followers. Users have extended their use of Twitter to more than just answering the initial question. The service has been described as an example of end-user innovation (Johnson, 2009) as users have embraced the technology and its affordances to develop conventions such as the use of hashtags and the @ reply.
Twitter and Journalism
Twitter has been rapidly adopted in newsrooms as an essential mechanism to distribute breaking news quickly and concisely, or as a tool to solicit story ideas, sources and facts (Farhi, 2009; Posetti, 2009). UK national newspapers had 121 official Twitter accounts by July 2009, with more than one million followers (Coles, 2009). In a sign of how far Twitter has come, the UK-based Sky News appointed a Twitter correspondent in March 2009 who would be “scouring Twitter for stories and feeding back, giving Sky News a presence in the Twittersphere” (Butcher, 2009).
The relative newness of micro-blogging means there is limited academic literature on the impact on journalism. Studies such as the one by Java et al. (2007) have looked at the motivation of users, concluding that micro-blogging fulfils a need for a fast mode of communication that “lowers users’ requirement of time and thought investment for content generation” (Java et al., 2007, p. 2).
In their analysis of user intentions, they found that people use Twitter for four reasons: daily chatter, conversation, sharing information and reporting news. At least two of these—sharing information and reporting news—can be considered as relevant to journalism, though arguably so could daily chatter and conversation around current events. Two of the three main categories of users on Twitter defined by Java et al.—information source and information seeker—are also directly relevant to journalism.
When Twitter is discussed in the mainstream media, it is framed within the context of established journalism norms and values.
There has been a degree of bewilderment, scepticism and even derision from seasoned journalists. New York Times columnist Maureen Dowd (2009) described it as “a toy for bored celebrities and high-school girls”. There has also been discussion on whether the breadth and depth of news reporting would suffer as more reporters sign up to Twitter (Wasserman, 2009).
Of particular concern has been how journalists should adopt social media within existing ethical norms and values (Posetti, 2009), leading news organisations such as the New York Times (Koblin, 2009), Wall Street Journal (Strupp, 2009), and Bloomberg (Carlson, 2009) to institute Twitter policies to bring its use in line with established practices.
Micro-blogging has been considered in the context of citizen journalism, where individuals perform some of the institutionalized communication functions of the professional journalist, often providing the first accounts, images or video of a news event (Ingram, 2008).
The value of user-generated content is assessed by professional norms and values that are presumed to guarantee the quality of the information (Hermida and Thurman, 2009). The issue commonly discussed in media commentaries on Twitter and journalism is the veracity and validity of messages. Concerns by journalists that many of the messages on Twitter amount to unsubstantiated rumours and wild inaccuracies are raised when there is a major breaking news event, from the Mumbai bombings to the Iranian protests to Michael Jackson's death (Arrington, 2008; Sutter, 2009).
The unverified nature of the information on Twitter has led journalists to comment that “it's like searching for medical advice in an online world of quacks and cures” (Goodman, 2009) and “Twitter? I won't touch it. It's all garbage” (Stelter, 2009).
The professional and cultural attitudes surrounding Twitter have their roots in the working routines and entrenched traditional values of a journalistic culture which defines the role of the journalist as providing a critical account of daily events, gathered, selected, edited and disseminated by a professional organization (Schudson, 2003; Tuchman, 2002). It reflects the unease in adopting a platform which appears to be at odds with journalism as a “professional discipline for verifying information” (Project for Excellence in Journalism, nd).
However, there are indications that journalism norms are bending as professional practices adapt to social media tools such as micro-blogging. During the Iranian election protests of June 2009, news organisations published “minute-by-minute blogs with a mix of unverified videos, anonymous Twitter messages and traditional accounts from Tehran” (Stelter, 2009).
Six months earlier, the BBC included unverified tweets filtered by journalists alongside material from correspondents in its breaking news coverage of the Mumbai bombings (BBC, 2008). The BBC justified its decision on the grounds that there was a case “for simply monitoring, selecting and passing on the information we are getting as quickly as we can, on the basis that many people will want to know what we know and what we are still finding out” (Herrmann, 2009). This approach means journalists adopt an interpretive standpoint concerning the utility of a tweet around a news event or topic, making a choice as to what to exclude or include.
By filtering and selecting what tweets to publish, the gatekeeper role is maintained and enforced. Journalists apply normative news values to determine if a specific tweet is newsworthy, dismissing content that might be considered as “snark and trivia” (Farhi, 2009).
Social media technologies like Twitter are part of a range of Internet technologies enabling the disintermediation of news and undermining the gatekeeping function of journalists. Micro-blogging can be seen as a form of participatory or citizen journalism, where citizens report without recourse to institutional journalism. It forms part of a trend in journalism that Deuze has described as a shift from “individualistic, ‘top-down’ mono-media journalism to team-based, ‘participatory’ multimedia journalism” (Deuze, 2005).
However, while micro-blogging services such as Twitter can be situated within the trend in citizen journalism, it should also be considered a system of communication with its own media logic, shapes and structures. While Twitter can be used to crowdsource the news, where a large group of users come together to report on a news event (Niles, 2007), this paper argues that the institutionally structured features of micro-blogging are creating new forms of journalism, representing one of the ways in which the Internet is influencing journalism practices and, furthermore, changing how journalism itself is defined.
Micro-blogging presents a multi-faceted and fragmented news experience, marking a shift away from the classical paradigm of journalism as a framework to provide reports and analyses of events through narratives, producing an accurate and objective rendering of reality (Dahlgren, 1996). Services like Twitter are a challenge to a news culture based on individual expert systems and group think over team work and knowledge-sharing (Singer, 2004). As Malone et al. (2009, p. 2) suggest, “to unlock the potential of collective intelligence, managers instead need a deeper understanding of how these systems work”.
This paper seeks to contribute an understanding of Twitter by introducing the concept of ambient journalism. I see new media forms of micro-blogging as “awareness systems”, providing journalists with more complex ways of understanding and reporting on the subtleties of public communication. Established journalism is based on a content-oriented communication, whereas Twitter adds an additional layer that can be considered as what has been referred to as connectedness-oriented communication (Kuwabara et al., 2002).
In an awareness system, value is defined less by each individual fragment of information that may be insignificant on its own or of limited validity, but rather by the combined effect of the communication.
Micro-blogging as Ambient Journalism
Definition of Micro-blogging
Drawing on the literature in the field of human–computer interaction, this paper suggests that broad, asynchronous, lightweight and always-on communication systems such as Twitter are enabling citizens to maintain a mental model of news and events around them. In this context, Twitter can be considered as an awareness system. Awareness systems are computer-mediated communication systems “intended to help people construct and maintain awareness of each others’ activities, context or status, even when the participants are not co-located” (Markopoulos et al., 2009).
Awareness systems have largely been discussed in the context of Computer-Supported Cooperative Work, with a focus on the notion of connecting remote co-workers by audio/video links (Bly et al., 1993). But there have also been critiques of the benefits of awareness (Gross et al., 2005) and even criticism of the term awareness as vague and problematic, often used in contradictory ways in the literature (Schmidt, 2002).
The emergence of the Web, coupled with increasingly affordable and ubiquitous information communication technologies, have helped foster a renewed research interest in awareness systems.
One focus of research is awareness systems for use in personal settings, where lightweight, informal communication systems help people maintain awareness of each other (Hindus et al., 2001; Markopoulos et al., 2003).
These systems are always-on and move from the background to the foreground as and when a user feels the need to communicate. Scholars suggest that awareness systems represent the next step in the evolution of communication technologies that have increased the frequency and amount of information transfer, offering “tremendous potential for innovation, with a wide range of forms and contexts for transforming the space around us” (Markopoulos et al., 2009, p. vii).
This paper adopts the definition of awareness proposed by Chalmers as “the ongoing interpretation of representations i.e. of human activity and of artefacts” (2002, p. 389). I suggest that this definition can be applied to social media networks such as Twitter, with messages considered as both the representations of human activity and as artefacts. Twitter becomes a system where news is reported, disseminated and shared online in short, fast and frequent messages. It creates an ambient media system that displays abstracted information in a space occupied by the user. In this system, a user receives information in the periphery of their awareness.
An individual tweet does not require the cognitive attention of, for example, an e-mail. The value does not lie in each individual fragment of news and information, but rather in the mental portrait created by a number of messages over a period of time. I describe this as ambient journalism—an awareness system that offers diverse means to collect, communicate, share and display news and information, serving diverse purposes.
The system is always-on but also works on different levels of engagement, creating an ecosystem where “a single user may have multiple intentions or may even serve different roles in different communities” (Java et al., 2007, p. 8). The question for journalism professionals and researchers is how individuals assign meaning to information from others, how they selectively attend to this information and how intentions are assigned to the information (Markopoulos et al., 2009).
In the literature on ambient media, scholars talk about improving people's quality of life by creating the desired atmosphere and functionality through intelligent, personalized, interconnected digital systems and services, with intelligent devices embedded in everyday objects (Aarts, 2005; Ducatel et al., 2001). In his discussion of ambient media, Lugmayr (2006) argues that today's technology is too complex, dominated by an individual's struggle to command the technology to do what they want. Instead, he suggests, we should aim to create media systems that can know what an individual desires and act autonomously on their behalf. If we consider Twitter as a form of ambient journalism, then the issue becomes the development of systems that can identify, contextualize and communicate news and information from a continuous stream of 140-character messages to meet the needs of an individual.
In their concept of calm technology, Weiser and Brown (1996) talk about the need for systems that allow for information to attract attention at different levels of awareness, be it at the centre or periphery of our attention. With Twitter, such an approach would enable users to be aware of the ambient information in the periphery, but would also bring from the periphery of our attention into the centre of our attention as required.
Suggested Approaches in Ambient Journalism
As an initial exploration into the impact of awareness systems on journalism norms and practices, this section examines the implications of Twitter as ambient journalism. This paper has considered how the first reports of a news event are now coming from people at the scene in the form of a 140-character message. But as an awareness system, Twitter goes beyond being just a network for the rapid dissemination of breaking news from individuals.
Rather, it can be seen as a system that alerts journalists to trends or issues hovering under the news radar. As Gillmor (quoted in Farhi, 2009) argues, journalists should view Twitter as a collective intelligence system that provides early warnings about trends, people and news. The immediacy and velocity of these micro-bursts of data, as well as potentially the high signal to noise ratio, presents challenges for the established practice of relying on the journalist as the filter for this information. During the Iranian election protests, the volume of tweets mentioning Iran peaked at 221,774 in one hour, from a flow of between 10,000 and 50,000 an hour (Parr, 2009).
The need to reduce, select and filter increases as the volume of information grows, suggesting a need for information systems to aid in the representation, selection and interpretation of shared information.
The growing volume of content on micro-blogging networks suggests that one of the future directions for journalism may be to develop approaches and systems that help the public negotiate and regulate this flow of awareness information, facilitating the collection and transmission of news. The purpose of these systems would be to identify the collective sum of knowledge contained in the micro-fragments and bring meaning to the data. Bradshaw (2008) discusses some of the systems used to aggregate tweets at the time of the Chinese earthquake in 2008, with the development of Web applications that aim to detect and highlight news trends in real-time. These applications rely on a journalistic interpretative standpoint as to the utility or interest in a topic, based on choices on what to include and exclude, suggesting there is a filtering mechanism at work, albeit on a systems design level.
Considering Twitter as an awareness system also represents a shift in the consumption of news and information. In such systems, completeness of awareness is not the goal, as it would be if an individual were actively pursuing an interest in a specific news event in print, broadcast or online. Instead of overwhelming an individual with an endless stream of tweets, Twitter as an always-on, asynchronous awareness system informs but does not overburden.
This notion draws on ideas advanced by Weiser and Brown (Weiser, 1991; Weiser and Brown, 1996) in which technology advances to the stage where it becomes embedded and invisible in people's lives. The extent to which such systems of ambient journalism allow citizens to maintain an awareness of the news events would be a fertile area for future study.
The trend to share links on Twitter provides a mechanism for what Johnson (2009) describes as a customized newspaper, “compiled from all the articles being read that morning by your social network”.
In this context, tweets provide a diverse and eclectic mix of news and information, as well as an awareness of what others in a user's network are reading and consider important. The information transmitted is content-oriented but also provides a context for the news-seeking activities of others on the network, which may make “visible the structure of implied communities” (Sarno, 2009). There are concerns that this may lead to a “private echo chamber” (Johnson, 2009) but, as Sunstein (2006) argues, such a position may be too simplistic.
This is an area that merits further exploration as part of the discussion about whether Internet technologies are creating a “Daily Me” or a “Daily Us”. Basing further research on an approach to networks such as Twitter as awareness systems, can, I suggest, help to contextualize the processes of the production, content, reception and circulation of news.
The link-based nature of many tweets, and the trend to re-send the links as a “retweet”, can be analysed as both a form of data sharing and as a system for creating a shared conversation.
This conversation can be considered as a form of ambient journalism. Since the retweets are not restricted by physical space, time or a delineated group, this creates what Boyd et al. (2010) argue is a distributed conversation that allows others to be aware of the content, without being actively part of it. They suggest that Twitter messages allow individuals to be peripherally aware of discussions without being contributors. This is significant in the context of engaging with audiences through the notion of journalism as a conversation (Gillmor, 2004). Awareness systems can be conceived as networks that engender information interactions and the development of a shared culture, which is particularly important for groups distributed across geography (Dourish and Bly, 1992; Kraut et al., 1990).
Research is needed to determine how far Twitter, as an awareness system for news, is contributing to the creation or strengthening of social bonds. For example, the mass outpouring of tweets following the death of Michael Jackson in July 2009 has been described as an immediate and public “collective expression of loss” (Cashmore, 2009).
As with most media technologies, there is a degree of hyperbole about the potential of Twitter, with proclamations that “every major channel of information will be Twitterfied” (Johnson, 2009). Furthermore, social media services are vulnerable to shifting and ever-changing social and cultural habits of audiences. While this paper has discussed micro-blogging in the context of Twitter, it is possible that a new service may replace it in the future. However, it is important to explore in greater depth the qualities of micro-blogging—real-time, immediate communication, searching, link-sharing and the follower structure—and their impact on the way news and information is communicated.
The emergence of ambient journalism through new digital delivery systems and evolving communications protocols, in this case Twitter, raises significant research questions for journalism scholars and professionals. This paper offers an initial exploration of the relationship between awareness systems and shifting journalism norms and practices. Twitter is, due to the speed and volume of tweets, a “noisy” environment, where messages arrive in the order received by the system.
A future direction for journalism may be to develop approaches and systems that help the public negotiate and regulate the flow of awareness information, providing tools that take account of this new mode for the circulation of news. Journalists would be seen as sense-makers, rather than just reporting the news. This broadens the journalist's role as proposed by Bardoel and Deuze of a professional “who serves as a node in a complex environment between technology and society, between news and analysis, between annotation and selection, between orientation and investigation” (2001, p. 101). In the case of ambient journalism, the role may be designing the tools that can analyse, interpret and contextualise a system of collection intelligence, rather than in the established practice of selection and editing of content.
Micro-blogging, and Twitter specifically, are in the early stages of development. The significance of Twitter as a news and information platform will be largely influenced by its adoption, both in journalism and other spheres. As Harrison and Dourish (1996) suggest, the richness and utility of a place increases as people build up a past that involves it and a record of experiences.
The challenge for researchers is to understand how this place becomes, in the words of Harrison and Dourish, “the understood reality” through a conversational and collaborative user experience. Examining Twitter as an awareness system, creating ambient journalism, provides a framework to analyse the emergent patterns of human behaviour and data interaction that offer an understanding of this place. It shifts the journalistic discourse on micro-blogging away from a debate about raw data to a discussion of contextualized, significant information based on the networked nature of asynchronous, lightweight and always-on communication systems.
Exploring the Media Ecology
According to Mason's Musings:
Of all the media theorists Marshall McLuhan is perhaps the most famous and in the 60s, there was perhaps no more well known academic figure in the entire communication discipline. McLuhan’s ideas have stood the test of time, yet at the time of their conception they were widely dismissed by the scientific community for reasons we will return to later (Scolari, 2012). In recent years the theory most accredited to McLuhan, the media ecology, has enjoyed a high degree of resurgence, with organizations such as the Media Ecology Association (MEA) leading the way. This theory, as Neil Postman proposed in a 1975 address, focuses not on specialization, but rather on making more generalize, bigger picture, connections (Salas, 2007). The media ecology can best be viewed as a framework, a way of looking at the world through the lens that mediums and technology are far more influential than the content of the messages they provide. This is the basic concept behind the phrase that epitomizes McLuhan’s contributions to this theory, “the medium is the message” (McLuhan, 1964, p. 7). Before we delve further into the tenants and contributions to the media ecology theory, it is useful to look at the metaphor around which it is organized, that of an ecology.
The ecology metaphor
While the majority of the credit for creating the framework of the media ecology goes to Marshall McLuhan, the actual use of the ecology metaphor in public discourse can be traced back to a speech made by Neil Postman to the National Council of Teachers of English in 1968 (Scolari, 2012). While Postman gives credit to McLuhan for introducing the term in private conversation this was the first time it was used in a way that was recorded for posterity (Scolari, 2012). In this conference, Postman defined media ecology as “the study of media as environments” (Scolari, 2012) this definition lends to the framework a biological metaphor, as Robert Logan points out in his study The Biological Foundation of Media Ecology (Logan, 2010). The words ecology and environment lend to this media theory a sense of interconnectedness. Much like the biological implications of the terms, this theory looks at how these mediums impact the structure, content, and impact on the people within the environment (Logan, 2010). Much like a biological environment the media environment is in constant flux, like adding a new species into an ecology “a new medium does not add something; it changes everything” (Postman, 1998). Also like a biological environment the media species within the media ecology interact and evolve with each other. This has been a stress with recent research in media ecology, which has examined the ideas of media convergence (Jenkins, 2006). Media ecology theorists such as Harold Innis and Jenkins trace certain developments in co-evolutionary terms, for example Innis tracked “the parallel development of railroads ad telegraphy in the nineteenth century” (Scolari, 2012, p. 209). These, as Scolari (2012) calls them, intermedia relationships provides prime evidence that mediums and media can be studied through a similar lens as we study the interaction of species in biological ecosystems. So, according to Logan (2007), the media ecology seeks to examine the interaction between the three domains media, technology, and language which together work to form a living media ecosystem. At this point I believe it may be useful to try and separate the terms ‘medium’ and ‘technology’ because, while similar, they have implicitly different meanings.
What is a medium?
The distinction between a ‘medium’ and a ‘technology’ is a somewhat slippery one. While mediums are technologies, not all technologies are mediums. In Amusing Ourselves to Death Neil Postman uses a comparison between the ‘brain’ and the ‘mind’ to help illustrate this distinction. “Like the brain, technology is a physical apparatus. Like the mind, a medium is a use to which a physical apparatus is put” (Postman, 1985, p. 84). This distinction means that a medium, is more than just a machine (as technology is), it is rather the “social and intellectual environment a machine creates” (p. 84). This is not to say that these technologies do not exist without bias, they do in fact exhibit a great degree of bias. Most technologies carry with them a predisposition for some kind of use, to borrow from Postman again for example; television carries with it a bias towards engaging the visual medium (Postman, 1985). While the television as a piece of technology could be used for any number of purposes from a light for the room to that of a radio, but that was not how it was adopted for use because its most novel feature was the broadcast of the visual medium (Postman, 1985). With this distinction firmly in place we can begin to examine the central tenant of the media ecology, “the medium is the message”.
“The medium is the message”
Of all the quotes associated with the framework of media ecology perhaps none is better known or provides better summation of the ideas than McLuhan’s famous quote “the medium is the message”. This idea comes from perhaps McLuhan’s most famous work, Understanding Media (1964). This phrase stresses the importance of the mediums that produce messages over the messages that they produce. As McLuhan (1964) wrote “’the medium is the message’ because it is the medium that shapes and controls the scale and form of human association and action. The content or uses of such media are as diverse as they are ineffectual in shaping the form of human association” (McLuhan, 1964, p. 9). An excellent example of this, as Strate (2008) points out, is art. An artistic rendering of a subject will have an entirely different effect depending on its medium, a sculpture is different than an oil painting which is different than a screen print or even playing the same song using different instruments, these all will yield a completely different piece (Strate, 2008). This is why McLuhan and other media ecologists stress the importance of the medium over the content of the messages provided by it. In creating this theory, as with any ecology, it was important to consider the historic developments of the environment. This is exactly what McLuhan considered in his 1962 book, The Gutenberg Galaxy which laid the framework for his later conclusion that the “medium is the message”.
In The Gutenberg Galaxy (1962) McLuhan outlines what can best be described as the four epochs of history as defined by the media ecology. These four epochs: the tribal age, the age of literacy, the print age, and the electronic age are each defined by a different technology, which has influenced the social and intellectual environments of society (thereby making them mediums as defined before).
The tribal age consists of the early ages of man before the existence of the written word. In this stage all history is oral and there is an emphasis on non-visual senses such as hearing and smelling because they provided a greater sense of what we cannot see, which understandable in a hunter gatherer tribal community is an important skill. Within this community and age there is a greater sense of community within which privacy was of non-emphasis, this community, McLuhan believed, had a greater awareness of the surrounding existence (Griffin, 2012). Because the spoken word only exists in the moment it is heard, there is little analysis and people are likely to believe what they hear. In the tribal age, people lived in the moment more.
This all changed with the invention of the Phonetic Alphabet. With its inception, the visual sense took reign as the most important which dramatically shifted the symbolic environment (Griffin, 2012). It now became possible to manipulate words out of context, and it jarred the community away from a collective tribal involvement into a ‘civilized’ type of private detachment. The written word meant that people no longer needed to congregate for information, which meant proximity was more important (hence why we began to spread out). Furthermore, McLuhan postures that the phonetic alphabet with its organizational structure, resulted in a more linear logical line of progressive thought (McLuhan, 1962). He believes that the invention of the alphabet birthed and fostered Philosophy, Mathematics, and Science (Laughey, 2007). It also created a line of informational and intellectual superiority and control between those who could read and write and those who could not. This created the aristocratic society that was common before the dawn of the next big technological advancement.
The invention of the printing press in 1450 made the visual dependence brought about by the phonetic alphabet widespread. Perhaps its most important aspect was the ability to replicate the same message and type over and over again, ensuring the integrity of the message (Griffin, 2012). McLuhan believed that the mass production capabilities of the printing press were the forerunner to the industrial revolution (Griffin, 2012). As individuals and groups turned to the written word for instruction and education, the era of detribalization sets in. It became no longer necessary for people to live, speak, listen, and be governed in the intimacy of tribal gatherings now that the written word can be mass-produced and widely distributed. The ability to spread a singular message over large distances helped to unify national languages, and was followed closely by the rise of nationalism; which was the result of a better informed populous (McLuhan, 1964). McLuhan (1964) offers the example of the French Revolution, “it was the printed word that, achieving cultural saturation in the eighteenth century, had homogenized the French nation… The typographic principles of uniformity, continuity, and linearity had overlaid the complexities of ancient feudal and oral society” (p. 14). The aristocracy that I mentioned in the previous age controlled the dissemination of written information to the public. As printing became more accessible the common man was given a voice, and that common man answered with a resounding “No cake for us thanks”. One other major effect of the Print age was a rise in isolation and individualization, because the printing press created portable books, people were able to absorb knowledge privately.
The Electronic age began with the invention of the telegraph in 1838. The telegraph shifted the media ecology back toward sound and touch (the two senses most closely associated with the telegraph). McLuhan, who was a very big proponent of electronic technology, believed this represented a retribalizing the human race, creating the global village (Griffin, 2012).
The global village is perhaps one of the most interesting ideas that emerged out of McLuhan’s theory. The global village is defined as a worldwide community connected by electronic mediums, which is similar to a tribe because everyone is aware of everyone else’s business (Griffin, 2012). We no longer live in tribal villages in the literal sense, but in the metaphorical sense electronic media has expanded our horizons to such an extent that we feel a vicarious intimacy with people and places all over the world (Griffin, 2012). Within the Electronic age, constant contact with the world becomes a daily reality. It is worth noting that what McLuhan essentially imagined here was the World Wide Web, thing is he imagined this thirty years before its realization. McLuhan believed the Internet would act as an extension of one’s consciousness. Within it everyone has access to knowledge about anything, and for people living within this age privacy has become a luxury at best but for most is a thing of the past (again remember that he predicted this in the early 60s). This positive reaction to the advent of the electronic culture was not echoed in Neil Postman’s sentiments. He believed that the speed of the electronic media, championed by the telegraph was an affront to the literate culture that was created by print media, which introduced “a large scale [of] irrelevance, impotence, and incoherence” (Postman, 1985, p. 66; as quoted in Laughey, 2007, p. 37). As fast as the telegraph carried messages it came with a lack of depth due to the medium’s bias towards short, truncated messages.
Modern Communications: Present Day
In today’s society, it is clear that McLuhan was spot on with his idea of the Global Village, I do not have a source for this but I would not doubt that the advent of the Internet as a societal force in the 90’s was likely one of the major reasons that lead to the creation of the MEA in 1998. The Internet has changed our society in many, many ways and only now are we starting to be able to study its effects on the media ecology. One of the major things that recent researchers have really focused on is the idea of media convergence. Henry Jenkins talks about media convergence in his (2006) book, Convergence Culture. Using The Matrix as an archetypal example, he discusses how modern media ecology has created texts that are too grand to be contained in a singular medium, creating what he calls “transmedia franchises” (Jenkins, 2006, p. 98).
This same convergence is being realized in the creation of new mediums. This was the focus of the joint study between Logan & Scolari (2010) mCommunication, which set to examine the emergence of the new mobile Internet mediums that have entered the media ecosystem. Logan & Scolari (2010) define mCommunication as “the convergence of the mobile devices and access to the Internet” (p. 170). Any communications process three distinct elements: the sender, the receiver, and the message/information (Logan & Scolari, 2010). Just as the telegraph represented the mobility of messages, the Internet represents the mobility of information. When one adds the element of mobile communications devices (i.e. smart phones, tablets, etc.) it extends that mobility to both the sender and the receiver (Logan & Scolari, 2010). By ‘unplugging’ and utilizing technologies that allow to be bridged between the telephone and Internet like Wi-Fi, users are able to access the wealth of information on the Internet and all the vast communicative possibilities contained within from the palm of their hand, at an instant. The phenomenon that we can observe in today’s society of people being absorbed in their smartphones constantly, or pulling it out to look up any bit of random information, shows just how this new medium is acting as an extension of ourselves, and consequently an extension of the ecology into a new epoch, which Logan & Scolari (2010) have deemed mCommunication. McLuhan’s vision of the global village has undoubtedly been realized, but this extension is far enough beyond that to warrant a new epoch.
Criticisms of the McLuhan’s Media Ecology
As I mentioned at the beginning of this paper McLuhan’s ideas were largely dismissed by the scientific community at the time of their creation (Scolari, 2012). George Gordon, for example, was quoted as denouncing McLuhan’s work as “McLuhanacy” (Griffin, 2012, p. 329). There are a number of different critiques that have been made on this theory, but they tend to center around one of three major lines of criticism: technological determinism, technological utopianism, and nonscientific methodology (Chandler, 2011).
Perhaps the most often used line of criticism is the theory’s perceived reliance on technological determinism, a line of though that makes many people uncomfortable. Technological determinism purports that “the development of society is directed by its technology” (Chandler, 2011, p. 281). This essentially means that technology controls the development of society and free will is minimalized to non-existent. One can readily see why this criticism could be applied to media ecology, but I believe especially in todays media-centric society, it seems that we may in fact be under the deterministic power of technology; unstoppably cascading towards a convergence of man and technology that Ray Kurzwiel has dubbed “the singularity” (Kurzweil, 2005). The second critique that is often used is that McLuhan’s views represent a utopic view of technology that he perhaps looks only for the most positive impacts that it has. Finally, the third critique is that of nonscientific methodology. As I stated before I think that media ecology is far more reminiscent of a framework than a theory, it is a very generalized way of looking at the world and making connections, as Postman said. Because of this however the theory faces scrutiny because McLuhan’s way of thinking (which is incredibly in line with my own thoughts towards research) is based on observation alone, as Eric McLuhan (his son) said in (2008) “[he] start[s] with – and stick[s] with – observation” (McLuhan, 2008). While the scientific community may take contest with this methodological approach, I tend to side with my personal feelings on the matter which is “if you are right you are right” and I think McLuhan hit the nail on the head.
The media ecology is ever changing, just like our actual ecology. We have witnessed countless technologies converge, opening completely new and interesting avenues. In the near future, the media ecology is posed to have another seismic addition, the convergence of virtual realities and the physical world. We see this beginning to permeate our culture with Augmented Reality technologies such as Google Glass and technologies which allow for the transference between the digital and physical (a relationship which previously had operated only in the other direction) with 3D printing. There are a number of technology leaders who are making this push, but perhaps none provides a better example of convergence theory than Elon Musk’s recently revealed rocket parts manufacturing design system. Musk and his team utilized a number of pre-existing technologies which have recently entered the ecology the Leap motion controller (which allows for naturalistic interaction with the visual data on the screen), the Oculus Rift (which creates a fully immersive virtual environment), 3D printing, and a number of other technologies, to create a new way of designing and manufacturing new rocket parts for his private space program SpaceX (Space.com, 2013). As a brief aside, when the article calls Elon Musk “a real-life Tony Stark” I could not agree more, this man is my idol and is setting to change the world. Now this convergence of technologies could lead to the same mobility and speed that has been associated with the digital world to the creation of physical objects, essentially combining and revolutionizing the design and manufacturing process. Developments and revolutions like this one would not be possible if it weren’t for the interactions and convergences within the media ecology.
ShiftnetworkWheel - Media Communication's Global Environment
Media Ecological Environmental Education
This is what Anotonia Lopez has to add about Media Environment:
Remember, all worldviews are environmental worldviews, whether they are based on exploitation or harmonious engagement. Media are a kind of environmental education. They teach us how to act upon the world, encouraging a particular attitude towards living systems. Mediamakers have a tremendous responsibility to incorporate a more holistic and ecologically intelligent perspective into how they mediate the world. Just as mediamakers increasingly have become sensitive to the stereotyping of genders, cultures, nationalities, and sexual orientation we now have to make a turn towards planetary ecology to become aware of how our forms of mediation impact living systems.
Minimarts of the Mind
Our current world system has made the production and reproduction of a certain form of
passive, consumerist consciousness its primary product. Subsequently, the world system has colonized the collective unconscious as it preys upon the living world. Following the brilliant Indian theorist and activist Vandana Shiva, we can understand this system as a mental model, a “monoculture of the mind.” Monoculture is an agricultural term for single crop farming, such as corn, palm oil or soya, that requires external inputs, like chemical pesticides, petroleum-based fertilizers and genetically altered seedlings. According to Shiva, monoculture is a “cognitive space,” one that sees food or agriculture for what it offers for the market. Whereas a local knowledge-based perspective looks at the nurturing characteristics of a given ecosystem, such as nutrition, soil, water and life,
by contrast a monocultural knowledge system sees ecosystems as resources that can be commoditized, privatized and controlled. The monocultural approach views life from a rational, scientific perspective. As agriculture, monoculture is not designed for local use or consumption, but for export and transport across the globe. It’s mass produced and generic to the point of being without context—it can grow anywhere and be done by anyone who buys into the system (sometimes by choice, others as enforced policy). The monocultural mind has a totalizing effect that extends beyond food systems to larger forms of social and economic organization that expand to the implementation of technology and media. Monoculture has a way of crowding out alternatives in order to promote standardization, masking itself in the rhetoric of development. We categorize nations and people according to whether or not they conform to this “development” scheme. Mass marketed media are designed to work within this standardized system to the extent that multinational media corporations promote and lobby for laws that favor their products in the global marketplace, protecting their interests against unauthorized uses and competition. They then use media to advocate for their particular position in the marketplace.
To see this phenomenon at work as a coherent system, one only needs to go to a random “convenience” minimart that populates North American highways. These are access points to the monocultural mindset and embody in an extreme way the volatility of the system. Minimarts service the various addictions of our society: oil, alcohol, tobacco, polyunsaturated fats, caffeine, empty carbs, sugar, porn and packaged media. Some also incorporate fast food kiosks. Here the totalizing spectacle of the world system masks its severe volatility: a breakdown in one key ingredient—oil, either in terms of scarcity or
price—then the whole system crashes. These portals do not depend on a single local input except for labor and utilities, and even these are often imported. In many poor and rural communities food markets no longer exist, just these vortices of cultural toxins, and as such, they represent monoculture’s ideal dream space. Constructed in meadows, clearcut forest tracts, and on fragile desert soil, each minimart island becomes a desolate outpost of the dominator complex. The misnaming of these portals as “convenient” reveal the anthropocentric and selfish character of unsustainable economic patterns that encourage personal satisfaction over planetary health. Likewise, if global media corporations have their way, the Internet and other media distribution channels will become of minimarts of the mind.
Goddess of Light
Advertising is the dream life of corporations. And Pepsi has dreamt up Shakira, a high
priestesses of the world system. A Colombian-born singer of Lebanese dissent, her name means “goddess of light,” an appropriate name for someone who is primarily experienced through electricity. Her function became apparent to me around ten years ago when I was working on a media literacy project in northern New Mexico. I was given the task of finding ads for a Spanish language media and health curriculum that would use media samples to tackle issues like body image, tobacco and alcohol abuse, gender identity, and violence. I taped hours of Spanish language TV from satellite, hunting for concrete examples of nefarious media to teach with. It wasn’t difficult. From underaged girls cage dancing on a children’s show to mass murdering gangsters strangling women with ropes, Mexican TV is an open laboratory of ritual abuse. But nothing prepared me for what I
I missed it on the first run through the tapes. But as I re-watched them, the
bleached blond goddess of the Latin Pop Matrix reasserted herself in a 30-second spot for Pepsi. Believing at first that this was just a Rorschach test for the litany of social ills I had set out to find, I rewound the commercial over and over again to make sure my senses weren’t deceiving me.
The ad opens with a glowing, indigo hued hall exaggerated by linear perspective, as if we are peering beyond the guts of a television set. Along the frame’s edges are circular portal-like windows as if it were the hull of a spacecraft. A concert stage forms the horizon line. Above it hovers a bulbous red, white and blue sphere. Below the sphere Shakira emerges to face concert goers. She belts a jingle devoted to global freedom, stalking along a cat walk, slithering, cooing, teasing her way towards the camera. Then a long shot reveals the stage’s actual shape: a crucifix. The portal windows now resemble stained glass windows, and long columns look eerily like the pillars of a great cathedral. If you are Latin American and Roman Catholic, the allusion is unmistakable. Shakira is performing mass.
As the concert progresses, her moves are ritualistically mirrored by an audience of clean-cut, adoring youth. In the final shot a Pepsi bottle materializes in Shakira’s hand, appearing from a flash of light. Shakira, and then the crowd of teen followers, together imbibe the Black Water of Imperialism. In sync they perform a transubstantiation of the world system: the indigenous colonized are transformed and purified by the Blood of Capitalism in order to go to Heaven to become White People.
As a spiritual sermon, here the world system represents itself as the dominant
planetary religion. The advertisement is a mini-ritual designed to educate Latin Americans that in order to better their lives, they must transform themselves into what Shakira has become. Already she’d altered her identity to join the planetary cult: she transitioned from her once dark-haired and distinct Latina identity to a blond angelic archetype typical of world system media: an incubus. Like her predecessors Britney Spears, Christina Aguilera and Madonna, she is a leather clad, blonde vixen set out to train youths to become proper aliens. Her divinity is bequeathed by the red, white and blue sphere. Like the pied piper, she beckons youth to leave behind their traditions in order to board her fun-filled spacecraft. To transform ourselves from the old world into the new, the magical, transformative elixir is, of course, Pepsi.
Where’s the Beef?
In addition to soda, corporate media and hamburgers have a lot in common. They succeed
because they stimulate the pleasure centers of our brains. As an example of how they converge, in 2008 Burger King launched a viral marketing campaign called “Whopper Virgins.” The idea was to take the Whopper burger to remote regions of the world and to film how people reacted to it in a taste “test” against the McDonald’s Big Mac.
To create the campaign, Burger King enlisted skater/filmmaker Stacy Peralta, director of Dogtwon and Z Boys, which documented the skater counterculture in LA during the 1970s. He took a crew to Thailand, Romania and Greenland where he filmed in mockumentary style. It has all the signs of a legitimate documentary by using shaky cameras, interviews, and “realism,” but to any keen-eyed, mediate literate observer it was clearly a farce. It portrayed the North American film crew as “normal” in order to make
the regional cultures appear absurd and strange, a technique going back to early circuses. The crew addresses us as if the hamburger is a kind of religious article of faith, and that of course anyone should like it and adapt their culture to it. Not only that, it should be Burger King that provides the access point to this product.
In terms of promoting green consciousness, hamburgers are one of the least ecologically sustainable food products. Making them requires an obscene amount of natural resources, from water to clear cutting forests for ranch lands. Moreover, hamburger production is highly automated, technological and centralized. For these reasons the hamburger is closely tied to climate change and symbolizes perfectly the monocultural mindset.
In order to propagate the hamburger, the ad needs to scramble our common sense. It does this through its pseudo claim to authenticity by incorporating Peralta’s street cred and fake documentary style that gives it a sense of verisimilitude—a feeling of “reality.” The growth of reality TV techniques are not confined to TV programming, but also extends to marketing and viral media. This demonstrates how corporate media survive by eating reality: whenever possible they have to harvest shreds of the real to claim legitimacy. It’s a very sketchy, sneaky and unethical game. But that is what is afoot.
Monoculture succeeds in post-traditional societies where identities are flexible commodities that have to adapt to consumption and constant change. In this respect, we think of ourselves as “free”— freed from tradition, free to choose who we want to be. On the surface this seems like a good thing—we are always told that the past is oppressive— but in practice the kinds of traditions that are eliminated by capitalist enclosure could also contribute to our well being, helping us reconnect with
traditional knowledge is a dangerous game, and though people in rich countries deride and expel immigrants based on a fantasy of cultural and racial purity, they depend on imported laborers that are still skilled in farming and ranching to maintain their food systems. Traditional knowledge also enables us to remain autonomous from the world system. Yes, we have the choice of being skaters, hip hoppers, ravers, preppies, etc., but increasingly many in the world don’t have the luxury to shop for identities at the mall or on the Internet. They are being displaced from the cultures, languages and lands and that have shaped their identities, and are being forced into permanent migration. And media provide the selling points to justify this disruptive process.
Traditions are broken so that we become dependent on external forces for all our needs. In some cases laws are written and lives are brokered through trade agreements. In others we are seduced by cheap consumer goods. Eventually we are sucked into a kind of spiritual dependency in which the corporation, as evidenced in the Pepsi ad, becomes the mediator between us and the cosmos. The vision that advertising offers us is that we cannot understand ourselves or our place in the universe, or with nature for that matter, without the intervention and mediation of the world system. So though Shakira croons about the virtues of freedom, the system is designed to turn us into serfs.
(Re)Mediating Ecological Worldviews
A mixture of carbonated water and high fructose corn syrup sweetener produced through
monocultural crop production, Pepsi is not just an innocuous soft drink: it’s world system in a disposable plastic container, a holon of a monocultural system of food production and consumption that includes genetically modified corn and its syrupy byproduct. Pepsi
is not in the business of traditional diets or healthy lifestyle choices. For many indigenous people, it’s a recipe for diabetes. Soda is liquid candy, and typically Americans drink over 50 gallons of it a year.
Not to overstate the obvious, but by the time we consume products from companies like Pepsi, they are so far removed from their source of production that they might as well have been delivered by spacecraft (piloted by Shakira, of course). Like the smartphone, Pepsi’s marketing image is pure and whatever waste, toxic byproducts or health effects that come from its production process and waste remains out of view to the average consumer. The Shakira ad, then, in fact propagates an ecological worldview, a mental model for how we engage (or not, as is most often the case) our living systems.
As stated previously, all worldviews are environmental, because they determine how we act upon Earth. Consequently, the world system is primarily an ecological worldview first, and all its other affects are symptoms of this fact.
Viral Soul Regulations
Media Law: Some Musings On Teh Laws Modus Operandi
Finally, grounded in the systemic backdrop of social inequality, this chapter encourages readers to begin the task of critical thinking and reflecting about how each of us, as individuals and members of local communities, nations and the world, assuage or reproduces the structurally-derived inequalities which the globalization of communication and technical systems and interacting in a global environment manifests.
Theoretically Contextualizing Communicative Interactions In Social Institutions
Lessons learned from 20th Century social theorists can be applied to explore how changes in epistemology have engendered social change in communication and interaction norms in our new millennium. By prioritizing the social construction of everyday social interactions, Peter Berger and Thomas Luckmann (1967), two well-known sociologists, challenged classical structural-functionalism, or social systems theory, particularly the ‘sociology of knowledge’ put forth by Talcott Parsons. Classical systems theory (see Parsons, 1937) articulated that a dialectical relationship exists between individuals and social systems and proposed effective social interactions resulted from successful socialization and integration of socially-approved norms, values and beliefs. In Durkheim-ian terms, the ‘social glue’ holding society together is created by the commonality of language and shared meanings of common socio-cultural ideas.
In 1967, Berger and Luckmann amalgamated tenets of grand social theory, such as the ideas put forth by Talcott Parsons, Karl Marx and Emile Durkheim, with insights from existential philosophers, stemming from the works of Jenn-Paul Sartre and Albert Camus. The marriage of metaphysical activity, particularly the creation of meaning from an essentially meaningless world, with sociologically-informed activities, i.e. social interactions social systems and social institutions, highlighted the tendency for social institutions to establish normative expectations and individuals’ propensity to conform or rebel. The term ‘reification’ (Berger & Luckmann, 1967) was coined to describe the capacity of the social order to operate in ways that seemed supra-human, or beyond interactions and influence. When individuals encounter seemingly-outdated institutionalized practices, the opportunity for social change emerges. According to Berger and Luckmann (1967), this occurred largely through intergenerational conflict, such as when newer generations questioned the norms and values of older generations.
Applying this theoretical lens, we can see social interactions achieve much more than that which occurs during any single interaction. Social interactions not only exist as sites for communication exchange, they serve as an effective mechanism for stabilizing the existing social order and ‘flow’ of institutions and societies. In Berger and Luckmann’s (1967) terminology, our collective symbols and everyday reality legitimate our subjective experiences. Social interactions may also be sites of protest, enabling social change via the creation of new modes of thinking and behaving which challenge the status quo. Thus, the strength of this theory lies in its capacity to establish quasi-universal principles which, on the surface may appear intrinsically individualistic and focused on furthering the capacity of individual action, yet simultaneously assert the power and authority of social institutions. In brief, social interactions require communication. Communicative events, although comprised of individuals, encompass a totality which surpasses the aggregation of individuals.
Today traditional boundaries blur as a result of new media that create an increasingly convergent communications environment. Media Law traditionally focused on critical First Amendment issues related to the press, to journalists, and to content. Communications law generally involved communications as a type of regulated industry, where content is not center stage but instead focus involves content delivery systems, entry into the market, pricing, and a myriad of other structural regulatio3ns that govern content delivery systems. Today provocative, cutting-edge developments in cyberspace now provide both media and communications law practitioners an array of contemporary problems to solve.
Law and Etiquette for Using Photos Online
Sarah F. Hawkins wrote the following article:
Most SEO experts suggest using at least one photo in every blog post. From an aesthetic perspective it’s a good idea, especially when the photo has something to do with the content. Photos and images are especially important for food blogs. And, of course, there is that “A picture is worth a thousand words” adage.
I always thought everyone knew that copying and pasting photos found on the internet was a definite no-no given that nearly every image created in the last 30 years is still protected by copyright, whether here in the US or from another country extending such rights. Boy was I wrong! When I spoke at Blissdom, one of the questions I asked of the audience was how many people have had a photo stolen. Nearly every hand in the room went up. WOW! We’re talking about fifty-some people (probably more). I went on to ask how many people have used Google Images to find photos. Quite a few hands went up.
Today I want to discuss using photos found online. I will not talk about using images from a brand’s website. The focus is on those images and photos found by searching the internet and coming up with page after page of images that may be suitable for your needs.
What is Copyright? Copyright is protection created by the US Constitution that give virtually every author the exclusive right to use or reproduce their work. This is a federal law and therefore uniform across all states. And, as the US Government has signed on to a variety of international copyright agreements protection is essentially world-wide.
US Copyright is a protection that applies to original works of authorship fixed in a tangible medium. “Original” means that an author produced a work by his or her own intellectual effort instead of copying or modifying it from an existing work. “Fixed in a tangible medium” means that the work is able to be perceived, reproduced, or otherwise communicated. Your blog is the necessary ‘tangible medium’. (17 USC 102)
Nearly every photo taken gives the author (the one who takes the photo) a protectable right to prevent others from using or reproducing that image. Of course there are exceptions, but generally, the photographer owns the copyright. This is actually very important to know should you ever hand your camera to someone else to take a photo. That’s a completely different discussion, but don’t get offended if you ask your photographer friend to use her camera and she says no.
How do I get a Copyright? Copyright is automatic upon creation of an original work of authorship. With regard to photography, with few exceptions, every image is accepted to be covered by copyright upon putting the photo onto a hard drive or similar device.
If the photo is only on your hard drive there really is no significant issue regarding unauthorized copying. It’s when you upload your photo to a photo sharing site, your website, your blog, Twitter, Facebook, or other social media platform when the potential for someone to use your image comes into play.
There is often a misconception that you have to ‘do something’ to get a copyright. That is not true. And no, you don’t have to mail yourself a copy (often referred to as the “poor man’s copyright”). The current version of the Copyright Act does not require any filings to obtain a copyright.
However, if you wish to purse an infringement lawsuit you will first need to obtain a registration with the Copyright Office of the Library of Congress. (I will not discuss this process here, that’s for another day.)
Can I Use Photos or Images from the Internet? NO! Well, maybe. Possibly. As a general rule, just assume that if you find an image on the internet that it is covered by copyright. Do not just ‘right click/save’ and put it on your website or blog or other social media platform or even use it on print materials. If you can find the source of the image you can then determine if they grant a license, such as creative commons, or offer it in the public domain. If they do offer a license, either free or for a fee, comply with the license and follow their rules and you’re good to go. Just know what you must do.
NOTE: finding something on the internet DOES NOT mean it is in the public domain. “Public domain” is a term of art and refers to a legal rule that means a work is no longer covered by copyright.
Can I Give a Link Back and Use the Photo? Uh, NO! Often referred to as the ‘hat tip’ or ‘shout out’, many feel that if they give the photographer credit of some form then they’re good to go. WRONG! Of course you need to give credit if that is what the license requires, but then you actually have permission. Just telling people who took the photo will not protect you if the author did not give you permission to use the image.
There is a big misconception that people want you to share their photos with your friends, family, readers, etc. Not always true. And while the majority of photographers really won’t mind, there are many who do and many who will not hesitate to take down your site for using their images.
If I’ll Get Caught Maybe I Just Won’t Link Back Even worse! Now you’re claiming it as your own and that is sure to anger the photographer. If you don’t want to link or give credit, either take the photo yourself or find images that are in the pubic domain.
But the Photo Did Not have a Copyright Notice On It! Then, if you want to use the photo, that should alert you to do some extra work to find out who owns the image. Copyright laws do not require the author to include a copyright notice. Yes, having one makes it easier to find out to whom you need to go for licensing. However, the lack of a copyright notice does not mean it is in the public domain or yours for the taking.
Where Can I Find Quality Images I Can Legally Use? There are a number of stock photo sources that offer free or low cost options. The following is not a comprehensive list, just the ones I like to use. Read over the rules so you know exactly what type of attribution is required.
Flickr Creative Commons Group
User Beware! Many photographers are embedding their copyright information into the source code for the image, so even if you crop out copyright notices, crop the photo to a size you want, right click instead of download, take a screen cap, or other ways of saving understand that the author may still be able to track your posting of their image online. In addition, just like that game Six Degree of Kevin Bacon you never know who knows someone and you’d be surprises how protective people are of their photos.
Conclusion Before taking any photo off the internet, get permission! Whether it be via a free license such as creative commons, paying for the license through a stock photo site, or using images known to be in the public domain get permission to use the image. It doesn’t take a lot of time to find a quality image. It surely takes much less time than what you’d have to spend if you get a cease and desist or DMCA take down notice.
Disclosure: While I am a lawyer, I am not offering legal advice. Posts on legal matters are intended to provide legal information and do not create an attorney/client relationship.
Right Or Wrong - Legal Or Illegal: Downloading Muisc
Sharing Music or Stealing Music Is The Question
Is Piracy Really Killing the Music Industry?
For me it’s sharing not stealing! What do you think? There really is a cultural gap between music industry and the downloader. Sharing music is arguably a technical innovation and that people like me and other downloaders have adapted to this innovation. This proliferation of downloading culture, and access to music content is a battle that the music industry will not beat.
I want to first start by saying that I am in no way promoting illegal downloading or file sharing. This weeks topic is on Framing versus “Transversality”, I will concentrate on this two topics in terms of music and piracy.
Framing was described by Andrew Murphie as ‘belief, attitudes, values and mental models, that we use to identify or understand a situation.
Transversality “Simply put a transversal is a line that cuts across other lines, perhaps across entire fields – bringing the fields together in a new way, recreating fields as something else” (Murphie 2006). Therefore, in correlation with our lecture today, Murphie argues that framing and transversality are interdependent.
Have you ever downloaded any file illegally? Come on tell the truth! I’m pretty sure everyone has downloaded something illegally at least once in their life. This weeks reading served one of the most ongoing issues surrounding the today’s music industry; which is Music Piracy.
The reading ‘Music Piracy war: are the big labels wasting their time? By Asher Moses, look at the issues and the possible solutions that the music industry have tried to do to overcome music piracy. Honestly, they are just wasting their time! This consistent need to frame music is a Loop system. We live in a networked society, where it is normal to share files, share our thoughts,comments, our very essence. Just like McLuhan argues the basic premise is that all technologies are extension of human capacities. Tools and implements are extensions of manual skills; the computer is an extension of the brain (Murphie and Potts 2003), it would be very hard to alter this downloading habit, as it is extension of ourselves.
If you take into account Lawrence Lessig activist. He is best known as a proponent of reduced legal restrictions on copyright, trademark, and radio frequency spectrum, particularly in technology applications (Wikipedia 2011). In a way music industry is framing our creativity! Lessig can arguably be seen as a pioneer for transversality.
We live in a remix culture, where we take an idea and make it our own. Music is the same, we take that art and listen to it. It becomes an extension of ourselves. Not everyone can pay for albums, everything our favourite artist introduces it to the market, I certainly couldn’t afford that. Just because we don’t pay for the music, doesn’t mean the artist looses money, what they miss is that they are gaining more fans. They are gaining more viewership and followers.
Colin Jacobs, chair of the online users’ lobby group Electronic Frontiers Australia, said “evolving their business to fit the times, not illegal downloading, was the problem the music industry needed to focus on” (Moses 2003). Jacobs makes a valid point, instead of changing or trying to change the downloading habits of our society, which obviously has failed to come up with a valid approach or solution, they should simply adapt or work around the problem. It is inevitable that downloading and sharing will influx in the future, and it will not only be downloading movies, or music it will extend to other art and ideas. The access to internet is so easy, a kid can download without even realising it, to stop this is simply impossible, unless they want to cancel the internet itself. “ Music is the universal language of mankind” (Henry Wadsworth Longfellow), breaking the contemporary restraints and frames is the only solution to this ongoing issue of music piracy. Instead of showing pointless piracy advertisement that are ineffective and is basically seen as a joke (take a look at this parody), I can’t stress enough that transversality is the modern movement and the right one at that.
Jacobs again made another valid point that “the digital revolution offers phenomenal opportunities for distribution and promotion, and the industry is leaving these opportunities on the table”(Moses 2003). Music industry needs to break from the ‘frames’ that are holding them back. As Murphie argues “ transversality is the unavoidable discipline we must follow in new media studies” (Murphie 2006). Transversality is arguably essential for today’s music industry, as mentioned earlier, it is recreating fields into something else. Instead of music industry trying to control and punish illegal downloader, why not adapt to contemporary downloading habits of today’s users? Wouldn’t it be a win- win situation?
Music industries need to break free from the frame that structure or enclose the music industry. Industry should use the ‘habit’s instead of seeing it as an disadvantage, use it as a positive for boosting business, ‘INVOLVE THE HABIT’. We live in a data-sharing revolution.
I want to leave you with the question from earlier, ‘Do you think downloading music is sharing or stealing?
Stolenn Works, Waht Legal Action to take
Blogging The Right And Legal Way
Loren Bartley, in the following Review informs us with the following article titled:
Blogging & Tweeting Without Getting Sued
I have recently finished reading Blogging and Tweeting without Getting Sued: A global guide to the law for anyone writing online (affiliate link) by media law expert, Mark Pearson. I was attracted to this book in the first place, as I have witnessed firsthand quite a bit of unsavoury online behaviour of late both personally and in the media and was keen to get a bit more information as to where that imaginary line in the sand is and to be able to share that information with fellow online users. This post provides a summary of the main issues discussed in this book.
his book is a handy guide that explains in plain english how you can get your message across online without getting yourself in trouble with the law, a lesson that anyone that is conducting business in the online and/or social media environment should be aware of. If you are sharing content online then you are effectively a self-publisher and as such fall under those laws that were originally produced for traditional media publishers, with many of those laws not yet keeping pace with the fast changing world of online publishing.
A range of international examples of the legal risks you are taking as an online writer are provided throughout the book, whilst at the same time demonstrating how cyberlaws may vary in different jurisdictions throughout the world. Issues covered include defamation, contempt of court, privacy breaches, identity theft, confidentiality, court orders, hate speech, state secrets, breach of copyright, trademarks, false advertising and sedition.
The overwhelming message throughout the book was that you can never quite be sure where your words, symbols, still and moving images, sounds, illustrations, headlines, captions and links might finish up and which laws will therefore apply. However despite the different international laws, most countries have the same principles. That is, there is a level of expectation that when you post on-line you will “refrain from committing a crime, destroying someone’s reputation, interfering with justice, insulting minorities, endangering national security or stealing other people’s words or images”.
Defamation, the legal term for damage to reputation, is the most common area of litigation for publishers, whether that be online or using more traditional forms of print media. Blogs, tweets, Facebook comments, hyperlinks, emails and even retweets or “Likes” might make you liable for defamation. As soon as you say something nasty about someone online you have defamed the victim of your comments. The damage that can come from such comments can range from embarrassing to devastating (and expensive) depending on many factors, including the content, trauma caused and who and how many people see what you have published.
Defamation is referred to as “libel” in its permanently published form or “slander” when spoken and is usually actioned as a ‘tort’, where someone can file a law suit against another person over a “civil wrong” that has been done to them. Courts usually award a sum of damages as compensation or make other orders, such as stopping the publication or forcing an apology.
The best advice offered here is to think twice before you write about a person or business in a negative way online, even if you are only using 140 characters.
Republication, Retweets & Shares
Whilst the initial author is responsible for the original publication of any dubious material they post online, any third part that shares those messages may also face damages, particularly if they add more inflammatory material of their own. Therefore you must also be careful when retweeting or sharing the work of others, as in doing so you are then republishing that material under your name and may share the legal liability with the original publisher. Courts will also consider the extent of republication when assessing damages, so should you publish anything unlawful, this is one time when you may wish that your content doesn’t go viral.
Should you choose to remain anonymous online by posting as a pseudonym or alter ego, you are still subject to the same laws and you are probably more identifiable than you might think. There have been many cases where Internet Service Providers (ISP’s) and social media platforms such as Facebook and Twitter have been ordered to reveal the identities of anonymous users.
There are two types of pseudonyms, noms de plume (‘pen names’) and nom de guerre (‘name of war’). The first allows you to take on a different persona, as many great authors have done in the past, whereas the later is used as a cover for your attacks upon others and is a form of cowardice. Using the veil of anonymity to engage in character assassination or vent bile at the expense of others is reprehensible and something that courts in most countries will not tolerate. In addition to this, by setting up fake profiles you are automatically in breach of the terms of service with social media platforms such as Facebook and most countries have laws forbidding setting up fake social media accounts or websites for mischievous purposes.
The best advice offered here is don’t publish anything under a pseudonym that you would not be prepared to take responsibility for if you were exposed at a later stage. Another thing to consider is that using your own name can sometimes afford you some legal protection as part of a defamation defence, as under your real name you can argue that you were just expressing an opinion. However it might be harder to prove the opinion of your online persona was one your honestly held in real life.
Often people may be more liberal in the type of content they share when posting within the perceived privacy of an online group, such as a “Secret” Facebook group. This is all well and good, but what if that content is later shared? Re-posting that content in a public forum might be classified as a breach of confidence or the disclosure of embarrassing facts and could result in civil action. Sharing content sourced from a private or restricted access group requires use of your moral compass and ethical judgement. If your compass is in tact, then you would never disclose embarrassing private facts about another that have been sourced in confidence.
On the flip side, never share any content online that you would not be prepared to say to someones face or would be unhappy seeing plastered across the evening news with your name attached to it. In other words, if you want something to remain private, then best to keep it that way and never post it online regardless of how strict you perceive the privacy settings to be.
Cyberstalking & Bullying
A lot of blogging and social sharing involves requesting opinions and invoking discussions. Sometimes this can get pretty heated. Whilst most people know when to walk away from an online disagreement, there are some people that either just won’t drop it or seek out and thrive on such altercations. Sometimes this can turn nasty in the form of cyberstalking and/or bullying. Many people are unaware that stalking laws extend to digital intimidation in many countries. If it is your personality type to hold onto issues like a dog with a bone, then perhaps the online environment is not for you.
I love this quote from the movie The Social Network: “The internet is written in ink”. Gone are the days when we can destroy the evidence of our written work by shredding publications. The minute you push that Post, Publish or Send button when publishing online you make your work instant and irretrievable. Even if you delete your work shortly after publishing it, you can never be quite sure who has seen your content, taken a screen shot, downloaded it or shared it already. This allows little room for impulsiveness, carelessness or publishing under the influence, things that we should avoid as online publishers.
Should you publish dubious material, the best policy is to take all steps to withdraw it as soon as possible. If others choose to forward it or republish it, then it may well become their problem rather than yours.
Digital theft of creative work is rampant on the Internet and social media, with intellectual property laws varying markedly throughout the world. As a publisher, you have a moral obligation to your fellow creators to make full acknowledgement of the original author of the content you share.
Intellectual propriety laws are in place to protect our right to the exclusive use of any creative outputs we may produce. The most relevant law in this instance is copyright law. With copyright, the original creator holds the right to what they have created and has the legal power to license others to copy it as they see fit. Copyright covers creative work such as writing, music and images and may also extend to computer programs and databases in some countries. Copyright does not protect an idea alone, but rather a form of expression used to convey your idea.
In most countries, work does not have to display the copyright symbol to be protected. However, using this symbol on digital work such as websites and blogs is still encouraged as best practice, as it signals your claim of authorship to anyone who might think you have given up your rights just because you have posted it on the Internet.
The best advice given here is to seek legal advice on your exposure to a host of commercial law issues in your jurisdiction and get the permission of the creator to reuse their material. If you cannot source the original author, then you should consider creating your own material.
Bloggers should also look into the details of their contracts and/or local laws relating to any work the have been paid to produce for employers or clients to determine where the copyright entitlement lies and to what extent. You also have the moral right as an author to receive attribution for your work and to object to changes that may damage your integrity as the creator.
In short, if you are being paid to report on, promote, review or analyse something, then you should disclose it. These types of arrangements are considered as endorsements and therefore the material connections you share with the seller must be disclosed. As well as falling under trade practices and advertising laws, it is best practice, particularly if you are trying to develop a level of trust within your online community. This allows your readers to determine for themselves whether the advice you are giving is unbiased or not and prevents you from being accused of misleading or deceptive conduct.
Whilst I have provided a summary of the key points of this book that I feel are most relevant to small businesses that use blogging and/or social media as part of their business, I have only touched the surface of the wealth of valuable information that this book contains. I would highly recommend this book not only to businesses, but to anyone that has an online presence. My (non-legal) advice is to read Blogging and Tweeting without Getting Sued: A global guide to the law for anyone writing online (affiliate link) and then seek further legal advice (if necessary) on any aspects of cyberlaw that you are either unsure about or feel you may have got very close to or stepped over that imaginary line in the sand on.
Lot to do About Avoiding Infringing On Other's Intellectual Properties
The Technilcaliies of Avoing Being Sued Whilst Bloggin...
Blogger Roni Loren shared a story on her blog of being sued for misusing photographs that she pulled from another site. She thought that by giving credit to the original source she was covering herself from liability. She was wrong.
In today’s digital world bloggers are everywhere. They write about anything from sugar-free cooking to fashion to automotive technologies. Behind many of these blogs are writers who have made this their livelihood. What used to be “online diaries” and “scrapbooks” for these bloggers, have become sources of income — but along with the proliferation of bloggers comes legal responsibilities for intellectual property.
How bloggers get sued and how to avoid it:
1. Cite your sources. This should be an obvious one. We’ve learned our entire lives to cite sources in school, and writing on paper versus typing a blog is no different. When you are stating facts and/or quotes that are not general knowledge and were created by someone else, you should give them credit. Be aware though, linking to a source is not the same as citing it — links can be broken, so citations should come in the form of in-copy attributions even if a link is also used, like so: “One thing to bear in mind when quoting text from someone else’s website, however, is that many companies carry content usage guidelines that will let you know if they do or do not want you using their content,” Corey Eridon, Hubspot.
2. Give photo credit to images that you don’t own. I’ve mentioned this before in a previous post. If you want to use images in your blog posts, there is a safe, easy way to go about it. There’s a great website called Creative Commons that I use to find blog pictures in a safe, responsible manner. Just go to its search page, check the boxes for images that you can “use for commercial purposes” and “modify, adapt, or build upon,” and then search one of the five image-related options. Personally, I am a big fan of Flickr—its users tend to be very clear about how, and when, you can use individual pictures. Each image comes with an accompanying description of its copyright status, so you can be sure of whether or not (or how) you may use it legally. Just because a photo shows up in Creative Commons does not mean you may use it without attributing it to the original owner.
3. Disclose your paid endorsements. Many people in the blogging world are unaware that they should tell their audience when something they write is actually a paid advertisement. This is especially true wow that the Federal Trade Commission recognizes blogging as more than a hobby. For more information check out the FTC FAQ Page. You can still provide useful information while getting paid, so there is no reason to not level with your audience about it. Hiding the truth usually hurts you in the long run, so be transparent about advertising.
4. Defamation and freedom of speech – there’s a difference. The proliferation of gossip blogs is at an all-time high. Kids in high school, college students, adult socialites and celebrity gossipers are a dime a dozen. Regardless, you need to be cautious of the things you say about others (regardless of their public figure status) to avoid getting hit with a lawsuit. Anything you write that harms a person’s reputation and is untrue can expose you to scary amounts of liability. Even if you write something true, which by definition would not be libelous, if it harms a person who gets mad enough to sue you over it, you could spend a fortune defending yourself. Lawyers are expensive! You can disagree with people in a professional manner, and you can share your opinions about other people’s actions, but try to avoid mean-spirited personal attacks and gossip.
Whether you getting paid for your blogging expertise or just posting your thoughts for fun online, you need to make sure that you are following the rules of the web. Protect yourself by (1) citing your sources, including giving proper credit for images you don’t own, (2) disclosing when you are being paid to write something, and (3) staying away from badmouthing others.
The Folowing Articcle was co-edited by Magda Torres for paper.li
Content Creation: Know Your Rights, Keep It Legal..
Q: What’s more painful than having to pay $8000 dollars for image copyright infringement penalties?
A: Being sued for $150,000 dollars.
Not too long ago I was asked by someone if they could re-use an image that I had used in a recent blog post. Without thinking twice, I said yes — as long as they credited the original source.
A couple of hours later I stumbled across this post by Roni Lorin on image copyright and it got me thinking: what if I actually didn’t have the right to use that image to begin with? I asked permission, but what if the image didn’t actually belong to the “owner”? What if I am violating a copyright?
That led me to research a bit further and I ran across these two stories: one about the use of a low grade image of Nebraska by the Content Factory, a company that consults to small businesses, and the other about a community fan photo posted on Goodreads.
The scary truth about Roni’s story, as well as the other two incidents, are that they are not far fetched or something you would shake your head while saying “duh, they should have known better”. They were going along with what most of us to believe to be best practices, attributing where needed and for the most part, being “thoughtful” about where the images came from.
Images are a crucial in today’s marketing mix and research shows that posts with visuals drive 180% more engagement than those that don’t. As marketers make the shift from longer form content to using more visuals in posts, ebooks and white papers, the good old “use and give attribution” isn’t enough. Your images need to be your own, or classified under Creative Commons or outright purchased.
Put aside visual content, the pressure to create any kind of “more entertaining and engaging content” increases the risk of copyright and plagiarism. After all, creating good content is hard work — taking someone else’s content and calling it your own, isn’t. Imagine how surprised Orbit Media was when they found out their entire website had been knocked off. An entire website!
So where is this leading?
If you are a content creator, it’s your business to know the following:
What is legal, and what isn’t, when you use other people’s content or images in your post or on your web site
What you can do to protect the images, graphics, posts and website in the case your find your content being used on someone else blog or website without your consent
This Tuesday on Tweet Chat, 2pm EST, we’re thrilled to have attorney, writer, educator and content marketing expert Kerry Gorgone as our Tweet-Guest to guide us through what we need to know to keep our content creation, and use, legal.
We’ll be covering copyright, attribution, curation, plagiarism and any other questions you have from the viewpoint of content creator and curator. If you’re a hobby blogger, solopreneur or small business owner this chat is for you!
The articles hyperlinked in this post make great pre-reading, as well as this post by Kerry on Protecting your Creative Works Online.
Gay Talese - Writer...
Gay Talese on the shortfalls of modern-day journalism
Jeremy Barr writes:
Legendary New York Times and Esquire writer Gay Talese visited New York University last night to discuss his book on the building of the Verrazano-Narrows Bridge, which was recently republished. In a Q&A session after the event, Talese expounded on the problems of modern-day journalism and modern-day journalists.
“They do not understand the working class, because they don't spend any time with the working class,” he told the crowd of mostly students. “They don’t understand the special needs of special people. Journalism today doesn’t have people … from the outside.”
Talese suggested that his generation of journalists—the post-World War II generation—had more of an “outsider’s” point of view.
“We weren’t blasé about anything,” he said. “I think the technology of people—each day sitting behind their laptop, spending their days communicating with the technology and never really seeing people, but sort of seeing the world through their little laptop, their little screen—has a narrowing effect. I think it’s not broadening. It's easy. It's fast
The journalists of today, he said, "are better educated ... in one sense, but not really curious about a world different than their own."
Talese also argued that New York is ripe with story possibilities.
“New York is the world,” he said. “You just go around the five boroughs, and you’ll find touches of the world in every manifestation—the people there, the neighbors, and what they think. So I don’t think you have to have airplane trips to far-away places. You can explore with depth this city, and find such great stories to do.”
Talese said that he shuns cellphones to this day.
"I don't want to talk to people. I want to see them," he said.
He bought a computer, however, when he began writing for The New Yorker three years ago.
"They said, 'You have to email us. You cannot type it and mail it into us.' ... Ok, so I had to learn how to use email. I had to get a computer. And I did. I had to 'sell out.' I wanted to be in The New Yorker.'"
Gay With Boxes
Gay Talese and a city of permanent change
The Following article was written by Abigail Carney:
Gay Talese tells me that he does not have dire notions about the future of New York. He has lived here since 1953 and has seen the city, in many ways, attacked. But a lot of streets have not changed, and what makes New York, New York, has not changed either.
“It’s a city of optimism and city of change and even bad news changes very quickly here,” he says.
He came to New York after graduating from the University of Alabama. The New York Times hired him as a copy boy. Gay says that in a way it was the most important job that he ever had at the paper, where he would later work as a staff reporter.
When you are a reporter, Gay explains, you have to go out into the city and interview people, chase down the mayor, watch a strike, talk to firemen who have hosed down a burning building. But as a copy boy, you are free to observe the secretaries, clerks, publishers, reporters, editors, advertising directors, floor sweepers, window washers, and elevator operators.
“This is perfect material for me,” he says. “Because I am very curious about ordinary people, not the people who make the news.”
The stories of the people at the New York Times would become his first bestseller, The Kingdom and the Power.
Gay Talese left his job as a New York Times reporter when he was 33 years old. He had already published his first book, New York: A Serendipiter’s Journey, about the people he saw on the streets, and his second, about the building of the Verrazano-Narrows Bridge. Gay did not find another place of employment. Instead, he worked out of his house, or rather, below his house. From there, he wrote profiles for Esquire that would becomes classics and books that would become bestsellers, and helped to define literary journalism.
Gay opens his front door—which is being repainted by a man dressed in white who asks who I am here to see—with a hat in his hand. He offers me a seat, and then a drink, and then suggests that we visit what he calls his bunker.
We walk out of the house, then down the steps to the sidewalk where we say hello to the painter. Gay unlocks a door at the house’s right corner. He tells me to slam the door behind me, to be careful on the steps, to hold the banister.
There are no windows in the bunker. There is no telephone. You cannot hear the cars on the street. Gay comes here at 9 or 10 in the morning, and then stays and works until 2 or 3. He follows this rou