Mis(s)-Understanding Media

By Mark Fretz of Scribe Inc.

Published

The New York Times recently reminded us that Marshall McLuhan’s classic Understanding Media turned fifty this year. The book is famous for popularizing ideas such as that the electronic image had supplanted the written word and for phrases such as “global village” and “the medium is the message.”1 In the world of pop culture, fifty years is longer than anyone can remember (fifteen minutes is about all most people get), whereas in the history of humankind, fifty years is but the blink of an eye. That anyone outside of McLuhan’s devoted followers remembers his book is remarkable. Even more so, that anyone still considers it relevant is a testament to the prescience of McLuhan’s ideas in relation to current tumultuous trends in that slice of the information and media sector of the economy called the publishing industry.

In his lifetime and afterward, people criticized McLuhan because he presented his ideas in such a disconnected fashion. The apparent disconnectedness of his discourse, however, further illustrated McLuhan’s assertion that “the medium is the message.” Without calling it such, McLuhan had transformed his discourse into hypertext, in sync with the electronic media about which he wrote. Lineal, sequential presentation of information gave way, in electronic media, to what McLuhan termed hyperreality—the simultaneous availability and experience of content in multiple formats and different media. The challenge of producing electronic media requires an understanding of what they are, a mastery of the technology required to produce content in electronic formats, and an appreciation of how electronic media shape humanity on the most fundamental level. For the publishing industry, it starts with knowing where we stand in historical context.

Whence Cometh Electronic Media?

Media, as McLuhan observed in the early 1960s, had, by the midtwentieth century, become electronic: a new reality inaugurated more than a century before by the invention of the telegraph. That was the third of humanity’s great revolutions, along with the phonetic alphabet and the introduction of moveable type. It is instructive to look briefly at these transformative events in the evolution of humanity.

  1. The introduction of the phonetic alphabet. McLuhan, drawing on Karl Popper and others, traced the first revolutionary shift in human history (admittedly from a Western perspective) to the transition from closed to open societies. With the introduction of the phonetic alphabet in the sixth century BCE, Greece experienced a shift from operational to classified wisdom. Phonetic writing sparked a fundamental cultural shift from dependence on the ear to dependence on the eye, which brought with it radical changes in Western societies. Although we could quibble about who invented the phonetic alphabet, where it was first widely used, and with McLuhan associating it with Plato’s system of educating the Greeks, evidence clearly supports that Greece had experienced radical cultural shifts by the late-fifth century that can be associated with the use of a phonetic alphabet, which gave rise to widespread writing and never-before-seen forms of written communication.
  2. The introduction of moveable type.2 Johannes Gutenberg is credited with introducing moveable type to Europe and the West in 1450. Gutenberg’s printing press took the earlier shift from ear to eye to a whole new level by distinguishing form from content. Each individual character in Gutenberg’s press was independent of all the others, but the entire collection of letters (capital, lowercase, various sizes), spaces, and eventually punctuation had to be presented on the page (i.e., designed) in some fashion. Separating form from content was not new to the Western world, since the Greeks had invented and implemented that distinction in the philosophical realm two millennia before. However, Gutenberg changed the landscape of printed media, for the world of the eye, by making a distinction between the phonetic alphabet and the media in which the alphabet was used.
  3. The introduction of electronic communication. The telegraph celebrates its 170th birthday this year. According to McLuhan, the third human revolution—electric media—dawned with the invention of the telegraph, which opened the floodgates to all subsequent electronic communication. Years before the first successful electrical transmission of a communication from Washington, DC, to Baltimore on May 24, 1844 (Samuel F. B. Morse’s prescient message, “What hath God wrought?”), Joseph Henry, then a professor of natural philosophy at the College of New Jersey (eventually renamed Princeton University) and later the first secretary of the Smithsonian Institution had researched and published articles on electromagnetism and electromagnetic induction. Morse found out about Henry’s experiments, saw the implications for telegraphic communication, and started to conduct his own research and development of what eventually became the telegraph in the United States; simultaneous efforts were being pursued in Europe.

Electronic communication took the then nearly four-hundred-year-old printing industry into territory never before imagined. On the one hand, it brought communication back full circle to being an ear-oriented event by converting the alphabet into electric impulses that could be translated into sound (i.e., oral speechphonetic alphabetmoveable type in printelectronic rendering of the alphabet in audible and digital forms). On the other hand, it further disembodied communication from being anything tangible at all, because in its electronic form, communication was imperceptible without being reembodied for human consumption, whether aimed at the eye, the ear, or any of the human senses. This revolution continues to proliferate cultural changes in media and to challenge the industry that has depended on both the alphabet and Gutenberg’s moveable type more than anything else—the publishing industry.

Misunderstanding and Missing Understanding

Ironically, the publishing industry has not only misunderstood electronic media; it has actually missed an understanding of the very media in which it functions. If fifty years ago, McLuhan attempted to help us understand media, how could the publishing industry have squandered half a century both by getting electronic media all wrong and by completely missing the importance of electronic media for how it produces content?

Misunderstanding (i.e., getting something wrong) can result from lack of information or incorrect comprehension of whatever information we do have. For a long time, the publishing industry misunderstood what electronic media were and their implications for publishing, and the industry’s actions reflected this misunderstanding.

At first, the industry moved from manual typewriters to electric typewriters. By the 1990s, personal computers were seen as an expensive replacement for electric typewriters—nothing more. Related to this limited comprehension of the technology, the industry was mired in general confusion about what we are looking at on the printed page. Many people don’t distinguish between structure and rendering, basic components of the “Gutenberg Galaxy,” when in fact electronic publishing requires this fundamental distinction.

Structure pertains to the identity and function of any given element within a document (i.e., What is the function of this paragraph or word or character in relation to the whole document?), whereas rendering pertains to the appearance of any element, regardless of its structural identity and function (i.e., What should that element look like?). Block quotes or extracts illustrate this point quite well. A paragraph may be formatted so that its first line appears indented, flush left, or flush right, followed by the remaining lines of the paragraph appearing flush left and ragged right, flush right and ragged left, or justified. The entire paragraph may be indented both left and right, with the first line indented further, or flush left. Font attributes (e.g., typeface, size, and color) can be combined in a wide variety of ways. Leading, especially a line space above and below an extract, can be used as the visual clue that this material is indeed set off from the preceding and following content. Kerning can be handled in numerous ways.

When we see a paragraph indented both left and right, with spacing above and below separating it from the surrounding paragraphs, we might call that a block quote or extract based solely on the way it looks on the page. However, a block quote or extract can be made to appear in countless other ways and still function structurally within the document as a block quote or extract. The structure and the rendering of an element are distinct from one another, even though publishers may have developed a tradition of presenting certain elements using specific conventions, which induces readers to associate the appearance of those paragraphs with certain elements (i.e., It must be a block quote because of the space above and the left and right indentation).

Similarly, our misunderstanding of the electronic age of publishing is reflected in how we have continued to produce books using a mechanistic process rather than adopting an electronic process. The introduction of electronic technology to produce books and journals changed the way those products should be produced. Printers no longer use tintype but instead use electronic files to produce the image on the paper, which becomes the pages of the book or journal. For printers to receive these files in an electronic format, publishers must use electronic technology to produce their electronic printer-ready files. Doing so requires a different production process and workflow than was employed prior to the advent of electronic media. However, some publishers have been slow to comprehend the implications of the shifts in technology for their workflow and the markets they wish to reach with their products.

Some in the industry may not have read or known about Understanding Media and therefore may have missed McLuhan’s forewarnings. Even those familiar with the book may have failed to see its relevance for publishing (i.e., the understanding completely passed them by), or if they did, they might have been uncertain about how to apply what they understood to what they did day-to-day in the publishing world. People miss things for various reasons:

  • They are not paying attention.
  • They are paying attention to the wrong thing.
  • They are paying attention but lack categories for comprehending the information they see and hear.
  • They are paying attention, but habit blinds them from seeing what is right in front of them.
  • They are incapable of paying attention, which prevents them from even knowing there is something to pay attention to and understand.

Our failure to adapt as an industry is as much the result of missed understanding as it is misunderstanding electronic media.

Understanding Electronic Media

“As the means of input increase,” says McLuhan in the preface to the third printing of Understanding Media, “so does the need for insight or pattern recognition.” Although he was referring to the volume of data needing to be processed—what we today call “big data”—this may be a valuable clue to altering the current state of affairs in the publishing industry. If we agree that the means of input have increased in the publishing industry, as they have fairly universally done in the electronic age, then we must increase our ability to recognize patterns. Through pattern recognition, we enable ourselves to gain insight into not only media in general but, more importantly, the production process itself. Pattern recognition brings us back to the aforementioned distinction between structure and rendering.

We can draw several practical conclusions from McLuhan’s ideas for various areas of the publishing industry.

Ingestion Requires Consistency

Ingestion of content at any stage of the production process depends on consistency and pattern recognition (i.e., structure). As employed here, ingestion is a functional result of interoperability. Let me clarify these two concepts. By ingestion, I mean the ability of any system to import and access content in a way that allows for it to be manipulated reliably. Interoperability pertains to the ability of computer systems and software to exchange data based on their ability to parse the code and formatting of the data. Both ingestion and interoperability depend on the data being marked or coded (i.e., structured) consistently and conforming to agreed-upon rules of markup or tagging conventions. We can have blind spots when it comes to understanding the need to ingest content. For instance, publishers may not think of the need for ingestion and interoperability until they have produced and printed a book and are interested in distributing it in an electronic version. At that point, the structural decisions (or lack thereof) have already been implemented.

When the publisher contacts a distributor about getting the book to market, the distributor has to consider the varied requirements of the numerous channels of distribution they serve. For the distributor to ingest the publisher’s book into its system, the content first must be formatted in a way that the distributor’s system can recognize. Then the distributor must be able to manipulate the content to conform to the requirements of each channel of distribution so that those channels can in turn ingest the publisher’s book into their systems and then export it to end users. A prerequisite of this sequence of ingestion is for the content to be marked up consistently, a process that begins with pattern recognition. Publishers should think about ingestion from the moment they decide to publish a book. Before the author writes the first word of a manuscript, a publisher can conceive of how best to make that manuscript consistent so that ingestion of the book at the end of the production process is as efficient and cost effective as possible.

Structure Is Editorial

Determining structure is an editorial function, thus structure should be imposed on a document in the editorial phase of production. Returning to the example of ingestion by distributors, we must move structural decisions as far upstream in the production process as possible to gain the greatest efficiency from pattern recognition and consistent structuring of a document. Rather than foist the task of determining structure on the typesetter, publishers are better served by recognizing that editors are making decisions about structure from the moment they first read a manuscript (if not before, during the conception and developmental phases of a project).

Electronic Should Be Electronic

Electronic media require electronic input, which of necessity relies on electronic technology best suited to an electronic publishing process. To doggedly stick with a mechanical process—one introduced in the fifteenth century, along with the printing press, when the world has been transformed by electronic media—is counterproductive at best. The publishing industry has come into this electronic environment dragging its feet, and segments of the industry still refuse to run with the new technology. Companies that pursue electronic media commit themselves to utilizing all available electronic means to produce and distribute their content and products, even when those products take physical or print form.

Typesetting Is about Perception

Typesetting best illustrates this transition from mechanical to electronic means of production. Although we typeset (literally “set type”) for the printed page, these days, we do so using electronic technology. This applies to the preparation of the manuscript for typesetting, the typesetting process itself, and the printing process; all phases of production depend on electronic technology. The purpose of typesetting is to enhance the comprehension and appreciation of the content through visual presentation. That same content could be presented as an undifferentiated series of characters, as it would have been during the manuscript era prior to the advent of moveable type—no horizontal or vertical spacing, no punctuation, no annotation, and so on. However, the art of design and typesetting books is concerned with the visual presentation of content, both textual and graphic, within the parameters of the predefined unit of presentation, which is the trim size of the page. Ultimately, therefore, typesetting is about perception: how the textual and graphic content is visually received and perceived.

In response to a question about why McLuhan thought people reacted so negatively to his work, he explained that “Any new demand on human perception, any new pressure to restructure the habits of perception, is the occasion for outraged response. Literary people prefer to deal with their world without disturbance to the perceptual life.” Two aspects of this statement relate to publishing. First, the book publishing industry must realize that publishing is a perception industry. It involves not only the content on the printed page, computer screen, or e-reading device but also the consumer’s perception of that content. Second, how the consumer perceives the attempted communication is affected by the chosen medium or media; the two go hand in hand. Their perception is being shaped by the medium, as well as the content delivered by that medium. Media choices are imposed on the publishing industry, as they are on all industries, by media availability. Publishers don’t invent media; they simply adopt them as means and channels of delivery. When new media are introduced to the world and adopted by the publishing industry, those media force consumers to adapt their perceptions.

If typesetting is about visual perception of content on the printed page, then replacing the fixed physical printed page with a dynamic electronic display eliminates the need for typesetting. No printed page, no typesetting. Electronic delivery of content can maximize perception outside the confines of the printed page, including animated video, as well as means of perception other than the human eye, such as audio. In the electronic environment, perception is hyperreal and not bound by the size of the printed page. But people accustomed to the printed page as the unit of delivery must “restructure habits of perception,” as McLuhan would say. Only then can they actually perceive what is being communicated via the electronic media.

Conclusion

I hope this essay prompts publishing professionals to think critically about the media in which they operate and publish content. More important, I hope people examine what they are doing, and how, from their various perspectives as editors, designers, typesetters, e-book developers, distributors, marketers, salespeople, and so forth. I hope this stimulates dialogue and debate in editorial and production meetings and forces those engaged at all stages in the publishing chain to consider how well their workflow is integrated with the media used to produce and distribute their products. Finally, I hope this essay reduces the misunderstanding of media and their effects while helping people get a better grasp of how media shapes the publishing industry, especially its production processes.


NOTES

1. The New York Times Sunday Book Review recently ran a Book Ends piece by Dana Stevens and Rivka Galchen, “Has the Electronic Image Supplanted the Written Word?” (June 17, 2014), which revisits the ideas and influence of Marshall McLuhan on the fiftieth anniversary of the publication of his book Understanding Media (1964).

2. McLuhan’s earlier work, The Gutenberg Galaxy (1962), examines the revolution sparked by Gutenberg’s moveable type. In particular, that book outlines the connections between the introduction of new technology, such as the moveable-type printing press, and all aspects of human life. The “tentative probes” offered in Understanding Media derive from arguments laid out in The Gutenberg Galaxy.