Blog

Save the date—PubPro 2013

When I worked in house, I saw the rights people flying off to book fairs, the sales and marketing folks going to trade fairs, and the digital people attending conferences to geek out and speculate about where publishing is headed. Never did I see the managing editor or production staff—those who actually made the books real—meeting professionals in the same role from other houses to talk shop. That was a real pity, because I’m sure we all had a lot to learn from one another.

On Saturday, April 13, 2013, I’ll be facilitating an event that I hope will redress that imbalance. EAC-BC and the Canadian Centre for Studies in Publishing will be co-hosting PubPro 2013, an unconference-style professional development event for managing editors, production editors, publication directors, editorial coordinators, and everyone else who performs essentially the same role: publications project management. The agenda is set at the start of the day by the participants, who are invited to consider giving a presentation or leading a discussion on a topic of interest.

You’ll find more information about the event on the EAC-BC site here. In the meantime, save the date, spread the word (official hashtag #PubPro2013) and, if you’re interested in getting event updates, contact me to be added to a mailing list. I promise not to spam you.

British Columbia: A New Historical Atlas event coming up

Derek Hayes will be giving a talk about his new book, British Columbia: A New Historical Atlas, on Monday, December 3, at the Vancouver Public Library’s central branch. This free event runs from 7pm to 8:30pm, and there will be books for sale. More information is available on the VPL’s event calendar.

Introduction to information graphics and data visualization

Alberto Cairo, author of The Functional Art, found my review of his book and wrote me a very gracious note to let me know about an online course he’s teaching about infographics and data visualization. The first offering of the six-week course began October 28 and saw an enrolment of 2,000 students; a second offering is scheduled for January 2013. It’s completely free, and you can register here.

Cairo tells me that although he designed the course with journalists in mind, the diversity of backgrounds among his current students is huge, from epidemiology to statistics to cartography—as well as journalism.

The course, he says, will help fill in the gaps I identified in The Functional Art—namely, the details of how to put the theory he presents into practice. Further, he’s at work on another book that focuses more on that aspect of information visualization. If it’s as cogent and clear as The Functional Art, it will be a valuable reference indeed.

Crimes involving words: some recent cases

Dr. Lorna Fadden captivated the audience at Wednesday evening’s November EAC-BC meeting with her fascinating talk about forensic linguistics. Dr. Fadden is an assistant professor of linguistics at SFU and also runs a consultancy as a forensic linguist, analyzing language evidence for law enforcement and legal counsel in criminal or civil cases.

Examples of crimes or offences involving language include extortion, ransom, solicitation, harassment, Internet luring, threats, coercion, perjury, hate speech, defamation, bribery, and plagiarism, and language evidence can turn up from a spectrum of sources, from emails, text messages, and letters to police interviews and emergency phone calls. Forensic linguists may analyze linguistic form (e.g., grammatical structure and word choice) or linguistic function (e.g., social context).

Modern forensic linguistics, Fadden explained, has its origins in the Evans Statements. In 1949 Timothy Evans of London was accused of murdering his wife and daughter. He was tried, convicted, hanged, then posthumously pardoned—an inquiry found that a neighbour had killed Evans’s family—and this miscarriage of justice caused the UK to throw out its death penalty. In 1968, Jan Svartvik, a Swedish professor of English, reviewed Evans’s statements, which were allegedly a verbatim transcription of what he had said. Evans had the vocabulary of a fourteen-year-old, but Svartvik found portions of his statements that showed the grammar and word choice of someone with a much higher level of education. As a result, Svartvik concluded that the police had made up portions of the confession, and the case put forensic linguistics on the map.

Fadden then took us through four of her cases.

Case 1

Fadden was asked to study a transcript of a 911 call, in which the caller gave an elaborate backstory before, five or six sentences in, telling the operator the reason for the call—that he saw a man with a gun. Most calls to 911, she said, tend to follow a script:

  • the operator takes the call,
  • the caller identifies the problem (“My wife is choking!”) or makes a request (“Send an ambulance!”) in the first few words, and
  • the operator solicits details as needed.

She added that criminals who call in their own crimes often go off script, because to them, there is no emergency and no urgency.

Fadden concluded that the transcript she analyzed didn’t meet our expectations for what generally happens in a 911 call, and although she had her theories as to why, it wasn’t up to her to make that determination.

Case 2

Fadden’s second case involved a young child being interviewed by family services for an investigation into allegations that his father sexually assaulted him. The lawyer who hired Fadden wanted to know if there was any evidence of coaching. In cases of coaching, you might see vocabulary or sentence structures that look out of place; children the age of the alleged victim tend to use common nouns and verbs and fewer adjectives and adverbs; they tend to stick to neutral terms or higher-frequency words:

  • look rather than leer or ogle
  • touch rather than fondle or grope
  • creepy, funny rather than lewd, salacious

Fadden concluded that the child’s vocabulary in this case didn’t support the theory of coaching, but she continued her analysis of the interview by coding the questions by type:

  • assertions (to be accepted or rejected)—e.g., Your dad touched your penis?
  • high-specificity questions—e.g., Did your dad touch your penis?
  • closed alternative questions—e.g., Did your dad touch your penis or your bum?
  • open alternative questions—e.g., Did your dad touch your penis or your bum or something else?
  • low-specificity questions—e.g., What did your dad do?
  • wide open questions—e.g., What can you tell me?

Fadden studied the alternative questions closely, because in those cases, details might not be generated by the witness but by the person asking the questions. She found a consistent pattern in the way the witness answered those kinds of questions (e.g., always selecting the first option in open alternative questions) and that they did not generate any new information. She questioned the credibility of those answers, contrasting them with portions of the interview in which the witness supplied completely unprompted details.

Case 3

Fadden analyzed correspondence from a man, the owner of a small business, who had met a CEO of a large firm at a public event and began sending him a series of letters. The CEO had never solicited or responded to the correspondence, but it kept coming, and it contained unsettling language. Did the letters constitute stalking?

Stalking letters, Fadden explained, have particular hallmarks:

  • expressing frustration with unrequited feelings or being ignored or overlooked
  • berating the target for the target’s transgressions
  • an offer of forgiveness for those transgressions
  • allusions to a relationship that doesn’t exist

Even though the correspondence in this case didn’t involve any kind of romantic angle, it nevertheless had all of the characteristics of stalking and was legitimate grounds for a cease and desist order.

Case 4

This final case has concluded, and so Fadden was at liberty to share details with us. On March 19, 2009, Justine Winter had a fight with her boyfriend. After dropping him off, she started to drive home, all the while exchanging text messages with him in which she threatened to kill herself. She crashed her car, and although she survived, a thirty-year-old woman and her thirteen-year-old son were killed. The prosecutors hired Fadden to determine whether these text messages constituted a suicide note; their assertion was that Justine’s death wish was a criminal act because two others died as she carried it out.

A suicide note, Fadden said, tends to have certain characteristics:

  • saying good-bye
  • claiming that life is too difficult to go on
  • expressing a desire to end suffering
  • imploring others to go on
  • expressing remorse or regret
  • expressing revenge
  • apologizing

Although Winter’s text messages had some of these characteristics, they also showed features—such as bargaining—that you wouldn’t typically find. “People who write suicide notes are way past bargaining,” said Fadden. As a result, Fadden concluded that the messages weren’t a suicide note.

But how credible was Winter’s threat to crash? To be a credible threat, Fadden explained,

  • it must be communicated
  • the subject must be motivated
  • the subject must have the means by which to carry it out

Analyzing the text messages, Fadden asserted that Winter’s threat to hurt herself were credible. Winter was convicted and is now appealing her sentence.

***

Fadden’s talk was thoroughly engaging and entertaining, and she’s clearly passionate about her work. What I found interesting was that she was very clear about where her role as a forensic linguist begins and ends. Her job is to assess the language; she can identify if something doesn’t fit the script, but it’s up to psychologists and other professionals to discover why.

Dr. Fadden will be moderating a Philosophers’ Café session, “Is language changing for the better or worse?” on December 5, 2012, at 7pm, at the McGill Branch of the Burnaby Public Library.

Information Mapping: models, templates, and standards

Today I attended my second webinar by Information Mapping, and it dealt with content standards, templates, and models.

A corporate content standard is a set of guidelines for everyone in an organization to follow to ensure content is written, formatted, and stored in ways that make it easy to retrieve, understand, and repurpose. Content standards facilitate team authoring, updates and revisions, compliance (particularly if the content will have to be audited), and migration to content management systems.

A content standard forms the basis for templates and model documents. Templates outline the format and content requirements for a specific type of document (fill-in-the blank kinds of documents, good for simpler content), whereas models are basically prototype documents that serve as a standard for creating subsequent documents (good for more complex content, like engineering reports).

To create a content standard, you need to understand

  • your users (e.g., Who are they? What do they know? What do they need to know? How do they access information?),
  • your content (e.g., How complex is the content? Are there graphics involved? Do you need to include special warnings?), and
  • the technologies used to access the content (e.g., Will it be paper based or online?).

Once you’ve created the content standard, you need to deploy it. Training will be involved, at all levels of your organization, and you may have to overcome an institutional resistance to change. Encouraging the shift in mindset among writers from creating full manuals, say, to topic-based authoring will be key.

In the Information Mapping content standard, the content is modularized into blocks, which are separated visually with lines and white space. On paper, a two-column grid is used, where labels are set off to the left, allowing users to easily scan and find what they need. As a result, the lines of text are short, resulting in reduced eye fatigue. The standard makes use of bullets to highlight important information and tables to present structured information. Information Mapping’s FS Pro software is a Microsoft Word plug-in that helps authors create content to the Information Mapping standard.

We were shown some examples of documents using the Information Mapping standard—or some modification thereof. A major advantage of the standard is its flexibility and adaptability for different types of content and presentations. One example I particular liked was a set of job aid cards used to help workers troubleshoot problems on a light rail transit system. Each card guides the user through solving one problem and features an illustration and clear instructions. The cards are colour-coded for easy recognition and retrieval, and should a procedure change, a single card can be revised without having to replace the whole set.

This webinar reinforced many of the topics introduced in the last one I attended, and again, although it was essentially an infomercial, it offered a lot of solid suggestions for content creators and editors. What I appreciated about the notion of a content standard is that it’s more than a style guide for how to write text—it emphasizes the importance of uniformity in formatting and file naming and hierarchy for easy information retrieval. These are areas that have the capacity to vastly reduce redundancy in content creation, regardless of the size of your organization.

This webinar, along with others in the Information Mapping series, are archived on the company’s website.

Introduction to Information Mapping

On Tuesday I attended a free webinar led by David Singer, content development manager at Information Mapping. The company does clear communication consulting, training, and implementation—for a host of clients across different industries—based on a method developed by psychologist Robert E. Horn.

The method provides a systematic way for authors to create structured, modular content that’s easy for users to find and understand. Singer demonstrated, with a before-and-after exercise, how presenting information within a paragraph often buries it, whereas a table, for example, can make retrieval of certain kinds of information much more efficient.

Singer noted that although people think clear communication and plain language is all about lines, labels, and white space to break up information and make it easier to read and digest, the presentation aspect is really just the tip of the iceberg; before the information can be presented, it must be analyzed to ascertain the best way to organize it.

The Information Mapping method is a set of best practices with three major components. It uses

  • the theory of information types to allow you to analyze your material,
  • information management principles to help you organize your content in a modular and hierarchical way, and
  • units of information that allow you to present your content for quick retrieval and understanding.

Information types

Most information falls into one of six information types, as identified by Robert Horn:

  • procedure—e.g., instructions on how to do something
  • process—e.g., description of how something works
  • principle—e.g., description of a standard or a convention
  • concept—e.g., description of a new idea or object
  • structure—e.g., description of an object’s components
  • fact—e.g., empirical information

Using information types helps writers work efficiently, making it easy to see contradictions, redundancies, and gaps. Different information types are best presented in different ways, so by classifying content into information types, writers can easily decide how to present information, and users quickly recognize what they’re looking for.

Information management

Information management is based on three principles: chunking, relevance, and labelling.

  • Chunking: group information into small, manageable chunks.
  • Relevance: limit each group or “unit of information” to a single topic, purpose, or idea.
  • Labelling: give each unit of information a meaningful name.

Miller’s Law states that our short-term memory can typically store 7±2 items. By grouping information into smaller chunks and labelling each group, we can vastly increase recall. The label primes your user to expect and be receptive to the content.

Units of information

Singer demonstrated that for a lot of information out there—business information is a particular example—narrative paragraphs are inefficient at conveying an idea quickly. Information Mapping supports the notion of information blocks, each of which encompasses a single main idea. Each of these blocks might consist of sentences, a list, a table, a graphic, or multimedia, and they are labelled and visually separated from one another (by a horizontal rule, say).

These blocks are put together into an information map, maps are grouped into topics, and, finally, topics into documents. Having information in modular blocks allows for easy storage and quick retrieval; they are easy to revise and update.

***

Although this webinar was largely a marketing exercise for Information Mapping (the fact that the company refers to its technique as “The Method” did make me feel a bit like a cult recruit)—and, of course, I knew it wouldn’t be giving away the farm by divulging all of its secrets in a free session—there was a good deal of sensible information in it. We’ve been using narrative paragraphs for so much of our lives that it’s easy to forget they’re often not the best way to transmit information.

What I’m curious to learn more about is how each of those blocks of information is best indexed and stored for easy retrieval by writers hoping to reuse and repurpose content.

Information Mapping’s free webinars are archived here. In addition to the informational one that I attended, “Information Mapping: What Is It? How Can It Help Me?”, there are others addressing managing and reusing content and writing in plain language. Another free webinar will take place November 20, covering standards and templates.

Book review: Science in Print

After reviewing Darcy Cullen’s Editors, Scholars, and the Social Text, which offered an insightful introduction to the world of scholarly publishing in the humanities, I found myself wondering which principles and practices within that book also applied to publishing in the sciences. I was hopeful that Science in Print: Essays on the History of Science and the Culture of Print, edited by Rima D. Apple, Gregory J. Downey, and Stephen L. Vaughn (published by the University of Wisconsin Press), might shed some light on the issue.

In 2008 the Center for the History of Print and Digital Culture at the University of Wisconsin-Madison sponsored an international conference on the culture of print in science, technology, engineering, and medicine; nine of the conference sessions were chosen to be included in Science in Print, released earlier this fall. The essays include

  • Meghan Doherty’s piece on how William Faithorne’s The Art of Graveing and Etching, a manual on the engraver’s craft, reflected standards of accuracy that he also applied to engravings for the Royal Society, which in turn reinforced scientific rigour among Royal Society members;
  • Robin E. Rider’s look at the importance of typography in late-eighteenth-century and early-nineteenth-century mathematical textbooks;
  • Lynn K. Nyhart’s overview of a decades-long series of publications, all arising from a German expedition to sample plankton in the world’s oceans;
  • Bertrum H. MacDonald’s tribute to the Smithsonian Institution’s role in scientific publication and information interchange between Canadian and American scientists in the late 1800s;
  • Jennifer J. Connor’s semi-biographical piece on George M. Gould, who in the late nineteenth century edited several medical journals and advanced ideas of editorial autonomy within medical journal publishing;
  • Kate McDowell’s probe of how evolution was presented in children’s science books between 1892 and 1922;
  • Sally Gregory Kohlstedt’s look at how textbooks and teacher resource books approached the burgeoning interest in nature study in the early twentieth century;
  • Rima D. Apple’s investigation into the influence of various publications, particularly government dietary guidelines, on fostering the primacy of meat in the American diet;
  • Cheryl Knott’s comparison between the reaction to Stewart Udall’s environmental treatise, The Quiet Crisis, published in 1963, and the reception to the book’s twenty-fifth-anniversary edition, published in 1988.

Being a bit of a math and typography nerd, I found resonance in Robin Rider’s essay, in which she says,

The visual culture of mathematics, done well, offers “enormous advantages of seeing,” as Edward Tufte would say. Readers learn much from the way mathematics is presented in type. Good typography highlights and reinforces ideas; indifferent typography (or worse) obscures ideas and stymies the reader. (p. 38)

—particularly since that last sentence applies just as well to non-mathematical texts.

Although not addressed as a specific topic in the book, the issue of the motivation behind academic publishing does rear its head in more than one essay. Both Lynn Nyhart and Jennifer Connor remark that the contributors to scientific and medical journals are generally not paid for their contributions. Writing about medical editor George M. Gould, Connor says,

After [publisher] William Wood of New York refused him permission that same year to reprint articles from its medical journals in his Year-Book—a digest of material that reached, according to Gould, thousands of readers—he distributed a circular about the relations between the medical profession and “lay publishing firms of medical journals.” Publishers do not pay physicians for their contributions, he noted, although they presumably profit from them; and, in this case, no other publisher—even those who do pay contributors—had objected to reprinting extracts. But above all, this publisher’s decision was wrong because it prevented the dissemination of medical knowledge. (p. 116)

Lynn Nyhart argues that publishing itself motivated scientific progress:

Maintaining the commitment to publish, I would suggest, was in fact what made these projects successful and important as science. (Conversely, the lack of a strong commitment to publishing following many voyages often resulted in the collected specimens languishing in boxes for years without ever being analyzed.) (p. 67)

Science in Print also looks beyond the academic realm at trade and popular science publishing, and the closing chapter by Cheryl Knott makes reference to Priscilla Coit Murphy’s book What a Book Can Do: The Publication and Reception of Silent Spring, saying

According to Murphy, it is the book (as opposed to the author) that launches social and political movements as it takes on a life of its own in ways the author and publisher could not have foreseen. (p. 201)

Knott reinforces this concept by showing how the evolution of the environmental movement and a changing political climate affected the success of The Quiet Crisis, an environmental book by former U.S. Secretary of the Interior Stewart Udall. It became a best-seller after it was first published in 1963 but saw a tepid reception when it was expanded, updated, and reissued in 1988. Knott discovered that readers often cite and recommend the original edition, even if they’d clearly read the newer one. She notes, “Such mix-ups indicate that many readers do not make the careful distinctions between editions that collectors, bibliographers, and librarians make.” (p. 217) In my experience, although publishers are aware of this reality, they are sometimes in denial about it as they try to find new ways of repackaging and marketing existing content. How do you capitalize on the cachet of a successful original edition while offering readers the new information they need?

***

Although Science in Print did offer me some new perspectives and gave historical context to the development of scientific publishing, particular in North America, I have to say that didn’t enjoy the experience of reading the book as much as I would have wanted, for a variety of reasons. I’ve been struggling for weeks to write a cohesive review of this book (and some may remark that I’ve failed), likely because I found that Science in Print itself lacks cohesion. I’m no stranger to reading and reviewing anthologies; despite being an assembly of contributions from different authors, they must still have an internal rhythm and logic—like a good album put together from a collection of singles. Science in Print takes too much of a scattergun approach, attempting to present numerous topics ostensibly connecting science and print culture that are really quite disparate. Perhaps a more effective approach would have been to select more of the conference sessions to publish but to group them by topic or genre and issue each of these as a separate volume, which would have allowed for more meaningful comparisons among contributors’ viewpoints.

And although I understand that scholarly presses generally don’t do much substantive editing, this is once instance in which a manuscript really could have benefited from a skilled stylistic editor’s hand. Take, for instance, this opening to one of the essays:

Educators in the early twentieth century faced the dilemma of how to build the skills of teachers so that they could teach directly from nature in a new progressive pedagogy emerging in the late nineteenth century known as nature study. (p. 156)

Most stylistic editors would be able to offer at least a couple of suggestions to make that sentence more engaging and approachable while conveying exactly the same information. (I should say that I don’t mean to pick on this one contributor—whose content was otherwise pretty interesting—I just wanted to offer an example.)

Finally, one aspect of the book that may have contributed to my discomfort while reading is the design (ironic, given Robin Rider’s astute analysis of the importance of good typography): the pages are dense, the type is small, and the lines are long. Robert Bringhurst, in The Elements of Typographic Style, writes, “Anything from 45 to 75 characters is widely regarded as a satisfactory length of line for a single-column page set in a serifed text face in a text size… A line that averages more than 75 or 80 characters is likely to be too long for continuous reading.” (v. 2.4, pp. 26–27) Science in Print definitely falls into the latter category. I would suggest that readers try the ebook and reflow the text to a comfortable line length, but it appears that the only available ebook version is a fixed-layout PDF. I haven’t read any other books published by University of Wisconsin Press, but if this book is based on a standard design template, the press may benefit from revisiting that template and revising it for readability.

Salute to a fallen Canadian cultural institution

I was going to make this week’s post a self-indulgent look back at the past year on my bloggiversary (as the kids call it), but given the sad news that D&M Publishers has filed for creditor protection, I wanted to say a few words about the company—and the people—that made my years in book publishing so rewarding.

I started at D&M during my Master of Publishing degree as a lowly intern (though pretty much everyone there did their best not to make me feel lowly at all), doing all manner of random tasks, from sending out review copies and archiving editorial material to staffing the front desk while the receptionist was away. Getting to spend time in several departments gave me a solid appreciation for the effort everyone was making. It really was, as Brenda Feist, sales and marketing assistant at the time, said, “amazing to see how many people it takes to make a book happen.”

My main tasks, though, were editorial—proofing inputting, proofreading books and marketing materials, and a bit of indexing. I learned from the best: Nancy Flight and Lucy Kenward patiently showed me the ropes, insisting on the highest standards and gently but firmly nudging me to improve myself. From Managing Editor Susan Rana I learned the best practices in book production as I watched her shepherd project after project through multiple hands and to tight deadlines. The company’s art department was also an inspiration: headed by Peter Cocking, D&M’s team of designers produced gorgeous books that routinely swept the Alcuin Awards.

During my internship, I embarked on a project to produce an informational handbook for authors to guide them through the editorial process, explaining the steps and the people involved in transforming a manuscript into a finished book. Little did I know that working on the handbook would sow the seeds of my interest in editorial efficiencies and systems. Later I would carve a niche role within the company of improving documentation and communication with authors and freelancers and developing quality-control methods to continue the company’s tradition of high editorial standards.

D&M offered me a contract to stay on once my internship was over, and I gladly accepted. There I was exposed to brilliant, inspiring authors and to books on a wide-ranging array of topics, from Aboriginal art to Vancouver architecture, from mouth-watering cookbooks to eye-opening biographies of influential Canadians, from history to current affairs and public policy, from environment to sport. I wish I’d retained more of what I read over those years.

To Scott McIntyre, thank you for all you have done. Thank you for trusting me with some of your best authors, thank you for recommending me to your friends and colleagues once I decided to strike out on my own, and thank you for giving me the opportunity to learn and develop alongside some of the best editors in the country. I can only imagine how heartbreaking this development must be—perhaps it feels like the loss of a child or the loss of a legacy. But please know that your fervent passion for and enormous contributions to Canadian culture endure—in the fine books that you’ve published, in the authors you’ve fostered and encouraged, in the people who’ve been able to learn from you by working for you.

What I value most from my time at D&M are the relationships I’ve forged with some of the smartest, funniest, hardest-working people I’ve ever met. To my good friends at D&M—who are too many to name here—please stay in touch. Now that I can no longer come into the office for the occasional visit, I’ll try to do my part and be better at reaching out in other ways.

Sorry; I guess this post did end up being self-indulgent after all. I didn’t think I would be as emotional about this turn of events as I am. I feel deeply for all of D&M’s employees and authors, and I’m here to offer my help wherever and whenever it’s needed.

Look out—the market’s about to be flooded by some amazingly talented people.

STC networking event, November 20

At Wednesday’s EAC-BC meeting, Mellissa Ruryk, president of the Society for Technical Communication’s Canada West Coast Chapter, invited all of us to attend a session featuring a panel of recruitment specialists discussing networking, followed by about thirty minutes of speed networking. The event will take begin at 7 pm on November 20 at the YWCA on Hornby Street. More information here.

Ebooks

Lara Smith gave a captivating and hugely informative presentation about ebooks at Wednesday’s EAC-BC meeting. Having gone to Greg Ioannou’s conference talk about e-publishing, I wondered if there’d be a lot of overlap in the content of the two talks. There wasn’t—and after the meeting BC Branch Chair Peter Moskos suggested to me that Lara probably had enough material to fill a full seminar.

Ebooks are often thought to be electronic versions of print books, Lara began, but many titles today are just born digital. Ebooks come in two main formats: PDF and EPUB. The ebook PDFs aren’t just your regular PDFs—they’re Universal PDFs, which are optimized for screen viewing. Chapters are bookmarked, the table of contents is linked, URLs are live, and the files include some metadata.

In the early days of ebooks, there were many different ebook formats; every e-reader developer wanted to create a device with a proprietary format, which led to a very fractured market. The International Digital Publishing Forum set out a standard known as EPUB—a set of rules that everyone could follow to build an ebook. All devices now have the capacity to read EPUB files. We’re not sure what the future will be for EPUB, though, because device manufacturers still like to add on proprietary bells and whistles to their EPUB files.

EPUBs can have fixed layouts or be flowable. Fixed-layout EPUBs look a bit like PDFs, but they have a lot more capability behind the scenes (e.g., accessibility features like text to speech). They’re much more complicated to create. EPUBs are good for visual books, such as coffee-table books or cookbooks, but they’re really meant to be read on a tablet device. Lara demonstrated how impractical it is to read a fixed-layout EPUB on a smartphone.

By contrast, flowable EPUBs can be read on a phone—not to mention e-readers and browsers—since the type can be enlarged as needed. Flowable EPUBs make up the bulk of the ebooks out there.

An EPUB, Lara explained, is really just a ZIP file. Change the epub extension to zip, and you can decompress the folder to see what’s inside. There may be a folder for images, and the text is broken up into chapters, each an HTML file. There’s a style sheet that controls how the tagged text looks to the human reader. She’s found the best strategy to ensure that the ebook looks good on all devices is to keep styling to a minimum. “We’re not trying to replicate the print book,” she said. “We really have to reconceptulaize it. We can’t control type in the same way.”

Lara works mostly with books that are destined for both print and digital, so she exports from InDesign. But she notes that you can build an EPUB from scratch in a text editor, and there’s conversion software that will transform Word files into EPUBs (although they don’t look very good). The simpler your original files, she said, the better it will look. (For example, never justify your text; on many devices, the text will look hideous and gappy.)

When publishers convert books to EPUBs, they have the option of using a conversion service, which is inexpensive and may be appropriate for converting large numbers of files (e.g., the publisher’s backlist), but the results can look pretty rough. Another option is in-house conversion, which allows for more control over quality, style, and timelines but requires an investment into a dedicated individual or team of people who must learn how to use the software and prepare the files for the market. Editors working with individual authors to create single ebooks may be able to dedicate more resources to fine-tune the EPUBs themselves to specific devices and take full advantage of enhancements like audio and video.

Lara also mentioned vendor conversion tools, including iBooks Author, Kindle Direct Publishing, and Kobo Writing Life, which are free tools to use but restrict you to selling within those particular streams, and DIY options (what she referred to as “device-agnostic options”), such as Smashwords, PressBooks by WordPress, and Vook, which charge for creating the ebooks, whether through an upfront fee or through royalties. She noted that all of these options have a learning curve and a real cost.

Once you’ve got your ebook made, you then have to sell it. How are people going to find it? The answer is metadata—information attached to your book including title, author, publisher, ISBN, price, description, author bio, reviews, etc.—that will populate distributors’ and retailers’ databases. Metadata is key to discoverability.

Lara then moved on to the contentious issue of digital rights management (DRM), which puts a lock on EPUBs file and prevents copying, editing, and reselling but also limits legitimate sharing of books and device switching. It pits readers’ freedoms against authors’ and publishers’ right to profit. The debate seems to be heading in two directions: digital media may be licensed to readers (where they can read but don’t actually own the book), or publishers may decide not to use DRM at all. (O’Reilly Media, in fact, has declared that it won’t be using DRM on any of its books.)

Another issue facing publishers is that EPUBs have the capability to incorporate a variety of assistive technologies, such as text to speech, alternative text, phonetic text, media overlays, dyslexic reading aids, conversion to braille, etc., and international accessibility organizations are pushing publishers to include all of these features. Of course, for the publisher, doing so means a lot more investment into editorial and production resources.

Lara was careful to note the distinction between apps and ebooks. Apps are self-contained applications, and they can be interactive and include all sorts of multimedia features. There are book apps—kids’ books work really well as apps, because they don’t have a lot of content but can support a lot of interactivity. Apps take more development than an ebook, and you need to involve a programmer.

So what are the editorial concerns surrounding e-publishing? First, the publisher must have the digital rights—including for the images that are to appear in the book. Next, the publisher should look at the content and figure out the best way to present the book (fixed or flowable) and decide whether to add enhancements.

Challenges for ebook publishers are elements like sidebars, which you want to place at section or chapter breaks so that they don’t interrupt the flow of the text. Lara noted that ebooks are read in a linear way; it becomes tedious to have to skip over what could turn into pages of sidebar content to get back to the main text, especially if you’re reading on a small screen. Footnotes are also a problem, because the foot of a page is no longer well defined. Indexes are similarly challenging. (See my summary of Jan Wright’s discussion of ebook indexes from this past spring’s ISC conference.)

On the flip side are the many advantages that ebooks offer. For example, endnotes can be linked, as can in-text references. Photo sections can go anywhere within the book, not necessarily just between printed signatures. You can make URLs in the book (and the references, especially) live, and you can add audio or video enhancements. Finally, there are no page limits, and you can really play around with the concept of what a book is. Lara warns, however, that the more fun stuff you put in, the greater the risk that something will break, and broken links or videos, for example, can frustrate readers.

Lara’s talk was phenomenal. I learned a huge amount, though I will probably eventually have to resign myself to the fact that she knows more about e-publishing than I ever will.