Blog

EAC Conference 2012, Day 2—The new libel defence: responsible communication

Ian Stauffer, a specialist in civil litigation, gave an overview of defamation and its defences, including a relatively new defence—responsible communication—which the Supreme Court of Canada recognized in late 2009.

Defamation was defined in the 1950 case of Willows v. Williams as follows:

A defamatory statement is one which has a tendency to injure the reputation of the person to whom it refers. It lowers him or her in the estimation of right-thinking members of society generally and causes him or her to be regarded with feelings of hatred, contempt, ridicule, fear, dislike or disesteem.

To make a case for defamation, one must prove

  • the words were published or spoken to a third party
  • the words referred to the plaintiff
  • the words were defamatory

Stauffer said that in the world of defamation, nothing is definitive. It’s hard to predict how much, if anything, a client might receive in a defamation case, and there is no scale for awarding damages. Pursuing a defamation case is also risky because the offending words are likely to be republished, and more will be said. “It’s not easy to put the genie back in the bottle,” Stauffer said.

Usually, he explained, the client will initially request an apology and retraction. Whether apologies are issues and how they are worded can affect the damages potentially awarded later on.

Defamation can be classified as slander, which is usually spoken and more ephemeral, or libel, which is typically written or otherwise recorded. Traditional defences to libel are truth (i.e., justification), privilege (absolute or qualified), and fair comment. Now there is a fourth defence: responsible communication.

Absolute privilege refers to remarks made in a chamber such as the House of Commons or Senate; qualified privilege includes performance reviews, letters of reference, etc. Fair comment refers to a comment made in good faith, without malice, on a matter of public interest. It must be identifiable as a comment rather than a statement of fact.

Responsible communication refers to reportage on matters of public interest in which the publisher has been diligent in verifying an allegation and the reliability of the source. A jury in a case in which responsible communication is used as a defence would also weigh whether the plaintiff’s side of the story was sought out and whether the inclusion of the defamatory statement was justifiable. This new defence lifts the chilling effect on reporters and frees them to write about potentially contentious matters of public interest.

Stauffer’s handed out copies of a paper he authored, “Defamation, responsible communication and cyberspace,”  which elaborates on the above issues, as well as their application to Internet-related cases, and offers examples and specific case studies.

EAC Conference 2012, Day 2—LGBTQ: getting it right

Luna Allison, a queer journalist, editor, playwright, and performer, offered her perspectives on some of the do’s and don’ts when writing about the LGBTQ communities, in the hopes, as she says, of “building knowledge and cultural competency.” Mainstream media approaches to LGBTQ issues can come off as ignorant and offensive; the key is to develop the discipline to dial back our curiosity and focus on the actual issues.

Don’t

  • use a person’s sexuality or sexual orientation in combination with their occupation (e.g., “gay MP”), unless they’ve explicitly stated that’s how they identify
  • make an assumption about a person’s gender based on how they look or sound
  • ask about a person’s surgical status
  • use a person’s pre-transition name—this is rude and exposing. And never use the term “tranny” to refer to a transsexual or transgendered person

Do

  • ask how a person identifies
  • ask what pronoun a person prefers. If you can’t ascertain this, try structuring your sentences without using pronouns (pluralizing often helps) and use gender-neutral terms
  • research to understand correct cultural usage and cultural history of particular terms (e.g., “butch,” “femme,” “queer”)
  • understand that transsexual individuals may change how they identify post-transition
  • refer to the LGBTQ communities in the plural; even within the “gay community,” for example, there are multiple communities

Allison also clarified the distinction between transsexual (someone who feels born in the wrong body and wants to transition) and transgendered (an umbrella term often used to describe someone who may be transsexual, genderfluid, genderqueer, or gender neutral). She emphasized the need to respect someone’s gender identity, which can be hard in a culture where the male–female dichotomy is so deeply engrained (for example, the first question that usually comes up when finding out someone has had a baby is, “Is it a boy or a girl?”).

In sensationalist stories in mainstream media, a lot of dormant assumptions tend to bubble up, Allison says, referencing the Luka Magnotta case in particular. His sexuality was often mentioned in close proximity to his alleged criminal activities, and journalists and editors have to be sensitive to the impression such proximity could leave on readers. She also cautioned that transgendered individuals are often characterized as being “in disguise” or otherwise trying to deceive. It’s this feeling of being fooled that has led to a lot of violence against transgendered people (and is why there is an international transgender day of remembrance).

As writers and editors, we have to be aware of the perceptions that our work might generate in our readership and the misconceptions it might feed.

EAC Conference 2012, Day 1—Whose words are these anyway? Translating, editing, and avoiding the Gallicism trap

Barb Collishaw and Traci Williams jointly presented a session about translation, with Collishaw focusing on the similarities and differences between editing, translation, and revision and Williams offering some insight into Gallicisms, particularly in Quebec English.

Collishaw works at the Parliament of Canada, helping to produce the Hansard, which, of course, must be translated so that it is complete in both official languages. Translators translate text (as opposed to interpreters, who translate speech) from a source language transcript into the target language in a way that accurately reflects the content, meaning, and style of the original. Revisers—who work exclusively in house—then edit the translated text.

Drawing upon the EAC’s Professional Editorial Standards, Collishaw compared the roles of translators and revisers to the role of an editor, noting that translators use virtually all of the stylistic editing, copy editing, and proofreading skills listed in the PES. Translation requires an eye for detail, a good command of both source and target language, and an understanding of where and how to check facts.

Collishaw emphasized the importance of keeping the audience in mind and to make room in the production schedule for translation and revision. Editors and managers sometimes forget that translation takes time, and because it comes in near the end of the process, translators often end up being under severe deadline pressure.

Translators get to choose the words they use, within the range of meaning of the source language words, so awkward or offensive terms can be smoothed over. However, this may not be what the author intended. Collishaw gave the example of “unparliamentary language”: sometimes translators soften such words or phrases, but this may not be wise, since an MP may object on a point of order later on, and revisers then have to go and restore the “mistake” to preserve logic.

Fact checking can be tricky, since translators don’t often get to query the author and ask people what they meant or how to spell someone’s name. Translators use tools such as Termium Plus, a terminology data bank, and TransSearch, a bilingual concordancer of past translations, to help them in their work, and are expected to compile glossaries. After they finish translating, translators are expected to proof their own work, checking against the source language.

Revisers check again, making sure that nothing has been left out and that meaning hasn’t been inadvertently changed, paying particular attention to details like numbers and dates. They also edit for style, imposing consistency on text from different translators. (To complicate matters, the House and Senate have different style guides, and revisers have to keep it all straight!)

I asked Collishaw if translators or revisers get to see transcripts of the interpreters as a reference, and she laughed, saying, “No, but I wish we would!” It seems that what the interpreters say isn’t transcribed, and the translators and revisers don’t have access to it.

***

Traci Williams is originally from Ontario but now works as a translator and editor in Quebec. She became fascinated by the influence of French on the English language and began to document Gallicisms—words or terms borrowed from French.

Originally, English was a rather limited language, composed primarily of one- or two-syllable words, Williams explained. The first Gallicisms appeared after the Norman Invasion in 1066, initially in law, warfare, and church language; afterwards, they began to pervade clothing- and food-related vocabulary (as seen is animals versus their meats—”pig” vs. “pork,” “cow” vs. “beef,” “deer” vs. “venison”). Between 1100 and 1500, English absorbed about 10,000 French words. Before the seventeenth century, French words appearing in English were anglicized (e.g., chimney, change, charge); afterwards, hints of the French were retained (e.g., chevron, champagne, chaperone).

In Quebec, the first major wave of English speakers were British loyalists; by 1841, English speakers of British descent were the largest population in Montreal. When rural French Quebeckers began moving to Montreal in the 1860s, they were expected to learn English, which, until 1975, was considered the language of prestige by both the French and English. During that period, a steady stream of Anglicisms seeped into French. Yet, after the PQ was voted in, in 1976, French began to influence English. At first, Gallicisms appeared in colloquial speech, but today educated professionals will use them without even realizing it. Between 1990 and 1999, the number of Gallicims tripled, and Oxford University has now officially recognized Quebec English as a distinct dialect.

Some Gallicisms are perfectly acceptable—”encore,” “fiancé,” and “en route” are examples. Cooking, dancing, and law feature many Gallicisms. And English has often retained words of both Germanic and French origin, with slightly different connotations (e.g., “ask” vs. “question,” “holy” vs. “sacred”) or has kept nouns of Germanic origin but has used the French adjectives (e.g., “finger” but “digital,” “book” but “literary”). What editors need to be aware of are the unacceptable Gallicisms that arise as a result of false cognates—words that are formally similar to words in the native language but have different meanings (e.g., “animator” rather than “instructor,” “conference” rather than “lecture,” “manifestation” rather than “demonstration”). The delicate aspect of editing Quebec English for an audience outside of Quebec is that an author—perfectly fluent in English—may be unaware that he or she is inappropriately using Gallicisms.

Williams emphasizes the importance of continuing to read, read, read. She suggests reading sources of English outside of where you live to make sure that you have a solid perspective of language quirks that might be a local peculiarity and may not translate to a wider audience. Williams has started a newsletter about Gallicisms and related topics. Contact her at via Semantech Communications to sign up.

EAC Conference 2012, Day 1—E-publishing essentials for editors

Greg Ioannou, president of EAC and publisher of Iguana Books, gave an overview of some of the things editors should know about ebooks, beginning with a bit of history: the first ebook was a computerized index of Thomas Aquinas’s works and was released in the 1940s. In the 1960s hypertext was used to format ebooks so that they could be read using different window sizes and monitors on IBM mainframes. The first ereader was Sony’s Data Discman, which displayed ebooks stored on CD.

Although there are hundreds of types of e-readers, many of them with proprietary file formats, the most common ones include EPUB, EPUB2, MOBI, and PDF. Most ebooks are basically just HTML files with metadata that help bookstores categorize them (e.g., title, author, description, ISBN, publication date, keywords, etc.) The editor [ed—or perhaps an indexer?] is in the best position to know what keywords should included in the metadata file.

At Iguana, the creation sequence is as follows:

For simple books

  • edit and style in Word
  • create PDF from Word (Iguana has discovered that they have to produce at least one print-on-demand copy for the author or, more often, as Ioannou says, the author’s mother).
  • create EPUB file using Sigil
  • create MOBI file using Calibre

For complex books

  • edit and style in Word
  • create PDF from InDesign
  • create EPUB file from InDesign
  • clean up EPUB in Sigil
  • create MOBI file using Calibre

Once you’ve created your files, Ioannou said, you should actually look at the ebook on the device(s) it’s destined for; looking at it on just the computer can be deceiving. Right now InDesign’s EPUB export doesn’t actually work very well, so the outputs have to be cleaned up quite a bit.

Ioannou then described the many devices on which ebooks could be read, including tablets, phones, computers, Kindles, and other e-readers (e.g., Nook, Kobo, Sony Reader, etc.). Only the Kindles can read MOBI files, whereas the other devices can all read EPUB files. All can display PDFs, although only tablets, smartphones, and computers can display colour and play videos.

Since EPUB/MOBI files are reflowable and may be read on very narrow devices like a smart phone, editors should keep the following in mind when editing for an ebook:

  • Make sure that there are spaces before and after dashes
  • Opt for hyphenating a compound rather than using a closed compound; however, avoid hyphenations when it could lead to odd line breaks (e.g., choose “ereader” over “e-reader”).
  • Make sure all quotes are smart quotes; this is relatively easy to do in Word but much more difficult to code in Sigil or Calibre.
  • Books without chapters don’t work very well as ebooks—the large file size can significantly slow down an e-reader. If possible, break a book down into chapters of ideally between 3,000 and 5,000 words. This structure also makes navigating an ebook much easier.
  • As for formatting, keep it simple. Tables and column look terrible on an e-reader, and images won’t display in some older e-readers. Most e-readers are black and white only, and many older e-readers can’t handle large files (e.g., files with embedded images and videos).

Ioannou noted that e-readers are primitive machines and that the technology’s rapidly changing. His caveat: “Most of what I say here will not be true a year from now, and practically none of it will be true two years from now.”

EAC Conference 2012, Day 1—Making the Language Portal of Canada work for you

The Translation Bureau launched the Language Portal of Canada in 2009 as a gateway to allow everyone free access to the translation tool Termium, a terminology and linguistic data bank in English, French, Spanish, and Portuguese. Despite its translation roots, however, the Language Portal is packed with news, tools, and references that appeal to a much wider audience of editors, writers, educators, and anyone interested in language.

The Language Portal exists in both French and English, but the sites aren’t merely translations of one another. There are different types of language problems in French and English, so although there is parallelism in the tools available to users on the two sites, the content is different.

Robin Kilroy creates and curates much of what’s on the Language Portal, and she took EAC conference attendees on a tour of the site.

Headlines

These link to language-related stories gleaned from external sources. Two headlines are posted each week and then are archived for a year.

My Portal

This is broken down for readers “At school,” “At work,” and “At home,” which link to specific resources for students and educators, professionals who have to write or edit (as Kilroy says, so many people now are “functional writers” who have to write for their jobs, though they may not consider themselves professional writers), and the general reader, respectively.

Resources include “Linguistic Recommendations and Reminders,” which offers tidbits of advice about grammar and style.

From Our Contributors

The Language Portal’s partner organizations (including the Editors’ Association of Canada) contribute language-related articles for this section, which are then edited and translated in house. They are all archived by organization name.

Discover

“Discover,” on the left-hand sidebar, and “Discover Coast to Coast,” at the bottom centre, link to the same resources but are organized differently. These are a collection of links, external to the Translation Bureau, to such resources as dictionaries and information about language training and language professions.

Well Written, Well Said

On the left-hand sidebar, this section links to Termium, Writing Tools, Gateway to English, and Quizzes (on everything from spelling and punctuation to Canadian authors and proverbs). Editors may find Writing Tools particularly useful, because it provides access to such resources as The Canadian Style (much more up to date than the print edition) and Peck’s English Pointers, among many others.

EAC Conference 2012, Day 1—Opening keynote address

Charlotte Gray, award-winning biographer and historian, kicked off the EAC conference with her thoughtful—and thought-provoking—keynote address. She praised the editor for saving her “from my own hideous mistakes.” Although she hears some writers complain that editors took out their voice, she says she recognizes that “my voice can either be a strength or a weakness.” The editor, she says, is “not only the first professional reader—but the best,” because he or she aims to help and support the editor, whereas the second professional reader—the reviewer—often approaches the text with the opposite goal.

As a writer of popular history, Gray also reflected on the malleability of history, acknowledging that words are themselves living artifacts. “In shaping history,” she wondered, “am I pulling it out of shape?” Memoirs are often assumed to be nonfiction until proven otherwise, she said, whereas John Updike’s view was that “biographies are really just novels with indexes.” She went on to describe how carefully and rigorously she seeks out primary sources for her work, walking the fine line between imagining and inventing as she uses novelistic techniques to flesh out a historical narrative.

Gray described the research and writing process for her book Gold Diggers: Striking It Rich in the Klondike, in which she yet again sheds light on the role of women in Canadian history, this time in Dawson—a setting she called a “pioneer Petri dish.” The book focuses on six people, including two women—a journalist and a businesswoman—and Gray colourfully recounted the “war dance” that she did at Library and Archives Canada every time she found solid evidence that her characters had actually met, allowing her to weave together their stories into a coherent narrative.

In an age where we’re constantly bombarded with information of all sorts, readers are generally less trusting, but that’s not necessarily a reflection of the veracity or integrity of sources we find today. History, Gray concluded, has always been—and likely will remain—malleable.

ISC Conference 2012, Day 2—Hands-on ebooks

David Ream and Jan Wright once again took to the stage to elaborate on indexing of digital files. Ream said that there aren’t a lot of usability studies that compare search versus indexing. BNA’s “Using Online Indexes” is one, but it would be interesting to get more universities involved in this kind of research to generate more data.

Ream then gave an overview of EPUB 3.0. It’s open source, is based on existing standards—such as XHTML, CSS3, Javascript, SVG—and was created ahead of the industry (i.e., tools and reading systems), meaning that we can all avoid costly format wars. It provides navigation and packaging information and incorporates global language support (i.e., for languages that are read left to right, right to left, or vertically). It is backwards compatible with EPUB 2.0 and has modular components and working groups.

EPUB 3.0 files will have rich metadata—Dublin Core for publication information, ONIX for supply chain information, and MARC for libraries. The metadata will be key to a digital file’s discoverability—and hence to its sale. Implications for indexers include the following:

  • no page or line limitations
  • potentially having to index rich media (e.g., time codes)
  • potentially having to index interactive ebook features (scripts)
  • potentially having to supply semantics of headings and locators (e.g, show only the statutes, show only the people, etc.)
  • being able to provide index data in multiple ways
  • cumulative indexing—of series, mashups, etc.

Jan Wright then explained the workflow for inserting anchors to EPUBs at the paragraph level. She and Olav Martin Kvern developed scripts that create identifiers for individual paragraphs in InDesign, which can then be used as part of the locator in standalone indexing programs like CINDEX or SKY Index to create a hyperlink from index locator to the paragraph. For now, this is the most realistic way of creating a paragraph-level index that will work for both print and ebook, because InDesign strips out any embedded index entries when it exports to EPUB. The digital trends task force is talking to Adobe separately about this issue, but a fix may be some time away.

Wright and Ream then allowed conference attendees to play around with various devices, from the Kindle, Kobo, and Nook to the iPad, to see the current state of the art of ebook indexing.

ISC Conference, Day 2—To award or not to award?

The Indexing Society of Canada is planning to establish a new award to recognize excellence in indexing. Indexing societies elsewhere in the world have their own awards, but Canadians aren’t eligible for most of them. At the ISC conference, Max McMaster of the Australian and New Zealand Society of Indexers, Jan Wright of the American Society for Indexing, and the ISC’s Christine Jacobs discussed the considerations that should go in to the award criteria and judging process.

McMaster, a three-time winner of the ANZSI medal for outstanding indexing, has also been a judge for the medal. The judging panel consists of three experienced indexers, but they will consult an outside expert if the subject matter of a submitted work is too esoteric for them to understand. He suggests that the ISC be careful not to limit the types of works that can be eligible for the award and to provide a certificate or plaque to the publisher of the winning work as well—good PR for indexing. Publishers should be encouraged to submit their books for consideration, but more often it will be indexers who submit their own work. McMaster warns indexers that they’re at the mercy of the book’s editor and typesetter: he had once submitted an index he thought was award worthy before realizing that the publisher had inadvertently removed all of the index’s indentation, severely compromising the final index’s usability.

Jan Wright won the ASI’s H.W. Wilson Award for her index to Real World InDesign CS3. She hasn’t been a judge but has spoken to the judges on the panel that granted her her award. Submissions for the award go to the ASI’s chapters around the country, and they are anonymized. Cheryl Landes remarked that many indexers would be willing to pay a high entry fee to submit their work if it meant that they would receive feedback on their submissions.

Christine Jacobs gave an outline of what the ISC’s award might look like. The awards committee plans to announce the award at the next AGM and issue a call for nominations. They hope to accept submissions in either English or French and both print and online works. Currently there’s no cash prize attached to the award, but the awards committee is taking a step-by-step approach, and it may be part of the award later on. Jacobs emphasized how the award can be a form of validation for a winning indexer and that it would help raise the profile of the profession and encourage high standards.

ISC Conference 2012, Day 2—New standard, more interoperability

Michèle Hudon, associate professor at l’École de bibliothéconomie et des sciences de l’information at l’Université de Montréal, spoke at the ISC conference about ISO25964, a new standard for thesauri.

ISO25964 will replace ISO2788 and ISO5964, out-of-date standards for monolingual and multilingual thesauri, respectively. These standards don’t address subject headings or taxonomies and are ill-suited to a networked environment of linked data, so in 2008, an international working group was struck to create the new standard based on the content of BSI8723, created by British information specialists. The working group, of which Dr. Hudon is a part, includes both practitioners and researchers.

Officially titled “Information and Documentation—Thesauri and interoperability with other vocabularies,” the ISO25964 project consists of two parts, the first of which, “Thesauri for information retrieval,” was published in August 2011 and covers general principles for developing and managing monolingual and multilingual thesauri. Part 2, “Interoperability with other vocabularies,” addresses crosswalks and mappings and is scheduled to be published at the end of this year.

“Interoperability”—a bit of a mouthful, as Hudon admits—refers to an ability to “act together coherently, effectively, and efficiently to achieve common objectives.” In the world of information science, it means the “capability of agents, services, systems and computer applications to exchange data, information and knowledge while preserving their integrity and full meaning.” (Zeng and Chan, 2009)

Thesauri are great tools for information retrieval for local users, but there may be multiple thesauri on the same topic that have different classification schemes and subject headings and thus can’t talk to one another. Having multilingual thesauri adds another layer of complexity.

In traditional information systems, thesauri allow a searcher to use the same search terms and strategies to search several databases and provide an efficient way to cross the language barrier. With the web, semantic interoperability becomes even more relevant. It allows for effective searching in several situations, including with the same language in different countries, two or more natural languages, a natural language and a language of specialty, a natural language and  an indexing and retrieval language, and one or more indexing and retrieval languages (e.g., Library of Congress and Dewey).

Interoperability implies equivalence, but many would argue that absolute equivalence, particularly between distinct languages, doesn’t exist. Part 2 of the standard gives recommendations for establishing and maintaining mappings between multiple thesauri or between thesauri and other types of vocabularies. As Hudon said in her talk, “Because it necessarily exists in a particular cultural, social, professional and linguistic context, semantic and terminological interoperability of indexing languages depends on compromises to compensate for the lack of absolute equivalence between concepts and between terms.” She also emphasizes that semantic equivalence is dynamic and ever evolving.

Mini-update

Apologies to readers who are waiting for the rest of my conference notes. I’ve just had a heap of deadlines run into one another, but I’m hoping to get caught up by the end of next week.

In other news, I had an inspiring meeting with EAC-BC programs co-chairs Micheline Brodeur and Frances Peck this week to plan out this upcoming season of meeting topics and speakers. I’m so thrilled to be working with them, and it’s looking as though we’re going to have a great year, with a little something for everyone.

In other, other news, publishers have generously sent me copies of some editing-related books to read, and I’ll be starting up book reviews for this site early this summer. In the queue so far are Editors, Scholars and the Social Text, edited by Darcy Cullen, and The Only Grammar & Style Workbook You’ll Ever Need by Susan Thurman. Stay tuned!