EAC Conference 2012, Day 1—Whose words are these anyway? Translating, editing, and avoiding the Gallicism trap

Barb Collishaw and Traci Williams jointly presented a session about translation, with Collishaw focusing on the similarities and differences between editing, translation, and revision and Williams offering some insight into Gallicisms, particularly in Quebec English.

Collishaw works at the Parliament of Canada, helping to produce the Hansard, which, of course, must be translated so that it is complete in both official languages. Translators translate text (as opposed to interpreters, who translate speech) from a source language transcript into the target language in a way that accurately reflects the content, meaning, and style of the original. Revisers—who work exclusively in house—then edit the translated text.

Drawing upon the EAC’s Professional Editorial Standards, Collishaw compared the roles of translators and revisers to the role of an editor, noting that translators use virtually all of the stylistic editing, copy editing, and proofreading skills listed in the PES. Translation requires an eye for detail, a good command of both source and target language, and an understanding of where and how to check facts.

Collishaw emphasized the importance of keeping the audience in mind and to make room in the production schedule for translation and revision. Editors and managers sometimes forget that translation takes time, and because it comes in near the end of the process, translators often end up being under severe deadline pressure.

Translators get to choose the words they use, within the range of meaning of the source language words, so awkward or offensive terms can be smoothed over. However, this may not be what the author intended. Collishaw gave the example of “unparliamentary language”: sometimes translators soften such words or phrases, but this may not be wise, since an MP may object on a point of order later on, and revisers then have to go and restore the “mistake” to preserve logic.

Fact checking can be tricky, since translators don’t often get to query the author and ask people what they meant or how to spell someone’s name. Translators use tools such as Termium Plus, a terminology data bank, and TransSearch, a bilingual concordancer of past translations, to help them in their work, and are expected to compile glossaries. After they finish translating, translators are expected to proof their own work, checking against the source language.

Revisers check again, making sure that nothing has been left out and that meaning hasn’t been inadvertently changed, paying particular attention to details like numbers and dates. They also edit for style, imposing consistency on text from different translators. (To complicate matters, the House and Senate have different style guides, and revisers have to keep it all straight!)

I asked Collishaw if translators or revisers get to see transcripts of the interpreters as a reference, and she laughed, saying, “No, but I wish we would!” It seems that what the interpreters say isn’t transcribed, and the translators and revisers don’t have access to it.

***

Traci Williams is originally from Ontario but now works as a translator and editor in Quebec. She became fascinated by the influence of French on the English language and began to document Gallicisms—words or terms borrowed from French.

Originally, English was a rather limited language, composed primarily of one- or two-syllable words, Williams explained. The first Gallicisms appeared after the Norman Invasion in 1066, initially in law, warfare, and church language; afterwards, they began to pervade clothing- and food-related vocabulary (as seen is animals versus their meats—”pig” vs. “pork,” “cow” vs. “beef,” “deer” vs. “venison”). Between 1100 and 1500, English absorbed about 10,000 French words. Before the seventeenth century, French words appearing in English were anglicized (e.g., chimney, change, charge); afterwards, hints of the French were retained (e.g., chevron, champagne, chaperone).

In Quebec, the first major wave of English speakers were British loyalists; by 1841, English speakers of British descent were the largest population in Montreal. When rural French Quebeckers began moving to Montreal in the 1860s, they were expected to learn English, which, until 1975, was considered the language of prestige by both the French and English. During that period, a steady stream of Anglicisms seeped into French. Yet, after the PQ was voted in, in 1976, French began to influence English. At first, Gallicisms appeared in colloquial speech, but today educated professionals will use them without even realizing it. Between 1990 and 1999, the number of Gallicims tripled, and Oxford University has now officially recognized Quebec English as a distinct dialect.

Some Gallicisms are perfectly acceptable—”encore,” “fiancé,” and “en route” are examples. Cooking, dancing, and law feature many Gallicisms. And English has often retained words of both Germanic and French origin, with slightly different connotations (e.g., “ask” vs. “question,” “holy” vs. “sacred”) or has kept nouns of Germanic origin but has used the French adjectives (e.g., “finger” but “digital,” “book” but “literary”). What editors need to be aware of are the unacceptable Gallicisms that arise as a result of false cognates—words that are formally similar to words in the native language but have different meanings (e.g., “animator” rather than “instructor,” “conference” rather than “lecture,” “manifestation” rather than “demonstration”). The delicate aspect of editing Quebec English for an audience outside of Quebec is that an author—perfectly fluent in English—may be unaware that he or she is inappropriately using Gallicisms.

Williams emphasizes the importance of continuing to read, read, read. She suggests reading sources of English outside of where you live to make sure that you have a solid perspective of language quirks that might be a local peculiarity and may not translate to a wider audience. Williams has started a newsletter about Gallicisms and related topics. Contact her at via Semantech Communications to sign up.

EAC Conference 2012, Day 1—E-publishing essentials for editors

Greg Ioannou, president of EAC and publisher of Iguana Books, gave an overview of some of the things editors should know about ebooks, beginning with a bit of history: the first ebook was a computerized index of Thomas Aquinas’s works and was released in the 1940s. In the 1960s hypertext was used to format ebooks so that they could be read using different window sizes and monitors on IBM mainframes. The first ereader was Sony’s Data Discman, which displayed ebooks stored on CD.

Although there are hundreds of types of e-readers, many of them with proprietary file formats, the most common ones include EPUB, EPUB2, MOBI, and PDF. Most ebooks are basically just HTML files with metadata that help bookstores categorize them (e.g., title, author, description, ISBN, publication date, keywords, etc.) The editor [ed—or perhaps an indexer?] is in the best position to know what keywords should included in the metadata file.

At Iguana, the creation sequence is as follows:

For simple books

  • edit and style in Word
  • create PDF from Word (Iguana has discovered that they have to produce at least one print-on-demand copy for the author or, more often, as Ioannou says, the author’s mother).
  • create EPUB file using Sigil
  • create MOBI file using Calibre

For complex books

  • edit and style in Word
  • create PDF from InDesign
  • create EPUB file from InDesign
  • clean up EPUB in Sigil
  • create MOBI file using Calibre

Once you’ve created your files, Ioannou said, you should actually look at the ebook on the device(s) it’s destined for; looking at it on just the computer can be deceiving. Right now InDesign’s EPUB export doesn’t actually work very well, so the outputs have to be cleaned up quite a bit.

Ioannou then described the many devices on which ebooks could be read, including tablets, phones, computers, Kindles, and other e-readers (e.g., Nook, Kobo, Sony Reader, etc.). Only the Kindles can read MOBI files, whereas the other devices can all read EPUB files. All can display PDFs, although only tablets, smartphones, and computers can display colour and play videos.

Since EPUB/MOBI files are reflowable and may be read on very narrow devices like a smart phone, editors should keep the following in mind when editing for an ebook:

  • Make sure that there are spaces before and after dashes
  • Opt for hyphenating a compound rather than using a closed compound; however, avoid hyphenations when it could lead to odd line breaks (e.g., choose “ereader” over “e-reader”).
  • Make sure all quotes are smart quotes; this is relatively easy to do in Word but much more difficult to code in Sigil or Calibre.
  • Books without chapters don’t work very well as ebooks—the large file size can significantly slow down an e-reader. If possible, break a book down into chapters of ideally between 3,000 and 5,000 words. This structure also makes navigating an ebook much easier.
  • As for formatting, keep it simple. Tables and column look terrible on an e-reader, and images won’t display in some older e-readers. Most e-readers are black and white only, and many older e-readers can’t handle large files (e.g., files with embedded images and videos).

Ioannou noted that e-readers are primitive machines and that the technology’s rapidly changing. His caveat: “Most of what I say here will not be true a year from now, and practically none of it will be true two years from now.”

EAC Conference 2012, Day 1—Making the Language Portal of Canada work for you

The Translation Bureau launched the Language Portal of Canada in 2009 as a gateway to allow everyone free access to the translation tool Termium, a terminology and linguistic data bank in English, French, Spanish, and Portuguese. Despite its translation roots, however, the Language Portal is packed with news, tools, and references that appeal to a much wider audience of editors, writers, educators, and anyone interested in language.

The Language Portal exists in both French and English, but the sites aren’t merely translations of one another. There are different types of language problems in French and English, so although there is parallelism in the tools available to users on the two sites, the content is different.

Robin Kilroy creates and curates much of what’s on the Language Portal, and she took EAC conference attendees on a tour of the site.

Headlines

These link to language-related stories gleaned from external sources. Two headlines are posted each week and then are archived for a year.

My Portal

This is broken down for readers “At school,” “At work,” and “At home,” which link to specific resources for students and educators, professionals who have to write or edit (as Kilroy says, so many people now are “functional writers” who have to write for their jobs, though they may not consider themselves professional writers), and the general reader, respectively.

Resources include “Linguistic Recommendations and Reminders,” which offers tidbits of advice about grammar and style.

From Our Contributors

The Language Portal’s partner organizations (including the Editors’ Association of Canada) contribute language-related articles for this section, which are then edited and translated in house. They are all archived by organization name.

Discover

“Discover,” on the left-hand sidebar, and “Discover Coast to Coast,” at the bottom centre, link to the same resources but are organized differently. These are a collection of links, external to the Translation Bureau, to such resources as dictionaries and information about language training and language professions.

Well Written, Well Said

On the left-hand sidebar, this section links to Termium, Writing Tools, Gateway to English, and Quizzes (on everything from spelling and punctuation to Canadian authors and proverbs). Editors may find Writing Tools particularly useful, because it provides access to such resources as The Canadian Style (much more up to date than the print edition) and Peck’s English Pointers, among many others.

EAC Conference 2012, Day 1—Opening keynote address

Charlotte Gray, award-winning biographer and historian, kicked off the EAC conference with her thoughtful—and thought-provoking—keynote address. She praised the editor for saving her “from my own hideous mistakes.” Although she hears some writers complain that editors took out their voice, she says she recognizes that “my voice can either be a strength or a weakness.” The editor, she says, is “not only the first professional reader—but the best,” because he or she aims to help and support the editor, whereas the second professional reader—the reviewer—often approaches the text with the opposite goal.

As a writer of popular history, Gray also reflected on the malleability of history, acknowledging that words are themselves living artifacts. “In shaping history,” she wondered, “am I pulling it out of shape?” Memoirs are often assumed to be nonfiction until proven otherwise, she said, whereas John Updike’s view was that “biographies are really just novels with indexes.” She went on to describe how carefully and rigorously she seeks out primary sources for her work, walking the fine line between imagining and inventing as she uses novelistic techniques to flesh out a historical narrative.

Gray described the research and writing process for her book Gold Diggers: Striking It Rich in the Klondike, in which she yet again sheds light on the role of women in Canadian history, this time in Dawson—a setting she called a “pioneer Petri dish.” The book focuses on six people, including two women—a journalist and a businesswoman—and Gray colourfully recounted the “war dance” that she did at Library and Archives Canada every time she found solid evidence that her characters had actually met, allowing her to weave together their stories into a coherent narrative.

In an age where we’re constantly bombarded with information of all sorts, readers are generally less trusting, but that’s not necessarily a reflection of the veracity or integrity of sources we find today. History, Gray concluded, has always been—and likely will remain—malleable.

ISC Conference 2012, Day 2—Hands-on ebooks

David Ream and Jan Wright once again took to the stage to elaborate on indexing of digital files. Ream said that there aren’t a lot of usability studies that compare search versus indexing. BNA’s “Using Online Indexes” is one, but it would be interesting to get more universities involved in this kind of research to generate more data.

Ream then gave an overview of EPUB 3.0. It’s open source, is based on existing standards—such as XHTML, CSS3, Javascript, SVG—and was created ahead of the industry (i.e., tools and reading systems), meaning that we can all avoid costly format wars. It provides navigation and packaging information and incorporates global language support (i.e., for languages that are read left to right, right to left, or vertically). It is backwards compatible with EPUB 2.0 and has modular components and working groups.

EPUB 3.0 files will have rich metadata—Dublin Core for publication information, ONIX for supply chain information, and MARC for libraries. The metadata will be key to a digital file’s discoverability—and hence to its sale. Implications for indexers include the following:

  • no page or line limitations
  • potentially having to index rich media (e.g., time codes)
  • potentially having to index interactive ebook features (scripts)
  • potentially having to supply semantics of headings and locators (e.g, show only the statutes, show only the people, etc.)
  • being able to provide index data in multiple ways
  • cumulative indexing—of series, mashups, etc.

Jan Wright then explained the workflow for inserting anchors to EPUBs at the paragraph level. She and Olav Martin Kvern developed scripts that create identifiers for individual paragraphs in InDesign, which can then be used as part of the locator in standalone indexing programs like CINDEX or SKY Index to create a hyperlink from index locator to the paragraph. For now, this is the most realistic way of creating a paragraph-level index that will work for both print and ebook, because InDesign strips out any embedded index entries when it exports to EPUB. The digital trends task force is talking to Adobe separately about this issue, but a fix may be some time away.

Wright and Ream then allowed conference attendees to play around with various devices, from the Kindle, Kobo, and Nook to the iPad, to see the current state of the art of ebook indexing.

ISC Conference, Day 2—To award or not to award?

The Indexing Society of Canada is planning to establish a new award to recognize excellence in indexing. Indexing societies elsewhere in the world have their own awards, but Canadians aren’t eligible for most of them. At the ISC conference, Max McMaster of the Australian and New Zealand Society of Indexers, Jan Wright of the American Society for Indexing, and the ISC’s Christine Jacobs discussed the considerations that should go in to the award criteria and judging process.

McMaster, a three-time winner of the ANZSI medal for outstanding indexing, has also been a judge for the medal. The judging panel consists of three experienced indexers, but they will consult an outside expert if the subject matter of a submitted work is too esoteric for them to understand. He suggests that the ISC be careful not to limit the types of works that can be eligible for the award and to provide a certificate or plaque to the publisher of the winning work as well—good PR for indexing. Publishers should be encouraged to submit their books for consideration, but more often it will be indexers who submit their own work. McMaster warns indexers that they’re at the mercy of the book’s editor and typesetter: he had once submitted an index he thought was award worthy before realizing that the publisher had inadvertently removed all of the index’s indentation, severely compromising the final index’s usability.

Jan Wright won the ASI’s H.W. Wilson Award for her index to Real World InDesign CS3. She hasn’t been a judge but has spoken to the judges on the panel that granted her her award. Submissions for the award go to the ASI’s chapters around the country, and they are anonymized. Cheryl Landes remarked that many indexers would be willing to pay a high entry fee to submit their work if it meant that they would receive feedback on their submissions.

Christine Jacobs gave an outline of what the ISC’s award might look like. The awards committee plans to announce the award at the next AGM and issue a call for nominations. They hope to accept submissions in either English or French and both print and online works. Currently there’s no cash prize attached to the award, but the awards committee is taking a step-by-step approach, and it may be part of the award later on. Jacobs emphasized how the award can be a form of validation for a winning indexer and that it would help raise the profile of the profession and encourage high standards.

ISC Conference 2012, Day 2—New standard, more interoperability

Michèle Hudon, associate professor at l’École de bibliothéconomie et des sciences de l’information at l’Université de Montréal, spoke at the ISC conference about ISO25964, a new standard for thesauri.

ISO25964 will replace ISO2788 and ISO5964, out-of-date standards for monolingual and multilingual thesauri, respectively. These standards don’t address subject headings or taxonomies and are ill-suited to a networked environment of linked data, so in 2008, an international working group was struck to create the new standard based on the content of BSI8723, created by British information specialists. The working group, of which Dr. Hudon is a part, includes both practitioners and researchers.

Officially titled “Information and Documentation—Thesauri and interoperability with other vocabularies,” the ISO25964 project consists of two parts, the first of which, “Thesauri for information retrieval,” was published in August 2011 and covers general principles for developing and managing monolingual and multilingual thesauri. Part 2, “Interoperability with other vocabularies,” addresses crosswalks and mappings and is scheduled to be published at the end of this year.

“Interoperability”—a bit of a mouthful, as Hudon admits—refers to an ability to “act together coherently, effectively, and efficiently to achieve common objectives.” In the world of information science, it means the “capability of agents, services, systems and computer applications to exchange data, information and knowledge while preserving their integrity and full meaning.” (Zeng and Chan, 2009)

Thesauri are great tools for information retrieval for local users, but there may be multiple thesauri on the same topic that have different classification schemes and subject headings and thus can’t talk to one another. Having multilingual thesauri adds another layer of complexity.

In traditional information systems, thesauri allow a searcher to use the same search terms and strategies to search several databases and provide an efficient way to cross the language barrier. With the web, semantic interoperability becomes even more relevant. It allows for effective searching in several situations, including with the same language in different countries, two or more natural languages, a natural language and a language of specialty, a natural language and  an indexing and retrieval language, and one or more indexing and retrieval languages (e.g., Library of Congress and Dewey).

Interoperability implies equivalence, but many would argue that absolute equivalence, particularly between distinct languages, doesn’t exist. Part 2 of the standard gives recommendations for establishing and maintaining mappings between multiple thesauri or between thesauri and other types of vocabularies. As Hudon said in her talk, “Because it necessarily exists in a particular cultural, social, professional and linguistic context, semantic and terminological interoperability of indexing languages depends on compromises to compensate for the lack of absolute equivalence between concepts and between terms.” She also emphasizes that semantic equivalence is dynamic and ever evolving.

Mini-update

Apologies to readers who are waiting for the rest of my conference notes. I’ve just had a heap of deadlines run into one another, but I’m hoping to get caught up by the end of next week.

In other news, I had an inspiring meeting with EAC-BC programs co-chairs Micheline Brodeur and Frances Peck this week to plan out this upcoming season of meeting topics and speakers. I’m so thrilled to be working with them, and it’s looking as though we’re going to have a great year, with a little something for everyone.

In other, other news, publishers have generously sent me copies of some editing-related books to read, and I’ll be starting up book reviews for this site early this summer. In the queue so far are Editors, Scholars and the Social Text, edited by Darcy Cullen, and The Only Grammar & Style Workbook You’ll Ever Need by Susan Thurman. Stay tuned!

ISC Conference 2012, Day 2—The re-indexing dilemma

Max McMaster, an award-winning indexer and representative of the Australian and New Zealand Society of Indexers (ANZSI), spoke at the ISC conference about reindexing. For a new edition of a book, a subsequent annual report, or a bilingual document, do you adapt was has already been done, or do you index from scratch? Further, are there ethical issues in reusing an existing index?

In Australia, trade publishers will often buy foreign rights and revise a book for the Australian market. In many cases, this means that although the content of the two editions is similar, terminology can vary substantially. Moreover, North American indexers tend to produce lengthier indexes than Australians are used to. When re-indexing for the Australian edition, the index may end up being 30% shorter.

If a new edition of an existing book is just a repagination, it’s absolutely most efficient to reuse the index headings. (McMaster adds 1000 to all the old page numbers to keep track of what he’s changed and what he hasn’t.) If changes to a new edition are minimal and you created the first index, reusing the existing index, with necessary revisions, may be easiest. If changes are substantial, however, it’s much more efficient to start from scratch. McMaster is emphatic in dispelling the myth, however, that re-indexing, in whatever form, is easier than creating an index for a new work.

If you didn’t create the first index and want to re-index a book, it’s still useful to see the existing index. The previous indexer may have found a way to solve a problem that will save you a lot of time or you may spot weaknesses that you should avoid.

In Australia, government annual reports are all required to have an index, so this is a boon for indexers in that country. Since the design and the components of an annual report rarely change from year to year, re-indexing is a snap. Basically, once you land a contract to do one annual report, you’ve got it for life. (McMaster has co-authored a guide for non-indexers on how to index annual reports.)

For bilingual documents, you can’t reuse pagination, since the structure and length of the two languages will be different. One possibility is translated embedded indexing. However, Heather Ebbs pointed out that translating an index doesn’t really work, since there are cultural and contextual differences.

As for ethical issues, McMaster once had a publisher reuse his index for a book that he did for Australia that was then repackaged, with a different title, in New Zealand. In Australia, because an indexer is under contract, he or she doesn’t retain copyright of the work; however, McMaster would have appreciated being notified at the very least that his index would be reused (and a bit of additional compensation wouldn’t have hurt, either). Mary Newberry said that in Canada, copyright of the index does belong to the indexer.

McMaster’s presentation brought up the issue of credit; in one of his anecdotes he mentioned that his name was on a book’s copyright page, which led me to ask him whether crediting an indexer is standard practice in Australia. He says that an indexer is credited only maybe 5% of the time. Christine Jacobs had an interesting approach to the credit issue: she invoices for a credit line (and, incidentally, for a copy of the finished book). She asks for a credit on the copyright page, in the acknowledgements, or in the index itself, and lists this as a separate line item on her invoice. In cases where she doesn’t approve of the changes an editor, author, or publisher has made to the index, she simply removes that item, and her name doesn’t appear.

ISC Conference 2012, Day 2—What is the future of indexing?

Cheryl Landes is a technical writer and indexer who sees a changing role for indexers—one that is rife with possibilities.

Today people are consuming content in four main ways: through print, on e-readers, on tablets, and on smartphones. In the past year, more people have been moving towards tablets and smartphones rather than e-readers, since the former devices offer colour and other functionality. Many software vendors of authoring tools are adding outputs to accommodate tablets, and more and more companies are publishing technical documentation that can be read on tablets or smartphones (for example, Alaska Airlines replaced forty pounds of paper pilots’ manuals with iPads). Despite the movement towards mobile devices, however, Landes doesn’t believe that print will ever go away.

Digital content means users are able to search, but searching doesn’t yield the speed of information retrieval or context that an index offers. Indexers have to be proactive about educating others about the utility and importance of indexes, and emerging technologies are providing many opportunities for indexers to apply their skills beyond the scope of traditional back-of-the-book indexing.

Partnering with content strategists

Indexers can serve as consultants about taxonomies and controlled vocabularies, which are key to finding content. (An example of a taxonomy is the Legislative Assembly of British Columbia’s Index to Debates.)

Database indexing

Growth in this area is anticipated as more companies move their catalogues online, particularly in retail.

Embedded indexing

Embedded indexing tags content directly in a file and allows for single-sourcing, which is ideal for publishers who want print and digital outputs for their content. (Landes echoes Jan Wright in saying that for the past decade technical communicators have been grappling with issues trade publishers are facing now, yet they’re not talking to each other. How do we start that conversation?)

Search engine optimization

Indexers understand what kinds of search terms certain target audiences use. Acting as consultants, they can create strategies for keywording in metadata.

Blog and wiki indexing

This area is likely to grow because more companies are turning to blogs to promote products and services, and they are using wikis for technical documentation.

Social media

Possible consulting opportunities abound in this quickly changing field. Facebook’s Timeline and Twitter’s hashtags are both attempts at indexing in social media, but one can envision the need for more sophisticated methods of retrieving information as more and more content is archived on these platforms.