For information on why we joined the Open TM2 initiative, please follow this link: http://www.lisa.org/globalizationinsider/2010/07/open_tm2.html
I have also included the interview below.
Smith
What is it about the Open TM2 initiative that motivated Welocalize to get involved?
Most translators use some type of translation workbench tool. Most clients use some type of content management tool, and most vendors use some type of translation management tool. To make it even more interesting, add machine translation tools, authoring tools and a variety of content types. Now combine all of those users and their various tools and try to pass the content type you want translated between each of them, and tell everyone they have half the budget, time and staff to do it!
Yes, I have exaggerated a bit to make a point, but the basic elements of this challenge are what I am hearing from clients, vendors and translators. Traditional methods across our translation supply chain are just not up to the task of the now always-on velocity of end user demands.
In order to increase velocity across the translation supply chain, we need to increase automation which implies more integration, interoperability, extensibility – and standards. We are by no means the first industry to confront this challenge, so why not borrow what has worked elsewhere. At the heart of every sophisticated and mature supply chain is a consistently followed set of standards. As Craig Barrett, former Chairman of Intel, stated, "The world is getting smaller on a daily basis. Hardware, software and content move independent of, and irrespective of, international boundaries. As that increasingly happens, the need to have commonality and interoperability grows. You need standards so that the movie made in China or India plays in the equipment delivered in the United States, or the Web site supporting Intel in the United States plays on the computer in China."
What sort of progress do you think has been made in the area of standards, and what work remains?
Unicode has probably been the most successful standard related to our industry. Unicode specifies a standard for the representation of text in just about any language across software products and systems. Before Unicode, there were hundreds of different encoding systems, and they often conflicted with each other. The significant problem was potential corruption in the passing of text representation data between different encodings or platforms. Thus the Unicode Consortium was formed, and to its credit, Unicode now “enables a single software product or a single website to be targeted across multiple platforms, languages and countries without re-engineering. It allows data to be transported through many different systems without corruption.”
Other standards, such as TMX, have not been as successful. We need to understand why this has been the case? As Bill Sullivan, IBM Globalization Executive, stated, “There is a recognized and growing need for standards in the localization industry. Despite our best intentions, however, standards themselves can often be vague and open to multiple interpretations. What is needed are reference implementations and reference platforms that serve as concrete and unambiguous models in support of the standard.”
This is the work that remains. We need to demonstrate more tangible benefits for adhering to a standard in typical use case scenarios and integrations. How can a client easily integrate the translation assets of an acquisition? How can a client plug-and-play what they deem as the best tool components? How can a client change tools? These are the simple questions I hear. To get closer to the answers, the Open TM2 Steering Committee is working on a Joomla (content management), Open TM2 (translator’s workbench) and GlobalSight (translation management system) integration. The goal is to develop a viable data exchange standard which works seamlessly in this 3-way environment and then extend it to other integrations in the translation supply chain.
LISA will document and publicize the resultant standards. However, neither the Open TM2 initiative nor LISA alone can make the greater vision a reality. As the Unicode initiative demonstrated, broad participation and support across the industry is necessary to achieve success. The Unicode Consortium includes corporate, institutional, individual, NGO and public sector members all collaborating with a unified purpose.
What is different about the Open TM2 initiative?
Open source and “free” are often found in the same sentence. Yes, there is no charge to download an open source product such as Open TM2 or GlobalSight, but there is a cost associated with support, training and customization to specific needs. Open source is not a “free lunch”, but it is an opportunity to engage, integrate and customize at a much deeper level and at a faster pace. The result is potentially a product that is more suited to one’s needs, more easily integrated with other products and a lower total cost of ownership. But what you get out of it is subject to what you put into it. As an ancient Chinese proverb reads, “Talk does not cook rice.” We need people willing to take action. These concepts apply to all open source projects.
What I think is different, and exciting, in this Open TM2 initiative is an increasing alignment of broader interests. Industries typically do not change significantly until the market forces them to change (look at the American auto industry). I think there are some market mega trends in play right now (cloud computing, mobile computing, social computing, open source) and those who don’t adapt to these trends will quickly be left behind. The “translation project” as we knew it traditionally is rapidly morphing into on-demand translation. SimShip is rapidly morphing into SimStream (simultaneous streaming releases). Translation tools and platforms are rapidly morphing into “mash-ups” (combinations of different tools with the sum benefits being significantly greater than the individual benefits). The translation service on the whole is rapidly morphing into a utility inside a broader and more deeply integrated global content supply chain. RFPs now have pages and pages of interoperability, integration and optimization questions. And according to Gartner, "The number of open-source projects doubles every 14 months. By 2012, 90% of companies with IT services will use open-source products of some type.”
So, I think the timing is right. Many, certainly not all, clients, LSPs, tool providers and translators alike are realizing that it is in the best interest of the supply chain as a whole to collaborate to achieve something on the scale of what was achieved with the Unicode standard. “Do not go where the path may lead, go instead where there is no path and leave a trail.” – Ralph Waldo Emerson ...
I have also included the interview below.
Smith
What is it about the Open TM2 initiative that motivated Welocalize to get involved?
Most translators use some type of translation workbench tool. Most clients use some type of content management tool, and most vendors use some type of translation management tool. To make it even more interesting, add machine translation tools, authoring tools and a variety of content types. Now combine all of those users and their various tools and try to pass the content type you want translated between each of them, and tell everyone they have half the budget, time and staff to do it!
Yes, I have exaggerated a bit to make a point, but the basic elements of this challenge are what I am hearing from clients, vendors and translators. Traditional methods across our translation supply chain are just not up to the task of the now always-on velocity of end user demands.
In order to increase velocity across the translation supply chain, we need to increase automation which implies more integration, interoperability, extensibility – and standards. We are by no means the first industry to confront this challenge, so why not borrow what has worked elsewhere. At the heart of every sophisticated and mature supply chain is a consistently followed set of standards. As Craig Barrett, former Chairman of Intel, stated, "The world is getting smaller on a daily basis. Hardware, software and content move independent of, and irrespective of, international boundaries. As that increasingly happens, the need to have commonality and interoperability grows. You need standards so that the movie made in China or India plays in the equipment delivered in the United States, or the Web site supporting Intel in the United States plays on the computer in China."
What sort of progress do you think has been made in the area of standards, and what work remains?
Unicode has probably been the most successful standard related to our industry. Unicode specifies a standard for the representation of text in just about any language across software products and systems. Before Unicode, there were hundreds of different encoding systems, and they often conflicted with each other. The significant problem was potential corruption in the passing of text representation data between different encodings or platforms. Thus the Unicode Consortium was formed, and to its credit, Unicode now “enables a single software product or a single website to be targeted across multiple platforms, languages and countries without re-engineering. It allows data to be transported through many different systems without corruption.”
Other standards, such as TMX, have not been as successful. We need to understand why this has been the case? As Bill Sullivan, IBM Globalization Executive, stated, “There is a recognized and growing need for standards in the localization industry. Despite our best intentions, however, standards themselves can often be vague and open to multiple interpretations. What is needed are reference implementations and reference platforms that serve as concrete and unambiguous models in support of the standard.”
This is the work that remains. We need to demonstrate more tangible benefits for adhering to a standard in typical use case scenarios and integrations. How can a client easily integrate the translation assets of an acquisition? How can a client plug-and-play what they deem as the best tool components? How can a client change tools? These are the simple questions I hear. To get closer to the answers, the Open TM2 Steering Committee is working on a Joomla (content management), Open TM2 (translator’s workbench) and GlobalSight (translation management system) integration. The goal is to develop a viable data exchange standard which works seamlessly in this 3-way environment and then extend it to other integrations in the translation supply chain.
LISA will document and publicize the resultant standards. However, neither the Open TM2 initiative nor LISA alone can make the greater vision a reality. As the Unicode initiative demonstrated, broad participation and support across the industry is necessary to achieve success. The Unicode Consortium includes corporate, institutional, individual, NGO and public sector members all collaborating with a unified purpose.
What is different about the Open TM2 initiative?
Open source and “free” are often found in the same sentence. Yes, there is no charge to download an open source product such as Open TM2 or GlobalSight, but there is a cost associated with support, training and customization to specific needs. Open source is not a “free lunch”, but it is an opportunity to engage, integrate and customize at a much deeper level and at a faster pace. The result is potentially a product that is more suited to one’s needs, more easily integrated with other products and a lower total cost of ownership. But what you get out of it is subject to what you put into it. As an ancient Chinese proverb reads, “Talk does not cook rice.” We need people willing to take action. These concepts apply to all open source projects.
What I think is different, and exciting, in this Open TM2 initiative is an increasing alignment of broader interests. Industries typically do not change significantly until the market forces them to change (look at the American auto industry). I think there are some market mega trends in play right now (cloud computing, mobile computing, social computing, open source) and those who don’t adapt to these trends will quickly be left behind. The “translation project” as we knew it traditionally is rapidly morphing into on-demand translation. SimShip is rapidly morphing into SimStream (simultaneous streaming releases). Translation tools and platforms are rapidly morphing into “mash-ups” (combinations of different tools with the sum benefits being significantly greater than the individual benefits). The translation service on the whole is rapidly morphing into a utility inside a broader and more deeply integrated global content supply chain. RFPs now have pages and pages of interoperability, integration and optimization questions. And according to Gartner, "The number of open-source projects doubles every 14 months. By 2012, 90% of companies with IT services will use open-source products of some type.”
So, I think the timing is right. Many, certainly not all, clients, LSPs, tool providers and translators alike are realizing that it is in the best interest of the supply chain as a whole to collaborate to achieve something on the scale of what was achieved with the Unicode standard. “Do not go where the path may lead, go instead where there is no path and leave a trail.” – Ralph Waldo Emerson ...
0
comments