OASIS XML Localisation Interchange File Format (XLIFF) TC

Expand all | Collapse all

XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

  • 1.  XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 10-25-2012 13:37
    Dear Yves, all, Yves proposed some time ago the general processing requirements for extensibility. As result, in the current spec, as part of the currently used processing requirements, we forbid deletion of extensions. If we insist on forbidding to delete extensions, we effectively undermine the other normative statement that we make, i.e. that tools must not rely on custom extensions for merging back. IMHO the general processing requirements should only protect modules [obviously including the mda module], but not extensions. It seems odd to enforce preservation of 3rd party extensions. If someone wants to effectively use extensions for broader interoperability as opposed to internal processing aid [which is fully OK], they should seek to warrant their survival with other means than a "carte blanche" from XLIFF TC. For instance, preserving ITS based extensions should be warranted by the W3C recommendation (and its implementers) rather than the OASIS stanadard and its implementers [the implmeneter groups can obviously overlap]. Anyway, all extension owners, including W3C WGs and similar can seek to promote their extensions as XLIFF modules, as long as these do not compete with core or other modules features [general principle]. Other method can be simply contractually agree on usage of extensions within a supply chain or similar.. Promoting (to be) widely used extensions as modules would have merits 1) Better protection of features should be an incentive for proposing commonly applied modules rther than rely on private extensions 2) Extension will be vetted for hidden conflicts if submitted as module proposals for the TC, which should prevent the laissez fair situation we have with 1.2 extensibility.. 3) Growing usage of modules as oppesed to random extensions will gradually broaden the public interoperability area [thanks to the general non-compete principle]. Thanks for your attention dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie David Filip, Ph.D. ===================== cellphone: +353-86-0222-158 mailto:davidf@davidf.org Please use david.filip@ul.ie for LRC business www.davidf.org, http://www.linkedin.com/in/davidfatdavidf


  • 2.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 10-25-2012 18:02
    Hi David, all, As we mentioned at the face-to-face meeting, the processing requirements for extensions certainly need to be worked on. The ones in the specification are, as you noted, just the initial proposal. As for whether or not user agents should be able to delete extensions or not: One major point to remember is that a custom extension ABC is not distinguishable from an official XLIFF module XYZ for the user agents that do not support that module. They both are just some elements/attributes in a unknown namespace. You can see this from a different direction: Forget about custom extensions. Think only about XLIFF modules from the viewpoint of tools supporting only the core. We must have some processing expectations to preserve such modules. This is what we need to define. Then when it's done, we can just say the same apply to custom extensions because there is no logical reason to treat them differently. > If someone wants to effectively use extensions for broader > interoperability as opposed to internal processing aid [which is > fully OK], they should seek to warrant their survival with other > means than a "carte blanche" from XLIFF TC. > > For instance, preserving ITS based extensions should be warranted > by the W3C recommendation (and its implementers) rather than the > OASIS stanadard and its implementers [the implmeneter groups can > obviously overlap]. I'm afraid that make no sense to me: XLIFF tools common behavior is driven by the processing requirements set in the XLIFF specification, not by anything else. Let's say we have a custom element for some ITS data category, we try it out, etc. Then 4 months later, we decided it works fine and it's stable and we'd like to make it an official XLIFF module. We go through that process and the extension becomes an official module. Why in the world a tool that supports only the core should have a different behavior before and after the extension become a module? For all we know at this point the element may even keep the same namespace. From the tool viewpoint, once again, there is no difference between a module it does not support and a custom extension. So, I think we should stop thinking about the word "extension" and focus on getting the PR done. If it can help, think the extension is a "unsupported module". I'm going to use that term from now on :) And with that in mind, I'd like very much the unsupported modules to be preserved whenever it's possible. We may have PR related to how to behave with unsupported modules in specific places too. For example, it may be good to define what a tool do when removing an <mrk> element that has a ref attribute pointing to some element in the same unit. that element should probably be remove as well (assuming it's not used by another <mrk>). Cheers, -yves


  • 3.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 10-25-2012 22:26
    Yves, same as Rodolfo, I think that it is a substantial difference that XLIFF modules have an OASIS warranted published schema, whereas extensions don't. This is also what I mean when I say that support of extensions can be negotiated outside XLIFF TC. Some authorities behind namespaces (and underlying semantics!) are better than others, so some people might be happy to validate against a W3C published schema rather than against a schema published by a private initiative or a single tools vendor. And why should people bother at all to preserve stuff that has no published schema at all? Still a Corporate buyer or an LSP can negotiate within their supply chain that their extensions won't be deleted. They might even negotiate that they will be supported, ideally as an OASIS warranted XLIFF module :-) I see the point that applications supporting only core might have trouble discerning between real modules and "unsupported modules". Still I do not subscribe to this terminology, as it might obscure the substantial difference between a module and an extension. This must be resolved by a specific requirement not to delete modules based on schema validation. Obviously schema is not enough for processing, and sure enough we do not want the core only applications to process the modules, we only tell them to preserve them and for that end, schema should be enough. Besides if they are lazy they can decide not to validate non-core and preserve everything at points where modules or extensions are allowed. There is no harm in it. On the other hand if someone multiplies the size of an xliff file by inclusion of some verbose proprietary mark up it is totally legitimate to delete it to preserve own storage resources (the use case might be e.g. building an MT training corpus, or simply unwillingness to clutter machines of my translators with rubbish, and rubbish it can be if it is ANY extension). If extensions will have the same level of protection as modules, implementers will rely on extensions rather than push modules into common space. And this is fundamentally bad from my point of view. Extensions can compete for feature coverage with other extensions, but MUST NOT compete with features covered by core AND modules. Modules are documented in the same spec as core and other modules, along with a schema that allows to recognize them as protected. I believe the following is a substantial question: What good can come from preserving an extension if you must not rely on it for merging back? The general Processing Requirements should go like this Core MUST be processed. Module MUST be processed if supported, or otherwise preserved. Extension MAY be processed, preserved, or deleted. [my preferred way] Or we might want to say something like: Extensions SHOULD NOT be deleted if their schema is declared in the XLIFF document AND publicly accessible. Otherwise Extensions MAY be processed, preserved, or deleted. I believe that SHOULD NOT is the right normative word here, as in most cases it is good not to delete, still people can decide to delete if they have their reasons. It is like us using : for a non-namespace prefix, we have a good reason for it, or do we? :-). Rgds dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Thu, Oct 25, 2012 at 6:19 PM, Yves Savourel <ysavourel@enlaso.com> wrote: > Hi David, all, > > As we mentioned at the face-to-face meeting, the processing requirements for extensions certainly need to be worked on. The ones in the specification are, as you noted, just the initial proposal. > > As for whether or not user agents should be able to delete extensions or not: > > One major point to remember is that a custom extension ABC is not distinguishable from an official XLIFF module XYZ for the user agents that do not support that module. They both are just some elements/attributes in a unknown namespace. > > You can see this from a different direction: > Forget about custom extensions. Think only about XLIFF modules from the viewpoint of tools supporting only the core. > We must have some processing expectations to preserve such modules. > This is what we need to define. > Then when it's done, we can just say the same apply to custom extensions because there is no logical reason to treat them differently. > > >> If someone wants to effectively use extensions for broader >> interoperability as opposed to internal processing aid [which is >> fully OK], they should seek to warrant their survival with other >> means than a "carte blanche" from XLIFF TC. >> >> For instance, preserving ITS based extensions should be warranted >> by the W3C recommendation (and its implementers) rather than the >> OASIS stanadard and its implementers [the implmeneter groups can >> obviously overlap]. > > I'm afraid that make no sense to me: XLIFF tools common behavior is driven by the processing requirements set in the XLIFF specification, not by anything else. > > Let's say we have a custom element for some ITS data category, we try it out, etc. > Then 4 months later, we decided it works fine and it's stable and we'd like to make it an official XLIFF module. > We go through that process and the extension becomes an official module. > > Why in the world a tool that supports only the core should have a different behavior before and after the extension become a module? For all we know at this point the element may even keep the same namespace. From the tool viewpoint, once again, there is no difference between a module it does not support and a custom extension. > > So, I think we should stop thinking about the word "extension" and focus on getting the PR done. If it can help, think the extension is a "unsupported module". I'm going to use that term from now on :) > > And with that in mind, I'd like very much the unsupported modules to be preserved whenever it's possible. > > We may have PR related to how to behave with unsupported modules in specific places too. For example, it may be good to define what a tool do when removing an <mrk> element that has a ref attribute pointing to some element in the same unit. that element should probably be remove as well (assuming it's not used by another <mrk>). > > Cheers, > -yves > > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org > For additional commands, e-mail: xliff-help@lists.oasis-open.org >


  • 4.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 10-26-2012 11:15
    Hi David, Rodolfo, all, > Same as Rodolfo, I think that it is a substantial > difference that XLIFF modules have an OASIS warranted > published schema, whereas extensions don't. I agree, it is an important difference. It allows you to validate or not a vocabulary. And that's it. A schema doesn't tell me what a tool should or should not do when manipulating the elements/attributes of a namespace it does not know. The processing requirements do that. They dictate what the schema can or cannot do, not the reverse. > This is also what I mean when I say that support of > extensions can be negotiated outside XLIFF TC. > Some authorities behind namespaces (and underlying semantics!) > are better than others, so some people might be happy to > validate against a W3C published schema rather than against > a schema published by a private initiative or a single tools vendor. > And why should people bother at all to preserve stuff that has > no published schema at all? I guess we have vastly different expectations. As an XLIFF user I don't give a hoot if the extension XYZ has a schema or not. I just want XLIFF to have processing requirements that allow any tool to deal with that extension. And I expect that extension to abide by those processing requirements too. > Still a Corporate buyer or an LSP can negotiate within their > supply chain that their extensions won't be deleted. They might > even negotiate that they will be supported, ideally as an OASIS > warranted XLIFF module :-) Sorry but that make no sense to me. XLIFF must work out of the box, without taking into account negotiation between LSP and tools vendor, or schema provided or not. The most common use of XLIFF is this one: Tools ABC generates an XLIFF document, the translator use tool XYZ to translate it. There is no relation whatsoever between the two tools. The only thing that can make interoperability works is that both tools adhere to the processing requirements set in the specification. > Obviously schema is not enough for processing, and sure enough > we do not want the core only applications to process the modules, > we only tell them to preserve them and for that end, schema > should be enough. I'm sorry, but again as an implementer, I don't see how a schema can help me in preserving or not elements/attributes. Making sure an application can store and re-provide non-core markup has nothing to do with validating that same markup. > On the other hand if someone multiplies the size of an xliff file > by inclusion of some verbose proprietary mark up it is totally > legitimate to delete it to preserve own storage resources. If you are worried about size, you should work on having the WG use <seg>, <ign>, <src>, <trg> instead of <segment>, <ignorable>, <source> and <target>. That's where you'll get you best savings. > Extensions can compete for feature coverage with other extensions, > but MUST NOT compete with features covered by core AND modules. > Modules are documented in the same spec as core and other > modules, along with a schema that allows to recognize > them as protected. a) Again, schema doesn't allow any optional thing (modules and extensions are always optional) to be "protected". The processing requirements do. b) How modules are going to be defined after the core is released is a whole different (and important) topic that deserves its own thread. There are many unknowns we have never talked about yet. For example, I'm still not clear on things like: - how do we add modules without releasing a new version of XLIFF? or do we? - how extension ABC that existed in 2013 suddenly becomes non-compliant because it does the same thing as module XYZ defined in 2014? - can we add module elements/attribute in core elements where extensions are forbidden if we need to? - etc. > I believe the following is a substantial question: > What good can come from preserving an extension if you > must not rely on it for merging back? Extensions can be rely on for merging when they are in the skeleton. I think the problem with extensions placed elsewhere is that they may get deleted depending on the manipulation of the data (re-segmenting, removing <mrk>, etc. > The general Processing Requirements should go like this > Core MUST be processed. > Module MUST be processed if supported, or otherwise preserved. > Extension MAY be processed, preserved, or deleted. [my preferred way] > Or we might want to say something like: > Extensions SHOULD NOT be deleted if their schema is declared > in the XLIFF document AND publicly accessible. > Otherwise Extensions MAY be processed, preserved, or deleted. Binding what the processing requirements for extensions to the presence or not of a schema make absolutely no sense to me. Using 'SHOULD NOT' may be ok but there is no reason to make a difference between modules and extensions. How do you know a module is a module if you don't support it? And even if you do know, why make a distinction? If you can preserve a module you don't support can also preserve a custom namespace exactly the same way. Why encourage to break people's processes? Cheers, -yves


  • 5.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 10-26-2012 12:33
    Yves, only a few short answers inline, as we are beginning to go in circles.. dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Fri, Oct 26, 2012 at 12:15 PM, Yves Savourel <ysavourel@enlaso.com> wrote: > Hi David, Rodolfo, all, > >> Same as Rodolfo, I think that it is a substantial >> difference that XLIFF modules have an OASIS warranted >> published schema, whereas extensions don't. > > I agree, it is an important difference. It allows you to validate or not a vocabulary. 1 thing we agree on in this thread > > And that's it. A schema doesn't tell me what a tool should or should not do when manipulating the elements/attributes of a namespace it does not know. The processing requirements do that. They dictate what the schema can or cannot do, not the reverse. > > >> This is also what I mean when I say that support of >> extensions can be negotiated outside XLIFF TC. >> Some authorities behind namespaces (and underlying semantics!) >> are better than others, so some people might be happy to >> validate against a W3C published schema rather than against >> a schema published by a private initiative or a single tools vendor. >> And why should people bother at all to preserve stuff that has >> no published schema at all? > > I guess we have vastly different expectations. > > As an XLIFF user I don't give a hoot if the extension XYZ has a schema or not. > I just want XLIFF to have processing requirements that allow any tool to deal with that extension. And I expect that extension to abide by those processing requirements too. > > >> Still a Corporate buyer or an LSP can negotiate within their >> supply chain that their extensions won't be deleted. They might >> even negotiate that they will be supported, ideally as an OASIS >> warranted XLIFF module :-) > > Sorry but that make no sense to me. > > XLIFF must work out of the box, without taking into account negotiation between LSP and tools vendor, or schema provided or not. > > The most common use of XLIFF is this one: Tools ABC generates an XLIFF document, the translator use tool XYZ to translate it. There is no relation whatsoever between the two tools. The only thing that can make interoperability works is that both tools adhere to the processing requirements set in the specification. > I agree that PRs do the trick, I just do not want them to protect extensions as it would undermine the whole idea of normative modules. >> Obviously schema is not enough for processing, and sure enough >> we do not want the core only applications to process the modules, >> we only tell them to preserve them and for that end, schema >> should be enough. > > I'm sorry, but again as an implementer, I don't see how a schema can help me in preserving or not elements/attributes. Making sure an application can store and re-provide non-core markup has nothing to do with validating that same markup. > > >> On the other hand if someone multiplies the size of an xliff file >> by inclusion of some verbose proprietary mark up it is totally >> legitimate to delete it to preserve own storage resources. > > If you are worried about size, you should work on having the WG use <seg>, <ign>, <src>, <trg> instead of <segment>, <ignorable>, <source> and <target>. That's where you'll get you best savings. > Going for the short names might make sense, I am getting exactly this feedback from MT producers. The reason why I was not pushing this is that I am also concerned with human readability. We might want to discuss it separately whether or not we care But you are avoiding the argument here. The extension can be anything and you want to force innocent people to preserve it no matter what. Do I need to accept it if it makes the file ten times bigger? Do I need to accept it if I am able to tell that it is neither core nor module? > >> Extensions can compete for feature coverage with other extensions, >> but MUST NOT compete with features covered by core AND modules. >> Modules are documented in the same spec as core and other >> modules, along with a schema that allows to recognize >> them as protected. > > a) Again, schema doesn't allow any optional thing (modules and extensions are always optional) to be "protected". The processing requirements do. Yes, and they should only protect modules, not extensions > > b) How modules are going to be defined after the core is released is a whole different (and important) topic that deserves its own thread. > There are many unknowns we have never talked about yet. For example, I'm still not clear on things like: > - how do we add modules without releasing a new version of XLIFF? or do we? It seems clear to me that full new version must be published every time (a number of) module(s) is added. But these will be minor versions (2.x) that do not break backward compatibility of core or modules. > - how extension ABC that existed in 2013 suddenly becomes non-compliant because it does the same thing as module XYZ defined in 2014? Surely this extension will become illegal. If the owner of the extension had enough sense, she took part in specifying the module that made it obsolete, so that her needs are catered for by the module. > - can we add module elements/attribute in core elements where extensions are forbidden if we need to? Do not understand this one, can you elaborate? > - etc. > > >> I believe the following is a substantial question: >> What good can come from preserving an extension if you >> must not rely on it for merging back? > > Extensions can be rely on for merging when they are in the skeleton. I think the problem with extensions placed elsewhere is that they may get deleted depending on the manipulation of the data (re-segmenting, removing <mrk>, etc. OK I am not talking about deleting extensions in skeletons, skeleton is so underspecified that extensions must be protected in skeleton, I agree, and I did not reflect that in the previous strawman. There is a general problem with metadata support on segments due to re-segmenting. We might need to look into detailed ways to maximize metadata survival on resegmenting. But again I do not want to protect extensions. > > >> The general Processing Requirements should go like this >> Core MUST be processed. >> Module MUST be processed if supported, or otherwise preserved. [Inserting skeleton provision] Skeleton (if present) MUST be preserved including extensions until re-merge. [Modifying due to the skeleton provision] Extensions outside skeleton MAY be processed, preserved, or deleted. >> Extensions outside MAY be processed, preserved, or deleted. [my preferred way] >> Or we might want to say something like: >> Extensions SHOULD NOT be deleted if their schema is declared >> in the XLIFF document AND publicly accessible. >> Otherwise Extensions MAY be processed, preserved, or deleted. > > Binding what the processing requirements for extensions to the presence or not of a schema make absolutely no sense to me. > > Using 'SHOULD NOT' may be ok but there is no reason to make a difference between modules and extensions. So you would be OK with deleting modules? How do you know a module is a module if you don't support it? I do not need to support it to know it as a module >And even if you do know, why make a distinction? Because extensions are not part of the spec and can be anything. > If you can preserve a module you don't support can also preserve a custom namespace exactly the same > way. I have the option to preserve, I do not want my option to delete it to ber taken away. I will opt to preserve and even process its: and related okp: and dc: etc. extensions even before they will become a part of an officially approved OASIS XLIFF modlule, becuase the authorities behind the related specs and namespace are good enough for me. > Why encourage to break people's processes? Because guaranteeing the lifecycle for private extensibility will kill interoperability in the long run. Core is the smallest common denominator, the plan is to expand the public spec based interoperability by adding modules. If extensions have the same level of protection as modules, there will be no incentive to grow the lowest common denominator. Extensions are by definition for private use. If TC tries to warrant their survival it undermines its own merit and kills the idea of modular standard that can be enhanced with official modules. No one will strive to specify commonly used modules if their extensions are protected no matter what. I MUST preserve a module because it is a part of the OASIS spec, I do not want to be forced to preserve extensions, as they can be ANYTHING. I am NOT going to sign a "carte blanche" > > Cheers, > -yves > > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org > For additional commands, e-mail: xliff-help@lists.oasis-open.org >


  • 6.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 10-28-2012 11:14
    Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves


  • 7.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-01-2012 01:46
    Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial.. @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools] I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed. Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them. I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.  Valid behaviors that core only implementers can adopt are practically two: 1) Process core and preserve everything that is on extension points (modules or extensions). 2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD). Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there. This seems exactly the intended behavior to me.. BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC]. He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed. Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. . As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all.. Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org


  • 8.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-01-2012 09:30
    Hi,   In latest published version of the specification document I modified the general processing expectations (section 2.1) to this:     • A tool processing a valid XLIFF document that contains XLIFF-defined elements that it cannot handle must preserve those elements. • A tool processing a valid XLIFF document that contains custom elements that it cannot handle should preserve those elements. • When a <target> child is added to a <segment> element, the value of its xml:space attribute must set to preserve if the xml:space attribute of the sibling <source> element is set to preserve .     The change covers one request from David regarding spaces and separates the treatment of extensions and XLIFF elements.   This is just a starting point, please propose a new concrete wording if you want changes.   I think it is time to change the format of our discussions. Everyone should read the specification and propose concrete changes to the text wherever it is necessary. We should move from generalizations to detailed pieces of text.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [mailto:xliff@lists.oasis-open.org] On Behalf Of Dr. David Filip Sent: Wednesday, October 31, 2012 11:45 PM To: Yves Savourel Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial..   @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools]   I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed.   Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them.   I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.    Valid behaviors that core only implementers can adopt are practically two:   1) Process core and preserve everything that is on extension points (modules or extensions).   2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD).   Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there.   This seems exactly the intended behavior to me..   BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC].   He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed.   Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. .   As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all..   Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org  


  • 9.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-02-2012 02:32
    Thanks Rodolfo, this wording indeed covers my requests. @All, As I believe that this is a matter of high importance and profound impact, I'd like to propose an electronic ballot to approve this wording. Hereby proposing and looking for a second. The ballot is intended as a yes/no ballot about Rodolfo's entire wording of the general processing requirements based on my 2 change requests. This ballot is not intended as a ballot about the previous wording of the section. If this ballot does not pass I'd say that the section is unstable and requires discussion in TC at large. Thanks and regards dF On Nov 1, 2012 9:29 AM, "Rodolfo M. Raya" < rmraya@maxprograms.com > wrote: Hi,   In latest published version of the specification document I modified the general processing expectations (section 2.1) to this:     • A tool processing a valid XLIFF document that contains XLIFF-defined elements that it cannot handle must preserve those elements. • A tool processing a valid XLIFF document that contains custom elements that it cannot handle should preserve those elements. • When a <target> child is added to a <segment> element, the value of its xml:space attribute must set to preserve if the xml:space attribute of the sibling <source> element is set to preserve .     The change covers one request from David regarding spaces and separates the treatment of extensions and XLIFF elements.   This is just a starting point, please propose a new concrete wording if you want changes.   I think it is time to change the format of our discussions. Everyone should read the specification and propose concrete changes to the text wherever it is necessary. We should move from generalizations to detailed pieces of text.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [mailto: xliff@lists.oasis-open.org ] On Behalf Of Dr. David Filip Sent: Wednesday, October 31, 2012 11:45 PM To: Yves Savourel Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial..   @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools]   I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed.   Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them.   I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.    Valid behaviors that core only implementers can adopt are practically two:   1) Process core and preserve everything that is on extension points (modules or extensions).   2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD).   Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there.   This seems exactly the intended behavior to me..   BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC].   He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed.   Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. .   As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all..   Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone:  +353-6120-2781 cellphone: +353-86-0222-158 facsimile:  +353-6120-2734 mailto: david.filip@ul.ie On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org  


  • 10.  Fwd: RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-02-2012 02:41
    sending from the proper e-mail address.. ---------- Forwarded message ---------- From: "David Filip" < davidf@davidf.org > Date: Nov 2, 2012 2:31 AM Subject: RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements To: "Rodolfo M. Raya" < rmraya@maxprograms.com > Cc: < xliff@lists.oasis-open.org > Thanks Rodolfo, this wording indeed covers my requests. @All, As I believe that this is a matter of high importance and profound impact, I'd like to propose an electronic ballot to approve this wording. Hereby proposing and looking for a second. The ballot is intended as a yes/no ballot about Rodolfo's entire wording of the general processing requirements based on my 2 change requests. This ballot is not intended as a ballot about the previous wording of the section. If this ballot does not pass I'd say that the section is unstable and requires discussion in TC at large. Thanks and regards dF On Nov 1, 2012 9:29 AM, "Rodolfo M. Raya" < rmraya@maxprograms.com > wrote: Hi,   In latest published version of the specification document I modified the general processing expectations (section 2.1) to this:     • A tool processing a valid XLIFF document that contains XLIFF-defined elements that it cannot handle must preserve those elements. • A tool processing a valid XLIFF document that contains custom elements that it cannot handle should preserve those elements. • When a <target> child is added to a <segment> element, the value of its xml:space attribute must set to preserve if the xml:space attribute of the sibling <source> element is set to preserve .     The change covers one request from David regarding spaces and separates the treatment of extensions and XLIFF elements.   This is just a starting point, please propose a new concrete wording if you want changes.   I think it is time to change the format of our discussions. Everyone should read the specification and propose concrete changes to the text wherever it is necessary. We should move from generalizations to detailed pieces of text.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [mailto: xliff@lists.oasis-open.org ] On Behalf Of Dr. David Filip Sent: Wednesday, October 31, 2012 11:45 PM To: Yves Savourel Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial..   @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools]   I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed.   Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them.   I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.    Valid behaviors that core only implementers can adopt are practically two:   1) Process core and preserve everything that is on extension points (modules or extensions).   2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD).   Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there.   This seems exactly the intended behavior to me..   BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC].   He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed.   Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. .   As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all..   Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone:  +353-6120-2781 cellphone: +353-86-0222-158 facsimile:  +353-6120-2734 mailto: david.filip@ul.ie On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org  


  • 11.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-02-2012 03:39
    Hi,   David: I don’t think it’s healthy to have to go through a ballot each time we change two sentences. Especially when the new wording of section 2.1 contradicts the wording of 2.7.2.   No matter how you look at it, there is no technical ground to treat custom extensions and un-supported modules differently.   Before we resolve whether or not they should be treated differently based on philosophical reasons, we need to define how un-supported elements/attributes should be handled in operations like re-segmentation. Blanket statements are not making sense.   Regards, -yves       From: xliff@lists.oasis-open.org [mailto:xliff@lists.oasis-open.org] On Behalf Of Rodolfo M. Raya Sent: Thursday, November 01, 2012 3:30 AM To: xliff@lists.oasis-open.org Subject: RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Hi,   In latest published version of the specification document I modified the general processing expectations (section 2.1) to this:     • A tool processing a valid XLIFF document that contains XLIFF-defined elements that it cannot handle must preserve those elements. • A tool processing a valid XLIFF document that contains custom elements that it cannot handle should preserve those elements. • When a <target> child is added to a <segment> element, the value of its xml:space attribute must set to preserve if the xml:space attribute of the sibling <source> element is set to preserve .     The change covers one request from David regarding spaces and separates the treatment of extensions and XLIFF elements.   This is just a starting point, please propose a new concrete wording if you want changes.   I think it is time to change the format of our discussions. Everyone should read the specification and propose concrete changes to the text wherever it is necessary. We should move from generalizations to detailed pieces of text.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [ mailto:xliff@lists.oasis-open.org ] On Behalf Of Dr. David Filip Sent: Wednesday, October 31, 2012 11:45 PM To: Yves Savourel Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial..   @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools]   I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed.   Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them.   I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.    Valid behaviors that core only implementers can adopt are practically two:   1) Process core and preserve everything that is on extension points (modules or extensions).   2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD).   Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there.   This seems exactly the intended behavior to me..   BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC].   He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed.   Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. .   As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all..   Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie   On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org  


  • 12.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-02-2012 09:06
    Yves, I agree that it would be bad to have a ballot every time two sentences are changed. Yet these sentences are not any two sentences. My goal is to be able to put together a conformance test suite, which cannot be achieved unless PRs are sorted out throughout the spec. And it is only natural to try and fix the PRs starting with general processing requirements. Sure enough, there is a contradiction with 2.7.2 but IMHO this contradiction should be simply fixed (with no need for a ballot) by changing 2.7.2. [replace MUST with SHOULD] in case the proposed general principle is approved by the TC ballot.. I formulated the ballot proposal as a proposal about wording as I agree with Rodolfo that at this point, we should have very specific discussions about current draft wording and how to change it to make the spec better and closer to final TC draft. I believe there will be few more ballots before the bulk of the spec can go for a full ballot resolution Also I proposed the ballot, as it seems the only way how to figure out the  preference of the TC et large, because so far the only people active in this discussion, are you, Rodolfo and myself, which does not provide much material for consensus sensing. @All, still looking for a second, eventually for a competing/clarifying wording proposal for the general PRs. Cheers dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Fri, Nov 2, 2012 at 3:38 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi,   David: I don’t think it’s healthy to have to go through a ballot each time we change two sentences. Especially when the new wording of section 2.1 contradicts the wording of 2.7.2.   No matter how you look at it, there is no technical ground to treat custom extensions and un-supported modules differently.   Before we resolve whether or not they should be treated differently based on philosophical reasons, we need to define how un-supported elements/attributes should be handled in operations like re-segmentation. Blanket statements are not making sense.   Regards, -yves       From: xliff@lists.oasis-open.org [mailto: xliff@lists.oasis-open.org ] On Behalf Of Rodolfo M. Raya Sent: Thursday, November 01, 2012 3:30 AM To: xliff@lists.oasis-open.org Subject: RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Hi,   In latest published version of the specification document I modified the general processing expectations (section 2.1) to this:     • A tool processing a valid XLIFF document that contains XLIFF-defined elements that it cannot handle must preserve those elements. • A tool processing a valid XLIFF document that contains custom elements that it cannot handle should preserve those elements. • When a <target> child is added to a <segment> element, the value of its xml:space attribute must set to preserve if the xml:space attribute of the sibling <source> element is set to preserve .     The change covers one request from David regarding spaces and separates the treatment of extensions and XLIFF elements.   This is just a starting point, please propose a new concrete wording if you want changes.   I think it is time to change the format of our discussions. Everyone should read the specification and propose concrete changes to the text wherever it is necessary. We should move from generalizations to detailed pieces of text.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [ mailto:xliff@lists.oasis-open.org ] On Behalf Of Dr. David Filip Sent: Wednesday, October 31, 2012 11:45 PM To: Yves Savourel Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial..   @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools]   I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed.   Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them.   I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.    Valid behaviors that core only implementers can adopt are practically two:   1) Process core and preserve everything that is on extension points (modules or extensions).   2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD).   Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there.   This seems exactly the intended behavior to me..   BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC].   He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed.   Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. .   As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all..   Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone:  +353-6120-2781 cellphone: +353-86-0222-158 facsimile:  +353-6120-2734 mailto: david.filip@ul.ie   On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org  


  • 13.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-02-2012 09:40
    Hi Yves,   I’m looking for concrete proposals to the text. My goal is to advance so we can have a specification ready for ballot in December.   Should I adjust the text in 2.7.2 to match the changes in 2.1 or should I make any change in 2.1?   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [mailto:xliff@lists.oasis-open.org] On Behalf Of Yves Savourel Sent: Friday, November 02, 2012 1:39 AM To: xliff@lists.oasis-open.org Subject: RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Hi,   David: I don’t think it’s healthy to have to go through a ballot each time we change two sentences. Especially when the new wording of section 2.1 contradicts the wording of 2.7.2.   No matter how you look at it, there is no technical ground to treat custom extensions and un-supported modules differently.   Before we resolve whether or not they should be treated differently based on philosophical reasons, we need to define how un-supported elements/attributes should be handled in operations like re-segmentation. Blanket statements are not making sense.   Regards, -yves       From: xliff@lists.oasis-open.org [ mailto:xliff@lists.oasis-open.org ] On Behalf Of Rodolfo M. Raya Sent: Thursday, November 01, 2012 3:30 AM To: xliff@lists.oasis-open.org Subject: RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Hi,   In latest published version of the specification document I modified the general processing expectations (section 2.1) to this:     • A tool processing a valid XLIFF document that contains XLIFF-defined elements that it cannot handle must preserve those elements. • A tool processing a valid XLIFF document that contains custom elements that it cannot handle should preserve those elements. • When a <target> child is added to a <segment> element, the value of its xml:space attribute must set to preserve if the xml:space attribute of the sibling <source> element is set to preserve .     The change covers one request from David regarding spaces and separates the treatment of extensions and XLIFF elements.   This is just a starting point, please propose a new concrete wording if you want changes.   I think it is time to change the format of our discussions. Everyone should read the specification and propose concrete changes to the text wherever it is necessary. We should move from generalizations to detailed pieces of text.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [ mailto:xliff@lists.oasis-open.org ] On Behalf Of Dr. David Filip Sent: Wednesday, October 31, 2012 11:45 PM To: Yves Savourel Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   Yves, I think we are getting closer. @All I'd like to see more opinions regarding this issue, I believe it is crucial..   @Yves, you are right that my reasons to allow deletion of extensions are not technical to a large extent. [my reasons can be prioritized as follows: 1. general standardization principles (why should the TC sign a "carte blanche" for any extension producers), 2. underpinning the importance of TC approved modules in the standard's architecture (closely intertwined with reason 1.). 3. technical issues like storage size of verbose markup, constraints on number of namespaces in generic XML tools]   I guess my approach to extensions making it to modules is naked Darwinism. If someone produces extensions that others take the trouble to track down and delete, they will probably have trouble to make it to an official module. 1.2 does not have processing requirements so I guess it is admissible to delete extensions, as you will not produce an invalid XLIFF file by just deleting extensions. Still some extensions are getting support among third party implementers for whatever reasons, as our survey showed.   Same as you, I would be unhappy with TC approving member submissions for modules without proper vetting. But I would not be too afraid of that, as [as explained in the last round, the nature of our standard will force us to re-publish the whole spec each time for minor 2.x versions, that is whenever a module or batch of modules is added] there will always be a full blown committee and OASIS process whenever adding them.   I am happy with saying that extensions SHOULD NOT be deleted (without further qualifications), rather than making the deletion or not truly optional (MAY). Still I do not see why modules should not be unconditionally protected. As we know what is in modules, we can vouch for them and we absolutely do not want to undermine them by letting users to decide if they might need to delete them.    Valid behaviors that core only implementers can adopt are practically two:   1) Process core and preserve everything that is on extension points (modules or extensions).   2) Process core AND validate the whole document against all official XLIFF schemas. Eventually, cut parts of the tree that are NOT protected AND causing them trouble (i.e. if they have valid reasons not to follow the SHOULD).   Most of the implementers probably won't bother to track down AND delete the unprotected parts UNLESS these are causing bigger trouble by being there.   This seems exactly the intended behavior to me..   BTW, I discussed the general behavior of extensions in XML based standards with Jirka Kosek today [took advantage of him being at W3C TPAC].   He said that it is best practice to preserve extensions that have attribute mustUnderstand set to 'yes'. The rationale of having such attribute is to indicate whether or not an extension changed the semantics of the core, so that it does not make sense without the extension. IMHO we implicitly prohibit such behavior of extensions, although I believe there are such extensions on 1.2 [bad]. We might want to be more specific and prohibit this explicitly. IMHO it follows from prohibiting to rely on extensions for the roundtrip, still being more explicit cannot hurt I guess.   Generally Jirka would see no harm in deleting extensions in XML formats except the above exception that we can safely rule out for XLIFF, moreover he pointed out that it might be better to delete them, then eventually to create an ungrounded suspicion that the extension actually was understood and processed.   Mark you, that this argument cannot be turned against preserving modules, because implementers MUST use modules (and not extensions) if they have functionality covered by modules, so if they do not do anything to modules it follows that the modules are always in the right state from the last tool that made use of them. .   As I said before I would be happy to protect extensions in skeleton that is underspecified, so that we basically encourage the implementers to use magic in skeleton, if they are using it at all..   Cheers dF   Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie   On Sun, Oct 28, 2012 at 11:14 AM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, I see that your main argument is not a technical one: We should encourage people to treat custom extensions as second class citizens because they undermine the idea of normative modules. But I think this is the wrong way to look at it: Extensions do not undermine the whole idea of normative modules. They are part of the whole idea. Official modules are not going to be just appear out of thin air. They'll start as extensions, get tested, be improved, as they are used as extensions. Then they'll migrate to become official. I don't think it would be good to see company XYZ come to the TC and say: "Here is a proposal for a module." And have the TC (which has very little bandwidth), have a quickly look at it and say "OK, let's add it". We need real feedback, real implementations, real users looking at and using the module. And the most natural way to have all that is through extensions. In order to have such evolution, they need to have the same processing requirements as a module. Some extensions may end up becoming modules without even changing their namespaces, as we want to promote re-use and interoperability. Parts of ITS are possible examples of this. I'm OK with "SHOULD NOT be deleted" rather than a "MUST be preserved" (the 'must' come from the discussion the TC had not directly from me). But, regardless what the processing requirements end up being, they must apply to both modules and extensions. Otherwise XLIFF is detrimental to users who want to make their needs/solutions evolve into official modules. Sure, not all extensions will become modules, but we can't control that. What we can control is the basic behavior that all tools should have when manipulating elements/attributes of namespaces they don't know. Cheers, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org  


  • 14.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-05-2012 13:10
    Hi all, > In latest published version of the specification document > I modified the general processing expectations (section 2.1) > to this: > • A tool processing a valid XLIFF document that contains XLIFF-defined > elements that it cannot handle must preserve those elements. > • A tool processing a valid XLIFF document that contains custom elements > that it cannot handle should preserve those elements. > • When a <target> child is added to a <segment> element, the value of > its xml:space attribute must set to preserve if the xml:space attribute > of the sibling <source> element is set to preserve. > > The change covers one request from David regarding spaces and separates the > treatment of extensions and XLIFF elements. > > This is just a starting point, please propose a new concrete wording > if you want changes. I disagree with the proposal for the following reasons: 1) Not all core/module elements should always be preserved when manipulating the document. We need to define this per elements in some cases. Re-segmentation is an obvious example. 2) I assume "XLIFF-defined" means the element is part of a namespace defined by the XLIFF TC. But I don't think this is the same as being a "module". If, for example, part of ITS becomes a module, it's not "XLIFF-defined", but it's still a module. 3) The processing requirements about <target> belongs to the section that defines <target> So, instead I propose this as general PRs: ----------- - All user agents MUST follow the processing requirements associated with the core elements and attributes of XLIFF. Those processing requirements are defined throughout the specification. - A user agent that does not support a given non-core element or attribute SHOULD preserve it, except when a core processing requirement specifies otherwise. (See, for example, section 2.8.3 Segmentation Modification). ----------- I'm also going to post an email with a initial proposal for the PRs associated with re-segmentation. Regards, -yves


  • 15.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-05-2012 18:43
    >


  • 16.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-05-2012 21:08
    Hi, > XLIFF-defined should be read as "defined by the XLIFF TC". > The ITS group can propose an enhancement but cannot define > an XLIFF module, it has to be defined (probably based on an ITS > proposal) and approved by the XLIFF TC. Any module will have > to use a namespace defined by the XLIFF TC following the > rules stated by OASIS. Could you point us to those rules? I'd like to read them and likely provide some feedback to OASIS. Why would OASIS have any rule on what a module should be made of? We use for example xm:lang in the core and in one module. We could use the ITS namespace in another module, as long as the core schema defines what goes where I don't see a reason for not using the ITS namespace itself in a module. It would go against the very idea of interoperability. Regards, -yves


  • 17.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-05-2012 23:28
    OASIS has rules for namespaces: http://docs.oasis-open.org/specGuidelines/namingGuidelines/resourceNaming.html . The namespaces we publish must be conformant. We cannot publish a namespace from W3C as our own. Regards, Rodolfo -- Rodolfo M. Raya rmraya@maxprograms.com Maxprograms http://www.maxprograms.com >


  • 18.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 12:09
    Hi, > OASIS has rules for namespaces: > http://docs.oasis-open.org/specGuidelines/namingGuidelines/resourceNaming.html > The namespaces we publish must be conformant. > We cannot publish a namespace from W3C as our own. Thanks for pointing the documentation Rodolfo. I see rules on how OASIS URIs and names in general should be done, but as far as I can tell there is nothing that says we cannot use attributes of a non-OASIS namespace in a specification. In this case we're not creating a new schema/namespace, we are defining a module that says how to use an existing standard. >> Any module will have to use >> a namespace defined by the XLIFF TC following >> the rules stated by OASIS. If we were to define a new namespace we would have to follow the OASIS rules for its URI, but I didn't see anything in the document that states that modules must be defined through an OASIS namespace. Even our own statement of what a module is says (section 1.1.3): "A module is an optional set of XML elements and attributes that stores information about a process applied to an XLIFF document and the data incorporated into the document as result of that process. Each official module defined for XLIFF 2.0 has its grammar defined in an independent XML Schema with a separate namespace." There is nothing there that says the namespace must be defined by the TC, or cannot be the namespace of an existing standard. We can update the core schema exactly like for the Glossary or the Match modules by telling where the attributes/elements of the module are allowed. What the URI of the namespace is has no bearing on this. Not using the ITS namespace would make the markup non-interoperable. It would also go against one of the take-aways that the TC got from it users at the first XLIFF symposium: re-use existing standards. We would be re-defining something that already exist and that could be used as-it. Regards, -yves


  • 19.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 12:27
    Hi Yves, As long as the XLIFF TC is not the one that maintains ITS, anything from ITS has to be considered a custom extension. If you want to define a module that mimics ITS, that’s fine as we will be the ones in control. If you want to embed ITS in XLIFF, then the agreed restrictions for elements that allow custom extensions should apply. There should not be anything from ITS inside <segment>. Regards, Rodolfo -- Rodolfo M. Raya rmraya@maxprograms.com Maxprograms http://www.maxprograms.com >


  • 20.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-05-2012 21:13
    All, Rodolfo, Yves, I agree that the target related PR belongs there and not to the general PRs. I obviously have an issue with not protecting modules as a general rule, that will of course have exceptions when removing core elements according to core processing requirements. Module data are not essential in the same way as core, but unlike extensions they are the essential and only possible way how to cover their specific functionality. So they must have as a general rule a higher level of protection. Otherwise there is no substantial difference between module and extension. In normative theory, special rules always beat general rules. And maybe we have so many special rules that there is little value left in setting out general processing requirements, as all implementers must obviously follow the core PRs and all implementers of modules must follow the module's PRs. Still, in the general section we must at least say what to do with modules that you do not support and what to do with extensions that you do not support in case when it does not follow from core manipulations. The general principle of course is when a core element is gone and there is no clear successor the dependent module or custom data not only must not be preserved but simply cannot be preserved as there is no carrier left. There is no harm in saying..  > - All user agents MUST follow the processing requirements associated with > the core elements and attributes of XLIFF. Those processing requirements > are defined throughout the specification. Even though it is tautological, it is no harm in reiterating as a general principle.. This has the fundamental issue of not protecting modules better than extensions: > - A user agent that does not support a given non-core element or attribute > SHOULD preserve it, except when a core processing requirement specifies > otherwise. (See, for example, section 2.8.3 Segmentation Modification). So I counter propose to have the following two paragraphs instead: - A user agent that does not support a given module element or attribute MUST preserve it, except when a core processing requirement specifies otherwise. (See, for example, section 2.8.3 Segmentation Modification). - A user agent that does not support a given custom namespace element or attribute SHOULD preserve it, except when a core processing requirement specifies otherwise. (See, for example, section 2.8.3 Segmentation Modification, or 2.2.2.3 skeleton). This general rule should be overriden with a specific rule in the section 2.2.2.3 skeleton - Because skeleton - if present - is essential for merging back of translatable content, skeleton including its custom elements MUST be preserved.   We will also need a definition of "support" in section 1.1.2 Definitions if we go with this wording. It should say that  to support an element or attribute means to read, understand, process, and write it according to its processing requirements. Best regards dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Mon, Nov 5, 2012 at 6:42 PM, Rodolfo M. Raya < rmraya@maxprograms.com > wrote: >


  • 21.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 11:20
    FWIW, yesterday I added processing requirements to the definition of <skeleton> stating that it must be kept unchanged.   Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com   From: xliff@lists.oasis-open.org [mailto:xliff@lists.oasis-open.org] On Behalf Of Dr. David Filip Sent: Monday, November 05, 2012 7:12 PM To: Rodolfo M. Raya Cc: xliff@lists.oasis-open.org Subject: Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements   All, Rodolfo, Yves,   I agree that the target related PR belongs there and not to the general PRs.   I obviously have an issue with not protecting modules as a general rule, that will of course have exceptions when removing core elements according to core processing requirements.   Module data are not essential in the same way as core, but unlike extensions they are the essential and only possible way how to cover their specific functionality. So they must have as a general rule a higher level of protection. Otherwise there is no substantial difference between module and extension.   In normative theory, special rules always beat general rules. And maybe we have so many special rules that there is little value left in setting out general processing requirements, as all implementers must obviously follow the core PRs and all implementers of modules must follow the module's PRs.   Still, in the general section we must at least say what to do with modules that you do not support and what to do with extensions that you do not support in case when it does not follow from core manipulations. The general principle of course is when a core element is gone and there is no clear successor the dependent module or custom data not only must not be preserved but simply cannot be preserved as there is no carrier left.   There is no harm in saying..  > - All user agents MUST follow the processing requirements associated with > the core elements and attributes of XLIFF. Those processing requirements > are defined throughout the specification. Even though it is tautological, it is no harm in reiterating as a general principle..   This has the fundamental issue of not protecting modules better than extensions: > - A user agent that does not support a given non-core element or attribute > SHOULD preserve it, except when a core processing requirement specifies > otherwise. (See, for example, section 2.8.3 Segmentation Modification).   So I counter propose to have the following two paragraphs instead:   - A user agent that does not support a given module element or attribute MUST preserve it, except when a core processing requirement specifies otherwise. (See, for example, section 2.8.3 Segmentation Modification).   - A user agent that does not support a given custom namespace element or attribute SHOULD preserve it, except when a core processing requirement specifies otherwise. (See, for example, section 2.8.3 Segmentation Modification, or 2.2.2.3 skeleton).   This general rule should be overriden with a specific rule in the section 2.2.2.3 skeleton   - Because skeleton - if present - is essential for merging back of translatable content, skeleton including its custom elements MUST be preserved.       We will also need a definition of "support" in section 1.1.2 Definitions if we go with this wording. It should say that  to support an element or attribute means to read, understand, process, and write it according to its processing requirements.     Best regards dF     Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Mon, Nov 5, 2012 at 6:42 PM, Rodolfo M. Raya < rmraya@maxprograms.com > wrote: >


  • 22.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 12:02
    Hi David, all, > - A user agent that does not support a given module element or > attribute MUST preserve it, except when a core processing requirement > specifies otherwise. It makes no sense to me that an optional module MUST be preserved. It's not optional anymore. Furthermore, if core-only tools cannot remove a module, I assume the tools aware of the module can. So we would have the absurd situation where a tool that doesn't support module ABC must preserve it, but the tool XYZ that supports that module can remove it. > Module data are not essential in the same way as core, but unlike > extensions they are the essential and only possible way how to cover > their specific functionality. That doesn't make them special. I want tools to be able to get rid of any non-core constructs if they see a need for it. The reason you listed for tools to get rid of the elements of a custom namespace are just as valid for the elements of a module. As the PRs I've posted earlier illustrate, you can treat all non-core elements exactly the same when doing for example re-segmentation. Why create an artificial difference (that would actually require more work)? The distinction between modules and custom namespaces exists: - modules always have a schema. - modules are documented and backed by the TC. - modules could exist in places extensions could not. Those are all incentives for tool vendors to migrate their extensions to modules. Regards, -yves


  • 23.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 12:20
    Hi, I tend to agree with Yves, it should be possible to discard non-core data. Let me give you an example: an XLIFF file with 10,000 segments is not unusual these days; after being fully translated and populated with TM and MT matches the file may become 10 times larger than it was originally. Removing all matches before merging can reduce the size of the file to one third and improve merging speed considerably. TM and MT data is not required for merging and is useless after all translations are marked as final. I recently saw a good case: original XLIFF (source only) had 1.5MB; full file (translations + TM data) was 27MB; trimmed file (source and target only) 2.5MB. Merging 2.5MB was much faster than merging 27MB. Regards, Rodolfo -- Rodolfo M. Raya rmraya@maxprograms.com Maxprograms http://www.maxprograms.com >


  • 24.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 14:03
    Rodolfo, this is a good point that should be considered for a set of special PRs pertaining to matches overriding the general protection of modules. While it makes sense to delete matches before merging, it does not make sense to delete them before translation. Even if you are reading the matches into an accompanying TMX, or sucking them into your proprietary TM, this should not be a valid reason for deleting the matches, as tools downstream including the merger/extractor might still need them, e.g. for review, engineering checks, editing distenace checks etc.  We should say something like Matches for a source element MUST NOT be trimmed until its target element is populated with translation. Matches for a source element SHOULD NOT be trimmed until its target element is marked final. If the merger wants to be able to delete matches before merging it should at least understand the above PRs. It is quite common with corporate merging tools that they are checking status of targets before performing merge  and If they want to send an XLIFF failing final target critreia back to the service provider the matches will still be useful. Cheers dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Tue, Nov 6, 2012 at 12:20 PM, Rodolfo M. Raya < rmraya@maxprograms.com > wrote: Hi, I tend to agree with Yves, it should be possible to discard non-core data. Let me give you an example: an XLIFF file with 10,000 segments is not unusual these days; after being fully translated and populated with TM and MT matches the file may become 10 times larger than it was originally. Removing all matches before merging can reduce the size of the file to one third and improve merging speed considerably. TM and MT data is not required for merging and is useless after all translations are marked as final.  I recently saw a good case: original XLIFF (source only) had 1.5MB; full file (translations + TM data) was 27MB; trimmed file (source and target only) 2.5MB. Merging 2.5MB was much faster than merging 27MB. Regards, Rodolfo -- Rodolfo M. Raya       rmraya@maxprograms.com Maxprograms       http://www.maxprograms.com >


  • 25.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 15:52
    Hi David, all, > We should say something like > Matches for a source element MUST NOT be trimmed until its target > element is populated with translation. > Matches for a source element SHOULD NOT be trimmed until its > target element is marked final. No. You cannot come up with PRs that cover all use cases like that. Take this customer of ENLASO: They send us XLIFF documents with the source and no target and 5 different entries for alt-trans. Each for a different language. Their logic is: We just have to create one XLIFF and you send us 5 back; sometime the alt-trans for language ABC could be helpful for the translator XYZ. It's logical. But it makes makes the file humongous and we can't work with it. So we create one file per language with the corresponding alt-trans. With PRs like the one above we could not do technically do it. We obviously would do it anyway. My point is: Let's not define PRs users may have to be forced to not respect. It's as bad has not having any PRs at all like in 1.2. Regards, -yves


  • 26.  Re: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 13:37
    Yves, inline.. Cheers dF Dr. David Filip ======================= LRC CNGL LT-Web CSIS University of Limerick, Ireland telephone: +353-6120-2781 cellphone: +353-86-0222-158 facsimile: +353-6120-2734 mailto: david.filip@ul.ie On Tue, Nov 6, 2012 at 12:01 PM, Yves Savourel < ysavourel@enlaso.com > wrote: Hi David, all, > - A user agent that does not support a given module element or > attribute MUST preserve it, except when a core processing requirement > specifies otherwise. It makes no sense to me that an optional module MUST be preserved. It's not optional anymore. It is optional, as you do not need to support it, see my previous definition of support as read, understand, process, write as per its PRs  Furthermore, if core-only tools cannot remove a module, I assume the tools aware of the module can. So we would have the absurd situation where a tool that doesn't support module ABC must preserve it, but the tool XYZ that supports that module can remove it. This situation is not at all absurd. Module aware tools can remove module elements AS PER its PRs. Module unaware tools MUST NOT do it because they do not understand (do not need to understand) the module's PRs, AND as per our definitions of module and extension, they also CANNOT POSSIBLY have a VALID reason to modify modules, because if they had it they would have to support it, i.e. understand its PRs. > Module data are not essential in the same way as core, but unlike > extensions they are the essential and only possible way how to cover > their specific functionality. That doesn't make them special. I want tools to be able to get rid of any non-core constructs if they see a need for it. This is surprising to me, as you first wanted to protect unconditionally both modules and extensions.   The reason you listed for tools to get rid of the elements of a custom namespace are just as valid for the elements of a module. I do not think they are. The modules are TC vetted (they are not ANYTHING that can come through a random custom extension) and module roundtrip cannot be guaranteed without unconditional protection from tools that do not support/understand it. This invalidates the idea of the module and prevents further development of the standard by adding modules. Who would care for adding modules if the roundtrip was not guaranteed. Modules are optional only as long as you do not cover their functionality. If you do, they are as non-negotiable as the core is.   As the PRs I've posted earlier illustrate, you can treat all non-core elements exactly the same when doing for example re-segmentation. Why create an artificial difference The difference is not artificial and does not need to be created, just honored. As I said before, special rules have always precedence over general rules, and describing non-core handling without making the distinction in a specific situation is not a counter-point here. If a module or extension has to be removed because the carrier core element is gone, so be it (I think this is a general principle and common sense for all core manipulations not just resegmenting). If there is a way to preserve (in case the successor carrier element can be identified), the module MUST be preserved and the extension SHOULD as per the general rule, if the special rule does not say otherwise.   (that would actually require more work)? This will only create more work if you care to delete extensions, i.e. have valid reasons to delete them. Suppose a lean core only implementer who implements a behavior that always preserves modules and extensions (as long as they do not need to be removed as per a specific core PR). He will have honored both the MUST preserve for modules and SHOULD preserve for extension. He will only bother with identifying and deleting extensions if they actually cause trouble.. So more work is needed IFF the implementer wants to identify and delete an extension   The distinction between modules and custom namespaces exists: - modules always have a schema. - modules are documented and backed by the TC. - modules could exist in places extensions could not. agreed   Those are all incentives for tool vendors to migrate their extensions to modules. The effect of these incentives is thwarted if the modules are not protected from tools that do not understand them.  or if they have the same level of protection as extensions. Regards, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org


  • 27.  RE: [xliff] XLIFF 2.0 spec - dF Issue #01 - Extensibility and processing requirements

    Posted 11-06-2012 15:43
    Hi David, all, >> It makes no sense to me that an optional module MUST >> be preserved. It's not optional anymore. > > It is optional, as you do not need to support it, see my previous > definition of support as read, understand, process, write > as per its PRs. If a tool must do something with an element, by definition, it does not have an option. Here you are forcing all the tools to at least read and write, which, according your own definition, is part of "support". >> Furthermore, if core-only tools cannot remove a module, I assume the >> tools aware of the module can. So we would have the absurd situation >> where a tool that doesn't support module ABC must preserve it, but >> the tool XYZ that supports that module can remove it. > > This situation is not at all absurd. Module aware tools can remove module > elements AS PER its PRs. The PRs regarding deletion are only when you re-segments. Any tool must be able to remove an element like a <matches>, even if there are no PRs that says it should. Same for glossaries, etc. > AND as per our definitions of module and extension, they also CANNOT POSSIBLY > have a VALID reason to modify modules, because if they had it they would > have to support it, i.e. understand its PRs. Wrong: there are many valid reasons to remove a non-core element even if you don't understand it. Rodolfo described an example. I can provide you with others. >> That doesn't make them special. I want tools to be able to get >> rid of any non-core constructs if they see a need for it. > > This is surprising to me, as you first wanted to protect > unconditionally both modules and extensions. Indeed. And I was wrong: SHOULD preserve is better. If I get enough logical arguments to change my opinion I do. Maybe you should consider that too. >> The reason you listed for tools to get rid of the elements of a custom namespace >> are just as valid for the elements of a module. > > I do not think they are. The modules are TC vetted (they are not ANYTHING that can > come through a random custom extension) No. There are "anything" for a tool that doesn't support the module. For that tool a module is as "random" (whatever that means) as an extension. It's not because the TC thinks the module is fine that the users of a tool should have the same opinion. There are many reasons they may want to not read those modules: speed, storage in DBs, size, etc. > and module roundtrip cannot be guaranteed without unconditional protection > from tools that do not support/understand it. This invalidates the idea of the > module and prevents further development of the standard by adding modules. > Who would care for adding modules if the roundtrip was not guaranteed. > Modules are optional only as long as you do not cover their functionality. > If you do, they are as non-negotiable as the core is. No module is guaranteed to do a round-trip, where did you see that? Modules are optional: tools *should* do their best the preserve them, but it should not mandatory. > If there is a way to preserve (in case the successor carrier element can > be identified), the module MUST be preserved and the extension SHOULD as > per the general rule, if the special rule does not say otherwise. This is the crux of the problem: you simply cannot force tools to preserve something that is optional and they may have valid reason to get rid of. The TC cannot define what a valid reason is, only the tools can. >> The distinction between modules and custom namespaces exists: >> - modules always have a schema. >> - modules are documented and backed by the TC. >> - modules could exist in places extensions could not. >> Those are all incentives for tool vendors to migrate their >> extensions to modules. > > The effect of these incentives is thwarted if the modules are not > protected from tools that do not understand them. or if they > have the same level of protection as extensions. The fact you could have a module where you cannot have an extension has nothing to do with protecting it or not; having a schema or not, having documentation or not; having interoperability or not: All those things are incentives that have nothing to do with whether the module MUST or SHOULD be preserved. The bottom line IMO is: - We cannot use "MUST preserve" for something that is not core. - There are no technical reasons to have separate generic PRs (e.g. PRs for core-only tools) for modules and custom extensions. Regards, -yves --------------------------------------------------------------------- To unsubscribe, e-mail: xliff-unsubscribe@lists.oasis-open.org For additional commands, e-mail: xliff-help@lists.oasis-open.org