Since recently, I am involved in selecting technologies (not vendors, mind you!) for distributed systems. While highly interesting, I am now faced with the age-old issue of interoperability and claimed adherence to standards. We all know the games companies and standards organizations have been playing: loosely specified standards with too many degrees of freedom, proprietary “extensions”, etc. What happens often enough is that the implementations of relatively new standards (say less than 10 years of commercially or freely available products) have significant interoperability issues. Over time, these issues disappear, but not necessarily at the speed that customers or even the industry would like. This can have significant detrimental effects, including delay in necessary technology upgrades (e.g. IPv6), market distortion (PAC data in authZ data fields in W2Kx), or even non-adoption.
The SAML commercial community has developed a process that is very useful to technology consumers: through Liberty, Drummond Group International operates a testing program that verifies standards compliance of SAML products against the SAML 2.0 static conformance requirements.With a rigorous testing process, the results of this process are quite helpful for source selection – if only to get a quick overview of the capabilities of the different products without having to wade through piles of marketing collateral and technical documentation. As a customer, I am particularly pleased about this process, since the vendors are paying for this process themselves. While this does not eliminate interoperability problems completely, it puts the burden of proofing interoperability on the vendor and not on the customer.
On the other hand, Microsoft and a number of other vendors have in the past performed informal cross-matrix interoperability testing in the form of the ws-builder plugfests or the OSIS InfoCard test rounds. The lack of formalism is countered here with the very low barrier to entry, so that open source projects or small companies have the opportunity to participate as well.
Combining these two approaches would yield an useful process:having a commercial vendors and–at least some– open source projects participate in a formalized vendor-initiated cross-matrix interoperability certification (VICMIC – this is for all the acronym lovers out there) would give enterprise architects and developers a powerful tool for source selection. The particpation of the open source projects could be sponsored through stipends that are awared by the testing organiztion based on criteria such as feature completeness, overall quality, etc.
If I had my way (yeah, I know, I will not … still you can DREAM), all technologies wanting to be considered for public projects would have to implement such a process – that’s a MUST in RFC 2119 speak. If they do not, the aquisition process should really require this.