Standard based disruptive MDM technology

Standard based disruptive MDM technology

Standard based disruptive MDM technology

Question: what do you call a consensus of best practice?
Answer: an international standard

Here I am writing a blog about disruptive and innovative master data management (MDM) solutions, so why I have started my blog defining such a constraining document as a standard?  Ask your MDM software supplier a simple question, which data quality standard is your software based on? Is it the standard that allows you to exchange multiple language specifications, portable data in other parlance?  Is it the standard that allows for interoperability through the exchange of digital data? Is it the standard that enables the semantic web by creating open, computer interpretable data?

Why are these particular qualities important?  The lack of portable digital data is held up as key constraining factors in four key areas:

  1. the move towards interoperability of public authorities according to a recent EU report [1];
  2. the move towards interoperability of smart cities according to the publically available standard produced by BSI [2];
  3. the move towards interoperability in the oil and gas sector according to the (soon to be published) ISO standard on that subject [3];
  4. the move towards industrial interoperability outlined in the Industrie 4.0 (I40) [4] initiative promoted by the German government.

It might seem a paradox that the disruptive solution to these issues is based on the international data quality standard, ISO 8000 [5], such documents are not normally seen as disruptive, the disruption comes because so few MDM solutions conform to the standard.  In addition to the solutions to the issues outlined above, ISO 8000 insists that messages are exchanged in a resolvable format that allows the receiver to be assured it is trustworthy, and also addresses data quality “from the bottom up” concentrating on accurate property values and units of measure.   As all experienced data quality managers know, the overuse of character, string or text fields in systems is the main cause of subsequent data quality problems.

The KOIOS Master Data cloud software conforms to ISO 8000.  The software positively discourages the use of string fields, encouraging the user to create lists of values based on authoritative sources from the global concept dictionary that KOIOS has compiled.  Where that approach is not suitable the software encourages the use of “representation” to constrain the value to its correct syntax, ensuring data quality is locked in at the lowest level. As all properties used to create specifications must have definitions, not user guides, the chances of loss of meaning when data is exchanged is dramatically reduced, this again is one of the key pillars of ISO 8000.

For manufacturers, KOIOS enables you to post a single version of the digital representation of your product specifications in the cloud, and allows you to control how your customers view your brand and product details, you can even control which fields you share with each customer.

For end-users, cataloguing at source (C@S) now comes alive.  As end-users you are able to import product specifications created by the very people that manufactured the item, without third parties manipulating the data.  These descriptions can be imported into your “PO text” field in full, ensuring your PO text is always understood by your supply chain, and enables you to create consistent “short descriptions” for those items ensuring reduced search times for users of your system.

Data cleaning is thus turned on its head by the use of data that is trustworthy.  Now that is disruptive!

Bibliography

[1] New European Interoperability Framework Promoting seamless services and data flows for European public administrations NO-07-16-042-EN-N. 2017
[2] PAS-181 – Smart city framework guide to establishing strategies
[3] ISO/TS 18101 Oil and Gas Interoperability
[4] Digital Transformation Monitor Germany: Industrie 4.0 January 2017
[5] ISO 8000 (all parts) Data quality – Framework and the exchange of characteristic data

The new paradigm for managing product master data

The management of product master data is having a revolution. The excellent data quality standards ISO 22745 and ISO 8000 from the International Organization for Standardization (ISO) in Geneva, Switzerland,  have changed everything.

In order to adapt organisations need to adopt a new mindset, new tools, new processes, and importantly, people need education and training. Getting this right will lead to significant productivity improvements and an array of other benefits that include: more accurate ordering and a reduction in purchase errors, less operational downtime hunting for the source of supply for spares, greater detail and consistency of product data on eCommerce web sites, shared product specifications throughout the supply chain, less exposure to fraud and counterfeiting through the use of authorized legal identifiers and many more.

Data cleaning is now dead, as is the use of noun-modifiers to define product specifications. Cataloguing at source is the new paradigm. The best entity to describe a product is the manufacturer who designs and builds it, and their product data should be used throughout the supply chain. Doing so means everyone in the supply chain can share the correct product data; load it into their ERP, eCommerce, and/or Punch-out systems; order the right parts from the right supplier at the right time; and cut out expensive, and often inaccurate, data cleaning work. It means purchasing errors are significantly reduced, or eliminated entirely, and the risk of downtime whilst spares are sourced minimized.

The charts below lays out the key success factors organisations need to implement in order to benefit fully . Find out more at www.kspir.cloud