top of page
  • OLTP Head

Whatever happened to accurate data and reliable software?

Back in the day when working on OLTP systems there was a simple rule.

All data presented to end-users had to be up-to-date and if it was distributed data, fully synchronised, no exceptions.

The software infrastructure that supported this level of data integrity in environments that might well include multiple vendor's databases and multiple different OSs was XA standard 2PC (2 phase commit) and TPMs (Transaction Processing Monitors).

It is very much to be hoped for that the back-end systems that still execute massive volumes of the really critical financial transactions continue to use this infrastructure today. They are certainly not using over-hyped technology such as blockchain for instance.

Current front-end user-facing systems meanwhile have given up all pretence of supporting synchronised transactional integrity. It is now such a common user experience to be presented with unsynchronised and inaccurate data that engaging with the human beings involved with technical support or customer services (should you be offered that choice which is increasingly rare) is just a waste of time. You will quickly be recruited into the software's obviously unfinished testing efforts with a blizzard of questions about browsers, OSs, data caches, cookies, and so on.

This is becoming the new normal and it's a very regrettable and retrograde phenomenon.


Featured Posts
Recent Posts
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page