Design Data - Information flow
RN-1 The classic technological perspective for ICT
RN-1.1 Contents
⚙ RN-1.1.1 Looking forward - paths by seeing directions
A reference frame in mediation innovation
When the image link fails,
🔰 click
here for the most logical higher fractal in a shifting frame.
Contexts:
◎ r-serve technology enablement for purposes
↖ r-steer motivation purposes by business
↗ r-shape mediation communication
↙ data infotypes
↘ data techflows
There is a counterpart
💠 click
here for the impracticable diagonal shift to shaping change.
The Fractal focus for knowledge management
The impracticable diagonal is connecting the technology realisation to a demand from administrative support.
There is no:
- budget as there is no obvious business value. Nobody else is doing it.
- vison for business value as it is too generic. Someone else should do it.
❶ The shape mindset mediation innovation:
The cosmos is full of systems and we are not good in understanding what is going on.
In a ever more complex and fast changing world we are searching for more certainties and predictabilities were we would better off in understanding the choices in uncertainties and unpredictability's.
Combining:
- Systems Thinking, decisions, ViSM (Viable Systems Model) good regulator
- Lean as the instantiation of identification systems
- The Zachman 6*6 reference frame principles
- Information processing, the third wave
- Value Stream (VaSM) Pull-Push cycle
- Improvement cycles : PDCA DMAIC SIAR OODA
- Risks and uncertainties for decisions in the now near and far future, VUCA BANI
The additional challenge with all complexities is that this is full of dualities - dichotomies.
❷ The serve mindset technology realisation:
⚙ RN-1.1.2 Local content
⚖ RN-1.1.3 Guide reading this page
The quest for methodlogies and practices
This page is about a mindset framework for undertanding and managing complex systems.
The type of complex systems that is focussed on are the ones were humans are part of the systems and build the systems they are part of.
When a holistic approach for organisational missions and organisational improvements is wanted, starting at the technology pillar is what is commonly done.
Knowing what is going on on the shop-floor (Gemba).
Working into an approach for optimized systems, there is a gap in knowledge and tools.
👁 💡 The proposal to solve those gaps is
"Jabes". It is About:
- ⚙ Document & communicate Knowledge (resources, capabilities, portfolio, opportunities).
- 📚 Defining boundaries in context in knowledge domains of disciplines.
- 🎭 In a knowledge domain of as discipline a standardize metadata structure.
- ⚖ Maturity evaluation of quality Document & communicate Knowledge.
Seeing "Jabes" as a system supporting systems the question is what system is driving Jabes?
The system driving Jabes must have similarities to the one that is driving it.
👁 💡
ZARF (Zachman-Augmented Reference Frame) is a streamlined upgrade to the classic Zachman matrix.
It turns a static grid into a practical, multidimensional map that guides choices, enforces clear boundaries, and adds a sense of time, so teams move methodically from idea to reality
Shaping Systems collective intelligence
These are part of a larger vision of adaptive, resilient enterprises, organisations.
The mindset is even exceeding that of what is seen as an enterprise to the communities enterprises are part of.
Sys6x6Lean and Shape Design for ICT Systems Thinking form a unified framework for adaptive enterprises. Combining Lean processes, Zachman reference models, mediation, and innovation, these pages guide organizations in shaping resilient systems for complex environments.
- Valuestream & Serve DevOPs: tooling aid for collective intelligence at Systems Thinking
- Frameworks for Innovation, Mediation, and Lean Systems Thinking
- From Reference Frames to Resilient Organizations
- Sys6x6Lean & Shape Design: A Unified Approach to Systems Thinking
There is special impracticable fractal the demand is at "C-Shape design" the realisation at "r-serve devops sdlc"
From the C-Shape location:
:
👉🏾 Sys6x6Lean page: focuses on systems thinking, Lean, viable systems modeling.
Shape Design for ICT Systems Thinking page: focuses on mediation, innovation, ICT organizational frameworks.
From the r-serve location:
:
👉🏾 Valuestream page: focuses on systems thinking, Lean, viable systems modeling.
Serve Devops for ICT Systems realisations page: focuses on practical innovations ICT organizational frameworks.
A recurring parable for methodlogies and practices
Key challenges:
- Cultural barriers and norms to Culture of openness and innovation/
- Lack of feedback mechanisms and learning loops
- Understanding and distributing the costs and benefits of cross-border efforts
- Undeveloped ecosystems to clearly defined roles, Scaling up experiments
Achieving Cross Border Government Innovation (researchgate Oecd opsi, foreword Geof Mulgan 2021 - collective intelligence)
OPSI is a global forum for public sector innovation.
In a time of increasing complexity, rapidly changing demands and considerable fiscal pressures, governments need to understand, test and embed new ways of doing things.
Over the last few decades innovation in the public sector has entered the mainstream in the process becoming better organised, better funded and better understood.
But such acceptance of innovation has also brought complications, in particularly regarding the scope of the challenges facing innovators, many of which extend across borders.
Solutions designed to meet the needs of a single country are likely to be sub-optimal when applied to broader contexts.
To address this issue, innovators need to learn from others facing similar challenges and, where possible, pool resources, data and capacities.
OPSI's colleagues in the OECD Policy Coherence for Sustainable Development Goals division (PCSDG) and the EC Joint Research Centre have developed a conceptual framework for analysing transboundary interrelationships in the context of the 2030 Agenda.
OPSI and the MBRCGI have observed an increased focus on cross-border challenge-driven research and innovation, with a particularly strong influence from agendas such as the SDGs.
A second challenge is how to institutionalise this work.
It is not too difficult to engage people in consultations across borders, and not all that hard to connect innovators through clubs and networks.
But transforming engagement into action can be trickier.
It is particularly hard to share data – especially if it includes personal identifiers (although in the future more “synthetic data†that mirrors actual data without any such identifiers may be more commonly used, particularly for collaborative projects in fields such as transport, health or education).
It is also hard to get multiple governments to agree to create joint budgets, collaborative teams and shared accountability, even though these are often prerequisites to achieving significant impacts.
⚒ RN-1.1.4 Progress
done and currently working on:
- 2012 week:44
- Moved the legal references list to the new inventory page.
- Added possible mismatches in the value stram wiht a BISL reference demand supply.
- 2019 week:48
- Page converted, added with all lean and value stream idea´s.
- Aside the values stream and EDWH 3.0 approach links are added tot the building block patterns SDLC and Meta.
- The technical improvements external on the market are the options for internal improvements.
- 2025 week 49
- Start to rebuild these pages as a split off of the Serve devops.
- There was too much content not able to consider what should come resulting in leaving it open at the serve devops page.
- When the split-off happened at the shape design the door opened to sess how to connect fractals.
- Old content to categorize evaluate and relocate choosing three pages inherited at the this location, other pages to archive
The topics that are unique on this page
👉🏾 Rules Axioms for the Zachman augmented reference framework (ZARF).
- Based in the classic way of categorized 6 type of questions for thinking (one dimensional)
- Stepping over the 6*6 two-dimensional Zachman Idea
- Extends to a 3*3*4 three-dimensional approach
- Awareness of a 6*6*6 (..) multidimensional projection
👉🏾 Connecting ZARF to systems thinking in the analogy of:
- Anatomy,
- Physiology,
- Neurology,
- Sociology - Psychology.
👉🏾 Explaining the patterns that are repeating seen in this.
- Connecting components for the systems as a whole,
- There must be an effective good regulator for the system to be viable.
- Searching the relations for systems to their universe.
- Motiviations and distraction seen in repeating patterns.
👉🏾 use cases using the patterns for Zarf and by Zarf.
- More practical examples that help in applying Zarf
- Use cases are not fixed but can vary in time
- Adaption to uses cases when there are clearly recognised.
Highly related in the domain context for information processing are:
- C-Shape the abstracted approach for shaping, the related predecessor.
- r-c6isr command and control practical an abstracted approach, in what to shape.
- c-shape the practice follower of the predecessor.
open design_bianl:
workcell
valuestream
open design_sdlc :
DTAP Multiple dimensions processes by layers
ALC type 2 low code ML process development
ALC type 3 low code ML process development
vmap_layers01 low code ML process development
data administration *meta describing modelling data
Security *meta - modelling access information
meta data model
meta data process
meta secure
open local devops_sdlc:
prtfl_c22
prtfl_t33
relmg_c66
relmg_t46
RN-1.2 Technical requirements for knowledge systems
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-1.2.1
Archiving, Retention policies.
Information is not only active operational but also historical what has happened, who has execute, what was delivered, when was the delivery when was the purchase etc.
That kind of information is often very valuable but at the same time it is not well clear how to organize that and who is responsible.
💣 Retention policies, archiving information is important do it well, the financial and legal advantages are not that obvious visible. Only when problems are escalating to high levels it is clear but too late to solve.
When being in some financial troubles, cost cutting is easily done.
Historical and scientific purposes, moved out off any organisational process.
An archive is an accumulation of historical records in any media or the physical facility in which they are located.
Archives contain primary source documents that have accumulated over the course of an individual or organization's lifetime, and are kept to show the function of that person or organization.
Professional archivists and historians generally understand archives to be records that have been naturally and necessarily generated as a product of regular legal, commercial, administrative, or social activities.
The word record and word document is having a slightly different meaning in this context than technical ICT staff is used to.
In general, archives consist of records that have been selected for permanent or long-term preservation on grounds of their enduring cultural, historical, or evidentiary value.
Archival records are normally unpublished and almost always unique, unlike books or magazines of which many identical copies may exist.
This means that archives are quite distinct from libraries with regard to their functions and organization, although archival collections can often be found within library buildings.
Additional information container attributes.
😉 EDW 3.0 Every information container must be fully identifiable. Minimal by:
- a logical context key
- moment of relevance
- moment received, available at the ware house
- source received information container.
When there are compliancy questions on information wiht this kind of compliancy questions it is often assumed to be an ICT problem only. Classic applications are lacking thes kind of attributes with information.
💡 Additional information container attributes supporting implementations defined retention policies.
Every information container must have for applicable retention references :
- Normal operational visibility moments:
- registered in the system
- information validity start
- information validity end
- registration in system to end
- Legal change relevance:
- legal case registered in system started
- registration for legal case in system to end
- Internal extended archive for purposes:
- registration for archiving purposes in system to end
Common issues when working for retention periods.
⚠ An isolated archive system in complexity reliability and availability being a big hurdle, high impact.
⚠ Relevant information for legal purposes, moved out from manufacturing process and not being available anymore in legal cases, is problematic.
⚠ Impact by cleaning as soon as possible is having high impact. The GDPR states it should be deleted as soon as possible.
This law is getting much attention and is having regulators. Archiving information for longer periods is not directly covered by laws, only indirect.
Government Information Retention.
Instead of a fight how it should be solved there is a fight somebody else is to blame for missing information.

Different responsible parties have their own opinion how conflict in retention policies should get solved.
🤔 Having information deleted permanent there is no way to recover when that decision is wrong.
🤔 The expectation it would be cheaper and having better quality is a promise without warrrants.
⟲ RN-1.2.2 Technology safe by design & open exchangeable
Business Continuity.
Loss of assets can disable an organisation to function. It is risk analysis to what level continuity, in what time, at what cost, is required and what kind of loss is acceptable.
💣 BCM is risk based having visible cost for needed implementations but not visible advantages or profits. There are several layers
| Procedures , organisational. |
| People , personal. |
| Products, physical & cyber. |
| Communications. |
| Hardware. |
| Software. |
Loss of physical office & datacentre.
In the early days using computers all was located close to the office with all users because the technical communication lines did not allow long distances.
Using batch processing with a day or longer to see results on hard copy prints. Limited Terminal usage needing copper wires in connections.

The disaster recovery plan was based on a relocation of the office with all users and the data centre when needed in case of a total loss (disaster).
For business applications a dedicate backup for each of them aside of the needed infrastructure software including the tools(applications).
⚠ The period to resilence could easily span several weeks, there was no great dependency yes on computer technology. Payments for example did not have any dependency in the 70´s.
Loss of network connections.
The datacentre has got relocated with the increased telecommunications capacity. A hot stand by with the same information on a Realtime duplicated storage made possible.
⚠ The cost argument with this new option resulted in ingorance of resilence of other type of disasters to recover and ignorance of archiving compliancy requirements.
⚠ With a distributed approach of datacenters the loss of single datacentre is not a valid scenario anymore. Having services spread over locations the isolated DR test of a having one location failing is not having the value as before.
Loss control to critical information.
Loss of information, software tools compromised, database storage compromised, is the new scenario when everything has become accessible using communications.
Just losing the control to hackers being taken into ransom or having data information leaked unwanted externally is far more likely and more common than previous disaster scenarios.
Not everything is possible to prevent. Some events are too difficult or costly to prevent. Rrisk based evaluation on how to resilence.
⚠ Loss of data integrity - business.
⚠ Loss of confidentiality - information.
⚠ Robustness failing - single point of failures.
The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations,
and as the principle behind layered security, as used in computer security and defense in depth.
Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure.
Although the Swiss cheese model is respected and considered to be a useful method of relating concepts, it has been subject to criticism that it is used too broadly, and without enough other models or support.
Several triads of components.
Eliminating single points of failure in a backup (restore) strategy. Only the proof of a successful recovery is a valid checkpoint.
3-2-1 backup rules , the 3-2-1 backup strategy is made up of three rules, they are as follows:
- Three copies of data- This includes the original data and at least two backups.
- Two different storage types- Both copies of the backed up data should be kept on two separate storage types to minimize the chance of failure. Storage types could include an internal hard drive, external hard drive, removable storage drive or cloud backup environment.
- One copy offsite- At least one data copy should be stored in an offsite or remote location to ensure that natural or geographical disasters cannot affect all data copies.
BCM is related to information security. It are the same basic components and same shared goals.
An organization´s resistance to failure is "the ability ... to withstand changes in its environment and still function".
Often called resilience, it is a capability that enables organizations to either endure environmental changes without having to permanently adapt, or the organization is forced to adapt a new way of working that better suits the new environmental conditions.
image:
By I, JohnManuel, CC BY-SA 3.0
Auditing monitoring.
For legal requirements there are standards by auditors. When they follow their checklist a list of &best practices"e are verified.
The difference with "good practice" is the continous improvement (PDCA) cycle.
| Procedures , organisational. |
| People , personal. |
| Products, physical & cyber. |
| Security Operations Center. |
| Infrastructure building blocks- DevOps. |
| Auditing & informing management. |
Audit procedure processing.
The situation was: Infrastructure building blocks- DevOps Leading. Auditing and informing management on implementations added for control.
Added is: Security Operations Centre, leading for evaluating security risk. Auditing and informing management on implementations added for control.
 
The ancient situation was: Application program coding was mainly done in house. This had changed into using public and commercial retrieved software when possible.
⚠ Instead of having a software crisis in lines of code not being understood (business rules dependency).
It has changed in used software libraries not being understood (vulnerabilities) and not understood how to control them by the huge number of used copied software libraries.
⚠ Instead of having only an simple infrastructure stack to evaluate it has become a complicated infrastructure stack with an additional involved party into a triad to manage.
 
Penetration testing,
also called pen testing or ethical hacking, is the practice of testing a computer system, network or web application to find security vulnerabilities that an attacker could exploit.
Penetration testing can be automated with software applications or performed manually. Either way, the process involves gathering information about the target before the test, identifying possible entry points,
attempting to break in -- either virtually or for real -- and reporting back the findings.
It will only notify what is visible to the tester, using tools only what is commonly known. There is nog warrant that it is not vulnerable after "ecorrections" are made.
It is well posible there is no security risk at all by the way the system is used and being managed.
⟲ RN-1.2.3 Standard understandable naming conventions meta
logging monitoring.
Logging events when processing information is generating new information. The goal in using those logging informations has several goals.
Some loginformation is related to the product and could also become new operational information.
💣 When there are different goals an additional copy of the information is an option but introduces an option of integrity mismatches.
Data classification.
Information security
The CIA triad of confidentiality, integrity, and availability is at the heart of information security.
(The members of the classic InfoSec triad confidentiality, integrity and availability are interchangeably referred to in the literature as security attributes, properties, security goals, fundamental aspects, information criteria, critical information characteristics and basic building blocks.)
However, debate continues about whether or not this CIA triad is sufficient to address rapidly changing technology and business requirements,
with recommendations to consider expanding on the intersections between availability and confidentiality, as well as the relationship between security and privacy.
Other principles such as "accountability" have sometimes been proposed; it has been pointed out that issues such as non-repudiation do not fit well within the three core concepts.
😉 Two additionals are:
- Undisputable When the information itself is in dispute that is a serious problem. Needed is the source and time / period relevance of the information.
- Verifiability When not able to that there is no underpinning on usage and any risks.
Negelected attentions points:
- An important logical control that is frequently overlooked is the principle of least privilege, which requires that an individual, program or system process not be granted any more access privileges than are necessary to perform the task.
- An important physical control that is frequently overlooked is separation of duties, which ensures that an individual can not complete a critical task by himself.
An important aspect of information security and risk management is recognizing the value of information and defining appropriate procedures and protection requirements for the information.
Not all information is equal and so not all information requires the same degree of protection. This requires information to be assigned a security classification.
Classified information
When labelling information in a categories an approach is:
- Public / unclassified
- Confidential, intended for circulation in the internal organisation and authorized third parties at owners discretion.
- Restricted, information that should not into disclosure outside a defined group.
- Secret, strategical sensitive information only shared between a few individuals.
Using BI analytics
Using BI analytics in the security operations centre (SOC).
This technical environment of bi usage is relative new. It is demanding in a very good runtime performance with well defined isolated and secured data. There are some caveats:
⚠ Monitoring events, ids, may not be mixed with changing access rights.
⚠ Limited insight at security design. Insight on granted rights is done.
It is called
Security information and event management (SIEM)
is a subsection within the field of computer security, where software products and services combine security information management (SIM) and security event management (SEM). They provide real-time analysis of security alerts generated by applications and network hardware.
Vendors sell SIEM as software, as appliances, or as managed services; these products are also used to log security data and generate reports for compliance purposes.

Using BI analytics for capacity and system performance.
This technical environment of bi usage is relative old optimizing the technical system performing better. Defining containers for processes and implementing a security design.
⚠ Monitoring systems for performance is bypassed when the cost is felt too high.
⚠ Defining and implementing an usable agile security design is hard work.
⚠ Getting the security model and monitoring for security purposes is a new challenge.
It is part of ITSM (IT Service maangemetn)
Capacity management´s
primary goal is to ensure that information technology resources are right-sized to meet current and future business requirements in a cost-effective manner. One common interpretation of capacity management is described in the ITIL framework.
ITIL version 3 views capacity management as comprising three sub-processes: business capacity management, service capacity management, and component capacity management.
In the fields of information technology (IT) and systems management, IT operations analytics (ITOA) is an approach or method to retrieve, analyze, and report data for IT operations. ITOA may apply big data analytics to large datasets to produce business insights.
Loss of confidentiality. compromised information.
getting hacked having got compromised by whale phishing is getting a lot of attention.
A whaling attack, also known as whaling phishing or a whaling phishing attack, is a specific type of phishing attack that targets high-profile employees, such as the CEO or CFO, in order to steal sensitive information from a company.
In many whaling phishing attacks, the attacker's goal is to manipulate the victim into authorizing high-value wire transfers to the attacker.
Government Organisation Integrity.

Different responsible parties have their own opinion how conflicts about logging information should get solved.
🤔 Having information deleted permanent there is no way to recover when that decision is wrong.
🤔 The expectation it would be cheaper and having better quality is a promise without warrrants.
🤔 Having no alignment between the silo´s there is a question on the version of the truth.
⟲ RN-1.2.4 Base temporal data structure following lifecycles
butics
RN-1.3 Classification of technical processing types
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-1.3.1 Info
DataWareHousing, Information flow based.
Repostioning the datawarehouseas part of an operational flow makes more sense. A compliancy gap getting a solution:
✅ The two vertical lines are managing who´s has access to what kind of data, authorized by data owner, registered data consumers, monitored and controlled.
In the figure:
- The two vertical lines are managing who´s has access to what kind of data, authorized by data owner, registered data consumers, monitored and controlled.
- The confidentiality and integrity steps are not bypassed with JIT (lambda).
In a figure:

The following consumers are also valid for the warehouse:
- Archive
- Operations
- ML operations
A very different approach in building up this enterprise information data ware house. Axiomas:
💡 No generic data model for relations between information elements - information containers.
💡 Every information container must be fully identifiable. Minimal by:
- a logical context key
- moment of relevance
- moment received, available at the ware house
- source received information container.
💡 Every information container must have a clear ownership.:
- The owner is accountable for budget.
- The owner is responsible for compliant use of information.
For being fully identifiable a well designed stable naming convention is required.
Administrative Value Stream Mapping Symbol Patterns.
Help in abstracting ideas is not by long text but using symbols and figures.
A blueprint is the old name for doing a design before realisation.
- Value stream mapping has symbols to help in abstracting ideas.
- Structured Program, coding, has the well known flow symbols.
- Demo has a very detailed structure on interactions with symbols.
What is missing is something in between that is helping in the value stream of administrative processing.
Input processing:
Retrieve multiple well defined resources.
Transform into a data model around a subject.
The result is similar to a star model. The differenes are that is lacking some integrity and constraint definitions.
Retrieve a data model around a subject.
Transform this in a denormalised one with possible logical adjustments.
Moving to in memory processing for analytics & reporting, denormalisation is the way to achieve workable solutions.
Retrieve multiple unstructured resources.
Transform (transpose) into multiple well defined resource.
A well defined resource is one that can be represented in rows columns. The columns are identifiers for similar logical information in some context.
Execute Business Logic (score):
Retrieve a data model around a subject.
Execute business logic generating some result.
This type of processing is well known for RDBMS applications. The denormalisation is done by the application.
Retrieve denormalised data for subject.
Execute business logic generating some result.
Moving to in memory processing for analytics & reporting, denormalisation is the way to achieve workable solutions.
Retrieve historical results (business) what has been previous scored. Execute business logic generating some result.
The is monitoring block generates a log-file (technical), historical results (business) and does a halt of the flow when something is wrong.
Logging: / Monitoring:
-
Retrieve a data model around a subject. Apply businsess rules for assumed validity.
This logging block generates a log-file. The period is limited, only technicial capacity with possible restarts to show.
Does a line-halt of the flow when something is wrong.
-
Retrieve a result from an executed business logic process. Apply businsess rules for assumed validity.
This monitoring block generates a log-file (technical), historical results (business).
Does a line-halt of the flow when something is wrong.
Output, delivery:
-
From a weel defined resource, propagate to, from this processing context, external one.
A logical switch is included wiht the goal of preventing sending out information when that is not applicable for some reason.
⟲ RN-1.3.2 Info
Administrative proposed standard pattern.
📚 The process split up in four stages of prepare request (IV, III) and the
delivery (I, II). The warehouse as starting point (inbound) and end point (outbound).
The request with all necessary preparations and validations going through IV and III.
The delivery with all necessary quality checks going through I and II.
SDLC life cycle steps - logging , monitoring.
Going back to the sdlc product life, alc model type 3. This is a possible implementation of the manufacturing I, II phases.
💡 There are four lines of artefacts collections at releases what will become the different production versions.
- collecting input sources into a combined data model.
- modifying the combined data model into a new one suited for the application (model).
- running the application (model) on the adjusted suited data creating new information, results.
- Delivering verified results to an agreed destinationt in an agreed format.

💡 There are two points that are validating the state en create additional logging. This is new information.
- After having collected the input sources, technical and logical verfication on what has is there is done.
- Before delviering the results technical and logical verfication on what is there is done.
This is logic having business rules. The goal is application logging and monitoring in business perspective.
When something is badly wrong, than halting the process flow is safety mitigation preventing more damage.
There is no way to solve this by technical logfiles generated by tools like a RDBMS.
💡 The results ar collected archived (business dedicated). This is new information.
- After having created the result, but before delivering.
- It usefull for auditing purpused (what has happended) and for predcitive modelling (ML) .
⟲ RN-1.3.3 Info
Applied Machine learning (AI), operations.
Analytics, Machine Learning, is changing the way of inventing rules to only human invented to helping humans with machines.
💡 The biggest change is the ALC type3 approach. This fundamentally changes the way how release management should be implemented.
ML is exchanging some roles in coding and data to achieve results at development but not in other life cycle stages.
When only a research is done for a report being made only once, the long waiting on data deliveries of the old DWH 2.0 methodology is acceptable.
⚠ Having a (near) real time operational process the data has to be correct when the impact on the scoring is important.
Using that approach, at least two data streams are needed:
- ML model Development: accept delays information delivery.
- ML exploitation (operations): No delay in deliveries.
🤔 The analytics AI ML machine learning has a duality in the logics definition.
The modelling stage (develop) is using data, that data is not the same, although similar, as in the operational stage.
Developing is done with operational production data. The sizing of this data can be much bigger than that of what is needed at operations due to the needed history.
The way of developping is ALC type3.
 
❗ The results of what an operational model is generating should be well monitored for many reasons. That is new information to process.
⟲ RN-1.3.4 Info
The technical solutions as first process option.
Sometimes a simple paper note will do, sometimes an advanced new machine is needed.
It depends on the situation. A simple solution avoiding the waste is lean - agile
Optimization Transactional Data.
An warehouse does not content structuring it must be able to locate the wanted content structured. Delivering the labelled containers efficient >
Optimization Transactional Data.
The way of processing information was in the old day using flat files in the physical way. Still very structured stored and labelled.
In the modern approach these techniques still are applicable although automated hidden in a RDBMS .
Analytics & reporting.
The "NO SQL" hype is a revival of choosing more applicable techniques.
It is avoiding the transactional RDBMS approach as the single possible technical solution.
Information process oriented, Process flow.
The information process in an internal flow has many interactions input, transformations and output in flows.
⚠ There is no relationship to machines and networking. The problem to solve those interactions will popup at some point.
⚠ Issues by conversions in datatypes, validations in integrity when using segregated sources (machines) will popup at some point.
The service bus (SOA).
ESB enterprise service bus
The technical connection for business applications is preferable done by a an enterprise service bus.
The goal is normalized systems.
Changing replacing one system should not have any impact on others.
Microservices with api´s
Microservices (Chris Richardson):
Microservices - also known as the microservice architecture - is an architectural style that structures an application as a collection of services that are:
- Highly maintainable and testable.
- Loosely coupled.
- Independently deployable/
- Organized around business capabilities.
The microservice architecture enables the continuous delivery/deployment of large, complex applications. It also enables an organization to evolve its technology stack.
Data in containers.

Data modelling using the relational or network concepts is based on basic elements (artefacts).
An information model can use more complex objects as artefacts. In the figure every object type has got different colours.
The information block is a single message describing complete states before and after a mutation of an object. The Life Cycle of a data object as new metainformation.
Any artefact in the message following that metadata information.
⚠ This is making a way to process a chained block of information. It is not following the blockchain axioma´s.
The real advantage of a chain of related information is detecting inter-relationships with the possible not logical or unintended effects.
Optimization OLTP processes.
The relational SQL DBMS replaced codasyl network databases (see math).
The goal is simplification of online transaction processing (oltp) data by deduplication and
normalization (techtarget)
using DBMS systems supporting ACID
ACID properties of transactions (IBM).
These approaches are necessary doing database updates with transactional systems. Using this type of DBMS for analytics (read-only) was not the intention.
normalization (techtarget, Margaret Rouse )
Database normalization is the process of organizing data into tables in such a way that the results of using the database are always unambiguous and as intended.
Such normalization is intrinsic to relational database theory.
It may have the effect of duplicating data within the database and often results in the creation of additional tables.
ACID properties of transactions (IBM)
- Atomicity
All changes to data are performed as if they are a single operation. That is, all the changes are performed, or none of them are.
For example, in an application that transfers funds from one account to another, the atomicity property ensures that, if a debit is made successfully from one account, the corresponding credit is made to the other account.
- Consistency
Data is in a consistent state when a transaction starts and when it ends.
For example, in an application that transfers funds from one account to another, the consistency property ensures that the total value of funds in both the accounts is the same at the start and end of each transaction.
- Isolation
The intermediate state of a transaction is invisible to other transactions. As a result, transactions that run concurrently appear to be serialized.
For example, in an application that transfers funds from one account to another, the isolation property ensures that another transaction sees the transferred funds in one account or the other, but not in both, nor in neither.
- Durability
After a transaction successfully completes, changes to data persist and are not undone, even in the event of a system failure.
For example, in an application that transfers funds from one account to another, the durability property ensures that the changes made to each account will not be reversed.
RN-1.4 The connection of technology agile lean
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-1.4.1 Info
The Philosophy and Practicality of Jidoka
Diving deep into the Toyota philosophy, you could see this as JIT telling you to let the material flow, and jidoka telling you when to stop the flow.
This is a bit like the Chinese philosophical concept of Ying and Yang, where seemingly opposite or contrary forces may actually be complementary.
The same applies here. JIT encourages flow, and Jidoka encourages stops, which seems contrary. However, both help to produce more and better parts at a lower cost.
Unfortunately, JIT gets much, much more attention as it is the glamorous and positive side, whereas jidoka is often seen as all about problems and stops and other negative aspects.
Yet, both are necessary for a good production system.
💣 Ignoring the holistic view of the higher goal, only on a detailed aspect like JIT can make things worse not better.
project shop, moving the unmovable.
The project shop is associated with not possible applying lean thoughts. Does it or are there situations where new technology are implementing a lean working way.
It is using a great invention of process improvement over and over again.
That is: the dock. Building in the water is not possible. Building it ashore is giving the question how to get it into the water safely.
🔰 Reinvention of patterns.
Moving something that is unmovable.
Changing something that has alwaus be done tath wasy.
Minimizing time for road adjustment, placing tunnel. Placing it when able to move done in just 3 days. Building several months.
See time-lapse. 👓 Placing the tunnel was a success, a pity the intended road isn´t done after three years.
 
The project approach of moving the unmovable has been copied many times with the intended usage afterwards.
rail bridge deck cover
The approach is repeatable.
💡 Reinvention of patterns. Moving something that is unmovable.
🎭When a project shop is better in place, why not copy this at ICT?
Administration information flow.
Seeing this way of working the association is to administration work moving the papers arround.
Flow lines are often the best and most organized approach to establish a value stream.
The "easiest" one is an unstructured approach. The processes are still arranged in sequence; however, there is no fixed signal when to start processing a part.
💡 Reinvention of patterns. Using the information flow as assembly line.
🎭 When a flow line is a fit for an administrative process, why not copy this at ICT?
🎭 When an administrative process is associated to administrative tags (eg prodcut description) being processed why not have them related to each other?
Administrative process, differences to physical objects.
-
⚠ Administrative information is easily duplicated.
Using ICT duplication is a standard action. Making all those copies gives some feeling of independncy.
The overall effect likely losing the connection tot the value chain. Technical ICT Hypes are a signal of this problem.
-
⚠ Administrative information often is not complete in needed material supply.
When assembling a physical prodcut the needed material planning is clear. Administrative information usually requires additional input resources.
Those additional resources are often external chains to connect. Problems arise when those input resources are not valid or changing when not expected to change.
Frustrations to this kind are common.
⟲ RN-1.4.2 Info
Change data - Transformations
Seeing the values stream within an administrative product is a different starting point for completely new approaches.
The starting point is redesigning what is not working well. Not automatically keeping things doing as always have been done. Also not changing things because of wanting to change something.
Design thinking.
It is a common misconception that design thinking is new. Design has been practiced for ages: monuments, bridges, automobiles, subway systems are all end-products of design processes.
Throughout history, good designers have applied a human-centric creative process to build meaningful and effective solutions.

The design thinking ideology is following several steps.
Defintion: The design thinking ideology asserts that a hands-on, user-centric approach to problem solving can lead to innovation, and innovation can lead to differentiation and a competitive advantage.
This hands-on, user-centric approach is defined by the design thinking process and comprises 6 distinct phases, as defined and illustrated below.
See link at figure 👓.
 
Those six phases are in line with what the crisp-dm model states. Wat is missing when comparing this with the PDCA cycle is the Check- Verify of it works as expected after implementation.
Combining information connections between silos & layers.
💡 Solving gaps between silos in the organisation is supporting the values stream.
Having aligned information by involved parties it is avoiding different versions of the truth.
It is more easy to consolidate that kind of information to a central managed (bi analytics) tactical - strategical level.
The change to achieve this is one of cultural attitudes. That is a top down strategical influence.
⟲ RN-1.4.3 Info
Tuning performance basics.
Solving performance problems requires understanding of the operating system and hardware.
That architecture was set by von Neumann (see design-math).
A single CPU, limited Internal Memory and the external storage.
The time differences between those resources are in magnitudes (factor 100-1000).
Optimizing is balancing between choosing the best algorithm and the effort to achieve that algorithm.
That concept didn´t change. The advance in hardware made it affordable to ignore the knowledge of tuning.
The Free Lunch Is Over .
A Fundamental Turn Toward Concurrency in Software, By Herb Sutter.
If you haven´t done so already, now is the time to take a hard look at the design of your application, determine what operations are CPU-sensitive now or are likely to become so soon,
and identify how those places could benefit from concurrency. Now is also the time for you and your team to grok concurrent programming´s requirements, pitfalls, styles, and idioms.
Additional component, the connection from machine, multiple cpu´s - several banks internal memory, to multiple external storage boxes by a network.
Tuning cpu - internal memory.
Minimize resource usage:
- use data records processing in serial sequence. (blue)
- indexes bundled (yellow).
- Allocate correct size and correct number of buffers.
- Balance buffers between operating system (OS) and DBMS. A DBMS normally is optimal without OS buffering (DIO).
❗ The
"balance line" algorithm is the best.
A DBMS will do that when possible.
Network throughput.
Minimize delays, use parallelization:
- Stripe logical volumes (OS).
- Parallelize IO, transport lines.
- Optimize buffer transport size.
- Compress - decompress data at CPU can decrease elapse time.
- Avoid locking caused by: shared storage - clustered machines.
⚠ Transport buffer size is a coöperation between remote server and local driver. The local optimal buffer size can be different.
Resizing data in buffers a cause of performance problems.
Minize delays in the storage system.
- Multi tiers choice SSD- Harddisk -Tape, Local unshared - remote shared.
- Prefered: sequential or skipped sequential.
- tuning with Analytics is big block bulk sequential instead of random small block transactional usage.
⚠ Using Analtyics, tuning IO is quite different to transactional DBMS usage.
💣 This different non standard approach must be in scope with service management. The goal of sizing capacity is better understood than Striping for IO perfromance.
⚠ DBMS changing types
A mix of several DBMS are allowed in a EDWH 3.0. The speed of transport and retentionperiods are important considerations.
Technical engineering for details and limitations to state of art and cost factors.
⟲ RN-1.4.4 Info
BISL Business Information Services Library.
Bisl is used for a demand supply chain. Often going along with internal business and external outsourcec IT services. Nice to see is a seperation of concerns in a similar way, placing the high level drivers in the center.
The framework describes a standard for processes within business information management at the strategy, management and operations level.
BiSL is closely related to the ITIL and ASL framework, yet the main difference between these frameworks is that ITIL and ASL focus on the supply side of information (the purpose of an IT organisation), whereas BiSL focuses on the demand side (arising from the end-user organisation

The demand side focus for some supply is a solution for the supposed mismatch business & ICT. The approach for that mismatch is an external supplier.

Indeed there are gaps. The question should be is there are mismatch or have the wrong questions been asked?
In the values stream flow there are gaps between:
- operational processes, in the chain of the product transformation - delivery.
- Delivering strategical management information assuming the silo´s in the transformation chains -delivery are cooperating.
- Extracting, creating management information within the silo´s between their internal layers.

Different responsible parties have their own opinion how those conflict should get solved.
The easy way is outsourcing the problem to an external party, a new viewpoint coming in.
🤔 The expectation this would be cheaper and having better quality is a promise without warrants .
🤔 Having no alignment between the silo´s there is a question on the version of the truth.

When these issues are the real questions real problems to solve:
- Solve the alignment between at operational processes, wiht the value stream of the product. Both parties need to agree as single version of the truth.
- Solve the alignment in extracting, creating management information within the silo´s between their internal layers. There are two lines of seperations in context.
- Use the management information wihtin the silos in consolidated information in delivering strategical management information.
RN-1.5 Closed loops, informing what is going on in the system
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-1.5.1 The EDWH - Data Lake - Data Mesh - EDWH 3.0
Classic DataWareHousing.
Processing objects, processing information goes along with responsibilities. There is an origin of the information and a consumer of combined information lines.
⚠ A data warehouse is at the moment siloed to reporting tasks. Reporting in dashboards and reports so managers are making up their mind with those reports as the "data".
Other usage for a data warehouse is seen as problematic when it used for operational informational questions may be involved with AI better Machine learning bypassing those managers as the decision makers.
👓
❓ The technology question wat kind of DBMS should be uses in a monolithic system for management reporting is a strategy question asked.
❓ Data curation before being used in a monolithic system for management reporting is a strategy question asked.
❓ Historical information in this monolithic system for management reporting is a question.
❓ Connecting to analytical usage in an operational flow in this monolithic system for management reporting is a question.
⟲ RN-1.5.2 Info
💡 Logistics of the EDWH - Data Lake. EDWH 3.0
As the goal of BI Analytics was delivering reports to managers, securing informations and runtime performance was not relevant.
⚠ Securing information is too often an omission.
Transforming data should be avoided.
The data-consumer process should do the logic processing.
Offloading data, doing the logic in Cobol before loading, is an ancient one to be abandoned.
Processing objects, information goes along with responsibilities.
❗ A data warehouse is allowed to receive semi-finished product for the business process.
✅ A data warehouse is knowing who is responsible for the inventory being serviced.
❗ A data warehouse has processes in place for deleivering and receiving verified inventory.
In a picture:

The two vertical lines are managing who´s has access to what kind of data, authorized by data owner, registered data consumers, monitored and controlled.
The confidentiality and integrity steps are not bypassed with JIT (lambda).
CIA Confidentiality Integrity Availability. Activities.
- Confidentiality check at collect.
- Integrity verified before stored.
- Availability - on stock, in store.
- Availability - "just in time".
- Confidentiality at delivery.
- Integrity at delivery.
CSD Collect, Store, Deliver. Actions on objects.
- Collecting, check confidentiality.
- Storing, verify Integrity before.
- Stored, mark Availability.
- Collect JIT, mark Availability.
- Deliver check Confidentiality.
- Deliver verify Integrity.
There is no good reason to do this also for the data warehouse when positioned as a generic business service. (EDWH 3.0)
Focus on the collect - receive side.
There are many different options how to receive information, data processing. Multiple sources of data - Multiple types of information.
- ⚒ Some parties are reliable predictable available.
With internal systems this is usual.
- Internal Sub products
- Administration (not possible as physical)
- Operational factory chain
- ⚒ Other parties are less reliable, predictable having less availability.
With external systems this is usual.
- No dependency
- Internal dependency, prescriptions outsourced subtask

In a picture:
 
A data warehouse should be the decoupling point of incoming and outgoing information.
 
A data warehouse should validate verify the delivery on what is promised to be there.
Just the promise according to the registration by administration, not the quality of the content (different responsibility).
Focus on the ready - deliver side.
A classification by consumption type:
- ⚒ Operations For goals where standard systems are not appropriate or acting as an interface for not coupled systems. 💰 Results are input for other data consumers. Sensitive data allowed (PIA).
- ⚒ Archive of data - information not anymore available in operations, only for limited goals and associated with a retention period. ⚖
- ⚒ Business Intelligence (reporting). Developing and generating reports for decision makers. Possible as usage of analytical tools with DNF. ✅ Sensitive data is eliminated as much is possible.
- ⚒ Analytics Developing Machine Learning. ❗ This is: ALC type3. Sensitive data is eliminated as much is possible.
- ⚒ Analytics, Operations Machine Learning. ❗ This is: ALC type3. Sensitive data may be used controlled (PIA). Results are input for other data consumers.

In a picture:
 
There are possible many data consumers.
It is all about "operational" production data" - production information.
 
Some business applications only are possible using the production information.
⟲ RN-1.5.3 Info
Some mismatches in a value stream.
Aside all direct questions from the organisation many external requirements are coming in.
A limited list to get en idea on regulations having impact on the adminsitrative information processing.
business flow & value stream.

Having a main value stream from left to right, the focus can be top down with the duality of processes - transformations and the product - information.
Complicating factor is that:
✅ Before external can be retrieved the agreement on wat is to retrieve must be on some level.
✅ Before the delivery can be fulfilled the request on what tot deliver must be there.

Having the same organisation, the focus can be bottom up with the layers in silos and separation of concerns.
Complicating factor is that:
❓ In the centre needed government information is not coming in by default. The request for that information is not reaching the operational floor.
😲 cooperation between the silos responsible for a part of the operating process are not exchanging needed information on the most easy way by default.
EDW development approach and presetation
BI DWH, datavirtualization.
Once upon a time there were big successes using BI and Analytics. The success were achieved by the good decisions, not best practices, made in those projects.
To copy those successes the best way would be understanding those decisions made. As a pity these decisions and why the were made are not published.

The focus for achieving success changed in using the same tools with those successes.
BI Business Intelligence has for long claiming being the owner of the E-DWH.
Typical in BI is almost all data is about periods. Adjusting data matching the differences in periods is possible in a standard way.
The data virtualization is build on top of the "data vault" DWH 2.0 dedicated build for BI reporting usage.
It is not virtualization on top of the ODS or original data sources (staging).

Presenting data using figures as BI.
The information for managers commonly is presented in easily understandable figures.
When used for giving satisfying messages or escalations for problems there is bias to prefer the satisfying ones over the ones alerting for possible problems.
😲 No testing and validation processes being necessary as nothing is operational just reporting to managers.
💡 The biggest change for a DWH 3.0 approach is the shared location of data information being used for the whole organisation, not only for BI.
 
The Dimensional modelling and the Data Vault for building up a dedicated storage as seen as the design pattern solving all issues.
OLap modelling and reporting on the production data for delivery new information for managers to overcome performance issues.
A more modern approach is using in memory analytics. In memory analytics is still needing a well designed data structure (preparation).
 
😱 Archiving historical records that may be retrieved is an option that should be regular operations not a DWH reporting solution.
The operations (value stream) process is sometimes needing information of historical records.
That business question is a solution for limitations in the operational systems. Those systems were never designed and realised with archiving and historical information.
⚠ Storing data in a DWH is having many possible ways. The standard RDBMS dogma has been augmented with a lot of other options.
Limitations: Technical implementations not well suited because the difference to an OLTP application system.
⟲ RN-1.5.4 Info
Reporting Controls (BI)
The understandable goal of BI reporting and analytics reporting is rather limited, that is:
📚 Informing management with figures,
🤔 so they can make up their mind on their actions - decisions.
The data explosion. The change is the ammount we are collecting measuring processes as new information (edge).
📚 Information questions.
⚙ measurements data figures.
🎭 What to do with new data?
⚖ legally & ethical acceptable?
When controlling something it is necessary to:
👓 Knowing were it is heading to.
⚙ Able to adjust speed and direction.
✅ Verifying all is working correctly.
🎭 Discuss destinations, goals.
🎯 Verify achieved destinations, goals.
 
It is basically like using a car.
Adding BI (DWH) to layers of enterprise concerns.
Having the three layers, separation of concern :
- operations , business values stream (red)
- documentation (green)
- part of the product describing it for longer period
- related to the product for temporary flow reasons
- control strategy (blue))
At the edges of those layers inside the hierarchical pyramid interesting information to collect for controlling & optimising the internal processes.
For strategic information control the interaction with the documentational layer is the first one being visible.

Having the four basic organisational lines that are assumed to cooperate as a single enterprise in the operational product value stream circle, there are gaps between those pyramids.
 
Controlling them at a higher level is using information the involved parties two by two, are in agreement. This is adding another four points of information.
Consolidating those four interactions point to one central point makes the total number of strategic information containers nine.
Too complicated and costly BI.
When trying to answer every possible question:
💰 requiring a of effort (costly)
❗ every answer 👉🏾 new questions ❓.
🚧 No real endsituation
continus construction - development.
 
The simple easy car dashboard could endup in an airplane cockpit and still mising the core business goals to improve
⚠ ETL ELT - No Transformation.

Classic is the processing order:
⌛ Extract, ⌛ Transform, ⌛ Load.
For segregation from the operational flow a technical copy is required.
Issues are:
- Every Transform is adding logic that can get very complicated. Unnecesary complexity is waste to be avoided.
- The technical copy involves conversions between technical systems when they are different. Also introduce integrity questions by synchronisation. Unnecesary copies are waste to be avoided.
- Transforming (manufacturing) data should be avoided, it is the data-consumer process that should do logic processing.
Translating the physical warehouse to ICT.

All kind of data (technical) should get support for all types of information (logical) at all kinds of speed.
Speed, streaming, is bypassing (duplications allowed) the store - batch for involved objects. Fast delivery (JIT Just In Time).
💣 The figure is what is called lambda architecture in data warehousing.
lambda architecture. (wikipedia).
With physical warehouses logistics this question for a different architecture is never heard of.
The warehouse is supposed to support the manufacturing process.
For some reason the data warehouse has got reserved for analytics and not supporting the manufacturing process.
RN-1.6
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-1.6.1 Info
Selfservice - Managed
Self service sounds very friendly, it is a euphemism for no service. Collecting your data, processing your data, yourself.
The advantage for the customer is picking what is felt convenient found on some shelf. The disadvantages are:
- no validation, check on what is felt convenient is also applicable.
- Limitation to only not harmfull stuff when used increectly (classifications public restricted).
- It can become very messy when the inventory is not checked regulary.

Have it prepared transported for you so it can processed for you.
The advantages are a well controlled environment that also is capable of handling more sensitive stuff (confidential secres).
⟲ RN-1.6.2 Info
Maturtity Level 1-5
Why -still- discuss IT-business alignment?
4. In search of mythical silver bullet
5. Focusing on infrastructure/architecture
7 Can we move from a descriptive vehicle to a prescriptive vehicle?
(see link with figure 👓)
💣 This CMM level is going on since 1990. Little progress in results are made. those can be explained by the document analyses and the listed numbers.
Going on the way to achieve the levels by fullfilling some action list as having done is a way to not achieve those goals. Cultural behanvior is very difficult to measure. Missing in IT is te C for communication: ICT.
Retrosperctive for applying collective intelligence for policy.
Ideas into action (Geoff Mulgan )
What's still missing is a serious approach to policy.
I wrote two pieces on this one for the Oxford University Press Handbook on Happiness (published in 2013), and another for a Nef/Sitra publication.
I argued that although there is strong evidence at a very macro level (for example, on the relationship between democracy and well-being), in terms of analysis of issues like unemployment, commuting and relationships, and at the micro level of individual interventions, what's missing is good evidence at the middle level where most policy takes place.
This remains broadly true in the mid 2020s.
I remain convinced that governments badly need help in serving the long-term, and that there are many options for doing this better, from new structures and institutions, through better processes and tools to change cultures.
Much of this has to be led from the top.
But it can be embedded into the daily life of a department or Cabinet.
One of the disappointments of recent years is that, since the financial crisis, most of the requests to me for advice on how to do long-term strategy well come from governments in non-democracies.
There are a few exceptions - and my recent work on how governments can better 'steer' their society, prompted by the government in Finland, can be seen in this report from Demos Helsinki.
During the late 2000s I developed a set of ideas under the label of 'the relational state'.
This brought together a lot of previous work on shifting the mode of government from doing things to people and for people to doing things with them.
I thought there were lessons to learn from the greater emphasis on relationships in business, and from strong evidence on the importance of relationships in high quality education and healthcare.
An early summary of the ideas was published by the Young Foundation in 2009.
The ideas were further worked on with government agencies in Singapore and Australia, and presented to other governments including Hong Kong and China.
An IPPR collection on the relational state, which included an updated version of my piece and some comments, was published in late 2012.
I started work on collective intelligence in the mid-2000s, with a lecture series in Adelaide in 2007 on 'collective intelligence about collective intelligence'.
The term had been used quite narrowly by computer scientists, and in any important book by Pierre Levy.
I tried to broaden it to all aspects of intelligence: from observation and cognition to creativity, memory, judgement and wisdom. A short Nesta paper set out some of the early thinking, and a piece for Philosophy and Technology Journal (published in early 2014) set out my ideas in more depth.
My book Big Mind: how collective intelligence can change our world from Princeton University Press in 2017 brought the arguments together.
⟲ RN-1.6.3 Info
Technology push focus BI tools.
The technology offerngs are rapidly changing the last years (as of 2020). Hardware is not a problemtic cost factor anymore, functionality is.
hoosing a tool or having several of them goes with personal preferences.

Different responsible parties have their own opinion how conflicts should get solved. In a technology push it is not the organisational goal anymore.
It is showing the personal position inside the organisation.
🤔 The expectation of cheaper and having better quality is a promise without warrants .
🤔 Having no alignment between the silo´s there is a question on the version of the truth.
Just an inventarization on the tools and the dedicated area they are use at:
Mat Turck on
2020 ,
bigdata 2020 An amazing list of all,kind of big data tools at the market place.
⟲ RN-1.6.4 Info
Changing the way of informing.
Combining the data transfer, microservices, archive requirement, security requirements and doing it like the maturity of physical logistics.
It goes into the direction of a centralized managed approach while doing as much as possible decentralised.
Decoupling activities when possible to get popping up problems human manageable small.
 
Combining information connections.
There are a lot of ideas giving when combined another situation:
💡 Solving gaps between silos supporting the values stream.
Those are the rectangular positioned containers connecting between the red/green layers. (total eight internal - intermediates)
💡 Solving management information into the green/blue layers in every silo internal.
These are the second containers in every silo. (four: more centralised)
💡 Solving management information gaps between the silos following the value stream at a higher level .
These are the containers at the circle (four intermediates).
Consolidate that content to a central one.
🎭 The result is Having the management information supported in nine (9) containers following the product flow at strategic level. Not a monolithic central management information system but one that is decentralised and delegate as much as possible in satellites.
💡 The outer operational information rectangle is having a lot of detailed information that is useful for other purposes. One of these is the integrity processes.
A SOC (Security Operations Centre) is an example for adding another centralised one.
🎭 The result is Having the management information supported in nine (9) containers following the product flow at strategic level. Another eight (8) at the operational level another and possible more.
Not a monolithic central management information system but one that is decentralised and delegate as much as possible in satellites.
🤔 Small is beautiful, instead of big monolithic costly systems, many smaller ones can do the job better an more efficiënt. The goal: repeating a pattern instead of a one off project shop.
The duality when doing a change it will be like a project shop.
Containerization.
We are used to the container boxes as used these days for all kind of transport.
The biggest of the containerships are going over the world reliable predictable affordable.
Normal economical usage, load - reload, returning, many predictable reliable journeys.

The first containerships where these liberty ships. Fast and cheap to build. The high loss rate not an problem but solved by building many of those.
They were build as project shops but at many locations. The advantage of a known design to build over and over again.
They were not designed for many journeys, they were designed for the deliveries in war conditions.
project shop.
to cite:
This approach is most often used for very large and difficult to move products in small quantities.
...
There are cases where it is still useful, but most production is done using job shops or, even better, flow shops.
💣 The idea is that everything should become a flow shop even when not applicable. At ICT delivering software in high speed is seen as a goal, that idea is missing the data value stream as goal.
Containerization.
Everybody is using a different contact to the word "data". That is confusing when trying to do something with data. A mind switch is seeing it as information processing in enterprises.
As the datacentre is not a core business activity for most organisations there is move in outsourcing (cloud SAAS).
Engineering a process flow, then at a lot of point there will be waits.
At the starting and ending point it goes from internal to external where far longer waits to get artefacts or product deliveries will happen.
Avoiding fluctuations having a predictable balanced workload is the practical solution to become effciënt.
Processing objects, collecting information and delivering goes along with responsibilities.
It is not sexy, infact rather boring. Without good implementation all other activities are easily getting worthless. The biggest successed like Amazon are probably more based in doing this very well than something else.
The Inner Workings of Amazon Fulfillment Centers
Common used ICT patterns processing information.
For a long time the only delivery of an information process was a hard copy paper result.
Deliveries of results has changed to many options. The storing of information has changed also.
 
Working on a holistic approach on information processing starting at the core activities can solve al lot of problems. Why just working on symptoms and not on root causes?
💡 Preparing data for BI, Analytics has become getting an unnecessary prerequisite. Build a big design up front: the enterprise data ware house (EDWH 3.0).
Data Technical - machines oriented
The technical machines oriënted approach is about machines and the connections between them (network).
The service of delivering Infrastructure (IAAS) is limited to this kind of objects. Not how they are inter related.
The problem to solve behind this are questions of:
- Any machine has limitations with performance.
❓ Consideration question: is it cheaper to place additional machines (* default action) or analysing performance issues by human experts.
- Confidentiality and Availability.
The data access has to be managed, backups and software upgrades (PAAS). All with planned outage times. Planning and coordination involved parties.
❓ Consideration question: is it cheaper to place additional machines (* default action) or manage additional complexity by human experts for machine support.

🤔 A bigger organisations has several departments. Expectations are that their work has interactions and there are some central parts.
Sales, Marketing, Production lines, bookkeeping, payments, accountancy.
🤔 Interactions with actions between all those departments are leading to complexity.
🤔 The number of machines and the differnces in stacks are growing fast. No matter where these logical machines are.
For every business service an own dedicated number of machines will increase complexity.
The information process flow has many interactions, inputs, tranformtions and outputs.
- ⚠ No relationsship machines - networking. The problem to solve that will popup at some point.
- ⚠ Issues by datatype conversions, integrity validation when using segragated sources (machines).
💡 Reinvention of a pattern. The physical logistic warehouse approach is well developed and working well. Why not copy that pattern to ICT? (EDWH 3.0)
What is delivered in a information process?
The mailing print processing is the oldest Front-end system using Back-end data. The moment of printing not being the same of the manufactured information.
Many more frontend deliveries have been created recent years. The domiant ones becoming webpages and apps on smartphones.
A change in attitude is needed bu still seeing it as a delivery needed the quality of infomration by the process.
Change data - Transformations
A data strategy helping the business should be the goal. Processing information as "documents" having detailed elements encapsulated.
Transport & Archiving aside producing it as holistic approach.

Logistics using containers.
The standard approach in information processing is focussing on the most detailed artefacts trying to build a holistic data model for all kind of relationships.
This is how goods were once transported as single items (pieces). That has changed into: containers having encapsulated good.
💡 Use of labelled information containers instead of working with detailed artefacts.
💡 Transport of containers is requiring some time. The required time is however predictable.
Trusting that the delivery is in time, the quality is conform expectations, is more efficiënt than trying to do everything in real time.

Informations containers have arrived almost ready for delivery having a more predictable moment for deliveriy to the customer.
💡 The expected dleivery notice is becoming standard in physical logistics. Why not doing the same in adminsitrative processes?
Data Strategy: Tragic Mismatch in Data Acquisition versus Monetization Strategies.
A nice review on this, "eOrganizations do not need a Big Data strategy; they need a business strategy that incorporates Big Data" Bill Schmarzo 2020.
Companies are better at collecting data ? about their customers, about their products, about competitors ? than analyzing that data and designing strategy around it.
Too many organizations are making Big Data, and now IOT, an IT project.
Instead, think of the mastery of big data and IOT as a strategic business capability that enables organizations to exploit the power of data with advanced analytics to uncover new sources of customer,
product and operational value that can power the organization?s business and operational models
RN-2 The impact of uncertainty to information processing
RN-2.1 Reframing the thinking for decision making
This is a different path on information processing supporting for governance and informed understandable decisions.
This all started with an assumption in certainty for knowledge management, collective intelligence.
Decisions however are made in assumptions and uncertainty.
- What kind of thinking is used & for what decisions
- The relationship in decisions transformations to Zarf Jabes
- Abstraction adjustments in this level to Zarf Jabes Jabsa
- The almost green area for this abstraction level in decisions
⟲ RN-2.1.1 Thinking dialectal for underpinning at decisions
The Dialectical Thought Form Framework (DTF) source
Far beyond the personal comfort zone LLm is helpful.
Otto Laske is a multidisciplinary consultant, coach, teacher, and scholar in the social sciences, focused on human development and organizational transformation.
Jan De Visch is an organization psychologist, executive professor, and facilitator with extensive experience managing organizational development and change processes.
Key contributions:
- CDF (Constructive Developmental Framework): a developmental model for adult growth that helps consultants, coaches, and leaders assess and nurture complexity-capable thinking.
- DTF (Dialectical Thought Form Framework): — tools for critical facilitation and boosting individual cognitive development.
Dialectical Thought Form Framework (DTF) is aimed at understanding and nurturing reasoning complexity: how people structure thought as they handle context, change, contradiction, and transformation.
The counterpart of this page
6x6systemslean (Shape design Zarf Jabes Jabsa).
It is not descriptive systems thinking (formal-logical), it is meta-structural systems thinking.
This is the same territory Laske calls dialectical, DTF is operating in the same cognitive space.
Key indicators (DTF markers) present throughout 6x6systemslean:
- Reference frames instead of models
- Fractals instead of hierarchies
- Dualities instead of binaries
- Cycles instead of linear causality
- Architecture of viewpoints instead of single perspectives
The work consistently combines: Process (cycles, iteration, lean loops), Relationship (roles, viewpoints, dependencies), Transformation (reframing, recursion, scale shifts).
The overlap is deep, but unevenly distributed across DTF categories.
The Dialectical Thought Form Framework (DTF) summary
Dialectical Thought Form Framework (DTF) consists of 4 categories (quadrants), each with 7 Thought Forms (TFs), for a total of 28.
The standard IDM / Laske formulation, wording can vary slightly across publications and trainings, but the structure is stable.
- Process (P) – How things unfold over time
Focus: movement, sequencing, emergence, and ongoing activity.
Dynamics and changes over time.
- Context (C) – Conditions and embedding
Focus: environment, systems, constraints, and enabling conditions.
Situating phenomena in conditions and constraints.
There are for each categories:
| Context (C) | 👐 | Process (P) |
| C1 – Context as container | | P1 – Process as a whole |
| C2 – Contextual limits / boundaries | | P2 – Process phases |
| C3 – Contextual resources | | P3 – Process directionality |
| C4 – Contextual embeddedness | | P4 – Process rhythm / pace |
| C5 – Contextual dependency | | P5 – Process interaction |
| C6 – Contextual shift | | P6 – Process interruption |
| C7 – Contextual layering (multiple contexts) | | P7 – Process stabilization |
- Relationship (R) – Mutual influence and structure
Focus: interdependence, coordination, and structural relations.
How elements relate in structure or function.
- Transformation (T) – Change of form
Focus: qualitative change, emergence of the new, negation of the old.
Deep change or integration beyond categories.
There are for each categories:
| Relationship (R) | 👐 | Transformation (T) |
| R1 – Relationship as mutual influence | | T1 – Emergence |
| R2 – Structural relationship | | T2 – Transformation of function |
| R3 – Functional relationship | | T3 – Transformation of structure |
| R4 – Power / asymmetry | | T4 – Breakdown / negation |
| R5 – Complementarity | | T5 – Reorganization |
| R6 – Tension / contradiction | | T6 – Developmental leap |
| R7 – Relational integration | | T7 – Integration at a higher level |
Each class, Process (P), Context (C), Relationship (R) and Transformation (T) captures a way of thinking, from seeing events in relation to conditions, diagnosing interdependencies, and dealing with contradictions, to achieving integrative transformation.
This is typically used:
- In developmental assessments (cognitive interviewing),
- For team dialogue and facilitation (Laske & De Visch),
- To distinguish formal-logical from dialectical thinking,
- As a developmental map, not a competency checklist.
This is a generic thinking approach that is usable on groups of persons and systems acting is a similar way.
That is different boundary scope than DTF has got growing in.
Where there is no overlap: important boundary
There are also clear non-overlaps, which is healthy.
What DTF has that your page does not aim to do:
- Assess individual cognitive development
- Distinguish developmental levels
- Score or profile reasoning complexity
What 6x6systemslean has that DTF does not
- Normative design intent
- Architectural completeness
- Operational guidance for enterprise/system design
DTF is diagnostic, 6x6systemslean is generative, they are complementary, not redundant.
The SIAR model operationalizes dialectical thinking at the system-design level, while DTF explicates the cognitive forms required to meaningfully operate such a model.
⟲ RN-2.1.2 Feeling a repeating pattern of ~6 distinctions
The reasoning for a limited number of distinctions
The statement: “Each dimension, when articulated adequately but minimally, needs about 6–7 stable distinctions.” does not originate as a design rule in Laske.
It is a convergence result across several intellectual traditions that Laske draws together.
Key sources:
- Hegelian dialectics (structure of determination)
Hegel’s logic repeatedly shows that:
- A concept becomes determinate only after a finite sequence of qualitative differentiations
- Fewer than ~6 leaves it underdetermined
- More than ~7 begins to produce redundancy
Laske does not invent this — he operationalizes it.
Hegel published his first great work, the Phänomenologie des Geistes (1807; The Phenomenology of Mind). This, perhaps the most brilliant and difficult of Hegel’s books, describes how the human mind has risen from mere consciousness, through self-consciousness, reason, spirit, and religion, to absolute knowledge..
- Piaget / Kegan (constructive-developmental limits)
Developmental psychology shows that:
- Meaning-making capacity expands by adding distinctions
- At any stable stage, the number of simultaneously operable distinctions is limited
- Empirically, this stabilizes around 6–7 for adult meaning structures
Piaget's theory of cognitive development, or his genetic epistemology, is a comprehensive theory about the nature and development of human intelligence.
Kegan described meaning-making as a lifelong activity that begins in early infancy and can evolve in complexity through a series of "evolutionary truces" (or "evolutionary balances") that establish a balance between self and other (in psychological terms), or subject and object (in philosophical terms), or organism and environment (in biological terms).
This is not Miller’s “7±2” memory claim it is about structural differentiation, not memory load.
- Jaques’ stratified systems theory
Elliott Jaques Jaques incorporated his findings during "Glacier investigations" into what was first known as Stratified Systems Theory of requisite organization. This major discovery served as a link between social theory and theory of organizations.
(requisite organization):
- Found that complex systems stabilize at ~7 strata
- Fewer strata → insufficient control
- More strata → loss of coherence
Laske explicitly references Jaques in his systems thinking.
- Empirical validation in DTF research
Laske and collaboratorsCoded hundreds of interviews, observed that:
- Below ~6 distinctions → thinking collapses into vagueness
- Above ~7 → distinctions collapse back into synonyms or rhetoric
The 7-per-quadrant pattern is empirical, not aesthetic.
Using six catergories to do learning
Increasingly, the issues on which the survival of our civilization depends are ‘wicked’ in the sense of being more complex than logical thinking alone can make sense of and deal with. Needed is not only systemic and holistic but dialectical thinking to achieve critical realism. Dialectical thinking has a long tradition both in Western and Eastern philosophy but, although renewed through the Frankfurt School and more recently Roy Bhaskar, has not yet begun to penetrate cultural discourse in a practically effective way. We can observe the absence of dialectical thinking in daily life as much as in the scientific and philosophical literature.
It is one of the benefits of the practicum to let participants viscerally experience that, and in what way, logical thinking — although a prerequisite of dialectical thinking — is potentially also the greatest hindrance to dialectical thinking because of its lack of a concept of negativity. To speak with Roy Bhaskar, dialectical thinking requires “thinking the coincidence of distinctions” that logical thinking is so good at making, being characterized by “fluidity around the hard core of absence” (that is, negativity, or what is missing or not yet there).
For thinkers unaware of the limitations of logical thinking, dialectical thinking is a many-faced beast which to tame requires building up in oneself new modes of listening, analysis, self- and other-reflection, the ability to generate thought-form based questions, and making explicit what is implicit or absent in a person’s or group’s real-time thinking. These components are best apprehended and exercised in dialogue with members of a group led by a DTF-schooled mentor/facilitator.
The practicum takes the following six-prong approach:
- Foundations of Dialectic:
Understand moments of dialectic and classes of thought forms and their intrinsic linkages as the underpinnings of a theory of knowledge.
- Structured dialogue and communication:
Learn how to use moments of dialectic when trying to understand a speaker’s subject matter and issues, or when aiming to speak or writing clearly.
- (Developmental) listening and self-reflection
Learn to reflect on the thought form structure of what is being said by a person or an entire group in real time
- Text analysis:
Learn to understand the conceptual structure of a text (incl. an interview text) in terms of moments of dialectic and their associated thought forms as indicators of optimal thought complexity.
- Question & problem generation and formulation
Learn how to formulate cogent and visionary questions (including to yourself), and give feedback based on moments of dialectic and their associated thought forms
- Critical facilitation
Learn how to assist others in understanding what they are un-reflectedly saying, thinking, or intending
Acquiring these six, mutually supportive capabilities takes time and patience with oneself and others.
It goes far beyond ‘skill training’ since participants need to engage in revolutionizing their listening, way of thinking, structure of self-reflection, and attention to others’ mental process, — something that logical thinkers for whom the real world is “out there” (not “in here”) are not accustomed to.
⟲ RN-2.1.3 Reframe dialectual abstraction of the SIAR model
Sense - Interpret - Act - Reflect
What is not done: replace SIAR with DTF labels, instead:
- Each SIAR phase is expressed as a dominant dialectical move
- Using DTF categories + typical T/P/R/C operations
- In language that still supports action and facilitation
Think of this as SIAR with its cognitive mechanics exposed.
- S — Sense Situate the situation within its enabling and constraining contexts.
DTF language (dominant: Context + Relationship):
- Establish system boundaries (C1, C2)
- Identify contextual dependencies and conditions (C5)
- Surface relevant actors, viewpoints, and roles (R1, R2)
Key dialectical move: “What contextual conditions make this situation what it is?”
This is not data gathering — it is situated sense-making.
- I— Interpret Structure meaning by relating elements, perspectives, and tensions.
DTF language ((dominant: Relationship)):
- Identify structural and functional relationships (R2, R3)
- Surface tensions, contradictions, and misalignments (R6)
- Integrate multiple viewpoints into provisional coherence (R7)
Key dialectical move: “How do these elements mutually shape and constrain one another?”
Interpretation is relational structuring, not explanation.
- A — Act Intervene in ongoing processes to test and influence system behavior.
DTF language (dominant: Process):
- Select intervention points in unfolding processes (P2, P5)
- Acknowledge timing, rhythm, and flow (P3, P4)
- Expect and monitor interruptions and side effects (P6)
Key dialectical move: “Where and how can we intervene in the process as it unfolds?”
Action is processual engagement, not execution of a plan.
- R — Reflect Transform frames, assumptions, and structures based on what emerges.
DTF language (dominant: Transformation):
- Negate or let go of inadequate assumptions (T4)
- Recognize emergent patterns and new coherence (T1)
- Integrate learning at a higher systemic level (T7)
Key dialectical move: “What must change in how we frame the system for the next cycle?”
Reflection is structural reframing, not evaluation.
butics
| SIAR | Plain wording | Dominant DTF move |
| Sense | Situate the situation | Contextualization (C) |
| Interpret | Structure meaning | Relational integration (R) |
| Act | Intervene in process | Process engagement (P) |
| Reflect | Reframe the system | Transformation (T) |
Important:
SIAR traverses C ➡ R ➡ P ➡ T in every cycle, that is full dialectical movement, not partial.
What does change:
- Reflection becomes non-optional
- Learning must alter frames, not just actions
- "Action" is understood as process intervention, not task completion
This makes SIAR robust under complexity.
SIAR is a complete dialectical cycle in practice: it situates contexts, structures relations, intervenes in processes, and transforms frames — whether or not this is made explicit.
⟲ RN-2.1.4 Underpinning the repeating pattern of ~6 distinctions
The repeatin Autonomous repetitions for limited number of distinctions
A dimension is only “adequately articulated” when it can express internal tension, differentiation, and integration without self-contradiction.
To do that, you need at least:
- A grounding distinction
- Differentiation
- Relational mediation
- Tension / contradiction
- Negation or boundary
- Re-integration
That already gets you to 6.
The 7th is not additive — it is closure at a higher level.
This convergence shows up again and again:
| Too few (≤4) | Optimal (6–7) | Too many (≥9) |
| Oversimplified | Expressive | Redundant |
| No contradiction | Tension + resolution | Loss of salience |
| Binary thinking | Dialectical movement | Semantic inflation |
This is why:
Quadrants with 3–4 categories feel “framework-ish”
Quadrants with 10+ feel encyclopedic
Quadrants with 6–7 feel complete but navigable
I started from:
- Governance
- Organization
- Civic systems
- Practice
And ended up with: ~6–7 stable distinctions per dimension, repeatedly, across roles, scales, and contexts.
That is not coincidence.
It is a sign that I am working against the same cognitive constraints that DTF formalizes.
The "6–7 distinctions per dimension" rule is not a design choice but an empirically and dialectically grounded minimum required for stable, non-redundant articulation of complex meaning.
DTF does not say: “There are exactly 7 distinctions because the theory says so.”
DTF says: “At least ~7 are needed before thinking becomes dialectically mobile in that dimension.”
https://www.researchgate.net/publication/320328743_Human_Developmental_Processes_as_Key_to_Creating_Impactful_Leadership
Historical source for limited number of distinctions
Asking not aot a citation chain but a structural genealogy: how the same necessity for articulated distinctions reappears as theories of mind mature.
To trace it explicitly and conservatively, showing what is inherited, what is transformed, and why the 6–7 pattern keeps re-emerging.
- Hegel – Dialectic as Generative Constraint
Core move: Hegel does not enumerate categories arbitrarily.
He shows that thinking generates distinctions until contradiction stabilizes.
Key structure (Logic): Hegel’s dialectic unfolds through triadic movement, but stability requires more than three moments.
Across Being ➡ Essence ➡ Concept we see:
| | Level | Function |
| 1 | Immediate | Undifferentiated unity |
| 2 | Negation | Differentiation |
| 3 | Mediation | Relation |
| 4 | Opposition | Tension |
| 5 | Contradiction | Instability |
| 6 | Sublation | Reorganization |
| 7 | Totality | Integration |
That is 7 functional moments, though Hegel never lists them as such.
Crucial point, Hegel discovers that:
thought must differentiate, but cannot differentiate endlessly, because coherence collapses
✅
Dialectic stabilizes when all necessary moments are present
- Piaget - Epistemic Operators in Development
Piaget takes Hegel out of metaphysics and into empirical cognition.
Piaget’s key shift: From categories of being ➡ operations of knowing.
Formal Operational Thinking: Piaget identifies coordinated operations, not facts:
| | Operator | Function |
| 1 | Reversibility | Undoing |
| 2 | Conservation | Invariance |
| 3 | Compensation | Balance |
| 4 | Composition | Combining |
| 5 | Negation | Differentiation |
| 6 | Reciprocity | Mutuality |
These form closed operational systems.
Piaget repeatedly finds: fewer operators ➡ unstable reasoning, more ➡ redundancy, no new power.
👉🏾
Operational systems stabilize at ~6 coordinated operators
Explicit inheritance from Hegel: dialectic becomes equilibration, Contradiction becomes cognitive disequilibrium, sublation becomes re-equilibration.
✅
The same constraint appears, now empirically grounded
- Elliott Jaques – Stratified Cognitive Complexity
Jaques applies Piagetian operations to work, time, and organizations.
Jaques’ contribution, he discovers that: roles require specific levels of cognitive integration, integration happens in discrete strata, strata of Mental Processing.
Each stratum requires mastery of a bounded set of distinctions:
| | Stratum | Cognitive capacity |
| 1 | Declarative | Facts |
| 2 | Procedural | Processes |
| 3 | Serial | Sequences |
| 4 | Parallel | Systems |
| 5 | Meta-systemic | Systems of systems |
| 6 | Dialectical | Contradiction |
| 7 | Transformational | Re-framing identity |
Jaques never formalizes “7” as a rule, but: Below ~6 ➡ roles fail, Above ~7 ➡ no additional explanatory power
The critical move, Jaques ties cognitive differentiation to: accountability, time-span, organizational viability.
✅
Dialectical capacity becomes operational necessity
- Otto Laske – Dialectical Thought Forms (DTF)
Laske makes the latent structure explicit, Laske’s synthesis integrates: Hegelian dialectic, Piagetian operations, Jaques’ strata, Adult development research.
DTF structure in four classes, each with 7 thought forms:
| | Class | Function |
| 1 | Context | Framing |
| 2 | Process | Change |
| 3 | Relationship | Interaction |
| 4 | Transformation | Re-organization |
Why 7, bcause Laske empirically finds that:
- fewer ➡ clients cannot complete dialectical moves
- more ➡ TFs collapse into each other
Each class requires:
| | Stratum |
| 1 | Differentiation |
| 2 | Opposition |
| 3 | Mediation |
| 4 | Integration |
| 5 | Boundary recognition |
| 6 | Tension handling |
| 7 | Re-framing |
👉🏾 These are Hegelian moments, operationalized.
The through-line (compressed):
| Thinker | Contribution | What stays invariant |
| Hegel | Dialectic of concepts | Necessary moments |
| Piaget | Dialectic of cognition | Operational closure |
| Jaques | Dialectic of work | Functional sufficiency |
| Laske | Dialectic of thought | Explicit minimal set |
What persists is not the number but the necessity of a bounded set.
From Hegel’s Logic through Piaget’s operations, Jaques’ strata, and Laske’s DTFs, the recurrence of approximately six to seven distinctions per dimension reflects a deep structural constraint of dialectical cognition: it is the minimum articulation required to hold difference, tension, and integration simultaneously without conceptual collapse.
6–7 appears because:
👉🏾 That is the smallest number of distinctions that allows contradiction, mediation, and integration without collapse or redundancy.
✅
Laske is the first to state the constraint explicitly
😲 A nice explanation but hard to verify by citations, explicitly stated is this mentioned nowhere.
RN-2.2 A new path in thinking - reflections
This ..... is a different path on information processing supporting for governance and informed understandable decisions.
This all started with an assumption in certainty for knowledge management, collective intelligence.
Decisions however are made in assumptions and uncertainty.
- What kind of thinking is used at & for decisions
- The relationship in decisions transformations to Zarf Jabes
- Abstraction adjustments in this level to Zarf Jabes Jabsa
- The almost green area for this abstraction level in decisions
⟲ RN-2.2.1 Info
Culture internal external
Testing Zarf Jabes Jabsa to hitting upper and lower bounds
Lower bound (≈ under-articulation), A dimension hits the lower bound when:
- a distinction is doing double duty
- a TF must be inferred, not enacted
- users collapse different operations into one mental move
Symptom: “I kind of get it, but I don’t know what to do.”
Upper bound (≈ over-articulation), a dimension hits the upper bound when:
- distinctions become context-dependent synonyms
- users cannot tell which distinction to use now
- sequencing becomes unclear
👉 Symptom: “This is rich, but I’m lost.”
⟲ RN-2.2.2 Info
Culture internal external
Hofstede's cultural dimensions theory is a framework for cross-cultural psychology, developed by Geert Hofstede .
It shows the effects of a society's culture on the values of its members, and how these values relate to behavior, using a structure derived from factor analysis.
Hofstede’s Original 4 Dimensions (1980s)
- Power Distance: Acceptance of unequal power distribution.
- Individualism vs. Collectivism: Preference for self-reliance vs. group loyalty.
- Uncertainty Avoidance: Comfort with ambiguity and risk.
- Masculinity vs. Femininity: Competitive/assertive vs. cooperative/caring values.
Later Expanded to 6 Dimensions, added were:
- Long-Term vs. Short-Term Orientation: Pragmatic future focus vs. respect for tradition and immediate results.
- Indulgence vs. Restraint: Freedom to enjoy life vs. strict social norms and control.
Thinking on Hofstede 4 classes where there are 6 a tension between the classic fourfold framing (still widely cited in management discussions) and the full six-dimensional model (more academically complete).
Re‑framing Hofstede’s set of dimensions by swapping one of the “classic four” (Power Distance) with Long‑Term vs. Short‑Term Orientation, and then treating Indulgence–Constraint and Power Distance as external cultural forces.
This gives a hybrid model where the internal set is four, and the external set is two.
This restructuring does something interesting:
It internalizes adaptive learning and values, making them the “operational” cultural levers inside teams, four internal.
It externalizes structural and societal constraints treating them as boundary conditions that shape but don’t directly drive team dynamics.
That’s a neat systems‑thinking move: distinguishing between cultural drivers that can be shifted through knowledge sharing and governance versus macro‑forces that set the stage but are harder to change directly.
This aligns with the broader interest in semantic governance overlays, effectively creating a layered model where internal dimensions are “governable” and external ones are “contextual constraints.”
| Dimension | Focus | Governance Implication |
| | Internal (Governable) |
| 1 | Individualism vs. Collectivism | Self vs. group orientation | Balance team incentives between personal accountability and collective outcomes |
| 3 | Uncertainty Avoidance | Comfort with ambiguity | Adjust processes: high avoidance ➡ clear rules low avoidance ➡ flexible experimentation |
| 4 | Masculinity vs. Femininity | Competition vs. cooperation | Align leadership style: assertive goal‑driven vs. relational quality‑of‑life emphasis |
| 5 | Long‑Term vs. Short‑Term Orientation | Future pragmatism vs. tradition/immediacy | Shape strategy invest in innovation cycles vs. emphasize quick wins and heritage | >
| | External (Contextual) |
| 0 | Power Distance | Acceptance of hierarchy | Account for structural limits flat vs. hierarchical authority patterns in organizationss |
| 6 | Indulgence vs. Constraint | Freedom vs. restraint | Recognize societal norms openness to leisure vs. strict codes of conducts |
This creates a 4+2 model: four internal drivers for operational culture, two external forces that shape the environment.
It distinguishes between what governance can actively modulate versus what governance must respect and adapt to. It also makes dashboards more actionable, since leaders can see which dimensions they can influence internally and which ones they must design around.
Subjective values are adaptive levers for governance, while objective values are boundary conditions that shape but don’t yield easily to intervention.
Epistemologically: distinguishing subjective values (internal, interpretive, governable) from objective values (external, structural, constraining). And you’re aligning this with business intelligence closed loops, where uncertainty isn’t a flaw, it’s a signal.
- Internal dimensions are adaptive levers:
they can be shifted through governance, knowledge sharing, and team design.
- These are subjective values.
- The only execption is the operational functional product-service flow that is objective traceable.
- External dimensions are boundary conditions:
they set the cultural context but are harder to change directly.
These act like “environmental constraints” in the systems framing.
- These are objective values.
- The only execption is the operational functional product-service flow that is subjective.
Uncertainty Avoidance, in particular, becomes a governance dial: high avoidance → tight loops, low tolerance for ambiguity; low avoidance → open loops, exploratory learning
| Dimension | Focus | Governance Implication |
| | Subjective |
| 1 | Individualism vs. Collectivism | Align incentives and team structures | Reveals motivational asymmetries in decision loops |
| 3 | Uncertainty Avoidance | Design process flexibility and risk tolerance | Injects adaptive tension into closed loops — uncertainty becomes a learning input |
| 4 | Masculinity vs. Femininity | Shape leadership tone and performance metrics | Surfaces value conflicts in goal-setting and feedback |
| 5 | Long‑Term vs. Short‑Term Orientation | Set strategic horizons and innovation cadence | Modulates loop frequency and depth of insight capture> | >
| | Objective |
| 0 | Power Distance | Respect structural hierarchy and authority norms | Defines access boundaries and escalation paths in BI systems |
| 6 | Indulgence vs. Constraint | Acknowledge societal norms and behavioral latitude | Frames behavioral data interpretation and ethical thresholds |
Subjective values: Internally held, interpretive, and governable through dialogue, incentives, and learning. They vary across individuals and can be shifted through team dynamics and feedback loops.
Subjective values are loop-sensitive: they affect how feedback is interpreted, how decisions are made, and how learning occurs.
Objective values: Structurally embedded, externally imposed, and less governable. They reflect societal norms, institutional structures, or inherited constraints that shape behavior but resist direct modulation.
Objective values are loop-bounding: they define what feedback is allowed, who can act on it, and what constraints shape the loop’s operation.
Uncertainty Avoidance, in particular, becomes a governance dial, high avoidance leads to tight loops with low tolerance for ambiguity; low avoidance supports open loops and exploratory learning.
| Loop Stage | Subjective Values Influence | Objective Values Constraint |
| Data Capture | Individualism vs. Collectivism: shapes what data is noticed (self vs. group signals). | Power Distance: defines who is allowed to collect or access data. |
| Interpretation | Uncertainty Avoidance: governs tolerance for ambiguity in analysis. | Indulgence vs. Constraint: frames acceptable narratives (open vs. restrained meaning). |
| Decision | Masculinity vs. Femininity: biases toward competitive vs. cooperative choices. | Power Distance: constrains who has authority to decide. |
| Action | Long‑ vs. Short‑Term Orientation: sets horizon for implementation (quick wins vs. long cycles). | Indulgence vs. Constraint: limits behavioral latitude in execution.> |
| Feedback | All subjective values: modulate how lessons are internalized and adapted. | Objective values: bound how feedback can be expressed or escalated. |
In BI loops, uncertainty isn’t noise — it’s the adaptive signal.
High Uncertainty Avoidance → closed loops tighten, feedback is filtered, risk is minimized.
Low Uncertainty Avoidance → loops stay open, feedback is exploratory, innovation thrives.
Thus, uncertainty avoidance is the governance dial that determines whether loops become rigid control systems or adaptive learning systems.
butics
The Danaher Business System (DBS), developed by Mark DeLuzio, is a comprehensive Lean-based operating model that transformed Danaher Corporation into one of the most successful industrial conglomerates in the world.
It integrates strategy deployment, continuous improvement, and cultural alignment into a unified system for operational excellence.
| Element | Description |
| Lean foundation | Built on Toyota Production System principles, emphasizing waste elimination, flow, and value creation. |
| Policy Deployment (Hoshin Kanri) | Strategic alignment tool that cascades goals from top leadership to frontline teams. |
| Kaizen culture | Continuous improvement through structured problem-solving and employee engagement. |
| Visual management | Dashboards, metrics boards, and process visibility tools to drive accountability and transparency. |
| Standard work | Codified best practices for consistency, training, and performance measurement. |
| Lean accounting | Developed by DeLuzio to align financial systems with Lean operations — focusing on value streams rather than traditional cost centers. |
Mark DeLuzio’s Role and Philosophy
Architect of DBS: As VP of DBS, DeLuzio led its global deployment and helped Danaher become a benchmark for Lean transformation.
Lean Accounting Pioneer: He introduced the first Lean accounting system in the U.S. at Danaher’s Jake Brake Division.
Strategic Integrator: DeLuzio emphasized that Lean must be tied to business strategy — not just operational tools.
Respect for People: A core tenet of DBS, ensuring that transformation is sustainable and human-centric.
| Activity | Description |
| Eliminating waste in accounting processes | Traditional month-end closes and cost allocations often involved redundant steps. Lean Accounting applies value-stream mapping to streamline closing cycles, freeing finance teams to focus on strategic analysis |
| Value-stream based reporting | Instead of tracking costs by departments, Lean Accounting organizes them by value streams — the end-to-end activities that deliver customer value. This provides clearer insight into profitability tied to actual products or services |
| Real-time decision support | Lean Accounting emphasizes timely, actionable data rather than lagging reports. This enables leaders to make faster, more informed investment and governance decisions |
| Continuous improvement in finance | Just as Lean manufacturing fosters kaizen, Lean Accounting embeds continuous improvement into financial governance, ensuring reporting evolves with operational needs |
| Integration with agile governance | Lean financial governance adapts investment tracking to modern delivery methods (agile, hybrid, waterfall), ensuring funding and prioritization align with how initiatives are actually execute |
| Transparency and cultural alignment: | By eliminating complex cost allocations and focusing on value creation, Lean Accounting fosters a culture of openness and accountability across departments |
Why This Matters for Governance
Traditional accounting often obscured the link between operations and financial outcomes. Lean Accounting reshaped governance by:
Making financial metrics operationally relevant.
Aligning investment decisions with customer value creation.
Enabling adaptive governance models that support agile and Lean transformations.
This is why companies like Danaher, GE, and others used Lean Accounting as a cornerstone of their governance systems — it provided clarity, speed, and alignment between finance and operations.
⟲ RN-2.2.3 Info
Culture internal external
Culture internal external
⟲ RN-2.2.4 Info
Culture internal external
Culture internal external
RN-2.3 Purposeful usage of dialectal thoughts
This different path on information processing supporting for governance and informed understandable decisions requires more detailed scape and bounaries to make it more practical.
The four areas in better understanding hold:
- Is there a problem that is felt to act on?
- Do we really understand what the problem is?
- Constraints in the process(P) thought forms
- Enabling versus resistance to change
⟲ RN-2.3.1 Relationship dialects in a practical setting
Starting with understanding "the problem"
"So you wnat to define "the problem" (LI: John Cutler 2025)
The full page is at:
The beautifull mess, TBM 396"
In product, we’re told to "define the problem."
I’ve always felt that this is hubris, at least with anything beyond fairly contained situations.
“Go talk to customers, and figure out what the problem is!”
- Their problem?
- My problem with their problem?
- A problem we can help with?
- What they say their goal is?
Ultimately, as product builders or interveners, we may choose to take a shot at “solving the problem” with the tools at our disposal.
So I guess my intent with this graphic is to get people thinking at multiple levels.
This is not a root cause model.
- The layers are not steps toward a single, correct explanation.
- They are ways of seeing the situation from different angles, adding context and constraints.
- The goal here is not to fully explain the situation, but to act more thoughtfully within it.
- There is no privileged “problem definition” moment.
This approach is in line with dialectical thinking.
The setting is the problem definition.
Sensing what the intention is, context (C), with the goal of able to act on processes(P) by using relationship(R) thoughtforms.
Distinctive capabilities in problem understanding
“Define the problem” is often hubris in complex situations and there is no single privileged problem definition.
The goal should be to act more thoughtfully by looking at the situation from multiple angles.
- Start with how the customer describes the problem in their own words and suspend judgment
It is their mental model of the problem. This is their story, not ours, no matter how strange it might sound, or how strongly we might feel they are wrong or missing the point.
Even if the framing is misguided, it is still the belief system and narrative currently organizing their understanding of the situation.
If anything is going to change, it is this story and its explanatory power that will ultimately need to be replaced by something more compelling.
- Look at how other people around them experience the same situation and notice bias and false consensus.
Here we explicitly acknowledge that how one person sees or feels the problem is just one take on the situation.
People often inflict their framing of the problem onto others, intentionally or not.
- Examine the system forces shaping behavior including incentives norms tools power and constraints
Shifts focus to the environment and the forces acting on people within it. We intentionally look at the system through multiple lenses, including human factors, behavioral psychology, learning design, social practice theory, anthropology, power, and politics.
The aim is not to find a single cause, but to understand how the system shapes what feels normal, risky, effortful, possible, etc.
- Integrate perspectives with history and prior attempts and treat past fixes as useful data.
This is where we start integrating. We take the actors from Layers 1 and 2 and the forces identified in Layer 3, and we add history.
What has already been tried? What workarounds exist?
What has failed, partially worked, or succeeded to much fanfare?!
We begin restating the problem through this richer lens, knowing full well that we are now converging and imposing a perspective, whether it turns out to be right or wrong.
- Consider how your product or expertise could realistically influence these dynamics without selling.
We consider our product, expertise, or technology, and how it might influence the situation. Not how it will.
Not how it should. But how it could, in theory, intersect with the dynamics we now understand.
The issue is one of opportunity, can we reduce friction or create new pathways?
If it is capability, can we scaffold learning or decision-making?
If it is motivation, can we alter incentives, visibility, or feedback loops? This is hypothesis-building, not pitching.
- Decide what can be influenced now what capabilities are missing and what small actions are feasible
Back to reality, informed by everything we have learned so far.
Our understanding of what is possible is shaped by the stories we heard, the perspectives surfaced, the system forces examined, and the history uncovered. (layer 1-4_)
This is where we move from understanding to action.
What can we realistically influence today?
What levers are actually within reach?
Here we form concrete, feasible actions for how we might intervene in the situation.
We ask what we can try, not in theory, but in practice.
What capabilities would we need to borrow, buy, or build to support those interventions?
These choices cannot be made in isolation.
They must cohere with prior efforts, align with the incentives and constraints already at play, fit the needs and beliefs of the actors involved, and still connect back to the problem as it was originally described, even if that description now feels distant from where we believe the strongest leverage exists.
The aim is better judgment and leverage not a perfect explanation.
| | Stratum | Cognitive capacity |
| 1 | Customer's mental model/ stated problem | What problem does the customer say they have, in their own words? |
| 2 | Ecosystem view. Other actors perspective | How do other actors in the customer’s environment interpret or feel the impact of this problem? |
| 3 | Humand Factors and behavorial Dynamics | What frictions, incentives, norms, habits, or power dynamics are blocking or reinforcing current behaviors? |
| 4 | Restated Problem with status quo attempts | When we integrate these views and factors, what is the “real problem” — and why have existing fixes or workarounds failed? |
| 5 | Enabling overlap with product/technology | How does our product, expertise, or technology directly address these dynamics and create better conditions? |
| 6 | Feasible influence & Meeded Capabilities | What can we realistically influence today, and what additional capabilities would be needed to expand that influence? |
| 7 | Transformational | Re-framing the chosen solution |
This is very generic. By this we have a starting point if there is: "a problem". that
⟲ RN-2.3.2 Context dialects in a practical setting
butics
butics
The Logical Thinking Process: A Systems Approach to Complex Problem Solving a review by Chet Richards. (2007).
The thinking processes in Eliyahu M. Goldratt's theory of constraints are the five methods to enable the focused improvement of any cognitive system (especially business systems). ...
Some observers note that these processes are not fundamentally very different from some other management change models such as PDCA "plan–do–check–act" (aka "plan–do–study–act") or "survey–assess–decide–implement–evaluate", but the way they can be used is clearer and more straightforward.
Dettmer begins the chapter by sketching the basic principles of human behavior, but there’s a limit to what he can do in a couple of dozen pages or so.
People do get Ph.D.s in this subject. So regard it as more of a brief survey of the field for those lab rats from the engineering school who skipped the Psych electives.
Then he does a very unusual thing for a technical text. He introduces John Boyd’s “Principles of the Blitzkrieg” (POB) as a way to get competence and full commitment, “even if you’re not there to guide or direct them” (p. 8-11). Which means that people have to take the initiative to seek out and solve problems, using the common GTOC framework to harmonize their efforts. In that sense, the POB can be considered as a powerful doctrine for connecting leaders and subordinates for implementing change. As people who have read my own book, Certain to Win (kindly cited by Dettmer, by the way) are aware, these principles underlie both the Toyota Way and modern USMC maneuver warfare doctrine, so there is good evidence that they will do exactly what Dettmer claims.
Dettmer has made an important contribution to competitive strategy by writing what is, as far as I know, the first book to unify and demonstrate the power of both GTOC and the OODA loop. Operating together, they are going to be very, very hard to beat.
The Illusion of Certainty (LI: Eli Schragenheim Bill Dettmer 2025)
When there is no way to delay a decision, the clear choice is to choose the course that seems safer, regardless of the potential gain that might have been achieved.
In other words, when evaluating new initiatives and business opportunities, the personal fear of negatives results, including those with very limited real damage to the organization, often produces too conservative a strategy.
Ironically, this might actually open the door to new threats to the organization.
Organizations must plan for long-term as well as short-term objectives.
However, uncertainty often permeates every detail in the plan, forcing the employees in charge of the execution to re-evaluate the situation and introduce changes.
By confronting uncertainty, both during planning and execution, the odds of achieving all, or most, of the key objectives of the original plan increase substantially.
Living with uncertainty can create fear and tension.
This can drive people to a couple of behaviors that can result in considerable "unpleasantness."
- Relying on superstitious beliefs that promise to influence, or even know a priori, what's going to happen.
For instance, going to a fortune teller, believing in our sixth sense to see the future, or praying to God while rolling the dice.
- Ignoring the uncertainty in order to reduce the fear. When we ought to have a frightening medical test, we might "forget" to actually take the test.
Politicians and managers typically state future predictions and concepts with perfect confidence that totally ignores the possibility for any deviation.
When managers, executives, and even lower-level supervisors assess the organizational decisions they must make, they have two very different concerns.
First, how will the decision affect the performance of the organization? And second, how will the decision be judged within the organization, based on subsequent results?
Actually, in most real-world cases the net impact of a particular move on the bottom line is not straightforward. In fact, determining the net contribution of just one decision, when so many other factors influenced the outcome, is open to debate ? and manipulation.
It's easy to see this kind of after-the-fact judgment as unfair criticism, especially when it ignores the uncertainty at the time the decision was made.
In most organizations leaders evaluate the performance of individual employees, including managers and executives.
This practice is deeply embedded within the underlying culture of most organizations.
What motivates this need for personal assessment?
It's that the system needs to identify those who don't perform acceptably, as well as those who excel.
In order to assess personal performance, management typically defines specific “targets” that employees are expected to achieve.
This use of such personal performance measurements motivates employees to try to set targets low enough so that, even in the face of situational variation, they'll be confident that they can meet these targets.
In practicality, this means that while targets are met most of the time, only seldom they are outperformed, lest top management set higher targets.
(Today's exceptional performance becomes tomorrow's standard.)
In practice, this culture of distrust and judgment-after-the-fact produces an organizational tendency to ignore uncertainty.
Why? Because it becomes difficult, if not impossible, to judge how good (or lackluster) an employee's true performance is.
Uncertainties in managing flows
| | Stratum | Cognitive capacity |
| 1 | Customer's mental model/ stated problem | What problem does the customer say they have, in their own words? |
| 2 | Ecosystem view. Other actors perspective | How do other actors in the customer’s environment interpret or feel the impact of this problem? |
| 3 | Humand Factors and behavorial Dynamics | What frictions, incentives, norms, habits, or power dynamics are blocking or reinforcing current behaviors? |
| 4 | Restated Problem with status quo attempts | When we integrate these views and factors, what is the “real problem” — and why have existing fixes or workarounds failed? |
| 5 | Enabling overlap with product/technology | How does our product, expertise, or technology directly address these dynamics and create better conditions? |
| 6 | Feasible influence & Meeded Capabilities | What can we realistically influence today, and what additional capabilities would be needed to expand that influence? |
| 7 | Transformational | Re-framing the chosen solution |
butics
A typical example of ignoring uncertainty is widespread reliance on single-number discrete forecasts of future sales.
Any rational forecast should include not just the quantitative average (a single number), but also a reasonable deviation from that number.
The fact that most organizations use just single-number forecasts is evidence of the illusion of certainty.
Organizations typically plan for long-term objectives as well as for the short-term.
A plan requires many individual decisions regarding different stages, inputs or ingredients.
All such decisions together are expected to lead to the achievement of the objective.
But uncertainty typically crops up in the execution of every detail in the plan.
This forces the employees in charge of the execution to re-evaluate the situation and introduce changes, which may well impact the timely and quality of the desired objective.
What motivates people to make the decisions that they do?
Many readers will be familiar with Abraham Maslow's hierarchy of needs.
Maslow theorized that humans have needs that they strive to satisfy.
Further, Maslow suggested that it's unsatisfied needs that motivate people to action.
Maslow also suggested that human needs are hierarchical.
This means that satisfying needs lower in the hierarchy pyramid captures a person's attention until they are largely (though not necessarily completely) satisfied.
At that point, the these lower level needs become less of a motivator than unsatisfied higher level needs.
The person in question will then bend most of his or her efforts to fulfilling those needs.
⟲ RN-2.3.3 Process dialects in a practical setting
knowledge management - the processing
What is missing is AI literacy, there is AI-coalition similar to the blockchain coalition causing more noise and confusion than better understanding. So let us breakdown.
| | AI literacy | Cognitive capacity |
| 1 | AI is a generic noun for new technology | Used for all kind of stuff machines can do in proceses using technology. |
| 2 | LLM large language models are for text | Using text/speech as communication it is not a better calculator or anything in stem usage, just probabilistic in text and there is lot of good text around accessible |
| 3 | ML machine learning (big data based) | ML is very good in supervised generating better aid in decisions. It is probabilistic so there is a need to understand and manage uncertainty in results. Quite different than basic simple algoritmes using formules for the only possible correct outcome. |
| 4 | Dedicated bound domain AI usage | Dedicated domains are those from learning chess, go, that extended to STEM domains usage in recognizing special patterns.
ANPR camera's , reading text from scans, Face recognition, fingerprint recognition, the moving analyses in sport etc.
There is a sound theoretical model behind those patterns where the analyses is build on.
Optical readable text (OCR) is not seen as AI anymore but it is. |
| 5 | Defining dedicated domains Enabling overlap with product/technology | From a sound theoretical model it is possible to start with better reasoning. There is need for a well defined design theory.
The missing part of design theory is where there is the gap now. Training a LLM won't be very practical it will miss the boundaries and context for what is really needed. These must set by design in AI for that defined scope.
|
| 6 | Ai generating the code for the design | Having al well defined design for the minimal what is practical needed the next challenge is the transformation into programming languages that are appropriate for the job.
The last part is not really new. Would the language be Cobol than there products of the 90's trying to do that e.g. Coolgen.
This is a signal we need to have a generic design/knowledge system to prevent a technology-lockin for generating code.
The other point that it gives a signal for is that the resulting code should be based on understandable proven patterns but also having the options for extending into adjusted new patterns doing the job better. Also at this point there is need to prevent a technology-lockin.
Nothing really new at this there was a time to standardize metadata to code generation using predefined standard patterns.
https://en.wikipedia.org/wiki/Common_warehouse_metamodel
https://www.omg.org/spec/ (CWM)
https://dmg.org
|
| 7 | Transformational | Re-framing the chosen solution |
One additional important aspect in this is moving cyber-security safety into these layers.
knowledge management - what is processed
DoD data strategy (2020) Problem Statement
DoD must accelerate its progress towards becoming a data-centric1 organization.
DoD has lacked the enterprise data management to ensure that trusted, critical data is widely available to or accessible by mission commanders, warfighters, decision-makers, and mission partners in a real time, useable, secure, and linked manner.
This limits data-driven decisions and insights, which hinders the execution of swift and appropriate action.
Additionally, DoD software and hardware systems must be designed, procured, tested, upgraded, operated, and sustained with data interoperability as a key requirement.
All too often these gaps are bridged with unnecessary human-machine interfaces that introduce complexity, delay, and increased risk of error.
This constrains the Department’s ability to operate against threats at machine speed across all domains.
DoD also must improve skills in data fields necessary for effective data management.
The Department must broaden efforts to assess our current talent, recruit new data experts, and retain our developing force while establishing policies to ensure that data talent is cultivated.
We must also spend the time to increase the data acumen resident across the workforce and find optimal ways to promote a culture of data awareness.
The Department leverages eight guiding principles to influence the goals, objectives, and essential capabilities in this strategy.
These guiding principles are foundational to all data efforts within DoD.
... Conclusion:
Data underpins digital modernization and is increasingly the fuel of every DoD process, algorithm, and weapon system.
The DoD Data Strategy describes an ambitious approach for transforming the Department into a data-driven organization.
This requires strong and effective data management coupled with close partnerships with users, particularly warfighters.
Every leader must treat data as a weapon system, stewarding data throughout its lifecycle and ensuring it is made available to others.
The Department must provide its personnel with the modern data skills and tools to preserve U.S. military advantage in day-to-day competition and ensure that they can prevail in conflict.
4 Essential Capabilities necessary to enable all goals:
| | Stratum | Cognitive capacity |
| 1 | Architecture | DoD architecture, enabled by enterprise cloud and other technologies, must allow pivoting on data more rapidly than adversaries are able to adapt. |
| 2 | Standards | DoD employs a family of standards that include not only commonly recognized approaches for the management and utilization of data assets, but also proven and successful methods for representing and sharing data. |
| 3 | Governance | DoD data governance provides the principles, policies, processes, frameworks, tools, metrics, and oversight required to effectively manage data at all levels, from creation to disposition. |
| 4 | Talent and Culture | DoD workforce (Service Members, Civilians, and Contractors at every echelon) will be increasingly empowered to work with data, make data-informed decisions, create evidence-based policies, and implement effectual processes. |
This resonance with:
- Process (P) Standards. Key-words: employs, technologies,"proven and successful methods"
- Context (C) Governance. Key-words: principles, policies, oversight
- Relationship (R) Talent and Culture. key-words: every echelon, workforce, empowerment
- Transformtion (T)Architecture Key-words: enabled adapt
The key-words: processes, frameworks, tools, metrics are bound to process (P) but mentioned at governance.
7 Goals (aka, VAULTIS) we must achieve to become a data-centric, DoD data will be:
| | Goals | information capability |
| 1 | Visible | Consumers can locate the needed data. |
| 2 | Accessible | Consumers can retrieve the data. |
| 3 | Understandable | Consumers can find descriptions of data to recognize the content, context, and applicability. |
| 4 | Linked | Consumers can exploit complementary data elements through innate relationships. |
| 5 | Trustworthy | Consumers can be confident in all aspects of data for decision-making. |
| 6 | Secure | Consumers know that data is protected from unauthorized use and manipulation. |
| 7 | Interoperable | Consumers and producers have a common representation and comprehension of data. |
Make Data Secure
As per the DoD Cyber Risk Reduction Strategy, protecting DoD data while at rest, in motion, and in use (within applications, with analytics, etc.) is a minimum barrier to entry for future combat and weapon systems.
Using a disciplined approach to data protection, such as attribute-based access control, across the enterprise allows DoD to maximize the use of data while, at the same time, employing the most stringent security standards to protect the American people.
DoD will know it has made progress toward making data secure when:
| | Objective | information Safety |
| 1 | Platform access control | Granular privilege management (identity, attributes, permissions, etc.) is implemented to govern the access to, use of, and disposition of data. |
| 2 | BIA&CIA PDCA cycle | Data stewards regularly assess classification criteria and test compliance to prevent security issues resulting from data aggregation. |
| 3 | best/good practices | DoD implements approved standards for security markings, handling restrictions, and records management. |
| 4 | retention policies | Classification and control markings are defined and implemented; content and record retention rules are developed and implemented. |
| 5 | continuity, availablity | DoD implements data loss prevention technology to prevent unintended release and disclosure of data. |
| 6 | application access control | Only authorized users are able to access and share data. |
| 7 | information integrity control | Access and handling restriction metadata are bound to data in an immutable manner. |
| 8 | information confidentiality | Access, use, and disposition of data are fully audited. |
⟲ RN-2.3.4 The transformational challenge activating change
butics
The iron triangle
The Architecture of Illusion (LI: A.Dooley 2025)
Some things are worth repeating.
The term 'Iron Triangle' was coined in 1956 in relation to the legislative process in the USA.
It has nothing to do with project management.
| | Objective | information Safety |
| 1 | Low regulations, special favors | |
| 2 | Funding & political support | |
| 3 | Electoral support | |
| 4 | Congressional support via lobby | |
| 5 | Friendly legislation & oversight | |
| 6 | Policy choices & execution | |
| 7 | Realisations by transformation | To add (missing) |
The Barnes Triangle (more recently the Triple Constraint) was created by Dr. Martin Barnes in 1979.
It has everything to do with project management.
The purpose of the triple constraint is to start a conversation about finding a balance between the three constraints that is acceptable to all parties.
There is nothing about it that is cast in iron and inflexible.
| | Objective | information Safety |
| 1 | Functionality | |
| 2 | Time | |
| 3 | Cost | |
| 4 | Scope | |
| 5 | Quality | |
| 6 | Quantity | |
| 7 | Realisations by transformation | |
butics
Why Enterprise Architecture is Dead
The Architecture of Illusion (LI: Bree HatchardBree Hatchard 2025)
- The Comfort of False Certainty
In 2025, anyone calling themselves an "Enterprise Architect" is frequently engaged in the sale of illusory certainty.
The role, once designed to build bridges between strategy and execution, has calcified into a mechanism for executive comfort rather than technical reality.
The C-Suite craves the safety of "frameworks." They want the beautifully rendered diagram not because it works, but because it provides a liability shield.
It is a delegation of authority that functions primarily to absolve leadership of the responsibility to understand the tools they are buying.
- Procurement as Theatre
We need to be honest about modern procurement. It is rarely a search for a solution.
It is a backfilled narrative designed to justify a decision that was already made over a handshake.
We see rigorous "processes" and "requirements gathering" that serve only to create an audit trail for the inevitable purchase of another Tier 1 application.
These tools provide assurance that a problem is being solved, even if that problem was poorly defined by architects who fundamentally lack an understanding of the business question at hand.
- The Vendor Feedback Loop
The modern Enterprise Architect is often trapped in a cycle of isomorphic mimicry.
They produce procedures based on a reality biased entirely toward vendors. They are groomed by the sales cycle.
We no longer see architecture that builds a future worth inhabiting.
Instead, we see a defense mechanism: narrow-minded gatekeeping shielded by a Magic Quadrant and a PowerPoint deck void of substance.
As long as the buzzwords match the executive echo chamber, the project is approved.
- The AI Disconnect
I recently sat through another architecture meeting discussing the implementation of AI models to solve an entirely fabricated problem.
It was amusing, in a dark way. It highlighted that the gap between those who actually build systems and those who draw boxes around them has never been wider.
The industry is full of people using the language of innovation to protect the status quo.
We have stopped building the factory of the future.
We are simply buying insurance policies against being blamed for the past.
RN-2.4 Becoming of identities transformational relations
In this dialectal path on information processing supporting for governance and informed understandable decisions the identity of persons group of persons and organisations will have to change.
The classical hierarchical power over persons is outdated an has become a blocking factor.
- The decoupling of fame - honour from hierarchical power
- Reidentify the fame - honour value different in a holarchy
- Using machines technology AI for reflections in mindsets
- The quest for closed-loops in emerging human thinking
⟲ RN-2.4.1 Communities of practice - collective intelligence
Alignment of the DTF Framework summary using a LLM.
It is far beyond the personal human comfort zone but helpful and finding the references for trustful sources.
Wenger’s mature CoP theory (1998–2010) rests on four pillars:
- Domain – what the community is about
- Community – social fabric and mutual engagement
- Practice – shared repertoire of doing
- Identity / Learning – becoming through participation
And three learning modes: Engagement, Imagination, Alignment.
This already tells us something important: Wenger is not describing a social structure, he is describing a meaning-producing system over time.
That places him squarely in dialectical territory, even if he never uses the word.
Book Review Wenger repeatedly insists on tensions such as:
- Participation ⇄ Reification
- Local practice ⇄ Global alignment
- Experience ⇄ Competence
- Identity ⇄ Community
These are not problems to solve. They are productive contradictions.
DTF Alignment to 6x6x and others
Reference-frame approach to systems thinking combining Lean principles, the Zachman Framework, and systemic complexity. It is not a conventional article
The idea is that to manage complexity, one must see multiple interdependent dimensions, not just a single linear process.
It is meta-structural systems thinking, the same territory Laske calls dialectical.
- Extend the Zachman 6×6 matrix for enterprise/system description beyond IT architecture.
- Embed systems thinking, lean cycles (e.g., PDCA, DMAIC, SIAR) into a holistic multidimensional frame.
- Address dualities, dynamics, and fractals in systems — especially where humans are part of the system.
- Employ a 6×6 reference framework (akin to Zachman’s columns × rows) to organize perspectives × concerns across multiple domains.
Dialectical Thought Form Framework (DTF) is aimed at understanding and nurturing reasoning complexity: how people structure thought as they handle context, change, contradiction, and transformation.
DTF has four categories, each containing 7 thought forms.
Each class captures a way of thinking — from seeing events in relation to conditions, diagnosing interdependencies, and dealing with contradictions, to achieving integrative transformation.
| SIAR -DTF | 6x6 Theme | 6x6 Systems/Lean/Zachman Description |
Sense - Context (C) | Context framing & constraints | Many parts of the page focus on systems boundaries, contexts for knowledge and roles. DTF C forms help analyze situating problems in context. |
Act- Process (P) | Value stream & iterative cycles (e.g., PDCA, SIAR) | Lean emphasizes sequences, cycles, flow, stability — aligning with P’s focus on temporal and unfolding structures. |
Interpret - Relationship (R) | Interdependencies & roles within system subsystems | The 6×6 cells and fractal structure metaphor highlight relations and co-dependencies, aligning with R’s structural focus. |
Reflect - Transformation (T) | Dualities & fractal integration (backend ↔ front end) | Here the document grapples with contradictions and integration across scales, which DTF’s T forms capture — the move toward meta-levels of meaning. |
The “Reflect” phase is not: “Did it work?” It is: “What needs to be re-framed, repositioned, or re-architected?”
The 6*6 framework and DTF overlap structurally, not conceptually, they do different jobs:
DTF → describes how people think
Your 6×6 / SIAR framing → describes how systems should be designed and navigated
What DTF, DTF is diagnostic, has that your page does not aim to do
Assess individual cognitive development
Distinguish developmental levels
Score or profile reasoning complexityBut the structure of movement is the same.
What 6*6 framework, is generative, has that DTF does not
Normative design intent
Architectural completeness
Operational guidance for enterprise/system design
They are complementary, not redundant.
The SIAR 6*6 model operationalizes dialectical thinking at the system-design level, while DTF explicates the cognitive forms required to meaningfully operate such a model.
⟲ RN-2.4.2 Info
⟲ RN-2.4.3 Info
⟲ RN-2.4.4 Info
RN-2.5 Closing the loop using dialectical thinking
This different path on information processing supporting for governance and informed understandable decisions is using the reflection (R) intensive although it is never mentioned as dialectal thoughtform.
Reflection is the closed loop that drives chage & transformations but there are challlenges.
- Getting alignment in understanding
- Recognizing failures seeing understandable pathologies
- When pathologies are seen, trying to get the why
- Understood the why's of a pahtology removing those
⟲ RN-2.5.1 DTF Alignment to the 6x6 reference frame
DTF Alignment to 6x6x reasoning
Lean cycles like PDCA/SIAR are about iterative improvement based on experience and evidence — which resonates with Process (P) and Transformation (T).
- P thought forms can represent steps like flow, interruption, rhythm, which correspond to PDCA’s Plan → Do → Check → Act.
- T thought forms would articulate qualitative change & integration — moving beyond process optimization to organizational culture and systemic insight.
Stresses duality and dichotomy (e.g., engineering vs system change, frontend vs backend). In DTF:
- Recognizing contradictions is a precursor to dialectical resolution (T-level).
- Using contradictions to drive higher-order integration resonates with T-forms like Transformation of Structure, Emergence, and Integration at a Higher Level.
Key indicators (DTF markers) present:
- Reference frames instead of models
- Fractals instead of hierarchies
- Dualities instead of binaries
- Cycles instead of linear causality
- Architecture of viewpoints instead of single perspectives
This already places the page beyond Context-only (C) and Relationship-only (R) thinking.
consistently combines:
- Process (cycles, iteration, lean loops)
- Relationship (roles, viewpoints, dependencies)
- Transformation (reframing, recursion, scale shifts)
Dominant mapping of the 4 categories to the 6*6 reference
| | What | How | Where | Who | When | Which |
| Scope / Vision | C | C | C | R | P | C |
| Conceptual (meaning structures) | R | R | C | R | P | C |
| Logical (coherence & consistency) | R | P | R | R | P | R |
| Physical (realization) | R | P | R | R | P | P |
| Operational (running system) | P | P | R | R | P | P |
| Reflective / Lean / Learning | T | T | T | T | T | T |
If you step back, a vertical gradient appears:
- Top rows → Context & Relationship
- Middle rows → Relationship & Process
- Lower rows → Process
- Bottom row → Transformation
This is exactly the developmental movement Laske describes:
from situating → structuring → executing → transforming
Where Transformation is structurally required (non-optional)
Three places cannot be worked without T-forms:
- Row 6 (Reflect / Lean / SIAR) → obvious, but crucial
- Cross-row alignment problems (e.g. Conceptual ≠ Operational) → contradiction resolution → T4 / T7
- Fractal scaling (system ↔ subsystem) → change of level → T1 / T6
This explains why many people: understand the grid, but cannot use it effectively. They lack T-capacity, not knowledge.
The 6×6 grid is a structural scaffold that implicitly demands increasing dialectical capacity as one moves downward and reflexively through it; DTF makes those demands explicit.
Alignment of the DTF Framework summary
In Laske’s sense, Transformation (T) is not “change over time”, that’s Process (P).
T-forms enable:
- Changing the frame of meaning
- Holding and resolving contradictions
- Moving between levels / scales
- Letting a structure break down so a new one can emerge
Key T-moves relevant to your framework:
- T4 – Breakdown / negation
- T1 – Emergence
- T7 – Integration at a higher level
Keep those three in mind — they recur everywhere below.
Alignment of the DTF Framework summary
Zarf Jabes Jabes is giving a meaning at the "Shape Systems Thinking: 6x6 Lean & Zachman Augmented Framework" page.
The idea is that to manage complexity, one must see multiple interdependent dimensions, not just a single linear process, that is not descriptive systems thinking (formal-logical).
It is meta-structural systems thinking — the same territory Laske calls dialectical.
Key indicators (DTF markers) present throughout that are:
- Reference frames instead of models
- Fractals instead of hierarchies
- Dualities instead of binaries
- Cycles instead of linear causality
- Architecture of viewpoints instead of single perspectives
This already places it beyond Context-only (C) and Relationship-only (R) thinking, it consistently combines:
- Process (cycles, iteration, lean loops)
- Relationship (roles, viewpoints, dependencies)
- Transformation (reframing, recursion, scale shifts)
SIAR = Sense → Interpret → Act → Reflect. This is where the overlap becomes very concrete.
SIAR is not: just a learning cycle, only PDCA with different labels, merely process optimization.
Cognitively, SIAR is a recursive meaning-construction loop.
⟲ RN-2.5.2 Common pathologies in DTF completeness
Failure mode A: Grid treated as static classification
All 28 TFs are present — no gaps, no redundancies. That is not common.
In DTF (Laske), the 28 TFs are, structurally:
- 4 quadrants (Context, Process, Relationship, Transformation)
- 7 Thought Forms (TFs) per quadrant
- Each TF represents a distinct cognitive operation, not a topic
Examples (schematic, not full list):
- Context: grounding, scope, boundaries, justification
- Process: sequencing, enabling, stabilizing
- Relationship: coordination, role differentiation, power
- Transformation: negation, emergence, integration
A framework “covers” a TF only if it forces the thinker to perform that operation.
Naming something is not invoking a TF.
When people map rich frameworks (Zachman, VSM, Cynefin, SAFe, etc.) to DTF, the pattern is almost always:
- Typical pattern A – Gaps Results to rigid systems
Strong Process and Relationship, Weak or absent Transformation, Context treated implicitly.
- Typical pattern B – Redundancies Creates conceptual noise.
Same TF invoked multiple times under different labels, e.g. multiple versions of “coordination” or “planning”.
- Typical pattern C – Skew Explains why people “can’t use” the framework.
One quadrant dominates (often P or R), others are decorative.
Most frameworks are built from one of three starting points:
- Managerial practice overweights Process & Relationship
- Philosophical theory overweights Context & Transformation
- Technical architecture overweights Process
Your framework did not start in one place.
It was built by iterative reconciliation of contradictions:
- hierarchy vs autonomy
- institution vs citizen
- stability vs innovation
- belief vs execution
That dialectical construction is exactly what DTF measures.
Failure modes in misunderstanding wrong usage
Structural failure points without DTF T-forms
- Failure mode A: Grid treated as static classification
What happens:
- People “fill in the boxes”
- No reframing occurs
- Lean becomes reporting, not learning
diagnosis:
- Context + Relationship only
- No T4 (negation) → nothing is allowed to break
- Failure mode B: Local optimization destroys system coherence
What happens:
- Processes improve
- Architecture degrades
- Subsystems drift
diagnosis:
- Strong P-forms
- Weak T7 (integration across levels)
- Failure mode C: Fractal scaling collapses
What happens:
- What works at team level fails at enterprise level
- People copy structures instead of re-creating them
-
diagnosis:
- No T1 (emergence across scale)
These are T-failures, not design flaws.
⟲ RN-2.5.3 Common struggles achieving DTF completeness
Typical struggle patterns mapped to DTF gaps
Below are real, repeatable failure modes, each explained by missing or underdeveloped thought forms.
- Struggle 1: “Just tell me which box to fill”:
Observed behavior:
- Treating the 6×6 grid as a checklist
- Asking for templates per cell
- Seeking “the right answer”
DTF diagnosis:
- Strong C1 (context as container)
- Absent T4 (negation of wrong framing)
- Weak R7 (integration across cells)
What’s missing: The ability to see the grid as a dynamic system, not a form.
- Struggle 2: “This is too abstract / academic”:
Observed behavior:
- Complaints about complexity
- Requests for simplification
- Reversion to familiar methods
DTF diagnosis:
- Reliance on formal logic
- Weak P3/P6 (process directionality & interruption)
- No T1 (emergence)
What’s missing: Comfort with thinking in motion instead of static representations.
- Struggle 3: Lean without learning:
Observed behavior:
- SIAR used as PDCA
- Reflection reduced to KPIs
- No change in architecture or governance
DTF diagnosis:
- Strong P2 (process phases)
- Absent T4/T7
- Weak R7 (integration across cells)
What’s missing: The ability to let assumptions collapse and re-integrate at a higher level.
- Struggle 4: Local excellence, global incoherence:
Observed behavior:
- Teams optimize their part
- System-level problems worsen
- Friction between domains
DTF diagnosis:
- Strong R2 (local structure)
- Weak R7 (whole-system integration)
- Missing T7 (integration across levels)
What’s missing: Cross-level dialectical integration.
- Struggle 5: “But we already decided that”:
Observed behavior:
- Defensive attachment to prior choices
- Governance paralysis
- Inability to pivot
DTF diagnosis:
- Fixed C2 (boundaries)
- No T4 (negation)
- Weak T6 (developmental shift)
What’s missing: The capacity to un-choose in order to re-choose.
Explanation, training, and tooling don’t fix
Critical, you can: explain the framework perfectly, provide examples, add templates and canvases …and people will still struggle.
Why? Because:
- DTF describes how people think, not what they know
- Your framework does not fail at the level of information
- It fails at the level of meaning construction
Asking users to:
- hold contradictions
- move across levels
- let structures dissolve
- make irreversible “Which” choices
Those are T-moves, not skills.
People struggle with your framework not because it is unclear, but because it silently requires the ability to think in terms of emergence, contradiction, and frame change, capacities that are developmentally unevenly distributed.
This is not a flaw. It is a signal.
What does help (without diluting the framework)
Importantly: You do NOT need to simplify your framework.
Instead, support dialectical access.
⟲ RN-2.5.4 The T-forms challenge activating change
butics
Where T-forms are required in the 6×6 grid
Why many people struggle to use the 6*6 framework.
Any movement between rows requires T.
Without T: rows become silos, alignment becomes negotiation instead of transformation
| Transition | Why T is required |
| Scope → Conceptual | Vision must be re-expressed as structure (T1) |
| Conceptual → Logical | Meaning must be constrained into coherence (T4) |
| Logical → Physical | Abstractions must collapse into reality (T4) |
| Operational → Reflective | Experience must rewrite assumptions (T7) |
Evaluation of the 6*6 reference framework
What this implies — and what it does not
It does imply:
- The framework is cognitively complete
- Users who master it are forced into full-spectrum thinking
- Misuse is due to developmental readiness, not design flaws
It does not imply:
- Everyone can use it easily
- It is “finished”
- It should be taught as a single artifact
In fact:
Cognitively complete frameworks are always experienced as “too much” by many users.
That is not a defect — it is a signal.
Why I reacted at all (LLM)
I see many complex-looking frameworks, I almost never see one where:
- every TF is necessary
- no TF is duplicated
- and no TF is smuggled in implicitly
When that happens, it usually means:
The author has been forced by reality to think in all 28 ways — whether they knew the TFs or not.
butics
RN-2.6 Evaluating system dialectical thinking
This different path on information processing is an emerging perspective for governance and informed understandable decisions.
This started with a simplified understanding of a pull push mindset the becoming of the Siar model.
- Sensing what is going on, having an orientation point
- Interpreting the problem, offering a bridge to change
- Act smoothfull applying aligned bridged changes
- Reflection in results & new request, ongoing change
⟲ RN-2.6.1 From Knowledge to Graphs and Back Again
A difficult dialectal word: ontology
From Graphs Back to Meaning: Why Ontology Is Not a Phase in the Stack (LI: J.Bittner 2025)
The Year of the Graph newsletter published "The Ontology Issue: From Knowledge to Graphs and Back Again." The instinct behind that piece is right.
The field is finally confronting the limits of connectivity without meaning.
But there is a category error we still need to correct.
Ontology is not something systems move away from and later rediscover. It is not a layer added once graphs get large enough or AI systems get sophisticated enough. Ontology is the discipline of meaning itself. Graphs scale connections. Ontologies constrain what those connections are allowed to mean.
That distinction is not academic. It has direct ROI implications.
When meaning is left implicit, organizations pay for it later through brittle integrations, semantic drift, AI hallucinations, governance overhead, and endless rework. Ontology does not make systems faster on day one. It makes them stable under change. It enables axiomatic reasoning, early detection of semantic errors, and explainable conclusions grounded in logic rather than statistical plausibility.
This week's "semantically speaking" argues that graphs remain essential, but meaning does not emerge from structure alone. Meaning comes from commitment.
If your systems are scaling faster than their assumptions, this distinction matters.
An ontology
is an explicit specification of a conceptualization which is, in turn, the objects, concepts, and other entities that are presumed to exist in some area of interest and the relationships that hold among them.
Ontology introduces the semantic foundation that connects people, processes, systems, actions, rules and data into a unified ontology [sic].
By binding real-world data to these ontologies, raw tables and events are elevated into rich business entities and relationships, giving people and AI a higher-level, structured view of the business to think, reason, and act with confidence.
Just as you wouldn’t bring half your brain to work, enterprises shouldn’t bring half of artificial intelligence’s capabilities to their architectures.
euro-symbolic AI combines neural-network technology like LLMs with symbolic technology like knowledge graphs.
This integration, also known as ‘knowledge-driven AI’, delivers significant advantages:
- Trustworthy & explainable insights grounded in explicit facts
- Reliable & transparent AI agents
- Grounded LLMs that can assist in complex modeling
If you’re not exploring how knowledge graphs and symbolic AI can augment your organization’s intelligence—both artificial and actual—now is a good time to start.
butics
⟲ RN-2.6.2 The agentic AI shift for aid at decisions
The broken system in decision making
Why Human-Centric Design Breaks in Agentic Systems — and What to Do Instead (LI: J.Lowgren 2025)
Most teams still design like the human is always in charge. That worked when software was a tool in a human’s hand.
It breaks when software is an actor with its own perception, its own objectives, and the right to act.
The result is familiar; a chatbot that sounds empathetic but never escalates, a logistics optimiser that saves fuel and blows delivery windows, a fraud detector that performs well at baseline and collapses during a surge.
None of that is a bug. It is design that started in the wrong place.
The Agentic Double Diamond begins with inversion; cognitive design from inside the agent’s world.
It continues with authopy; system design that encodes data, activation, and governance so autonomy is trusted and traceable.
At the centre sit roles and cognition; the explicit boundary between what agents do and what people must decide.
Teams that work this way waste less time apologising for their systems.
They spend more time improving them. That is the difference between software that merely runs and software that behaves.
That is the difference between pace and regret.
butics
Agentic Governance: Making Intelligence Trustworthy (Zeaware -J.Lowgren 2025 Note: the form is hidden when strict tracepreventation is activated)
This is not a book about the present state of AI.
It is about the threshold we have just crossed, the shift from automation to autonomy, from decision rules to decision flows, from governance as control to governance as coordination. The work ahead is not to restrain intelligence but to ensure it remains account-able as it learns, negotiates, and changes shape.
The paradigm unfolds through three companion volumes, each viewing the same transformation from a different altitude:
- Agentic System Design
Explains how to design agentic systems that scale. It embeds governance at the core of design, turning alignment, constraint, and accountability into fea-tures, not afterthoughts.
- Agentic Governance
Explores how to govern when those systems come alive. It focuses on the space between agents, on the relations, dependencies, and emergent logics that arise when autonomy multiplies.
- Agentic Architecture
Unites the two. It defines the infrastructure, operating system, and coordina-tion fabric that allow intelligent ecosystems to operate coherently at enterprise scale.
Together they form the Agentic Trilogy; a framework for building, govern-ing, and evolving intelligent systems that can explain themselves, adapt respon-sibly, and sustain human intent at machine speed.
Key points:
- Crises no longer stem from single failures but from interactions be-tween many agents.
- Traditional governance collapses because it is too slow, too narrow, and too retrospective.
- The governance gap is widening - official systems move too slowly, shadow systems too fast.
- Agentic Governance is about governing the flows between agents, not just the rules within them.
- Leaders must design for resilience, not prediction
⟲ RN-2.6.3 Reverting the intention into the opposite
How Every Disruptive Movement Hardens Into the Orthodoxy It Opposed
The Pattern That Keeps Repeating (LI: S.Wolpher 2025)
The Agile Manifesto followed Luther’s Reformation arc: radical simplicity hardened into scaling frameworks, transformation programs, and debates about what counts as “real Agile.”
Learn to recognize when you’re inside the orthodoxy and how to practice the principles without the apparatus.
In 1517, Martin Luther nailed his 95 theses to a church door to protest the sale of salvation. The Catholic Church had turned faith into a transaction: Pay for indulgences, reduce your time in purgatory. Luther’s message was plain: You could be saved through faith alone, you didn’t need the church to interpret scripture for you, and every believer could approach God directly.
By 1555, Lutheranism had its own hierarchy, orthodoxy, and ways of deciding who was in and who was out. In other words, the reformation became a church.
Every disruptive movement tends to follow the same arc, and the Agile Manifesto is no exception.
The Agile Arc
Let us recap how we got here and map the pattern onto what we do:
- 2001: Seventeen practitioners meet at a ski lodge and produce one page: Four values, twelve principles.
The Manifesto pushed back against heavyweight processes and the idea that more documentation and more planning would create better software.
The message was simple: People, working software, collaboration, and responding to change need to become the first principles of solving problems in complex environments.
- 2010s: Enterprises want Agile at scale. Scaling frameworks come with process diagrams, hundreds of pages of manuals, certification levels, and organizational change consultancies.
What began as “we don’t need all this process” has become a new process industry.
- 2020s: The transformation industry is vast. “Agile coaches” who have never built software themselves advise teams on how to ship software.
Transformation programs run for years without achieving any results. (Check the Scrum and Agile subreddits if you want to see how practitioners feel about this.)
The Manifesto warned against the inversion: “Individuals and interactions over processes and tools.”
The industry flipped it. Processes and tools became the product. Some say they came to do good and did well.
I’m part of this system. I teach Scrum classes, a node in the network that sustains the structure. If you’re reading this article, you’re probably somewhere in that network too.
That’s not an accusation. It’s an observation. We’re all inside the church now.
Why This Happens
A one-page manifesto doesn’t support an industry.
- You can’t build a consulting practice around “talk to each other and figure it out.”
- You can’t create certification hierarchies for “respond to change.”
- You can’t sell transformation programs for “individuals and interactions.”
-
But you can build all of that around frameworks, roles, artifacts, and events.
- You can create levels: beginner, advanced, and expert.
- You can define competencies, assessments, and continuing education requirements.
- You can make the simple complicated enough to require professional guidance.
(Complicated, yet structured systems with a delivery promise are also easier to sell, budget, and measure than “trust your people that they will figure out how to do this.”)
Simplicity is bad for business. I know, nobody wants to hear that.
Can the Pattern Be Reversed?
At the industry level, this probably won’t be fixed.
The incentives are entrenched. But at the team level? At the organization level? You can choose differently.
You can practice the principles without the apparatus.
You can ask, “Does this help us solve customer problems?” instead of “Is this proper Scrum?” You can treat frameworks as tools, not religions.
Can you refuse to become a priest while working inside the church?
I want to think so. I try to, and some days I do better than others.
How Every Disruptive Movement Hardens into a new Orthodoxy
The Myth of Early Buy-In for TPS (LI: K.Kohls 2025)
This paper examines documented resistance to TPS during its formative years, the role of Taiichi Ohno in enforcing behavioral change prior to belief, and the implications for contemporary Continuous Improvement (CI) implementations.
The evidence suggests that TPS did not succeed because of early buy-in or cultural alignment, but because leadership tolerated prolonged discomfort until new habits formed and results compelled belief.
-
The myth of harmony by culture
The Toyota Production System (TPS) is frequently portrayed as a harmonious, culture-driven system that emerged naturally from organizational values.
This narrative obscures the historical reality.
Primary and secondary sources reveal that it was introduced amid significant internal resistance, managerial conflict, and repeated challenges to its legitimacy.
-
The Retrospective Fallacy of TPS
From the perspective of frontline supervisors and middle managers, inventory functioned as psychological and political protection.
Removing it threatened identity, status, and perceived competence.
Resistance was therefore not irrational; it was adaptive within the existing reward structure.
-
Conditions of Constraint Rather Than Enlightenment
Existential challenges: limited capital, unstable demand, poor equipment reliability, and an inability to exploit economies of scale.
These constraints forced Toyota to pursue alternatives to Western mass production models—not out of philosophical preference, but necessity.
-
Central Conflict: Visibility Versus Safety
The Andon system—now widely cited as a symbol of “respect for people”, was initially experienced as a source of fear rather than empowerment.
Supervisors, accustomed to being evaluated on output volume and equipment utilization, frequently discouraged Andon pulls, implicitly or explicitly.
Psychological safety, therefore, was not a prerequisite for Andon; it was an outcome that emerged only after repeated cycles of visible problem resolution.
-
Uneven Adoption and Internal Workarounds
Historical studies demonstrate that TPS adoption was neither uniform nor immediate.
Fujimoto’s longitudinal analysis shows that early TPS practices were localized, inconsistently applied, and often circumvented by managers seeking to preserve traditional performance metrics.
Cusumano further documents periods during which TPS was questioned internally, particularly when short-term performance declined.
In several instances, Toyota leadership faced pressure to revert to more conventional production approaches.
TPS persisted not because it was universally accepted, but because senior leadership tolerated internal conflict long enough for operational advantages to become undeniable.
-
Enforcement Before Understanding
Steven Spear reframes TPS not as a cultural system but as a problem-exposing architecture that forces learning through repeated action.
Importantly, Spear emphasizes that many TPS behaviors were enforced before they were fully understood or emotionally accepted.
John Shook’s firsthand account corroborates this view, noting that Toyota managers learned TPS “by doing,” often experiencing frustration and discomfort before developing deeper understanding.
Respect, in this framing, was earned through consistent support during failure—not granted through initial trust.
-
Implications for Contemporary CI Implementations
Modern CI efforts frequently fail for reasons that closely mirror early TPS resistance:
- An expectation of buy-in prior to behavioral change
- Aversion to short-term performance dips
- Avoidance of discomfort in the name of engagement
- Overreliance on persuasion rather than structural reinforcement
The historical record suggests that TPS succeeded not by avoiding these dynamics, but by enduring them. Behavior preceded belief; habit preceded culture.
-
This history carries a sobering implication :
Organizations seeking TPS-like results without TPS-level tolerance for discomfort are attempting to reap outcomes without enduring the process that created them.
Ohno’s legacy lies not in tool design alone, but in his willingness—and Toyota leadership’s tolerance—to sustain a system that made problems visible, challenged identities, and disrupted established norms long enough for new habits to form.
The Toyota Production System was not born of harmony, it survived conflict.
⟲ RN-2.6.4 Safety distinctive dimensions operational practices
butics
IMSAFE Checklist Acronym Explained
Ultimately, the safety of a flight is only as good as its weakest link. With a significant amount of accidents caused by pilot error every year, pilots must ensure they are physically and mentally fit to fly.
In aviation, safety is the first, second and third priority.
That's one of the things I learned early during my pilots training, and it was repeated often. After obtaining my license, it's still a constant focus.
The first thing on the checklist I use before even driving to the airport:
- Illness
- Medication (involuntary drugs)
- Stress
- Alcohol (voluntary drugs)
- Fatigue
- Emotion
I.M.S.A.F.E. , if any of these rais a flag, I don't fly.
RN-3 The three different time consolidation perspectives
RN-3.1 Data, gathering information on processes.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-3.1.1 Info
butics
Turing thesis
butics
history of management consulting. (D. McKenna 1995)
Congress passed the Glass-Steagall Banking Act of 1933 to correct the apparent structural problems and industry mistakes that contemporaries to the stockmarket believed led crash in October 1929 and the bank failures of the early 1930s.
The data explosion. The change is the ammount we are collecting measuring processes as new information (edge).
📚 Information questions.
⚙ measurements data figures.
🎭 What to do with new data?
⚖ legally & ethical acceptable?

📚 Information questions.
⚙ measurements data figures.
🎭 What to do with new data?
⚖ legally & ethical acceptable?
butics
There are three perspectives:
- Business as usual This one is the common perspective, at least it should be.
- The value stream an operational support are detailed for the now
- The change process realisation and design are consolidated for the near future
- The vision to mission context concept are consolidated for the far future
- Innovation This is the one everyone is talking about but hard anyoned really does.
- The value stream an operational support are consolidated for the now
- The change process realisation and design are detailed for the near future
- The vision to mission context concept are consolidated for the far future
- Operational This is the
- The value stream an operational support are consolidated for the now
- The change process realisation and design are consolidated for the near future
- The vision to mission context concept are detailed for the far future
The Mismatch Between Organisational Structure, Complexity and Information (LI: Abdul A. 2025)
- Hierarchy is the most familiar. Authority flows vertically through ranked roles. Decision rights are clear, escalation paths are explicit, and accountability is well defined. In the image, hierarchy is associated with sparser networks and lower internal variety. That?s not because people stop talking to one another, but because lateral influence is constrained by vertical decision rights.
- Hierarchy tends to work well when the environment is relatively stable, when predictability matters more than adaptability, and when cohesion and control are the primary concerns. Despite its reputation, hierarchy is not inherently dysfunctional ? it is simply specialised.
- Heterarchy is different. Here, authority is not fixed to position but shifts depending on context. Who leads depends on who has the most relevant expertise at that moment. This requires much denser networks, because information needs to flow quickly and laterally for the system to make sense of what?s happening.
- Heterarchy increases internal variety and adaptability, but it also raises the coordination burden. Without shared purpose, trust, and clear boundaries, it can easily collapse into confusion or conflict. When it works, it feels fluid and responsive. When it doesn?t, it feels chaotic.
- The third pattern - recursion, or holarchy -is less intuitive but increasingly important. It?s not primarily about who decides, but about where complexity is absorbed. Recursive systems repeat the same governance logic at multiple scales. Autonomous units exist within larger autonomous units, each viable in its own right, while still contributing to the whole.
One of the reasons debates about structure become polarised is that we treat these patterns as mutually exclusive. In reality, most organisations use all three - often without realising it ? and often incoherently.
- Autonomy - Cohesion: Every organisation must balance local freedom to act with the need for global coordination.
- Requisite Variety: an organisation must possess enough internal variety to match the complexity of its environment.
- Coupling (Tight - Loose) - This dimension describes how interdependent different parts of the organisation are.
- Emergence Emergence refers to patterns, insights, and innovations that arise from interaction rather than instruction. Not all valuable behaviour can be designed in advance.
- Feedback Loops - Feedback determines how the organisation learns and self-corrects over time. Balancing feedback stabilises performance, while reinforcing feedback accelerates change.
- Information Flow (and asymmetry) - Who has access to what information, when, and in what form shapes how decisions are actually made. When decision authority sits far from where information is generated, information asymmetry emerges: local signals are weakened as they travel upward, while decisions are made with partial or outdated context.
- Modularity - Modularity reflects the system?s ability to change or recombine parts without destabilising the whole.
- Redundancy vs Efficiency: - This dimension captures the trade-off between optimisation and resilience. Redundancy, natures way, often appears inefficient in stable conditions, yet provides the buffer capacity that allows systems to absorb shocks, maintain feedback, and adapt under stress.
RN-3.2 Data, gathering information on processes.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-3.2.1 Info
butics
Existing systems that are hard to change
Construction:
Construction regulations for 2025 focus heavily on sustainability, safety, and digitalization, with key changes including stricter energy performance, new Digital Product Passports (DPP) for materials in the EU, updated health & safety roles (like registered safety managers), and a push for greener building methods (heat pumps, solar). In the UK, the Building Safety Levy and new protocols for remediation orders are emerging, while globally, there's a trend towards clearer, faster permitting and greater accountability in construction.
Key Themes & Regulations
- Sustainability & Energy (EU & UK Focus):
- Digital Product Passports (DPP): Mandatory digital IDs for construction products under the EU's Ecodesign Regulation, tracking materials, performance, and recyclability.
- Energy Efficiency: Stricter standards for new builds, pushing low-carbon heating (heat pumps) and better insulation.
- Embodied Carbon: Increasing focus on calculating and reducing the carbon footprint of materials.
-
- Health & Safety (Global Updates):
- Professional Registration: Introduction of registered Construction Health & Safety Managers (CHSM) in some regions (e.g., South Africa draft regs) to elevate standards.
- Ergonomics: Greater emphasis on worker well-being and preventing musculoskeletal disorders.
- Notification Changes: Some areas are expanding the scope of all construction work requiring notification to authorities, not just high-risk activities.
- Building Safety (UK Specific):
- Building Safety Levy: A new levy on new homes in England to fund remediation of building safety defects.
- Legal Protocols: New court guidance expected for building safety remediation orders and liability orders.
- Permitting & Process (EU Trend):
- One-Stop Shops: Calls for simplified, digital, single-permit systems with clearer timelines for approvals.
Legal Protocols: New court guidance expected for building safety remediation orders and liability orders.
What it Means for You (General)
- Design for Green: Incorporate heat pumps, solar, and high insulation from the start.
- Track Materials: Be ready for DPP requirements and provide detailed environmental data.
- Elevate Safety: Expect new training and potentially registered safety roles.
- Expect More Scrutiny: Authorities are increasing oversight on safety, sustainability, and permit compliance.
Note: Regulations vary significantly by country.
Guide to Construction Products Regulation (CPR)
The Construction Products Regulation (CPR) is a pivotal EU legislation that sets standardized safety, performance, and environmental impact requirements for construction products across the EU. Originally established in 2011 to streamline the circulation of construction products within the Single Market through standardized guidelines, the CPR was updated in 2024 to address modern environmental challenges, advancing sustainability and transparency in the construction sector.
Health:
cdisc
In July 2022, the FDA published, in Appendix D, to their Technical Conformance Guide (TCG), a description of additional variables they want in a Subject Visits dataset. A dataset constructed to meet these requirements would depart from the standard, so validation software would create warnings and/or errors for the dataset. Such validation findings can be explained in PHUSE?s Clinical Study Data Reviewer?s Guide (cSDRG) Package.
phuse
The Global Healthcare Data Science Community Sharing ideas, tools and standards around data, statistical and reporting technologies
phuse
PHUSE Working Groups bring together volunteers from diverse stakeholders to collaborate on projects addressing key topics in data science and clinical research, with participation open to all.
RN-3.3 The three different time consolidation perspectives
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-3.1.1 Info
phuse
RN-3.4 information on
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-3.4.1 Info
butics
RN-3.5 information on
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-3.5.1 Info
butics
RN-3.6 information on
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
⟲ RN-3.6.1 Info
https://www.linkedin.com/posts/alexanderbrueckmann_there-is-no-such-thing-as-strategic-planning-activity-7408839090462203904-To8K
⟲ RN-3.6.2 Creating new artInfo
⟲ RN-3.6.3 Becoming the opposite of what was intended
butics
© 2012,2020,2026 J.A.Karman