logo Jabes

Design Data - Information flow


📚 data logic types Information Frames data tech flows 📚

👐 C-Steer C-Serve C-Shape 👁 I-C6isr I-Jabes I-Know👐
👐 r-steer r-serve r-shape 👁 r-c6isr r-Jabes r-know👐

🔰 Contents Frame-ref ZarfTopo ZarfRegu SmartSystem ReLearn 🔰
  
🚧  Knowium P&S-ISFlw P&S-ISMtr P&S-Pltfrm Fractals Learn-I 🚧
  
🎯 Know_npk Gestium Stravity Human-cap Evo-InfoAge Learn-@2 🎯


RN-1 The classic technological perspective for ICT


RN-1.1 Contents

RN-1.1.1 Looking forward - paths by seeing directions
A reference frame in mediation innovation
details systems life  shift logframe back devops bpmc devops bianl data infotypes logframe  technology logframe When the image link fails, 🔰 click here for the most logical higher fractal in a shifting frame.
Contexts:
r-serve technology enablement for purposes
r-steer motivation purposes by business
r-shape mediation communication
data infotypes
data techflows
There is a counterpart 💠 click here for the impracticable diagonal shift to shaping change.


The Fractal focus for knowledge management
The impracticable diagonal is connecting the technology realisation to a demand from administrative support. There is no: The shape mindset mediation innovation:
The cosmos is full of systems and we are not good in understanding what is going on. In a ever more complex and fast changing world we are searching for more certainties and predictabilities were we would better off in understanding the choices in uncertainties and unpredictability's.
Combining:
  1. Systems Thinking, decisions, ViSM (Viable Systems Model) good regulator
  2. Lean as the instantiation of identification systems
  3. The Zachman 6*6 reference frame principles
  1. Value Stream (VaSM) Pull-Push cycle
  2. Improvement cycles : PDCA DMAIC SIAR OODA
  3. Risks and uncertainties for decisions in the now near and far future, VUCA BANI
The additional challenge with all complexities is that this is full of dualities - dichotomies.
The serve mindset technology realisation:

RN-1.1.2 Local content
Reference Squad Abbrevation
RN-1 The classic technological perspective for ICT
RN-1.1 Contents contents Contents
RN-1.1.1 Looking forward - paths by seeing directions
RN-1.1.2 Local content
RN-1.1.3 Guide reading this page
RN-1.1.4 Progress
RN-1.2 Knowledge shoulders for the 6x6 RFW bsiarflw_02 Frame-ref
RN-1.2.1 ....................................... right questions
RN-1.2.2 .......................................nd replies
RN-1.2.3 ......................................., fame, honor
RN-1.2.4 .......................................d dichotomies
RN-1.3 Augmented axioms: Anatomy Physiology ZARF bsiarflw_03 ZarfTopo
RN-1.3.1 ............................................ame
RN-1.3.2 ............................................imensions
RN-1.3.3 ............................................ons
RN-1.3.4 ............................................ations
RN-1.4 Augmented axioms: Neurology Sociology ZARF bsiarflw_04 ZarfRegu
RN-1.4.1 ..................................s
RN-1.4.2 ..................................ology: 1* dimensions
RN-1.4.3 .................................. & implications
RN-1.4.4 .................................. & implications
RN-1.5 Insight for intelligence in viable systems bsiarflw_05 SmartSystem
RN-1.5.1 ........................................xt
RN-1.5.2 ........................................ good regulator
RN-1.5.3 ........................................-abstraction
RN-1.5.4 ........................................ns to clear
RN-1.6 Learning systems maturity from 6x6 RFW's bsiarflw_06 ReLearn
RN-1.6.1 ............................................. model
RN-1.6.2 .............................................res
RN-1.6.3 .............................................ment
RN-1.6.4 ............................................. crisis
RN-2 The impact of uncertainty to information processing
RN-2.1 Reframing the thinking for decision making bsiarsys_01 Knowium
RN-2.1.1 Thinking dialectal for underpinning at decisions
RN-2.1.2 Feeling a repeating pattern of ~6 distinctions
RN-2.1.3 Reframe dialectual abstraction of the SIAR model
RN-2.1.4 Underpinning the repeating pattern of ~6 distinctions
RN-2.2 A new path in thinking - reflections bsiarsys_02 P&S-ISFlw
RN-2.2.1 ...............................................ions
RN-2.2.2 ...............................................missions
RN-2.2.3 ...............................................ons
RN-2.2.4 ...............................................issions
RN-2.3 Purposeful usage of dialectal thoughts bsiarsys_03 P&S-ISMtr
RN-2.3.1 Relationship dialects in a practical setting
RN-2.3.2 Context dialects in a practical setting
RN-2.3.3 Process dialects in a practical setting
RN-2.3.4 The transformational challenge activating change
RN-2.4 Becoming of identities transformational relations bsiarsys_04 P&S-Pltfrm
RN-2.4.1 Communities of practice - collective intelligence
RN-2.4.2 ....................................................s
RN-2.4.3 ....................................................ions
RN-2.4.4 ....................................................ons
RN-2.5 Closing the loop using dialectical thinking bsiarsys_05 Fractals
RN-2.5.1 DTF Alignment to the 6x6 reference frame
RN-2.5.2 Common pathologies in DTF completeness
RN-2.5.3 Common struggles achieving DTF completeness
RN-2.5.4 The T-forms challenge activating change
RN-2.6 Evaluating system dialectical thinking bsiarsys_06 Learn-I
RN-2.6.1 From Knowledge to Graphs and Back Again
RN-2.6.2 The agentic AI shift for aid at decisions
RN-2.6.3 Reverting the intention into the opposite
RN-2.6.4 Safety distinctive dimensions operational practices
RN-3 The three different time consolidation perspectives
RN-3.1 Using the understanding continuum practical siaragil_01 Know_npk
RN-3.1.1 ....................................................rns
RN-3.1.2 .................................................... shifts
RN-3.1.3 ....................................................s
RN-3.1.4 ....................................................s
RN-3.2 Using the emergence pragnanz gestalt siaragil_02 Gestium
RN-3.2.1 ....................................................tterns
RN-3.2.2 ....................................................actions
RN-3.2.3 ....................................................tions
RN-3.2.4 ....................................................ents
RN-3.3 Using the "center of gravity" in value streams siaragil_03 Stravity
RN-3.3.1 .................................................patterns
RN-3.3.2 .................................................ons
RN-3.3.3 .................................................ions
RN-3.3.4 .................................................nts
RN-3.4 Human Capital in systems for capabilities siaragil_04 Human-cap
RN-3.4.1 ................................................
RN-3.4.2 ................................................implify
RN-3.4.3 ................................................ractals
RN-3.4.4 ................................................efs
RN-3.5 Changing systems information age C&C siaragil_05 Evo-InfoAge
RN-3.5.1 ..........................................pes
RN-3.5.2 ..........................................mergent types
RN-3.5.3 ..........................................vations
RN-3.5.4 ..........................................in systems
RN-3.6 Touching transcendental boundaries in learning siaragil_06 Learn-@2
RN-3.6.1 ............................................... a whole?
RN-3.6.2 ...............................................tomy
RN-3.6.3 Becoming the opposite of what was intended
RN-3.6.4 ...............................................ystems

RN-1.1.3 Guide reading this page
The quest for methodlogies and practices
This page is about a mindset framework for undertanding and managing complex systems. The type of complex systems that is focussed on are the ones were humans are part of the systems and build the systems they are part of.
When a holistic approach for organisational missions and organisational improvements is wanted, starting at the technology pillar is what is commonly done. Knowing what is going on on the shop-floor (Gemba). Working into an approach for optimized systems, there is a gap in knowledge and tools.
👁 💡 The proposal to solve those gaps is "Jabes". It is About: Seeing "Jabes" as a system supporting systems the question is what system is driving Jabes? The system driving Jabes must have similarities to the one that is driving it.
👁 💡 ZARF (Zachman-Augmented Reference Frame) is a streamlined upgrade to the classic Zachman matrix. It turns a static grid into a practical, multidimensional map that guides choices, enforces clear boundaries, and adds a sense of time, so teams move methodically from idea to reality
Shaping Systems collective intelligence
These are part of a larger vision of adaptive, resilient enterprises, organisations. The mindset is even exceeding that of what is seen as an enterprise to the communities enterprises are part of.
Sys6x6Lean and Shape Design for ICT Systems Thinking form a unified framework for adaptive enterprises. Combining Lean processes, Zachman reference models, mediation, and innovation, these pages guide organizations in shaping resilient systems for complex environments. There is special impracticable fractal the demand is at "C-Shape design" the realisation at "r-serve devops sdlc" From the C-Shape location:
: 👉🏾 Sys6x6Lean page: focuses on systems thinking, Lean, viable systems modeling.
Shape Design for ICT Systems Thinking page: focuses on mediation, innovation, ICT organizational frameworks.
From the r-serve location:
: 👉🏾 Valuestream page: focuses on systems thinking, Lean, viable systems modeling.
Serve Devops for ICT Systems realisations page: focuses on practical innovations ICT organizational frameworks.

A recurring parable for methodlogies and practices
Key challenges: Achieving Cross Border Government Innovation (researchgate Oecd opsi, foreword Geof Mulgan 2021 - collective intelligence)
OPSI is a global forum for public sector innovation. In a time of increasing complexity, rapidly changing demands and considerable fiscal pressures, governments need to understand, test and embed new ways of doing things.
Over the last few decades innovation in the public sector has entered the mainstream in the process becoming better organised, better funded and better understood. But such acceptance of innovation has also brought complications, in particularly regarding the scope of the challenges facing innovators, many of which extend across borders. Solutions designed to meet the needs of a single country are likely to be sub-optimal when applied to broader contexts. To address this issue, innovators need to learn from others facing similar challenges and, where possible, pool resources, data and capacities.
OPSI's colleagues in the OECD Policy Coherence for Sustainable Development Goals division (PCSDG) and the EC Joint Research Centre have developed a conceptual framework for analysing transboundary interrelationships in the context of the 2030 Agenda.
OPSI and the MBRCGI have observed an increased focus on cross-border challenge-driven research and innovation, with a particularly strong influence from agendas such as the SDGs.
A second challenge is how to institutionalise this work. It is not too difficult to engage people in consultations across borders, and not all that hard to connect innovators through clubs and networks. But transforming engagement into action can be trickier.
It is particularly hard to share data – especially if it includes personal identifiers (although in the future more “synthetic data” that mirrors actual data without any such identifiers may be more commonly used, particularly for collaborative projects in fields such as transport, health or education). It is also hard to get multiple governments to agree to create joint budgets, collaborative teams and shared accountability, even though these are often prerequisites to achieving significant impacts.
OPSI double four
RN-1.1.4 Progress
done and currently working on:

The topics that are unique on this page
👉🏾 Rules Axioms for the Zachman augmented reference framework (ZARF). 👉🏾 Connecting ZARF to systems thinking in the analogy of: 👉🏾 Explaining the patterns that are repeating seen in this.
👉🏾 use cases using the patterns for Zarf and by Zarf. Highly related in the domain context for information processing are:
open design_bianl:
workcell
valuestream
open design_sdlc :
DTAP Multiple dimensions processes by layers
ALC type 2 low code ML process development
ALC type 3 low code ML process development
vmap_layers01 low code ML process development
data administration *meta describing modelling data
Security *meta - modelling access information
meta data model
meta data process
meta secure
open local devops_sdlc:
prtfl_c22
prtfl_t33
relmg_c66
relmg_t46

RN-1.2 Technical requirements for knowledge systems

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-1.2.1
Archiving, Retention policies.
Information is not only active operational but also historical what has happened, who has execute, what was delivered, when was the delivery when was the purchase etc. That kind of information is often very valuable but at the same time it is not well clear how to organize that and who is responsible.
💣 Retention policies, archiving information is important do it well, the financial and legal advantages are not that obvious visible. Only when problems are escalating to high levels it is clear but too late to solve. When being in some financial troubles, cost cutting is easily done.
Historical and scientific purposes, moved out off any organisational process.
An archive is an accumulation of historical records in any media or the physical facility in which they are located. Archives contain primary source documents that have accumulated over the course of an individual or organization's lifetime, and are kept to show the function of that person or organization. Professional archivists and historians generally understand archives to be records that have been naturally and necessarily generated as a product of regular legal, commercial, administrative, or social activities.
The word record and word document is having a slightly different meaning in this context than technical ICT staff is used to.
In general, archives consist of records that have been selected for permanent or long-term preservation on grounds of their enduring cultural, historical, or evidentiary value. Archival records are normally unpublished and almost always unique, unlike books or magazines of which many identical copies may exist. This means that archives are quite distinct from libraries with regard to their functions and organization, although archival collections can often be found within library buildings.

Additional information container attributes.
😉 EDW 3.0 Every information container must be fully identifiable. Minimal by: When there are compliancy questions on information wiht this kind of compliancy questions it is often assumed to be an ICT problem only. Classic applications are lacking thes kind of attributes with information.
compliancy_sdlc.jpg 💡 Additional information container attributes supporting implementations defined retention policies. Every information container must have for applicable retention references :
Common issues when working for retention periods.
An isolated archive system in complexity reliability and availability being a big hurdle, high impact.
Relevant information for legal purposes, moved out from manufacturing process and not being available anymore in legal cases, is problematic.
Impact by cleaning as soon as possible is having high impact. The GDPR states it should be deleted as soon as possible. This law is getting much attention and is having regulators. Archiving information for longer periods is not directly covered by laws, only indirect.

compliancy_bpmbia.jpg
Government Information Retention.
Instead of a fight how it should be solved there is a fight somebody else is to blame for missing information. This has nothing to do with hard facts but everything with things like my turf and your fault. Different responsible parties have their own opinion how conflict in retention policies should get solved.
🤔 Having information deleted permanent there is no way to recover when that decision is wrong.
🤔 The expectation it would be cheaper and having better quality is a promise without warrrants.

RN-1.2.2 Technology safe by design & open exchangeable
Business Continuity.
Loss of assets can disable an organisation to function. It is risk analysis to what level continuity, in what time, at what cost, is required and what kind of loss is acceptable. 💣 BCM is risk based having visible cost for needed implementations but not visible advantages or profits. There are several layers
Procedures , organisational.
People , personal.
Products, physical & cyber.
Communications.
Hardware.
Software.

Loss of physical office & datacentre.
In the early days using computers all was located close to the office with all users because the technical communication lines did not allow long distances. Using batch processing with a day or longer to see results on hard copy prints. Limited Terminal usage needing copper wires in connections.
etl-elt_01.png The disaster recovery plan was based on a relocation of the office with all users and the data centre when needed in case of a total loss (disaster).
For business applications a dedicate backup for each of them aside of the needed infrastructure software including the tools(applications).
The period to resilence could easily span several weeks, there was no great dependency yes on computer technology. Payments for example did not have any dependency in the 70´s.

Loss of network connections.
The datacentre has got relocated with the increased telecommunications capacity. A hot stand by with the same information on a Realtime duplicated storage made possible.
etl-elt_01.png The cost argument with this new option resulted in ingorance of resilence of other type of disasters to recover and ignorance of archiving compliancy requirements.
With a distributed approach of datacenters the loss of single datacentre is not a valid scenario anymore. Having services spread over locations the isolated DR test of a having one location failing is not having the value as before.

Loss control to critical information.
Loss of information, software tools compromised, database storage compromised, is the new scenario when everything has become accessible using communications. Just losing the control to hackers being taken into ransom or having data information leaked unwanted externally is far more likely and more common than previous disaster scenarios.
Swiss_cheese_model.png Not everything is possible to prevent. Some events are too difficult or costly to prevent. Rrisk based evaluation on how to resilence.
Loss of data integrity - business.
Loss of confidentiality - information.
Robustness failing - single point of failures.
The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in computer security and defense in depth. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure. Although the Swiss cheese model is respected and considered to be a useful method of relating concepts, it has been subject to criticism that it is used too broadly, and without enough other models or support.

Several triads of components.
Eliminating single points of failure in a backup (restore) strategy. Only the proof of a successful recovery is a valid checkpoint. 3-2-1 backup rules , the 3-2-1 backup strategy is made up of three rules, they are as follows:
  1. Three copies of data- This includes the original data and at least two backups.
  2. Two different storage types- Both copies of the backed up data should be kept on two separate storage types to minimize the chance of failure. Storage types could include an internal hard drive, external hard drive, removable storage drive or cloud backup environment.
  3. One copy offsite- At least one data copy should be stored in an offsite or remote location to ensure that natural or geographical disasters cannot affect all data copies.
wikepedia_informationsecurity.png
BCM is related to information security. It are the same basic components and same shared goals.
An organization´s resistance to failure is "the ability ... to withstand changes in its environment and still function". Often called resilience, it is a capability that enables organizations to either endure environmental changes without having to permanently adapt, or the organization is forced to adapt a new way of working that better suits the new environmental conditions.
image:By I, JohnManuel, CC BY-SA 3.0
Auditing monitoring.
For legal requirements there are standards by auditors. When they follow their checklist a list of &best practices"e are verified. The difference with "good practice" is the continous improvement (PDCA) cycle.
Procedures , organisational.
People , personal.
Products, physical & cyber.
Security Operations Center.
Infrastructure building blocks- DevOps.
Auditing & informing management.

Audit procedure processing.
The situation was: Infrastructure building blocks- DevOps Leading. Auditing and informing management on implementations added for control.
Added is: Security Operations Centre, leading for evaluating security risk. Auditing and informing management on implementations added for control.
 
The ancient situation was: Application program coding was mainly done in house. This had changed into using public and commercial retrieved software when possible.
Instead of having a software crisis in lines of code not being understood (business rules dependency). It has changed in used software libraries not being understood (vulnerabilities) and not understood how to control them by the huge number of used copied software libraries.
Instead of having only an simple infrastructure stack to evaluate it has become a complicated infrastructure stack with an additional involved party into a triad to manage.
 
Penetration testing, also called pen testing or ethical hacking, is the practice of testing a computer system, network or web application to find security vulnerabilities that an attacker could exploit. Penetration testing can be automated with software applications or performed manually. Either way, the process involves gathering information about the target before the test, identifying possible entry points, attempting to break in -- either virtually or for real -- and reporting back the findings.
It will only notify what is visible to the tester, using tools only what is commonly known. There is nog warrant that it is not vulnerable after "ecorrections" are made. It is well posible there is no security risk at all by the way the system is used and being managed.
RN-1.2.3 Standard understandable naming conventions meta
logging monitoring.
Logging events when processing information is generating new information. The goal in using those logging informations has several goals. Some loginformation is related to the product and could also become new operational information.
💣 When there are different goals an additional copy of the information is an option but introduces an option of integrity mismatches.
Data classification.
Information security
The CIA triad of confidentiality, integrity, and availability is at the heart of information security. (The members of the classic InfoSec triad confidentiality, integrity and availability are interchangeably referred to in the literature as security attributes, properties, security goals, fundamental aspects, information criteria, critical information characteristics and basic building blocks.) However, debate continues about whether or not this CIA triad is sufficient to address rapidly changing technology and business requirements, with recommendations to consider expanding on the intersections between availability and confidentiality, as well as the relationship between security and privacy. Other principles such as "accountability" have sometimes been proposed; it has been pointed out that issues such as non-repudiation do not fit well within the three core concepts.
😉 Two additionals are: Negelected attentions points: An important aspect of information security and risk management is recognizing the value of information and defining appropriate procedures and protection requirements for the information. Not all information is equal and so not all information requires the same degree of protection. This requires information to be assigned a security classification.
Classified information
When labelling information in a categories an approach is:
  1. Public / unclassified
  2. Confidential, intended for circulation in the internal organisation and authorized third parties at owners discretion.
  3. Restricted, information that should not into disclosure outside a defined group.
  4. Secret, strategical sensitive information only shared between a few individuals.

etl-elt_01.png
Using BI analytics
Using BI analytics in the security operations centre (SOC).
This technical environment of bi usage is relative new. It is demanding in a very good runtime performance with well defined isolated and secured data. There are some caveats:
Monitoring events, ids, may not be mixed with changing access rights.
Limited insight at security design. Insight on granted rights is done.
It is called
Security information and event management (SIEM)
is a subsection within the field of computer security, where software products and services combine security information management (SIM) and security event management (SEM). They provide real-time analysis of security alerts generated by applications and network hardware. Vendors sell SIEM as software, as appliances, or as managed services; these products are also used to log security data and generate reports for compliance purposes.

etl-elt_01.png Using BI analytics for capacity and system performance.
This technical environment of bi usage is relative old optimizing the technical system performing better. Defining containers for processes and implementing a security design.
Monitoring systems for performance is bypassed when the cost is felt too high.
Defining and implementing an usable agile security design is hard work.
Getting the security model and monitoring for security purposes is a new challenge.
It is part of ITSM (IT Service maangemetn) Capacity management´s
primary goal is to ensure that information technology resources are right-sized to meet current and future business requirements in a cost-effective manner. One common interpretation of capacity management is described in the ITIL framework. ITIL version 3 views capacity management as comprising three sub-processes: business capacity management, service capacity management, and component capacity management.
In the fields of information technology (IT) and systems management, IT operations analytics (ITOA) is an approach or method to retrieve, analyze, and report data for IT operations. ITOA may apply big data analytics to large datasets to produce business insights.


Loss of confidentiality. compromised information.
getting hacked having got compromised by whale phishing is getting a lot of attention.
A whaling attack, also known as whaling phishing or a whaling phishing attack, is a specific type of phishing attack that targets high-profile employees, such as the CEO or CFO, in order to steal sensitive information from a company. In many whaling phishing attacks, the attacker's goal is to manipulate the victim into authorizing high-value wire transfers to the attacker.

Government Organisation Integrity.
This has nothing to do with hard facts but everything with things like my turf and your fault. Different responsible parties have their own opinion how conflicts about logging information should get solved.
🤔 Having information deleted permanent there is no way to recover when that decision is wrong.
🤔 The expectation it would be cheaper and having better quality is a promise without warrrants.
🤔 Having no alignment between the silo´s there is a question on the version of the truth.

RN-1.2.4 Base temporal data structure following lifecycles
butics

RN-1.3 Classification of technical processing types

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-1.3.1 Info
DataWareHousing, Information flow based.
Repostioning the datawarehouseas part of an operational flow makes more sense. A compliancy gap getting a solution:
The two vertical lines are managing who´s has access to what kind of data, authorized by data owner, registered data consumers, monitored and controlled.
In the figure: In a figure:
df_csd01.jpg The following consumers are also valid for the warehouse: A very different approach in building up this enterprise information data ware house. Axiomas:
💡 No generic data model for relations between information elements - information containers.
💡 Every information container must be fully identifiable. Minimal by: 💡 Every information container must have a clear ownership.: For being fully identifiable a well designed stable naming convention is required.

Administrative Value Stream Mapping Symbol Patterns.
Help in abstracting ideas is not by long text but using symbols and figures. A blueprint is the old name for doing a design before realisation. What is missing is something in between that is helping in the value stream of administrative processing.
Input processing:
A well defined resource is one that can be represented in rows columns. The columns are identifiers for similar logical information in some context.
Execute Business Logic (score):
Logging: / Monitoring:
Output, delivery:

RN-1.3.2 Info
Administrative proposed standard pattern.
📚 The process split up in four stages of prepare request (IV, III) and the delivery (I, II). The warehouse as starting point (inbound) and end point (outbound).
The request with all necessary preparations and validations going through IV and III.
The delivery with all necessary quality checks going through I and II.
lean procesoriented single workstation adddwh

SDLC life cycle steps - logging , monitoring.
Going back to the sdlc product life, alc model type 3. This is a possible implementation of the manufacturing I, II phases. 💡 There are four lines of artefacts collections at releases what will become the different production versions.
  1. collecting input sources into a combined data model.
  2. modifying the combined data model into a new one suited for the application (model).
  3. running the application (model) on the adjusted suited data creating new information, results.
  4. Delivering verified results to an agreed destinationt in an agreed format.
SDLC life cycle steps - logging , monitoring 💡 There are two points that are validating the state en create additional logging. This is new information.
  1. After having collected the input sources, technical and logical verfication on what has is there is done.
  2. Before delviering the results technical and logical verfication on what is there is done.
This is logic having business rules. The goal is application logging and monitoring in business perspective. When something is badly wrong, than halting the process flow is safety mitigation preventing more damage.
There is no way to solve this by technical logfiles generated by tools like a RDBMS.
💡 The results ar collected archived (business dedicated). This is new information.
  1. After having created the result, but before delivering.
  2. It usefull for auditing purpused (what has happended) and for predcitive modelling (ML) .

RN-1.3.3 Info
df_dlv_alctp3.jpg
Applied Machine learning (AI), operations.
Analytics, Machine Learning, is changing the way of inventing rules to only human invented to helping humans with machines.
💡 The biggest change is the ALC type3 approach. This fundamentally changes the way how release management should be implemented. ML is exchanging some roles in coding and data to achieve results at development but not in other life cycle stages.
When only a research is done for a report being made only once, the long waiting on data deliveries of the old DWH 2.0 methodology is acceptable.
⚠ Having a (near) real time operational process the data has to be correct when the impact on the scoring is important. Using that approach, at least two data streams are needed: 🤔 The analytics AI ML machine learning has a duality in the logics definition. The modelling stage (develop) is using data, that data is not the same, although similar, as in the operational stage. Developing is done with operational production data. The sizing of this data can be much bigger than that of what is needed at operations due to the needed history. The way of developping is ALC type3.
 
❗ The results of what an operational model is generating should be well monitored for many reasons. That is new information to process.

RN-1.3.4 Info
The technical solutions as first process option.
Sometimes a simple paper note will do, sometimes an advanced new machine is needed. It depends on the situation. A simple solution avoiding the waste is lean - agile
archive documents nosql Optimization Transactional Data. An warehouse does not content structuring it must be able to locate the wanted content structured. Delivering the labelled containers efficient >
Optimization Transactional Data. The way of processing information was in the old day using flat files in the physical way. Still very structured stored and labelled. In the modern approach these techniques still are applicable although automated hidden in a RDBMS .
Analytics & reporting. The "NO SQL" hype is a revival of choosing more applicable techniques.
It is avoiding the transactional RDBMS approach as the single possible technical solution.

etl-reality.jpg
Information process oriented, Process flow.
The information process in an internal flow has many interactions input, transformations and output in flows.
There is no relationship to machines and networking. The problem to solve those interactions will popup at some point.
Issues by conversions in datatypes, validations in integrity when using segregated sources (machines) will popup at some point.

The service bus (SOA).
SD_enterpriseservicebus.jpg ESB enterprise service bus The technical connection for business applications is preferable done by a an enterprise service bus. The goal is normalized systems.
Changing replacing one system should not have any impact on others.

Microservice_Architecture.png
Microservices with api´s
Microservices (Chris Richardson):
Microservices - also known as the microservice architecture - is an architectural style that structures an application as a collection of services that are: The microservice architecture enables the continuous delivery/deployment of large, complex applications. It also enables an organization to evolve its technology stack.

Data in containers.
informatie_mdl_imkad11.jpg Data modelling using the relational or network concepts is based on basic elements (artefacts).
An information model can use more complex objects as artefacts. In the figure every object type has got different colours.
The information block is a single message describing complete states before and after a mutation of an object. The Life Cycle of a data object as new metainformation. Any artefact in the message following that metadata information.
This is making a way to process a chained block of information. It is not following the blockchain axioma´s. The real advantage of a chain of related information is detecting inter-relationships with the possible not logical or unintended effects.

olap_star01.jpg
Optimization OLTP processes.
The relational SQL DBMS replaced codasyl network databases (see math). The goal is simplification of online transaction processing (oltp) data by deduplication and normalization (techtarget) using DBMS systems supporting ACID ACID properties of transactions (IBM).
These approaches are necessary doing database updates with transactional systems. Using this type of DBMS for analytics (read-only) was not the intention.
normalization (techtarget, Margaret Rouse ) Database normalization is the process of organizing data into tables in such a way that the results of using the database are always unambiguous and as intended. Such normalization is intrinsic to relational database theory. It may have the effect of duplicating data within the database and often results in the creation of additional tables.
ACID properties of transactions (IBM)

RN-1.4 The connection of technology agile lean

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-1.4.1 Info
The Philosophy and Practicality of Jidoka
allaboutlean: The Three Fundamental Ways to Decouple Fluctuations Diving deep into the Toyota philosophy, you could see this as JIT telling you to let the material flow, and jidoka telling you when to stop the flow. This is a bit like the Chinese philosophical concept of Ying and Yang, where seemingly opposite or contrary forces may actually be complementary.
The same applies here. JIT encourages flow, and Jidoka encourages stops, which seems contrary. However, both help to produce more and better parts at a lower cost. Unfortunately, JIT gets much, much more attention as it is the glamorous and positive side, whereas jidoka is often seen as all about problems and stops and other negative aspects. Yet, both are necessary for a good production system.
💣 Ignoring the holistic view of the higher goal, only on a detailed aspect like JIT can make things worse not better.

project shop, moving the unmovable.
The project shop is associated with not possible applying lean thoughts. Does it or are there situations where new technology are implementing a lean working way.
allaboutlean projectshop - building ship
It is using a great invention of process improvement over and over again. That is: the dock. Building in the water is not possible. Building it ashore is giving the question how to get it into the water safely.
🔰 Reinvention of patterns.
Moving something that is unmovable.
Changing something that has alwaus be done tath wasy.

 Timelapse - Inschuiven tunneldeel A12 Minimizing time for road adjustment, placing tunnel. Placing it when able to move done in just 3 days. Building several months.
See time-lapse. 👓 Placing the tunnel was a success, a pity the intended road isn´t done after three years.
 
The project approach of moving the unmovable has been copied many times with the intended usage afterwards. rail bridge deck cover The approach is repeatable.
💡 Reinvention of patterns. Moving something that is unmovable.
🎭When a project shop is better in place, why not copy this at ICT?

Administration information flow.
Seeing this way of working the association is to administration work moving the papers arround.
Unstructured and Pulse Line Flow lines are often the best and most organized approach to establish a value stream.
The "easiest" one is an unstructured approach. The processes are still arranged in sequence; however, there is no fixed signal when to start processing a part.
kantoorttuin 💡 Reinvention of patterns. Using the information flow as assembly line.
🎭 When a flow line is a fit for an administrative process, why not copy this at ICT?
🎭 When an administrative process is associated to administrative tags (eg prodcut description) being processed why not have them related to each other?
Administrative process, differences to physical objects.
RN-1.4.2 Info
Change data - Transformations
Seeing the values stream within an administrative product is a different starting point for completely new approaches. The starting point is redesigning what is not working well. Not automatically keeping things doing as always have been done. Also not changing things because of wanting to change something.
Design thinking.
It is a common misconception that design thinking is new. Design has been practiced for ages: monuments, bridges, automobiles, subway systems are all end-products of design processes. Throughout history, good designers have applied a human-centric creative process to build meaningful and effective solutions.
BISL gap Business ICT The design thinking ideology is following several steps.
Defintion: The design thinking ideology asserts that a hands-on, user-centric approach to problem solving can lead to innovation, and innovation can lead to differentiation and a competitive advantage. This hands-on, user-centric approach is defined by the design thinking process and comprises 6 distinct phases, as defined and illustrated below.
See link at figure 👓.
 
Those six phases are in line with what the crisp-dm model states. Wat is missing when comparing this with the PDCA cycle is the Check- Verify of it works as expected after implementation.

many partitioned dws-s process cycle demo
Combining information connections between silos & layers.
💡 Solving gaps between silos in the organisation is supporting the values stream.
Having aligned information by involved parties it is avoiding different versions of the truth. It is more easy to consolidate that kind of information to a central managed (bi analytics) tactical - strategical level.
The change to achieve this is one of cultural attitudes. That is a top down strategical influence.

RN-1.4.3 Info
Tuning performance basics.
Solving performance problems requires understanding of the operating system and hardware. That architecture was set by von Neumann (see design-math).
vonNeumann_perftun01.jpg
A single CPU, limited Internal Memory and the external storage.
The time differences between those resources are in magnitudes (factor 100-1000).

Optimizing is balancing between choosing the best algorithm and the effort to achieve that algorithm.

vonNeumann_perftun02.jpg
That concept didn´t change. The advance in hardware made it affordable to ignore the knowledge of tuning.

The Free Lunch Is Over .
A Fundamental Turn Toward Concurrency in Software, By Herb Sutter.
If you haven´t done so already, now is the time to take a hard look at the design of your application, determine what operations are CPU-sensitive now or are likely to become so soon, and identify how those places could benefit from concurrency. Now is also the time for you and your team to grok concurrent programming´s requirements, pitfalls, styles, and idioms.

Additional component, the connection from machine, multiple cpu´s - several banks internal memory, to multiple external storage boxes by a network.

Perftun_EtL01.jpg
Tuning cpu - internal memory.
Minimize resource usage: ❗ The "balance line" algorithm is the best. A DBMS will do that when possible.

Perftun_EtL02.jpg
Network throughput.
Minimize delays, use parallelization:
⚠ Transport buffer size is a coöperation between remote server and local driver. The local optimal buffer size can be different. Resizing data in buffers a cause of performance problems.

Perftun_EtL03.jpg
Minize delays in the storage system.
⚠ Using Analtyics, tuning IO is quite different to transactional DBMS usage.
💣 This different non standard approach must be in scope with service management. The goal of sizing capacity is better understood than Striping for IO perfromance.

DBMS changing types
A mix of several DBMS are allowed in a EDWH 3.0. The speed of transport and retentionperiods are important considerations. Technical engineering for details and limitations to state of art and cost factors.
dbmsstems_types01.png
RN-1.4.4 Info
BISL Business Information Services Library.
Bisl is used for a demand supply chain. Often going along with internal business and external outsourcec IT services. Nice to see is a seperation of concerns in a similar way, placing the high level drivers in the center.
The framework describes a standard for processes within business information management at the strategy, management and operations level. BiSL is closely related to the ITIL and ASL framework, yet the main difference between these frameworks is that ITIL and ASL focus on the supply side of information (the purpose of an IT organisation), whereas BiSL focuses on the demand side (arising from the end-user organisation
Business Process The demand side focus for some supply is a solution for the supposed mismatch business & ICT. The approach for that mismatch is an external supplier.
Business Process Indeed there are gaps. The question should be is there are mismatch or have the wrong questions been asked?
In the values stream flow there are gaps between:
  1. operational processes, in the chain of the product transformation - delivery.
  2. Delivering strategical management information assuming the silo´s in the transformation chains -delivery are cooperating.
  3. Extracting, creating management information within the silo´s between their internal layers.

This has nothing to do with hard facts but everything with things like my turf and your fault. Different responsible parties have their own opinion how those conflict should get solved. The easy way is outsourcing the problem to an external party, a new viewpoint coming in.
🤔 The expectation this would be cheaper and having better quality is a promise without warrants .
🤔 Having no alignment between the silo´s there is a question on the version of the truth.

Business Process When these issues are the real questions real problems to solve:
  1. Solve the alignment between at operational processes, wiht the value stream of the product. Both parties need to agree as single version of the truth.
  2. Solve the alignment in extracting, creating management information within the silo´s between their internal layers. There are two lines of seperations in context.
  3. Use the management information wihtin the silos in consolidated information in delivering strategical management information.


RN-1.5 Closed loops, informing what is going on in the system

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-1.5.1 The EDWH - Data Lake - Data Mesh - EDWH 3.0
Classic DataWareHousing.
Processing objects, processing information goes along with responsibilities. There is an origin of the information and a consumer of combined information lines.
A data warehouse is at the moment siloed to reporting tasks. Reporting in dashboards and reports so managers are making up their mind with those reports as the "data".
Other usage for a data warehouse is seen as problematic when it used for operational informational questions may be involved with AI better Machine learning bypassing those managers as the decision makers. 👓  data-lake-to-data-marketplace
The technology question wat kind of DBMS should be uses in a monolithic system for management reporting is a strategy question asked.
Data curation before being used in a monolithic system for management reporting is a strategy question asked.
Historical information in this monolithic system for management reporting is a question.
Connecting to analytical usage in an operational flow in this monolithic system for management reporting is a question.

RN-1.5.2 Info
💡 Logistics of the EDWH - Data Lake. EDWH 3.0
As the goal of BI Analytics was delivering reports to managers, securing informations and runtime performance was not relevant.
Securing information is too often an omission.
Transforming data should be avoided.
The data-consumer process should do the logic processing.
Offloading data, doing the logic in Cobol before loading, is an ancient one to be abandoned. Processing objects, information goes along with responsibilities.
❗ A data warehouse is allowed to receive semi-finished product for the business process.
✅ A data warehouse is knowing who is responsible for the inventory being serviced.
❗ A data warehouse has processes in place for deleivering and receiving verified inventory.
In a picture:
df_csd01.jpg The two vertical lines are managing who´s has access to what kind of data, authorized by data owner, registered data consumers, monitored and controlled.
The confidentiality and integrity steps are not bypassed with JIT (lambda).

CIA Confidentiality Integrity Availability. Activities.

CSD Collect, Store, Deliver. Actions on objects.

There is no good reason to do this also for the data warehouse when positioned as a generic business service. (EDWH 3.0)
Focus on the collect - receive side.
There are many different options how to receive information, data processing. Multiple sources of data - Multiple types of information.
df_collect01.jpg In a picture:
 
A data warehouse should be the decoupling point of incoming and outgoing information.
 
A data warehouse should validate verify the delivery on what is promised to be there. Just the promise according to the registration by administration, not the quality of the content (different responsibility).

Focus on the ready - deliver side.
A classification by consumption type:
df_delivery01.jpg In a picture:
 
There are possible many data consumers.
It is all about "operational" production data" - production information.
 
Some business applications only are possible using the production information.

RN-1.5.3 Info
Some mismatches in a value stream.
Aside all direct questions from the organisation many external requirements are coming in. A limited list to get en idea on regulations having impact on the adminsitrative information processing.
business flow & value stream.
Business Process top down Having a main value stream from left to right, the focus can be top down with the duality of processes - transformations and the product - information.
Complicating factor is that:
✅ Before external can be retrieved the agreement on wat is to retrieve must be on some level.
✅ Before the delivery can be fulfilled the request on what tot deliver must be there.
Business Process bottom up Having the same organisation, the focus can be bottom up with the layers in silos and separation of concerns.
Complicating factor is that:
❓ In the centre needed government information is not coming in by default. The request for that information is not reaching the operational floor.
😲 cooperation between the silos responsible for a part of the operating process are not exchanging needed information on the most easy way by default.


EDW development approach and presetation
BI DWH, datavirtualization.
Once upon a time there were big successes using BI and Analytics. The success were achieved by the good decisions, not best practices, made in those projects.
To copy those successes the best way would be understanding those decisions made. As a pity these decisions and why the were made are not published.
Lans_datavirtualise.jpg The focus for achieving success changed in using the same tools with those successes.
BI Business Intelligence has for long claiming being the owner of the E-DWH. Typical in BI is almost all data is about periods. Adjusting data matching the differences in periods is possible in a standard way. The data virtualization is build on top of the "data vault" DWH 2.0 dedicated build for BI reporting usage. It is not virtualization on top of the ODS or original data sources (staging).

dashboard BI Presenting data using figures as BI.
The information for managers commonly is presented in easily understandable figures.
When used for giving satisfying messages or escalations for problems there is bias to prefer the satisfying ones over the ones alerting for possible problems.
😲 No testing and validation processes being necessary as nothing is operational just reporting to managers.

df_dlv_bi-anl.jpg 💡 The biggest change for a DWH 3.0 approach is the shared location of data information being used for the whole organisation, not only for BI.
 
The Dimensional modelling and the Data Vault for building up a dedicated storage as seen as the design pattern solving all issues. OLap modelling and reporting on the production data for delivery new information for managers to overcome performance issues. A more modern approach is using in memory analytics. In memory analytics is still needing a well designed data structure (preparation).
 
😱 Archiving historical records that may be retrieved is an option that should be regular operations not a DWH reporting solution.
The operations (value stream) process is sometimes needing information of historical records. That business question is a solution for limitations in the operational systems. Those systems were never designed and realised with archiving and historical information.
⚠ Storing data in a DWH is having many possible ways. The standard RDBMS dogma has been augmented with a lot of other options. Limitations: Technical implementations not well suited because the difference to an OLTP application system.

RN-1.5.4 Info
many partitioned dws-s process cycle demo
Reporting Controls (BI)
The understandable goal of BI reporting and analytics reporting is rather limited, that is:
📚 Informing management with figures,
🤔 so they can make up their mind on their actions - decisions.
The data explosion. The change is the ammount we are collecting measuring processes as new information (edge).
📚 Information questions.
⚙ measurements data figures.
🎭 What to do with new data?
⚖ legally & ethical acceptable?
dashboard classic
When controlling something it is necessary to:
👓 Knowing were it is heading to.
⚙ Able to adjust speed and direction.
✅ Verifying all is working correctly.
🎭 Discuss destinations, goals.
🎯 Verify achieved destinations, goals.
 
It is basically like using a car.
Adding BI (DWH) to layers of enterprise concerns.
Having the three layers, separation of concern : At the edges of those layers inside the hierarchical pyramid interesting information to collect for controlling & optimising the internal processes. For strategic information control the interaction with the documentational layer is the first one being visible.

many partitioned dws-s process cycle demo Having the four basic organisational lines that are assumed to cooperate as a single enterprise in the operational product value stream circle, there are gaps between those pyramids.
 
Controlling them at a higher level is using information the involved parties two by two, are in agreement. This is adding another four points of information. Consolidating those four interactions point to one central point makes the total number of strategic information containers nine.

dashboard_airbus_a380.jpg
Too complicated and costly BI.
When trying to answer every possible question:
💰 requiring a of effort (costly)
❗ every answer 👉🏾 new questions ❓.
🚧 No real endsituation
continus construction - development.
 
The simple easy car dashboard could endup in an airplane cockpit and still mising the core business goals to improve

ETL ELT - No Transformation.
etl-elt_01.png Classic is the processing order:
⌛ Extract, ⌛ Transform, ⌛ Load. For segregation from the operational flow a technical copy is required. Issues are:
Translating the physical warehouse to ICT.
Diagram_of_Lambda_Architecture_generic_.jpg All kind of data (technical) should get support for all types of information (logical) at all kinds of speed. Speed, streaming, is bypassing (duplications allowed) the store - batch for involved objects. Fast delivery (JIT Just In Time).
💣 The figure is what is called lambda architecture in data warehousing.
lambda architecture. (wikipedia). With physical warehouses logistics this question for a different architecture is never heard of. The warehouse is supposed to support the manufacturing process. For some reason the data warehouse has got reserved for analytics and not supporting the manufacturing process.

RN-1.6

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-1.6.1 Info
wrh_selfsrvc-01.jpg
Selfservice - Managed
Self service sounds very friendly, it is a euphemism for no service. Collecting your data, processing your data, yourself.
The advantage for the customer is picking what is felt convenient found on some shelf. The disadvantages are: wrh_cntr_stor.jpg Have it prepared transported for you so it can processed for you. The advantages are a well controlled environment that also is capable of handling more sensitive stuff (confidential secres).
RN-1.6.2 Info
Maturtity Level 1-5
IT-Business Strategic Alignment Maturity- Jerry Luftman
Why -still- discuss IT-business alignment?
4. In search of mythical silver bullet
5. Focusing on infrastructure/architecture
7 Can we move from a descriptive vehicle to a prescriptive vehicle?

(see link with figure 👓)
💣 This CMM level is going on since 1990. Little progress in results are made. those can be explained by the document analyses and the listed numbers.
Going on the way to achieve the levels by fullfilling some action list as having done is a way to not achieve those goals. Cultural behanvior is very difficult to measure. Missing in IT is te C for communication: ICT.

Retrosperctive for applying collective intelligence for policy.
Ideas into action (Geoff Mulgan )
What's still missing is a serious approach to policy. I wrote two pieces on this one for the Oxford University Press Handbook on Happiness (published in 2013), and another for a Nef/Sitra publication. I argued that although there is strong evidence at a very macro level (for example, on the relationship between democracy and well-being), in terms of analysis of issues like unemployment, commuting and relationships, and at the micro level of individual interventions, what's missing is good evidence at the middle level where most policy takes place. This remains broadly true in the mid 2020s.
I remain convinced that governments badly need help in serving the long-term, and that there are many options for doing this better, from new structures and institutions, through better processes and tools to change cultures. Much of this has to be led from the top. But it can be embedded into the daily life of a department or Cabinet. One of the disappointments of recent years is that, since the financial crisis, most of the requests to me for advice on how to do long-term strategy well come from governments in non-democracies. There are a few exceptions - and my recent work on how governments can better 'steer' their society, prompted by the government in Finland, can be seen in this report from Demos Helsinki.
During the late 2000s I developed a set of ideas under the label of 'the relational state'. This brought together a lot of previous work on shifting the mode of government from doing things to people and for people to doing things with them. I thought there were lessons to learn from the greater emphasis on relationships in business, and from strong evidence on the importance of relationships in high quality education and healthcare. An early summary of the ideas was published by the Young Foundation in 2009. The ideas were further worked on with government agencies in Singapore and Australia, and presented to other governments including Hong Kong and China. An IPPR collection on the relational state, which included an updated version of my piece and some comments, was published in late 2012.
I started work on collective intelligence in the mid-2000s, with a lecture series in Adelaide in 2007 on 'collective intelligence about collective intelligence'. The term had been used quite narrowly by computer scientists, and in any important book by Pierre Levy. I tried to broaden it to all aspects of intelligence: from observation and cognition to creativity, memory, judgement and wisdom. A short Nesta paper set out some of the early thinking, and a piece for Philosophy and Technology Journal (published in early 2014) set out my ideas in more depth. My book Big Mind: how collective intelligence can change our world from Princeton University Press in 2017 brought the arguments together.

RN-1.6.3 Info
Technology push focus BI tools.
The technology offerngs are rapidly changing the last years (as of 2020). Hardware is not a problemtic cost factor anymore, functionality is. hoosing a tool or having several of them goes with personal preferences.
This has nothing to do with hard facts but everything with things like my turf and your fault. Different responsible parties have their own opinion how conflicts should get solved. In a technology push it is not the organisational goal anymore. It is showing the personal position inside the organisation.
🤔 The expectation of cheaper and having better quality is a promise without warrants .
🤔 Having no alignment between the silo´s there is a question on the version of the truth.

Just an inventarization on the tools and the dedicated area they are use at: Mat Turck on 2020 , bigdata 2020 An amazing list of all,kind of big data tools at the market place.
2019 Matt Turck Big Data Landscape

RN-1.6.4 Info
Changing the way of informing.
Combining the data transfer, microservices, archive requirement, security requirements and doing it like the maturity of physical logistics. It goes into the direction of a centralized managed approach while doing as much as possible decentralised. Decoupling activities when possible to get popping up problems human manageable small.
 
Combining information connections.
There are a lot of ideas giving when combined another situation: many partitioned dws-s process cycle demo 💡 Solving gaps between silos supporting the values stream. Those are the rectangular positioned containers connecting between the red/green layers. (total eight internal - intermediates)
💡 Solving management information into the green/blue layers in every silo internal. These are the second containers in every silo. (four: more centralised)
💡 Solving management information gaps between the silos following the value stream at a higher level . These are the containers at the circle (four intermediates).
Consolidate that content to a central one.
🎭 The result is Having the management information supported in nine (9) containers following the product flow at strategic level. Not a monolithic central management information system but one that is decentralised and delegate as much as possible in satellites.
💡 The outer operational information rectangle is having a lot of detailed information that is useful for other purposes. One of these is the integrity processes. A SOC (Security Operations Centre) is an example for adding another centralised one.
🎭 The result is Having the management information supported in nine (9) containers following the product flow at strategic level. Another eight (8) at the operational level another and possible more. Not a monolithic central management information system but one that is decentralised and delegate as much as possible in satellites.
🤔 Small is beautiful, instead of big monolithic costly systems, many smaller ones can do the job better an more efficiënt. The goal: repeating a pattern instead of a one off project shop. The duality when doing a change it will be like a project shop.

shp_cntr_load-2.jpg
Containerization.
We are used to the container boxes as used these days for all kind of transport. The biggest of the containerships are going over the world reliable predictable affordable.
Normal economical usage, load - reload, returning, many predictable reliable journeys.

shp_cntr_liberty.jpg The first containerships where these liberty ships. Fast and cheap to build. The high loss rate not an problem but solved by building many of those. They were build as project shops but at many locations. The advantage of a known design to build over and over again.
They were not designed for many journeys, they were designed for the deliveries in war conditions.

allaboutlean projectshop - building ship project shop.
to cite:
This approach is most often used for very large and difficult to move products in small quantities.
...
There are cases where it is still useful, but most production is done using job shops or, even better, flow shops.
💣 The idea is that everything should become a flow shop even when not applicable. At ICT delivering software in high speed is seen as a goal, that idea is missing the data value stream as goal.

Containerization.
Everybody is using a different contact to the word "data". That is confusing when trying to do something with data. A mind switch is seeing it as information processing in enterprises. As the datacentre is not a core business activity for most organisations there is move in outsourcing (cloud SAAS).
Engineering a process flow, then at a lot of point there will be waits. At the starting and ending point it goes from internal to external where far longer waits to get artefacts or product deliveries will happen. Avoiding fluctuations having a predictable balanced workload is the practical solution to become effciënt.
Processing objects, collecting information and delivering goes along with responsibilities. It is not sexy, infact rather boring. Without good implementation all other activities are easily getting worthless. The biggest successed like Amazon are probably more based in doing this very well than something else. The Inner Workings of Amazon Fulfillment Centers
Common used ICT patterns processing information. For a long time the only delivery of an information process was a hard copy paper result. Deliveries of results has changed to many options. The storing of information has changed also.
 
Working on a holistic approach on information processing starting at the core activities can solve al lot of problems. Why just working on symptoms and not on root causes?
💡 Preparing data for BI, Analytics has become getting an unnecessary prerequisite. Build a big design up front: the enterprise data ware house (EDWH 3.0).

Data Technical - machines oriented
The technical machines oriënted approach is about machines and the connections between them (network). The service of delivering Infrastructure (IAAS) is limited to this kind of objects. Not how they are inter related.
The problem to solve behind this are questions of:

df_machines.jpg 🤔 A bigger organisations has several departments. Expectations are that their work has interactions and there are some central parts.
Sales, Marketing, Production lines, bookkeeping, payments, accountancy.
🤔 Interactions with actions between all those departments are leading to complexity.
🤔 The number of machines and the differnces in stacks are growing fast. No matter where these logical machines are.
For every business service an own dedicated number of machines will increase complexity.

The information process flow has many interactions, inputs, tranformtions and outputs.
💡 Reinvention of a pattern. The physical logistic warehouse approach is well developed and working well. Why not copy that pattern to ICT? (EDWH 3.0)

printing delivery line
What is delivered in a information process?
The mailing print processing is the oldest Front-end system using Back-end data. The moment of printing not being the same of the manufactured information.

Many more frontend deliveries have been created recent years. The domiant ones becoming webpages and apps on smartphones.
A change in attitude is needed bu still seeing it as a delivery needed the quality of infomration by the process.

Change data - Transformations
A data strategy helping the business should be the goal. Processing information as "documents" having detailed elements encapsulated. Transport & Archiving aside producing it as holistic approach.
shp_cntr_clct.jpg Logistics using containers.
The standard approach in information processing is focussing on the most detailed artefacts trying to build a holistic data model for all kind of relationships. This is how goods were once transported as single items (pieces). That has changed into: containers having encapsulated good.
💡 Use of labelled information containers instead of working with detailed artefacts.

shp_cntr_store.jpg 💡 Transport of containers is requiring some time. The required time is however predictable. Trusting that the delivery is in time, the quality is conform expectations, is more efficiënt than trying to do everything in real time.

shp_cntr_dlv.jpg Informations containers have arrived almost ready for delivery having a more predictable moment for deliveriy to the customer.
💡 The expected dleivery notice is becoming standard in physical logistics. Why not doing the same in adminsitrative processes?

Data Strategy: Tragic Mismatch in Data Acquisition versus Monetization Strategies.
A nice review on this, "eOrganizations do not need a Big Data strategy; they need a business strategy that incorporates Big Data" Bill Schmarzo 2020.
value for the money
Companies are better at collecting data ? about their customers, about their products, about competitors ? than analyzing that data and designing strategy around it. Too many organizations are making Big Data, and now IOT, an IT project. Instead, think of the mastery of big data and IOT as a strategic business capability that enables organizations to exploit the power of data with advanced analytics to uncover new sources of customer, product and operational value that can power the organization?s business and operational models

🔰 Contents Frame-ref ZarfTopo ZarfRegu SmartSystem ReLearn 🔰
  
🚧  Knowium P&S-ISFlw P&S-ISMtr P&S-Pltfrm Fractals Learn-I 🚧
  
🎯 Know_npk Gestium Stravity Human-cap Evo-InfoAge Learn-@2 🎯


RN-2 The impact of uncertainty to information processing


dual feeling

RN-2.1 Reframing the thinking for decision making

This is a different path on information processing supporting for governance and informed understandable decisions. This all started with an assumption in certainty for knowledge management, collective intelligence. Decisions however are made in assumptions and uncertainty.

RN-2.1.1 Thinking dialectal for underpinning at decisions
The Dialectical Thought Form Framework (DTF) source
Far beyond the personal comfort zone LLm is helpful.
Otto Laske is a multidisciplinary consultant, coach, teacher, and scholar in the social sciences, focused on human development and organizational transformation. Jan De Visch is an organization psychologist, executive professor, and facilitator with extensive experience managing organizational development and change processes.
Key contributions: Dialectical Thought Form Framework (DTF) is aimed at understanding and nurturing reasoning complexity: how people structure thought as they handle context, change, contradiction, and transformation.
The counterpart of this page 6x6systemslean (Shape design Zarf Jabes Jabsa). It is not descriptive systems thinking (formal-logical), it is meta-structural systems thinking. This is the same territory Laske calls dialectical, DTF is operating in the same cognitive space.
Key indicators (DTF markers) present throughout 6x6systemslean: The work consistently combines: Process (cycles, iteration, lean loops), Relationship (roles, viewpoints, dependencies), Transformation (reframing, recursion, scale shifts). The overlap is deep, but unevenly distributed across DTF categories.

The Dialectical Thought Form Framework (DTF) summary
Dialectical Thought Form Framework (DTF) consists of 4 categories (quadrants), each with 7 Thought Forms (TFs), for a total of 28. The standard IDM / Laske formulation, wording can vary slightly across publications and trainings, but the structure is stable. There are for each categories:
Context (C) 👐 Process (P)
C1 – Context as container P1 – Process as a whole
C2 – Contextual limits / boundaries P2 – Process phases
C3 – Contextual resources P3 – Process directionality
C4 – Contextual embeddedness P4 – Process rhythm / pace
C5 – Contextual dependency P5 – Process interaction
C6 – Contextual shift P6 – Process interruption
C7 – Contextual layering (multiple contexts) P7 – Process stabilization

There are for each categories:
Relationship (R) 👐 Transformation (T)
R1 – Relationship as mutual influence T1 – Emergence
R2 – Structural relationship T2 – Transformation of function
R3 – Functional relationship T3 – Transformation of structure
R4 – Power / asymmetry T4 – Breakdown / negation
R5 – Complementarity T5 – Reorganization
R6 – Tension / contradiction T6 – Developmental leap
R7 – Relational integration T7 – Integration at a higher level

Each class, Process (P), Context (C), Relationship (R) and Transformation (T) captures a way of thinking, from seeing events in relation to conditions, diagnosing interdependencies, and dealing with contradictions, to achieving integrative transformation. This is typically used: This is a generic thinking approach that is usable on groups of persons and systems acting is a similar way. That is different boundary scope than DTF has got growing in.
Where there is no overlap: important boundary
There are also clear non-overlaps, which is healthy. What DTF has that your page does not aim to do: What 6x6systemslean has that DTF does not DTF is diagnostic, 6x6systemslean is generative, they are complementary, not redundant.
The SIAR model operationalizes dialectical thinking at the system-design level, while DTF explicates the cognitive forms required to meaningfully operate such a model.

RN-2.1.2 Feeling a repeating pattern of ~6 distinctions
The reasoning for a limited number of distinctions
The statement: “Each dimension, when articulated adequately but minimally, needs about 6–7 stable distinctions.” does not originate as a design rule in Laske.
It is a convergence result across several intellectual traditions that Laske draws together.
Key sources:
  1. Hegelian dialectics (structure of determination) Hegel’s logic repeatedly shows that:
    • A concept becomes determinate only after a finite sequence of qualitative differentiations
    • Fewer than ~6 leaves it underdetermined
    • More than ~7 begins to produce redundancy
    Laske does not invent this — he operationalizes it.
    Hegel published his first great work, the Phänomenologie des Geistes (1807; The Phenomenology of Mind). This, perhaps the most brilliant and difficult of Hegel’s books, describes how the human mind has risen from mere consciousness, through self-consciousness, reason, spirit, and religion, to absolute knowledge..
  2. Piaget / Kegan (constructive-developmental limits) Developmental psychology shows that:
    • Meaning-making capacity expands by adding distinctions
    • At any stable stage, the number of simultaneously operable distinctions is limited
    • Empirically, this stabilizes around 6–7 for adult meaning structures
    Piaget's theory of cognitive development, or his genetic epistemology, is a comprehensive theory about the nature and development of human intelligence.
    Kegan described meaning-making as a lifelong activity that begins in early infancy and can evolve in complexity through a series of "evolutionary truces" (or "evolutionary balances") that establish a balance between self and other (in psychological terms), or subject and object (in philosophical terms), or organism and environment (in biological terms). This is not Miller’s “7±2” memory claim it is about structural differentiation, not memory load.
  3. Jaques’ stratified systems theory Elliott Jaques Jaques incorporated his findings during "Glacier investigations" into what was first known as Stratified Systems Theory of requisite organization. This major discovery served as a link between social theory and theory of organizations.
    (requisite organization):
    • Found that complex systems stabilize at ~7 strata
    • Fewer strata → insufficient control
    • More strata → loss of coherence
    Laske explicitly references Jaques in his systems thinking.
  4. Empirical validation in DTF research Laske and collaboratorsCoded hundreds of interviews, observed that:
    • Below ~6 distinctions → thinking collapses into vagueness
    • Above ~7 → distinctions collapse back into synonyms or rhetoric
The 7-per-quadrant pattern is empirical, not aesthetic.

Using six catergories to do learning
Increasingly, the issues on which the survival of our civilization depends are ‘wicked’ in the sense of being more complex than logical thinking alone can make sense of and deal with. Needed is not only systemic and holistic but dialectical thinking to achieve critical realism. Dialectical thinking has a long tradition both in Western and Eastern philosophy but, although renewed through the Frankfurt School and more recently Roy Bhaskar, has not yet begun to penetrate cultural discourse in a practically effective way. We can observe the absence of dialectical thinking in daily life as much as in the scientific and philosophical literature. It is one of the benefits of the practicum to let participants viscerally experience that, and in what way, logical thinking — although a prerequisite of dialectical thinking — is potentially also the greatest hindrance to dialectical thinking because of its lack of a concept of negativity. To speak with Roy Bhaskar, dialectical thinking requires “thinking the coincidence of distinctions” that logical thinking is so good at making, being characterized by “fluidity around the hard core of absence” (that is, negativity, or what is missing or not yet there).
For thinkers unaware of the limitations of logical thinking, dialectical thinking is a many-faced beast which to tame requires building up in oneself new modes of listening, analysis, self- and other-reflection, the ability to generate thought-form based questions, and making explicit what is implicit or absent in a person’s or group’s real-time thinking. These components are best apprehended and exercised in dialogue with members of a group led by a DTF-schooled mentor/facilitator.
The practicum takes the following six-prong approach:
  1. Foundations of Dialectic: Understand moments of dialectic and classes of thought forms and their intrinsic linkages as the underpinnings of a theory of knowledge.
  2. Structured dialogue and communication: Learn how to use moments of dialectic when trying to understand a speaker’s subject matter and issues, or when aiming to speak or writing clearly.
  3. (Developmental) listening and self-reflection Learn to reflect on the thought form structure of what is being said by a person or an entire group in real time
  4. Text analysis: Learn to understand the conceptual structure of a text (incl. an interview text) in terms of moments of dialectic and their associated thought forms as indicators of optimal thought complexity.
  5. Question & problem generation and formulation Learn how to formulate cogent and visionary questions (including to yourself), and give feedback based on moments of dialectic and their associated thought forms
  6. Critical facilitation Learn how to assist others in understanding what they are un-reflectedly saying, thinking, or intending
Acquiring these six, mutually supportive capabilities takes time and patience with oneself and others. It goes far beyond ‘skill training’ since participants need to engage in revolutionizing their listening, way of thinking, structure of self-reflection, and attention to others’ mental process, — something that logical thinkers for whom the real world is “out there” (not “in here”) are not accustomed to.

RN-2.1.3 Reframe dialectual abstraction of the SIAR model
Situation Input Actions Results, SIAR lean structured processing
Sense - Interpret - Act - Reflect
What is not done: replace SIAR with DTF labels, instead: Think of this as SIAR with its cognitive mechanics exposed.
butics
SIAR Plain wording Dominant DTF move
Sense Situate the situation Contextualization (C)
Interpret Structure meaning Relational integration (R)
Act Intervene in process Process engagement (P)
Reflect Reframe the system Transformation (T)

Important: SIAR traverses C ➡ R ➡ P ➡ T in every cycle, that is full dialectical movement, not partial.
What does change: This makes SIAR robust under complexity. SIAR is a complete dialectical cycle in practice: it situates contexts, structures relations, intervenes in processes, and transforms frames — whether or not this is made explicit.

RN-2.1.4 Underpinning the repeating pattern of ~6 distinctions
The repeatin Autonomous repetitions for limited number of distinctions
A dimension is only “adequately articulated” when it can express internal tension, differentiation, and integration without self-contradiction. To do that, you need at least: That already gets you to 6. The 7th is not additive — it is closure at a higher level.
This convergence shows up again and again:
Too few (≤4) Optimal (6–7) Too many (≥9)
Oversimplified Expressive Redundant
No contradiction Tension + resolution Loss of salience
Binary thinking Dialectical movement Semantic inflation

This is why: Quadrants with 3–4 categories feel “framework-ish”
Quadrants with 10+ feel encyclopedic
Quadrants with 6–7 feel complete but navigable
I started from: And ended up with: ~6–7 stable distinctions per dimension, repeatedly, across roles, scales, and contexts.
That is not coincidence. It is a sign that I am working against the same cognitive constraints that DTF formalizes. The "6–7 distinctions per dimension" rule is not a design choice but an empirically and dialectically grounded minimum required for stable, non-redundant articulation of complex meaning.
DTF does not say: “There are exactly 7 distinctions because the theory says so.”
DTF says: “At least ~7 are needed before thinking becomes dialectically mobile in that dimension.”
https://www.researchgate.net/publication/320328743_Human_Developmental_Processes_as_Key_to_Creating_Impactful_Leadership
Historical source for limited number of distinctions
Asking not aot a citation chain but a structural genealogy: how the same necessity for articulated distinctions reappears as theories of mind mature. To trace it explicitly and conservatively, showing what is inherited, what is transformed, and why the 6–7 pattern keeps re-emerging.
Core move: Hegel does not enumerate categories arbitrarily. He shows that thinking generates distinctions until contradiction stabilizes.
Key structure (Logic): Hegel’s dialectic unfolds through triadic movement, but stability requires more than three moments.
Across Being ➡ Essence ➡ Concept we see:
Level Function
1 Immediate Undifferentiated unity
2 Negation Differentiation
3 Mediation Relation
4 Opposition Tension
5 Contradiction Instability
6 Sublation Reorganization
7 Totality Integration

That is 7 functional moments, though Hegel never lists them as such. Crucial point, Hegel discovers that: thought must differentiate, but cannot differentiate endlessly, because coherence collapses
Dialectic stabilizes when all necessary moments are present Piaget takes Hegel out of metaphysics and into empirical cognition. Piaget’s key shift: From categories of being ➡ operations of knowing.
Formal Operational Thinking: Piaget identifies coordinated operations, not facts:
Operator Function
1 Reversibility Undoing
2 Conservation Invariance
3 Compensation Balance
4 Composition Combining
5 Negation Differentiation
6 Reciprocity Mutuality

These form closed operational systems. Piaget repeatedly finds: fewer operators ➡ unstable reasoning, more ➡ redundancy, no new power.
👉🏾 Operational systems stabilize at ~6 coordinated operators Explicit inheritance from Hegel: dialectic becomes equilibration, Contradiction becomes cognitive disequilibrium, sublation becomes re-equilibration.
The same constraint appears, now empirically grounded Jaques applies Piagetian operations to work, time, and organizations.
Jaques’ contribution, he discovers that: roles require specific levels of cognitive integration, integration happens in discrete strata, strata of Mental Processing.
Each stratum requires mastery of a bounded set of distinctions:
Stratum Cognitive capacity
1 Declarative Facts
2 Procedural Processes
3 Serial Sequences
4 Parallel Systems
5 Meta-systemic Systems of systems
6 Dialectical Contradiction
7 Transformational Re-framing identity

Jaques never formalizes “7” as a rule, but: Below ~6 ➡ roles fail, Above ~7 ➡ no additional explanatory power
The critical move, Jaques ties cognitive differentiation to: accountability, time-span, organizational viability.
Dialectical capacity becomes operational necessity Laske makes the latent structure explicit, Laske’s synthesis integrates: Hegelian dialectic, Piagetian operations, Jaques’ strata, Adult development research.
DTF structure in four classes, each with 7 thought forms:
Class Function
1 Context Framing
2 Process Change
3 Relationship Interaction
4 Transformation Re-organization

Why 7, bcause Laske empirically finds that: Each class requires: Stratum 1 Differentiation 2 Opposition 3 Mediation 4 Integration 5 Boundary recognition 6 Tension handling 7 Re-framing
👉🏾 These are Hegelian moments, operationalized.
The through-line (compressed):
Thinker Contribution What stays invariant
Hegel Dialectic of concepts Necessary moments
Piaget Dialectic of cognition Operational closure
Jaques Dialectic of work Functional sufficiency
Laske Dialectic of thought Explicit minimal set

What persists is not the number but the necessity of a bounded set. From Hegel’s Logic through Piaget’s operations, Jaques’ strata, and Laske’s DTFs, the recurrence of approximately six to seven distinctions per dimension reflects a deep structural constraint of dialectical cognition: it is the minimum articulation required to hold difference, tension, and integration simultaneously without conceptual collapse.
6–7 appears because:
👉🏾 That is the smallest number of distinctions that allows contradiction, mediation, and integration without collapse or redundancy.
Laske is the first to state the constraint explicitly
😲 A nice explanation but hard to verify by citations, explicitly stated is this mentioned nowhere.

feel order

RN-2.2 A new path in thinking - reflections

This ..... is a different path on information processing supporting for governance and informed understandable decisions. This all started with an assumption in certainty for knowledge management, collective intelligence. Decisions however are made in assumptions and uncertainty.

RN-2.2.1 Info
Culture internal external

Testing Zarf Jabes Jabsa to hitting upper and lower bounds
Lower bound (≈ under-articulation), A dimension hits the lower bound when: Symptom: “I kind of get it, but I don’t know what to do.”
Upper bound (≈ over-articulation), a dimension hits the upper bound when: 👉 Symptom: “This is rich, but I’m lost.”

RN-2.2.2 Info
Culture internal external
Hofstede's cultural dimensions theory is a framework for cross-cultural psychology, developed by Geert Hofstede . It shows the effects of a society's culture on the values of its members, and how these values relate to behavior, using a structure derived from factor analysis.
Hofstede’s Original 4 Dimensions (1980s) Later Expanded to 6 Dimensions, added were: Thinking on Hofstede 4 classes where there are 6 a tension between the classic fourfold framing (still widely cited in management discussions) and the full six-dimensional model (more academically complete). Re‑framing Hofstede’s set of dimensions by swapping one of the “classic four” (Power Distance) with Long‑Term vs. Short‑Term Orientation, and then treating Indulgence–Constraint and Power Distance as external cultural forces. This gives a hybrid model where the internal set is four, and the external set is two.
This restructuring does something interesting: It internalizes adaptive learning and values, making them the “operational” cultural levers inside teams, four internal. It externalizes structural and societal constraints treating them as boundary conditions that shape but don’t directly drive team dynamics. That’s a neat systems‑thinking move: distinguishing between cultural drivers that can be shifted through knowledge sharing and governance versus macro‑forces that set the stage but are harder to change directly. This aligns with the broader interest in semantic governance overlays, effectively creating a layered model where internal dimensions are “governable” and external ones are “contextual constraints.” >
Dimension Focus Governance Implication
Internal (Governable)
1 Individualism vs. Collectivism Self vs. group orientation Balance team incentives between personal accountability and collective outcomes
3 Uncertainty Avoidance Comfort with ambiguity Adjust processes:
high avoidance ➡ clear rules
low avoidance ➡ flexible experimentation
4 Masculinity vs. Femininity Competition vs. cooperation Align leadership style:
assertive goal‑driven vs. relational
quality‑of‑life emphasis
5 Long‑Term vs. Short‑Term Orientation Future pragmatism vs. tradition/immediacy Shape strategy
invest in innovation cycles vs. emphasize quick wins and heritage
External (Contextual)
0 Power Distance Acceptance of hierarchy Account for structural limits
flat vs. hierarchical authority patterns in organizationss
6 Indulgence vs. Constraint Freedom vs. restraint Recognize societal norms
openness to leisure vs. strict codes of conducts

This creates a 4+2 model: four internal drivers for operational culture, two external forces that shape the environment. It distinguishes between what governance can actively modulate versus what governance must respect and adapt to. It also makes dashboards more actionable, since leaders can see which dimensions they can influence internally and which ones they must design around.
Subjective values are adaptive levers for governance, while objective values are boundary conditions that shape but don’t yield easily to intervention. Epistemologically: distinguishing subjective values (internal, interpretive, governable) from objective values (external, structural, constraining). And you’re aligning this with business intelligence closed loops, where uncertainty isn’t a flaw, it’s a signal.
Uncertainty Avoidance, in particular, becomes a governance dial: high avoidance → tight loops, low tolerance for ambiguity; low avoidance → open loops, exploratory learning >
Dimension Focus Governance Implication
Subjective
1 Individualism vs. Collectivism Align incentives and team structures Reveals motivational asymmetries in decision loops
3 Uncertainty Avoidance Design process flexibility and risk tolerance Injects adaptive tension into closed loops — uncertainty becomes a learning input
4 Masculinity vs. Femininity Shape leadership tone and performance metrics Surfaces value conflicts in goal-setting and feedback
5 Long‑Term vs. Short‑Term Orientation Set strategic horizons and innovation cadence Modulates loop frequency and depth of insight capture>
Objective
0 Power Distance Respect structural hierarchy and authority norms Defines access boundaries and escalation paths in BI systems
6 Indulgence vs. Constraint Acknowledge societal norms and behavioral latitude Frames behavioral data interpretation and ethical thresholds

Subjective values: Internally held, interpretive, and governable through dialogue, incentives, and learning. They vary across individuals and can be shifted through team dynamics and feedback loops.
Subjective values are loop-sensitive: they affect how feedback is interpreted, how decisions are made, and how learning occurs. Objective values: Structurally embedded, externally imposed, and less governable. They reflect societal norms, institutional structures, or inherited constraints that shape behavior but resist direct modulation.
Objective values are loop-bounding: they define what feedback is allowed, who can act on it, and what constraints shape the loop’s operation.
Uncertainty Avoidance, in particular, becomes a governance dial, high avoidance leads to tight loops with low tolerance for ambiguity; low avoidance supports open loops and exploratory learning.
Loop Stage Subjective Values Influence Objective Values Constraint
Data Capture Individualism vs. Collectivism: shapes what data is noticed (self vs. group signals). Power Distance: defines who is allowed to collect or access data.
Interpretation Uncertainty Avoidance: governs tolerance for ambiguity in analysis. Indulgence vs. Constraint: frames acceptable narratives (open vs. restrained meaning).
Decision Masculinity vs. Femininity: biases toward competitive vs. cooperative choices. Power Distance: constrains who has authority to decide.
Action Long‑ vs. Short‑Term Orientation: sets horizon for implementation (quick wins vs. long cycles). Indulgence vs. Constraint: limits behavioral latitude in execution.>
Feedback All subjective values: modulate how lessons are internalized and adapted. Objective values: bound how feedback can be expressed or escalated.

In BI loops, uncertainty isn’t noise — it’s the adaptive signal. High Uncertainty Avoidance → closed loops tighten, feedback is filtered, risk is minimized. Low Uncertainty Avoidance → loops stay open, feedback is exploratory, innovation thrives. Thus, uncertainty avoidance is the governance dial that determines whether loops become rigid control systems or adaptive learning systems.
butics
The Danaher Business System (DBS), developed by Mark DeLuzio, is a comprehensive Lean-based operating model that transformed Danaher Corporation into one of the most successful industrial conglomerates in the world. It integrates strategy deployment, continuous improvement, and cultural alignment into a unified system for operational excellence.
Element Description
Lean foundation Built on Toyota Production System principles, emphasizing waste elimination, flow, and value creation.
Policy Deployment (Hoshin Kanri) Strategic alignment tool that cascades goals from top leadership to frontline teams.
Kaizen culture Continuous improvement through structured problem-solving and employee engagement.
Visual management Dashboards, metrics boards, and process visibility tools to drive accountability and transparency.
Standard work Codified best practices for consistency, training, and performance measurement.
Lean accounting Developed by DeLuzio to align financial systems with Lean operations — focusing on value streams rather than traditional cost centers.

Mark DeLuzio’s Role and Philosophy Architect of DBS: As VP of DBS, DeLuzio led its global deployment and helped Danaher become a benchmark for Lean transformation. Lean Accounting Pioneer: He introduced the first Lean accounting system in the U.S. at Danaher’s Jake Brake Division. Strategic Integrator: DeLuzio emphasized that Lean must be tied to business strategy — not just operational tools. Respect for People: A core tenet of DBS, ensuring that transformation is sustainable and human-centric.
Activity Description
Eliminating waste in accounting processes Traditional month-end closes and cost allocations often involved redundant steps. Lean Accounting applies value-stream mapping to streamline closing cycles, freeing finance teams to focus on strategic analysis
Value-stream based reporting Instead of tracking costs by departments, Lean Accounting organizes them by value streams — the end-to-end activities that deliver customer value. This provides clearer insight into profitability tied to actual products or services
Real-time decision support Lean Accounting emphasizes timely, actionable data rather than lagging reports. This enables leaders to make faster, more informed investment and governance decisions
Continuous improvement in finance Just as Lean manufacturing fosters kaizen, Lean Accounting embeds continuous improvement into financial governance, ensuring reporting evolves with operational needs
Integration with agile governance Lean financial governance adapts investment tracking to modern delivery methods (agile, hybrid, waterfall), ensuring funding and prioritization align with how initiatives are actually execute
Transparency and cultural alignment: By eliminating complex cost allocations and focusing on value creation, Lean Accounting fosters a culture of openness and accountability across departments

Why This Matters for Governance Traditional accounting often obscured the link between operations and financial outcomes. Lean Accounting reshaped governance by: Making financial metrics operationally relevant. Aligning investment decisions with customer value creation. Enabling adaptive governance models that support agile and Lean transformations. This is why companies like Danaher, GE, and others used Lean Accounting as a cornerstone of their governance systems — it provided clarity, speed, and alignment between finance and operations.
RN-2.2.3 Info
Culture internal external

Culture internal external

RN-2.2.4 Info
Culture internal external

Culture internal external

feel order

RN-2.3 Purposeful usage of dialectal thoughts

This different path on information processing supporting for governance and informed understandable decisions requires more detailed scape and bounaries to make it more practical. The four areas in better understanding hold:

RN-2.3.1 Relationship dialects in a practical setting
Starting with understanding "the problem"
"So you wnat to define "the problem" (LI: John Cutler 2025) The full page is at: The beautifull mess, TBM 396" In product, we’re told to "define the problem."
I’ve always felt that this is hubris, at least with anything beyond fairly contained situations. “Go talk to customers, and figure out what the problem is!” Ultimately, as product builders or interveners, we may choose to take a shot at “solving the problem” with the tools at our disposal. So I guess my intent with this graphic is to get people thinking at multiple levels. This is not a root cause model.
This approach is in line with dialectical thinking.
The setting is the problem definition. Sensing what the intention is, context (C), with the goal of able to act on processes(P) by using relationship(R) thoughtforms.

Distinctive capabilities in problem understanding
“Define the problem” is often hubris in complex situations and there is no single privileged problem definition. The goal should be to act more thoughtfully by looking at the situation from multiple angles.
  1. Start with how the customer describes the problem in their own words and suspend judgment
    It is their mental model of the problem. This is their story, not ours, no matter how strange it might sound, or how strongly we might feel they are wrong or missing the point. Even if the framing is misguided, it is still the belief system and narrative currently organizing their understanding of the situation. If anything is going to change, it is this story and its explanatory power that will ultimately need to be replaced by something more compelling.
  2. Look at how other people around them experience the same situation and notice bias and false consensus.
    Here we explicitly acknowledge that how one person sees or feels the problem is just one take on the situation. People often inflict their framing of the problem onto others, intentionally or not.
  3. Examine the system forces shaping behavior including incentives norms tools power and constraints
    Shifts focus to the environment and the forces acting on people within it. We intentionally look at the system through multiple lenses, including human factors, behavioral psychology, learning design, social practice theory, anthropology, power, and politics. The aim is not to find a single cause, but to understand how the system shapes what feels normal, risky, effortful, possible, etc.
  4. Integrate perspectives with history and prior attempts and treat past fixes as useful data.
    This is where we start integrating. We take the actors from Layers 1 and 2 and the forces identified in Layer 3, and we add history. What has already been tried? What workarounds exist? What has failed, partially worked, or succeeded to much fanfare?!
    We begin restating the problem through this richer lens, knowing full well that we are now converging and imposing a perspective, whether it turns out to be right or wrong.
  5. Consider how your product or expertise could realistically influence these dynamics without selling.
    We consider our product, expertise, or technology, and how it might influence the situation. Not how it will. Not how it should. But how it could, in theory, intersect with the dynamics we now understand.
    The issue is one of opportunity, can we reduce friction or create new pathways? If it is capability, can we scaffold learning or decision-making? If it is motivation, can we alter incentives, visibility, or feedback loops? This is hypothesis-building, not pitching.
  6. Decide what can be influenced now what capabilities are missing and what small actions are feasible
    Back to reality, informed by everything we have learned so far. Our understanding of what is possible is shaped by the stories we heard, the perspectives surfaced, the system forces examined, and the history uncovered. (layer 1-4_)
    This is where we move from understanding to action. What can we realistically influence today? What levers are actually within reach? Here we form concrete, feasible actions for how we might intervene in the situation. We ask what we can try, not in theory, but in practice. What capabilities would we need to borrow, buy, or build to support those interventions? These choices cannot be made in isolation. They must cohere with prior efforts, align with the incentives and constraints already at play, fit the needs and beliefs of the actors involved, and still connect back to the problem as it was originally described, even if that description now feels distant from where we believe the strongest leverage exists.
The aim is better judgment and leverage not a perfect explanation.
Stratum Cognitive capacity
1 Customer's mental model/ stated problem What problem does the customer say they have, in their own words?
2 Ecosystem view. Other actors perspective How do other actors in the customer’s environment interpret or feel the impact of this problem?
3 Humand Factors and behavorial Dynamics What frictions, incentives, norms, habits, or power dynamics are blocking or reinforcing current behaviors?
4 Restated Problem with status quo attempts When we integrate these views and factors, what is the “real problem” — and why have existing fixes or workarounds failed?
5 Enabling overlap with product/technology How does our product, expertise, or technology directly address these dynamics and create better conditions?
6 Feasible influence & Meeded Capabilities What can we realistically influence today, and what additional capabilities would be needed to expand that influence?
7 Transformational Re-framing the chosen solution

This is very generic. By this we have a starting point if there is: "a problem". that
RN-2.3.2 Context dialects in a practical setting
butics

butics
The Logical Thinking Process: A Systems Approach to Complex Problem Solving a review by Chet Richards. (2007).
The thinking processes in Eliyahu M. Goldratt's theory of constraints are the five methods to enable the focused improvement of any cognitive system (especially business systems). ... Some observers note that these processes are not fundamentally very different from some other management change models such as PDCA "plan–do–check–act" (aka "plan–do–study–act") or "survey–assess–decide–implement–evaluate", but the way they can be used is clearer and more straightforward.
Dettmer begins the chapter by sketching the basic principles of human behavior, but there’s a limit to what he can do in a couple of dozen pages or so. People do get Ph.D.s in this subject. So regard it as more of a brief survey of the field for those lab rats from the engineering school who skipped the Psych electives.
Then he does a very unusual thing for a technical text. He introduces John Boyd’s “Principles of the Blitzkrieg” (POB) as a way to get competence and full commitment, “even if you’re not there to guide or direct them” (p. 8-11). Which means that people have to take the initiative to seek out and solve problems, using the common GTOC framework to harmonize their efforts. In that sense, the POB can be considered as a powerful doctrine for connecting leaders and subordinates for implementing change. As people who have read my own book, Certain to Win (kindly cited by Dettmer, by the way) are aware, these principles underlie both the Toyota Way and modern USMC maneuver warfare doctrine, so there is good evidence that they will do exactly what Dettmer claims.
Dettmer has made an important contribution to competitive strategy by writing what is, as far as I know, the first book to unify and demonstrate the power of both GTOC and the OODA loop. Operating together, they are going to be very, very hard to beat.
The Illusion of Certainty (LI: Eli Schragenheim Bill Dettmer 2025)

When there is no way to delay a decision, the clear choice is to choose the course that seems safer, regardless of the potential gain that might have been achieved. In other words, when evaluating new initiatives and business opportunities, the personal fear of negatives results, including those with very limited real damage to the organization, often produces too conservative a strategy. Ironically, this might actually open the door to new threats to the organization.
Organizations must plan for long-term as well as short-term objectives. However, uncertainty often permeates every detail in the plan, forcing the employees in charge of the execution to re-evaluate the situation and introduce changes. By confronting uncertainty, both during planning and execution, the odds of achieving all, or most, of the key objectives of the original plan increase substantially.
Living with uncertainty can create fear and tension. This can drive people to a couple of behaviors that can result in considerable "unpleasantness." When managers, executives, and even lower-level supervisors assess the organizational decisions they must make, they have two very different concerns. First, how will the decision affect the performance of the organization? And second, how will the decision be judged within the organization, based on subsequent results?
Actually, in most real-world cases the net impact of a particular move on the bottom line is not straightforward. In fact, determining the net contribution of just one decision, when so many other factors influenced the outcome, is open to debate ? and manipulation. It's easy to see this kind of after-the-fact judgment as unfair criticism, especially when it ignores the uncertainty at the time the decision was made.
In most organizations leaders evaluate the performance of individual employees, including managers and executives. This practice is deeply embedded within the underlying culture of most organizations. What motivates this need for personal assessment? It's that the system needs to identify those who don't perform acceptably, as well as those who excel. In order to assess personal performance, management typically defines specific “targets” that employees are expected to achieve. This use of such personal performance measurements motivates employees to try to set targets low enough so that, even in the face of situational variation, they'll be confident that they can meet these targets. In practicality, this means that while targets are met most of the time, only seldom they are outperformed, lest top management set higher targets. (Today's exceptional performance becomes tomorrow's standard.)
In practice, this culture of distrust and judgment-after-the-fact produces an organizational tendency to ignore uncertainty. Why? Because it becomes difficult, if not impossible, to judge how good (or lackluster) an employee's true performance is.

Uncertainties in managing flows
Stratum Cognitive capacity
1 Customer's mental model/ stated problem What problem does the customer say they have, in their own words?
2 Ecosystem view. Other actors perspective How do other actors in the customer’s environment interpret or feel the impact of this problem?
3 Humand Factors and behavorial Dynamics What frictions, incentives, norms, habits, or power dynamics are blocking or reinforcing current behaviors?
4 Restated Problem with status quo attempts When we integrate these views and factors, what is the “real problem” — and why have existing fixes or workarounds failed?
5 Enabling overlap with product/technology How does our product, expertise, or technology directly address these dynamics and create better conditions?
6 Feasible influence & Meeded Capabilities What can we realistically influence today, and what additional capabilities would be needed to expand that influence?
7 Transformational Re-framing the chosen solution


butics
A typical example of ignoring uncertainty is widespread reliance on single-number discrete forecasts of future sales. Any rational forecast should include not just the quantitative average (a single number), but also a reasonable deviation from that number. The fact that most organizations use just single-number forecasts is evidence of the illusion of certainty.
Organizations typically plan for long-term objectives as well as for the short-term. A plan requires many individual decisions regarding different stages, inputs or ingredients. All such decisions together are expected to lead to the achievement of the objective. But uncertainty typically crops up in the execution of every detail in the plan. This forces the employees in charge of the execution to re-evaluate the situation and introduce changes, which may well impact the timely and quality of the desired objective.
What motivates people to make the decisions that they do? Many readers will be familiar with Abraham Maslow's hierarchy of needs. Maslow theorized that humans have needs that they strive to satisfy. Further, Maslow suggested that it's unsatisfied needs that motivate people to action. Maslow also suggested that human needs are hierarchical. This means that satisfying needs lower in the hierarchy pyramid captures a person's attention until they are largely (though not necessarily completely) satisfied. At that point, the these lower level needs become less of a motivator than unsatisfied higher level needs. The person in question will then bend most of his or her efforts to fulfilling those needs.
RN-2.3.3 Process dialects in a practical setting
knowledge management - the processing
What is missing is AI literacy, there is AI-coalition similar to the blockchain coalition causing more noise and confusion than better understanding. So let us breakdown.
AI literacy Cognitive capacity
1 AI is a generic noun for new technology Used for all kind of stuff machines can do in proceses using technology.
2 LLM large language models are for text Using text/speech as communication it is not a better calculator or anything in stem usage, just probabilistic in text and there is lot of good text around accessible
3 ML machine learning (big data based) ML is very good in supervised generating better aid in decisions. It is probabilistic so there is a need to understand and manage uncertainty in results. Quite different than basic simple algoritmes using formules for the only possible correct outcome.
4 Dedicated bound domain AI usage Dedicated domains are those from learning chess, go, that extended to STEM domains usage in recognizing special patterns. ANPR camera's , reading text from scans, Face recognition, fingerprint recognition, the moving analyses in sport etc. There is a sound theoretical model behind those patterns where the analyses is build on. Optical readable text (OCR) is not seen as AI anymore but it is.
5 Defining dedicated domains Enabling overlap with product/technology From a sound theoretical model it is possible to start with better reasoning. There is need for a well defined design theory. The missing part of design theory is where there is the gap now. Training a LLM won't be very practical it will miss the boundaries and context for what is really needed. These must set by design in AI for that defined scope.
6 Ai generating the code for the design Having al well defined design for the minimal what is practical needed the next challenge is the transformation into programming languages that are appropriate for the job. The last part is not really new. Would the language be Cobol than there products of the 90's trying to do that e.g. Coolgen. This is a signal we need to have a generic design/knowledge system to prevent a technology-lockin for generating code. The other point that it gives a signal for is that the resulting code should be based on understandable proven patterns but also having the options for extending into adjusted new patterns doing the job better. Also at this point there is need to prevent a technology-lockin. Nothing really new at this there was a time to standardize metadata to code generation using predefined standard patterns. https://en.wikipedia.org/wiki/Common_warehouse_metamodel https://www.omg.org/spec/ (CWM) https://dmg.org
7 Transformational Re-framing the chosen solution

One additional important aspect in this is moving cyber-security safety into these layers.
knowledge management - what is processed
DoD data strategy (2020) Problem Statement
DoD must accelerate its progress towards becoming a data-centric1 organization. DoD has lacked the enterprise data management to ensure that trusted, critical data is widely available to or accessible by mission commanders, warfighters, decision-makers, and mission partners in a real time, useable, secure, and linked manner. This limits data-driven decisions and insights, which hinders the execution of swift and appropriate action.
Additionally, DoD software and hardware systems must be designed, procured, tested, upgraded, operated, and sustained with data interoperability as a key requirement. All too often these gaps are bridged with unnecessary human-machine interfaces that introduce complexity, delay, and increased risk of error. This constrains the Department’s ability to operate against threats at machine speed across all domains.
DoD also must improve skills in data fields necessary for effective data management. The Department must broaden efforts to assess our current talent, recruit new data experts, and retain our developing force while establishing policies to ensure that data talent is cultivated. We must also spend the time to increase the data acumen resident across the workforce and find optimal ways to promote a culture of data awareness.
The Department leverages eight guiding principles to influence the goals, objectives, and essential capabilities in this strategy. These guiding principles are foundational to all data efforts within DoD.
... Conclusion: Data underpins digital modernization and is increasingly the fuel of every DoD process, algorithm, and weapon system. The DoD Data Strategy describes an ambitious approach for transforming the Department into a data-driven organization. This requires strong and effective data management coupled with close partnerships with users, particularly warfighters. Every leader must treat data as a weapon system, stewarding data throughout its lifecycle and ensuring it is made available to others. The Department must provide its personnel with the modern data skills and tools to preserve U.S. military advantage in day-to-day competition and ensure that they can prevail in conflict.
4 Essential Capabilities necessary to enable all goals:
Stratum Cognitive capacity
1 Architecture DoD architecture, enabled by enterprise cloud and other technologies, must allow pivoting on data more rapidly than adversaries are able to adapt.
2 Standards DoD employs a family of standards that include not only commonly recognized approaches for the management and utilization of data assets, but also proven and successful methods for representing and sharing data.
3 Governance DoD data governance provides the principles, policies, processes, frameworks, tools, metrics, and oversight required to effectively manage data at all levels, from creation to disposition.
4 Talent and Culture DoD workforce (Service Members, Civilians, and Contractors at every echelon) will be increasingly empowered to work with data, make data-informed decisions, create evidence-based policies, and implement effectual processes.

This resonance with: The key-words: processes, frameworks, tools, metrics are bound to process (P) but mentioned at governance.
7 Goals (aka, VAULTIS) we must achieve to become a data-centric, DoD data will be:
Goals information capability
1 Visible Consumers can locate the needed data.
2 Accessible Consumers can retrieve the data.
3 Understandable Consumers can find descriptions of data to recognize the content, context, and applicability.
4 Linked Consumers can exploit complementary data elements through innate relationships.
5 Trustworthy Consumers can be confident in all aspects of data for decision-making.
6 Secure Consumers know that data is protected from unauthorized use and manipulation.
7 Interoperable Consumers and producers have a common representation and comprehension of data.

Make Data Secure As per the DoD Cyber Risk Reduction Strategy, protecting DoD data while at rest, in motion, and in use (within applications, with analytics, etc.) is a minimum barrier to entry for future combat and weapon systems. Using a disciplined approach to data protection, such as attribute-based access control, across the enterprise allows DoD to maximize the use of data while, at the same time, employing the most stringent security standards to protect the American people. DoD will know it has made progress toward making data secure when:
Objective information Safety
1 Platform access control Granular privilege management (identity, attributes, permissions, etc.) is implemented to govern the access to, use of, and disposition of data.
2 BIA&CIA PDCA cycle Data stewards regularly assess classification criteria and test compliance to prevent security issues resulting from data aggregation.
3 best/good practices DoD implements approved standards for security markings, handling restrictions, and records management.
4 retention policies Classification and control markings are defined and implemented; content and record retention rules are developed and implemented.
5 continuity, availablity DoD implements data loss prevention technology to prevent unintended release and disclosure of data.
6 application access control Only authorized users are able to access and share data.
7 information integrity control Access and handling restriction metadata are bound to data in an immutable manner.
8 information confidentiality Access, use, and disposition of data are fully audited.


RN-2.3.4 The transformational challenge activating change
butics
The iron triangle The Architecture of Illusion (LI: A.Dooley 2025)
Some things are worth repeating. The term 'Iron Triangle' was coined in 1956 in relation to the legislative process in the USA. It has nothing to do with project management.
Objective information Safety
1 Low regulations, special favors
2 Funding & political support
3 Electoral support
4 Congressional support via lobby
5 Friendly legislation & oversight
6 Policy choices & execution
7 Realisations by transformation To add (missing)

The Barnes Triangle (more recently the Triple Constraint) was created by Dr. Martin Barnes in 1979. It has everything to do with project management.
The purpose of the triple constraint is to start a conversation about finding a balance between the three constraints that is acceptable to all parties. There is nothing about it that is cast in iron and inflexible.
Objective information Safety
1 Functionality
2 Time
3 Cost
4 Scope
5 Quality
6 Quantity
7 Realisations by transformation

Dooley triplet riangle

butics
Why Enterprise Architecture is Dead The Architecture of Illusion (LI: Bree HatchardBree Hatchard 2025)
We are simply buying insurance policies against being blamed for the past.

BI life

RN-2.4 Becoming of identities transformational relations

In this dialectal path on information processing supporting for governance and informed understandable decisions the identity of persons group of persons and organisations will have to change. The classical hierarchical power over persons is outdated an has become a blocking factor.

RN-2.4.1 Communities of practice - collective intelligence
Alignment of the DTF Framework summary using a LLM. It is far beyond the personal human comfort zone but helpful and finding the references for trustful sources.
Wenger’s mature CoP theory (1998–2010) rests on four pillars: And three learning modes: Engagement, Imagination, Alignment.
This already tells us something important: Wenger is not describing a social structure, he is describing a meaning-producing system over time. That places him squarely in dialectical territory, even if he never uses the word.
Book Review Wenger repeatedly insists on tensions such as: These are not problems to solve. They are productive contradictions.

DTF Alignment to 6x6x and others
Reference-frame approach to systems thinking combining Lean principles, the Zachman Framework, and systemic complexity. It is not a conventional article The idea is that to manage complexity, one must see multiple interdependent dimensions, not just a single linear process. It is meta-structural systems thinking, the same territory Laske calls dialectical. Dialectical Thought Form Framework (DTF) is aimed at understanding and nurturing reasoning complexity: how people structure thought as they handle context, change, contradiction, and transformation. DTF has four categories, each containing 7 thought forms. Each class captures a way of thinking — from seeing events in relation to conditions, diagnosing interdependencies, and dealing with contradictions, to achieving integrative transformation.
SIAR -DTF 6x6 Theme 6x6 Systems/Lean/Zachman Description
Sense -
Context
(C)
Context framing & constraints Many parts of the page focus on systems boundaries, contexts for knowledge and roles. DTF C forms help analyze situating problems in context.
Act-
Process
(P)
Value stream & iterative cycles (e.g., PDCA, SIAR) Lean emphasizes sequences, cycles, flow, stability — aligning with P’s focus on temporal and unfolding structures.
Interpret -
Relationship
(R)
Interdependencies & roles within system subsystems The 6×6 cells and fractal structure metaphor highlight relations and co-dependencies, aligning with R’s structural focus.
Reflect -
Transformation
(T)
Dualities & fractal integration
(backend ↔ front end)
Here the document grapples with contradictions and integration across scales, which DTF’s T forms capture — the move toward meta-levels of meaning.

The “Reflect” phase is not: “Did it work?” It is: “What needs to be re-framed, repositioned, or re-architected?”
The 6*6 framework and DTF overlap structurally, not conceptually, they do different jobs: DTF → describes how people think Your 6×6 / SIAR framing → describes how systems should be designed and navigated What DTF, DTF is diagnostic, has that your page does not aim to do Assess individual cognitive development Distinguish developmental levels Score or profile reasoning complexityBut the structure of movement is the same. What 6*6 framework, is generative, has that DTF does not Normative design intent Architectural completeness Operational guidance for enterprise/system design They are complementary, not redundant. The SIAR 6*6 model operationalizes dialectical thinking at the system-design level, while DTF explicates the cognitive forms required to meaningfully operate such a model.
RN-2.4.2 Info
RN-2.4.3 Info
RN-2.4.4 Info
feel order

RN-2.5 Closing the loop using dialectical thinking

This different path on information processing supporting for governance and informed understandable decisions is using the reflection (R) intensive although it is never mentioned as dialectal thoughtform. Reflection is the closed loop that drives chage & transformations but there are challlenges.

RN-2.5.1 DTF Alignment to the 6x6 reference frame
DTF Alignment to 6x6x reasoning
Lean cycles like PDCA/SIAR are about iterative improvement based on experience and evidence — which resonates with Process (P) and Transformation (T). Stresses duality and dichotomy (e.g., engineering vs system change, frontend vs backend). In DTF: Key indicators (DTF markers) present: This already places the page beyond Context-only (C) and Relationship-only (R) thinking. consistently combines: Dominant mapping of the 4 categories to the 6*6 reference
What How Where Who When Which
Scope / Vision C C C R P C
Conceptual (meaning structures) R R C R P C
Logical (coherence & consistency) R P R R P R
Physical (realization) R P R R P P
Operational (running system) P P R R P P
Reflective / Lean / Learning T T T T T T

If you step back, a vertical gradient appears: This is exactly the developmental movement Laske describes: from situating → structuring → executing → transforming
Where Transformation is structurally required (non-optional) Three places cannot be worked without T-forms: This explains why many people: understand the grid, but cannot use it effectively. They lack T-capacity, not knowledge.
The 6×6 grid is a structural scaffold that implicitly demands increasing dialectical capacity as one moves downward and reflexively through it; DTF makes those demands explicit.

Alignment of the DTF Framework summary
In Laske’s sense, Transformation (T) is not “change over time”, that’s Process (P). T-forms enable: Key T-moves relevant to your framework: Keep those three in mind — they recur everywhere below.

Alignment of the DTF Framework summary
Zarf Jabes Jabes is giving a meaning at the "Shape Systems Thinking: 6x6 Lean & Zachman Augmented Framework" page. The idea is that to manage complexity, one must see multiple interdependent dimensions, not just a single linear process, that is not descriptive systems thinking (formal-logical). It is meta-structural systems thinking — the same territory Laske calls dialectical. Key indicators (DTF markers) present throughout that are: This already places it beyond Context-only (C) and Relationship-only (R) thinking, it consistently combines: SIAR = Sense → Interpret → Act → Reflect. This is where the overlap becomes very concrete. SIAR is not: just a learning cycle, only PDCA with different labels, merely process optimization. Cognitively, SIAR is a recursive meaning-construction loop.
RN-2.5.2 Common pathologies in DTF completeness
Failure mode A: Grid treated as static classification
All 28 TFs are present — no gaps, no redundancies. That is not common. In DTF (Laske), the 28 TFs are, structurally: Examples (schematic, not full list): A framework “covers” a TF only if it forces the thinker to perform that operation. Naming something is not invoking a TF.
When people map rich frameworks (Zachman, VSM, Cynefin, SAFe, etc.) to DTF, the pattern is almost always: Most frameworks are built from one of three starting points: Your framework did not start in one place. It was built by iterative reconciliation of contradictions: That dialectical construction is exactly what DTF measures.

Failure modes in misunderstanding wrong usage
Structural failure points without DTF T-forms These are T-failures, not design flaws.
RN-2.5.3 Common struggles achieving DTF completeness
Typical struggle patterns mapped to DTF gaps
Below are real, repeatable failure modes, each explained by missing or underdeveloped thought forms.
Explanation, training, and tooling don’t fix
Critical, you can: explain the framework perfectly, provide examples, add templates and canvases …and people will still struggle. Why? Because: Asking users to: Those are T-moves, not skills. People struggle with your framework not because it is unclear, but because it silently requires the ability to think in terms of emergence, contradiction, and frame change, capacities that are developmentally unevenly distributed. This is not a flaw. It is a signal.
What does help (without diluting the framework) Importantly: You do NOT need to simplify your framework.
Instead, support dialectical access.

RN-2.5.4 The T-forms challenge activating change
butics

Where T-forms are required in the 6×6 grid
Why many people struggle to use the 6*6 framework.
Any movement between rows requires T. Without T: rows become silos, alignment becomes negotiation instead of transformation
Transition Why T is required
Scope → Conceptual Vision must be re-expressed as structure (T1)
Conceptual → Logical Meaning must be constrained into coherence (T4)
Logical → Physical Abstractions must collapse into reality (T4)
Operational → Reflective Experience must rewrite assumptions (T7)


Evaluation of the 6*6 reference framework
What this implies — and what it does not It does imply: It does not imply: In fact:
Cognitively complete frameworks are always experienced as “too much” by many users. That is not a defect — it is a signal.
Why I reacted at all (LLM) I see many complex-looking frameworks, I almost never see one where: When that happens, it usually means:
The author has been forced by reality to think in all 28 ways — whether they knew the TFs or not.

butics

BI life

RN-2.6 Evaluating system dialectical thinking

This different path on information processing is an emerging perspective for governance and informed understandable decisions. This started with a simplified understanding of a pull push mindset the becoming of the Siar model.

RN-2.6.1 From Knowledge to Graphs and Back Again
A difficult dialectal word: ontology
From Graphs Back to Meaning: Why Ontology Is Not a Phase in the Stack (LI: J.Bittner 2025) The Year of the Graph newsletter published "The Ontology Issue: From Knowledge to Graphs and Back Again." The instinct behind that piece is right. The field is finally confronting the limits of connectivity without meaning. But there is a category error we still need to correct.
Ontology is not something systems move away from and later rediscover. It is not a layer added once graphs get large enough or AI systems get sophisticated enough. Ontology is the discipline of meaning itself. Graphs scale connections. Ontologies constrain what those connections are allowed to mean.
That distinction is not academic. It has direct ROI implications. When meaning is left implicit, organizations pay for it later through brittle integrations, semantic drift, AI hallucinations, governance overhead, and endless rework. Ontology does not make systems faster on day one. It makes them stable under change. It enables axiomatic reasoning, early detection of semantic errors, and explainable conclusions grounded in logic rather than statistical plausibility. This week's "semantically speaking" argues that graphs remain essential, but meaning does not emerge from structure alone. Meaning comes from commitment. If your systems are scaling faster than their assumptions, this distinction matters.
An ontology is an explicit specification of a conceptualization which is, in turn, the objects, concepts, and other entities that are presumed to exist in some area of interest and the relationships that hold among them.
Ontology introduces the semantic foundation that connects people, processes, systems, actions, rules and data into a unified ontology [sic]. By binding real-world data to these ontologies, raw tables and events are elevated into rich business entities and relationships, giving people and AI a higher-level, structured view of the business to think, reason, and act with confidence.
Just as you wouldn’t bring half your brain to work, enterprises shouldn’t bring half of artificial intelligence’s capabilities to their architectures. euro-symbolic AI combines neural-network technology like LLMs with symbolic technology like knowledge graphs. This integration, also known as ‘knowledge-driven AI’, delivers significant advantages: If you’re not exploring how knowledge graphs and symbolic AI can augment your organization’s intelligence—both artificial and actual—now is a good time to start.
butics

RN-2.6.2 The agentic AI shift for aid at decisions
The broken system in decision making
Why Human-Centric Design Breaks in Agentic Systems — and What to Do Instead (LI: J.Lowgren 2025) Most teams still design like the human is always in charge. That worked when software was a tool in a human’s hand. It breaks when software is an actor with its own perception, its own objectives, and the right to act. The result is familiar; a chatbot that sounds empathetic but never escalates, a logistics optimiser that saves fuel and blows delivery windows, a fraud detector that performs well at baseline and collapses during a surge. None of that is a bug. It is design that started in the wrong place.
JLowgren_doublediamond.png
The Agentic Double Diamond begins with inversion; cognitive design from inside the agent’s world. It continues with authopy; system design that encodes data, activation, and governance so autonomy is trusted and traceable. At the centre sit roles and cognition; the explicit boundary between what agents do and what people must decide.
Teams that work this way waste less time apologising for their systems. They spend more time improving them. That is the difference between software that merely runs and software that behaves. That is the difference between pace and regret.

butics
Agentic Governance: Making Intelligence Trustworthy (Zeaware -J.Lowgren 2025 Note: the form is hidden when strict tracepreventation is activated)
This is not a book about the present state of AI. It is about the threshold we have just crossed, the shift from automation to autonomy, from decision rules to decision flows, from governance as control to governance as coordination. The work ahead is not to restrain intelligence but to ensure it remains account-able as it learns, negotiates, and changes shape. The paradigm unfolds through three companion volumes, each viewing the same transformation from a different altitude: Together they form the Agentic Trilogy; a framework for building, govern-ing, and evolving intelligent systems that can explain themselves, adapt respon-sibly, and sustain human intent at machine speed.
Key points:


RN-2.6.3 Reverting the intention into the opposite
How Every Disruptive Movement Hardens Into the Orthodoxy It Opposed
The Pattern That Keeps Repeating (LI: S.Wolpher 2025)
The Agile Manifesto followed Luther’s Reformation arc: radical simplicity hardened into scaling frameworks, transformation programs, and debates about what counts as “real Agile.” Learn to recognize when you’re inside the orthodoxy and how to practice the principles without the apparatus.
In 1517, Martin Luther nailed his 95 theses to a church door to protest the sale of salvation. The Catholic Church had turned faith into a transaction: Pay for indulgences, reduce your time in purgatory. Luther’s message was plain: You could be saved through faith alone, you didn’t need the church to interpret scripture for you, and every believer could approach God directly. By 1555, Lutheranism had its own hierarchy, orthodoxy, and ways of deciding who was in and who was out. In other words, the reformation became a church. Every disruptive movement tends to follow the same arc, and the Agile Manifesto is no exception.

The Agile Arc
Let us recap how we got here and map the pattern onto what we do: The Manifesto warned against the inversion: “Individuals and interactions over processes and tools.” The industry flipped it. Processes and tools became the product. Some say they came to do good and did well.
I’m part of this system. I teach Scrum classes, a node in the network that sustains the structure. If you’re reading this article, you’re probably somewhere in that network too.
That’s not an accusation. It’s an observation. We’re all inside the church now.

Why This Happens
A one-page manifesto doesn’t support an industry. But you can build all of that around frameworks, roles, artifacts, and events. (Complicated, yet structured systems with a delivery promise are also easier to sell, budget, and measure than “trust your people that they will figure out how to do this.”)
Simplicity is bad for business. I know, nobody wants to hear that.

Can the Pattern Be Reversed?
At the industry level, this probably won’t be fixed. The incentives are entrenched. But at the team level? At the organization level? You can choose differently.
You can practice the principles without the apparatus. You can ask, “Does this help us solve customer problems?” instead of “Is this proper Scrum?” You can treat frameworks as tools, not religions.
Can you refuse to become a priest while working inside the church? I want to think so. I try to, and some days I do better than others.

How Every Disruptive Movement Hardens into a new Orthodoxy
The Myth of Early Buy-In for TPS (LI: K.Kohls 2025) This paper examines documented resistance to TPS during its formative years, the role of Taiichi Ohno in enforcing behavioral change prior to belief, and the implications for contemporary Continuous Improvement (CI) implementations. The evidence suggests that TPS did not succeed because of early buy-in or cultural alignment, but because leadership tolerated prolonged discomfort until new habits formed and results compelled belief.
  1. The myth of harmony by culture The Toyota Production System (TPS) is frequently portrayed as a harmonious, culture-driven system that emerged naturally from organizational values. This narrative obscures the historical reality. Primary and secondary sources reveal that it was introduced amid significant internal resistance, managerial conflict, and repeated challenges to its legitimacy.
  1. The Retrospective Fallacy of TPS From the perspective of frontline supervisors and middle managers, inventory functioned as psychological and political protection. Removing it threatened identity, status, and perceived competence. Resistance was therefore not irrational; it was adaptive within the existing reward structure.
  1. Conditions of Constraint Rather Than Enlightenment Existential challenges: limited capital, unstable demand, poor equipment reliability, and an inability to exploit economies of scale. These constraints forced Toyota to pursue alternatives to Western mass production models—not out of philosophical preference, but necessity.
  1. Central Conflict: Visibility Versus Safety The Andon system—now widely cited as a symbol of “respect for people”, was initially experienced as a source of fear rather than empowerment. Supervisors, accustomed to being evaluated on output volume and equipment utilization, frequently discouraged Andon pulls, implicitly or explicitly. Psychological safety, therefore, was not a prerequisite for Andon; it was an outcome that emerged only after repeated cycles of visible problem resolution.
  1. Uneven Adoption and Internal Workarounds Historical studies demonstrate that TPS adoption was neither uniform nor immediate. Fujimoto’s longitudinal analysis shows that early TPS practices were localized, inconsistently applied, and often circumvented by managers seeking to preserve traditional performance metrics.
    Cusumano further documents periods during which TPS was questioned internally, particularly when short-term performance declined. In several instances, Toyota leadership faced pressure to revert to more conventional production approaches. TPS persisted not because it was universally accepted, but because senior leadership tolerated internal conflict long enough for operational advantages to become undeniable.
  1. Enforcement Before Understanding Steven Spear reframes TPS not as a cultural system but as a problem-exposing architecture that forces learning through repeated action. Importantly, Spear emphasizes that many TPS behaviors were enforced before they were fully understood or emotionally accepted.
    John Shook’s firsthand account corroborates this view, noting that Toyota managers learned TPS “by doing,” often experiencing frustration and discomfort before developing deeper understanding. Respect, in this framing, was earned through consistent support during failure—not granted through initial trust.
  1. Implications for Contemporary CI Implementations Modern CI efforts frequently fail for reasons that closely mirror early TPS resistance:
    • An expectation of buy-in prior to behavioral change
    • Aversion to short-term performance dips
    • Avoidance of discomfort in the name of engagement
    • Overreliance on persuasion rather than structural reinforcement
    The historical record suggests that TPS succeeded not by avoiding these dynamics, but by enduring them. Behavior preceded belief; habit preceded culture.
  1. This history carries a sobering implication : Organizations seeking TPS-like results without TPS-level tolerance for discomfort are attempting to reap outcomes without enduring the process that created them. Ohno’s legacy lies not in tool design alone, but in his willingness—and Toyota leadership’s tolerance—to sustain a system that made problems visible, challenged identities, and disrupted established norms long enough for new habits to form. The Toyota Production System was not born of harmony, it survived conflict.

RN-2.6.4 Safety distinctive dimensions operational practices
butics
IMSAFE Checklist Acronym Explained
Ultimately, the safety of a flight is only as good as its weakest link. With a significant amount of accidents caused by pilot error every year, pilots must ensure they are physically and mentally fit to fly. In aviation, safety is the first, second and third priority. That's one of the things I learned early during my pilots training, and it was repeated often. After obtaining my license, it's still a constant focus. The first thing on the checklist I use before even driving to the airport: I.M.S.A.F.E. , if any of these rais a flag, I don't fly.
🎯 Know_npk Gestium Stravity Human-cap Evo-InfoAge Learn-@2 🎯
  
🚧  Knowium P&S-ISFlw P&S-ISMtr P&S-Pltfrm Fractals Learn-I 🚧
  
🔰 Contents Frame-ref ZarfTopo ZarfRegu SmartSystem ReLearn 🔰


RN-3 The three different time consolidation perspectives


RN-3.1 Data, gathering information on processes.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-3.1.1 Info
butics
Turing thesis
butics
history of management consulting. (D. McKenna 1995) Congress passed the Glass-Steagall Banking Act of 1933 to correct the apparent structural problems and industry mistakes that contemporaries to the stockmarket believed led crash in October 1929 and the bank failures of the early 1930s.
The data explosion. The change is the ammount we are collecting measuring processes as new information (edge).

📚 Information questions.
⚙ measurements data figures.
🎭 What to do with new data?
⚖ legally & ethical acceptable?
BI life
BI life 📚 Information questions.
⚙ measurements data figures.
🎭 What to do with new data?
⚖ legally & ethical acceptable?

butics
There are three perspectives: The Mismatch Between Organisational Structure, Complexity and Information (LI: Abdul A. 2025) One of the reasons debates about structure become polarised is that we treat these patterns as mutually exclusive. In reality, most organisations use all three - often without realising it ? and often incoherently.
dual feeling

RN-3.2 Data, gathering information on processes.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-3.2.1 Info
butics

Existing systems that are hard to change
Construction:
Construction regulations for 2025 focus heavily on sustainability, safety, and digitalization, with key changes including stricter energy performance, new Digital Product Passports (DPP) for materials in the EU, updated health & safety roles (like registered safety managers), and a push for greener building methods (heat pumps, solar). In the UK, the Building Safety Levy and new protocols for remediation orders are emerging, while globally, there's a trend towards clearer, faster permitting and greater accountability in construction. Key Themes & Regulations What it Means for You (General) Note: Regulations vary significantly by country. Guide to Construction Products Regulation (CPR) The Construction Products Regulation (CPR) is a pivotal EU legislation that sets standardized safety, performance, and environmental impact requirements for construction products across the EU. Originally established in 2011 to streamline the circulation of construction products within the Single Market through standardized guidelines, the CPR was updated in 2024 to address modern environmental challenges, advancing sustainability and transparency in the construction sector. Health:
cdisc In July 2022, the FDA published, in Appendix D, to their Technical Conformance Guide (TCG), a description of additional variables they want in a Subject Visits dataset. A dataset constructed to meet these requirements would depart from the standard, so validation software would create warnings and/or errors for the dataset. Such validation findings can be explained in PHUSE?s Clinical Study Data Reviewer?s Guide (cSDRG) Package. phuse The Global Healthcare Data Science Community Sharing ideas, tools and standards around data, statistical and reporting technologies phuse PHUSE Working Groups bring together volunteers from diverse stakeholders to collaborate on projects addressing key topics in data science and clinical research, with participation open to all.
 legal

RN-3.3 The three different time consolidation perspectives

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-3.1.1 Info
phuse
idea lightbulb

RN-3.4 information on

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-3.4.1 Info
butics

 legal

RN-3.5 information on

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-3.5.1 Info
butics

 horse sense

RN-3.6 information on

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
RN-3.6.1 Info
https://www.linkedin.com/posts/alexanderbrueckmann_there-is-no-such-thing-as-strategic-planning-activity-7408839090462203904-To8K
RN-3.6.2 Creating new artInfo
RN-3.6.3 Becoming the opposite of what was intended
butics


🎯 Know_npk Gestium Stravity Human-cap Evo-InfoAge Learn-@2 🎯
  
🚧  Knowium P&S-ISFlw P&S-ISMtr P&S-Pltfrm Fractals Learn-I 🚧
  
🔰 Contents Frame-ref ZarfTopo ZarfRegu SmartSystem ReLearn 🔰

© 2012,2020,2026 J.A.Karman
📚 data logic types Information Frames data tech flows 📚

🎭 Concerns & Indices Elucidation 👁 Summary Vitae 🎭