logo know

From Information to Knowledge, Wisdom


🎭 Summary & Indices Elucidation 👁 Foreword Vitae 🎭

👐 C-Steer C-Serve C-Shape 👁 I-C6isr I-Jabes I-Know👐
👐 r-steer r-serve r-shape 👁 r-c6isr r-jabes r-know👐

🔰 Contents Mesh ABCs Control ALC-V* Polestar 🔰
  
🚧  Variety Act on Cyber Change ALC-V* Knowit 🚧
  
🎯 Algol Interact Tenets Change Volatile North 🎯


Q-1 Processes & Organisations


Q-1.1 Contents

Q-1.1.1 Global content
Technical mathematical transformations devops bpm design data devops math devops data devops meta This is the most technical part. Being enablers there should be no dependicies for business processes.
There are three subtopics:

To go to business processes is by floor operations.
🔰 Too fast .. previous.
Zachman 6W-s no W for which technology
🎭 Q-1.1.2 Guide reading this page
Break-up: Logic, Concept, Context
The words Logic, Concept, Context are from the Zachman framework.
More easy to understand: Only three levels are grouped in this, one group for design plan and another for realisations.
The six paragraphs for each of three chapters are aligned for 5w1h questions.
Needed knowledge to read this
Basic knowledge: This page is just a knowledge area, there are no refences to Jabes.
6w 1 how
Why is Jabes interesting?
Everybody is looking for a solution to mange the challenges with information processing.
As far I know there is nothing on the market for solving those challenges holostic. There are many tools for detailed topics, but no one covering all the interactions.

Q-1.1.3 Local content
Reference Squad Abbrevation
Q-1 Processes & Organisations
Q-1.1 Contents contents Contents
Q-1.1.1 Global content
Q-1.1.2 Guide reading this page
Q-1.1.3 Local content
Q-1.1.4 Progress
Q-1.2 Communication - Interactions knowpr_02 Mesh
Q-1.2.1 Processing Information
Q-1.2.2 Managing Processes
Q-1.2.3 Steer Shape Serve
Q-1.3 Historical evolvements knowpr_03 ABCs
Q-1.3.1 Optimalization in the industrial era (I)
Q-1.3.2 Optimalization in the 20-th century (II)
Q-1.3.3 Optimalization in the 20-th century (III)
Q-1.4 Processing flows VSM - assembly lines knowpr_04 Control
Q-1.4.1 Control function - closed loop
Q-1.4.2 Co-ordinating function - Plant value stream
Q-1.4.3 Co-ordinating function - divide work
Q-1.4.4 Study of the tasks - the plant, value stream
Q-1.5 Processes Building Blocks Basics knowpr_05 ALC-V*
Q-1.5.1 Process approaches at the shop floor
Q-1.5.2 ALC-V1 process details
Q-1.5.3 ALC-V2 process details
Q-1.5.4 ALC-V3 process details
Q-1.6 Organisation & Business knowledge knowpr_06 Polestar
Q-1.6.1 What is done servicing administrative/cyber technology
Q-1.6.2 What could be serviced by administrative/cyber technology
Q-1.6.3 What could be realised by administrative/cyber technology
Q-2 How, serve technology
Q-2.1 Miscellaneouss Information Technology knowit_01 Variety
Q-2.1.1 Technology communication- sideways
Q-2.1.2 Technology Logical acces -sideways
Q-2.1.3 Technology software crisis
Q-2.1.4 Programming languages
Q-2.2 Communication - Interactions knowit_02 Act on
Q-2.2.1 Information Communication Technology
Q-2.2.2 Information representations
Q-2.2.3 Decisions & uncertainties expansions
Q-2.3 Historical evolvement ICT knowit_03 Cyber
Q-2.3.1 Computer Technology Basics
Q-2.3.2 Information Communication Technology
Q-2.3.3 Information Technology Fundaments
Q-2.4 Processing flows VSM - Change knowit_04 Change
Q-2.4.1 Product management processes
Q-2.4.2 Delivering data products in a cycle
Q-2.4.3 Three stages in realisation
Q-2.4.3 Control & Compliancy
Q-2.5 Technical understanding processes knowit_05 ALC-V*
Q-2.5.1 Process approaches at the shop floor
Q-2.5.2 ALC-V1 process details
Q-2.5.3 ALC-V2 process details
Q-2.5.3 ALC-V3 process details
Q-2.6 Organisation & Business Understanding knowit_06 Knowit
Q-2.6.1 Retrospective creating this page
Q-2.6.2 Retrospective creating pillar pages
Q-2.6.3 Retrospective building reference pages
Q-2.6.4 External references
Q-2.6.4 ICT & Philosophy
Q-3 Decisions in Wisdom
Q-3.1 Miscellaneous Knowledge Wisdom knowld_01 Algol
Q-3.1.1 Distractors from knowledge
Q-3.1.2 Different paths, philosophy
Q-3.1.3 Impact technology changes
Q-3.2 Communication - Interactions knowld_02 Interact
Q-3.2.1 Goals feeding decisisons
Q-3.2.2 Enterprise Culture vision
Q-3.2.3 Decisions wiht the aid of machines
Q-3.3 Historical evolvement Wisdom knowld_03 Tenets
Q-3.3.1 Logical constructs information processing
Q-3.3.2 Using Data Analytics statistics
Q-3.3.3 Using Big Data, forgotten histories
Q-3.4 Processing flows VSM - Control knowld_04 Change
Q-3.4.1 Defining priority for changes
Q-3.4.2 Indispensable security, safety
Q-3.4.3 Data, Information Quality
Q-3.4.4 Impact on persons
Q-3.4.5 Business Rules, Algorithm
Q-3.5 Processes Building Blocks - Control knowld_05 Volatile
Q-3.5.1 Vocubalary gaps functional technical
Q-3.5.2 Control Decisions
Q-3.5.3 Understanding data - information
Q-3.5.4 Solving volatility, uncertainty, complexity, ambiguity
Q-3.5.4 Logic - Algebra - Decision
Q-3.6 Controlling Organisation & Business knowld_06 North
Q-3.6.1 Information categorisation for engineering
Q-3.6.2 Controlling an Enterprise, Organisation
Q-3.6.3 Pretending being in control
Q-3.6.4 Culture, learning from others
Q-3.6.5 Following steps

Q-1.1.4 Progress
done and currently working on:

Ai missing ML

Q-1.2 Communication - Interactions

Working with machines that process information, is a relative new topic of science. Human communications and interaction is classic.
The concept of the "information" container is not that clear and simple.
Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information.
🎭 Q-1.2.1 Processing Information
generic communication
Describing the properties of information (data) in some metadata approach is going back to ancient history philosophers with "Universals" .
"Semiotics" A sign is anything that communicates a meaning, that is not the sign itself, to the interpreter of the sign. The meaning can be intentional such as a word uttered with a specific meaning, or unintentional, such as a symptom being a sign of a particular medical condition. Signs can communicate through any of the senses, visual, auditory, tactile, olfactory, or gustatory. The semiotic tradition explores the study of signs and symbols as a significant part of communications.

Triangle_of_reference connected Having two parties in the communication:
There is a sender and there is as receiver of information. Both parties, sender & receiver, are processing in a "Triangle_of_reference" .
 
Both parties are translating in two lines the intention to a symbol/word vice versa.
There are al lot of opportunities for failing in aligned good communication.
Communication: Organisation ⇄ Technology
Strategic alignment Venkatraman ea
12 manage venkatraman
Venkatraman ea argue in 1993 that the difficulty to realize value from IT investments is:
❶ firstly due to the lack of alignment between the business and IT strategy of the organizations that are making investments, and
❷ secondly due to the lack of a dynamic administrative process to ensure continuous alignment between the business and IT domains.
  1. (yellow) Strategy Execution: Business is strategy formulator, IT is implementor follower.
  2. (red) Technology Potential: Business strategy drives IT strategy, the organisaton follows.
  3. (green) Competitive Potential: emerging IT capabilities drives new business strategy.
  4. (blue) Service Level: The role of business strategy is indirect. "It should work."

The four options for who is in the lead and who is following are resulting in with opportunities are possible. New questions are: When there are many technology stacks there is a complex situation. In those complex situations there is no simple solutions for all.
🎭 Q-1.2.1 Managing Processes
The organization missions, core business, with the processes: why doing ICT?
A main asset for an organization is: "information". Processing information is the way to create value. From brmbok a framework named bisl, a nice split between information and technology.
BiSL framework bisl in three minutes Relationship Management Institute´s BRMBoK© with the ASL BiSL Foundation´s Business Information Services Library (BiSL©).

The explicit distinction between information and technology emphasizes that the business needs information, and that technology is the enabler.

Information and technology are intimately intertwined, yet each needs to be managed in its own right.

For more BISL details: Digital Leadership: The Objective-Subjective Dichotomy of Technology Revisited (2016)
When looking for guidance how to fill in the responsibility of business information management, the BiSL framework seems the only available framework for business information management.
BiSL framework
🎭 Q-1.2.3 Steer Shape Serve
Strategic alignment - Solve conflict of interests, roles
The Amsterdam Information Model (AIM) model has the goal of defining more clear the roles.
Aside nine green planes there are four intermediate areas.
The hierarchy of control authority is clear: top-down.
4qinn_9vlaks.jpg
In a figure:
Vertical split:
Strategic,
Tactical,
Operational.

Horizontal split:
Business, Steer
Information, Shape
Technology. Serve


Strategic alignment - Overhauled
Changing some words, avoiding the word "Information", Added symbols form the Jabes framework with the Jabes application.
Three pillars but the activities are mixed:
Information accountability is clearly at "steer", business organisation.
The field of communication is a combination of all over the different type of activities.
dtap layers application
The figure,
See right side:

❓ A question in this what kind of shape it would be closing the horizontal layers and closing the vertical pillars?
👁 Answer: you will have a donut (torus) or when the shape is more stretched a pipe.
That is a complicated three dimensional surface.

❓ Next question: is it possible to reduce complexitiy for all communications?
👁 Answer: When going arround circular, the visible two dimensions, and focus on never more than two coordinated interactions this is possible.

rethink what has happened TN

Q-1.3 Historical evolvements

Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step.
For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of, each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain.

📚 Q-1.3.1 Optimalization in the industrial era (I)
Start of industrialsation, computerization
We are presuming using computers, machines is of very recent years. That assupmtion is not correct. The industrialisation of textile did have an enormous impact in the way of living. The Weaving Before the Industrial Revolution, weaving was a manual craft and wool was the principal staple. as well known the first example.
The invention in France of the Jacquard loom, patented in 1804, enabled complicated patterned cloths to be woven, by using punched cards to determine which threads of coloured yarn should appear on the upper side of the cloth.
.. The perceived threat of the power loom led to disquiet and industrial unrest. Well known protests movements such as the Luddites and the Chartists had handloom weavers amongst their leaders. ...
😱 The social state of the workers seen as easy replaceable cheap resources.
Low payments and hard work were the standard. Work shifts up to 16 hours in gruelling conditions, child labour, low wages, lack of rights.
Jacuard_loom
Jacquard Loom Optimization of the working force started with the industrialisation.

Programming machines saving on costly hard manual work. The jacquard loom was the first example.

With this in mind a lot has changed nobody these days is worried about or is even aware off.
Winsor Taylor
Frederick Winslow Taylor
Taylor He was widely known for his methods to improve industrial efficiency. He was one of the first management consultants.
... Taylor's scientific management consisted of four principles:
  1. Replace rule-of-thumb work methods with methods based on a scientific study of the tasks.
  2. Scientifically select, train, and develop each employee rather than passively leaving them to train themselves.
  3. Provide "Detailed instruction and supervision of each worker in the performance of that worker´s discrete task" (Montgomery 1997: 250).
  4. Divide work nearly equally between managers and workers, so that the managers apply scientific management principles to planning the work and the workers actually perform the tasks.
Within the setting of a factory fully control of workers is possible. This rigid approach of seeing human workers as inhuman robots caused the aversion of not being a corect approach.
That social gap to the working class was however the common usual way.
Henri Fayol
Henri Fayol
Henri Fayol A French mining engineer, mining executive, author and director of mines who developed a general theory of business administration that is often called Fayolism. Like his contemporary Frederick Winslow Taylor, he is widely acknowledged as a founder of modern management methods.
... While Fayol came up with his theories almost a century ago, many of his principles are still represented in contemporary management theories.
... Fayol divided the range of activities undertaken within an industrial undertaking into six types:
❶ technical, ❷ commercial, ❸ financial,
❹ security, ❺ accounting, ❻ managerial.
five primary functions were:
❶ Planning, ❷ Organizing, ❸ Commanding, ❹ Co-ordinating, ❺ Controlling.
The control function, from the French contrôler, is used in the sense that a manager must receive feedback about a process in order to make necessary adjustments and must analyze the deviations.
Principles of management:
  1. Division of work = Different levels of expertise can be distinguished.
  2. Authority = gives the management the right to give orders to the subordinates.
  3. Discipline = about obedience.
  4. Unity of command - Every employee should receive orders from only one superior.
  5. Subordination of Individual Interest = The interests of any one employee or group of employees should not take precedence over the interests of the organization as a whole.
  6. Remuneration = All Workers must be paid a fair wage for their services.
  7. Centralisation and decentralisation = Centralisation refers to the degree to which subordinates are involved in decision making.
  8. Line of authority from top management to the lowest ranks represents the scalar chain.
  9. Order = There should be a specific place for every employee in an organization.
  10. Equity = Managers should be kind and fair to their subordinates.
  11. Stability of tenure of personnel = High employee turnover is inefficient.
  12. Initiative = Employees who are allowed to originate and carry out plans will exert.
  13. Esprit de corps = Promoting team spirit will build harmony and unity.
In a mining setting, an autonome responsible team is required. They operate in a dangerous environment. Micro management is no option. Generals keep away from those locations.

📚 Q-1.3.2 Optimalization in the 20-th century (II)
Ford_Pocline
Assembly line - Henry Ford
Henry Ford faster & cheaper facturing Henry Ford credited as a pioneer in making automobiles affordable for middle-class Americans through the Fordism system. ... Workers are paid higher "living" wages so that they can afford to purchase the products they make
Assembly line 1913 Experimenting with mounting body on Model T chassis. Ford tested various assembly methods to optimize the procedures before permanently installing the equipment.

Abraham Wald plane
Operations research - Abraham Wald
Abraham Wald is seen is one of the founders of Operations research (wikipedia).

Wald noted that the study only considered the aircraft that had survived their missions—the bombers that had been shot down were not present for the damage assessment.
Wald proposed that the Navy instead reinforce the areas where the returning aircraft were unscathed, since those were the areas that, if hit, would cause the plane to be lost.


There are many caveats using Machine Learning. The bias data, the correct meaning data are some of them.
Understanding the uncertainties, the effect on the whole process but being fair to outliers are others among a long list.

Edwards_Deming
PDCA - W. Edwards Deming
Toyota made W. Edwards Deming famous. pdca dmaic
PDCA (plan–do–check–act or plan–do–check–adjust) is an iterative four-step management method used in business for the control and continuous improvement of processes and products. It is also known as the Deming circle/cycle/wheel

Far more was and is done by Toyota.
The Difference between the Toyota Production System and Lean Manufacturing The Toyota Production System (TPS) is the archetype of lean manufacturing. Lean is often used as a synonym for the Toyota Production System, and that is generally quite accurate.
All too often, lean is seen as some tool that can be bought and then delegated to someone in the lower ranks of hierarchy.

📚 Q-1.3.3 Optimalization in the 20-th century (III)
Peter_Drucker
Culture - Peter Drucker
Peter_Drucker An Austrian American management consultant, educator, and author, whose writings contributed to the philosophical and practical foundations of modern management theory. ... Drucker coined the term "knowledge worker", and later in his life considered knowledge-worker productivity to be the next frontier of management.
Best known for his quotes:
❶ Management is doing things right; leadership is doing the right things.
❷ The most important thing in communication is hearing what isn't said.
❸ The best way to predict the future is to create it.
❹ Rank does not confer privilege or give power. It imposes responsibility.
❺ Efficiency is doing things right; effectiveness is doing the right things.
❻ Unless commitment is made, there are only promises and hopes... but no plans.
❼ The aim of marketing is to know and understand the customer so well the product or service fits him and sells itself.
❽ Knowledge has to be improved, challenged, and increased constantly, or it vanishes.
❾ There is nothing so useless as doing efficiently that which should not be done at all.

Rik_Maes.jpg
AIM nine plane - Rik Maes
Visie op informatie-management (AIM Amsterdam Information Model "Amsterdamse raamwerk voor informatiemanagement"). Many see this as a static situation. Too often only the strategic level is considered, too often only a siloed Taylorian reorganistion is the hidden action.
Every layer important: The tactical level sets goals and preconditions of the strategic domain into: concrete, realizable objectives, responsibilities, authorizations, frameworks, and guidelines for operations.

SIAR cycle

Q-1.4 Processing flows VSM - assembly lines

A swarm organisation, self organisation, are networked structures without leaderships. Using some shared goal.
⚠ Challenges: have a shared goal, have a good shared goal.
The organisation structure is a hierarchical line of command. Formation in groups using leaders is human nature.
⚠ Challenges: avoiding leaderships going into micro details.
Expected is authority and acountability for a product in place.
😱 Administrative/cyber setting: seems to have got lost.
Q-1.4.1 Control function - closed loop
Control, Feedback, closed loop, PDCA
The fundamental approach in all historical evolvements: The feedback, verification of results with intentions, goals, is the beating heart of real lean using PDCA (Plan-Do-Check-Act). 😱 The PDCA cycle is a closed loop. Sadly it got lost from how to apply in procesess. In the SIAR model combinations: PDCA (Deming), DMAIC (Lean six sigma), lean pull-push and value stream.
A closed loop reference:
BIDM
BI analytics is integrated or not in the business process can strongly affect the decision making process. Hence, we consider this category to be a very important one when delimiting a maturity stage
  1. initiation (user driven - activity initiated by the user, process driven - activity initiated by a process)
  2. process integration (data centric - BI analytics is usually supported by a data warehouse, process centric - BI analytics is integrated in the business processes)
  3. processing model (store and analyze; analyze and store)
  4. event stream processing
  5. "closed-loop" environment
Business Intelligence Development Model

Although having the mindset set for BI (Business Intelligence) it is very generic.
PID process control
PID control When there is a measurement control adjustment becomes a known theory. However this theory is not simple at all.
PID control In theory, a controller can be used to control any process that has a measurable output (PV), a known ideal value for that output (SP), and an input to the process (MV) that will affect the relevant PV.
Using this kind of control there is effect that in the beginning things are going worse before improvements are seen. What would be the reason for that?
Some possible options:
Q-1.4.2 Co-ordinating function - Plant value stream
First line, second line supervisors
Looking for the administrative/cyber physical equivalent of who is accountable for the value stream flow. Where and how to find this role when the product is administrative/cyber?
😱 The hierarchical implementation is easily missed.
The accountability and authority for a product:
Training Within Industry—Second-Line Supervisor Job Instructions Training Within Industry and its modules Job Instructions, Job Relations, and Job Methods are well known. ...
Job Instructions for Second-Line Supervisors (nowadays called managers). This is a hierarchy level higher, and the goal is to support and guide the shop floor supervisors on how to use job instructions.

Line management Lean
Q-1.4.3 Co-ordinating function - divide work
Physical: Waterstrider, Scrum master, Product owner
In the physcial world anybody can see what is going on. In the administrative/cyber world we need a tapping point. The coordinator of the work actions at the floor.
😱 This hierarchical implementation and the importance is easily missed.

Optimizing work of the workers:
Introduction to Point-of-Use Providers (or Mizusumashi)
The point-of-use provider takes care of the "last mile" (or more precisely last few meters) of the material transport. This is often for assembly lines, as there is a lot of material arriving.
...
do not create an additional kanban loop between the supermarket and the assembly line. The effort would heavily outweigh the benefit, making the whole idea pointless. Instead, the point-of-use provider is close enough to the line to keep an overview about what is needed.

Cyber/administrative: Waterstrider, Scrum master, Product owner Logical role, tasks:
lean water spider
Pros and cons of lean water spiders
There are a number of benefits to creating a lean water spider position; however, it can also come with some downsides if they are not utilized properly. Some benefits of the position include the following: Some potential downsides to water spiders, however, may include the following:
Lean water spider job responsibilities
When looking for a lean water spider position, job seekers may see postings that express specific responsibilities and look for certain characteristics. For example, a company looking for a lean water spider may seek someone who will carry out the following tasks: Employers may also look for key characteristics, including the following:
Lean water spider position growth
While the water spider position may sometimes be seen as less important than it really is, the position does also hold a lot of potential for growth. An individual in this position will learn much about the production floor and how the organization they're working for operates. Ideally, they will get to know the people there and the individual challenges in day-to-day work.
This role is a good experience to have in order to become a future manager, supervisor or team leader. Knowing what work is like on the floor before moving into a leadership role gives previous lean water spiders an appreciation for the work process everyone needs to go through, what the workflow is currently like and how it can be kept lean

Q-1.4.4 study of the tasks - the plant, value stream
From: "Want to do a process mining project" slides and videos (vdaalst). The scientific approach understanding and managing processes. 😱 Although it is a fundament it is hardly seen being used. The idea of using data, transformed into information for seeing what is going on the shop floor.
Process mining W.vanAalst
In a figure:
See right side

Having processes grouped in the value stream, not all process events will follow the expectation form the values stream map.
Process mining W.vanAalst
In a figure:
See right side
elephant-blind-men

Q-1.5 Processes Building Blocks Basics

The term elephant test refers to situations in which an idea or thing, "is hard to describe, but instantly recognizable when spotted"
A process life cycle building block, ALC life cycle, is very generic en simplistic. There are only three possible approaches.
To solve:
😱 PM: project management is ❌ NOT product management.
😱 ALC life cycles are made complicated by blame games.
🎭 Q-1.5.1 Process approaches at the shop floor
Fully human, immediate impact: ALC-V1
Change immediate human invented only ⚖ This simple job-shop approach is: ⚒ Instructions for processing: ⚙ The craftsmanship of the workers is decisive for what is delivered at what cost in some time window.

Delegated but human, validation before change: ALC-V2
Change human invented only algorihtmic ⚖ This advanced approach is: ⚒ Creating instructions for processing: ⚙ The craftsmanship of the building, engineering, is decisive for what process is delivered at what cost in some time window.
⚙ The craftsmanship, education, of the operational workers is part of the process creating instructions and specifications.
⚙ The operational transformation is predictable in what is delivered at what cost in what time.

Computer aided decision making, validation before change: ALC-V3
Change with algorihtmic support human guidance ⚖ This sophisticated approach is: ⚒ Creating instructions for processing: ⚙ The craftsmanship of the building, engineering, is decisive for what process is delivered at what cost in some time window.
⚙ The craftsmanship, education, of the operational workers is part of the process creating instructions and specifications.
⚙ The operational transformation is predictable in what is delivered at what cost in what time.

🎭 Q-1.5.2 ALC-V1 process details
⚖ This simple job-shop approach is: A figure is of a classic one-off proces, operations:
one off proces, immediate running

🎭 Q-1.5.3 ALC-V2 process details
⚖ This advanced approach is: A figure is of this classic proces: develop, test - operations:
Develop & Test, POC, before running

A personal figure is of a ML (machine learning) process develop, test - operations:
BPM AI ML proces

Data monetizing journey

Q-1.6 Organisation & Business knowledge

Once Dorothy and her colleagues made the journey to OZ, they quickly found out that there was no there, there. The Wizard simply told her what she really should have known all along.
Dorothy and her companions just had to figure out how to reframe their own perceived shortcomings and recast them as strengths to achieve real transformation.

Q-1.6.1 What is done servicing administrative/cyber technology
Retrospective ICT service, modelling
SAS analytics lifecycle
Looking around what others are posting and what the direction of the opinions is .... (ALC-v3)

The life cycle is a hot topic (2019). Just modelling, inventing new processes and not able to operationalize doesn´t bring the expected value. There are many more issues to solve than becoming just aware of this.

The algorithm using data to create a model is transforming the data to a source. When auditing reviewing the model that data is an source component to archived as evidence.


Retrospective ICT service, Software building
The agile manifesto redirected all attention for building software. That is weird because software isn´t the key factor where the value stream processes are. There are a many situations that this is the best approach because there are no better options.
Saying failing fast it the goal is also weird. One of the best lean agile projects wat the "race to the moon" In the Apollo project everyting was tested and validated. Only the unforeseen being a problem.
Retrospective ICT service, Service desk
One of the aspects that is copied is how the ICT servicedesk is managed. It has become part of the ITIL framework. The Information Technology Infrastructure Library (ITIL) is a set of practices and a framework for IT activities.
Apllo project, man to the moon There are many companies offering an ITIL course. The common idea is: An Apollo 13 ITSM - game 💣 Building and operating in a reliable predictable way is not covered. Prioritizing would be better avoiding that ITIL proces as much as possible.
Q-1.6.2 What could be serviced by administrative/cyber technology
There is a long not exhaustive list where administrative/cyber work is applicable.
ML_Agriculture
optimalization Agriculture
Food production and demand on a global basis, with special attention paid to the major producers, such as China, India, Brazil, the US and the EU.
Agricultural_science (wikipedia)
ML_smarthome.jpg
optimalization your home
A Smart Home is one that provides its home owners comfort, security, energy efficiency (low operating costs) and convenience at all times.

ML_logistic
optimalization transport
Logistics Transport (wikipedia)
ML_travel
optimalization personal travelling
Travelling (wikipedia)

ML_deepblue kasparov
optimalization decisions
Playing chess is making decisions in a timely manner. with the advice of a machine anyone can beat the master. Note the man behind the computerscreen, he is moving the pieces.

ML_health
optimalization Health
Health services research (wikipedia)
Q-1.6.3 What could be realised by administrative/cyber technology
analyst viewpoint
Even the nice life cycle approach is seen in analyst reports: Forrester Business Clear Implementation Cycle
Many companies still struggle to realize value from predictive analytics despite considerable investments in technology and human capital. This is largely due to the insights-to-action gap, the disconnect between analytical insights and operational processes Source: Close The Insights-To-Action Gap With A Clear Implementation Plan Forrester report (Brandon Purcell 2017)
A pity even the visual and intro went behind a paywall. In the time of first publication in 2015 it was free accessible.
ALC-V3 process line, closed loops - circular change
The life cycle of the ALC-V3 in a circular visual:
When the need for change in information processing is high a devops perspective using several closed loops gives other attention points:
The whole process is a closed loop in itself. At every stage all three involved parties have activities to be coordinated and to be aligned.


The data driven process in a figure:




🔰 Contents Mesh ABCs Control ALC-V* Polestar 🔰
  
🚧  Variety Act on Cyber Change ALC-V* Knowit 🚧
  
🎯 Algol Interact Tenets Change Volatile North 🎯


Q-2 How, serve technology


meandering path

Q-2.1 Miscellaneouss Information Technology

The technology using computers has several lines of evolvements. The hardware has become faster, better, cheaper. Application software has a few basic fundaments in logic by flows and constructs. The problem to solve has moved from a pure technical aspect how to run machines into how to process information in a technical way.

Q-2.1.1 Technology communication- sideways
signal flags
Signal flags The associated illustration is not have a real meaning other than that there is party something to celebrate.
International maritime signal flags are various flags used to communicate with ships. The principal system of flags and associated codes is the International Code of Signals. Various navies have flag systems with additional flags and codes, and other flags are used in special uses, or have historical significance. There are various methods by which the flags can be used as signals:

Optical teletgraph
optical telegraph
An optical telegraph is a line of stations, typically towers, for the purpose of conveying textual information by means of visual signals. There are two main types of such systems; the semaphore telegraph which uses pivoted indicator arms and conveys information according to the direction the indicators point, and the shutter telegraph which uses panels that can be rotated to block or pass the light from the sky behind to convey information.
The illustration is of the Napolean era. These kind of system was used widely all over the world in ancient times.
Invisibility Technical communications.
New technology still is evolving.
In telecommunications, 5G is the fifth generation technology standard for cellular networks, which cellular phone companies began deploying worldwide in 2019, the planned successor to the 4G networks which provide connectivity to most current cellphones. Like its predecessors, 5G networks are cellular networks, in which the service area is divided into small geographical areas called cells.

Not understanding technology brings old types of conspiracies
During the COVID-19 pandemic, several conspiracy theories circulating online posited a link between severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and 5G. This has led to dozens of arson attacks being made on telecom masts in the Netherlands (Amsterdam, Rotterdam, etc.), Ireland (Cork,[ etc.), Cyprus, the United Kingdom (Dagenham, Huddersfield, Birmingham, Belfast and Liverpool), Belgium (Pelt), Italy (Maddaloni), Croatia (Bibinje) and Sweden. It led to at least 61 suspected arson attacks against telephone masts in the United Kingdom alone and over twenty in The Netherlands.
When steam trains were introduced a lot of conspiracies were made in trying to halt change.
When Industrialisation replaced the at home work looms, a lot of resistances was made in trying to halt change.
When vaccination was invented for preventive healthcare a lot of conspiracies were made in trying to halt change.

Q-2.1.2 Technology Logical acces -sideways.
CODASYL, Transactional usage.
CodasylB The DBTG group, codasysl , greatest person doing the data network modelling: Charles_Bachman, was the DBMS standard before SQL. (wikipedia) CODASYL, the Conference/Committee on Data Systems Languages, was a consortium formed in 1959 to guide the development of a standard programming language that could be used on many computers. This effort led to the development of the programming language COBOL and other technical standards.

Almost forgotten, a network model database is a no sql one.
In October 1969 the DBTG published its first language specifications for the network database model which became generally known as the CODASYL Data Model. (wiki) This specification in fact defined several separate languages: a data definition language (DDL) to define the schema of the database, another DDL to create one or more subschemas defining application views of the database; and a data manipulation language (DML) defining verbs for embedding in the COBOL programming language to request and update data in the database.
von Neumann perftun01
Technology Backend (von Neuman)
This topic did not change in concepts from the beginning.
❶ The improvements on what is possible with hardware and software to realise are however changing fast. The principals on segregations by processing speed and parallel processing are still valid.
❷ The change in speed at communication lines are enabling to do a lot on far physical distances.
vonNeumann_perftun02.jpg
Unlimited capacity, unlmited speed will never be reality.
Performance & Tuning hardware:
❶ Knowing approximity time costs for requests
    👉🏾 going to optimize is choice decisions between resources.
❷ minimize resource usage consuming most time. ⌛
    👉🏾 better algorithm
❸ trying to run processes parallel instead of serial. ⏳
    👉🏾 scheduling
❹ preventing overload 🚧 conditions in any resourced used in the chain.
    👉🏾 load balancing

Networkspeed.jpg
Technology Network - endpoints
Local hardware speed is Not the bottleneck, we are moving into an always connected situation. The possible speed is growing fast. In 1991, two years after its invention by Tim Berners-Lee, researchers at CERN in Switzerland released the World Wide Web (WWW) to other institutions and then later that year to the general public. Back then, it had a global average connection speed of just 14.4 kilobits per second (kbit/s). Only a handful of pages could be visited, all with static HTML and limited functionality.
... Today, most advanced nations and those with less developed infrastructure. In Singapore, for example, you can expect nearly 300 Mbit/s as standard, while in Turkmenistan the average is just 4.3 Mb/s. As of 2022, the global average is approximately 100 Mbit/s and growing by 20% each year.

Technology Frontend
Frontend and backend , In software engineering, the terms frontend and backend (sometimes written as back end or back-end) refer to the separation of concerns between the presentation layer (frontend), and the data access layer (backend) of a piece of software, or the physical infrastructure or hardware. In the client–server model, the client is usually considered the frontend and the server is usually considered the backend, even when some presentation work is actually done on the server itself.
Q-2.1.3 Technology software crisis
software crisis in the 1960
The software crisis in the 60´s, organizing conferences. Nato reports (brian randell) a beautiful document: Nato 1969 E.Dijkstra and d´Agapeyeff´ inverted Pyramid. Dependicies in layers where you don´t want to be dependent, decoupling by interfaces (middleware).
agapeyeff_pyramid This is because no matter how good the manufacturer´s software for items like file handling it is just not suitable; it´s either inefficient or inappropriate.
We usually have to rewrite the file handling processes, the initial message analysis and above all the real-time schedulers, because in this type of situation the application programs interact and the manufacturers, software tends to throw them off at the drop of a hat, which is somewhat embarrassing.
On the top you have a whole chain of application programs.
The point about this pyramid is that it is terribly sensitive to change in the underlying software such that the new version does not contain the old as a subset. It becomes very expensive to maintain these systems and to extend them while keeping them live.

d´Agapeyeff: (from Reducing the cost of software) Programming is still too much of an artistic endeavour. We need a more substantial basis to be taught and monitored in practice on the:
  1. structure of programs and the flow of their execution.
  2. shaping of modules and an environment for their testing.
  3. simulation of run time conditions.

software crisis in the 2020
Stating Virtual machines, docker containers, going to the cloud, are cheap is not correct. The cost is in supporting, maintaining what is on top of that.
Nothing has really changed sinde those 1969 days.

flowchart drawer
Technology for logic
Any process is having a design when it should be adequate, correct, elegant. Ordering thoughts about the design of a process has aving a simple approach by doing drawings like flowcharts.
The quick & dirty approach of trial and error where failing fast and breaking things is acceptable is only acceptable when the risk assessment does allow it.
Q-2.1.4 Programming languages
Algol -semicolon based languages.
Algol 60 As a semicolon based langauge bypassed the requirements using hollerith cards. ALGOL 60 was used mostly by research computer scientists in the United States and in Europe. Its use in commercial applications was hindered by the absence of standard input/output facilities in its description and the lack of interest in the language by large computer vendors. ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development. Some modern languages are using these concepts. Powershell one-liners. Many programming and scripting languages require a semicolon at the end of each line.
Cobol - column based languages.
Cobol a language survivor, is a compiled English-like computer programming language designed for business use. It is an imperative, procedural and, since 2002, object-oriented language. There are cultural similarities to Java of Phyton.
Python - column based languages.
Python based on columns by identation.
Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks. An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block. Thus, the program´s visual structure accurately represents the program´ss semantic structure. This feature is sometimes termed the off-side rule, which some other languages share, but in most languages indentation doesn´t have any semantic meaning.

Hadoop, Elastic search - Nosql
These are recent products, recent approaches that are going back to old concepts.
3 GL languages going into more advanced ones. low coding.
Procedural programming languages are leaving the details on accessing object fully to the programmer. Examples are Cobol Java Python. When processing many objects in a similar way much of coding on repetitions can handed over to the language. RPG and SAS (base) are examples.

🎯 New goals:
Ai missing ML

Q-2.2 Communication - Interactions

Working with machines that process information, is a relative new topic of science. Human communications and interaction is classic.
The concept of the "information" container is not that clear and simple.
Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information.
Q-2.2.1 Information Communication Technology
Technology start at communication
The change in information processing did not start with machines dong calculations. It did start with using technology to exchange information: communication. Instead of humans using a messenger, using by something written or just spoken it went to digital with the telegraph .
cooperating understanding encoding These systems led to new telegraph codes, starting with the Baudot code. However, telegrams were never able to compete with the letter post on price, and competition from the telephone, which removed their speed advantage, drove the telegraph into decline from 1920 onwards.
The few remaining telegraph applications were largely taken over by alternatives on the internet towards the end of the 20th century.

decoding machine
Technology start by computerss
The first machine classified as computer had his roots in analysing communication. Colossus was a set of computers developed by British codebreakers in the years 1943–1945 to help in the cryptanalysis of the Lorenz cipher. Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Colossus is thus regarded as the world's first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.
Confused focus in subject
There is a strange misunderstanding in concepts.
  1. When referring to science of information processing than: people are starting to think on just programming languages. That is however a not interesting technology component.
    💣 Issue: Ignoring the process of how information should get understood.
  2. When referring to communications for information processing than: people are starting to think on programming languages.
    🤔 Ignoring the communication, human interaction for decisions.
    😱 Assuming the technical enablement of data exchange in a technical network is all what there is. Ignoring what is really "communications": the information for decisions.
  3. When referring to Technology of information processing than: people are going for building databases , program to deliver some numbers and figures.
    🚧 Ignoring the process of how information for decisions should get treated explained and presented while not ignoring all uncertainties and ethical aspects.
Is the Technology Part about Basic Technology or is it about the decision maker, imperator.
Without a better understanding on the concepts we are condemned to make the same mistakes over and over again.

Q-2.2.2 Information representations
Data encoding decoding.
👓 As soon as information was reliable to encode and decode resulting in meaningful information, the question aros how to prevent seeing that by those that should not know that information. The battle of encryption - decryption of information adding an additional layer on the encoding - decoding.
The colossus was an automated machine helping in that information war. It was not a generic computer system.
Technology fixed and variabel lenght of encoded messages.
In the ease of technical implementations fixed length sizing is preferred. In the ease of text messages variable lengths are preferred. For reliable technical data transfer after a number of the same level of signal the opposite level is required. Adding another reversed level signal is quickly adding some length.
💣 Mixing up several conventions can cause un unwanted unclear encoding of information. In a comma separated file, is the comma part of a digit of it is really segregation of fields.
Ascii - Ebdic - Single byte
These are encodings: interpretation of a single byte 8 bits, 256 different values)
Properties:
unicode
Unicode character representation - Multi byte
Many versions several implementations. Unicode Unicode is becoming the defacto standard.

Floating, Math in ICT
Slider Atbasic issues in ICT with math: inaccuracy with calculations, Floating numbers.
Using a slider you had to think about that and by that knowing the possible impact. Using a computer everybody trust the results until the surprising wrong results getting complaints by feed back respsonses.

IEEE 754
The standard provides for many closely related formats, differing in only a few details. Five of these formats are called basic formats, and others are termed extended precision formats and extendable precision format. Three formats are especially widely used in computer hardware and languages:
The common format in programming is double format. Using a GPU (NVIDIA) it is half precision. A difference of less than 1 promille is for image procession seen as sufficiënt.
Noice & synchronisation - error detection
Although everything is told to be digital the transmission are using real world phenomes in an analog world. Shannon-Hartley
In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise.
High-Level Data Link Control Is vo synchronzing clocks without using clocks.
This bit-stuffing serves a second purpose, that of ensuring a sufficient number of signal transitions. On synchronous links, the data is NRZI encoded, so that a 0-bit is transmitted as a change in the signal on the line, and a 1-bit is sent as no change. Thus, each 0 bit provides an opportunity for a receiving modem to synchronize its clock via a phase-locked loop. If there are too many 1-bits in a row, the receiver can lose count. Bit-stuffing provides a minimum of one transition per six bit times during transmission of data, and one transition per seven bit times during transmission of a flag.
This is really technical but on the low technical level necessary and in use at all kind of common devices.
This "change-on-zero" is used by High-Level Data Link Control and USB. They both avoid long periods of no transitions (even when the data contains long sequences of 1 bits) by using zero-bit insertion. HDLC transmitters insert a 0 bit after 5 contiguous 1 bits (except when transmitting the frame delimiter "01111110"). USB transmitters insert a 0 bit after 6 consecutive 1 bits. The receiver at the far end uses every transition-both from 0 bits in the data and these extra non-data 0 bits — to maintain clock synchronization.

Q-2.2.3 Decisions & uncertainties expansions
Risk guidelines.
Risk Assessment
The level of risk can be estimated by using statistical analysis and calculations combining impact and likelihood. Any formulas and methods for combining them must be consistent with the criteria defined when establishing the Risk Management context.

Keep it stupid simple (KISS)
Simplifying is possible in several ways.
Occam´s_razor Occam´s razor, Ockham´s razor, Ocham´s razor (Latin: novacula Occami) or law of parsimony (Latin: lex parsimoniae) is the problem-solving principle that "entities should not be multiplied without necessity." The idea is attributed to English Franciscan friar William of Ockham (c. 1287-1347), a scholastic philosopher and theologian who used a preference for simplicity to defend the idea of divine miracles. It is variously paraphrased by statements like "the simplest explanation is most likely the right one".
KISS principle The KISS principle states that most systems work best if they are kept simple rather than made complicated; therefore, simplicity should be a key goal in design, and unnecessary complexity should be avoided.
The only question not asked and not answered is what in a situation is simple. What is a simple step for one person is possible felt for another person as very complicated.

Chaotic systems.
Chaos_theory Chaos theory is a branch of mathematics focusing on the study of chaos states of dynamical systems whose apparently-random states of disorder and irregularities are often governed by deterministic laws that are highly sensitive to initial conditions.
💣 The education is focussing on predictable deterministic systems. Assuming when you know all inputs you can predict the outcome with a defined certaintity. This not the truth.
Small differences in initial conditions, such as those due to errors in measurements or due to rounding errors in numerical computation, can yield widely diverging outcomes for such dynamical systems, rendering long-term prediction of their behavior impossible in general. This can happen even though these systems are deterministic, meaning that their future behavior follows a unique evolution and is fully determined by their initial conditions, with no random elements involved. In other words, the deterministic nature of these systems does not make them predictable. This behavior is known as deterministic chaos, or simply chaos. The theory was summarized by Edward Lorenz.

Getting in accptable time many choices as options
Monte_Carlo
The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.
Las_Vegas  
Las Vegas algorithms were introduced by Laszlo Babai in 1979, in the context of the graph isomorphism problem, as a dual to Monte Carlo algorithms. Babai introduced the term "Las Vegas algorithm" alongside an example involving coin flips: the algorithm depends on a series of independent coin flips, and there is a small chance of failure (no result). However, in contrast to Monte Carlo algorithms, the Las Vegas algorithm can guarantee the correctness of any reported result.

Random numbers
Generating good random numbers is ever lasting question, it is also changing in realisations. Mersenne_twister
The Mersenne Twister was developed in 1997 by Makoto Matsumoto and Takuji Nishimura. It was designed specifically to rectify most of the flaws found in older PRNGs.
The most commonly used version of the Mersenne Twister algorithm is based on the Mersenne prime 2**19937-1. The standard implementation of that, MT19937, uses a 32-bit word length. There is another implementation (with five variants) that uses a 64-bit word length, MT19937-64; it generates a different sequence.
pro cons A note on why there are shortcomings: (B. A. Wichmann). Another one still present is the seed values being predictable of the system clock.
Algorithm AS 183, Hill and Wichmann (1982) and Wichmann and Hill (1982) resulted. It has had a "good innings" but its cycle length of about 7x10**12 must now be considered inadequate. It has been reported (McCullough and Wilson, 2005) as having failed some tests at a probability level of less than 10**-15, which surely is indicative of a major failing. Computing developments over the last quarter of a century now make a better version both possible and desirable. In particular, there does not now seem to be a need for the 16-bit restriction, as 32-bit availability is almost universal.
 thinking guessing
Aristotle & Plato
These old Greek philosophers are stating the problem with the analytics. The meaning and concepts. theory_of_universals Aristotle to Platonic realism Socrates
Plato and Xenophon's dialogues provide the main source of information on Socrates's life and thought. These writings are the Sokratikoi logoi, or Socratic dialogues, which consist of reports of conversations apparently involving Socrates.
The most famous allegory of the cave Any disruptive change although being the real truth, is not automatically an acceptable option for all involved. The difficult decision is what to do in those kind situations.

🎯 New goals: Decisions on information change by perspectives.

rethink what has happened TN

Q-2.3 Historical evolvement ICT

Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step.
For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of, each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain.

Q-2.3.1 Computer Technology Basics.
Astrolabe
Analalog computing.
A quick acceptable result is a good option using analog devices. The problem with those is their single purpose and limited accuracy. The advantage using dedicated scale models is seeing issues not in another possible. The oldest kind of an application is navigation.
astrolabe an elaborate inclinometer, and can be considered an analog calculator capable of working out several different kinds of problems in astronomy.
Lovelace Babbage
Ada Lovelace - Charles_Babbage
Ada lovelace Between 1842 and 1843, Ada translated an article by Italian military engineer Luigi Menabrea on the calculating engine, supplementing it with an elaborate set of notes, simply called Notes. These notes contain what many consider to be the first computer program that is, an algorithm designed to be carried out by a machine.
Charles Babbage Considered by some to be a "father of the computer".
Babbage is credited with inventing the first mechanical computer that eventually led to more complex electronic designs, though all the essential ideas of modern computers are to be found in Babbage´s analytical engine.
Samuel Morse
Morse
Samuel Morse Samuel Morse The Morse system for telegraphy, which was first used in about 1844, was designed to make indentations on a paper tape when electric currents were received. ...
Morse code was developed so that operators could translate the indentations marked on the paper tape into text messages. In his earliest code, Morse had planned to transmit only numerals and to use a codebook to look up each word according to the number which had been sent. However, the code was soon expanded by Alfred Vail in 1840 to include letters and special characters so it could be used more generally.
Vail estimated the frequency of use of letters in the English language.

Explanation choices
This collection for algorithms Computers is: Information processing is a combination of these three parts.
Q-2.3.2 Information Communication Technology.
Heinrich Rudolf Hertz
Henrich Herz
Heinrich Rudolf Hertz Between 1886 and 1889 Hertz would conduct a series of experiments that would prove the effects he was observing were results of Maxwell's predicted electromagnetic waves. Starting in November 1887 with his paper "On Electromagnetic Effects Produced by Electrical Disturbances in Insulators", Hertz would send a series of papers to Helmholtz at the Berlin Academy, including papers in 1888 that showed transverse free space electromagnetic waves traveling at a finite speed over a distance.
Tesla_thinker
Nikola Tesla
Nikola Tesla The three big firms, Westinghouse, Edison, and Thomson-Houston, were trying to grow in a capital-intensive business while financially undercutting each other.
There was even a "war of currents&qut propaganda campaign going on with Edison Electric trying to claim their direct current system was better and safer than the Westinghouse alternating current system.
Competing in this market meant Westinghouse would not have the cash or engineering resources to develop Tesla´s motor and the related polyphase system right away.

Guglielmo Marconi
Marconi
Guglielmo Marconi Late one night, in December 1894, Marconi demonstrated a radio transmitter and receiver to his mother, a set-up that made a bell ring on the other side of the room by pushing a telegraphic button on a bench. Supported by his father, Marconi continued to read through the literature and picked up on the ideas of physicists who were experimenting with radio waves. He developed devices, such as portable transmitters and receiver systems, that could work over long distances, turning what was essentially a laboratory experiment into a useful communication system. Marconi came up with a functional system with many components
Explanation choices
This collection for Computer Technology is: Information processing is a combination of these three parts.
Going from vacuum tubes to transistors into chips was an evolution building on what was created by these basic knowledge. The change in number of information elements has dramtically grown Storge measured in Tebibytes (2**40) has become normal. One byte havin 8 bits. Communication speed of 100 Mb/s that is 2**20 bits/second has become normal. The measurement increasing in facotrs of 1024 (2**10). byte
Q-2.3.3 Information Technology Fundaments.
von Neumann
Hardware design - John von Neumann
von Neumann
The term "von Neumann architecture" has evolved to mean any stored-program computer in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus.
This is referred to as the von Neumann bottleneck and often limits the performance of the system.
The design of a von Neumann architecture machine is simpler than a Harvard architecture machine which is also a stored-program system but has one dedicated set of address and data buses for reading and writing to memory, and another set of address and data buses to fetch instructions.


Grace Hopper (Yale)
Mother of 3 GL languages - Grace Hopper
Grace Hopper (Yale edu) In addition to their work for the Navy, Hopper and her colleagues also completed calculations for the army and -ran numbers- used by John von Neumann in developing the plutonium bomb dropped on Nagasaki, Japan. ...
Though the term "bug" had been used by engineers since the 19th century to describe a mechanical malfunction, Hopper was the first to refer to a computer problem as a "bug" and to speak of "debugging" a computer. ...
As the number of computer languages proliferated, the need for a standardized language for business purposes grew. In 1959 COBOL (short for "common business-oriented language") was introduced as the first standardized general business computer language. Although many people contributed to the "invention" of COBOL, Hopper promoted the language and its adoption by both military and private-sector users.
Edgar_F_Codd
Data, Information structuring.- Edgar F Codd
Set the terms for relational data, transactional usage. Edgar F. Codd Codd, the man who killed codasyl.
An English computer scientist who, while working for IBM, invented the relational model for database management, the theoretical basis for relational databases and relational database management systems.
He made other valuable contributions to computer science, but the relational model, a very influential general theory of data management, remains his most mentioned, analyzed and celebrated achievement. .... He published his 12 rules to define what constituted a relational database. This made his position at IBM increasingly difficult. Codd´s 12rules

Rule 0 The foundation rule:
For any system that is advertised as, or claimed to be, a relational data base management system, that system must be able to manage data bases entirely through its relational capabilities.
Rule 1 The information rule:
All information in a relational data base is represented explicitly at the logical level and in exactly one way – by values in tables.
Rule 2: The guaranteed access rule:
Each and every datum (atomic value) in a relational data base is guaranteed to be logically accessible by resorting to a combination of table name, primary key value and column name. Rule 3: Systematic treatment of null values:
Null values (distinct from the empty character string or a string of blank characters and distinct from zero or any other number) are supported in fully relational DBMS for representing missing information and inapplicable information in a systematic way, independent of data type.
Rule 4: Dynamic online catalog based on the relational model:
The data base description is represented at the logical level in the same way as ordinary data, so that authorized users can apply the same relational language to its interrogation as they apply to the regular data.
Rule 5: The comprehensive data sublanguage rule:
A relational system may support several languages and various modes of terminal use (for example, the fill-in-the-blanks mode). However, there must be at least one language whose statements are expressible, per some well-defined syntax, as character strings and that is comprehensive in supporting all of the following items:
  1. Data definition.
  2. View definition.
  3. Data manipulation (interactive and by program).
  4. Integrity constraints.
  5. Authorization.
  6. Transaction boundaries (begin, commit and rollback).
Rule 6: The view updating rule:
All views that are theoretically updatable are also updatable by the system.
Rule 7: Relational Operations Rule / Possible for high-level insert, update, and delete: The capability of handling a base relation or a derived relation as a single operand applies not only to the retrieval of data but also to the insertion, update and deletion of data.
Rule 8: Physical data independence:
Application programs and terminal activities remain logically unimpaired whenever any changes are made in either storage representations or access methods.
Rule 9: Logical data independence:
Application programs and terminal activities remain logically unimpaired when information-preserving changes of any kind that theoretically permit unimpairment are made to the base tables.
Rule 10: Integrity independence:
Integrity constraints specific to a particular relational data base must be definable in the relational data sublanguage and storable in the catalog, not in the application programs.
Rule 11: Distribution independence:
The end-user must not be able to see that the data is distributed over various locations. Users should always get the impression that the data is located at one site only.
Rule 12: The nonsubversion rule:
If a relational system has a low-level (single-record-at-a-time) language, that low level cannot be used to subvert or bypass the integrity rules and constraints expressed in the higher level relational language (multiple-records-at-a-time).

Explanation choices
This collection for Computer Technology is: Information processing is a combination of these three parts.
🎯 New horizons:
SIAR cycle

Q-2.4 Processing flows VSM - Change

A swarm organisation, self organisation, are networked structures without leaderships. Using some shared goal.
⚠ Challenges: have a shared goal, have a good shared goal.
The organisation structure is a hierarchical line of command.
📚 Challenges:
Q-2.4.1 Product management processes
Administrative/cyber processing similar to a factory
There are four stages: process cycle value stream
In a figure
See right side
Decisions with control on processes.
process cycle SIAR A different position of control hierarchy:
The controls for each stage are communicated to a central position. Central position in the middle.
The most important controls are those the are related to the handovers in the cycle, diagonals.
The Second level in the hierarchy is slightly different postioned than the shop floor. The coverage in the second level are on the basic vertical-horizontal positions.
At the floor they are at decision points. (Enable-Asses)


Situation Input Actions Results, SIAR lean structured processing
Mindset prerequisites: Siar model - static
The model covers all of:

Ideate - Asses, Plan - Enable, Demand - backend, Frontend - Delivery
Mindset prerequisites: Siar model - dynamic
The cubicle representation of the model does show a lot for categorisations. The static element information is well to place. Processing, transforming, is a dynamic process. A circular representation is a better fit.
The cycle:
Customer interaction: bottom right side.
Supply chain interaction: bottom left side.

Q-2.4.2 Delivering data products in a cycle
Combining the factory approach wiht basic four adminsitrative/cyber steps, data driven processing.
Jabes generic process
A figure:
See right side

Q-2.4.3 Three stages in realisation
Materials retrieval
Requirements: Jabes process Assurance
A figure:
See right side

Processing Materials into a product
Requirements: Jabes process Assurance
A figure:
See left side

Processing Materials into a product
Requirements:
Jabes process Assurance
A figure:
See right side

Q-2.4.4 Control & Compliancy
Requirements to set, document & validate: Jabes process Assurance
In a figure:
See left side
elephant-blind-men

Q-2.5 Processes Building Blocks Realisations

The term elephant test refers to situations in which an idea or thing, "is hard to describe, but instantly recognizable when spotted"
A process life cycle building block, ALC life cycle, is very generic en simplistic. There are only three possible approaches.
To solve:
😱 Understanding goals, bringing value is hampered by culture.
😱 Technology driven is failing to align to product values streams.
🎭 Q-2.5.1 Process approaches at the shop floor
Fully human, immediate impact: ALC-V1
Change immediate human invented only ⚙ This simple job-shop approach is: ⚖ This simple job-shop approach is used for: These attributes result in avoidance of ICT staff (technology) for doing the work building those reports.
⚒ ➡ Alignment for information processing:
Delegated but human, validation before change: ALC-V2
Change human invented only algorihtmic ⚙ This advanced approach is: ⚖ This advanced approach is used for: ⚒ ➡ Alignment for information processing:
Computer aided decision making, validation before change: ALC-V3
Change with algorihtmic support human guidance ⚖ This sophisticated approach is: ⚖ This sophisticated approach is used for: ⚒ ➡ Alignment for information processing:
🎭 Q-2.5.2 ALC-V1 process details
⚖ This simple job-shop approach is: The simplicity is also the loophole. Using a spreadsheet and not having independent validations is a pathway for spurious results.
A figure is of a classic one-off proces, operations:
one off proces, immediate running

🎭 Q-2.5.3 ALC-V2 process details
⚖ This advanced approach is: A figure is of this classic proces: develop, test - operations:
Develop & Test, POC, before running

🎭 Q-2.5.4 ALC-V3 process details
⚖ This sophisticated approach is: A personal figure is of a ML (machine learning) process develop, test - operations:
BPM AI ML proces

Data monetizing journey

Q-2.6 Organisation & Business Understanding

Once Dorothy and her colleagues made the journey to OZ, they quickly found out that there was no there, there. The Wizard simply told her what she really should have known all along.
Dorothy and her companions just had to figure out how to reframe their own perceived shortcomings and recast them as strengths to achieve real transformation.

🎭 Q-2.6.1 Retrospective creating this page
🕳 Categorizing content, start with the why
I was assuming collecting the old information would be easy to get into this knowledge page. How wrong I was, there was no story line no plan for a direction for the categorisation. I was forced to restart with: Start with Why (Simon Sinek, 2009) Sinek says people are inspired by a sense of purpose (or "Why"), and that this should come first when communicating, before "How" and "What"
The Why for this knowledge page is: This failed for some aspects and is successful with other ones.

❓ The question why is this happening?
🤔Assumptions were made. 👁 Conclusion: These assumptions are false.
The reality, there is:
🎭 Q-2.6.2 Retrospective creating pillar pages
Steer Shape Serve in pillars, the visualisation.
Avoiding the word "Information", Added symbols from the Jabes framework with the Jabes application.
Three pillars but the activities are mixed:
Information accountability is clearly at "steer", business organisation.
Communication interactions is a combination of all over the different type of activities.
dtap layers application
The figure,
See right side:

Steer Shape Serve in pillars, categorizing content, start with the why
Succeeded: Putting technology at distance results in a gap of supporting simple processes by technology tot the organisation.
👁 These pillars are conforming an organisation hierarchy.
👉🏾 Troublesome is the missing awareness for aligning technology and product management in strategy and tactics (I-Steer, I-Shape, I-Serve). This results in gaps at operations.

🎭 Q-2.6.3 Retrospective building reference pages
Refernce pages, categorizing content, start with the why
Succeeded: 👁 These reference pages have no conforming organisation hierarchy.
Rotating the overhauled nine plane reordering into Steer Serve Shape. Replacing the hierarchical assocaited words: strategy, tactical, operations into: basic competent advanced.
Three levels of growing skills, the activities are mixed:
Putting technology in the middle allows to show acting on simple adjustments (Steer-Serve) and ones with more impact (Server-Shape).
dtap layers application
The figure,
See right side:


This page, categorizing content, start with the why
Succeeded: Changing and solving all issues will not be easy.
A tool can help but is not decisive for the needed cultural change.
6w 1 how
Categorisation of 5W1H
Having six paragraphs there is an buildiing up into a goal from a request into a result.
A lazy categorisation:
📚 Q.2.6.4 External references
Many not that important external refrences ar part of the text.
Relevant links, a limited list:
link , newstopic interest who, source date
The Four Dimensions of Semantic Qualityn
Ronald G. Ross 2020
A Philosophy of Security Architecture Design
Geir M. Køien 2020-04
Digital Leadership: The Objective-Subjective Dichotomy of Technology Revisited
Benny De Waal, Pascal Ravesteyn, Frank van Outvorst 2016
BIDM - The Business Intelligence Development Model
Catalina Sacu, Marco Spruit 2010-06

🎭 Q-2.6.5 ICT & Philosophy
Why Philosophy?
A Philosophy of Security Architecture Design
There certainly are many technical aspects of modern information and communications technology (ICT) systems and the associated security architectures. Indeed, most of the aspect of how to achieve the goals tend to be of a technical nature. However, questions concerning why need not be technical at all. That is, on a systems level, the end goal of a security architecture are normally not technical in nature. Instead, the goals tend to be more philosophical. They may be framed in a context of moral and ethics, and sometimes in the framework of legislation and societal rules.
The distinction between the technical and concrete aspects and the philosophical aspects can be conceptualized as the difference between verification and validation. Verification is largely about checking that something is done the correct way, whereas validation is concerned about whether one is doing the right thing. It is of little use to do something correctly, if one is indeed doing the wrong thing in the first place.

... taleb-fragile_antifragile
The need for security, safety and privacy is in many ways self-evident. Large-scale critical infrastructures is essential to society, and so the level of security, safety and privacy becomes a question about what kind of society one wants to have. We shall not dive into safety and privacy in this paper. However, we argue that strong security is a necessary condition for both safety and privacy. This puts emphasis on the importance of an effective and comprehensive security architecture. Informally, the differences and relationships between security, safety and privacy can be stated as follows: ...
The Incerto is a set of books and essays on uncertainty. Touted as an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision making, the main body of the Incerto consists of five books by Nassim Nicolas Taleb. ...
The design of the proactive parts of a security architecture will be explicitly specified. These are “design phase” requirements. It will be possible to have a complete and consistent design for the proactive measures. The dynamic/reactive parts of the security architecture, which will be dealing with incident detection and response, recovery, etc., will likely not be completely specified (if at all). ...
There does not seem to be much work done on security architecture designs for large-scale critical infrastructures. Frederick Brooks, of “The Mythical Man-Month” fame, has written extensively about system designs in “The Design of Designs” . ...


🎯 Algol Interact Tenets Change Volatile North 🎯
  
🚧  Variety Act on Cyber Change ALC-V* Knowit 🚧
  
🔰 Contents Mesh ABCs Control ALC-V* Polestar 🔰


Q-3 Decisions in Wisdom


meandering path

Q-3.1 Miscellaneous Knowledge Wisdom

The technology using computers has several lines of evolvements. The hardware has become faster, better, cheaper. Application software has a few basic fundaments in logic by flows and constructs. The problem to solve has moved from a pure technical aspect how to run machines into how to process information in a technical way.

Q-3.1.1 Distractors from knowledge
Aristotle & Plato
These old greek philosophers are stating the problem with the analytics. Them meaning and concepts. theory_of_universals Aristotle to Platonic realism Socrates Plato and Xenophon's dialogues provide the main source of information on Socrates's life and thought. These writings are the Sokratikoi logoi, or Socratic dialogues, which consist of reports of conversations apparently involving Socrates.
The most famous allegory of the cave Any disruptive change altough being the real truth is not automatically an acceptable option for the involved. Decisions on information change by perspectives.

Algorithm
Al-Khwarizmi The origin of refering to an algorithmn. An algorithm is an simplified recipe tos solve an known type of problem. alogrithm vs abocist usage
Al-Khwarizmi´s contributions to mathematics, geography, astronomy, and cartography established te basis for innovation in algebra and trigonometry.
His systematic approach to solving linear and quadratic equations led to algebra, a word derived from the title of his book on the subject, "The Compendious Book on Calculation by Completion and Balancing".
On the Calculation with Hindu Numerals written about 820, was principally responsible for spreading the Hindu-Arabic numeral system throughout the Middle East and Europe. It was translated into Latin as Algoritmi de numero Indorum. Al-Khwarizmi, rendered as (Latin) Algoritmi, led to the term "algorithm". Some of his work was based on Persian and Babylonian astronomy, Indian numbers, and Greek mathematics.

📚 These not understandable type of alogrithms are basic school maths lessons these days. How to Solve Quadratic Equations Just follow the recipe to get the answer.

colossus machine
Encrypting decrypting machines.
Being able to communicate using machines and more of the technical realisations solving the limits previous exist a new problem arose. How to keep the information that is send in transmission types easily tapped by others as a secret?
Colossus
Colossus was a set of computers developed by British codebreakers in the years 1943-1945 to help in the cryptanalysis of the Lorenz cipher. Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Colossus is thus regarded as the world´s first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.
This was not the decryption of the engigma machine but the mechanical transformed telex machines (Lorenz machines - tunny).

Decrypting Enigma, Turing
British bombe
Bombe machine The British bombe was an electromechanical device designed by Alan Turing soon after he arrived at Bletchley Park in September 1939. Harold "Doc" Keen of the British Tabulating Machine Company (BTM) in Letchworth (35 kilometres (22 mi) from Bletchley) was the engineer who turned Turing´s ideas into a working machine—under the codename CANTAB. Turin´s specification developed the ideas of the Poles´ bomba kryptologiczna but was designed for the much more general crib-based decryption.

The enigma code has never been broken, sloppy procedures leaking basic conventions decreased the number of options to verify sufficient for getting enough decrypted in time. The impact of reading what another did not want to let known was huge
While Germany introduced a series of improvements to Enigma over the years, and these hampered decryption efforts to varying degrees, they did not ultimately prevent Britain and its allies from exploiting Enigma-encoded messages as a major source of intelligence during the war. Many commentators say the flow of communications intelligence from Ultra´s decryption of Enigma, Lorenz and other ciphers shortened the war significantly and may even have altered its outcome.

Q-3.1.2 Different paths, philosophy
Portrait_of_Niccolò_Machiavelli.jpg
Politics philosophy - Machiavelli
The Prince (Il Principe), written around 1513 but not published until 1532, five years after his death .
After his death Machiavelli's name came to evoke unscrupulous acts of the sort he advised most famously in his work, The Prince. He claimed that his experience and reading of history showed him that politics have always been played with deception, treachery, and crime. He also notably said that a ruler who is establishing a kingdom or a republic, and is criticized for his deeds, including violence, should be excused when the intention and the result are beneficial to him.[9][10][11] Machiavelli's Prince has been surrounded by controversy since it was published. Some consider it to be a straightforward description of political reality. Others view The Prince as a manual, teaching would-be tyrants how they should seize and maintain power.
...
That a community has different components whose interests must be balanced in any good regime is an idea with classical precedents, but Machiavelli's particularly extreme presentation is seen as a critical step towards the later political ideas of both a division of powers or checks and balances, ideas which lay behind the US constitution, as well as many other modern state constitutions.
Charles_Darwin_seated_crop.jpg
Philosophy Change Nature - Charles Darwin
Darwin published his theory of evolution with compelling evidence in his 1859 book On the Origin of Species . By the 1870s, the scientific community and a majority of the educated public had accepted evolution as a fact. However, many favoured competing explanations that gave only a minor role to natural selection, and it was not until the emergence of the modern evolutionary synthesis from the 1930s to the 1950s that a broad consensus developed in which natural selection was the basic mechanism of evolution. Darwin's scientific discovery is the unifying theory of the life sciences, explaining the diversity of life. ...
He saw that European colonisation would often lead to the extinction of native civilisations, and "tried to integrate colonialism into an evolutionary history of civilization analogous to natural history". ...
The term Darwinism was used for the evolutionary ideas of others, including Spencer's "survival of the fittest" as free-market progress, and Ernst Haeckel's polygenistic ideas of human development. Writers used natural selection to argue for various, often contradictory, ideologies such as laissez-faire dog-eat-dog capitalism, colonialism and imperialism. However, Darwin's holistic view of nature included "dependence of one being on another"; thus pacifists, socialists, liberal social reformers and anarchists such as Peter Kropotkin stressed the value of co-operation over struggle within a species. Darwin himself insisted that social policy should not simply be guided by concepts of struggle and selection in nature. ...

Sociology - Michael Kranzberg
Decision making a limited philisophy on those aspects.
Kranzberg is known for his laws of technology.
Melvin Kranzberg's six laws of technology state: Sserved in the U.S. Army in Europe during World War II. He received a Bronze Star for interrogating captured German prisoners and learning the location of Nazi gun emplacements. He was one of two interrogators out of nine in Patton's army who were not killed during the conflict.

Politicial theorist - Langdon Winnner
Technology and politics a limited philisophy on those aspects.
To the question he poses "Do Artifacts Have Politics?", Winner identifies two ways in which artifacts can have politics.
The first, involving technical arrangements and social order, concerns how the invention, design, or arrangement of artifacts or the larger system becomes a mechanism for settling the affairs of a community. This way "transcends the simple categories of 'intended' and 'unintended' altogether", representing "instances in which the very process of technical development is so thoroughly biased in a particular direction that it regularly produces results heralded as wonderful breakthroughs by some social interests and crushing setbacks by others".
This second way in which artifacts can have politics can be further articulated as consisting of four 'types' of artifacts: those requiring a particular internal sociological system, those compatible with a particular internal sociological system, those requiring a particular external sociological system, and those compatible with a particular external sociological system.
Certain features of Winner's thesis have been criticized by other scholars, including Bernward Joerges.
Over the years one focus of Winner's criticism has been the excessive use of technologies in the classroom, both in K-12 schools and higher education. Winner's critique is well explained in his article "Information Technology and Educational Amnesia,"and expressed in his satirical lecture, "The Automatic Professor Machine."
Philosophy of Science, Technology & Society -
Theory of Technological Mediation a limited philisophy on those aspects.
Verbeek presents as the purpose of his theory of technological mediation to systematically analyzing the influence of technology on human behavior in terms of the role technology plays in human-world relations. In his original theory, a number of different human-technology-world relations are stipulated (the first four based on the philosophy of Don Ihde): A unique feature of Verbeek's philosophy of technology is its basis in an empirical analysis of technologies. Instead of generating an overarching framework by which the universal features of specific technologies can be analyzed, Verbeek takes the technology itself as point of departure; which is for example illustrated by his analysis of ultrasound technology
Q-3.1.3 Impact technology changes
Change - food, climate
Changes over time/space and lessons for future food safety The emergence of city-states has been a major driver of food system changes, bringing together large populations within defined boundaries and requiring complex governance to deliver sufficient quantities and quality of food. Advances in food storage, with sealed containers and curing methods, the use of animal transport, sailing ships, and trains to move larger volume than can be carried by individuals; trade in ingredients like salt as well as live animals and agricultural products; and increasing political and military conflict for resources all have been developments of the city-state.
Early impact of Mesoamerican goods in Iberian society. The early impact of Mesoamerican goods on Iberian society had a unique effect on European societies, particularly in Spain and Portugal. The introduction of American "miracle foods" was instrumental in pulling the Iberian population out of the famine and hunger that was common in the 16th century. Maize (corn), potatoes, turkey, squash, beans, and tomatoes were all incorporated into existing Spanish and Portuguese cuisine styles. Equally important was the impact of coffee and sugar cane growing in the New World (despite having already existed in the Old World). Along with the impact from food, the introduction of new goods (such as tobacco) also altered how Iberian society worked.

Change - Social structure
Age of Enlightenment The Enlightenment featured a range of social ideas centered on the value of knowledge learned by way of rationalism and of empiricism and political ideals such as natural law, liberty, and progress, toleration and fraternity, constitutional government and the formal separation of church and state.
... The Enlightenment was preceded by the Scientific Revolution and the work of Francis Bacon and John Locke, among others.
... Others cite the publication of Isaac Newton's Principia Mathematica (1687) as the culmination of the Scientific Revolution and the beginning of the Enlightenment.
... Philosophers and scientists of the period widely circulated their ideas through meetings at scientific academies, Masonic lodges, literary salons, coffeehouses and in printed books, journals, and pamphlets. The ideas of the Enlightenment undermined the authority of the monarchy and religious officials and paved the way for the political revolutions of the 18th and 19th centuries.

Change - Shrinking world
Global village describes the phenomenon of the entire world becoming more interconnected as the result of the propagation of media technologies throughout the world. The term was coined by Canadian media theorist Marshall McLuhan in his books The Gutenberg Galaxy: The Making of Typographic Man (1962) and Understanding Media (1964).
🎯 New goals:
Ai missing ML

Q-3.2 Communication - Interactions

Working with machines that process information, is a relative new topic of science. Human communications and interaction is classic.
The concept of the "information" container is not that clear and simple.
Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information.
Q-3.2.1 Goals feeding decisisons
Yellow brick road
Monetizing Data: Follow the Yellow Brick Road (Mark Katz 2018) Firms can undergo the same kind of journey, only to find out that there is indeed no “magic” to solving data monetization challenges. While the tools have vastly improved, and the power of BI buttressed by AI and Machine Learning has helped greatly with incorporating challenges like unstructured and disparate data (internal and external), that Yellow Brick Road journey still requires cultural and operational steps including empowerment of associated teams. There is not a lot of room for autocracy in achieving the best results. Foundational elements work best, and collaboration is a must.
...
In my experience around toolsets, it is often a mistake to think that monetizing data is as easy as dropping tools into internal users or customer’s hands and you have a data product. That approach can be myopic, ultimately damaging a firm’s brand, causing the data journey to move sideways.
Firms should avoid building process and foundational data strategies around software, hoping for an easy answer—that will indeed be an expensive mistake.

The need for decision making
Decision making is necessary when there are relationships with others. As soon a conflict arises the choice is solving that by: I used the words that are common in organizing agile ICT.
🤔 Questions:
Ethicals: Volatility uncertainty complexity ambiguity
Vuca world .
The deeper meaning of each element of VUCA serves to enhance the strategic significance of VUCA foresight and insight as well as the behaviour of groups and individuals in organizations. It discusses systemic failuresand behavioural failures, which are characteristic of organisational failure.

Ethicals arround decision making.
Review: The design of design (F Brooks 2010, author of The Mythical Man-Month) Within seeking for new silver bullets organising information processing, old question are keeping coming back.
How to make Decisions in uncertain Times (Wolfgang Goebl 2015)
It is generally assumed that collaboration is, in and of itself, a "good thing." "Plays well with others" is high praise from kindergarten onward. "All of us are smarter than any of us." "The more participation in design, the better." Now, these attractive propositions are far from self-evident. I will argue that they surely are not universally true. Most great works of the human mind have been made by one mind, or two working closely.
This is true of most of the great engineering feats of the 19th and early 20th centuries. But now, team design has become the modern standard, for good reasons. The danger is the loss of conceptual integrity in the product, a very grave loss indeed. So the challenge is how to achieve conceptual integrity while doing team design, and at the same time to achieve the very real benefits of collaboration. [F. Brooks: 'The Design of Designs', 2010]
What you should do: The Architectural Thinking Framework defines the following: To deal with the challenges of the VUCA world, many companies experiment with shifting the idea of agility, as broadly used in software engineering practices in form of e.g. SCRUM to the whole organization. Browsing through approaches about scaled Agile, some of their proponents seem to propose that all decisions should be made decentralized by autonomous teams. Use the knowledge of the many and you will get the right solutions.
But that is far from true.
All approaches proposing the agile enterprise do not take one thing into account: architecture. Building solutions in a sound architectural form needs common elements and ‘conceptual integrity’. This means that the concepts and structures of the business (capabilities, value streams, products & services, business objects) and IT (technology components) must play together in a way that maximizes simplicity, consistency, agility and thus business value.

Q-3.2.2 Enterprise Culture vision
Demings legacy and the Toyota way
25 Years after W. Edwards Deming
He greatly influenced the management of quality in Japan, where he is still revered as one of the great gurus in manufacturing.
Through his influence on Toyota, his ideas are now common in the lean world.

⚖ Lean culture, PDCA
❓ what is real lean about?
  1. Create constancy of purpose toward improvement of product and service, with the aim to become competitive, to stay in business and to provide jobs.
  2. Adopt the new philosophy. We are in a new economic age.
    Western management must awaken to the challenge, must learn their responsibilities, and take on leadership for change.
  3. Cease dependence on inspection to achieve quality. Eliminate the need for massive inspection by building quality into the product in the first place.
  4. End the practice of awarding business on the basis of a price tag.
    Instead, minimize total cost. Move towards a single supplier for any one item, on a long-term relationship of loyalty and trust.
  5. Improve constantly and forever the system of production and service, to improve quality and productivity, and thus constantly decrease costs.
  6. Institute training on the job.
  7. Institute leadership . The aim of supervision should be to help people and machines and gadgets do a better job. Supervision of management is in need of overhaul, as well as supervision of production workers.
    (see Point 12 and Ch. 8 of Out of the Crisis).
  8. Drive out fear, so that everyone may work effectively for the company.
    (See Ch. 3 of Out of the Crisis)
  9. Break down barriers between departments.
    People in research, design, sales, and production must work as a team, to foresee problems of production and usage that may be encountered with the product or service.
  10. Eliminate slogans, exhortations, and targets for the work force asking for zero defects and new levels of productivity. Such exhortations only create adversarial relationships, as the bulk of the causes of low quality and low productivity belong to the system and thus lie beyond the power of the work force.
    1. Eliminate work standards (quotas) on the factory floor. Substitute with leadership.
    2. Eliminate management by objective. Eliminate management by numbers and numerical goals. Instead substitute with leadership.
  11. Remove barriers that rob the hourly worker of his right to pride of workmanship. The responsibility of supervisors must be changed from sheer numbers to quality.
  12. Remove barriers that rob people in management and in engineering of their right to pride of workmanship. This means, inter alia, abolishment of the annual or merit rating and of management by objectives (See Ch. 3 of Out of the Crisis).
  13. Institute a vigorous program of education and self-improvement.
  14. Put everybody in the company to work to accomplish the transformation. The transformation is everybody´s job.
⚖ Lean, Deadly Diseases
Well that is real lean, very sensible, far more than that PDCA cycle.
❓ What should be avoided whith real lean?
He also created a list of the “Seven Deadly Diseases,” which are also sensible.
  1. Lack of constancy of purpose
  2. Emphasis on short-term profits
  3. Evaluation by performance, merit rating, or annual review of performance
  4. Mobility of management
  5. Running a company on visible figures alone
  6. Excessive medical costs
  7. Excessive costs of warranty, fueled by lawyers who work for contingency fees

⚖ Abstraction forces in the organsisation:
There are more lines of power in an organisation. Some of those:
  1. Financial based management. Goal: profits at least enough budget for tasks.
  2. Core business. Goal: Fulfilling the operations for tasks of the organisation.
  3. Green fields. Goal: Improvement, product research, customer relations.
💣 The powers are not equally balanced The core business (operations) is the line having commonly the least influence at strategic level. The result of that could be (risk) a total loss of all tasks the business was positioned to do. Going back to the basics of Fayol.
A proposal for a generic approach balancing powers. CEO The responsibilities of an organization´s CEO are set by the organization´s board of directors or other authority, depending on the organization´s structure. They can be far-reaching or quite limited, and are typically enshrined in a formal delegation of authority regarding business administration.
Typically, responsibilities include being an active decision-maker on business strategy and other key policy issues, leader, manager, and executor. The communicator role can involve speaking to the press and to the public, as well as to the organization´s management and employees; the decision-making role involves high-level decisions about policy and strategy. The CEO is tasked with implementing the goals, targets and strategic objectives as determined by the board of directors.

Q-3.2.3 Decisions wiht the aid of machines
Usablity classic explainable Machine Learning
Recognizing cat or dog in a Rorschach setting. The result is one the will fulfil algorithmic requirements it is not guaranteeing correct result in the real world perception. The world of deep learning AI is having results on categorizing images sound and more in a mostly but not absolute correct classification.
The only thing that gets underpinning by using information (data) is the decisions that were done before by "good human feeling". It is not a negative advice for using machines instead is a positive advice to uses machines wisely.
Qualities: The problems in doing that were usually showing a bad way of human decision makers. Blaming the machine showing that kind of human issues is human nature.

Usablity Deep Learning
optical vaze-humans opticoal young-old woman These kind of decisions are partial for another decision. Looking for a vase of looking persons talking to each other are very different questions. Asking whether there is a young or old woman and seeing the sae picture having both of them.
The fulfilment by the same image is counterintuitive.
 
What kind of problems are a good condidate for this type ofatomated machine classification decisions?
The validation on being a cat or dog could be eays seen by a human but not by running the same machine model again.
Cleaning up harvested natural goods using machines coud be automated with image recognition. Whether the fall out is well enough segregated is easilty to see by a human but correcing the image selection when not appropiate is a difficult change.
The hyped biometrical recongition in computertechnology I avoided is this list. Probably machines are already better in recognizing humans than humans in the same limitation setting.

🎯 New goals: Decisions on information change by perspectives.
rethink what has happened TN

Q-3.3 Historical evolvement Wisdom

Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step.
For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of, each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain.

Q-3.1.2 Logical constructs information processing.
Alan turing
Decisions, problems solving, encryption - Turing
Alan-Turing Honored by: Turing award
What mathematicians called an "effective" method for solving a problem was simply one that could be carried by a human mathematical clerk working by rote. In Turing´s time, those rote-workers were in fact called "computers," and human computers carried out some aspects of the work later done by electronic computers. The Entscheidungsproblem sought an effective method for solving the fundamental mathematical problem of determining exactly which mathematical statements are provable within a given formal mathematical system and which are not. A method for determining this is called a decision method. In 1936 Turing and Church independently showed that, in general, the Entscheidungsproblem problem has no resolution, proving that no consistent formal system of arithmetic has an effective decision method.
It was in the course of his work on the Entscheidungsproblem that Turing invented the universal Turing machine, an abstract computing machine that encapsulates the fundamental logical principles of the digital computer. ...
Turing was a founding father of artificial intelligence and of modern cognitive science, and he was a leading early exponent of the hypothesis that the human brain is in large part a digital computing machine. He theorized that the cortex at birth is an "unorganised machine" that through "training" becomes organized into a universal machine or something like it. Turing proposed what subsequently became known as the Turing test as a criterion for whether an artificial computer is thinking (1950).
To be or not to be? Turing%27s proof It was the second proof (after Church´s theorem) of the conjecture that some purely mathematical yes-no questions can never be answered by computation; more technically, that some decision problems are "undecidable" in the sense that there is no single algorithm that infallibly gives a correct "yes" or "no" answer to each instance of the problem. In Turing´s own words: "...what I shall prove is quite different from the well-known results of Gödel ... I shall now show that there is no general method which tells whether a given formula U is provable in K [Principia Mathematica]..." (Undecidable, p. 145).

Edsger_Wybe_Dijkstra
Science information processing - EW Dijkstra
Quality Correctness Elegance.
Edsger Dijkstra The question on software quality by mathematical abstractian.
One of the most influential figures of computing science's founding generation, Dijkstra helped shape the new discipline from both an engineering and a theoretical perspective. His fundamental contributions cover diverse areas of computing science, including compiler construction, operating systems, distributed systems, sequential and concurrent programming, programming paradigm and methodology, programming language research, program design, program development, program verification, software engineering principles, graph algorithms, and philosophical foundations of computer programming and computer science. Many of his papers are the source of new research areas. Several concepts and problems that are now standard in computer science were first identified by Dijkstra or bear names coined by him.
Strcutured programming.
Jackson structured programming , Nassi Shneiderman diagram Algol all were basic elements at education touching software design in the first years thereafter.
"The revolution in views of programming started by Dijkstra's iconoclasm led to a movement known as structured programming, which advocated a systematic, rational approach to program construction. Structured programming is the basis for all that has been done since in programming methodology, including object-oriented programming."

Edward_lorenz
Chaos theory - Edward_Norton_Lorenz
Edward_lorenz By the late 1950s, Lorenz was skeptical of the appropriateness of the linear statistical models in meteorology, as most atmospheric phenomena involved in weather forecasting are non-linear. It was during this time that his discovery of deterministic chaos came about.

In 1961, Lorenz was using a simple digital computer, a Royal McBee LGP-30, to simulate weather patterns by modeling 12 variables, representing things like temperature and wind speed. He wanted to see a sequence of data again, and to save time he started the simulation in the middle of its course. He did this by entering a printout of the data that corresponded to conditions in the middle of the original simulation. To his surprise, the weather that the machine began to predict was completely different from the previous calculation. The culprit: a rounded decimal number on the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 printed as 0.506. This difference is tiny, and the consensus at the time would have been that it should have no practical effect. However, Lorenz discovered that small changes in initial conditions produced large changes in long-term results.
Lorenz's discovery, which gave its name to Lorenz attractors, showed that even detailed atmospheric modelling cannot, in general, make precise long-term weather predictions. His work on the topic, assisted by Ellen Fetter, culminated in the publication of his 1963 paper "Deterministic Nonperiodic Flow"
Explanation choices
This collection for processing Information is: Remarkable this basic theory of information processing is that recent.
Q-3.2.2 Using Data Analytics statistics.
Bayes, 18th century.
Thomas_Bayes Thomas_Bayes (wikipedia) One of the founders for probablity.
Bayesian probability is the name given to several related interpretations of probability as an amount of epistemic confidence " the strength of beliefs, hypotheses etc." rather than a frequency. This allows the application of probability to all sorts of propositions rather than just ones that come with a reference class. "Bayesian" has been used in this sense since about 1950. Since its rebirth in the 1950s, advancements in computing technology have allowed scientists from many disciplines to pair traditional Bayesian statistics with random walk techniques. The use of the Bayes theorem has been extended in science and in other fields.

Laplace, 19th century.
Laplace is more generic. Some of his theory being used at spectral signal processing science. What is called these days "Bayesian" is more likely coming from Laplace.
Pierre-Simon,_marquis_de_Laplace_(1745-1827) laplace (wikipedia)
In 1812, Laplace issued his Theorie analytique des probabilitys in which he laid down many fundamental results in statistics. The first half of this treatise was concerned with probability methods and problems, the second half with statistical methods and applications. Laplace´s proofs are not always rigorous according to the standards of a later day, and his perspective slides back and forth between the Bayesian and non-Bayesian views with an ease that makes some of his investigations difficult to follow, but his conclusions remain basically sound even in those few situations where his analysis goes astray.

Fisher, 20th century.
Youngronaldfisher2.jpg Ronald Fisher (wikipedia) In 1925 he published Statistical Methods for Research Workers, one of the 20th century´s most influential books on statistical methods. Fisher´s method is a technique for data fusion or "meta-analysis" (analysis of analyses). This book also popularized the p-value, and plays a central role in his approach. Fisher proposes the level p=0.05, or a 1 in 20 chance of being exceeded by chance, as a limit for statistical significance, and applies this to a normal distribution (as a two-tailed test), thus yielding the rule of two standard deviations (on a normal distribution) for statistical significance.
The basics on statistics mostly practiced descriptive, the only prediction is extrapolation from a small sample size to a complete population.

Explanation choices
This collection for processing Information is: Remarkable this basic theory of information processing is that old. Just recently this kind of knowledge started to become usable by computer technology known as Artificial Intelligence, Machine Learning (AI, ML).
Q-3.2.3 Using Big Data, forgotten histories.
Age of Discovery Age of Discovery
"It saw also the first major victories of empirical inquiry over authority, the beginnings of that close association of science, technology, and everyday work which is an essential characteristic of the modern western world." ...
The Portuguese began systematically exploring the Atlantic coast of Africa in 1418, under the sponsorship of Infante Dom Henrique (Prince Henry). In 1488, Bartolomeu Dias reached the Indian Ocean by this route. ...
Technological advancements that were important to the Age of Exploration were the adoption of the magnetic compass and advances in ship design. ...
Indian Ocean trade routes were sailed by Arab traders. Between 1405 and 1421, the Yongle Emperor of Ming China sponsored a series of long range tributary missions under the command of Zheng He (Cheng Ho). The fleets visited Arabia, East Africa, India, Maritime Southeast Asia and Thailand. But the journeys, reported by Ma Huan, a Muslim voyager and translator, were halted abruptly after the emperor's death and were not followed up, as the Chinese Ming dynasty retreated in the haijin, a policy of isolationism, having limited maritime trade. ...

Florence Nightingale Florence Nightingale
Nightingale was a pioneer in statistics; she represented her analysis in graphical forms to ease drawing conclusions and actionables from data. She is famous for usage of the polar area diagram, also called the Nightingale rose diagram, equivalent to a modern circular histogram. This diagram is still regularly used in data visualisation. ...
she was simply opposed to a precursor of germ theory known as contagionism. This theory held that diseases could only be transmitted by touch. Before the experiments of the mid-1860s by Pasteur and Lister, hardly anyone took germ theory seriously; even afterwards, many medical practitioners were unconvinced. Bostridge points out that in the early 1880s Nightingale wrote an article for a textbook in which she advocated strict precautions designed, she said, to kill germs.

Lt._Matthew_Maury Matthew_Fontaine_Maury
Lieutenant Maury published his Wind and Current Chart of the North Atlantic, which showed sailors how to use the ocean's currents and winds to their advantage, drastically reducing the length of voyages. His Sailing Directions and Physical Geography of the Seas and Its Meteorology remain standard. Maury's uniform system of recording synoptic oceanographic data was adopted by navies and merchant marines around the world and was used to develop charts for all the major trade routes. ...
Maury became convinced that adequate scientific knowledge of the sea could be obtained only through international cooperation. He proposed that the United States invite the maritime nations of the world to a conference to establish a "universal system" of meteorology, and he was the leading spirit of a pioneer scientific conference when it met in Brussels in 1853. Within a few years, nations owning three-fourths of the shipping of the world were sending their oceanographic observations to Maury at the Naval Observatory, where the information was evaluated and the results were given worldwide distribution.
Explanation choices
This collection for processing Information is: Information processing is a combination of these three parts.
🎯 New horizons:
SIAR cycle

Q-3.4 Processing flows VSM - Control

A swarm organisation, self organisation, are networked structures without leaderships. Using some shared goal.
⚠ Challenges: have a shared goal, have a good shared goal.
The organisation structure is a hierarchical line of command. Group formations using leaders is human nature.
⚠ Challenges: avoiding leadership to micro details, bad goals.
To solve:
😱 Every topic needs improvements proposals for how at each.

Q-3.4.1 Defining priority for changes
Hoshin Kanri
😉 The start and reason of any change. There must be an associated goal of the organisation.
Part 1: The To-Do List x-matrix 4 seasons
... you will sooner or later come across an X-Matrix. It is a visually very impressive tool, but I am in serious doubt about its usefulness. It focuses on the creation of the Hoshin items, but to me this approach is overkill, and – even worse – may distract the user from actually following the PDCA, especially the Check and Act parts. ...
Setting the right goals and filtering them through the organization is important in Hoshin Kanri. In my first post I talked in detail about this as the "to-do list." ...
Like the “normal” Hoshin Kanri, this document is done at different levels in the hierarchy, starting with the top executive. These are named rather straightforward as top-level matrix, second-level matrix, and third-level matrix.

Hoshin Kanri X-matrix
A figure:
See right side

Criticsm:
Most Hoshin Kanri documents that I know cover one year. This is usually a good duration, since one year allows for quite a bit of improvement activity. This duration is also long enough to see the results and review the outcome.
The fit with the SIAR model and PDCA DMAIC is far to nice. It solves: "who and why" going from "knowledge" (Jabes stage proposals backlog) into initiating activities.

Q-3.4.2 Indispensable security, safety
PDCA used by SOC Security Operations Center
SOC_aas.jpeg A marvelous figure for security, it is having a feedback to improve VAS, TIS.
Starting with a BIA (Business Impact Analysis) for risk probability and impact it goes into the CIA (Confidnetiality Integrity Availablity). 💡 A way to look at improvement, mitigations:
VAS A versatile authentication server or service (VAS) is a single authentication product or service supporting user authentication in multiplatform environments (on-premises and in the cloud).
For TIS, Trustede Identity Services, the key symbol, PAM Controlling, monitoring, and auditing privileges and privileged access—for employees, vendors, systems, applications, IoT, and everything else that touches your IT environments is essential for protecting against both external and internal threat vectors, and for meeting a growing list of compliance requirements.
💡 Extended SOC Security Operations Centre
😉 Solving organisational impediments.
There is a serious problem in managing security, See T-2.5.3 Identity Access (I-Serve) .
Moving the operational activity: onboarding -offboarding and those for middleware and infrastructure to the SOC is a way out of the problems. Generic patterns for security are more easy to define and rolled out. The accountabilities and insight for control still are left at the organisation.

SOC_vistriad.gif
Standard SOC Security Operations Center
The most likely source, Dacoso. Standard competent activities: Soc aas
Advanced activities requiring more intelligent decision making: The challenges in the decision making are:
Technical
Words are changing althoudgh concepts are te same: Securing Machine Learning Algorithms

Q-3.4.3 Data, Information Quality
functional
😉 Solving organisational impediments. Impossible choice:
An Overview of Data Quality Frameworks (2019)
Nowadays, the importance of achieving and maintaining a high standard of data quality is widely recognized by both practitioners and researchers. Based on its impact on businesses, the quality of data is commonly viewed as a valuable asset. The literature comprises various techniques for defining, assessing, and improving data quality. However, requirements for data and their quality vary between organizations. Due to this variety, choosing suitable methods that are advantageous for the data quality of an organization or in a particular context can be challenging.
Technical
😉 Solving organisational impediments. Start with the why: Simple data management (Robert Borkes)
The 9-Square Data Management Model offers a robust framework tailored to foster strategic alignment across diverse facets of business operations. Specifically crafted to address the challenges faced by executives grappling with data-related issues, this model provides targeted solutions and guidance. By serving as a guiding principle, it facilitates the synchronization of business and data strategies, leading to enhanced efficiency in organizational decision-making processes.
With strategic goals centered around maximizing “return on data”, and clear objectives aimed at improving decision-making, optimizing organizational structure, and delivering exceptional services, this model empowers organizations to harness the full potential of their data assets.
R.Borkes 9-plane
in a figure:
See left side

Q-3.4.4 Impact on persons
functional
Information over a person is not the same as the person is the owner of the information. 😱 There is lot going wrong just by this wrong assumption.
Personal_data
In the GDPR, personal data is defined as:
Any information relating to an identified or identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.


When there is a decision having impact on a person the common assumption is it is possible for those decisions not having different impact in similar situations. This is very wrong assumption in complex systems. The chaos theory based on very simple equations only got known in the sixties (1961). The only way to mitigate those effects is avoiding complex systems going for anti-fragile simplicity.

Technical
Ethical principles in machine learning and artificial intelligence
AI became a self-standing discipline in the year 1955 (McCarthy et al., 2006) with significant development over the last decades. ...
The France’s Digital Republic Act gives the right to an explanation as regards decisions on an individual made through the use of administrative algorithms (Edwards and Veale, 2018). This law touches upon several aspects including: Sensitive governmental areas, such as national security and defence, and the private sector (the largest user and producer of ML algorithms by far) are excluded from this document.
The complexity in this is that rules written as laws are not included in this kind of review. The human as weakest link. There are a lot of statements with doubtfull assumptions leaving eryone to doubt in in a=in anger.
😱 Just telling how it works in documents could be a way out. There is nothing weel evalauated in place (2024).
AI faces the difficulty of translating overarching principle into practices. Even its current setting of seeking maximum speed, efficiency and profit clashes with the resource and time requirements of an ethical assessment and/or counselling.
Q-3.4.5 Business Rules, Algorithm
functional
😉 Solving organisational impediments. Forgotten basics.
The Business Rules Manifesto
A Brief History of the Business Rule Approach
The Business Rules Manifesto, coverage:

Technical
😉 Solving organisational impediments. Sophisticated technology for using basics.
When a technical approach for defining and storing is prefered: Semantics of Business Vocabulary and Business Rules
This specification defines the vocabulary and rules for documenting the semantics of business vocabularies and business rules for the exchange of business vocabularies and business rules among organizations and between software tools. This specification is interpretable in predicate logic with a small extension using modal operators. It supports linguistic analysis of text for business vocabularies and business rules, with the linguistic analysis itself being outside the scope of this specification.
elephant-blind-men

Q-3.5 Processes Building Blocks - Control

The term elephant test refers to situations in which an idea or thing, "is hard to describe, but instantly recognizable when spotted"
A process life cycle building block, ALC life cycle, is very generic en simplistic. There are only three possible approaches.
To solve:
😱 Aside the basics and realisation control is required.
😱 Communication is made complicated caused by neglectance.
Q-3.5.1 Vocubalary gaps functional technical.
⚒ Context confusing: business - cyber technology
There is a lot of misunderstanding between normal organisational humans and their cyber colleagues. That culture is not necessary, should be eliminated. This already starts with words describing the organisation.
A translation of words to start:
ICT Business ICT Business ICT Business
Strategy Control - Functional Target-Value - Confidentiality People
Tactical Orchestration - Compliancy Communication - Integrity Processes
Operational Realization - Technical Information - Availability Machines

Note that the asset "Information" is a business asset not something to be pushed off as incomprehensible for the "cyber" guys. Being an important business asset, "Information" accountability and responsibility is at product management staff of the organisation.
Confusing: ICT Business
A figure:
See right side

Q-3.5.2 Control decisions
marionet feeling.
Real working people feelings.
Decision making a limited philisophy on those aspects.
Decision-making can be regarded as a problem-solving activity yielding a solution deemed to be optimal, or at least satisfactory. It is therefore a process which can be more or less rational or irrational and can be based on explicit or tacit knowledge and beliefs. Tacit knowledge is often used to fill the gaps in complex decision making processes. Usually both of these types of knowledge, tacit and explicit, are used together in the decision-making process.
The decisions are by control, controllers, imperators.
That is very selected small group of people. Real working people feelings are experiences like being marionets. They are used like machines and commonly excluded from decisions. That is not an enablement of the workforce.

Devils triangle ICT
Decisions in the ICT devil triangle.
Having three parties: It is difficult to manage them all three well.

Information overload
The advent of modern information technology has been a primary driver of information overload on multiple fronts: in quantity produced, ease of dissemination, and breadth of the audience reached. Longstanding technological factors have been further intensified by the rise of social media including the attention economy, which facilitates attention theft. In the age of connective digital technologies, informatics, the Internet culture (or the digital culture), information overload is associated with over-exposure, excessive viewing of information, and input abundance of information and data.

tsar ceasar
T - Tsar replacement Technology.
With the goal of decision making Technology is just a detail. The importance is in serving the Tsar Tsar Tsar a better description for the abbreviation "T" in ICT.
The title tsar is derived from the Latin title for the Roman emperors, caesar. In comparison to the corresponding Latin word imperator, the Byzantine Greek term basileus was used differently depending on whether it was in a contemporary political context or in a historical or Biblical context. In the history of the Greek language, basileus had originally meant something like "potentate". It gradually approached the meaning of "king" in the Hellenistic Period, and it came to designate "emperor" after the inception in the Roman Empire.
As a consequence, Byzantine sources continued to call the Biblical and ancient kings "basileus" even when that word had come to mean "emperor" when referring to contemporary monarchs, while it was never applied to Western European kings, whose title was transliterated from Latin rex, or to other monarchs, for whom designations ("leader", "chieftain") were used.

Q-3.5.3 Understanding data - information
Creating Data: More To It Than You Think.
Normally we think of communication as either direct conversation or (in the spirit of the times) a flurry of text messages exchanged more or less in real time with people we know. In either case there's usually a shared context within which the meaning of the messages can be interpreted, as well as more or less real-time exchange of clarifications. What's distinct about creating data is that you're almost certainly not going to be face-to-face with the recipients of the message or connected live with them via an interactive network. That fact rules out body language (e.g., raised eyebrows or emoticons) and dialog (including grunts and groans — or more emoticons ) to clarify what you mean. In that sense the communication is blind.
...
As a consequence, the data a worker creates literally needs to speak for itself. The emphasis needs to be on the effectiveness of communication — that is, on semantic quality.
Unfortunately, typical data quality measures in current use focus on the health of the content of the data/system architecture rather than on the semantic quality of the original business messages. That focus serves a purpose for data management but misses the mark almost entirely in clarifying what practices produce good business communications in the first place. Typical data quality dimensions (e.g., completeness, uniqueness, timeliness, etc.) are: Worst of all, typical data quality dimensions implicitly remove responsibility off the shoulders of those who create the data.

The quality of data in a data/system architecture can never be any better than the quality of the business communications that produced it. A systematic means to manage data at rest simply does not guarantee the vitality — the semantic health — of the business communications it supports. Sometimes IT professionals focus so intently on software development the importance of the point escapes them. (Many data professionals do understand the point, but do not know quite how to articulate it or feel powerless to do much about it.)
To make the point differently, it is entirely possible to assess your data quality as outstanding even though the business communications that produced the data were confusing, contradictory, unintelligible, or otherwise ineffective. Rating data quality high when communication is poor is nonsense!
GRoss_SemanticQuality
Q-3.5.4 Solving volatility, uncertainty, complexity, ambiguity
VUCA
Vuca Within VUCA, several thematic areas of consideration emerge, providing a framework for introspection and evaluation: Multiple aspects ambiguity complexity Within the VUCA system of thought, an organization's ability to navigate these challenges is closely tied to its foundational beliefs, values, and aspirations. Those enterprises that consider themselves prepared and resolved align their strategic approach with VUCA's principles, signaling a holistic awareness.
The VUCA world of the 2000/2010s
The term was first coined by the U.S. Army War College to describe the challenges of operating in a post-Cold War world. From there, the acronym made its way into management and leadership literature and business school lecture halls at the turn of the millennium.
BANI
The BANI model goes a step further and helps companies consider the chaotic and completely unpredictable impacts that can have a major impact on their operations. The BANI model of the 2020s

Q-3.5.5 Logic - Algebra - Decision
binary - multiple value
"Boolean" Algebra has been fundamental in the development of digital electronics, and is provided for in all modern programming languages. It is also used in set theory and statistics.
💣 This focus on just two possible outcomes True/False is not what real life is, even it is not what is used and should used in information technology. Not understanding what logic in an information system is applicable will cause unexpected errors when getting used.
"Many valued logic" The Priest P3 logic is used in a relational DBMS where undefined is noted as NULL.
Pareto principle, statistics & decisions.
🤔 Having 80% going well is good enough ... Is it really good enough?
Go for the low hanging fruit, do not bother you will have up to 20% failures or things going wrong.
Pareto-principle

Statistical relevant <-> decisions.
🤔 Accept 5% (1 out 20 is a mistake / failure) acceptable or not?
😱 Searching cases that fulfil the statistical relevance test is a correct approach or not?
P-value In 2016, the American Statistical Association (ASA) published a statement on p-values, saying that "the widespread use of -statistical significance- (generally interpreted as -p le 0.05-) as a license for making a claim of a scientific finding (or implied truth) leads to considerable distortion of the scientific process"
Lift, accuracy, confusion matrix
Explainable AI, better understandable ML Machine Learning, automatic decisions is searched for but lacking. Confusion matrix
In predictive analytics, a table of confusion (sometimes also called a confusion matrix), is a table with two rows and two columns that reports the number of false positives, false negatives, true positives, and true negatives. This allows more detailed analysis than mere proportion of correct classifications (accuracy). Accuracy will yield misleading results if the data set is unbalanced; that is, when the numbers of observations in different classes vary greatly.
In real life the impact by a wrong decision should be another ethic dimension to evaluate. Missing a single wrong decision that is having a catastrophic impact is better to avoid than many wrong decisions with a little impact.


Data monetizing journey

Q-3.6 Controlling Organisation & Business

Once Dorothy and her colleagues made the journey to OZ, they quickly found out that there was no there, there. The Wizard simply told her what she really should have known all along.
Dorothy and her companions just had to figure out how to reframe their own perceived shortcomings and recast them as strengths to achieve real transformation.

🎭 Q-3.6.1 Information categorisation for engineering
Seven Common Myths About the Zachman Architecture Framework
By some kind of evolotion I found myself doing the categorisation in this approach. Premise & Conclusion (Ronald.G.Ross, Gladys S.W.Lam 2015)
Widely misunderstood and misrepresented, the Zachman Architecture Framework is simply a thinking tool, not a methodology of any kind. Its being fundamentally neutral with respect to methodology, in fact, is the secret to its power and the reason it has proven so enduring. The Framework can, of course, be applied to create a methodology, but that's a different matter. ...
Zachman 6W-s no W for which technology The Zachman Architecture Framework is the classification scheme, or ontology, created by John Zachman for engineering things of complexity. Such solutions don't happen by accident — they require deliberate engineering. Zachman simply points out, like it or not, what such 'deliberate engineering' necessarily involves.
The classic scheme has only five rows. The middle one "logic" is duplicate by a context switch.
Zachman's basic premise is that whenever you engineer anything of complexity, no matter what — a complex machine, a skyscraper, a microchip, a spacecraft, a product, a business (an enterprise), or some part of a business (a business capability) — there are two basic aspects that need to be addressed. These two aspects correspond to the columns and rows of the Framework. Myths:

Enterprise Architecture Defined: Architecture Abstractions
Enterprise Architecture (J.A.Zachman 2021)
You can classify the set of descriptive representations of anything (buildings, airplanes, locomotives, battleships, computers, etc.) in a two-dimensional classification structure, a "schema."
One dimension of the classification I call "Abstractions" … I chose to call this dimension of the classification Abstractions because you can abstract out, or separate out, or factor out a set of six single, independent variables or focal points of descriptions that are common to every architected object.
The architectural descriptions of anything will include:
  1. Bills of Material,
  2. Functional Specs,
  3. Drawings (or Geometry),
  4. Operating Instructions,
  5. Timing Diagrams, and
  6. Design Objectives.
It is not mysterious why the people who build buildings, airplanes, battleships, locomotives, computers, all the Industrial Age products that are sufficiently complex to warrant Architecture came up with that set of description representations. They are answering the six primitive interrogatives that constitute the total set of questions that have to be answered to have a complete description of anything: What, How, Where, Who, When, and Why. ...
This goes back about 7,000 years to the origins of language … and by the way, I did not invent this classification. It has been well-exercised by humanity for thousands of years. If you don't answer all six primitive interrogatives it means that your description is incomplete.

Q-3.6.2 Controlling an Enterprise, Organisation
PDCA changing the way of change the enterprise
Using the Jabes framework wiht a Jabes tooling gives a remarkable visualisation. Instead of missions - visions or organisation value streams the process of change processes is te change. devops Jabes
Controller data literacy
Controller data literacy is the ability to select and connect the right sources based on strategic, tactical and operational process information flows so that an actual, complete trustworthy information position is created with which a substantiated decision can be made based on facts.
Q-3.6.3 Pretending being in control
Cargo Cult
Just following: "There is nothing quite so useless, as doing with great efficiency, something that should not be done at all."
The question: Cargo Cult Agile or a true Agile Mindset? (Kasia Bartos)
Jabes generic process When it is hard to notice the difference between the Daily Scrum and the classical “status update for the manager”, we can feel that something is not right.
When team members are complaining that “Scrum Events take up so much time, while we have work to do”, then it is easy to figure out, that there is no buy-in among employees for the whole idea of Scrum and some important ingredient is missing: the Agile Mindset needs to be developed. When Scrum is done without promoting the real Agile Values, we might be dealing with Cargo Cult Agile.
I would add that there is possible something wrong with the promoted agile behaviour.

Source: Feynman, Richard P. (June 1974). "Cargo Cult Science":
In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they´ve arranged to imitate things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas -he´s the controller- and they wait for the airplanes to land.
They´re doing everything right.
- The form is perfect.
- It looks exactly the way it looked before.
- But it doesn´t work. No airplanes land.
So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they´re missing something essential, because the planes don´t land.

If you expect motivated proficient autonomous staff and doing micromanagement there is a contradiction. The real question for help would be solving the essential issues.
Decisions and ethical challenges.
With decisons and leadship, there are many styles of leaderships. There is optimistic approach to collect as much as possible from positive aspects as the goal. On positive aspects there is openness. Negative aspects are also existing in the real world. The goal would be to avoid as many of them as possible. On negative aspects there is no openness.
Devils triangle, dark triad
Dark triad
All three dark triad traits are conceptually distinct although empirical evidence shows them to be overlapping. They are associated with a callous-manipulative interpersonal style. Dark triad a limited philisophy on those aspects.
A new model of Machiavellianism based in organizational settings consists of three factors:
  1. maintaining power.
  2. harsh management tactics.
  3. manipulative behaviors.

Q-3.6.4 Culture, learning from others
Using knowledge of the founders - EW Dijkstra
Dijkstra has left many notes. He did have great distrust in doings things correct.
TV Interview EWD Quality Correctness Elegance
Edsger W. Dijkstra Archive (texas university)
The essence: In IT, the adage as in no other sector is valid: "We don´t have time to do it right, so we will do it over later". And so the laws of the economy have determined with unrelenting cruelty that until the day of his death Dijkstra had to stare at his own right in the form of a relentless stream of messages about faulty software and failed automation projects.
Flow! The Route to Panta Rhei [1] - J van Ede
TWI I
There are many methods for process improvement, ranging from Lean to Agile, and from TPM to Six Sigma. There are differences in their improvement approaches, but much more similarities. For example structured problem solving, and visualising work streams. The most commonly shared characteristic is the realisation of flow! Which improvement method increases throughput (flow) the best, depends on the specific situation, but also on the desired pace of improvement, the gear with which you wish to 'cycle uphill'.
TWI II
The concept Respect for People expresses that managers need the expertise and help of production workers for ongoing improvement. Craftsmanship is not seen as something that 'lower' educated people do, but as a thing to be proud of, and a skill that many 'higher' educated people do not possess. Craftsmanship is much more appreciated in Japan than in the west. Japanse words like monozukuri, the art of making things, and Takumi, an honorary title for an expert in his or her production step, highlight this.
Q-3.6.5 Following steps.
Missing link devops math design meta devops data devops meta devops math
These are design meta modelling concepts, others:

Switching context to more practice (devops)
Practical Math, 👓 next
bianl Theoretical Math.



Others are operational realisations: 👓
data meta -& security in practice- math
What is not here: 👓 bpm in practice.


🎯 Algol Interact Tenets Change Volatile North 🎯
  
🚧  Variety Act on Cyber Change ALC-V* Knowit 🚧
  
🔰 Contents Mesh ABCs Control ALC-V* Polestar 🔰

© 2012,2020,2024 J.A.Karman
🎭 Summary & Indices Elucidation 👁 Foreword Vitae 🎭