This is the most technical part.
Being enablers there should be no dependicies for business processes.
There are three subtopics:
1/ information - computers
2/ encryption, communication meaning
3/ statistics, data exploration
To go to business processes is by floor operations. 🔰 Too fast .. previous.
🎭 Q-1.1.2 Guide reading this page
Break-up: Logic, Concept, Context
The words Logic, Concept, Context are from the Zachman framework.
More easy to understand:
What, Information, Logic: The organisation.
How, Knowledge, Concept: Technology.
Why, Wisdom, Context: Changing processes.
Only three levels are grouped in this, one group for design plan and another for realisations.
The six paragraphs for each of three chapters are aligned for 5w1h questions.
Needed knowledge to read this
Basic knowledge:
Logic of Information processing for an organisation.
Basic understanding of computeres and their ICT technology.
Understanding what he impact of changing processes can be.
This page is just a knowledge area, there are no refences to Jabes.
Why is Jabes interesting?
Everybody is looking for a solution to mange the challenges with information processing.
As far I know there is nothing on the market for solving those challenges holostic.
existing portfolio management
portfolio change management
support design in change management
support validation in chnage management
There are many tools for detailed topics, but no one covering all the interactions.
BPM - Steer content moved to here for the generic knowledge parts.
A first draft version for this publised.
Chapter Steer Q-1.* seems have become suitable.
2024 week 15
Other content copied moved to here for the generic knowledge parts.
Got blocked In Q-2.* and Q-3.* because the structure doesn´t have a good fit.
Restructure after doing a Retro perspective helped to find a way out.
2024 week 16
A draft version of this page finished.
Sub pages deactivated, relevant content moved to devops_math (r-know).
Q-1.2 Communication - Interactions
Working with machines that process information, is a relative new topic of science.
Human communications and interaction is classic.
The concept of the "information" container is not that clear and simple.
Information is an abstract concept that refers to that which has the power to inform.
At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions.
Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information.
🎭 Q-1.2.1 Processing Information
generic communication
Describing the properties of information (data) in some metadata approach is going back to ancient history philosophers with
"Universals" .
"Semiotics"
A sign is anything that communicates a meaning, that is not the sign itself, to the interpreter of the sign.
The meaning can be intentional such as a word uttered with a specific meaning, or unintentional, such as a symptom being a sign of a particular medical condition.
Signs can communicate through any of the senses, visual, auditory, tactile, olfactory, or gustatory.
The semiotic tradition explores the study of signs and symbols as a significant part of communications.
Having two parties in the communication:
There is a sender and there is as receiver of information.
Both parties, sender & receiver, are processing in a "Triangle_of_reference" .
 
Both parties are translating in two lines the intention to a symbol/word vice versa.
There are al lot of opportunities for failing in aligned good communication.
Venkatraman ea argue in 1993 that the difficulty to realize value from IT investments is:
❶ firstly due to the lack of alignment between the business and IT strategy of the organizations that are making investments, and
❷ secondly due to the lack of a dynamic administrative process to ensure continuous alignment between the business and IT domains.
(yellow) Strategy Execution: Business is strategy formulator, IT is implementor follower.
(red) Technology Potential: Business strategy drives IT strategy, the organisaton follows.
(green) Competitive Potential: emerging IT capabilities drives new business strategy.
(blue) Service Level: The role of business strategy is indirect. "It should work."
The four options for who is in the lead and who is following are resulting in with opportunities are possible.
New questions are:
Is the "IT domain" capable of being in the lead?
Having maintenance being done conform specifications would be the expected one to be in place.
Giving options for competitive advantages will require more from staff and more budget than minimal maintenance.
Is the "Business domain" capable of being in the lead?
Having ideas on options with technology could give advantages requires special technology knowledge within the "Business domain".
The choice of change the organisation to follow technology or adjust technology to the organisation has consequences.
When there are many technology stacks there is a complex situation.
In those complex situations there is no simple solutions for all.
Strategic alignment - Solve conflict of interests, roles
The Amsterdam Information Model (AIM) model has the goal of defining more clear the roles.
Aside nine green planes there are four intermediate areas.
The hierarchy of control authority is clear: top-down.
In a figure:
Vertical split: ❶ Strategic, ❷ Tactical, ❸ Operational.
Changing some words, avoiding the word "Information", Added symbols form the Jabes framework with the Jabes application.
Three pillars but the activities are mixed:
Steer in the organisation: pillar are the basic core competencies in the holistic one.
Shape in the organisation: pillar assures the future fitness for the organisation.
Serve in the organisation: pillar are the technology connections for processes.
Information accountability is clearly at "steer", business organisation.
The field of communication is a combination of all over the different type of activities.
The figure,
See right side:
❓ A question in this what kind of shape it would be closing the horizontal layers and closing the vertical pillars?
👁 Answer: you will have a donut (torus) or when the shape is more stretched a pipe.
That is a complicated three dimensional surface.
❓ Next question: is it possible to reduce complexitiy for all communications?
👁 Answer: When going arround circular, the visible two dimensions, and focus on never more than two coordinated interactions this is possible.
Q-1.3 Historical evolvements
Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step.
For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of,
each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain.
📚 Q-1.3.1 Optimalization in the industrial era (I)
Start of industrialsation, computerization
We are presuming using computers, machines is of very recent years. That assupmtion is not correct.
The industrialisation of textile did have an enormous impact in the way of living.
The Weaving
Before the Industrial Revolution, weaving was a manual craft and wool was the principal staple. as well known the first example.
The invention in France of the Jacquard loom, patented in 1804, enabled complicated patterned cloths to be woven, by using punched cards to determine which threads of coloured yarn should appear on the upper side of the cloth.
.. The perceived threat of the power loom led to disquiet and industrial unrest. Well known protests movements such as the Luddites and the Chartists had handloom weavers amongst their leaders. ...
😱 The social state of the workers seen as easy replaceable cheap resources.
Low payments and hard work were the standard.
Work shifts up to 16 hours in gruelling conditions, child labour, low wages, lack of rights.
Jacquard Loom
Optimization of the working force started with the industrialisation.
Programming machines saving on costly hard manual work.
The jacquard loom was the first example.
With this in mind a lot has changed nobody these days is worried about or is even aware off.
Frederick Winslow Taylor
Taylor
He was widely known for his methods to improve industrial efficiency. He was one of the first management consultants.
... Taylor's scientific management consisted of four principles:
Replace rule-of-thumb work methods with methods based on a scientific study of the tasks.
Scientifically select, train, and develop each employee rather than passively leaving them to train themselves.
Provide "Detailed instruction and supervision of each worker in the performance of that worker´s discrete task" (Montgomery 1997: 250).
Divide work nearly equally between managers and workers, so that the managers apply scientific management principles to planning the work and the workers actually perform the tasks.
⚠ Within the setting of a factory fully control of workers is possible.
This rigid approach of seeing human workers as inhuman robots caused the aversion of not being a corect approach.
That social gap to the working class was however the common usual way.
Henri Fayol
Henri Fayol
A French mining engineer, mining executive, author and director of mines who developed a general theory of business administration that is often called Fayolism.
Like his contemporary Frederick Winslow Taylor, he is widely acknowledged as a founder of modern management methods.
...
While Fayol came up with his theories almost a century ago, many of his principles are still represented in contemporary management theories.
... Fayol divided the range of activities undertaken within an industrial undertaking into six types:
❶ technical, ❷ commercial, ❸ financial,
❹ security, ❺ accounting, ❻ managerial.
five primary functions were:
❶ Planning, ❷ Organizing, ❸ Commanding, ❹ Co-ordinating, ❺ Controlling. The control function, from the French contrôler, is used in the sense that a manager must receive feedback about a process in order to make necessary adjustments and must analyze the deviations.
Principles of management:
Division of work = Different levels of expertise can be distinguished.
Authority = gives the management the right to give orders to the subordinates.
Discipline = about obedience.
Unity of command - Every employee should receive orders from only one superior.
Subordination of Individual Interest = The interests of any one employee or group of employees should not take precedence over the interests of the organization as a whole.
Remuneration = All Workers must be paid a fair wage for their services.
Centralisation and decentralisation = Centralisation refers to the degree to which subordinates are involved in decision making.
Line of authority from top management to the lowest ranks represents the scalar chain.
Order = There should be a specific place for every employee in an organization.
Equity = Managers should be kind and fair to their subordinates.
Stability of tenure of personnel = High employee turnover is inefficient.
Initiative = Employees who are allowed to originate and carry out plans will exert.
Esprit de corps = Promoting team spirit will build harmony and unity.
In a mining setting, an autonome responsible team is required. They operate in a dangerous environment.
Micro management is no option. Generals keep away from those locations.
📚 Q-1.3.2 Optimalization in the 20-th century (II)
Assembly line - Henry Ford
Henry Ford faster & cheaper facturing
Henry Ford credited as a pioneer in making automobiles affordable for middle-class Americans through the Fordism system. ...
Workers are paid higher "living" wages so that they can afford to purchase the products they make Assembly line
1913 Experimenting with mounting body on Model T chassis. Ford tested various assembly methods to optimize the procedures before permanently installing the equipment.
Wald noted that the study only considered the aircraft that had survived their missions—the bombers that had been shot down were not present for the damage assessment.
Wald proposed that the Navy instead reinforce the areas where the returning aircraft were unscathed, since those were the areas that, if hit, would cause the plane to be lost.
There are many caveats using Machine Learning. The bias data, the correct meaning data are some of them.
Understanding the uncertainties, the effect on the whole process but being fair to outliers are others among a long list.
PDCA - W. Edwards Deming
Toyota made W. Edwards Deming famous.
PDCA (plan–do–check–act or plan–do–check–adjust) is an iterative four-step management method used in business for the control and continuous improvement of processes and products. It is also known as the Deming circle/cycle/wheel
Far more was and is done by Toyota.
The Difference between the Toyota Production System and Lean Manufacturing
The Toyota Production System (TPS) is the archetype of lean manufacturing.
Lean is often used as a synonym for the Toyota Production System, and that is generally quite accurate.
All too often, lean is seen as some tool that can be bought and then delegated to someone in the lower ranks of hierarchy.
📚 Q-1.3.3 Optimalization in the 20-th century (III)
Culture - Peter Drucker
Peter_Drucker
An Austrian American management consultant, educator, and author, whose writings contributed to the philosophical and practical foundations of modern management theory. ...
Drucker coined the term "knowledge worker", and later in his life considered knowledge-worker productivity to be the next frontier of management.
Best known for his
quotes:
❶ Management is doing things right; leadership is doing the right things.
❷ The most important thing in communication is hearing what isn't said.
❸ The best way to predict the future is to create it.
❹ Rank does not confer privilege or give power. It imposes responsibility.
❺ Efficiency is doing things right; effectiveness is doing the right things.
❻ Unless commitment is made, there are only promises and hopes... but no plans.
❼ The aim of marketing is to know and understand the customer so well the product or service fits him and sells itself.
❽ Knowledge has to be improved, challenged, and increased constantly, or it vanishes.
❾ There is nothing so useless as doing efficiently that which should not be done at all.
AIM nine plane - Rik Maes
Visie op informatie-management (AIM Amsterdam Information Model "Amsterdamse raamwerk voor informatiemanagement").
Many see this as a static situation. Too often only the strategic level is considered, too often only a siloed Taylorian reorganistion is the hidden action.
Every layer important:
Strategy is worthless until they are adopted by tactics, operations
Tactics needs to define projects aligned to strategy
Operations: projects to implement and included in daily activities
The tactical level sets goals and preconditions of the strategic domain into: concrete, realizable objectives, responsibilities, authorizations, frameworks, and guidelines for operations.
Q-1.4 Processing flows VSM - assembly lines
A swarm organisation, self organisation, are networked structures without leaderships. Using some shared goal.
⚠ Challenges: have a shared goal, have a good shared goal.
The organisation structure is a hierarchical line of command.
Formation in groups using leaders is human nature.
⚠ Challenges: avoiding leaderships going into micro details.
Expected is authority and acountability for a product in place.
😱 Administrative/cyber setting: seems to have got lost.
⚖ Q-1.4.1 Control function - closed loop
Control, Feedback, closed loop, PDCA
The fundamental approach in all historical evolvements:
Learn from the feedback to rethink on improvements
Use information in scientific way to Learn what improvements are possible
Build up a skilled staff for the operational work
The feedback, verification of results with intentions, goals, is the beating heart of real lean using PDCA (Plan-Do-Check-Act).
😱 The PDCA cycle is a closed loop. Sadly it got lost from how to apply in procesess.
In the SIAR model combinations: PDCA (Deming), DMAIC (Lean six sigma), lean pull-push and value stream.
A closed loop reference: BIDM
BI analytics is integrated or not in the business process can strongly affect the decision making process.
Hence, we consider this category to be a very important one when delimiting a maturity stage
initiation (user driven - activity initiated by the user, process driven - activity initiated by a process)
process integration (data centric - BI analytics is usually supported by a data warehouse, process centric - BI analytics is integrated in the business processes)
processing model (store and analyze; analyze and store)
event stream processing
"closed-loop" environment
Although having the mindset set for BI (Business Intelligence) it is very generic.
PID process control
When there is a measurement control adjustment becomes a known theory. However this theory is not simple at all. PID control
In theory, a controller can be used to control any process that has a measurable output (PV), a known ideal value for that output (SP), and an input to the process (MV) that will affect the relevant PV.
Using this kind of control there is effect that in the beginning things are going worse before improvements are seen.
What would be the reason for that?
Some possible options:
knowing the effects with change options instead of trial/error attempts
The announced change is micro managed harming the existing operations
The announced change is not correct, not wanted and being disputed
⚒ Q-1.4.2 Co-ordinating function - Plant value stream
First line, second line supervisors
Looking for the administrative/cyber physical equivalent of who is accountable for the value stream flow.
Where and how to find this role when the product is administrative/cyber?
😱 The hierarchical implementation is easily missed.
The accountability and authority for a product: Training Within Industry—Second-Line Supervisor Job Instructions
Training Within Industry and its modules Job Instructions, Job Relations, and Job Methods are well known. ...
Job Instructions for Second-Line Supervisors (nowadays called managers).
This is a hierarchy level higher, and the goal is to support and guide the shop floor supervisors on how to use job instructions.
In the physcial world anybody can see what is going on. In the administrative/cyber world we need a tapping point.
The coordinator of the work actions at the floor.
😱 This hierarchical implementation and the importance is easily missed.
The point-of-use provider takes care of the "last mile" (or more precisely last few meters) of the material transport.
This is often for assembly lines, as there is a lot of material arriving.
...
do not create an additional kanban loop between the supermarket and the assembly line.
The effort would heavily outweigh the benefit, making the whole idea pointless.
Instead, the point-of-use provider is close enough to the line to keep an overview about what is needed.
There are a number of benefits to creating a lean water spider position; however, it can also come with some downsides if they are not utilized properly.
Some benefits of the position include the following:
reduces wasted time spent by floor workers;
makes tasks easier to standardize;
helps enable lean manufacturing;
optimizes complex production processes; and
decreases variations.
Some potential downsides to water spiders, however, may include the following:
Managers, if not properly informed, may see the role as less important than it is. If they see it as secondary to production, then additional tasks may be added to their process. This will make for a less productive output.
In addition, with less time to dedicate to their main tasks, the productivity of other workers will be affected.
Water spiders may also sometimes make empty rounds. This could be due to unseen production inefficiencies.
Larger operations may be more difficult to support if a water spider has to maintain multiple work areas.
Lean water spider job responsibilities
When looking for a lean water spider position, job seekers may see postings that express specific responsibilities and look for certain characteristics. For example, a company looking for a lean water spider may seek someone who will carry out the following tasks:
Take various materials, parts and assemblies from a warehouse or staging location to production.
Prepare those materials for proper use.
Use enterprise resource planning (ERP) software.
Use continuous improvement methods to decrease any errors.
Replenish materials and tools as needed or if found defective.
Manage kanbans.
Ensure proper communication among teams.
Employers may also look for key characteristics, including the following:
The water spider must be knowledgeable of specific processes.
The water spider needs to be process-driven and take a systematic approach in completing their rounds and processes.
The water spider also needs to be perpetually on the move, as they have to move materials from one place to another and travel between workstations.
Lean water spider position growth
While the water spider position may sometimes be seen as less important than it really is, the position does also hold a lot of potential for growth.
An individual in this position will learn much about the production floor and how the organization they're working for operates.
Ideally, they will get to know the people there and the individual challenges in day-to-day work.
This role is a good experience to have in order to become a future manager, supervisor or team leader.
Knowing what work is like on the floor before moving into a leadership role gives previous lean water spiders an appreciation for the work process everyone needs to go through, what the workflow is currently like and how it can be kept lean
⚙ Q-1.4.4 study of the tasks - the plant, value stream
From: "Want to do a process mining project" slides and videos (vdaalst).
The scientific approach understanding and managing processes.
😱 Although it is a fundament it is hardly seen being used.
The idea of using data, transformed into information for seeing what is going on the shop floor.
In a figure:
See right side
Having processes grouped in the value stream, not all process events will follow the expectation form the values stream map.
In a figure:
See right side
Q-1.5 Processes Building Blocks Basics
The term elephant test refers to situations in which an idea or thing, "is hard to describe, but instantly recognizable when spotted"
A process life cycle building block, ALC life cycle, is very generic en simplistic.
There are only three possible approaches.
To solve:
😱 PM: project management is ❌ NOT product management.
😱 ALC life cycles are made complicated by blame games.
🎭 Q-1.5.1 Process approaches at the shop floor
Fully human, immediate impact: ALC-V1
⚖ This simple job-shop approach is:
Changing in production,
there is no develop test stage
Applicable for a one-off delivery
⚒ Instructions for processing:
Input:
Gather instructions for the transformation
Gather conform instructions input materials: quantity & quality
Transform
Result:
Verify results conform specifications
Archive documents, knowledge
⚙ The craftsmanship of the workers is decisive for what is delivered at what cost in some time window.
Delegated but human, validation before change: ALC-V2
⚖ This advanced approach is:
Preparing designing in a develop test stage before in production executing transformations planned, predictable.
Applicable for deliveries done in series, mass production.
⚒ Creating instructions for processing:
Input - requirements for a process:
gather requirements for instructions
materials instructions: quantity & quality
Transform - Create tools supporting activities
Result - validations of a created process:
Verify results specifications conform requirements.
Archive documents, knowledge of creating the process.
⚙ The craftsmanship of the building, engineering, is decisive for what process is delivered at what cost in some time window.
⚙ The craftsmanship, education, of the operational workers is part of the process creating instructions and specifications.
⚙ The operational transformation is predictable in what is delivered at what cost in what time.
Computer aided decision making, validation before change: ALC-V3
⚖ This sophisticated approach is:
Preparing designing in a develop test stage.
When deployed to production, production transformations are planned, predictable.
Applicable for deliveries done in series, mass production and for one-off research projects.
⚒ Creating instructions for processing:
Input - requirements for a process:
gather requirements for instructions
gather opeational information (data)
materials instructions: quantity & quality
Transform - Use tools supporting activities
Result - evaluations & validations of a created process:
Verify results in information loop-backs
Verify results specifications conform requirements.
Archive documents, knowledge of creating the process
⚙ The craftsmanship of the building, engineering, is decisive for what process is delivered at what cost in some time window.
⚙ The craftsmanship, education, of the operational workers is part of the process creating instructions and specifications.
⚙ The operational transformation is predictable in what is delivered at what cost in what time.
🎭 Q-1.5.2 ALC-V1 process details
⚖ This simple job-shop approach is:
The management in lead has all knowledge what and how to do it.
A figure is of a classic one-off proces, operations:
🎭 Q-1.5.3 ALC-V2 process details
⚖ This advanced approach is:
The management in lead is able to define globally what and how to do it.
A delegated management team is assumed to have all knowledge what and how to do it what lead management did order.
There are separated environments to build & validate the new processes and run the operations.
Running operational is done by a dedicated team.
A figure is of this classic proces: develop, test - operations:
🎭 Q-1.5.4 ALC-V3 process details
⚖ This sophisticated approach is:
The management in lead is able to define globally what and how to do it.
The management in lead is accountable for the process although they don´t know all the logical and technical details.
A delegated management team is not assumed to have all knowledge what and how to do it what lead management did order.
There is loop back control to the delegated management for what is going on in the operational process results.
These are used for monitoring signals of the process.
There are loop back controls to engineer building for what is going on in the operational process results.
These operational results are sources to research for issues and improvements in the development environment.
Frome the engineers building the process is expected that they make decisions that are conforming requirements.
Explain those decisions for to the delegated management for impact and risks.
There are separated environments to build & validate the new processes and run the operations.
Running operational is done by a dedicated team.
A personal figure is of a ML (machine learning) process develop, test - operations:
Q-1.6 Organisation & Business knowledge
Once Dorothy and her colleagues made the journey to OZ, they quickly found out that there was no there, there.
The Wizard simply told her what she really should have known all along.
Dorothy and her companions just had to figure out how to reframe their own perceived shortcomings and recast them as strengths to achieve real transformation.
⚙ Q-1.6.1 What is done servicing administrative/cyber technology
Retrospective ICT service, modelling
Looking around what others are posting and what the direction of the opinions is .... (ALC-v3)
The life cycle is a hot topic (2019).
Just modelling, inventing new processes and not able to operationalize doesn´t bring the expected value.
There are many more issues to solve than becoming just aware of this.
The algorithm using data to create a model is transforming the data to a source.
When auditing reviewing the model that data is an source component to archived as evidence.
Retrospective ICT service, Software building
The agile manifesto redirected all attention for building software.
That is weird because software isn´t the key factor where the value stream processes are.
There are a many situations that this is the best approach because there are no better options.
Saying failing fast it the goal is also weird. One of the best lean agile projects wat the "race to the moon"
In the Apollo project everyting was tested and validated. Only the unforeseen being a problem.
Retrospective ICT service, Service desk
One of the aspects that is copied is how the ICT servicedesk is managed. It has become part of the ITIL framework.
The Information Technology Infrastructure Library (ITIL) is a set of practices and a framework for IT activities.
There are many companies offering an ITIL course. The common idea is:
known defined services requests to plan.
responding on events that are not planned.
An Apollo 13 ITSM - game
💣 Building and operating in a reliable predictable way is not covered.
Prioritizing would be better avoiding that ITIL proces as much as possible.
⚙ Q-1.6.2 What could be serviced by administrative/cyber technology
There is a long not exhaustive list where administrative/cyber work is applicable.
optimalization Agriculture
Food production and demand on a global basis, with special attention paid to the major producers, such as China, India, Brazil, the US and the EU. Agricultural_science (wikipedia)
optimalization your home
A Smart Home is one that provides its home owners comfort, security, energy efficiency (low operating costs) and convenience at all times.
Playing chess is making decisions in a timely manner. with the advice of a machine anyone can beat the master. Note the man behind the computerscreen, he is moving the pieces.
⚙ Q-1.6.3 What could be realised by administrative/cyber technology
analyst viewpoint
Even the nice life cycle approach is seen in analyst reports:
Many companies still struggle to realize value from predictive analytics despite considerable investments in technology and human capital. This is largely due to the insights-to-action gap, the disconnect between analytical insights and operational processes
Source: Close The Insights-To-Action Gap With A Clear Implementation Plan Forrester report (Brandon Purcell 2017)
A pity even the visual and intro went behind a paywall. In the time of first publication in 2015 it was free accessible.
ALC-V3 process line, closed loops - circular change
The life cycle of the ALC-V3 in a circular visual:
Adds compliancy checkpoints
The model evaluation is done as acceptance testing in the building of the model.
The evaluation and decsion to deploy is in coordination wiht the plant management.
When the need for change in information processing is high a devops perspective using several closed loops gives other attention points:
Model monitoring supported by the model builders.
Data provision for developpers, delivered by operational support.
Business change requests, reviewed on impact before deployment.
Compliancy in regulations, security adviced before & during development, evaluted regular after deployment.
The whole process is a closed loop in itself.
At every stage all three involved parties have activities to be coordinated and to be aligned.
The technology using computers has several lines of evolvements. The hardware has become faster, better, cheaper.
Application software has a few basic fundaments in logic by flows and constructs.
The problem to solve has moved from a pure technical aspect how to run machines into how to process information in a technical way.
⚙ Q-2.1.1 Technology communication- sideways
signal flags
The associated illustration is not have a real meaning other than that there is party something to celebrate.
International maritime signal flags
are various flags used to communicate with ships. The principal system of flags and associated codes is the International Code of Signals.
Various navies have flag systems with additional flags and codes, and other flags are used in special uses, or have historical significance.
There are various methods by which the flags can be used as signals:
A series of flags can spell out a message, each flag representing a letter
Individual flags have specific and standard meanings
One or more flags form a code word whose meaning can be looked up in a code book held by both parties
n yacht racing and dinghy racing, flags have other meanings; for example, the P flag is used as the "preparatory" flag to indicate an imminent start
optical telegraph
An optical telegraph is a line of stations, typically towers, for the purpose of conveying textual information by means of visual signals. There are two main types of such systems;
the semaphore telegraph which uses pivoted indicator arms and conveys information according to the direction the indicators point, and the shutter telegraph which uses panels that can be rotated to block or pass the light from the sky behind to convey information.
The illustration is of the Napolean era. These kind of system was used widely all over the world in ancient times.
Invisibility Technical communications.
New technology still is evolving.
In telecommunications, 5G is the fifth generation technology standard for cellular networks, which cellular phone companies began deploying worldwide in 2019, the planned successor to the 4G networks which provide connectivity to most current cellphones.
Like its predecessors, 5G networks are cellular networks, in which the service area is divided into small geographical areas called cells.
Not understanding technology brings old types of conspiracies
During the COVID-19 pandemic, several conspiracy theories circulating online posited a link between severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and 5G.
This has led to dozens of arson attacks being made on telecom masts in the Netherlands (Amsterdam, Rotterdam, etc.), Ireland (Cork,[ etc.), Cyprus, the United Kingdom (Dagenham, Huddersfield, Birmingham, Belfast and Liverpool), Belgium (Pelt), Italy (Maddaloni), Croatia (Bibinje) and Sweden.
It led to at least 61 suspected arson attacks against telephone masts in the United Kingdom alone and over twenty in The Netherlands.
When steam trains were introduced a lot of conspiracies were made in trying to halt change.
When Industrialisation replaced the at home work looms, a lot of resistances was made in trying to halt change.
When vaccination was invented for preventive healthcare a lot of conspiracies were made in trying to halt change.
⚙ Q-2.1.2 Technology Logical acces -sideways.
CODASYL, Transactional usage.
The DBTG group, codasysl , greatest person doing the data network modelling: Charles_Bachman, was the DBMS standard before SQL. (wikipedia)
CODASYL, the Conference/Committee on Data Systems Languages, was a consortium formed in 1959 to guide the development of a standard programming language that could be used on many computers.
This effort led to the development of the programming language COBOL and other technical standards.
Almost forgotten, a network model database is a no sql one.
In October 1969 the DBTG published its first language specifications for the network database model which became generally known as the CODASYL Data Model. (wiki)
This specification in fact defined several separate languages: a data definition language (DDL) to define the schema of the database, another DDL to create one or more subschemas defining application views of the database;
and a data manipulation language (DML) defining verbs for embedding in the COBOL programming language to request and update data in the database.
Technology Backend (von Neuman)
This topic did not change in concepts from the beginning.
❶ The improvements on what is possible with hardware and software to realise are however changing fast.
The principals on segregations by processing speed and parallel processing are still valid.
❷ The change in speed at communication lines are enabling to do a lot on far physical distances.
Unlimited capacity, unlmited speed will never be reality.
Performance & Tuning hardware:
❶ Knowing approximity time costs for requests
    👉🏾 going to optimize is choice decisions between resources.
❷ minimize resource usage consuming most time. ⌛
    👉🏾 better algorithm
❸ trying to run processes parallel instead of serial. ⏳
    👉🏾 scheduling
❹ preventing overload 🚧 conditions in any resourced used in the chain.
    👉🏾 load balancing
Technology Network - endpoints
Local hardware speed is Not the bottleneck, we are moving into an always connected situation.
The possible speed is growing fast.
In 1991, two years after its invention by Tim Berners-Lee, researchers at CERN in Switzerland released the World Wide Web (WWW) to other institutions and then later that year to the general public.
Back then, it had a global average connection speed of just 14.4 kilobits per second (kbit/s). Only a handful of pages could be visited, all with static HTML and limited functionality.
...
Today, most advanced nations and those with less developed infrastructure.
In Singapore, for example, you can expect nearly 300 Mbit/s as standard, while in Turkmenistan the average is just 4.3 Mb/s.
As of 2022, the global average is approximately 100 Mbit/s and growing by 20% each year.
Technology Frontend
Frontend and backend ,
In software engineering, the terms frontend and backend (sometimes written as back end or back-end) refer to the separation of concerns between the presentation layer (frontend), and the data access layer (backend) of a piece of software, or the physical infrastructure or hardware. In the client–server model, the client is usually considered the frontend and the server is usually considered the backend, even when some presentation work is actually done on the server itself.
⚙ Q-2.1.3 Technology software crisis
software crisis in the 1960
The software crisis in the 60´s, organizing conferences.
Nato reports (brian randell) a beautiful document:
Nato 1969 E.Dijkstra and d´Agapeyeff´ inverted Pyramid.
Dependicies in layers where you don´t want to be dependent, decoupling by interfaces (middleware).
This is because no matter how good the manufacturer´s software for items like file handling it is just not suitable; it´s either inefficient or inappropriate.
We usually have to rewrite the file handling processes, the initial message analysis and above all the real-time schedulers,
because in this type of situation the application programs interact and the manufacturers, software tends to throw them off at the drop of a hat, which is somewhat embarrassing.
On the top you have a whole chain of application programs.
The point about this pyramid is that it is terribly sensitive to change in the underlying software such that the new version does not contain the old as a subset.
It becomes very expensive to maintain these systems and to extend them while keeping them live.
d´Agapeyeff: (from Reducing the cost of software) Programming is still too much of an artistic endeavour.
We need a more substantial basis to be taught and monitored in practice on the:
structure of programs and the flow of their execution.
shaping of modules and an environment for their testing.
simulation of run time conditions.
software crisis in the 2020
Stating Virtual machines, docker containers, going to the cloud, are cheap is not correct.
The cost is in supporting, maintaining what is on top of that.
Nothing has really changed sinde those 1969 days.
Technology for logic
Any process is having a design when it should be adequate, correct, elegant.
Ordering thoughts about the design of a process has aving a simple approach by doing drawings like flowcharts.
The quick & dirty approach of trial and error where failing fast and breaking things is acceptable is only acceptable when the risk assessment does allow it.
⚙ Q-2.1.4 Programming languages
Algol -semicolon based languages.
Algol 60 As a semicolon based langauge bypassed the requirements using hollerith cards.
ALGOL 60 was used mostly by research computer scientists in the United States and in Europe.
Its use in commercial applications was hindered by the absence of standard input/output facilities in its description and the lack of interest in the language by large computer vendors.
ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development.
Some modern languages are using these concepts.
Powershell one-liners.
Many programming and scripting languages require a semicolon at the end of each line.
Cobol - column based languages.
Cobol a language survivor,
is a compiled English-like computer programming language designed for business use. It is an imperative, procedural and, since 2002, object-oriented language.
There are cultural similarities to Java of Phyton.
Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks.
An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block.
Thus, the program´s visual structure accurately represents the program´ss semantic structure.
This feature is sometimes termed the off-side rule, which some other languages share, but in most languages indentation doesn´t have any semantic meaning.
Hadoop, Elastic search - Nosql
These are recent products, recent approaches that are going back to old concepts.
3 GL languages going into more advanced ones. low coding.
Procedural programming languages are leaving the details on accessing object fully to the programmer. Examples are Cobol Java Python.
When processing many objects in a similar way much of coding on repetitions can handed over to the language. RPG and SAS (base) are examples.
🎯 New goals:
Moving from technology driven into business driven.
Going into low coding. Handing over to a graphical click approach over a language.
Q-2.2 Communication - Interactions
Working with machines that process information, is a relative new topic of science.
Human communications and interaction is classic.
The concept of the "information" container is not that clear and simple.
Information is an abstract concept that refers to that which has the power to inform.
At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions.
Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information.
⚙ Q-2.2.1 Information Communication Technology
Technology start at communication
The change in information processing did not start with machines dong calculations.
It did start with using technology to exchange information: communication.
Instead of humans using a messenger, using by something written or just spoken it went to digital with the telegraph .
These systems led to new telegraph codes, starting with the Baudot code. However, telegrams were never able to compete with the letter post on price, and competition from the telephone, which removed their speed advantage, drove the telegraph into decline from 1920 onwards.
The few remaining telegraph applications were largely taken over by alternatives on the internet towards the end of the 20th century.
Technology start by computerss
The first machine classified as computer had his roots in analysing communication.
Colossus
was a set of computers developed by British codebreakers in the years 1943–1945 to help in the cryptanalysis of the Lorenz cipher.
Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations.
Colossus is thus regarded as the world's first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.
Confused focus in subject
There is a strange misunderstanding in concepts.
When referring to science of information processing than: people are starting to think on just programming languages.
That is however a not interesting technology component.
💣 Issue: Ignoring the process of how information should get understood.
When referring to communications for information processing than: people are starting to think on programming languages.
🤔 Ignoring the communication, human interaction for decisions.
😱 Assuming the technical enablement of data exchange in a technical network is all what there is.
Ignoring what is really "communications": the information for decisions.
When referring to Technology of information processing than: people are going for building databases , program to deliver some numbers and figures.
🚧 Ignoring the process of how information for decisions should get treated explained and presented while not ignoring all uncertainties and ethical aspects.
❓ Is the Technology Part about Basic Technology or is it about the decision maker, imperator.
Without a better understanding on the concepts we are condemned to make the same mistakes over and over again.
⚙ Q-2.2.2 Information representations
Data encoding decoding.
👓 As soon as information was reliable to encode and decode resulting in meaningful information, the question aros how to prevent seeing that by those that should not know that information.
The battle of encryption - decryption of information adding an additional layer on the encoding - decoding.
The colossus was an automated machine helping in that information war. It was not a generic computer system.
Technology fixed and variabel lenght of encoded messages.
In the ease of technical implementations fixed length sizing is preferred.
In the ease of text messages variable lengths are preferred.
For reliable technical data transfer after a number of the same level of signal the opposite level is required. Adding another reversed level signal is quickly adding some length.
💣 Mixing up several conventions can cause un unwanted unclear encoding of information. In a comma separated file, is the comma part of a digit of it is really segregation of fields.
Ascii - Ebdic - Single byte
These are encodings: interpretation of a single byte 8 bits, 256 different values)
Properties:
Every byte can be processed by the computer as valid a valid byte / character.
Not all interpretations are existing in defined Ascii tables or Ebcdic tables.
transcoding without loss of information is not always possible.
Not Every byte can be processed by the computer as valid.
a character is made up in varying between 1-4 bytes.
UTF8 - varibele 1-4 bytes, the lower 127, old Ascii interpretation.
The first 32 being control characters not being used as text characters.
UTF16LE UT16BE 2,4 bytes. BE big-endian or Le little-endian.
the endianess is following the order of the preferred hardware chip architecture (LE) left to right or (BE) right to left.
UTF32LE UT32BE 4 bytes. BE big-endian or LE little-endian.
Unicode is becoming the defacto standard.
Floating, Math in ICT
Atbasic issues in ICT with math: inaccuracy with calculations, Floating numbers.
Using a slider you had to think about that and by that knowing the possible impact.
Using a computer everybody trust the results until the surprising wrong results getting complaints by feed back respsonses.
The standard provides for many closely related formats, differing in only a few details. Five of these formats are called basic formats, and others are termed extended precision formats and extendable precision format.
Three formats are especially widely used in computer hardware and languages:
Single precision, has a precision of 24 bits (about 7 decimal digits)
Double precision, has a precision of 53 bits (about 16 decimal digits).
Double precision.
The common format in programming is double format. Using a GPU (NVIDIA) it is half precision. A difference of less than 1 promille is for image procession seen as sufficiënt.
Noice & synchronisation - error detection
Although everything is told to be digital the transmission are using real world phenomes in an analog world.
Shannon-Hartley
In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise.
It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise.
This bit-stuffing serves a second purpose, that of ensuring a sufficient number of signal transitions.
On synchronous links, the data is NRZI encoded, so that a 0-bit is transmitted as a change in the signal on the line, and a 1-bit is sent as no change.
Thus, each 0 bit provides an opportunity for a receiving modem to synchronize its clock via a phase-locked loop.
If there are too many 1-bits in a row, the receiver can lose count. Bit-stuffing provides a minimum of one transition per six bit times during transmission of data, and one transition per seven bit times during transmission of a flag.
This is really technical but on the low technical level necessary and in use at all kind of common devices.
This "change-on-zero" is used by High-Level Data Link Control and USB.
They both avoid long periods of no transitions (even when the data contains long sequences of 1 bits) by using zero-bit insertion.
HDLC transmitters insert a 0 bit after 5 contiguous 1 bits (except when transmitting the frame delimiter "01111110"). USB transmitters insert a 0 bit after 6 consecutive 1 bits.
The receiver at the far end uses every transition-both from 0 bits in the data and these extra non-data 0 bits — to maintain clock synchronization.
The level of risk can be estimated by using statistical analysis and calculations combining impact and likelihood.
Any formulas and methods for combining them must be consistent with the criteria defined when establishing the Risk Management context.
Keep it stupid simple (KISS)
Simplifying is possible in several ways. Occam´s_razor
Occam´s razor, Ockham´s razor, Ocham´s razor (Latin: novacula Occami) or law of parsimony (Latin: lex parsimoniae) is the problem-solving principle that "entities should not be multiplied without necessity."
The idea is attributed to English Franciscan friar William of Ockham (c. 1287-1347), a scholastic philosopher and theologian who used a preference for simplicity to defend the idea of divine miracles.
It is variously paraphrased by statements like "the simplest explanation is most likely the right one".
KISS principle
The KISS principle states that most systems work best if they are kept simple rather than made complicated; therefore, simplicity should be a key goal in design, and unnecessary complexity should be avoided.
⚠ The only question not asked and not answered is what in a situation is simple.
What is a simple step for one person is possible felt for another person as very complicated.
Chaotic systems.
Chaos_theory
Chaos theory is a branch of mathematics focusing on the study of chaos states of dynamical systems whose apparently-random states of disorder and irregularities are often governed by deterministic laws that are highly sensitive to initial conditions.
💣 The education is focussing on predictable deterministic systems. Assuming when you know all inputs you can predict the outcome with a defined certaintity. This not the truth.
Small differences in initial conditions, such as those due to errors in measurements or due to rounding errors in numerical computation,
can yield widely diverging outcomes for such dynamical systems, rendering long-term prediction of their behavior impossible in general.
This can happen even though these systems are deterministic, meaning that their future behavior follows a unique evolution and is fully determined by their initial conditions, with no random elements involved.
In other words, the deterministic nature of these systems does not make them predictable. This behavior is known as deterministic chaos, or simply chaos. The theory was summarized by Edward Lorenz.
The underlying concept is to use randomness to solve problems that might be deterministic in principle.
They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches.
Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.
Las Vegas algorithms were introduced by Laszlo Babai in 1979, in the context of the graph isomorphism problem, as a dual to Monte Carlo algorithms.
Babai introduced the term "Las Vegas algorithm" alongside an example involving coin flips: the algorithm depends on a series of independent coin flips, and there is a small chance of failure (no result).
However, in contrast to Monte Carlo algorithms, the Las Vegas algorithm can guarantee the correctness of any reported result.
Random numbers
Generating good random numbers is ever lasting question, it is also changing in realisations.
Mersenne_twister
The Mersenne Twister was developed in 1997 by Makoto Matsumoto and Takuji Nishimura. It was designed specifically to rectify most of the flaws found in older PRNGs.
The most commonly used version of the Mersenne Twister algorithm is based on the Mersenne prime 2**19937-1. The standard implementation of that, MT19937, uses a 32-bit word length.
There is another implementation (with five variants) that uses a 64-bit word length, MT19937-64; it generates a different sequence.
A note on why there are shortcomings: (B. A. Wichmann). Another one still present is the seed values being predictable of the system clock.
Algorithm AS 183, Hill and Wichmann (1982) and Wichmann and Hill (1982) resulted.
It has had a "good innings" but its cycle length of about 7x10**12 must now be considered inadequate.
It has been reported (McCullough and Wilson, 2005) as having failed some tests at a probability level of less than 10**-15, which surely is indicative of a major failing.
Computing developments over the last quarter of a century now make a better version both possible and desirable.
In particular, there does not now seem to be a need for the 16-bit restriction, as 32-bit availability is almost universal.
Aristotle & Plato
These old Greek philosophers are stating the problem with the analytics. The meaning and concepts.
theory_of_universals Aristotle to
Platonic realism Socrates
Plato and Xenophon's dialogues provide the main source of information on Socrates's life and thought.
These writings are the Sokratikoi logoi, or Socratic dialogues, which consist of reports of conversations apparently involving Socrates.
The most famous allegory of the cave
Any disruptive change although being the real truth, is not automatically an acceptable option for all involved.
The difficult decision is what to do in those kind situations.
🎯 New goals:
Understanding, accepting and acting for all kind of uncertainties with decisions
Managing the options in decisions by uncertainties with the help of technology
Eliminating the misunderstandings by uncertainties in decisions
Decisions on information change by perspectives.
Q-2.3 Historical evolvement ICT
Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step.
For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of,
each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain.
⚙ Q-2.3.1 Computer Technology Basics.
Analalog computing.
A quick acceptable result is a good option using analog devices.
The problem with those is their single purpose and limited accuracy.
The advantage using dedicated scale models is seeing issues not in another possible.
The oldest kind of an application is navigation. astrolabe
an elaborate inclinometer, and can be considered an analog calculator capable of working out several different kinds of problems in astronomy.
Ada Lovelace - Charles_Babbage
Ada lovelace
Between 1842 and 1843, Ada translated an article by Italian military engineer Luigi Menabrea on the calculating engine, supplementing it with an elaborate set of notes, simply called Notes.
These notes contain what many consider to be the first computer program that is, an algorithm designed to be carried out by a machine.
Charles Babbage Considered by some to be a "father of the computer".
Babbage is credited with inventing the first mechanical computer that eventually led to more complex electronic designs,
though all the essential ideas of modern computers are to be found in Babbage´s analytical engine.
Morse
Samuel Morse Samuel Morse
The Morse system for telegraphy, which was first used in about 1844, was designed to make indentations on a paper tape when electric currents were received. ...
Morse code was developed so that operators could translate the indentations marked on the paper tape into text messages.
In his earliest code, Morse had planned to transmit only numerals and to use a codebook to look up each word according to the number which had been sent.
However, the code was soon expanded by Alfred Vail in 1840 to include letters and special characters so it could be used more generally.
Vail estimated the frequency of use of letters in the English language.
Explanation choices
This collection for algorithms Computers is:
For first: an advanced analogue approach with a dedicated goal that works, with all approximates and guesses as a real world problems, is a neglected awareness.
Decisions to make are in this the most important goal.
For the second: Ada & Babbage very common by reviewing computer history.
For the third: encoding information in technology for communication is a basic requirement.
Information processing is a combination of these three parts.
⚙ Q-2.3.2 Information Communication Technology.
Henrich Herz
Heinrich Rudolf Hertz
Between 1886 and 1889 Hertz would conduct a series of experiments that would prove the effects he was observing were results of Maxwell's predicted electromagnetic waves.
Starting in November 1887 with his paper "On Electromagnetic Effects Produced by Electrical Disturbances in Insulators",
Hertz would send a series of papers to Helmholtz at the Berlin Academy, including papers in 1888 that showed transverse free space electromagnetic waves traveling at a finite speed over a distance.
Nikola Tesla
Nikola Tesla
The three big firms, Westinghouse, Edison, and Thomson-Houston, were trying to grow in a capital-intensive business while financially undercutting each other.
There was even a "war of currents&qut propaganda campaign going on with Edison Electric trying to claim their direct current system was better and safer than the Westinghouse alternating current system.
Competing in this market meant Westinghouse would not have the cash or engineering resources to develop Tesla´s motor and the related polyphase system right away.
Marconi
Guglielmo Marconi
Late one night, in December 1894, Marconi demonstrated a radio transmitter and receiver to his mother, a set-up that made a bell ring on the other side of the room by pushing a telegraphic button on a bench.
Supported by his father, Marconi continued to read through the literature and picked up on the ideas of physicists who were experimenting with radio waves.
He developed devices, such as portable transmitters and receiver systems, that could work over long distances, turning what was essentially a laboratory experiment into a useful communication system.
Marconi came up with a functional system with many components
Explanation choices
This collection for Computer Technology is:
For first: Herz, the theoretical enabling of communication.
For the second: Tesla making the technology for electric energy to a commodity, common available. Electric energy the enabler for machines including computers.
For the third: Marconi enabling the communication into useful applications. The radio operator(Dutch: marconist) on ships using morse is an example.
Information processing is a combination of these three parts.
Going from vacuum tubes to transistors into chips was an evolution building on what was created by these basic knowledge.
The change in number of information elements has dramtically grown Storge measured in Tebibytes (2**40) has become normal. One byte havin 8 bits.
Communication speed of 100 Mb/s that is 2**20 bits/second has become normal. The measurement increasing in facotrs of 1024 (2**10).
byte
⚙ Q-2.3.3 Information Technology Fundaments.
Hardware design - John von Neumann
von Neumann
The term "von Neumann architecture" has evolved to mean any stored-program computer in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus.
This is referred to as the von Neumann bottleneck and often limits the performance of the system.
The design of a von Neumann architecture machine is simpler than a Harvard architecture machine which is also a stored-program system but has one dedicated set of address and data buses for reading and writing to memory, and another set of address and data buses to fetch instructions.
Mother of 3 GL languages - Grace Hopper
Grace Hopper (Yale edu)
In addition to their work for the Navy, Hopper and her colleagues also completed calculations for the army and -ran numbers- used by John von Neumann in developing the plutonium bomb dropped on Nagasaki, Japan.
...
Though the term "bug" had been used by engineers since the 19th century to describe a mechanical malfunction, Hopper was the first to refer to a computer problem as a "bug" and to speak of "debugging" a computer.
...
As the number of computer languages proliferated, the need for a standardized language for business purposes grew. In 1959 COBOL (short for "common business-oriented language") was introduced as the first standardized general business computer language.
Although many people contributed to the "invention" of COBOL, Hopper promoted the language and its adoption by both military and private-sector users.
Data, Information structuring.- Edgar F Codd
Set the terms for relational data, transactional usage.
Edgar F. Codd Codd, the man who killed codasyl.
An English computer scientist who, while working for IBM, invented the relational model for database management, the theoretical basis for relational databases and relational database management systems.
He made other valuable contributions to computer science, but the relational model, a very influential general theory of data management, remains his most mentioned, analyzed and celebrated achievement.
.... He published his 12 rules to define what constituted a relational database. This made his position at IBM increasingly difficult.
Codd´s 12rules Rule 0 The foundation rule: For any system that is advertised as, or claimed to be, a relational data base management system, that system must be able to manage data bases entirely through its relational capabilities. Rule 1 The information rule:
All information in a relational data base is represented explicitly at the logical level and in exactly one way – by values in tables. Rule 2: The guaranteed access rule:
Each and every datum (atomic value) in a relational data base is guaranteed to be logically accessible by resorting to a combination of table name, primary key value and column name.
Rule 3: Systematic treatment of null values:
Null values (distinct from the empty character string or a string of blank characters and distinct from zero or any other number) are supported in fully relational DBMS for representing missing information and inapplicable information in a systematic way, independent of data type. Rule 4: Dynamic online catalog based on the relational model:
The data base description is represented at the logical level in the same way as ordinary data, so that authorized users can apply the same relational language to its interrogation as they apply to the regular data. Rule 5: The comprehensive data sublanguage rule:
A relational system may support several languages and various modes of terminal use (for example, the fill-in-the-blanks mode). However, there must be at least one language whose statements are expressible, per some well-defined syntax, as character strings and that is comprehensive in supporting all of the following items:
Data definition.
View definition.
Data manipulation (interactive and by program).
Integrity constraints.
Authorization.
Transaction boundaries (begin, commit and rollback).
Rule 6: The view updating rule:
All views that are theoretically updatable are also updatable by the system. Rule 7: Relational Operations Rule / Possible for high-level insert, update, and delete:
The capability of handling a base relation or a derived relation as a single operand applies not only to the retrieval of data but also to the insertion, update and deletion of data. Rule 8: Physical data independence:
Application programs and terminal activities remain logically unimpaired whenever any changes are made in either storage representations or access methods. Rule 9: Logical data independence:
Application programs and terminal activities remain logically unimpaired when information-preserving changes of any kind that theoretically permit unimpairment are made to the base tables. Rule 10: Integrity independence:
Integrity constraints specific to a particular relational data base must be definable in the relational data sublanguage and storable in the catalog, not in the application programs. Rule 11: Distribution independence:
The end-user must not be able to see that the data is distributed over various locations. Users should always get the impression that the data is located at one site only. Rule 12: The nonsubversion rule:
If a relational system has a low-level (single-record-at-a-time) language, that low level cannot be used to subvert or bypass the integrity rules and constraints expressed in the higher level relational language (multiple-records-at-a-time).
Explanation choices
This collection for Computer Technology is:
For first: von Neuman, the theoretical and practical enabling of computers. More curious is:
He had perhaps the widest coverage of any mathematician of his time, integrating pure and applied sciences and making major contributions to many fields, including mathematics, physics, economics, computing, and statistics.
For the second: Hopper is clear and usual for programming logics brining it to a next level. With Java Phyton not that much has changed. Might be the quality of work went down since the beginning.
Outside of academia, she organized myriad workshops and conferences to promote understanding of programming and expand the community of computer programmers.
For the third: Edgar F Codd the decoupling of data using SQL was and is game changer for a comfort zone.
The most important effect is a decoupling of procedural logic from the data storage. There is by exception a need to combine those when being out of comfort zones.
New technologies still are going back to comfort zones when becoming more mature.
Information processing is a combination of these three parts.
🎯 New horizons:
Including: risk management for all kind of uncertainties
Avoiding complexity by accepting uncertainties with inequality for decisions
Developing and realising tools for real information processing, bpm & brm
Aligning: business process management (bpm) and business rules management (brm)
Q-2.4 Processing flows VSM - Change
A swarm organisation, self organisation, are networked structures without leaderships. Using some shared goal.
⚠ Challenges: have a shared goal, have a good shared goal.
The organisation structure is a hierarchical line of command.
📚 Challenges:
have leadership to support and enable shared goals.
There is difference in prodcut managemen en project management, let the product be in lead.
⚙ Q-2.4.1 Product management processes
Administrative/cyber processing similar to a factory
There are four stages:
Any of the four process phases might get split up in sub-processes.
There is a very detailed version of the business process value stream (2,3,4) (5,6).
In that detailed version the administration and business controls are visible (Center).
A clockwise cycle. The pull (request) starts at IV (0,1). Push (delivery) ends at II (9,0).
In a figure
See right side
Decisions with control on processes.
A different position of control hierarchy: ❶ The controls for each stage are communicated to a central position. Central position in the middle. ❷ The most important controls are those the are related to the handovers in the cycle, diagonals. ❸ The Second level in the hierarchy is slightly different postioned than the shop floor.
The coverage in the second level are on the basic vertical-horizontal positions. ❹ At the floor they are at decision points. (Enable-Asses)
Mindset prerequisites: Siar model - static
The model covers all of:
simple processes: 0 - 9
The duality between processes, transformations, and information, data
four quadrants:
Push Pull,
lean agile requests deliveries
realistic human interaction & communication. nine plane:
Steer Shape Serve
Strategy, Tactics, Operational
Accountabilities, responsibilities, roles
value stream: left to right
PDCA, DMAIC, lean agile improvements
Mindset prerequisites: Siar model - dynamic
The cubicle representation of the model does show a lot for categorisations.
The static element information is well to place.
Processing, transforming, is a dynamic process.
A circular representation is a better fit.
The cycle:
Ideate - Asses (pull)
Plan - Enable (pull)
Demand, Backend (push)
Frontend , Delivery (push)
Customer interaction: bottom right side.
Supply chain interaction: bottom left side.
⚙ Q-2.4.2 Delivering data products in a cycle
Combining the factory approach wiht basic four adminsitrative/cyber steps, data driven processing.
A figure:
See right side
⚙ Q-2.4.3 Three stages in realisation
Materials retrieval
Requirements:
Know for who the processing is done
Know who is accountable for the retrieved information
get from correct agreed locations
agreed quality of information
A figure:
See right side
Processing Materials into a product
Requirements:
Know what and how to process do transformations
Have tools in place for doing the transformations
A figure:
See left side
Processing Materials into a product
Requirements:
Know for who the processing is done
Know who is accountable for the delivered information
put to correct agreed locations
agreed quality of information
A figure:
See right side
⚙ Q-2.4.4 Control & Compliancy
Requirements to set, document & validate:
Indispensable security, safety. See: Q-3.4.2
Data, Information Quality. See: Q-3.4.3
Impact on persons. See: Q-3.4.4
Business Rules, Algorithm. See: Q-3.4.5
In a figure:
See left side
Q-2.5 Processes Building Blocks Realisations
The term elephant test refers to situations in which an idea or thing, "is hard to describe, but instantly recognizable when spotted"
A process life cycle building block, ALC life cycle, is very generic en simplistic.
There are only three possible approaches.
To solve:
😱 Understanding goals, bringing value is hampered by culture.
😱 Technology driven is failing to align to product values streams.
🎭 Q-2.5.1 Process approaches at the shop floor
Fully human, immediate impact: ALC-V1
⚙ This simple job-shop approach is:
Changing in production,
there is no develop test stage
Applicable for a one-off delivery
⚖ This simple job-shop approach is used for:
Mandatory reports to regulators, annual and quarterly periods.
Highly sensitive information that could change in public after a regular publishing.
The reporting is subject to change. Changes by regulators, input changes, situation changes.
The required knowledge to do this, demands high educated craftsman for the dedicated topics.
These attributes result in avoidance of ICT staff (technology) for doing the work building those reports.
⚒ ➡ Alignment for information processing:
To deliver needed information input:
A specified safe location to work for the specialists according risk analyses.
From specified sources to safe specified locations.
Tools:
Advanced tools that are useable for work by the specialists.
The spreadsheet is te ususal escape for not hanving adequate tools.
Endpoints that are classified safe to work with the classified information.
Delegated but human, validation before change: ALC-V2
⚙ This advanced approach is:
Preparing designing in a develop test stage before going into operational production.
In use for value streams in an organisation. To administer: one current version and a possible new one being rolled out.
When the product is created by a supplier for many customers, there can be many current versions to support.
Monitoring results vs expectations to population distributions
Specifications of the product tool
🎭 Q-2.5.2 ALC-V1 process details
⚖ This simple job-shop approach is:
The management in lead has all knowledge what and how to do it.
Delegation of the knowledge to specified specialists by management.
The simplicity is also the loophole. Using a spreadsheet and not having independent validations is a pathway for spurious results.
A figure is of a classic one-off proces, operations:
🎭 Q-2.5.3 ALC-V2 process details
⚖ This advanced approach is:
The management in lead is able to define globally what and how to do it.
A delegated management team is assumed to have all knowledge what and how to do it what lead management did order.
There are separated environments to build & validate the new processes and run the operations.
Running operational is done by a dedicated team.
This approach is the only one seen as standard by technology staff people.
A figure is of this classic proces: develop, test - operations:
🎭 Q-2.5.4 ALC-V3 process details
⚖ This sophisticated approach is:
The management in lead is able to define globally what and how to do it.
The management in lead is accountable for the process although they don´t know all the logical and technical details.
A delegated management team is not assumed to have all knowledge what and how to do it what lead management did order.
There is loop back control to the delegated management for what is going on in the operational process results.
These are used for monitoring signals of the process.
There are loop back controls to engineer building for what is going on in the operational process results.
These operational results are sources to research for issues and improvements in the development environment.
Frame the engineers building the process is expected that they make decisions that are conforming requirements.
Explain those decisions for to the delegated management for impact and risks.
There are separated environments to build & validate the new processes and run the operations.
There could be the same levels for Develop - test -aceptance - production.
This imposes a challenge for having real operational data and doing intergration testing.
Running operational is done by a dedicated team.
The technology capacity for development is far higher than that for operations. GPU's are a recent addon for increased capacity in modelling.
This is different in ALC-V2, the common assumptions is the production environment needing more capacity than development
When needing to show the source the training data is an important asset.
A personal figure is of a ML (machine learning) process develop, test - operations:
Q-2.6 Organisation & Business Understanding
Once Dorothy and her colleagues made the journey to OZ, they quickly found out that there was no there, there.
The Wizard simply told her what she really should have known all along.
Dorothy and her companions just had to figure out how to reframe their own perceived shortcomings and recast them as strengths to achieve real transformation.
🎭 Q-2.6.1 Retrospective creating this page
🕳 Categorizing content, start with the why
I was assuming collecting the old information would be easy to get into this knowledge page.
How wrong I was, there was no story line no plan for a direction for the categorisation.
I was forced to restart with:
Start with Why (Simon Sinek, 2009)
Sinek says people are inspired by a sense of purpose (or "Why"), and that this should come first when communicating, before "How" and "What"
The Why for this knowledge page is:
Why: Getting the evolutions and the interactions, relations becoming more understandable
How: by getting the information, knowledge categorised
What: gathering the information for applying information processing
This failed for some aspects and is successful with other ones.
❓ The question why is this happening?
🤔Assumptions were made.
There is always a logical incremental growth, linear evolution.
Systems will always be behaving linear.
Systems are always be predictable from known input.
👁 Conclusion: These assumptions are false.
The reality, there is:
NO guaranteed logical growth, evolution. Not a logical timeline, possible jumps.
NO Boolean logical mathematical space for decisions. Other possible geometries.
NO exact outcome with mass systems for similar cases, a normal or other distribution.
Chaotic systems have well defined algorithms but are generating very different outcomes by slightly differences in the predictors.
🎭 Q-2.6.2 Retrospective creating pillar pages
Steer Shape Serve in pillars, the visualisation.
Avoiding the word "Information", Added symbols from the Jabes framework with the Jabes application.
Three pillars but the activities are mixed:
Steer in the organisation: pillar are the basic core competencies in the holistic one.
Shape in the organisation: pillar assures the future fitness for the organisation.
Serve in the organisation: pillar are the technology connections for processes.
Information accountability is clearly at "steer", business organisation.
Communication interactions is a combination of all over the different type of activities.
The figure,
See right side:
Steer Shape Serve in pillars, categorizing content, start with the why
Succeeded:
I-Serve page, There are three subjects for: What (T-1), How (T-2) into Why (T-3).
There are topic overlapping to I-steer but at different layers different conten context.
I-Steer page, There are three subjects for: What (B-1), How (B-2) into Why (B-3).
I-Shape page, Nice three subjects for: What (A-1), How (A-2) into Why (A-3).
The six topic for every subject are nice similar, building up to the goal of using Jabes.
Putting technology at distance results in a gap of supporting simple processes by technology tot the organisation.
👁 These pillars are conforming an organisation hierarchy.
👉🏾 Troublesome is the missing awareness for aligning technology and product management in strategy and tactics (I-Steer, I-Shape, I-Serve).
This results in gaps at operations.
🎭 Q-2.6.3 Retrospective building reference pages
Refernce pages, categorizing content, start with the why
Succeeded:
I-Data page, unknown es, to do.
I-Jabes page, Nice three subjects for: What (Y-1), How (Y-2) into Why (Y-3).
I-Know page, got the question, loking for: What (Q-1), How (Q-2) into Why (Q-3).
👁 These reference pages have no conforming organisation hierarchy.
Rotating the overhauled nine plane reordering into Steer Serve Shape.
Replacing the hierarchical assocaited words: strategy, tactical, operations into: basic competent advanced.
Three levels of growing skills, the activities are mixed:
Basic: Well known activities, keeping the lights on.
Competent: Forsight, able to act on changes, start changes.
Advanced: Insight planning for running and changes in wisdom.
Putting technology in the middle allows to show acting on simple adjustments (Steer-Serve) and ones with more impact (Server-Shape).
The figure,
See right side:
This page, categorizing content, start with the why
Succeeded:
Started with the what, focus on product management (Q-1).
🎭 Scientific fundaments controlling & running organisations is recent (Q-1.3)
👉🏾 Troublesome is the missing awareness for product management: Q-1.4
👉🏾 Troublesome is the missing awareness for process types ALC_V*: Q-1.5
Started with the how, focus on technology and other services (Q-2).
🎭 The scientific fundaments of information procssing are too technical (Q-2.3)
💣 Interactions, communication are completely different contexts: Q-2.2 vs Q-1.2
👉🏾 Troublesome is awareness, alignment process types ALC_V*: Q-2.5 vs Q-1.5
👉🏾 Troublesome is shift of decisions into developing processes ALC_V3 See Q-3
Started with the goal, why (Q-3). This is adding philisophy aside technology and organise.
🎭 The scientific fundaments of information procssing are too complicated (Q-3.3)
💣 Interactions, communication are different context: Q-3.2 vs Q-2.2 vs Q-1.2
👉🏾 Troublesome is organizing all process types ALC_V*: Q-3.5, Q-2.5, Q-1.5
👉🏾 Troublesome: responsibilities shift in processes ALC_V*3 old accountabilities
Changing and solving all issues will not be easy.
A tool can help but is not decisive for the needed cultural change.
Categorisation of 5W1H
Having six paragraphs there is an buildiing up into a goal from a request into a result.
A lazy categorisation:
Organising scopes: Which (Q-*.1) , Where (Q-*.6)
Mostly a technology scope: What (Q-*.4) , How (Q-*.5) .
Planning, Knowledge, scape: When (Q-*.2) , Who (Q-*.3)
📚 Q.2.6.4 External references
Many not that important external refrences ar part of the text.
Relevant links, a limited list:
A Philosophy of Security Architecture Design
There certainly are many technical aspects of modern information and communications technology (ICT) systems and the associated security architectures.
Indeed, most of the aspect of how to achieve the goals tend to be of a technical nature. However, questions concerning why need not be technical at all.
That is, on a systems level, the end goal of a security architecture are normally not technical in nature.
Instead, the goals tend to be more philosophical. They may be framed in a context of moral and ethics, and sometimes in the framework of legislation and societal rules.
The distinction between the technical and concrete aspects and the philosophical aspects can be conceptualized as the difference between verification and validation.
Verification is largely about checking that something is done the correct way, whereas validation is concerned about whether one is doing the right thing.
It is of little use to do something correctly, if one is indeed doing the wrong thing in the first place.
...
The need for security, safety and privacy is in many ways self-evident.
Large-scale critical infrastructures is essential to society, and so the level of security, safety and privacy becomes a question about what kind of society one wants to have. We shall not dive into safety and privacy in this paper.
However, we argue that strong security is a necessary condition for both safety and privacy.
This puts emphasis on the importance of an effective and comprehensive security architecture. Informally, the differences and relationships between security, safety and privacy can be stated as follows:
Security is about protection of the system.
Safety is about protection of people.
Security versus Safety.
For critical infrastructures, one cannot have credible safety without having strong security.
Privacy is about information control (related to persons).
Security versus Privacy.
One cannot have credible privacy without having strong security.
...
The Incerto is a set of books and essays on uncertainty. Touted as an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision making, the main body of the Incerto consists of five books by Nassim Nicolas Taleb.
...
The design of the proactive parts of a security architecture will be explicitly specified.
These are “design phase” requirements. It will be possible to have a complete and consistent design for the proactive measures.
The dynamic/reactive parts of the security architecture, which will be dealing with incident detection and response, recovery, etc., will likely not be completely specified (if at all).
...
There does not seem to be much work done on security architecture designs for large-scale critical infrastructures. Frederick Brooks, of “The Mythical Man-Month” fame, has written extensively about system designs in “The Design of Designs” .
...
The technology using computers has several lines of evolvements. The hardware has become faster, better, cheaper.
Application software has a few basic fundaments in logic by flows and constructs.
The problem to solve has moved from a pure technical aspect how to run machines into how to process information in a technical way.
⚙ Q-3.1.1 Distractors from knowledge
Aristotle & Plato
These old greek philosophers are stating the problem with the analytics. Them meaning and concepts.
theory_of_universals Aristotle to
Platonic realism Socrates
Plato and Xenophon's dialogues provide the main source of information on Socrates's life and thought.
These writings are the Sokratikoi logoi, or Socratic dialogues, which consist of reports of conversations apparently involving Socrates.
The most famous allegory of the cave
Any disruptive change altough being the real truth is not automatically an acceptable option for the involved. Decisions on information change by perspectives.
Algorithm
Al-Khwarizmi The origin of refering to an algorithmn. An algorithm is an simplified recipe tos solve an known type of problem.
Al-Khwarizmi´s contributions to mathematics, geography, astronomy, and cartography established te basis for innovation in algebra and trigonometry.
His systematic approach to solving linear and quadratic equations led to algebra, a word derived from the title of his book on the subject,
"The Compendious Book on Calculation by Completion and Balancing".
On the Calculation with Hindu Numerals written about 820, was principally responsible for spreading the Hindu-Arabic numeral system throughout the Middle East and Europe.
It was translated into Latin as Algoritmi de numero Indorum. Al-Khwarizmi, rendered as (Latin) Algoritmi, led to the term "algorithm".
Some of his work was based on Persian and Babylonian astronomy, Indian numbers, and Greek mathematics.
📚 These not understandable type of alogrithms are basic school maths lessons these days.
How to Solve Quadratic Equations
Just follow the recipe to get the answer.
Encrypting decrypting machines.
Being able to communicate using machines and more of the technical realisations solving the limits previous exist a new problem arose.
How to keep the information that is send in transmission types easily tapped by others as a secret? Colossus
Colossus was a set of computers developed by British codebreakers in the years 1943-1945 to help in the cryptanalysis of the Lorenz cipher.
Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations.
Colossus is thus regarded as the world´s first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.
This was not the decryption of the engigma machine but the mechanical transformed telex machines (Lorenz machines - tunny).
The British bombe was an electromechanical device designed by Alan Turing soon after he arrived at Bletchley Park in September 1939.
Harold "Doc" Keen of the British Tabulating Machine Company (BTM) in Letchworth (35 kilometres (22 mi) from Bletchley) was the engineer who turned Turing´s ideas into a working machine—under the codename CANTAB.
Turin´s specification developed the ideas of the Poles´ bomba kryptologiczna but was designed for the much more general crib-based decryption.
The enigma code has never been broken, sloppy procedures leaking basic conventions decreased the number of options to verify sufficient for getting enough decrypted in time.
The impact of reading what another did not want to let known was huge
While Germany introduced a series of improvements to Enigma over the years, and these hampered decryption efforts to varying degrees,
they did not ultimately prevent Britain and its allies from exploiting Enigma-encoded messages as a major source of intelligence during the war.
Many commentators say the flow of communications intelligence from Ultra´s decryption of Enigma, Lorenz and other ciphers shortened the war significantly and may even have altered its outcome.
⚙ Q-3.1.2 Different paths, philosophy
Politics philosophy - Machiavelli
The Prince (Il Principe), written around 1513 but not published until 1532, five years after his death .
After his death Machiavelli's name came to evoke unscrupulous acts of the sort he advised most famously in his work, The Prince.
He claimed that his experience and reading of history showed him that politics have always been played with deception, treachery, and crime.
He also notably said that a ruler who is establishing a kingdom or a republic, and is criticized for his deeds, including violence, should be excused when the intention and the result are beneficial to him.[9][10][11] Machiavelli's Prince has been surrounded by controversy since it was published. Some consider it to be a straightforward description of political reality. Others view The Prince as a manual, teaching would-be tyrants how they should seize and maintain power.
...
That a community has different components whose interests must be balanced in any good regime is an idea with classical precedents, but Machiavelli's particularly extreme presentation is seen as a critical step towards the later political ideas of both a division of powers or checks and balances, ideas which lay behind the US constitution, as well as many other modern state constitutions.
Philosophy Change Nature - Charles Darwin
Darwin published his theory of evolution with compelling evidence in his 1859 book On the
Origin of Species .
By the 1870s, the scientific community and a majority of the educated public had accepted evolution as a fact.
However, many favoured competing explanations that gave only a minor role to natural selection, and it was not until the emergence of the modern evolutionary synthesis from the 1930s to the 1950s that a broad consensus developed in which natural selection was the basic mechanism of evolution.
Darwin's scientific discovery is the unifying theory of the life sciences, explaining the diversity of life.
...
He saw that European colonisation would often lead to the extinction of native civilisations, and "tried to integrate colonialism into an evolutionary history of civilization analogous to natural history".
...
The term Darwinism was used for the evolutionary ideas of others, including Spencer's "survival of the fittest" as free-market progress, and Ernst Haeckel's polygenistic ideas of human development.
Writers used natural selection to argue for various, often contradictory, ideologies such as laissez-faire dog-eat-dog capitalism, colonialism and imperialism. However, Darwin's holistic view of nature included "dependence of one being on another"; thus pacifists, socialists, liberal social reformers and anarchists such as Peter Kropotkin stressed the value of co-operation over struggle within a species.
Darwin himself insisted that social policy should not simply be guided by concepts of struggle and selection in nature.
...
Sociology - Michael Kranzberg
Decision making a limited philisophy on those aspects.
Kranzberg is known for his laws of technology.
Melvin Kranzberg's six laws of technology state:
Technology is neither good nor bad; nor is it neutral.
Invention is the mother of necessity.
Technology comes in packages, big and small.
Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.
All history is relevant, but the history of technology is the most relevant.
Technology is a very human activity – and so is the history of technology
Sserved in the U.S. Army in Europe during World War II.
He received a Bronze Star for interrogating captured German prisoners and learning the location of Nazi gun emplacements.
He was one of two interrogators out of nine in Patton's army who were not killed during the conflict.
Politicial theorist - Langdon Winnner
Technology and politics a limited philisophy on those aspects.
To the question he poses "Do Artifacts Have Politics?", Winner identifies two ways in which artifacts can have politics.
The first, involving technical arrangements and social order, concerns how the invention, design, or arrangement of artifacts or the larger system becomes a mechanism for settling the affairs of a community. This way "transcends the simple categories of 'intended' and 'unintended' altogether", representing "instances in which the very process of technical development is so thoroughly biased in a particular direction that it regularly produces results heralded as wonderful breakthroughs by some social interests and crushing setbacks by others".
This second way in which artifacts can have politics can be further articulated as consisting of four 'types' of artifacts: those requiring a particular internal sociological system, those compatible with a particular internal sociological system, those requiring a particular external sociological system, and those compatible with a particular external sociological system.
Certain features of Winner's thesis have been criticized by other scholars, including Bernward Joerges.
Over the years one focus of Winner's criticism has been the excessive use of technologies in the classroom, both in K-12 schools and higher education. Winner's critique is well explained in his article "Information Technology and Educational Amnesia,"and expressed in his satirical lecture, "The Automatic Professor Machine."
Philosophy of Science, Technology & Society -
Theory of Technological Mediation a limited philisophy on those aspects.
Verbeek presents as the purpose of his theory of technological mediation to systematically analyzing the influence of technology on human behavior in terms of the role technology plays in human-world relations.
In his original theory, a number of different human-technology-world relations are stipulated (the first four based on the philosophy of Don Ihde):
Embodiment: in which the technology does not call attention to itself but to aspects of the world given through it (e.g. glasses)
Hermeneutic: in which the technology represents a certain aspect of the world (e.g. a thermometer)
Background: in which technology shapes the experiential context, going beyond conscious experience (e.g. room temperature through a central heating system)
Alterity: in which technology presents itself as quasi other to the subject (e.g. an ATM)
Cyborg: in which technology merges with the human (e.g., brain implants)
Immersion: in which technology forms an interactive context (e.g., smart homes)
Augmentation: in which technology mediates and alters our experience of the world, e.g., Google Glass.
A unique feature of Verbeek's philosophy of technology is its basis in an empirical analysis of technologies.
Instead of generating an overarching framework by which the universal features of specific technologies can be analyzed, Verbeek takes the technology itself as point of departure; which is for example illustrated by his analysis of ultrasound technology
⚙ Q-3.1.3 Impact technology changes
Change - food, climate
Changes over time/space and lessons for future food safety
The emergence of city-states has been a major driver of food system changes, bringing together large populations within defined boundaries and requiring complex governance to deliver sufficient quantities and quality of food. Advances in food storage, with sealed containers and curing methods, the use of animal transport, sailing ships, and trains to move larger volume than can be carried by individuals; trade in ingredients like salt as well as live animals and agricultural products; and increasing political and military conflict for resources all have been developments of the city-state. Early impact of Mesoamerican goods in Iberian society.
The early impact of Mesoamerican goods on Iberian society had a unique effect on European societies, particularly in Spain and Portugal.
The introduction of American "miracle foods" was instrumental in pulling the Iberian population out of the famine and hunger that was common in the 16th century. Maize (corn), potatoes, turkey, squash, beans, and tomatoes were all incorporated into existing Spanish and Portuguese cuisine styles.
Equally important was the impact of coffee and sugar cane growing in the New World (despite having already existed in the Old World).
Along with the impact from food, the introduction of new goods (such as tobacco) also altered how Iberian society worked.
Change - Social structure
Age of Enlightenment
The Enlightenment featured a range of social ideas centered on the value of knowledge learned by way of rationalism and of empiricism and political ideals such as natural law, liberty, and progress, toleration and fraternity, constitutional government and the formal separation of church and state. ...
The Enlightenment was preceded by the Scientific Revolution and the work of Francis Bacon and John Locke, among others. ...
Others cite the publication of Isaac Newton's Principia Mathematica (1687) as the culmination of the Scientific Revolution and the beginning of the Enlightenment. ...
Philosophers and scientists of the period widely circulated their ideas through meetings at scientific academies, Masonic lodges, literary salons, coffeehouses and in printed books, journals, and pamphlets. The ideas of the Enlightenment undermined the authority of the monarchy and religious officials and paved the way for the political revolutions of the 18th and 19th centuries.
Change - Shrinking world
Global village describes the phenomenon of the entire world becoming more interconnected as the result of the propagation of media technologies throughout the world. The term was coined by Canadian media theorist Marshall McLuhan in his books The Gutenberg Galaxy: The Making of Typographic Man (1962) and Understanding Media (1964).
🎯 New goals:
Moving from only financial goals into included social goals
Avoiding: manipulations, inconsistencies by what is known, locked in mindsets
Q-3.2 Communication - Interactions
Working with machines that process information, is a relative new topic of science.
Human communications and interaction is classic.
The concept of the "information" container is not that clear and simple.
Information is an abstract concept that refers to that which has the power to inform.
At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions.
Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information.
⚙ Q-3.2.1 Goals feeding decisisons
Yellow brick road
Monetizing Data: Follow the Yellow Brick Road (Mark Katz 2018)
Firms can undergo the same kind of journey, only to find out that there is indeed no “magic” to solving data monetization challenges.
While the tools have vastly improved, and the power of BI buttressed by AI and Machine Learning has helped greatly with incorporating challenges like unstructured and disparate data (internal and external),
that Yellow Brick Road journey still requires cultural and operational steps including empowerment of associated teams.
There is not a lot of room for autocracy in achieving the best results.
Foundational elements work best, and collaboration is a must.
...
In my experience around toolsets, it is often a mistake to think that monetizing data is as easy as dropping tools into internal users or customer’s hands and you have a data product.
That approach can be myopic, ultimately damaging a firm’s brand, causing the data journey to move sideways.
Firms should avoid building process and foundational data strategies around software, hoping for an easy answer—that will indeed be an expensive mistake.
The need for decision making
Decision making is necessary when there are relationships with others. As soon a conflict arises the choice is solving that by:
forcing it in the most powerful one, survival of the fittest. There is personal view and others depending on a group interest (chapter squad guild).
have it solved within in tribe by the leader of the tribe. Assuming the leader of the tribe is protecting all against other tribes.
I used the words that are common in organizing agile ICT. 🤔 Questions:
Why referring to pre-historic human cultures for implementing new modern organisations?
Guilds tribes squads chapters Agile
When implementing some process for decision making why mentioning that goal is avoided?
Your goal is not to be Spotify, but to leverage their model to improve how your organization works together.
The deeper meaning of each element of VUCA serves to enhance the strategic significance of VUCA foresight and insight as well as the behaviour of groups and individuals in organizations. It discusses systemic failuresand behavioural failures, which are characteristic of organisational failure.
Volatility: the nature and dynamics of change, and the nature and speed of change forces and change catalysts.
Uncertainty: the lack of predictability, the prospects for surprise, and the sense of awareness and understanding of issues and events.
Complexity: the multiplex of forces, the confounding of issues, no cause-and-effect chain and confusion that surrounds organization.
Ambiguity: the haziness of reality, the potential for misreads, and the mixed meanings of conditions; cause-and-effect confusion.
It is generally assumed that collaboration is, in and of itself, a "good thing." "Plays well with others" is high praise from kindergarten onward.
"All of us are smarter than any of us."
"The more participation in design, the better." Now, these attractive propositions are far from self-evident.
I will argue that they surely are not universally true.
Most great works of the human mind have been made by one mind, or two working closely.
This is true of most of the great engineering feats of the 19th and early 20th centuries.
But now, team design has become the modern standard, for good reasons.
The danger is the loss of conceptual integrity in the product, a very grave loss indeed.
So the challenge is how to achieve conceptual integrity while doing team design, and at the same time to achieve the very real benefits of collaboration. [F. Brooks: 'The Design of Designs', 2010]
What you should do:
The Architectural Thinking Framework defines the following:
Integrate IT governance in other governance processes of the company
Establish a 'Digital Governance Board' that consists of Top-level executives
Establish boards for each top-level capability that consist of the corresponding business unit leads and managers of these business units
Define 'Architecture Owner' roles that work within the autonomous solution teams AND draw the architecture maps needed for decisions by the boards. They are the bridge between autonomous- and centralized decisions
To deal with the challenges of the VUCA world, many companies experiment with shifting the idea of agility, as broadly used in software engineering practices in form of e.g. SCRUM to the whole organization.
Browsing through approaches about scaled Agile, some of their proponents seem to propose that all decisions should be made decentralized by autonomous teams.
Use the knowledge of the many and you will get the right solutions.
But that is far from true.
All approaches proposing the agile enterprise do not take one thing into account: architecture. Building solutions in a sound architectural form needs common elements and ‘conceptual integrity’.
This means that the concepts and structures of the business (capabilities, value streams, products & services, business objects) and IT (technology components) must play together in a way that maximizes simplicity, consistency, agility and thus business value.
⚙ Q-3.2.2 Enterprise Culture vision
Demings legacy and the Toyota way
25 Years after W. Edwards Deming
He greatly influenced the management of quality in Japan, where he is still revered as one of the great gurus in manufacturing.
Through his influence on Toyota, his ideas are now common in the lean world.
⚖ Lean culture, PDCA
❓ what is real lean about?
Create constancy of purpose toward improvement of product and service, with the aim to become competitive, to stay in business and to provide jobs.
Adopt the new philosophy. We are in a new economic age.
Western management must awaken to the challenge, must learn their responsibilities, and take on leadership for change.
Cease dependence on inspection to achieve quality. Eliminate the need for massive inspection by building quality into the product in the first place.
End the practice of awarding business on the basis of a price tag. Instead, minimize total cost.
Move towards a single supplier for any one item, on a long-term relationship of loyalty and trust.
Improve constantly and forever the system of production and service, to improve quality and productivity, and thus constantly decrease costs.
Institute training on the job.
Institute leadership .
The aim of supervision should be to help people and machines and gadgets do a better job. Supervision of management is in need of overhaul, as well as supervision of production workers.
(see Point 12 and Ch. 8 of Out of the Crisis).
Drive out fear, so that everyone may work effectively for the company. (See Ch. 3 of Out of the Crisis)
Break down barriers between departments.
People in research, design, sales, and production must work as a team, to foresee problems of production and usage that may be encountered with the product or service.
Eliminate slogans, exhortations, and targets for the work force asking for zero defects and new levels of productivity.
Such exhortations only create adversarial relationships, as the bulk of the causes of low quality and low productivity belong to the system and thus lie beyond the power of the work force.
Eliminate work standards (quotas) on the factory floor. Substitute with leadership.
Eliminate management by objective. Eliminate management by numbers and numerical goals. Instead substitute with leadership.
Remove barriers that rob the hourly worker of his right to pride of workmanship. The responsibility of supervisors must be changed from sheer numbers to quality.
Remove barriers that rob people in management and in engineering of their right to pride of workmanship. This means, inter alia, abolishment of the annual or merit rating and of management by objectives (See Ch. 3 of Out of the Crisis).
Institute a vigorous program of education and self-improvement.
Put everybody in the company to work to accomplish the transformation. The transformation is everybody´s job.
⚖ Lean, Deadly Diseases
Well that is real lean, very sensible, far more than that PDCA cycle.
❓ What should be avoided whith real lean?
He also created a list of the “Seven Deadly Diseases,” which are also sensible.
Lack of constancy of purpose
Emphasis on short-term profits
Evaluation by performance, merit rating, or annual review of performance
Mobility of management
Running a company on visible figures alone
Excessive medical costs
Excessive costs of warranty, fueled by lawyers who work for contingency fees
⚖ Abstraction forces in the organsisation:
There are more lines of power in an organisation. Some of those:
Financial based management. Goal: profits at least enough budget for tasks.
Core business. Goal: Fulfilling the operations for tasks of the organisation.
Green fields. Goal: Improvement, product research, customer relations.
💣 The powers are not equally balanced The core business (operations) is the line having commonly the least influence at strategic level.
The result of that could be (risk) a total loss of all tasks the business was positioned to do. Going back to the basics of Fayol.
A proposal for a generic approach balancing powers.
Steer:
dynamic: operational value stream
static: financial and other values
public: promotion, communication
Serve:
technology enablement: value stream - monitoring - feed back
CEO
The responsibilities of an organization´s CEO are set by the organization´s board of directors or other authority, depending on the organization´s structure.
They can be far-reaching or quite limited, and are typically enshrined in a formal delegation of authority regarding business administration.
Typically, responsibilities include being an active decision-maker on business strategy and other key policy issues, leader, manager, and executor.
The communicator role can involve speaking to the press and to the public, as well as to the organization´s management and employees; the decision-making role involves high-level decisions about policy and strategy.
The CEO is tasked with implementing the goals, targets and strategic objectives as determined by the board of directors.
⚙ Q-3.2.3 Decisions wiht the aid of machines
Usablity classic explainable Machine Learning
Recognizing cat or dog in a Rorschach setting. The result is one the will fulfil algorithmic requirements it is not guaranteeing correct result in the real world perception.
The world of deep learning AI is having results on categorizing images sound and more in a mostly but not absolute correct classification.
The only thing that gets underpinning by using information (data) is the decisions that were done before by "good human feeling".
It is not a negative advice for using machines instead is a positive advice to uses machines wisely.
Qualities:
clear described goals
validations options are understandable, possible to process in the same activity.
dependencies and incomplete / invalid information having correction options.
The problems in doing that were usually showing a bad way of human decision makers. Blaming the machine showing that kind of human issues is human nature.
Usablity Deep Learning
These kind of decisions are partial for another decision. Looking for a vase of looking persons talking to each other are very different questions.
Asking whether there is a young or old woman and seeing the sae picture having both of them.
The fulfilment by the same image is counterintuitive.
 
What kind of problems are a good condidate for this type ofatomated machine classification decisions?
clear described goals
validations options are understandable but NOT possible in the same proces activity.
dependicies and incomplete / invalid information having correction options.
The validation on being a cat or dog could be eays seen by a human but not by running the same machine model again.
Cleaning up harvested natural goods using machines coud be automated with image recognition. Whether the fall out is well enough segregated is easilty to see by a human but correcing the image selection when not appropiate is a difficult change.
The hyped biometrical recongition in computertechnology I avoided is this list. Probably machines are already better in recognizing humans than humans in the same limitation setting.
🎯 New goals:
Understanding, accepting the world we are living is constantly changing.
Managing the options avoiding unnecessary risks by uncertainties.
Eliminating the fear for new tools new real information processing, bpm & brm
Decisions on information change by perspectives.
Q-3.3 Historical evolvement Wisdom
Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step.
For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of,
each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain.
⚙ Q-3.1.2 Logical constructs information processing.
Decisions, problems solving, encryption - Turing
Alan-Turing
Honored by: Turing award
What mathematicians called an "effective" method for solving a problem was simply one that could be carried by a human mathematical clerk working by rote.
In Turing´s time, those rote-workers were in fact called "computers," and human computers carried out some aspects of the work later done by electronic computers.
The Entscheidungsproblem sought an effective method for solving the fundamental mathematical problem of determining exactly which mathematical statements are provable within a given formal mathematical system and which are not.
A method for determining this is called a decision method. In 1936 Turing and Church independently showed that, in general, the Entscheidungsproblem problem has no resolution, proving that no consistent formal system of arithmetic has an effective decision method.
It was in the course of his work on the Entscheidungsproblem that Turing invented the universal Turing machine, an abstract computing machine that encapsulates the fundamental logical principles of the digital computer.
...
Turing was a founding father of artificial intelligence and of modern cognitive science, and he was a leading early exponent of the hypothesis that the human brain is in large part a digital computing machine.
He theorized that the cortex at birth is an "unorganised machine" that through "training" becomes organized into a universal machine or something like it.
Turing proposed what subsequently became known as the Turing test as a criterion for whether an artificial computer is thinking (1950).
To be or not to be? Turing%27s proof
It was the second proof (after Church´s theorem) of the conjecture that some purely mathematical yes-no questions can never be answered by computation;
more technically, that some decision problems are "undecidable" in the sense that there is no single algorithm that infallibly gives a correct "yes" or "no" answer to each instance of the problem.
In Turing´s own words: "...what I shall prove is quite different from the well-known results of Gödel ...
I shall now show that there is no general method which tells whether a given formula U is provable in K [Principia Mathematica]..." (Undecidable, p. 145).
Science information processing - EW Dijkstra
Quality Correctness Elegance. Edsger Dijkstra The question on software quality by mathematical abstractian.
One of the most influential figures of computing science's founding generation, Dijkstra helped shape the new discipline from both an engineering and a theoretical perspective.
His fundamental contributions cover diverse areas of computing science, including compiler construction, operating systems, distributed systems, sequential and concurrent programming, programming paradigm and methodology, programming language research, program design, program development, program verification, software engineering principles, graph algorithms, and philosophical foundations of computer programming and computer science.
Many of his papers are the source of new research areas.
Several concepts and problems that are now standard in computer science were first identified by Dijkstra or bear names coined by him.
Strcutured programming. Jackson structured programming ,
Nassi Shneiderman diagram Algol all were basic elements at education touching software design in the first years thereafter.
"The revolution in views of programming started by Dijkstra's iconoclasm led to a movement known as structured programming, which advocated a systematic, rational approach to program construction. Structured programming is the basis for all that has been done since in programming methodology, including object-oriented programming."
Chaos theory - Edward_Norton_Lorenz
Edward_lorenz
By the late 1950s, Lorenz was skeptical of the appropriateness of the linear statistical models in meteorology, as most atmospheric phenomena involved in weather forecasting are non-linear.
It was during this time that his discovery of deterministic chaos came about.
In 1961, Lorenz was using a simple digital computer, a Royal McBee LGP-30, to simulate weather patterns by modeling 12 variables, representing things like temperature and wind speed. He wanted to see a sequence of data again, and to save time he started the simulation in the middle of its course.
He did this by entering a printout of the data that corresponded to conditions in the middle of the original simulation. To his surprise, the weather that the machine began to predict was completely different from the previous calculation. The culprit: a rounded decimal number on the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 printed as 0.506. This difference is tiny, and the consensus at the time would have been that it should have no practical effect. However, Lorenz discovered that small changes in initial conditions produced large changes in long-term results.
Lorenz's discovery, which gave its name to Lorenz attractors, showed that even detailed atmospheric modelling cannot, in general, make precise long-term weather predictions.
His work on the topic, assisted by Ellen Fetter, culminated in the publication of his 1963 paper "Deterministic Nonperiodic Flow"
Explanation choices
This collection for processing Information is:
For first: Turing is usually classified as someone with computers. The difference in understanding and seeing goals with information processing is decisive.
For the second: E.Dijkstra also usually classified to computer science. Clearly the same diffrence, not the technology but logics in information processing.
For the third: E.M Lorentz might be a surprise. The uncertaintities in clearly defined systems in information processing.
It is this area of chaotic behavior in very similiar predictors that has difficulties in getting acceptance.
Remarkable this basic theory of information processing is that recent.
⚙ Q-3.2.2 Using Data Analytics statistics.
Bayes, 18th century.
Thomas_Bayes (wikipedia) One of the founders for probablity.
Bayesian probability is the name given to several related interpretations of probability as an amount of epistemic confidence " the strength of beliefs, hypotheses etc." rather than a frequency.
This allows the application of probability to all sorts of propositions rather than just ones that come with a reference class. "Bayesian" has been used in this sense since about 1950.
Since its rebirth in the 1950s, advancements in computing technology have allowed scientists from many disciplines to pair traditional Bayesian statistics with random walk techniques.
The use of the Bayes theorem has been extended in science and in other fields.
Laplace, 19th century.
Laplace is more generic. Some of his theory being used at spectral signal processing science. What is called these days "Bayesian" is more likely coming from Laplace. laplace (wikipedia)
In 1812, Laplace issued his Theorie analytique des probabilitys in which he laid down many fundamental results in statistics.
The first half of this treatise was concerned with probability methods and problems, the second half with statistical methods and applications.
Laplace´s proofs are not always rigorous according to the standards of a later day, and his perspective slides back and forth between the Bayesian and non-Bayesian views with an ease that makes some of his investigations difficult to follow,
but his conclusions remain basically sound even in those few situations where his analysis goes astray.
Fisher, 20th century.
Ronald Fisher (wikipedia)
In 1925 he published Statistical Methods for Research Workers, one of the 20th century´s most influential books on statistical methods.
Fisher´s method is a technique for data fusion or "meta-analysis" (analysis of analyses). This book also popularized the p-value, and plays a central role in his approach.
Fisher proposes the level p=0.05, or a 1 in 20 chance of being exceeded by chance, as a limit for statistical significance, and applies this to a normal distribution (as a two-tailed test), thus yielding the rule of two standard deviations (on a normal distribution) for statistical significance.
The basics on statistics mostly practiced descriptive, the only prediction is extrapolation from a small sample size to a complete population.
Explanation choices
This collection for processing Information is:
For first: Bayes is common to refer.
For the second: Laplace also common.
For the third: Fisher set what is known "Statistical relevant".
The usage of the p-value however has grown into "not without doubts".
Remarkable this basic theory of information processing is that old.
Just recently this kind of knowledge started to become usable by computer technology known as Artificial Intelligence, Machine Learning (AI, ML).
⚙ Q-3.2.3 Using Big Data, forgotten histories.
Age of Discovery
"It saw also the first major victories of empirical inquiry over authority, the beginnings of that close association of science, technology, and everyday work which is an essential characteristic of the modern western world."
...
The Portuguese began systematically exploring the Atlantic coast of Africa in 1418, under the sponsorship of Infante Dom Henrique (Prince Henry). In 1488, Bartolomeu Dias reached the Indian Ocean by this route.
...
Technological advancements that were important to the Age of Exploration were the adoption of the magnetic compass and advances in ship design.
...
Indian Ocean trade routes were sailed by Arab traders.
Between 1405 and 1421, the Yongle Emperor of Ming China sponsored a series of long range tributary missions under the command of Zheng He (Cheng Ho).
The fleets visited Arabia, East Africa, India, Maritime Southeast Asia and Thailand.
But the journeys, reported by Ma Huan, a Muslim voyager and translator, were halted abruptly after the emperor's death and were not followed up, as the Chinese Ming dynasty retreated in the haijin, a policy of isolationism, having limited maritime trade.
... Florence Nightingale
Nightingale was a pioneer in statistics; she represented her analysis in graphical forms to ease drawing conclusions and actionables from data.
She is famous for usage of the polar area diagram, also called the Nightingale rose diagram, equivalent to a modern circular histogram.
This diagram is still regularly used in data visualisation.
...
she was simply opposed to a precursor of germ theory known as contagionism.
This theory held that diseases could only be transmitted by touch.
Before the experiments of the mid-1860s by Pasteur and Lister, hardly anyone took germ theory seriously; even afterwards, many medical practitioners were unconvinced.
Bostridge points out that in the early 1880s Nightingale wrote an article for a textbook in which she advocated strict precautions designed, she said, to kill germs.
Matthew_Fontaine_Maury
Lieutenant Maury published his Wind and Current Chart of the North Atlantic, which showed sailors how to use the ocean's currents and winds to their advantage, drastically reducing the length of voyages.
His Sailing Directions and Physical Geography of the Seas and Its Meteorology remain standard. Maury's uniform system of recording synoptic oceanographic data was adopted by navies and merchant marines around the world and was used to develop charts for all the major trade routes.
...
Maury became convinced that adequate scientific knowledge of the sea could be obtained only through international cooperation.
He proposed that the United States invite the maritime nations of the world to a conference to establish a "universal system" of meteorology, and he was the leading spirit of a pioneer scientific conference when it met in Brussels in 1853.
Within a few years, nations owning three-fourths of the shipping of the world were sending their oceanographic observations to Maury at the Naval Observatory, where the information was evaluated and the results were given worldwide distribution.
Explanation choices
This collection for processing Information is:
For first: The age of discovery used prescribed documenting information using gathered information for getting new insight.
Isolated and humiliated westerns did succeed in travelling over the world repeatable and predictable. Bypassing the cut off known eastern routes.
The Arabs Indian And Chinese could have done that but made different choices. African people (south of Sahara) and native Americans were isolated.
The migrations are an evolution by natural logic described by Darwin.
For the second: Nightingale broke social culture awareness. Soldiers were assumed to fight and die in fights but not unnecessary before those battles.
Seeing human beings as some cheap different class was common all over the world with leaders having power and others for pleasing the ones with power.
For the third: Maury did a new step by not only knowing how to travel and reach destinations,
but using the available information to answer which of the options will give the best result in getting to the destination.
Information processing is a combination of these three parts.
🎯 New horizons:
Including: risk management for all kind of uncertainties
Avoiding complexity by accepting uncertainties with inequality for decisions
Developing and realising tools for real information processing, bpm & brm
Aligning: business process management (bpm) and business rules management (brm)
Q-3.4 Processing flows VSM - Control
A swarm organisation, self organisation, are networked structures without leaderships. Using some shared goal.
⚠ Challenges: have a shared goal, have a good shared goal.
The organisation structure is a hierarchical line of command. Group formations using leaders is human nature.
⚠ Challenges: avoiding leadership to micro details, bad goals.
To solve:
😱 Every topic needs improvements proposals for how at each.
⚙ Q-3.4.1 Defining priority for changes
Hoshin Kanri
😉 The start and reason of any change. There must be an associated goal of the organisation.
There should be only three to six main points
these items should be based on a process, not on a target
... you will sooner or later come across an X-Matrix. It is a visually very impressive tool, but I am in serious doubt about its usefulness. It focuses on the creation of the Hoshin items, but to me this approach is overkill, and – even worse – may distract the user from actually following the PDCA, especially the Check and Act parts. ...
Setting the right goals and filtering them through the organization is important in Hoshin Kanri. In my first post I talked in detail about this as the "to-do list."
...
Like the “normal” Hoshin Kanri, this document is done at different levels in the hierarchy, starting with the top executive. These are named rather straightforward as top-level matrix, second-level matrix, and third-level matrix.
A figure:
See right side
Criticsm:
Long-term goals not long-term enough
Often redundant focus on numeric goals
Diluting responsibilities
Where´s the PDCA?
Most Hoshin Kanri documents that I know cover one year. This is usually a good duration, since one year allows for quite a bit of improvement activity.
This duration is also long enough to see the results and review the outcome.
The fit with the SIAR model and PDCA DMAIC is far to nice.
It solves: "who and why" going from "knowledge" (Jabes stage proposals backlog) into initiating activities.
⚙ Q-3.4.2 Indispensable security, safety
PDCA used by SOC Security Operations Center
A marvelous figure for security, it is having a feedback to improve VAS, TIS.
reactive, what has happened:
Detect
Prevent
Proactive, what could happen:
Respond
Improve
Starting with a BIA (Business Impact Analysis) for risk probability and impact it goes into the CIA (Confidnetiality Integrity Availablity).
💡 A way to look at improvement, mitigations: VAS
A versatile authentication server or service (VAS) is a single authentication product or service supporting user authentication in multiplatform environments (on-premises and in the cloud).
For TIS, Trustede Identity Services, the key symbol, PAM
Controlling, monitoring, and auditing privileges and privileged access—for employees, vendors, systems, applications, IoT, and everything else that touches your IT environments is essential for protecting against both external and internal threat vectors, and for meeting a growing list of compliance requirements.
💡 Extended SOC Security Operations Centre
😉 Solving organisational impediments.
There is a serious problem in managing security, See T-2.5.3 Identity Access (I-Serve) .
Moving the operational activity: onboarding -offboarding and those for middleware and infrastructure to the SOC is a way out of the problems.
Generic patterns for security are more easy to define and rolled out. The accountabilities and insight for control still are left at the organisation.
Standard SOC Security Operations Center
The most likely source, Dacoso. Standard competent activities:
Soc aas
SIEM: Security vulnerabilities are detected through log file analyses and cyber security attacks are averted in time.
NDR: Alarms and threats in the network. Machine learning is used to automate the detection of attacks.
EDR: identify and block attacks on endpoint. With the help of machine learning and artificial intelligence.
Advanced activities requiring more intelligent decision making:
VAS: A vulnerability scanner checks your IT systems for weak points
TIS: Threat Intelligence Service, Does the Darknet have your company in its sights?
The challenges in the decision making are:
Not every possible risk from technical escalation is a real risk in functional perspective.
The technical escalations are possible mentioning issues that are out of control by functional accountable persons although there is a influence to the suppliers.
Risk based is evaluating the option by stopping primary operations preventive vs stopping by created damage.
😉 Solving organisational impediments. Impossible choice: An Overview of Data Quality Frameworks (2019)
Nowadays, the importance of achieving and maintaining a high standard of data quality is widely recognized by both practitioners and researchers.
Based on its impact on businesses, the quality of data is commonly viewed as a valuable asset.
The literature comprises various techniques for defining, assessing, and improving data quality.
However, requirements for data and their quality vary between organizations.
Due to this variety, choosing suitable methods that are advantageous for the data quality of an organization or in a particular context can be challenging.
Technical
😉 Solving organisational impediments. Start with the why:
Simple data management (Robert Borkes)
The 9-Square Data Management Model offers a robust framework tailored to foster strategic alignment across diverse facets of business operations.
Specifically crafted to address the challenges faced by executives grappling with data-related issues, this model provides targeted solutions and guidance.
By serving as a guiding principle, it facilitates the synchronization of business and data strategies, leading to enhanced efficiency in organizational decision-making processes.
With strategic goals centered around maximizing “return on data”, and clear objectives aimed at improving decision-making, optimizing organizational structure, and delivering exceptional services, this model empowers organizations to harness the full potential of their data assets.
in a figure:
See left side
⚙ Q-3.4.4 Impact on persons
functional
Information over a person is not the same as the person is the owner of the information.
😱 There is lot going wrong just by this wrong assumption. Personal_data
In the GDPR, personal data is defined as:
Any information relating to an identified or identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
When there is a decision having impact on a person the common assumption is it is possible for those decisions not having different impact in similar situations.
This is very wrong assumption in complex systems. The chaos theory based on very simple equations only got known in the sixties (1961).
The only way to mitigate those effects is avoiding complex systems going for anti-fragile simplicity.
Technical
Ethical principles in machine learning and artificial intelligence
AI became a self-standing discipline in the year 1955 (McCarthy et al., 2006) with significant development over the last decades. ...
The France’s Digital Republic Act gives the right to an explanation as regards decisions on an individual made through the use of administrative algorithms (Edwards and Veale, 2018).
This law touches upon several aspects including:
how and to what extent the algorithmic processing contributed to the decision-making;
which data was processed and its source;
how parameters were treated and weighted;
which operations were carried out in the treatment.
Sensitive governmental areas, such as national security and defence, and the private sector (the largest user and producer of ML algorithms by far) are excluded from this document.
The complexity in this is that rules written as laws are not included in this kind of review. The human as weakest link.
There are a lot of statements with doubtfull assumptions leaving eryone to doubt in in a=in anger.
😱 Just telling how it works in documents could be a way out. There is nothing weel evalauated in place (2024).
AI faces the difficulty of translating overarching principle into practices. Even its current setting of seeking maximum speed, efficiency and profit clashes with the resource and time requirements of an ethical assessment and/or counselling.
Article 2. Separate From Processes, Not Contained In Them
Article 3. Deliberate Knowledge, Not A By-Product
Article 4. Declarative, Not Procedural
Article 5. Well-Formed Expression, Not Ad Hoc
Article 6. Rule-Based Architecture, Not Indirect Implementation
Article 7. Rule-Guided Processes, Not Exception-Based Programming
Article 8. For the Sake of the Business, Not Technology
Article 9. Of, By and For Business People, Not IT People
Article 10. Managing Business Logic, Not Hardware/Software Platforms
Technical
😉 Solving organisational impediments. Sophisticated technology for using basics.
When a technical approach for defining and storing is prefered:
Semantics of Business Vocabulary and Business Rules
This specification defines the vocabulary and rules for documenting the semantics of business vocabularies and business rules for the exchange of business vocabularies and business rules among organizations and between software tools.
This specification is interpretable in predicate logic with a small extension using modal operators.
It supports linguistic analysis of text for business vocabularies and business rules, with the linguistic analysis itself being outside the scope of this specification.
Q-3.5 Processes Building Blocks - Control
The term elephant test refers to situations in which an idea or thing, "is hard to describe, but instantly recognizable when spotted"
A process life cycle building block, ALC life cycle, is very generic en simplistic.
There are only three possible approaches.
To solve:
😱 Aside the basics and realisation control is required.
😱 Communication is made complicated caused by neglectance.
⚙ Q-3.5.1 Vocubalary gaps functional technical.
⚒ Context confusing: business - cyber technology
There is a lot of misunderstanding between normal organisational humans and their cyber colleagues.
That culture is not necessary, should be eliminated. This already starts with words describing the organisation.
A translation of words to start:
ICT
Business
ICT
Business
ICT
Business
Strategy
Control
-
Functional
Target-Value
-
Confidentiality
People
Tactical
Orchestration
-
Compliancy
Communication
-
Integrity
Processes
Operational
Realization
-
Technical
Information
-
Availability
Machines
Note that the asset "Information" is a business asset not something to be pushed off as incomprehensible for the "cyber" guys.
Being an important business asset, "Information" accountability and responsibility is at product management staff of the organisation.
Decision-making can be regarded as a problem-solving activity yielding a solution deemed to be optimal, or at least satisfactory.
It is therefore a process which can be more or less rational or irrational and can be based on explicit or tacit knowledge and beliefs.
Tacit knowledge is often used to fill the gaps in complex decision making processes.
Usually both of these types of knowledge, tacit and explicit, are used together in the decision-making process.
The decisions are by control, controllers, imperators.
That is very selected small group of people.
Real working people feelings are experiences like being marionets.
They are used like machines and commonly excluded from decisions. That is not an enablement of the workforce.
Decisions in the ICT devil triangle.
Having three parties:
Organisation, product value stream (I-Steer), Accountable for "Information"
Services, Software, processes, ICT (I-Steer), Accountable for "Technology"
Change, projects, (I-Shape), Accountable for "Communicationy"
The advent of modern information technology has been a primary driver of information overload on multiple fronts: in quantity produced, ease of dissemination, and breadth of the audience reached. Longstanding technological factors have been further intensified by the rise of social media including the attention economy, which facilitates attention theft.
In the age of connective digital technologies, informatics, the Internet culture (or the digital culture), information overload is associated with over-exposure, excessive viewing of information, and input abundance of information and data.
T - Tsar replacement Technology.
With the goal of decision making Technology is just a detail. The importance is in serving the Tsar
Tsar Tsar a better description for the abbreviation "T" in ICT.
The title tsar is derived from the Latin title for the Roman emperors, caesar.
In comparison to the corresponding Latin word imperator, the Byzantine Greek term basileus was used differently depending on whether it was in a contemporary political context or in a historical or Biblical context.
In the history of the Greek language, basileus had originally meant something like "potentate".
It gradually approached the meaning of "king" in the Hellenistic Period, and it came to designate "emperor" after the inception in the Roman Empire.
As a consequence, Byzantine sources continued to call the Biblical and ancient kings "basileus" even when that word had come to mean "emperor" when referring to contemporary monarchs, while it was never applied to Western European kings, whose title was transliterated from Latin rex, or to other monarchs, for whom designations ("leader", "chieftain") were used.
⚙ Q-3.5.3 Understanding data - information
Creating Data: More To It Than You Think.
Normally we think of communication as either direct conversation or (in the spirit of the times) a flurry of text messages exchanged more or less in real time with people we know.
In either case there's usually a shared context within which the meaning of the messages can be interpreted, as well as more or less real-time exchange of clarifications.
What's distinct about creating data is that you're almost certainly not going to be face-to-face with the recipients of the message or connected live with them via an interactive network.
That fact rules out body language (e.g., raised eyebrows or emoticons) and dialog (including grunts and groans — or more emoticons ) to clarify what you mean.
In that sense the communication is blind. ...
As a consequence, the data a worker creates literally needs to speak for itself.
The emphasis needs to be on the effectiveness of communication — that is, on semantic quality.
Unfortunately, typical data quality measures in current use focus on the health of the content of the data/system architecture rather than on the semantic quality of the original business messages. That focus serves a purpose for data management but misses the mark almost entirely in clarifying what practices produce good business communications in the first place. Typical data quality dimensions (e.g., completeness, uniqueness, timeliness, etc.) are:
Retroactive rather than proactive
Quantitative rather than qualitative
Systemic rather than semantic
Worst of all, typical data quality dimensions implicitly remove responsibility off the shoulders of those who create the data.
The quality of data in a data/system architecture can never be any better than the quality of the business communications that produced it.
A systematic means to manage data at rest simply does not guarantee the vitality — the semantic health — of the business communications it supports.
Sometimes IT professionals focus so intently on software development the importance of the point escapes them.
(Many data professionals do understand the point, but do not know quite how to articulate it or feel powerless to do much about it.)
To make the point differently, it is entirely possible to assess your data quality as outstanding even though the business communications that produced the data were confusing, contradictory, unintelligible, or otherwise ineffective.
Rating data quality high when communication is poor is nonsense!
Vuca
Within VUCA, several thematic areas of consideration emerge, providing a framework for introspection and evaluation:
Knowledge management and sense-making: An exploration into how we organize and interpret information.
Planning and readiness considerations: A reflection on our preparedness for unforeseen challenges.
Process management and resource systems: A contemplation on our efficiency in resource utilization and system deployment.
Functional responsiveness and impact models: Understanding our capacity to adapt to changes.
Recovery systems and forward practices: An inquiry into our resilience and future-oriented strategies.
Systemic failures: A philosophical dive into organizational vulnerabilities.
Behavioural failures: Exploring the human tendencies that lead to mistakes.
Within the VUCA system of thought, an organization's ability to navigate these challenges is closely tied to its foundational beliefs, values, and aspirations.
Those enterprises that consider themselves prepared and resolved align their strategic approach with VUCA's principles, signaling a holistic awareness.
The VUCA world of the 2000/2010s
The term was first coined by the U.S. Army War College to describe the challenges of operating in a post-Cold War world.
From there, the acronym made its way into management and leadership literature and business school lecture halls at the turn of the millennium.
Volatility - Don´t expect standard values applicable for all. Fluctuations Diverse Dynamic.
Uncertainty - Don´t expect situations to be stable - immutable. Instability.
Complexity - Dependencies to external parties. Tied together, non-transparant.
Ambiguity - It depends: Never black or white, anything to be interpreted in myriad ways.
BANI
The BANI model goes a step further and helps companies consider the chaotic and completely unpredictable impacts that can have a major impact on their operations.
The BANI model of the 2020s
Brittle - it is about a sudden and unforeseen shock to or even the destruction of a seemingly stable system, which may lead to a global ripple effect.
Anxious - feelings of power- and helplessness, turning people rigid with fear. Anxiety can also be triggered by misinformation and fake news.
Non-linear - there is no more law of cause and effect – these things are either completely uncoupled or disproportionate.
Incomprehensible - The human mind is no longer able to grasp the complexity of information and occurrences in their entirety.
⚙ Q-3.5.5 Logic - Algebra - Decision
binary - multiple value
"Boolean"
Algebra has been fundamental in the development of digital electronics, and is provided for in all modern programming languages. It is also used in set theory and statistics.
💣 This focus on just two possible outcomes True/False is not what real life is, even it is not what is used and should used in information technology.
Not understanding what logic in an information system is applicable will cause unexpected errors when getting used. "Many valued logic" The Priest P3 logic is used in a relational DBMS where undefined is noted as NULL.
Pareto principle, statistics & decisions.
🤔 Having 80% going well is good enough ... Is it really good enough?
Go for the low hanging fruit, do not bother you will have up to 20% failures or things going wrong. Pareto-principle
Statistical relevant <-> decisions.
🤔 Accept 5% (1 out 20 is a mistake / failure) acceptable or not?
😱 Searching cases that fulfil the statistical relevance test is a correct approach or not? P-value
In 2016, the American Statistical Association (ASA) published a statement on p-values, saying that
"the widespread use of -statistical significance- (generally interpreted as -p le 0.05-) as a license for making a claim of a scientific finding (or implied truth) leads to considerable distortion of the scientific process"
Lift, accuracy, confusion matrix
Explainable AI, better understandable ML Machine Learning, automatic decisions is searched for but lacking.
Confusion matrix
In predictive analytics, a table of confusion (sometimes also called a confusion matrix), is a table with two rows and two columns that reports the number of false positives, false negatives, true positives, and true negatives.
This allows more detailed analysis than mere proportion of correct classifications (accuracy).
Accuracy will yield misleading results if the data set is unbalanced; that is, when the numbers of observations in different classes vary greatly.
In real life the impact by a wrong decision should be another ethic dimension to evaluate.
Missing a single wrong decision that is having a catastrophic impact is better to avoid than many wrong decisions with a little impact.
Q-3.6 Controlling Organisation & Business
Once Dorothy and her colleagues made the journey to OZ, they quickly found out that there was no there, there.
The Wizard simply told her what she really should have known all along.
Dorothy and her companions just had to figure out how to reframe their own perceived shortcomings and recast them as strengths to achieve real transformation.
🎭 Q-3.6.1 Information categorisation for engineering
Seven Common Myths About the Zachman Architecture Framework
By some kind of evolotion I found myself doing the categorisation in this approach.
Premise & Conclusion (Ronald.G.Ross, Gladys S.W.Lam 2015)
Widely misunderstood and misrepresented, the Zachman Architecture Framework is simply a thinking tool, not a methodology of any kind.
Its being fundamentally neutral with respect to methodology, in fact, is the secret to its power and the reason it has proven so enduring.
The Framework can, of course, be applied to create a methodology, but that's a different matter. ...
The Zachman Architecture Framework is the classification scheme, or ontology, created by John Zachman for engineering things of complexity.
Such solutions don't happen by accident — they require deliberate engineering.
Zachman simply points out, like it or not, what such 'deliberate engineering' necessarily involves.
The classic scheme has only five rows. The middle one "logic" is duplicate by a context switch.
Zachman's basic premise is that whenever you engineer anything of complexity, no matter what — a complex machine, a skyscraper, a microchip, a spacecraft, a product, a business (an enterprise), or some part of a business (a business capability) — there are two basic aspects that need to be addressed.
These two aspects correspond to the columns and rows of the Framework.
The columns represent the primitives of engineering problems and correspond to the six interrogatives (business engineering questions) what, how, where, when, who, when, and why.
(The order doesn't matter.)
If an artifact is not primitive, then it's a composite and inevitably more complex and resistant to change.
The rows represent reifications in the sense of MWUD (Merriam-Webster Unabridged Dictionary) [reify]: convert mentally into something concrete or objective : give definite content and form to : MATERIALIZE.
In engineering, an object is created for a particular audience with a certain perspective, set of needs, and agenda.
The Framework recognizes six such reifications or audiences. (Their order does matter.)
Six primitives times six reifications (audiences) equals 36 cells in the Framework.
You can think of those 36 cells as covering the universe of discourse for engineering things of complexity, a fundamental scheme for understanding and assessing completeness.
Myths:
Myth #1. The Framework requires you to create an artifact for each and every cell. Wrong. It's not a methodology, it's a classification scheme.
Different methodologies emphasize problems of different kinds, so in practice some cells are likely to play a less prominent role than others.
Myth #2. The Framework can be applied only at the enterprise level. Wrong.
It can be applied for an engineering problem of any size (scope) deemed meaningful.
Myth #3. The Framework discourages or precludes prototyping. Wrong. Again, the Framework isn't a methodology.
Much can be learned about the best solution for any given audience by prototyping alternative approaches.
Myth #4. The rows in the Framework are about increasing level of detail. Wrong.
Each successive row represents a transform of the previous reification into a new reification. The new reification serves a new purpose for a distinct audience.
Myth #5. The Framework doesn't recognize there are connections between the primitives. Wrong.
A key question, in fact, is how the primitives are 'tied together' (configured) at any point in time to create a complete and workable solution.
Myth #6. The Framework somehow induces complexity. Wrong.
Engineering problems are inherently complex, with business engineering being perhaps the most complex of all.
The complexity already exists, the trick is to engage with it most effectively.
Myth #7. The Framework slows you down. Wrong. That's not our experience at all.
Asking the right questions of the right audiences at the right times in the right ways doesn't slow you down, it speeds you up (or avoids costly dead ends).
Enterprise Architecture (J.A.Zachman 2021)
You can classify the set of descriptive representations of anything (buildings, airplanes, locomotives, battleships, computers, etc.) in a two-dimensional classification structure, a "schema."
One dimension of the classification I call "Abstractions" … I chose to call this dimension of the classification Abstractions because you can abstract out, or separate out, or factor out a set of six single, independent variables or focal points of descriptions that are common to every architected object.
The architectural descriptions of anything will include:
Bills of Material,
Functional Specs,
Drawings (or Geometry),
Operating Instructions,
Timing Diagrams, and
Design Objectives.
It is not mysterious why the people who build buildings, airplanes, battleships, locomotives, computers, all the Industrial Age products that are sufficiently complex to warrant Architecture came up with that set of description representations.
They are answering the six primitive interrogatives that constitute the total set of questions that have to be answered to have a complete description of anything: What, How, Where, Who, When, and Why. ...
This goes back about 7,000 years to the origins of language … and by the way, I did not invent this classification. It has been well-exercised by humanity for thousands of years. If you don't answer all six primitive interrogatives it means that your description is incomplete.
⚖ Q-3.6.2 Controlling an Enterprise, Organisation
PDCA changing the way of change the enterprise
Using the Jabes framework wiht a Jabes tooling gives a remarkable visualisation.
Instead of missions - visions or organisation value streams the process of change processes is te change.
Still having nine planes, showing the SIAR model in several levels
Goal: enabling controlled data driven processes at (IV)
Enabling & planning: y understanding the nine plane at (III)
Designing building validating the Jabes tool/product at (I)
Delivery of Jabes tool. change in the framework for processes at customers at(II)
Controller data literacy
Controller data literacy is the ability to select and connect the right sources based on strategic, tactical and operational process information flows so that an actual, complete trustworthy information position is created with which a substantiated decision can be made based on facts.
⚒ Q-3.6.3 Pretending being in control
Cargo Cult
Just following: "There is nothing quite so useless, as doing with great efficiency, something that should not be done at all."
The question: Cargo Cult Agile or a true Agile Mindset? (Kasia Bartos)
When it is hard to notice the difference between the Daily Scrum and the classical “status update for the manager”, we can feel that something is not right.
When team members are complaining that “Scrum Events take up so much time, while we have work to do”, then it is easy to figure out,
that there is no buy-in among employees for the whole idea of Scrum and some important ingredient is missing: the Agile Mindset needs to be developed.
When Scrum is done without promoting the real Agile Values, we might be dealing with Cargo Cult Agile.
I would add that there is possible something wrong with the promoted agile behaviour.
Source: Feynman, Richard P. (June 1974). "Cargo Cult Science":
In the South Seas there is a cargo cult of people.
During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now.
So they´ve arranged to imitate things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in,
with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas -he´s the controller- and they wait for the airplanes to land.
They´re doing everything right.
- The form is perfect.
- It looks exactly the way it looked before.
- But it doesn´t work. No airplanes land.
So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation,
but they´re missing something essential, because the planes don´t land.
If you expect motivated proficient autonomous staff and doing micromanagement there is a contradiction.
The real question for help would be solving the essential issues.
Decisions and ethical challenges.
With decisons and leadship, there are many styles of leaderships.
There is optimistic approach to collect as much as possible from positive aspects as the goal. On positive aspects there is openness.
Negative aspects are also existing in the real world. The goal would be to avoid as many of them as possible. On negative aspects there is no openness.
Dark triad
All three dark triad traits are conceptually distinct although empirical evidence shows them to be overlapping. They are associated with a callous-manipulative interpersonal style.
Dark triad a limited philisophy on those aspects.
A new model of Machiavellianism based in organizational settings consists of three factors:
maintaining power.
harsh management tactics.
manipulative behaviors.
⚒ Q-3.6.4 Culture, learning from others
Using knowledge of the founders - EW Dijkstra
Dijkstra has left many notes. He did have great distrust in doings things correct. TV Interview EWD Quality Correctness Elegance Edsger W. Dijkstra Archive (texas university)
The essence:
In IT, the adage as in no other sector is valid: "We don´t have time to do it right, so we will do it over later".
And so the laws of the economy have determined with unrelenting cruelty that until the day of his death Dijkstra had to stare at his own right in the form of a relentless stream of messages about faulty software and failed automation projects.
Flow! The Route to Panta Rhei [1] - J van Ede
TWI I
There are many methods for process improvement, ranging from Lean to Agile, and from TPM to Six Sigma.
There are differences in their improvement approaches, but much more similarities.
For example structured problem solving, and visualising work streams.
The most commonly shared characteristic is the realisation of flow!
Which improvement method increases throughput (flow) the best, depends on the specific situation, but also on the desired pace of improvement, the gear with which you wish to 'cycle uphill'.
TWI II
The concept Respect for People expresses that managers need the expertise and help of production workers for ongoing improvement.
Craftsmanship is not seen as something that 'lower' educated people do, but as a thing to be proud of, and a skill that many 'higher' educated people do not possess.
Craftsmanship is much more appreciated in Japan than in the west. Japanse words like monozukuri, the art of making things, and Takumi, an honorary title for an expert in his or her production step, highlight this.