logo jabes

jakarman - ICT - My profession


🎭 Summary & Indices Elucidation 👁 Foreword Vitae 🎭

👐 C-Steer C-Serve C-Shape 👁 I-C6isr I-Jabes I-Know👐
👐 r-steer r-serve r-shape 👁 r-c6isr r-jabes r-know👐

🚧 Contents Enter-I Enter-II TocJST-I Struct-I GBU(3)   🚧
  
⏳ Enabled Support-I Support-II TocJST-II Struct(II) GBU(4)  ⌛
  
⟳ Ideation Insight-I Insight-II Shopfloor  FlowValue GBU(5)  ⟲


D-1 Who I am, the early years


D-1.1 Contents

D-1.1.1 What there is at this page
The infromation factory
The mindset for a circular flow must always have been in my mind. Similar to a factory the value stream left to right but the question on what to do right to left.
Just rather recently I made this visualisation. The kind of passion is not an easy position, changing and not blind following is not appreciated in social groups.
JST JCL values stream politics impact
Pull: 0 ,1,2,3
Demand request

Push: 4,5,6,7,8,9
Delivery result

Value stream materials: Left to right

See right side:

D-1.1.2 Local content
Reference Squad Abbrevation
D-1 Who I am, the early years
D-1.1 Contents contents Contents
D-1.2 Entering ICT areas (I) wrkbgn_02 Enter(I)
D-1.3 Entering ICT areas (II) wrkbgn_03 Enter(II)
D-1.4 Lean processing TOC = JST: Job submit tool (I) wrkbgn_04 TocJST(I)
D-1.5 Structuring practices = coding guidelines, JCL2000 (I) wrkbgn_05 Struct(I)
D-1.6 Notes: the good bad and ugly - early years wrkbgn_06 GBU(3)
D-2 Internal at a big company
D-2.1 Operations planning & executing services wrkbig_01 Enabled
D-2.2 Supporting ICT areas (I) wrkbig_02 Support(I)
D-2.3 Supporting ICT areas (II) wrkbig_03 Support(II)
D-2.4 Lean processing TOC = JST: Job submit tool (II) wrkbig_04 TocJST(II)
D-2.5 Structuring practices = coding guidelines, JCL2000 (II) wrkbig_05 Struct(II)
D-2.6 Notes: Good bad and ugly - big enterprise wrkbig_06 GBU(4)
D-3 Wandering contract roles
D-3.1 Creation of new innovative ideas wrkmdl_01 Ideation
D-3.2 Understanding ICT areas (I) wrkmdl_02 Insight(I)
D-3.3 Understanding ICT areas (II) wrkmdl_03 Insight(II)
D-3.4 Lean linkage: the virtual shopfloor wrkmdl_04 Shopfloor
D-3.5 Lean linkage: understanding flow value wrkmdl_05 FlowValue
D-3.6 Notes: Good bad and ugly - wandering - visonary wrkmdl_06 GBU(5)

D-1.1.3 Progress


D-1.1.4 jakarman, ICT - My way of thinking
working life - experiences
In my working lifetime there are many periods while changing the technical details and changing attention to issues I did want getting solved.
There is gap between the passion and what was possible getting done. Not being dependent anymore for payments gives the freedom for sharing proposals that are not conforming to usual commercial interests.
👓 link for proposals introduction.

D-1.2 Entering ICT areas (I)

D-1.2.1 IVO (ISS, Individual Sales Support)
📚 The business goal was delivering pc´s to the sales people (600) at their home having all the information they needed for their customers (1984).
🎭 The available technology was: What was done:
D-1.2.2 Performance & Tuning, Mainframe
old days computer 📚 The goal was setting up, delivering Management Infomation (MIS EIS) by applications and tools in system resource usage.
Multiple goals: 🎭 The available technology was: Having done:
D-1.2.3 JST, structuring JCL (Job Control Language)
📚 The goal a structured practice enabling JST.
JCL is the script language coding tool in a classic IBM Mainframe used to run processes for the business. Before you can do anything to optimize the JCL process running business processes you must structure the way of how JCL is coded.
hollerith oldtime It has a huge advantage in disconnecting the physical data reference out of source code to this JCL storage location (punch cards). What you do by that is defining which data should be used on the moment the job is run, JIT (Just In Time).
Any business software, logical program, can be reused in the DTAP setting without modified application code. Avoiding not necessary modifications in business software (production, acceptance) is a quality and compliancy requirement.
🎭 Available technology and information source: Having done:
D-1.2.4 JST, a Generic approach for automated testing
old days computer terminal 📚 It is automating manual work of several IT-staff lines.
This is a very unusual part of IT optimization as it is internal IT. After all this years I am considering the approach still a revolutionary.
Experiences: 🎭 Used technology and limitations: The solution design and realization:

D-1.3 Entering ICT areas (II)

D-1.3.1 Database IDD - IDMS DB/DC
IDMS was using an advanced approach using an "Integrated Data Dictionaries" (IDD-s). A more modern word would be Multiple contextual Metadata Databases. An integrated backend (DB Database) with frontend (DC Datacommunication) classified as a tool, middleware.
storage cabinet These dictionaries were setup in: Each of them having definitions: Cullinet was the preferred supplier. In previous years no front end was available from any supplier. An in house build front-end and middleware system was still running (VVTS).
📚 Goal operational support (system programmer) in a small team.
🎭 Used technology: suppliers Cullinet, CA (1990-s) now (2020+) Broadcom and IBM toolset.
What was done:
D-1.3.2 Security (ACF2 RACF - MS AD )
Bringing structured access to resources by roles, organizing who is allowed to what role and getting it to some natural person verifying really who is saying he is, is known as security. There is a master security administrator task (design) and a restricted security administrator task (executing requests). This is a required by segregation compliance in duties.
old days computer terminal The old common approach is doing user input validation before handing over to the tool. This was done by lack of integrated security.
The dark background has become very modern again recent years (2020). The limitation was that number of lines * columns was limited, often used sizes were: 80*24 and 132*27.
The screen from the image is an example how TSO did look like. The user interaction for any application had to be intuitive according to user expectations in the same approach. Any modification to this kind of tailored menu had to be organised.

📚 Goals implementing security with tools:
🎭 Used technology: suppliers "ADR" later "CA"(Broadcom), IBM, Microsoft (AD).
What was done:
D-1.3.3 Scheduling (UCC7 TWS - homebuild)
lost in meaning Scheduling Jobs in the old times was all manually hard work. Operators had to carry all the physical paper cards in time to the readers to execute them. Planning was done as preparations on paper using previous experiences on load and durations. The operator was the person you should be friends with.
This all changed when that work became automated and shifted for some parts to persons inside the business (functional support, production support).
There is big shift in time impacting jobs. Many kind of jobs have gone and replaced by others in the era of applying computers AI ML.
📚 Goals:
🎭 Used technology and limitations: The realization was being adjusted in level of acceptance at departments.

D-1.4 Lean processing TOC = JST: Job submit tool (I)

 legal
D-1.4.1 🚧 The situation, analyse the floor for improvement
The JCL JOB guidance Form (JBF)
Working on the middelware for changes I found myself blocked by some jobs running.
I could not figure out who was running them. At that moment I knew the Operations (OPS) and "production support" (PB) teams well, so I went to them asking what those jobs were. That is the moment I got to learn the work of the "permanent test group" PTG group. There were not that many PTG persons. One person of the PB team being busy there.
The PTG persons had to fill a hardcopy paper "test request form" (TRF), using a lot of correction paint. The PB person used that, making notes on it to create Jobs to run for that group. The throughput limit was about 100 jobs a day.
There was another hardcopy paper coming in from the Development department that described how the programs were supposed to be used with a lot of details on how data was defined (JBF). When the papers were found to have wrong information the paper had to go back and being corrected before further processing.
There was no business analyst for processes at ICT.

JCL JOB guidance Form (JBF)
Found this form as guidance for workers and then got these questions:
  1. Why?
  2. Possible improvements?
  3. Who can help?
  4. What can I do?
  5. When to do it?
The answers for myself to start change by taking the initiative:
  1. It was historical grown
  2. remove the form by trying to automate in what is in there
  3. Allowance of managers team leads at involved departments
    cooperation with workers at those departments fro changes
  4. I could build a tool for submitting those jobs.
  5. Just start and do what is acceptable to most of te involved persons

D-1.4.2 💡 Starting simple improvements
ToC Theory of constraints clarke ching
The often used jobs for testers (Pareto)
I got some time and cooperation for doing a little small things.
PTG was willing to do more themselves, the generic data preparation and reporting jobs mainly used for test as test tools too start with.
Interesting discussions started on whether:
Structuring JCL, a requirement along with operations (JCL2000)
PTG needed more time for acceptance new things, but that other form paper was bothering me. I found an ear at the PB department to improve that process.
When all jobs would be build in a standard common job approach it could be automated. The question was what the best and acceptable option for new standard JCL coding would be. We made several examples of possibilities doing steps in normalisation. Showing this to other persons the most structured one was surprisingly acceptable preferred.
👁 Explaining it to another PB person, as soon he did understood the intention, he wanted to start changing. We were not ready at that moment.
Of course not everybody did like it, but we could go on.

D-1.4.3 Starting simple improvements
All kind of batch jobs for testers additional tools (compare)
Running the test by PTG rather well additional questions were asked. Some of those:
Handy tool
Backup restore tools for test activities.
These are coming in with the operating system. We delivered those backup & restore as system support to the OPS department. Using those tools also at the test group was not that difficult at that moment.
Comparing data is included in TSO/ISPF environment. The compare tool could be automated when the data, results, is in datasets.

Automatization of automation engineers.
Covering all of the DTAP, Development, Test, Acceptance, Production, combined with release management by header and trailers. The organisation did work with a DBMS. Release management tools and manage releases were there and got replaced.

D-1.5 Structuring practices = coding guidelines, JCL2000 (I)

D-1.5.1 🚧 The situation, analyse the floor for improvement
The JCL script pattern challenge
The main technical issue and question was structuring JCL so it could be automated and reused. The tool was build with TSO/ISPF using REXX at no additional cost with those common available tools. The structured approach is very generic. At Quality Assurance it is applicable.
A disadvantage: naming conventions and standard way of work are characteristics. A technical solution is not easily copied into another organisation.

The JCL script pattern glossary
Defining all globals, job environment settings by artifacts:
symbol meaning explanation
header trailer Header or Tail Defining all globals, job environment settings
Defining actions, jobs after this one
body frame Bodies - Frames Business logic software application programs
Used by business functional owners
procedure body Common procs Technicals you can reuse very often
These technicals are a limited number of types JCL proc steps
ddio1
ddio2 ddio3
Data Definitions DBMS definition
Input, Output, Report -log for every step

Somehow defining the datasets output / print / mail / databases / transfers to be used is needed. These definitions may not be hard coded in the business logic code. Why not?
D-1.5.2 💡 Starting simple patterns
Structured scripting
First basic conversion to automatization
Simple using a monoltitic application flow.
Head : Application flow procedure & IO
Tail :

Structured scripting
Normalised basic conversion into automatization patterns
A modulair proc step in this way is like an api (program interface) options. The normalising of the data into segregated components is creating new artefacts.
Head :

D-1.6 Notes: the good bad and ugly - early years

D-1.6.1 😉 Experienced positivity by activities
green logl
ISS, Individual Sales Support
This was a revolutionary approach:
Job Submittool - Lean process toc (I)
It survived all years several reorganizations, the year 2000 testing, Euro testing, outsourcing to India. Created around 1996 after more than 20 years mostly untouched it was still in use.
Job Submittool - Lean process toc (II)
After mare than 20 years it stumbled over: Removing constraints, being Agile. Explaining what I have done a long time ago. The Bottleneck Rules (Clarke Ching 2018) has a nice story on a development team. bottleneck-rules-table ToC Theory of constraints clarke ching
There was a waiting room for work in progress in the flow. Testers could not work faster.

It is mura (unbalance) in the system.
Unbalance and Muri (overburdening) are too often negelected only mentioning waste (muda).
ToC Theory of constraints clarke ching
Developers changed tools for testers to improve their throughput.

The end situation, Work got managed by what testers could finish: Improving the throughput of testers by better tooling is also what JST did.
Even more, there were other bottlenecks, eg at the operations delivery department and some hidden at development.
D-1.6.2 😱 Experienced negativity by activities
ISS, Individual Sales Support
This was a revolutionary approach, it harmed the ones in the culture of "work as was always was done". The resistance was :
Job Submittool - Lean process toc
Although the goal was helping in the daily activities of everyone the resistance by some was tremendous.
Structuring Security
Changing the security from input validation into using standard tools:
D-1.6.3 🤔 Ambiguity complex uncertain volatile emotions at activities
ISS, Individual Sales Support
This was a revolutionary approach, but never achieved the full potential:
Job Submittool - Lean process toc
Because of the business optimization that is done:
Job Submittool - Test Methodology
All work at JST - JCL was done accordingly to the documentation found at ISTQB.
🤔 However, it was done years before the ISTQB organization did exist.
Just the mainframe approach is what I have documented here. It is according to the ISTQB design concepts chapters. The JST approach can be used using by any tool in any environment.

sea log
Structuring Security
Roscoe was an multi user mainframe integrated devlopment environment, program editor. The roscoe administrator tried to implement security by parsing and than reject or allow that docuemnted command. It never succeeded to become reliable, that was in 1985.
Question: What is new? In modern times we are parsing code, preventing code injection, crating dedciated api's, trying to secure the web browser interface (2024).

🚧 Contents Enter-I Enter-II TocJST-I Struct-I GBU(3)   🚧
  
⏳ Enabled Support-I Support-II TocJST-II Struct(II) GBU(4)  ⌛
  
⟳ Ideation Insight-I Insight-II Shopfloor  FlowValue GBU(5)  ⟲


D-2 Internal at a big company

D-2.1 Operations planning & executing services

 legal
D-2.1.1 🎭 Personal tools & coding preferences
Coding when required
I never had a coding preference, tools middelware. The only preference is reusing as much what is already in place and sufficient applicable for the case. It has been: In project management is a different, there is a step after understanding the problem of the customer, tools middleware selection is done. This is the opportunity for external suppliers getting a stronghold without any responsibility for results.
Learning PowerShell was only done recently (2022), I felt comfortable when getting realized what the similarities to SAS was. The metadata for data with SAS is included in the middleware system. Part of the dataset or at a dedicated repository. For PowerShell it is part of the system there is no dedicated repository. PowerShell has the goal of system administration and is not runed for mass data processing.

Limiting the area scope but extending the scope area
By reorganisations I was forced to concentrate and specialize more on the usage with SAS tools. An interesting area because:
D-2.1.2 🎭 Data modelling
Extending into more structures data flows
The stability of delivering is automating what information is coming into what can be used by the information consumers (push). What should by delivered is a reaction on what information consumers are expecting (pull). When the demand variety is high, requests for changes in the delivery will be high.
data model, data life cycle Thinking in a infromation value stream from left to right, The Information request & delivery in a full pull-push cycle is a little bit different.
The numbering looks not very logical, there is divergence between a linear and circular order. In four steps: The push was already there. Numbered in that order I, II numbering left to right.
The pull for a flow is added, right to left IV III.
The full operational cycle: IV, III, I , II

Extending to continous & disruptive improvements
In the material flow form left to right both DMAIC (counter clockwise) and PDCA (clockwise) have their place.
➡ possible disruptive change PDCA: III, I, II, IV
It doesn't make sense to start with Plan without first getting insight where to act on. The most logical start evaluating is at IV A (PDC). Small incremental changes can use the PDCA path.
➡ small improvement DMAIC: II, I, III , IV, IV
The Control step is going together with a next iteration.
D-2.1.2 🎭 Process scheduling
operations computer
Running information transformation processes
Running the operational processes for the business lines, planned in a way everything is delivered in time. The mindset in this is completely different to the ones that are trying to do changes. Planning the operations is by using tuned applications in a automated time frame not overloading the system. The service delivery moment is to agreed with the intended usage. There are several plannings possible to spread the load over a day.
Example material operational flow: This was extended later more in detail somethimes having the tools and sometimes misssing those.
D-2.1.3 🎭 Deliveries by missions sales, marketing
EIS BI - sales, marketing
Organising deliveries results.
Aside the daily operational planning there is more:

D-2.2 Supporting ICT areas (I)

D-2.2.1 Hosting, multi tenancy - *AAS - stacks
datacenter 03 multitenacy Software As A Service (SAAS) is a great idea. Implement it by yourself if your requirements are more strictly then can be fulfilled with SAS on demand e.g. cloud services. Dot it by yourself when you are you are big enough to do a SAAS implementation yourself.
Another reason can be having multiple business lines (tenants) needing the same solution the same application. Sharing computer resources can bring huge benefits. There is a whole industry based on that. When doing it wrong, possible risks are also high.
Solving the SAS environment challenges with all my knowledge and experience in all years, it is brought to a much higher level as is common practice. The approach is valid for all kind of middleware, tools.
📚 Goals
🎭 Used technology doesn´t really matter, it is about all kind of dependicies in the full stack.
Having implemented:
D-2.2.2 Release management (versioning)
lost in meaning Within the Information technology guidelines and technics are evolving fast. However release management for business applications, middleware - tools, ICT infrastructure is still hardly well understood let be acceptable implemented. A simple example is that Excel maintained by Microsoft is the "application". Question: when excel is the application what is the artifact a worksheet usable in Excel?
Within the Information technology guidelines and the technology are evolving fast. Many tools for release management went along with technology. Endevor was the one with a lot resulting issue. Recent hyping tools like GIT are getting the most attention.
A generic DTAP Approach DTAP - seeing the levelled three layers is far more important.
Being in a silo you are having just one layer, your own layer, your own mindset.
📚 Layered DTAP Release management
🎭 Used technology doesn't really matter as long the goal by release management is met.
Having done and being involved with:
D-2.2.3 Disaster recovery, data retention policies
handelingekamer
Disaster recovery
The Disaster Recovery, a fall back system, and more for availability are part of delivering a technology service for middleware and tools.
📚 Verifying the assumed DR infastrcuture service:
🎭 Used technology isn't important, it is about the disaster recovery goal.
Having done and being involved with:
Archiving - retention policies
Being compliant at data retention policies is too often ignored. It are basic information requirements that should be part of the system.
📚 Verifying the assumed DR infastrcuture service:
  1. How long must information kept available at the first line?
  2. What and how must information archived at a second or third line?
  3. When should information be destroyed?
🎭 Used technology isn't important, it is about the retention goal.
Having done and being involved with:
D-2.2.4 Operational Risk (OpRisk),
lost in meaning This OpRisk departments and their work is interesting. OpRisk is doing things like Advanced Measurement Approach. Using Monte Carlo simulation modelling with public known situations is the way to go.
Required: release management, Quality testing - DR, data retention and security policies. Delivery deadlines (quarterly) critical moments.
📚 Goals
🎭 Technology used and limitations: Having done:

D-2.3 Supporting ICT areas (II)

D-2.3.1 DWH, BI, Data connections, Unix
Getting into the analytics data flow cycle In a growing environment this topics became the only working area.
The first problem to be solved was a generic desktop roll out for SAS clients as the desktop got another standard.
The next one was adding and consolidation of midrange servers using SAS (see hosting *AAS stacks).
This was a Unix environment (AIX) using a SAN (not NAS). Not that different in the technology approach compared to Linux. It made to set of experiences at different type of operating systems complete: Mainframe, Windows, Unix (Linux).
📚 Supporting DWH BI Goals:
🎭 Used technology: Implemented:
D-2.3.2 Policies - Sox-404 ITSM Cobit IEC/ISO 27001, 27002 - GDPR
Mcafee analytics Policies standards are becoming mandatory (legal requirement), but theres is a lot of work to do.
A valuable figure because very generic: At interent the figure is lost. Mcafee is merged with FireEye, the new name: Trellix(2021).
Needed: internal agreement, archiving knowledge, on what & how to do safety at the organisation.
Top down: You need to know the safety goals and than how to get those implemented.
Bottom up: the technology is having a history of doing things some way. External suppliers have their own "best practices", sometimes they are very bad practices.
📚 Goals
🎭 It is about compliant processes, not the details in technology.
Having worked on:
D-2.3.3 Data mining, Customer Intelligence (CI)
Mcafee analytics Data mining, data science was hyping in 2016, sill hyping 2024. The department CI, Customer Intelligence, was in my early years (1990´s) one of the business lines to support with tools.
Cross selling, customer segmentation, churn rate and more are words they are using.
The development being indicated with the word modelling, operational usage of a model using the word scoring. That is using another language to communicate for known processes.
A weird fundamental difference in data usage. In the operational plane, operational systems, normalisation is the norm. At the analytical plane denomralisations is required.
The same questions as always with analytics.
➡ hindsight: What happened? Why did it happen? Evaluate, consider, think: What could happen?
➡ insight: Can we prevent it? What should we do?
Marketing, Customer Intelligence, is commonly using more data sources than are available internally. Geo locations, external open and closed data for input processing bringing into correlations with internal business processes.
Bringing these marketing operations into departments executing the normal classic mass operations is a challenge. The "Analytics Life Cycle" (ALC) is not settled yet, not in 2016 not in 2024 .
📚 Goals
🎭 Used technology: Having done:

D-2.4 Lean processing TOC = JST: Job submit tool (II)

 legal
D-2.4.1 🚧 The situation, analyse the floor for improvement
Interactive Transactional Systems
There are two appraoches for processing. In information processing for en enterprise it is not different. Batch vs online processing is full of emotions.
Technical descriptions for what batch is and what interactive are indeterminate. The functional characterises are better understandable when accepting both of them are possible used in products / services. Mainframe workloads: Batch and online transaction processing
batch - transactions Most workloads fall into one of two categories: Batch processing or online transaction processing, which includes Web-based applications.
Perform end-of-quarter processing and produce reports that are necessary to customers or to the government. E.g. generate and consolidate during nightly hours reports for review by managers.
In contrast to batch processing, transaction processing occurs interactively with the end user.

Optimizing throughput, optimizing resource usage
Techncial resources Understanding technology components in information processing is full of emotions.
There are three important geographical levels:
  1. Local Machine: (fast) memory, processing units (CPU GPU)
  2. Communication lines to the machines
  3. Massive remote storage reachable by the machines
Valid for: "on premise" datacenters and cloud services.
Balancing load over technology components optimizes resource usages, reduces elapsed time (clock time). Total used time by several processing units easily exceeds the clock time. Humans are educated to fulfill tasks sequentially. Parallel processing, planning scheduling, is a more difficult concept to get reliable predictable results.
Balancing load over interactive usage by operators - users and by planned batch processes. Requires understanding what activities by operators -users at what moment can be expected and what the options for the other processes are.

Integrated Data Dictionary
All kind of masterdata and metadata should be easily accessable and maintanable.
The IDD (IDMS Cullinet) is a good example how this works. A central IDD (e.g. 11) referring to a full DTAP realisations (eg. 66, 88, 77, 01). Each of them having staged versions of a dictionary.
Release management for an IDD is based on promotion management. CA IDMS Reference - Software Management
IDD Integrated Data Dictionary - Promoting Management Promotion Management is the process wherein entities are moved from environment to environment within the software development life cycle. When these environments consist of multiple dictionaries, application development typically involves staged promotions of entities from one dictionary to another, such as Test to Quality Assurance to Production. This movement can be in any direction, from a variety of sources.
Release management for an IDD integrating with commercial tools is based on promotion management. A choice for tools is very limited and a lot of scripting (programming) is needed for all dedicated naming conventions and requirements. Endevor, Endevor-DB the only commercial external option with IDMS.
D-2.4.2 💡 Using holistic patterns QA testing
Enabling a holistic environment using controlled released management and underpinned Quality Assurance is the holy grail. The big problem to achieve this are: the assumed complexity and financial cost.
The first steps:
D-2.4.3 Release mangement, parallel testing development
Integration of relase management tools (Endevor)
With the DTAP approach, home grown approaches are simply build and can be used for many years. What should go into a release, what is verified (QA) and what is rolled out are the real questions to be solved by the organisations.
ToC Theory of constraints clarke ching ⚠ No tool is able to solve the real organisational questions in release management.
⚠ Those home grown approaches are similar to the usage of "Git". Copying, retrieving to a location is by a generic tool.
⚠ Disappointing: release management is not the same as running QA tests, validating, archiving test results.
➡ The list points of attention is a root cause for accumualting technical debt.

D-2.5 Structuring practices = coding guidelines, JCL2000 (II)

D-2.5.1 🚧 The situation, analyse the floor for improvement
The JCL script pattern challenge continued
Quality Assurance is testing the whole system holistic. Just running a partial process or doing only a review how the coding is done gives no insight on quality.
Doing the same process in a test environment, not only partial batch-processes, but all their possible inter relationships and the interaction with the partial OLTP systems is the real challenge. A complete environment simulation the operational prodcution is a pre requisite.
➡ Simulating is not the same as physical duplicates.
The JCL script pattern glossary
Defining all globals, job environment settings by artifacts:
symbol meaning explanation
body frame Infra-Process Used by Testers
Complete tetenvironment (image) backup - restore
body frame Infra-Process transactional Used by Testers
Complete tetenvironment (image) backup - restore


D-2.5.2 💡 Advanced patterns
Structured scripting dedicated dbms
Dedicated DBMS script D, T, A
This feature essential to D,T,A Test environments when transactional systems are involved.
Head :

D-2.6 Notes: Good bad and ugly - big enterprise

D-2.6.1 😉 Experienced positivity by activities
modern5 logl
Used concepts & practical knowledge
Reconsidering all nice projects, nice because it has that many connections to a diverse set of activities. Details to connect into this: The security association may not that obvious. It is embedded in all activities.
Good practice:
Operational Risk (OpRisks)
A nice team, staff of excellent people. I worked for their shared goal.
Marketing, CI, Credit risk, Actuary and others
Nice people to collaborate with helping them doing their work.
working with colleagues, full circle
Job Submittool - Lean process toc
Words of thank to my former colleagues We had a lot fun in stressful times working out this on the mainframe in the nineties. The uniqueness and difficulties to port it to somewhere else realising the big challenge to do that.
A few names: Many more people to mention, sorry when I didn´t.
(notes: 2008)

What about my collegues?
I had many of them, doublehearted feelings.
What about my teamleaders, management?
I had many of them, doublehearted feelings.
a disaster building
D-2.6.2 😱 Experienced negativity by activities
What about my teamleaders, management?
💣 The merge with another organisation in 1998 was an example how it should not be done. There was no vision insight on the goal, cost saving by accounting was the leading strategy but there was nothing more than that. A battle of different cultures, different ideas was the result. At the merge of that other organisation the our old management was set aside and that of the other got the power with the instruction to abandon their technology to use ours.
Even 25 year later nothing did exist anymore, former colleagues were still badly hurtled by wrong behaviour of the other side.
Printing services, Hardware consolidation
Printing in house by on prem mainframe. Cchosen printer technology:g Siemens.
🤔 This technology was already outdated when the machines were moved form the office to another location (1996). The investment solving that technical debt could not be justified by only financial costs.
😱 Worse: the argument of cost-saving doing the relocation and consolidation failed dramatically for all involved parties. Technical debt for printing was solved around 2010, allowing the print services to be outsourced. The technology change to the one what was used at other amchines.

Job Submittool - Release management
💣 With the merge of the other company their technology opinions were getting in. One of those was purchasing Endevor would solve anything about release management, development and QA testing. That was a very costly customer journey for the organisation by wrong perceptions. Worse: the failure getting used in a personal blame claiming it was the JST tool fault.
Technology options- organisational consolidation
💣 With the consolidation ohter people com in with their technology opinions. Instead of the claimed better
Financial crisis, mandatory orgnsaitional split
💣 After growing by merges and consolidations the result of the financial crisis was an implosion and split. Forcing people to leave in any way. I was lost not belonging anywhere.
It was a too much toxic attitude by too many.

modern7 logl
D-2.6.3 🤔 Ambiguity complex uncertain volatile emotions at activities
Job Submittool - Release management
The effect of these perception mismatch for Endevors was that the JST tool got rolled out to development (DEV) and to acceptance environments. In the end a full development line even for the operations department was in place.

Operational Risk (OpRisk)
  • Several ugly issues:
    1. Issue 1: The internal cost assessment became unexplainable high. The cause was: Oracle licenses and involved machines.
    2. Issue 2: Doing a DR test successful was almost impossible due to too many involved machines. Cause was: The Oracle database location.
    3. A strange quirk in dynamic prompting causing difficulties to run for different lines (tenants) having their own internal data.

  • ⟳ Ideation Insight-I Insight-II Shopfloor  FlowValue GBU(5)  ⟲
      
    ⏳ Enabled Support-I Support-II TocJST-II Struct(II) GBU(4)  ⌛
      
    🚧 Contents Enter-I Enter-II TocJST-I Struct-I GBU(3)   🚧


    D-3 Wandering contract roles

    D-3.1 Creation of new innovative ideas

    D-3.1.1 🎭 Information processing, DWH, Data Lake (I)
    Soc, Security operations Center and Computer Operations
    This one of the beginning of the information age has influenced a lot of how I see information processing. All kind of questions, what is really needed, what is possible available, in scope.
    Integrated Computer System Information (CSI)
    The basics of information processing using logs and other sources. The DWH (data ware house), data lake, for a CSI is able to support security operations and others. This started at the same moment with those other topics for Executive Information Systems (EIS). A spreadsheet, these days Excel is the standard, and a more easy presentation on a personal computer. Spreadsheets, websites, are a very simplified approach only having that interface of security access.
    System management facilities (SMF) collects and records system and job-related information that your installation can use in: That are a lot of goals with a different kind of usage. Technical it is: Extract Load Transform (ELT) - selecting types: Extract Transform load (ETL). Building a DWH, datalake:
    1. a landing area
    2. add more meaningfull context in a staging set
    3. get a semantic area from the staging
    4. create the valuable knowledge in a databank
    Having a generic system (SMF) avoids the need for defining a data pipeline for every subsystem. It was not complete, not a everything is included in those logging.
    Add data, information for: baisc SoC CSI
    in a figure,
    see right side
    CSI disintegration due to separation in interests
    Security operations Center (SoC) specialized in processing logs for the limited cyber security safety goal. Logs, events for Windows, log systems in Unix the modern products.
    ⚠ Issues: When words like "operator" are used the functionality and security are easily going into conflicts by different interpretations for usage and goal.
    D-3.1.2 🎭 Information processing, DWH 3.0, Data Lake (II)
    The warehouse similarity
    Business Intelligence analytics (BI&A), EIS - Operational information flow are two complimentary topics. Common accepted practice: copying from the Operational information flow into a DWH, datalike, for the sake of reporting.
    The difference in interest:
    1. The operational core value stream (flow) is for the mission of the organisation.
    2. Understanding what is going on in the core value stream is BI&A.
    Both of these two complimentary topics use ELT / ETL in four stages:
    1. a landing area
    2. add more meaningfull context in a staging set
    3. get a semantic area from the staging
    4. create the valuable knowledge in a databank
    ⚠ Issues: 💡 A simplified lean design, data lake, data warehouse 3.0
    DWh 3.0
    in a figure,
    see right side
    D-3.1.3 🎭 Lean design: Initialisation Termination, Control
    Measurements Operational processes
    How to get detailed information monitoring operational processes.
    WM initialistion termination control. For example how two implement a watchdog on extract, load steps. ( E-L ➡ T )
    Extracting information results in one or more datasets, tables.

    💡 Detailed control & balance requires addtional knowledge at initialisation and termination logic.
    This logic is tailored & configured to the organisation in house process standards. Contains knowledge of the designed functionality.
    ⚠ 🚧 Mission impossible: expect this getting solved from generic tools.
    lean desing, full process control design
    Any process whether physcial or in the cyberspace is going through a circle.
    Lean process / full connections Processing the value stream (LTR):
    ➡ request: IV ➡ III (pull)
    ➡ delivery: I ➡ II ➡ (push)

    💡 Two supporting processes for control, delivering needed components and another for monitoring usage.
    👉🏾 This idea is the same as the "operational plane" and "analytical plane" of Data Mesh. The experience and cotrol planes are not mentioned but the ones needed in completeness.
    ⚠ 🚧 Common distraction: from "value stream" into using machines, tools software, cots.

    Importance of Naming Conventions, Data Administration
    Data Administration or Master Data Management (MDM), the goal is a clear understanding on the meaning context inentions and goals. Gartner Master data management (MDM) is a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprises official shared master data assets.
    Master data is the consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise including customers, prospects, citizens, suppliers, sites, hierarchies and chart of accounts.

    💡 The importance of a clear naming conventions with the goal of clear understanding and using that for simple technical solutions cannot be overvalued.

    D-3.2 Understanding ICT areas (I)

    D-3.2.1 ML, Machine Learning, Scoring, Explainable AI (I)
    The environment was about sensitive information (2016), governmental, big organisation.
    The first question was technical one going into functionality support.
    In the technical design the assumption was made that Scaling out was the only possible approach.
    📚 Goals:
    🎭 Used technologies: Solutions design and realization:
    The second question was support and solving functionality for a dedicated purpose. In the technical design the assumption was made that Scaling out was the only possible approach.
    It should be replaced by updated versions. A generic description.
    siar 📚 Goals:
    🎭 Technology used and limitations: The solution design and realization:
    D-3.2.2 ML, Machine Learning, Scoring, Explainable AI (II)
    ordering patterns The environment use sensitive information. The company is a public enterprise that had a relation with a bigger financial institute (2017). Some approach details.
    In the technical design Scaling up was the only allowed approach. No made assumption but a limitation by defined external technical resource support.
    📚 Goals:
    🎭 Used technologies: Solutions and realization:
    siar Patterns are basic building blocks and shoud match some issue to solve. 📚 Goals:
    🎭 Technology used and limitations: The solution design and realization:
    D-3.2.3 Data governance, variability volatility
    I had a period at a healthcare insurance company. What I learned was several of the interesting information flows for declarations and costs from details into overviews being send to governmental regulators.
    I got a project at a governmental regulator for healthcare cost. The environment is based on sensitive information but fully anonymised by aggregation. Some details in the problem and solutions follows.
    📚 Goals:
    🎭 Used technologies: Solutions and realization: 📚Defining, naming all elements was a unique learning. The challenge of not having that much data, there apx 30 suppliers delivering, but the complexity in the volatiblity and mass number of information elements. The translation made was doing the request delivery in the same cycle type of doing a request and delivery.
    The available information (input) is a fixed layout. Output, results, having a goal are in a relational format using columns.
    data model, data life cycle Processing the value stream (LTR):
    ➡ request: IV ➡ III (pull)
    ➡ delivery: I ➡ II ➡ (push)

    💡 Controlling mass in involved elements by using naming conventions in logical names (32 positions).
    👉🏾 This idea is supporting using Excel where it has strengths. Remote validated controlled data using summaries better than uncontrolled CSV or asci files.
    ⚠ 🚧 Common distraction: from "value stream" into using machines, tools software, cots.

    D-3.3 Understanding ICT areas (II)

    D-3.3.1 XML messages, object life cycle data model
    The environment was about sensitive information (2016), governmental, big organisation.
    The third very interesting queest was about getting an attempt to process a very complicated data model. Data modelling is a confusing challenge. The requirement is undertanding how information is delivered and what information is needed. These can be complete differnt worlds of context.
    translation communication The classic DWH approach is based on modelling very detailed elements optimizing the transactional database process and saving as much as possible storage. The disadvantage is the complexity in relationships.
    Blockchain is a hyping approach (2018) for archiving and processing al detailed information and all history in chained blocks (ledger). This complicated model is another approach to process and now changes in time. The ownership of the information by governmental task is the required underpinning trust.
    In practice contracts, legal agreements, are describing fully the most recent situation. Their history in changes is only in special cases relevant.
    Those special cases are the most interesting ones when a goal is able to detect illegal activity or fraud. Use case "real estate":
    📚 Goals
    🎭 Technology used and limitations: The solution design and realization:
    D-3.2.2 Grid computing - performance load balance
    This is a hot topic for performance reasons effectivity and cost of business solutions. The question on choices is coming back everytime. Only for simple not distinctive commodity solutions this is not a relevant topic.
    modern datacenter The Free Lunch Is Over : A Fundamental Turn Toward Concurrency in Software Grid computingframe (Herb Sutter 2001). The conclusion:
    Applications will increasingly need to be concurrent if they want to fully exploit continuing exponential CPU throughput gains Efficiency and performance optimization will get more, not less, important.
    One approach is scaling out
    Another approach is scaling up
    No matter which one, parallel processing, not serial, should be the mindset.
    📚 Goals
    🎭 Technology, limitations: The solution design and realization:
    D-3.3.3 Technical & functional support, using a datalake, DWH for analytics
    The environment use sensitive information. The company is a semi public enterprise that has it main activity for healthcare insurance but lso some banking and insurance lines (2020).
    Storing objects for use at some later moment is warehousing. Just collecting a lot of things not knowing whether you will use it, is another approach, data hoarding.
    The words "datawarehouse" and "data lake" are confusing in their associations with their physical counterparts. The physical ones are in the operational plane.
    The Datawarehouse (Bill Inmon 1970, Ralph Kimball 1996) is not having a goal in the operational plan but in the analytical plane. In essence, the data warehousing concept was intended to provide an architectural model for the flow of data from operational systems to decision support environments.
    What it is describing is the functional equivalent of a quality assurance laboratories and / or a research laboratory. The goal for this is allowing JIT (Just in Time) processing, going for lean realisations.
    All kind of issues and real problems I have seen from the beginning in the 80's were stacked-up in their environment. The goals and topics changed by time solving several of those issues while getting some new ones.
    📚 Goals (I)
    🎭 Used technologies Solutions (I) Handing over knowledge resulted in extensively documenting while discussing what BI&A is about. The question of sharing information in a component approach resulted in the following figure:
    Process artifactshare2
    There was a good fit with the colleague. The SIAR model for agility was without discussion. A project was running for storage migration, it was started several years before. Used old hardware old storage was getting obsolete.
    📚 Goals (II)
    Solutions (II) The need for an update for the processing servers expected.
    📚 Goals (III)
    Solutions (III) There was time to develop that structured approach in requirements, design plan, verification plan. Reviewing that while building it the notion is that is very generic and could be an new disruptive product with framework. 👉🏾 Jabes.

    D-3.4 Lean linkage: the virtual shopfloor

    D-3.4.1 🚧 Situation: analyse the floor for improvement
    CPO, chief product officer
    When the product service is what it is about, the CPO has a pivotal role.
    A CPO is more then just another chief role. It is linking pin for middle management coordination: Product, Proftibality, Quality, Safety for predictable adding value.
    X Issues in Control (blue diagonals):
    Accountability clarification
    Product Manager The product managers role is not in place, it is a recovery proposal
    Quality Control Quality control is too often missing, it is a recovery proposal
    Delivery / Safety Safety for the environment too often isolated, it is a recovery proposal
    Account Manager The one for decisions achieving profitability by quantities

    +Issues in advisory & support (green vertical horizontal):
    Advisory&Support clarification
    Sales Manager The well known persons for contacts to external customers
    Business Analyst The well known persons for functional management
    Water Strider The ignored person that keeps the processes going
    Safety Analyst Safety for the environment too often isolated, it is a relocation proposal


    Change & Control - 3*3 plane visual
    Using this C-role existing, recovery, relocation & newbie in a 3*3 plane is changing the Command & Control mindset.
    A visual is helpful in understanding.
    CPO shop-floor
    The Fancy 3*3 plane:


    D-3.4.2 💡 The Product / Service quantum - floor
    Product / Service quantum
    Product / Service flow
    When the product / service covered by a vision mission of the organisation a quantum exist with many lines of several types. See the figure aside. The Product / Service flow, operational plane:
    Closed loops
    There are several closed loops for different stakeholders doing a balancing act. The PM product manager ha a pivotal role in functionality. Several roles have multiple tasks:
    Supply process
    There is a positional context change when the product / service is operational executed or under change or in development.
    CISO shop-floor
    In a figure,
    see right side:

    D-3.4.3 Lean, agile, optimizing - floor

    Understanding Agile, Lean
    The understanding of Agile is limited to what the hierarchy allows to get realised by lean. Avoiding the tree evils is hardly possible at the floor. Being under control by hierachical constraints is limiting lean options.
    Empowering people is often mentioned but micro management is more likely to happen.
    Lean Agile
    In a figure,
    see right side:

    Going Agile, Lean
    What the floor is able to do is understanding the improvement cycles PDCA DMAIC and what is possible for that within their cycle of influence.
    DMAIC PDCA
    In a figure,
    see right side:

    D-3.5 Lean linkage: understanding flow value

    D-3.5.1 🚧 Situation: analyse the boardroom for improvement
    CPO, chief product officer
    When the product service is what it is about, the CPO has a pivotal role.
    A CPO is more then than the COO (Chief Operations Officer) it is including innovation (Chief Innovation Officer), technology alignment, involving empowered people.
    X Issues in Control (Chief _ officer):
    Accountability clarification
    CFO finance The well known one for financial stability and predictiable profitablity.
    FM/HR facilities Facilities, working place, human resources and more are essential assets.
    CRO Risk Existing, redefine in all kind of risks what is done and what is not done.
    CEO Executive The well known one for decisions for the enterprise, organisation.

    +Issues advisory & support (Chief _ officer) :
    Advisory&Support clarification
    CAIO A&I Analytics Intelligence is how about understanding information (new).
    CSO Safety Safety including well known cyber security (new).
    CTO Technology Technology attention is at the cutting edges. Continuity is being between those.
    CDO Data REdefines into: Human communication, understanding using by shared language.


    Change & Control - 3*3 plane visual
    Using this C-role existing, recovery, relocation & newbies in a 3*3 plane is changing the Command & Control mindset.
    A visual is helpful in understanding.
    CPO hierarchy top
    The Fancy 3*3 plane:


    D-3.5.2 💡 The Product / Service quantum - strategy
    Product / Service quantum
    Product / Service flow
    When the product / service covered by a vision mission of the organisation a quantum exist with many lines of several types. See the figure aside. The Product / Service flow, operational plane (horizontal):
    Closed loops
    There are several closed loops for different stakeholders doing a balancing act. This is applied analytics, intelligence with the goal of understanding the performance. The connection lines are in the analytical plane (vertical): It would help a lot in simplification of what data, what information and analytics intelligence are important. How to do the knowledge provision is simplified by a more clear understanding of intended goals.
    😉 This would be simple when all was just internal processed and there would be not any external dependencies for a Product / Service.
    Supply chain
    The reality is different multiple products / services are combined into what is the product / service for the (external) customer, consumer.
    Supply chain
    In a figure,
    see right side:
    The challenge is this:
    D-3.5.3 Lean, agile, optimizing - strategy
    Going Agile, Lean
    Everyone is wanting this, seeing the advantages but at the same time most attempts are failing dramatically. What is going on, why is this happening? When asking what lean is, agile is there a good generic applicable answer is not found.
    Trying to understand how to recognize lean, agile no good generic applicable answer is not found. There is a famous commercial example trying to be copied by similarities, it is the Toyota Production System (TPS). This famous example however was not the only one moving in that direction. The Theory of Constraints (TOC) 1984 was build on top of that. Looking from a far distance it is a natural evolutionair evolvement in the "Information Age".

    Understanding Agile, Lean
    Trying to understand lean agile and trying to categorize a complicated list of requirements with definitions, technical requirements, behaviour requirements resulted in something that could be helpful in recognizing lean agile.
    Lean value chaing
    In a visualisation:
    The areas in this 3*3 plane are not at random places, they are positioned for: The pillars are interesting in understanding lean and what has happened.
    1. The left vertical pillar is about mindset. The famous TPS example had his place in a culture that did not have many conflicts for that. The Toyoda family had an important role for this
    2. The middle pillar is about methodology and practices. In the famous TPS example the dissonant Taiichai Ohno was the master for this.
    3. The right pillar is about measuring, understanding, insight adjust in closed loops. Deming had an important role in this one.
    4. In the middle there are threats obstacles. When not getting managed they will rip down the others. For the TPS system Eiji Toyoda was the master.
    If we accept this as what has happend and accept the why it is happening at least there is way to recognize that big lean elephant. Not being lost anymore in details of the blind trying to understand the elephant.

    D-3.6 Notes: Good bad and ugly - wandering - visonary

    Ending bridge
    D-3.6.1 😉 Experienced positivity by activities
    Used concepts & practical knowledge
    Reconsidering all nice projects, nice because it has that many connections to a diverse set of activities. Details to connect into this: The security association got more obvious. It is embedded in all activities.

    Working in a big government environment.
    Experiences were positive because: The three periods were a good, althoug a huge organisation a lot was possible.
    working with colleagues, full circle
    Midlandce construction
    Not being an entrepreneur I was at several projects in a mid-lance construction. I got several projects for longer periods. When issues arose I was kept out of the heat.
    The small company changed over time. This stopped abrupt when achieving a certain age.

    Working for credit scoring
    Experiences were positive because: The challenge for better predictable quality was needed in their market competition.
    Working for health insurance
    Experiences were positive because:
    Working for health regulator
    Experiences were positive because: The challenge for rebuilding the information inquirys was by technical and functional debt.
    Working at technology for a conglomerate including health insurance
    Positive experiences because: Challenge: technical debt, not understanding well fucntionalities while outsourcing tasks.
    feel_fire2.jpg logl
    D-3.6.2 😱 Experienced negativity by activities
    Working at big government environment.
    There were many internal frictions mostly hidden. Those probably escalated at a moment in a public distorted rumour resulting into new reorganizations and regression reverting innovation.
    Complexity changing a pivotal orientation, I blew up : Due those technical challenge is was not possible handing over the knowledge.
    Working at technology for a conglomerate including health insurance
    A mistake in organisational and political setting in security. Detect en defend activities having a lot of staff. Expecting deter, preventing at the same level was wrong.
    modern7 logl
    D-3.6.3 🤔 Ambiguity complex uncertain volatile emotions at activities
    Working for health insurance
    There was no good fit in working culture, for me it was far too fragmented missing the vision to missions.
    Working at technology for a conglomerate including health insurance
    Although I am positive in the end there were weird situations - activities
    D-3.6.4 Lean reflection into the future
    Mc-Kinsey 7s model (1970)
    Starting point: Overall organisational strategy designed to achieve goals using shared values. See: 7s
    clarification
    Skills Capabilities, competencies possessed by employees and the organisation as a whole
    Staff Human resources, their allocation and deployment within the organisation
    Structure Organisational structure and hierarchy
    Systems Processes and procedures that govern how work is done and decisions are made
    Style Leadership and management style exhibited within the organisation.


    Small - Big
    The Information Age
    What have we learned not only in our own circle and where is it going to? From: "power to the edge" Alberts Hayes -2003-:
    With the cost of information and its dissemination dropping dramatically, information has become a dominant factor in the value chain for almost every product or service. As the costs drop, so do the barriers to entry. Hence, competitors in many domains are seizing on the opportunity provided by "cheap" information and communications to redefine business processes and products." (73)
    Organizations that are products of Industrial Age thinking are not well suited for significant improvements in interoperability or agility. (56)
    Outputs are the products/services valued and paid for by the business customers.
    Mc-Kinsey 7s extende into 12S-9plane
    💡 Adding five additionals, managed by a "servant leader" for strategy by shared values.
    clarification
    Social-intelligence the ability to understand one's own and others actions
    Supply-chain a network involved in the production and delivery of a product or service
    Suppliers is a company or individual that provides goods or services to another company
    Service/Product a named collection of business capabilities valuable to a defined customer segment
    Serving-customers doing useful work for them

    Service product flow (pull push): 9plane lean model, applicable at each of the 5 rectangles.
    7s -> 12s mckinsey
    In a figure,
    see right side:

    ⟳ Ideation Insight-I Insight-II Shopfloor  FlowValue GBU(5)  ⟲
      
    ⏳ Enabled Support-I Support-II TocJST-II Struct(II) GBU(4)  ⌛
      
    🚧 Contents Enter-I Enter-II TocJST-I Struct-I GBU(3)   🚧

    © 2012,2020,2024 J.A.Karman
    🎭 Summary & Indices Elucidation 👁 Foreword Vitae 🎭