There are various systems development methods and concepts that address some issues within the complex environment of "Systems of Systems". Think of DevOps, Scrum, XP, Kanban, RAD, DAD, ... When faced with the challenge of developing a system under time, budget/money and people constraints, it surely is common sense to tailor a method that fits your challenge. This is generally done by synthesising concepts from existing methods.
If the synthesis is well thought, a development and operations environment is created that harmonises people, tools, processes and other resources. Otherwise the environment gears up for schizophrenia. Environments gearing up for schizophrenia are common place with symptoms such as "people with overloaded concurring roles", "Information overload & reload", "multiple inconsistent information communicated over multiple communication channels" ...
Contrary to schizophrenic environments, desired harmonious environments are created by carefully mapping and sourcing people, tools, processes, accessories and other resources within the given constraints. In such harmonious environments collaborators have an overview perspective from where they can zoom in and out of details to facilitate consistent and coherent communication. A key challenge in these environments is to sustainably maintain information consistency and coherency. This can safely be addressed with SAFe in a healthy mature and experienced manner (bringing the right people together).
Getting the right people, processes and tools at the right place and time is easily said than done. The discipline, time, money, knowledge and attitude required will not be available at all times. And in the context of continuous improvement, this reality is responsible for some skewed decisions screwing up various enterprises.
There is no silver bullet to address these challenges. Every team will have to forge its own bullet and hope it is well formed to assure the development and protection of the desired environment.
Dienstag, 6. September 2016
Freitag, 8. April 2016
Fitting "Cloud Computing" and "Internet of Things - IoT" into your Enterprise Architecture
Fundamental to Enterprise Architecture is describing the enterprise landscapes of products, services, information, responsibilities, tools, technologies, processes, methodologies, as well as the transitions within and across these landscapes. Understanding these fundamentals from the perspectives of Ontology and Methodology, raised the question of fitting cloud computing and IoT into one's Enterprise Architecture. This question is inevitable albeit the current slow uptake of these "new" ways of processing information.
The difficulty in responding to the question becomes obvious vis-a-vis the various organisational scopes and constellations of enterprise architecture practices. In some enterprises architecture is explicit only from a business capability perspective. In others, it is exclusively practiced within the IT function/department. In some rare cases one may find an enterprise with an explicit architecture practice covering the entire enterprise landscape mapping clearly its business capabilities to its various functions. The responsibility of corporate wide architecture practice should/must however be taken by the corporate management (IMHO, by the CEO).
Depending on your scope and constellation, you may be limited. In terms of cloud computing you may be limited to the Infrastructure - IaaS, the Platform - PaaS and/or iPaaS, and or the Application - SaaS. In terms of IoT you may be limited to the things that need to be connected (physical and cyber things), the connection (e.g. WiFi, Bluetooth LE, 6LoWPAN and ZigBee), and or the storage/processing of the information flowing between the connected things, which may likely take you to cloud computing.
Addressing the question, irrespective of your organisational scope and constellation, you should fit the primitives into your ontology framework and adapt your methodology framework to incorporate the relevant processes to synthesising the relevant architecture deliverables. The principles, standards and frameworks of enterprise architecture should always be leveraged to stream value.
The difficulty in responding to the question becomes obvious vis-a-vis the various organisational scopes and constellations of enterprise architecture practices. In some enterprises architecture is explicit only from a business capability perspective. In others, it is exclusively practiced within the IT function/department. In some rare cases one may find an enterprise with an explicit architecture practice covering the entire enterprise landscape mapping clearly its business capabilities to its various functions. The responsibility of corporate wide architecture practice should/must however be taken by the corporate management (IMHO, by the CEO).
Depending on your scope and constellation, you may be limited. In terms of cloud computing you may be limited to the Infrastructure - IaaS, the Platform - PaaS and/or iPaaS, and or the Application - SaaS. In terms of IoT you may be limited to the things that need to be connected (physical and cyber things), the connection (e.g. WiFi, Bluetooth LE, 6LoWPAN and ZigBee), and or the storage/processing of the information flowing between the connected things, which may likely take you to cloud computing.
Addressing the question, irrespective of your organisational scope and constellation, you should fit the primitives into your ontology framework and adapt your methodology framework to incorporate the relevant processes to synthesising the relevant architecture deliverables. The principles, standards and frameworks of enterprise architecture should always be leveraged to stream value.
Sonntag, 21. Februar 2016
Data Analytics in Enterprise Architecture
In the prezi “Relating TOGAF ADM to other IT related Methods” I highlighted the relation between TOGAF and other IT Standards (COBIT, ITIL, PRINCE2 and CMMI). Unfortunately current software tools supporting these standards are not seamlessly integrated. Information locked in one tool is not easily accessible to the other tools. Hence data cannot be analysed appropriately despite the correlation and synergy of the data stored in the various tools.
With the acquisition of Alfabet by Software AG and Troux by Planview as well as the current offerings of other independent software tool manufacturers to support Enterprise Architecture, one can hope to see improvements towards integral tools supporting the various IT standards mentioned above.
As this tool integration process matures, the values of Enterprise Analytics can be leverage to collected integral, correlated data facilitating the development of an advanced CIO Dashboard. Leveraging Enterprise Data Analytics at the core of IT (in the architecture of any enterprise) will change the game of Business/IT alignment.
While IT has been providing enterprise data analytics to other departments such as marketing, customer and partner relations management, IT has not been at the forefront of using Enterprise Data Analytics in driving the business of IT within and/or across enterprises and industries. This should be at the core of any implementation of the IT4IT™ Standard towards realising the Vision of Boundaryless Information Flow™.
Samstag, 13. Februar 2016
Another challenge in the uptake of modelling in the era of IoTx
The development of modelling standards has reached a maturity level where Model Driven Development – MDD can drive productivity in the entire System Development Life Cycle – SDLC, but the uptake of these standards in industrial production has been lagging despite the current wave of developing Internet of Things – IoTx.
The evolution of IoTx is and will drive industrial development for at least the next twenty years. And the value of modelling in the era of IoTx has been emphasised in the many reports on “Industrie 4.0” (Cyber-Physical Systems). The lagging in the uptake of modelling in the integral development of products and systems is in my humble opinion due to overcoming the challenge of promoting knowledge on modelling amongst managers and directors (decision makers).
In the light of quarterly thinking (short term evaluations) investments in long term value generators such as taking up modelling standards are generally postponed or cancelled in organisations managed and directed by decision makers lacking adequate knowledge on modelling. The consequence is a crawling pace in unleashing the values of modelling in an era of highly computerised production of cyber-physical systems.
In this IoTx era, standardisation across the entire production chain is the catalyst to speed up production. If a bead on the chain is not in alignment with the standards used, a bottleneck impacting production time is created. Projecting this understanding to the introduction of state of the art modelling standards into established business as usual processes, it becomes obvious that decision makers must be involved in the management and direction of this feat. But what do you expect, when directors and managers are not knowledgeable in modelling and lack the will to focus resources on developing that knowledge.
After a demonstration of some use case scenarios of IoTx, a director asked a question shedding light into this challenge. The question was: “How long does it take to model such a use case, can it be done using MS Office tools?” In a side conversation another question was frequently asked: “How can I leverage IoTx in my organisation, I cannot get a sponsor without showing the ROI?
To the first question, my response was: The speed of modelling depends on the proficiency in the modelling language, as well as knowledge of the modelling tools and the (system) context being modelled. In elaborating on the response, it became evident that the current versions of MS Office tools are not appropriate in unleashing the value of modelling in the era of IoTx.
To the second question, my response was: A collaborative effort will be required to develop a common understanding of the respective organisational context, before an adequate value proposition can be tailored. The key question is: When is who going to commit human and financial resources to address this challenge?
The evolution of IoTx is and will drive industrial development for at least the next twenty years. And the value of modelling in the era of IoTx has been emphasised in the many reports on “Industrie 4.0” (Cyber-Physical Systems). The lagging in the uptake of modelling in the integral development of products and systems is in my humble opinion due to overcoming the challenge of promoting knowledge on modelling amongst managers and directors (decision makers).
In the light of quarterly thinking (short term evaluations) investments in long term value generators such as taking up modelling standards are generally postponed or cancelled in organisations managed and directed by decision makers lacking adequate knowledge on modelling. The consequence is a crawling pace in unleashing the values of modelling in an era of highly computerised production of cyber-physical systems.
In this IoTx era, standardisation across the entire production chain is the catalyst to speed up production. If a bead on the chain is not in alignment with the standards used, a bottleneck impacting production time is created. Projecting this understanding to the introduction of state of the art modelling standards into established business as usual processes, it becomes obvious that decision makers must be involved in the management and direction of this feat. But what do you expect, when directors and managers are not knowledgeable in modelling and lack the will to focus resources on developing that knowledge.
After a demonstration of some use case scenarios of IoTx, a director asked a question shedding light into this challenge. The question was: “How long does it take to model such a use case, can it be done using MS Office tools?” In a side conversation another question was frequently asked: “How can I leverage IoTx in my organisation, I cannot get a sponsor without showing the ROI?
To the first question, my response was: The speed of modelling depends on the proficiency in the modelling language, as well as knowledge of the modelling tools and the (system) context being modelled. In elaborating on the response, it became evident that the current versions of MS Office tools are not appropriate in unleashing the value of modelling in the era of IoTx.
To the second question, my response was: A collaborative effort will be required to develop a common understanding of the respective organisational context, before an adequate value proposition can be tailored. The key question is: When is who going to commit human and financial resources to address this challenge?
Abonnieren
Posts (Atom)