Many companies today align their software development with the DevOps model. But saving time alone is not enough, security aspects must be included from the very beginning – keyword: DevSecOps.
The second article in my series “Time to Production” has shown the optimization potential that system operation and administration can achieve through close cooperation with development, QA and product managers as well as consistent use of tools. However, it is too short-sighted to use a deployment pipeline to make new code available to users in real time: the pipeline should also deliver a secure application in a secure and monitored system environment.
The DevOps model brings developers and administrators closer together and enables agile methods in both areas.
In the first article of my “Time to Production” series, I highlighted best practices for optimizing the time it takes for a new feature from a product owner’s request to the completion of development. With the completion of software development, however, the feature is not yet available to users in the production system. The idea of also speeding up the second part of the supply chain after a change has been handed over to the operator – without any compromise in availability, security or other quality features – is obvious. Here too, the application of agile principles is promising. DevOps – the composition of the terms development and operations – is a model and not a role. It is neither meant that developers take over the task of operators and administrators, nor vice versa. Content of the DevOps model is the interdepartmental cooperation between development, quality assurance and operation as well as the implementation of best practices in software development in the field of service management and system administration.
Agile methods can help to increase the productivity and quality of processes as well as of the developed software. A consideration of the success criteria.
Productivity is an important key figure in software development. It determines which software volume an organization can create with a certain amount of effort while adhering to certain quality criteria. We have already published several articles about our experiences with its measurement and optimization. However, high productivity in software development alone does not say anything about how quickly a new feature of an application can be used in production by users – the so-called Time to Production. My new series of articles deals with this topic.
In software development, a selective measurement of productivity can be very deceptive. Valid statements require consideration over a longer period of time.
In one of my previous contributions, I described how the management model used at PASS for several years works. In brief: It is based on three key performance indicators (productivity, costs and quality) and the corresponding measurement methods. Continue reading
Through the cyclical repetition of measurements, evaluations and optimizations a targeted improvement of productivity and quality can be achieved in software development. Thereby error analysis is an essential part of this process.
When processing an error, the focus is naturally on correcting it in order to eliminate the user’s restrictions on the use of the system. In many cases, agreed response or troubleshooting times require a quick focus on a solution, even if it is only a workaround solution that does not prevent the error from occurring in the long term. Continue reading
How can productivity and quality in software development be managed effectively and sustainably controlled? A KPI-based management model as response.
In my last article “Three levers for higher productivity in software development“, I used empirical values to show how industrial methods, such as standardization and automation, affect productivity. Thus, software development by industrial means does not aim at mass production of similar products; on the contrary, it enables a high degree of individualization by a small-scale standardization of business-related and technical components. Continue reading
Costs down – quality up: productivity in software development is a triad of development standards, automation and reuse.
In my article “Individual Standard: Mass Customization in Software Development” I pointed out how optimization potentials can be exploited by reusing fine-granular business and technical components. Here, the development of a travel management system served as an example with a total of 2,286 man days saved. Considering this impressive figure, the question arises: To what extent can the productivity in the development of individual systems be increased?
Developed from practice to practice: Today I present the second book of my series “Increasing Productivity of Software Development”. Topics are the application, evaluation and optimization of the KPIs productivity, costs and quality.
While the first book figured out experiences with different measuring methods, my second book entitled “Management Model, Cost Estimation and KPI Improvement” describes a management model based on key performance indicators. Continue reading
Complex requirements, ever shorter development periods and increasing cost pressure – productivity is an issue of increasing importance in software development. This is the starting point for my book project “Productivity and Performance Measurement – Measurability and Methods”.
As Head of the Competence Center Project Governance, I am responsible for the process engineering for software development at PASS. For several years we have been using key performance indicators as a major information source of our IT management – in more than ten different application environments with more than 500 customers and approx. 250,000 users. In some areas our regularly measured KPIs show significantly higher delivery productivity. Continue reading
Industrialization vs. individualization = mass vs. individual production – an equation which has largely lost its validity today. In the industrial sector as well as in software development the future belongs to customized mass production (mass customization).
For a long time, mass production allowed an optimum utilization of operating assets and thus higher quantities, shorter delivery times and lower product prices than individual production which is rather characterized by more attractive and usable products. Continue reading
Mitigating risks: For all businesses, IT security must be a crucial part of their overall strategy. In this article, I will outline its cornerstones and how tools can support the risk management process.
The use of information technology introduces risks – risks, however, that can be kept at bay by suitable measures. The following areas are crucial and worth being considered more in detail: software development, technical protection measures, staff training and risk management. Continue reading
Cost estimation methods must be quickly applicable – even without expert knowledge – to plan and control development projects reliably.
Recently in a development project in Aschaffenburg:
In order to plan the team size necessary for developing a new release in due time, a cost estimation is required. A colleague decides to perform an expert estimation. Continue reading
Software Engineering by start-ups – chaos and bugs instead of structure and quality? This was the title of a conference by the German federal association for the ITC industry BITKOM in mid-April. Its purpose was an exchange between established ITC companies and start-ups.
In particular, the key questions were: how chaotic are the conditions under which start-ups actually develop software – and of which importance is quality management to them. The location was suited to address these issues: the Hasso-Plattner-Institute for software systems engineering (HPI) in Potsdam, which already has supported numerous start-ups in the IT business. Continue reading
After Safe Harbor: A “Privacy Shield” shall protect the data of European citizens in the US.
In my last post I asked: “After Safe Harbor: Where is the legal certainty promised by the EC?“. Now the follow-up agreement for the regulation of transatlantic data transfers which has been expected for end of January at least is in sight. Yesterday the USA agreed at political level with the EC regarding a new framework with the promising name “EU-US Privacy Shield”. Continue reading
The Safe Harbor vacuum: the legal certainty promised by the EC for end of January is still pending; thereby the future of transatlantic data transfers still remains unclear. Does the politics really leave the IT industry in the lurch?
In the European Community the handling of individual-related data is regulated by the Directive 95/46/EC, which is deemed as one of the safest data protection standards in the world. It prohibits the transfer of individual-related data from the EC to countries without a comparable level of protection. Continue reading