Digitalisation as a lever for cost reductions

Updated: Oct 19

Many companies have problems presenting true digitalisation as a cost-saving measure. But what do we mean by true digitalisation? We don't understand the mapping of manual processes in a softwarehere, but the redesign of processes based on data!


But why is it so difficult to present real digitalisationas a tool for reducing costs? This usually has several reasons:


  • Companies must first improve the data quality because machine learning is not possible based on the data available today

  • Data analysts are not available or not available to a sufficient extent to be able to draw conclusions from possible or necessary recommendations of the software on the data required for this

  • The digitization of know-how is not possible because the companies do not manage to convert existing know-how into an algorithm that enables digitalisation on the one hand and machine learning based on the data on the other

  • In addition, the costs of the initial investment for personnel, data analysis and setting up a digital infrastructure are not only high, but often also not calculable, which is why a more detailed ROI calculation in advance is not so easy


When redesigning processes based on data, the assumption is that the data provide additional insights and that better decisions can be made based on them.



A classic example where this methodology was successfully implemented a few years ago is Google Maps. Who can still imagine using a navigation device that does not use real-time data and uses this data for precise route recommendations?

And now you have to imagine that you can have a procurement map that shows you the way for the most efficient negotiation and if you accept the proposal, then implement it immediately. This not only saves you a lot of energy, you also achieve your goal with the best possible result.

The previous reasons why digitalisation is so difficult are thus overridden. Since the algorithm has already been developed, there is no need to laboriously extract your own know-how and set up a database. On the contrary: the algorithm is based on game theory principles and is therefore not only tried and tested in practice, but also scientifically verifiable. The existing data quality is also not important here, since the existing database from many thousands of negotiations is used.

Since there is no need to start a large IT project, you can start quickly and easily and gain initial experience immediately.

This results in several levels at which savings can be achieved:

  • Process change: In the area of the so-called. Tail-end-spend many companies follow the "3 bids and a buy" strategy, i.e. 3 offers are obtained and the cheapest one is commissioned without negotiation, as it is assumed that the (process) costs of the negotiation are higher than the additional ones savings. If there is software that covers exactly this process without additional effort, every percentage point of savings is an additional hard saving.


  • Quality: But even if one were to negotiate, it is still not certain that the buyer will choose the best negotiating strategy. If you could ensure that every buyer, in every region, and no matter what level of experience he had, would always select the most efficient negotiation method, you could certainly achieve additional savings.


  • Adaptability: Now there are companies that operate in specific markets and it is therefore not so easy to choose the most efficient strategy. This is where the "self-learning" algorithm comes into play, which adapts precisely to these special framework conditions and thus also changes its recommendations


  • Process cost reduction: Of course you can not only use such a tool for better negotiations, i.e. an increase in the quality of negotiations, but simply to reduce process costs. Where strategies used to be laboriously worked out and implemented semi-automatically with suppliers, this is now possible fully automatically. It should be noted that the sourcing tools available on the market today only support and partially automate the solicitation of offers, but cannot support the negotiation itself in the form of recommendations or implement it independently in any way.


  • Usability: This brings us to the last aspect of successful digitalisation: the optimization of the user interface to such an extent that end users can easily handle it without training. After all, what good is digitalisation if it is simply not implemented or processes are not successfully adapted?

All this is offered by the nnamu negotiation bot developed by TWS Partners. TWS Partners, the leading purchasing consultancy in Europe when it comes to game theory and negotiation strategy, has developed the algorithm for the negotiation bot over the past 3 years based on its own project database, the NEX data & Indexes, which contains the data from more than 3,000 projects have flowed in.

9 views0 comments