Thank you also to all participants! We enjoyed your excellent questions and the discussions.
Summary of the sessions
1. How to de-risk corporate-startup innovations while improving speed and cost? By Josemaria Siota
Josemaria Siota, Executive Director IESE Business School, explained how to engage in venturing as a corporation by pulling from his latest research report (available here). The session was facilitated by Nikhil Chouguley, Global Head of Product Governance & ESG Oversight at Deutsche Bank. Click here for a summary of the session.
2. GAIA-X: The future of the European datacloud by Hubert Tardieu
At the second day of our Inclusive Digital Innovation in Financial Services & Insurance (FSI) event, we had a look on a European moon-shot project GAIA-X. For this topic we were happy to welcome Hubert Tardieu, chairman of the Board of GAIA-X, highlighting how the project will shape the future of the financial services and insurance market in Europe by “creating a next generation data ecosystem for Europe with a global aspiration”.
The kick-off summit of GAIA-X in 2020 consequentially focused on two major foundations for a project of such scale. First, the overall key concepts to achieve the envisioned cloud penetration of and in the European market were presented. These depict the five pillars of GAIA-X:
Supporting policy rules derived from requirements of a European single market
Support federal data infrastructures (methodology to synthesize different frameworks)
Ensuring interoperability, sovereignty, portability of data
Providing testable compliance to GAIA-X Architecture of standards
Acknowledging open standard setting processes laid out in the internal GAI-X rules of the GAIA-X
Second, the projects governance structure was outlined. Both points reveal what is at the heart of GAIA-X: creating a digital business ecosystem for open innovation.
GAIA-X – digital business ecosystem by design
The professional literature highlights a couple of design principles to achieve a successful setup of a healthy digital business ecosystem. The summit therefore was existential to growing legitimacy by presenting that GAIA-X is shaping its governance based on these, interdependent, design principles:
Demand orientation; stating a mission allows enthusiastic actors to push into the ecosystem instead of pulling them in securing pro-active, responsive behaviour for the joint value creation
Openness; in form of a transparent environment enabling an easy access
Self-organization; enabling participants to act autonomously to increase commitment
Loose coupling; so that participants can join freely and engage in open relationships so that there are no heavy dependencies determining the success of conducted projects
Domain clustering; enabling the grouping of participants in projects based on shared interests
Advantages of GAIA-X as a digital business ecosystems
Mr. Tardieu sees a major benefit of GAIA-X in the enabled Europe-wide collaboration between private and public sector. First, the initiative allows to jumpstart the facilitation of digital competence of European companies and thus of Europe as well. Second, it enables European-centred research hand-in-hand with practitioners which in return enables more precise policy-making. Third, the collaboration allows for a holistic approach since it is the only project that is addressing all the necessary elements together from root-to-tip. The initiative captures the alignment of technical standards and services for interoperability and portability (the roots) through the federated trust and sovereignty services (trunk) up to the definition of ontologies, APIs and technology standards for compliance (leaves).
“Gaia-X may seem gigantic but we don’t think that the big issues we are trying to tackle can be ‘sliced’ into smaller parts. In Financial Services and Insurance, this is especially true at a time when cloud adoption needs to accelerate and there is so much change within the industry.”
Based on these the concept of open strategy, nested in a digital business ecosystem, offers advantages for both affiliated producers and consumers. Since for the GAIA-X participants are often both – provider and user of data – the approach taken by GAIA-X is of particularly high functionality. Benefits for data providers are lowered development and launch costs, quality improvement due to a joined development environment and increased speed to market. This, in turn, translates into the benefits of data users since the reduced costs are reflected in the price (up to being open source) as well as a direct incorporation of feedback and implementation of specifications in the development cycle.
Challenges identified, faced and tackled
Nevertheless, open strategizing also presents those involved with the challenge of finding of losing established business models of value appropriation. In the case of GAIA-X, there are two factors that endanger common business models of participants:
The lower costs (should costs be charged) of developed services or products are passed on directly to the user of the data. This significantly reduces the achieved profit.
Differing ownership of input data, managed through data sharing agreements and data use statements, impedes the distribution of the benefits achieved.
Hence, participants have to find new ways to appropriate value within the value chain and thus generate profit from their engagement in the digital business ecosystem. According to Mr. Tardieu the participants at GAIA-X are fully aware of that challenge:
“Of course, our concern is how do we create data spaces in which those involved will be able to further their own interests too. We also think about how those who put the most effort in from the start don’t lose out to those who might join later when the hard work is done. We reject the idea of selling data. It is an old-fashioned way of thinking.”
Part of this process is the integration of researchers as they are developing novel ideas to tackle this challenge. A recently idea explored is the approach of ‘Tickenomics’, originating at the University of Toulouse. The underlying mechanism is illustrated by Mr. Tardieu through an analogy: “One example [of Tickenomics] would be to suppose you are in a place with no transportation system. You are selling tickets (or lots) to travellers, to towns and to whole regions. At some point, you will have enough money to create the transportation system. And that is when the tickets become valuable. It might be slow getting started but as soon as everything is in place, it takes off quickly. So we are looking at ways we might introduce ‘tickets’ without the possibility of these leading to monopolies.”
Furthermore, open strategizing allowed the participants of GAIA-X to identify the following barriers which hinder the successful realisation of the visionary mission:
The absence of portability (also known as vendor lock-in or the risk of ‘mainframe syndrome’) – preventing companies from committing to cloud due to future risk.
The potential lack of interoperability – whereby differences in the technical infrastructure may hamper or even render data sharing impossible
The importance of data sovereignty – as otherwise companies would refrain from moving to the cloud due to the risk of misappropriation of shared data.
At the same time the mission-driven initiative also produced a mode of operation to tackle these barriers according to Mr. Tardieu: “The challenges of data portability, interoperability, and common commercial and legal frameworks have different implications for different industries. That is where GAIA-X is helping participants come together to define use cases for their industries and to share information that can make industry data spaces possible. This collaboration is important.”
GAIA-X enables mission-oriented work in industry-specific projects
With the mission, design and barriers of the digital business ecosystem being fleshed out, naturally, the question occurs how GAIA-X will manifest itself through the realisation of projects. Mr. Tardieu pointed out that therefore the domain clustering is of importance as it defines groups to create “data spaces” on an industrial level. Therefore, the FSI industry is a prime pilot since, due to the high level of regulation, collaboration between participants of the public and private sector is required when tackling the challenges of data portability, interoperability, and common commercial as well as legal frameworks.
“By working together, they (public & private actors) can increase the chance of success. And this isn’t just about sharing data, remember. It’s also about infrastructure too. Especially where regulations dictate compliance at a local level. You can’t just do it at the application level. This is something GAIA-X is working on.”
Furthermore, he lined out that the FSI industry “is ‘ahead of the pack’ because of PSD2 (for a brief overview of PSD2 see this blogpost). We wouldn’t have seen the huge development of FinTechs without it. But this is only half of the work. Data ontologies are key and you will soon see the priority use cases from the financial services and insurance sector start to emerge based on GAIA-X projects.”
The first pilot project – the safe Financial Big Data Cluster – investigating the use case of a joint platform to fuel artificial intelligence services, is currently developed by participants of the private and public sector with involvement of FINDER (for an introduction see this blogpost).
Added benefit to the FSI industry through GAIA-X
While there are different initiatives (for instance, the EU Alliance for Industrial Data and Cloud) Mr. Tardieu sees the benefit of GAIA-X in its holistic approach since it is the only initiative that is addressing all the necessary elements together from root-to-tip. The initiative captures the alignment of technical standards and services for interoperability and portability (the roots) through the federated trust and sovereignty services (trunk) up to the definition of ontologies, APIs and technology standards for compliance (leaves).
“Gaia-X may seem gigantic but we don’t think that the big issues we are trying to tackle can be ‘sliced’ into smaller parts. In Financial Services and Insurance, this is especially true at a time when cloud adoption needs to accelerate and there is so much change within the industry.”
We will be looking out in the future to see how GAIA-X and its pilots will develop thereby changing the European Financial Service & Insurance industry. Stay tuned for more in the future.
Acquisitions are common in industries, but not all acquisitions succeed and those that fail often have a negative effect on the acquirer. A recent publication in the Long Range Planning further explores multiple levels of risks involved in acquisitions and the importance of signaling the ‘why’ and ‘where’ of said acquisitions.
Dr Rick Aalbers (Associate Professor in Strategy and Innovation at Radboud University Nijmegen and coordinating team member of the FINDER Project) teamed up with Dr Killian J. McCarthy (Associate Professor of Innovation at University of Groningen) and Prof. Dr Koen Heimeriks (Professor of Strategy at Warwick Business School) deep dived into the matter, which resulted in the publication of:
As acquisitions are risky events but not all acquisitions involve the same levels of risk, the authors suggest that the announced acquisition motive – the ‘why’ of the acquisition – is an important risk signal. In the paper they categorize acquisition motives and distinction is made between acquisitions with ‘pure explore’ and ‘pure exploit’ motives. Recognizing that most acquisitions have multiple motives, acquisitions with ‘ambidextrous’ motives – different combinations of explorative and exploitative motives – are identified too. Building on recent contributions to signaling theory, it is argued that the ‘why’ will matter more, if the ‘where’ pertains to a high-risk setting. The authors measure this, using target-to-acquirer industry relatedness.
The market reacts more positively to pure acquisitions, aimed at exploration or exploitation, compared to ambidextrous acquisitions.
The market reacts more positively to ambidextrous acquisitions orientated towards exploitation than ambidextrous acquisitions orientated toward exploration.
Relatedness moderates this relationship, in that the market is more willing to tolerate exploration in a related industry.
The authors core contribution is to the literatures on acquisition motives and ambidexterity. They provide new insights into the incidence of specific motives, the ways in which they are mixed, and the market’s reaction to their announcement. In addition, they contribute to the emerging literature that takes on behavioral perspective of market reactions by showing that the ‘why’ and ‘where’ of an acquisition matter.
Explainable AI refers to making the decision-making process of a machine-learning model transparent and understandable for a human observer. This includes which data has been used as an input and which variables are proportionally contributing to a model’s decision.
Why do we need explainable AI?
There are multiple reasons why explainable is needed. Firstly, we need to know if input data is biased because that leads to bias-reproducing AI. Secondly, we need to know which variables the model is attributing the most weight to since these could be variables that discriminate against particular groups of people. Thirdly, having an explainable AI model enables companies to address accountability and to be prepared for regulatory reporting.
How to implement explainable AI in insurance?
The prominent business cases that AI in insurance addresses are cross-selling and up-selling, targeted recommendations, and churn prevention. Explainable AI in insurance (compared to non-explainable AI) enables customers to have increased trust in the AI system, validate the business relevance of the model, discover new insights in the data, check for variables that should be excluded, and use it for regulatory purposes.
If you would like to learn more about explainable AI in insurance, please reach out to Jeremie Abiteboul.
On Wednesday, March 17th, S. James Ellis gave a talk concerning ecosystem dominance at the weeklong Inclusive Digital Innovation event hosted by Atos. This discussion comes in the wake of a white paper currently in development centered around the same topic.
The paper views dominance through three different lenses in order to prescribe what incumbent and startups should focus on to gain a dominant edge in digital, data-driven ecosystems. “Dominance,” in this sense, is given a fair amount of room for interpretation, but it hinges on the idea that in an ecosystem where a business’ stakeholders seek sustainable revenue going forward, there exists the possibility to adapt to ecosystem changes while simultaneously gaining some measure of influence over how a company’s peers in an ecosystem engage with each other. This all centers on a core tenet of ecosystems being the variety of interactions between members.
The first argument asserts that customer access – distinct from customer engagements – is a path to focus on when seeking a dominant position in ecosystems. While many companies do indeed prioritize interaction with their customers as a general objective, this point of view suggests that building the material or conceptual infrastructure to own engagement with the customer is key to gaining a dominant advantage. This could be actualized, for instance, through building “vessel offerings,” where the focal company bundles its own offerings alongside complementary companies’ offerings. The example James gave was that of Internet companies that bundle television companies’ offerings in with their own services, thereby owning access to the Internet and television customer. As the customer, in this perspective, is assumed to be the leading force in ecosystem innovation, this begets an advantage in seizing customer-led innovation opportunities – and thus, a sense of dominance concerning this.
Similar to hallmark resource-based approaches, this viewpoint asserts that access to key resources is the key to finding a dominant position in ecosystems. However and somewhat particular to data-driven ecosystems, these key resources are interrelated proportionately. That is, a company must achieve an interlinked balance of capital, talent, and data in order to most effectively advance its position in its ecosystem. This viewpoint further posits that an overage of any of these resources without a correlated gain in the other two will result in an inefficient operating position, which could slow the company down enough to jeopardize its dominant advantage.
The final viewpoint asserts that a company that systematically pursues the most ecosystem connections, thus centralizing itself among participants, stands to gain a dominant edge among peers. By establishing material linkages with other companies, such as supply chain redundancies, formalized partnerships in joint offerings, and the like, this central and centralizing company begins to insulate itself from the inevitable failures and disruptions that occur in ecosystems, and especially those experiencing the turbulence of broadscale innovation.
The white paper will be available through Atos’ Thought Leadership publications later this year.
FINDER and Atos joined with practitioners, academics, and policy-makers to discuss how to yield benefits from these developments by re-positioning banks in the ecosystem, using Artificial Intelligence in insurance, mitigating risks in new venture collaborations and exploring the opportunities of the European GAIA-X project.
On Monday 15th March, the first day’s sessions took place as part of FINDER and Atos’ ‘Inclusive Digital Innovation in Financial Services & Insurance Event Week’. Atos CTO Remco Neuteboom (https://atos.net/en/expert/remco-neuteboom) and Rick Aalbers (https://www.ru.nl/personen/aalbers-h/), Associate Professor Strategy and Innovation at Radboud University hosted the sessions. They were joined by Josemaria Siota – Executive Director of the IESE Business School – who presented findings from a new corporate venturing report. The discussion was moderated by Nikhil Chouguley, the Global Head of Product Governance & ESG Oversight at Deutsche Bank.
The first half of the session was focus on new research on corporate venturing. Josemaria Siota presented the latest research on the new role of corporate venturing as an ‘enabler’. The research findings showed the importance of a corporate venturing ‘enabler’. The enabler is “An institution or individual, within an innovation ecosystem, that facilitates a resource or activity in the collaboration between an established corporation and a startup, in order for the corporation to attract and adopt innovation.” There are many types of enabler, including private accelerators and incubators, research institutions, venture capital firms or investors, governments and even other corporations. The enabler role is to help determine the innovation gap, explore the options for building the innovation capacity or partnering with others, and facilitate any partnership. Then Josemaria Siota answered about the research and its implications in the market.
The second half of the session was open to the audience. Host Nikhil Chouguley introduced himself and explained how he was interested in both sides of the relationship. He is responsible for governance at a major corporate in his day job, but he also operates his own fintech startup. Nikhil was particularly interested in the role of enablers and the relatively new concept of corporate venturing squads. The 25% of collaborations that had succeeded still represented a huge positive as he invited audience questions for Josemaria Siota to answer the research and its implications in the market.
Josemaria Siota explained how the research points to five crucial conclusions for corporates:
1) Protect your company’s core business when running corporate venturing through an enabler
2) Choose capabilities rather than ‘packaging’ to filter potential enablers. For example, working with partners via a local enabler that has a deep understanding of a specific sector in a specific country
3) Remember that enablers are not just consulting firms – the reality is far richer as they bring databases, events and other ways to connect organizations
4) These opportunities offer you a completely new revenue stream: enabling other partnerships through corporate venturing ‘squads’
5) Every day, the company is becoming less and less unique and enablers can improve your value proposition
There was also time for one key conclusion for potential enablers: A proven capability is the most frequent aspect considered by partners. So always under-sell and over-deliver.
FINDER is a Marie Curie Research and Training Program funded by the European Committee and has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Sklodowska-Curie grant agreement No 813095. This session took place as part of Atos’s ‘Inclusive Digital Innovation in Financial Services & Insurance Event Week’ (15th to the 18th March 2021). This is part of Atos and Radboud University’s joint initiative FINDER (https://thefinderproject.eu/), funded by the European Commission.
Life-fulfillment platforms refer to a vision on retail banking. It represents a business model in which the customer interacts through the banking platform with various ecosystems to fulfill diverse real-world needs. For instance, purchasing train tickets, filing insurance claims, and organizing a move. Of course, the platform also offers core financial services provided by almost all retail banks today. The general idea is that life-fulfillment platforms cover all of the customer’s needs that are related to financial transactions. These customer needs can be summarized into four cornerstones of the vision.
Pay and spend
Save and borrow
Invest and protect
Receive and earn
Which roles can banks play in this model?
The overall vision is to create a one-stop-shop solution, meaning one bank becomes the exclusive entry point for customers to fulfill various needs (see above for categories of customer needs). The short-term and mid-term perspective requires banks to position themselves concerning their function as links between the customer and various ecosystems. For instance, banks can decide to act as advisors for ecosystems where they have expertise but do not want to get directly involved. Or they can aggregate services and products in their offerings. The role banks can play hinges on their competencies and prospects. The lfie-fulfillment services vision sees four different roles mentioned below as the most promising options for banks:
Advisor: the bank consults the customer on what to do, when to do it, how to do it, and with whom to do it.
Facilitator: the bank provides, orchestrates, and curates a platform for different stakeholder groups to not only find each other but also interact and transact,
Aggregator: the bank will package and integrate homegrown and third-party solutions.
Initiator: the bank offers direct access through bank distribution channelsto specifically supported externally sourced services or products
What did we learn in our discussion with academia and practitioners on the topic of life-fulfillment services?
Banks are already working towards the implementation of similar models. The risk of disintermediation and the lower margins in various fields of retail banking requires banks to shift their business models. Hence, becoming a provider of new services is an appealing vision for banks.
Banks do not need to fulfill all roles proclaimed in the model. It strongly depends on their customers and their position in the ecosystems. Which relationships can be leveraged? Also, the customer journey for individual use cases shows which role to play.
Banks are well-equipped for the data-driven operational model that is needed to become life-fulfillment service platforms. They have the customers’ trust and access to rich financial data. However, the analytic capability might be something that needs further development, and that can be achieved through partnerships.
If you would like to learn more about life-fulfillment services, please reach out to Eddy Claessens for further dialog on this topic.
Business ecosystems are distinct from random collections of companies for the high degree of interactivity between ecosystem participants. In a purely ecological sense, this is the difference between a random collection of penguins, chimpanzees, and grizzly bears in a zoo – who have little if any interaction with each other for being confined to separate enclosures and having no natural connection otherwise – and a collection of lesser long-nosed bats and the night-blooming cacti of the Sonoran Desert. In these more interesting cases, interactivity is beneficial to at least one party and ideally beneficial – but perhaps inert or even damaging – to the other. In the case of the bat and cactus, the relationship might even be a matter of survival.
Businesswise, we see this in, for instance, the smart lighting industry. Providers of smart lighting platforms provide the lights and a basic infrastructure for third parties to provide add-on services. Those complementors then come in and jazz up an otherwise mundane chandelier with services like music synchronization to make the entire ecosystem – platform, complementors, and all – more attractive to customers. Scale these systems up from individual users to smart lighting systems for entire cities, and the potential for ecosystem diversity becomes immense.
However, unreigned chaos – in the constructive rather than destructive sense – rarely provides efficient market outcomes. In the smart lighting example, platform providers have some sort of de facto control over the ecosystem of complementors that amass around their platforms, and they might use this to nudge certain outcomes. This is not always the case. Especially when the “platform” around which complementors come together is more conceptual rather than a tangible product, the roles of who provides the platform and who really guides where it’s going become disentangled. In these settings, a strategically advantageous position to take is that of an orchestrator, which involves putting one’s self or firm in the center of many others and attempting to order and interlace those others’ capabilities and offerings.
This either requires or hopefully provides a panoptical view of the ecosystem, which can then be harnessed to create things of value. Information asymmetry, which we won’t get into, and intentional ambiguity, which we will, can affect what one can do from that panopticon, but in this post, we’ll first discuss orchestration generally as a concept before getting into some literature that addresses the challenges that can burden the orchestrator. Don’t worry though: we’ll also get to some strategic fixes and recommendations to get around those, which will likely become more useful over time as we see ecosystem cooperation – and thus orchestration – rise in importance.
Defining an Orchestrator
Rarely are business leaders and their stakeholders – employees concerned with career growth, investors looking for substantial returns, and so on – content with being in the passenger seat while unplanned chaos drives. Ecosystems, despite their difficulty to gain unilateral control of, are steerable. They can be steered by the institutions that oversee them from the beginning – such as regulatory authorities providing tax incentives for companies working towards certain Sustainable Development Goals – or they can be steered by actors within the ecosystem – such as firms attempting to establish themselves as industry leaders by enlisting other ecosystem actors to work towards a collaborative, groundbreaking innovation.
An orchestrator, to borrow and slightly adjust the oft-cited definition of Dhanaraj and Parkhe, are central firms that create value by ordering components of an ecosystem into sequences more valuable than the sum of their parts, and then extract value by selling those sequences as products or services. That this is exactly what happens in a professional orchestra is a painfully obvious statement, but also one I’ve not seen anywhere in all this literature, so there it finally is.
In a paper I recently submitted to a few conferences, I focus on Atos as an orchestrator amid the financial services ecosystem. Over the past five years or so, they’ve worked to create a system whereby they search for promising fintechs, daisy-chain those fintechs’ offerings alongside those of other fintechs as well as the ones Atos itself can provide, then sell those solutions to clients. The benefits are clear in this win-win-win situation: fintechs (those being orchestrated) gain market exposure especially to large clients, clients (those for whom the orchestrator is orchestrating) are able to purchase their innovation goals, and Atos (the orchestrator) draws in revenue as the broker of the deal with minimal costs in terms of production.
Orchestration as a theoretical event is not a necessarily new concept. The previously cited Dhanaraj and Parkhe article, which seems to be the root article in a lot of management literature concerning the topic, was published in 2006. It’s been a long time since then, with 2020 accounting for roughly half of it. However, one of our colleagues right here at Radboud University co-authored an article on orchestration that was published in Organization Studies this year, and the insights are particularly valuable for practitioners in likely any industry who seek to achieve a similar role. The above win-win-win dynamic, after all, is about more than generating profit for shareholders: it’s about improving the health of the ecosystem. Whether in an ecological or a market sense of ecosystems, it’s hard to argue against that. I’ll also reference an article co-authored by a partner of the FINDER project, Dr. Miriam Wilhelm, which relates in its discussion of how a central firm must apply different approaches and more specifically ambidextrous ones when dealing with other firms contributing to its outputs.
The difficulty of constantly being in tune with all orchestrated components – being in the panopticon – is no small factor, and it not only requires many sets of eyes to monitor what’s going on in many places at once, but it also involves many sets of hands to address various issues and concerns among the various project participants. Even more importantly, it requires an intuition for when to apply hands-on, dominant solutions and when to only provide a gentle nudge before letting the consensus figure out the rest.
Broadly speaking, the study focuses on interfirm orchestration. This is in contrast to orchestration that occurs between different units within a firm, which I’ll cover in a future post. In their paper, Reypens, Lieven, and Blazevic assess a project with a large collection of stakeholders to explore how orchestrators go about mobilizing agents in a variety of firms to work towards the same objective. They adopt the view from previous literature that there are two modes of orchestration: dominant and consensus-based. These are fairly self-explanatory: in the former, one entity attempts to centrally govern most processes that happen within the endeavor, putting other entities in a de facto subordinate role. In the latter, governance and management are decentralized or revolving. The authors then assert that both of these modes can be employed in a given project and by a given entity dynamically.
It is along this line of thought that they commence their analysis, and their study lays out in great detail the dynamics that occurred between stakeholders through a four-year project. Specifically, they narrate how orchestrators of the project danced between dominant and consensus-based orchestration based on environmental conditions, the growing capabilities and interaction of the network participants, and so on. The paper – cited in full at the bottom – contains insights that would likely be useful for any manager at the head of a collaborative project, and thus is worth a fuller read. I’ll use the remainder of this piece to discuss how key aspects of their abstraction can turn into specific strategic methods for practitioners. In the following section, I’ll refer to orchestrated projects, but keep in mind that this can be scaled up to long-term, international events or scaled down to embedded units within a single company. If you find exceptions to that, feel free to engage in the comments section of whatever medium through which you found this post. For the speed-readers out there, I’ve put the main takeaways in bold.
This section briefly extracts a few points that are practically relevant for managers finding themselves at the beginning of or in the midst of populated projects. The authors also included a chart in their work for this, which I’ve included below, that discusses specific orchestration practices that address the plurality as well as the diversity of stakeholders – again, the paper is worth a look for a more comprehensive explanation.
To start, I’m going to momentarily reach out to a different theoretical topic before coming back to this paper. You might’ve heard the team “ambidexterity” in contexts not referring to what people can do with their hands lately; as a theoretical topic, it’s a contemporary darling in management literature and not for no reason. At its core, it refers to the basic idea of doing two different things (well) at once. In this paper, the authors suggest that orchestrating dominantly and orchestrating harmoniously must be dynamically balanced over time to account for stakeholder diversity. The link between these concepts is clear, but we can make it clearer if we compress the four-year period they researched the medical project of their focus into one event1. As such, these occur ambidextrously and through three episodes the authors define: connecting members, facilitating their work, and governing the process.
To tie this in with the article co-authored by Dr. Wilhelm, the orchestrator should make it a goal from the beginning to gain a comprehensive understanding of each orchestrated member’s own capabilities and how motivated they are of their own accord to accelerate or modify those capabilities. While a complex task to pull off, it can really pay dividends: having an in-depth knowledge of how certain, KPI-driven members respond to ambiguous versus very specific task guidance sounds intuitive but is also overlooked to a disappointing extent. Consider, if nothing else, how this knowledge might be used to motivate those members to optimize their own processes without repetitive external pressure (from the orchestrator).
To borrow an example from the above-cited paper, Toyota sought cost-reduction behaviors from its supply network partners. However, Toyota also was interested in maintaining quality of parts delivered. While on one hand demarcating clear, measurable cost-reduction goals to all of its suppliers, Toyota on the other hand offered coaching in the production-optimization practice of kaizen2 to individual suppliers without explicitly forcing them to follow it or micromanaging how those suppliers optimize their practices. This lateral freedom allows those suppliers to explore their own potential for improvement, and giving that to members of an orchestration project at every ripe opportunity is a key strategy managers should keep at the top of their toolboxes.
For business leaders finding themselves near the starting line of projects that resonate so far, the connection step is important. Especially as the ongoing COVID-19 pandemic has largely scattered the workforce out of centralized working locations such as corporate offices or construction jobsites, bringing members back together is necessary to prevent a situation where project members feel like they’re disconnected from their peers. In a material sense, this can have resounding consequences for the serendipitous generation of new ideas that could make a good project even greater. I beg of you, however, to mitigate effects such as “Zoom fatigue” (a review of that linked article being a good first step).
Shifting tracks slightly from connection of members to facilitation of their work, being a present and connected orchestrator goes a long way. “Work” of course means different things in different arenas, but I focus here on the type of work where various members of a larger project have relative freedom in the ways in which they go about performing their tasks. In other words, they’re able to deliberate, think of alternative methods, and perhaps implement them even if it slightly shifts the course of the entire project. This stands in contrast to, for instance, assembly line work, where workers (be they human or machine) perform highly specialized tasks without much room for on-the-spot improvisation.
Members of an orchestrated project – especially due to the tendency for these workers to get into states where their field of vision narrows to what they and only the direct links in the project’s system are concerned with on a daily basis – might find themselves hitting the proverbial “writer’s block,” or perhaps straying away from original objectives. Especially when given ambiguous guidance per the earlier recommendation, this is likely in large projects with a diversity of stakeholders. Orchestrators, however, have an extremely valuable bird’s-eye view of the project even when it might seem chaotically dense. How can they leverage this to refuel, restart, and realign their agents? By making the objectives and especially the interdependencies of other components in the project chain known to straying or stalled participants, giving them a reference point to guide their own way forward.
The nexus of this paper, and the final point I’ll discuss here despite there being much more that’s worth a look in the paper itself, is in discussing the orchestration mode as dynamic through time. Sounds intuitive, doesn’t it? But considering the reasons why that might need an entire research paper to cover alludes to the instinctive and perhaps counterproductive nature of projects with too many cooks in the kitchen, so to speak.
The project they researched showed that orchestration moved from dominating to consensus-based because “as ambiguity decreases and relationships form, the reliance on formal structures decreases.” It’s not difficult to imagine why this crucial step goes missed in, for example, old-school dinosaur companies that have opted for a community-based innovation approach in trying to leapfrog past their advancing competitors. Relinquishing control, even if for the health of the initiative itself, is a difficult thing to do for high-level managers in these companies who might perceive doing so as jeopardizing their professional reputation.
– S. James Ellis, ESR
The original paper co-authored by our Radboud colleague, Dr. Vera Blazevic:
Reypens, C., Lievens, A., & Blazevic, V. (2019). Hybrid Orchestration in Multi-stakeholder Innovation Networks: Practices of mobilizing multiple, diverse stakeholders across organizational boundaries. Organization Studies, 42(1), 61–83. https://doi.org/10.1177/0170840619868268
The paper co-authored by Dr. Miriam Wilhelm, a member of the broader FINDER team:
Aoki, K., & Wilhelm, M. (2017). The Role of Ambidexterity in Managing Buyer–Supplier Relationships: The Toyota Case. Organization Science, 28(6), 1080–1097. https://doi.org/10.1287/orsc.2017.1156
1: “Why would you do that though?” Good question. In process research methods, and more specifically in researching Markov processes (which I do not claim to be an expert about, so take the following with a grain of salt), occurrences (such as the collaborative writing of one work package that is a small component of a larger project) stack into events (such as the combination, assignment, and fulfillment of these work packages to achieve project outcomes); events then stack into states (such as the project shifting from incomplete to complete). This is not absolute, but rather a good framework through which one can comprehend how long-term processes can be systematically divided up for incremental analysis.
2: Kaizen, per Dr. Katsuki Aoki (the co-author of Dr. Wilhelm’s paper), is “a term generally and broadly used in Japanese manufacturing industries to refer to activity that is implemented onsite by recognizing and bridging the gap between ideal and actual conditions and applying ideas to improve a production situation.”
Atos and the FINDER team are hosting an online event week on Inclusive Digital Innovation in Financial Services & Insurance from the 15th until the 18th of March, everyday at 16:00 CET (Thursday already at 15:00 CET). To see the agenda and register for the event go to https://digitalevents.atos.net/Digital-Innovation-in-FSI/home
The event consists of five sessions with presentations by world-leading speakers:
GAIA-X: The future of the European datacloud (Hubert Tardieu, Chairman of the Board of GAIX-X)
How to de-risk corporate-startup innovations, while improving speed and cost? (Josemaria Siota, Executive Director of IESE Business School’s Entrepreneurship and Innovation Center)
Ecoystem dominance (Ivo Luijendijk, Group Industry Director at Atos and S. James Ellis, FINDER PhD candidate)
Retail Banking transforms into Life-fulfilment services (Eddy Claessens, Group Industry Director at Atos and Jonas Röttger, FINDER PhD candidate)
Enabling next generation customer insights & interactions in insurance through explainable AI (Jérémie Abiteboul, Chief Technology Advisor at DreamQuark)
About the event
The COVID-19 pandemic has been a catalyst for digital adoption across various aspects of our private and professional life. In the financial services and insurance industry, processes are increasingly tackled by leveraging data, machine-learning, and Fintechs/InsurTechs. Atos has joined forces with practitioners, academics, and policy-makers to discuss how to yield benefits from these developments by re-positioning banks in the ecosystem, using Artificial Intelligence in insurance, mitigating risks in new venture collaborations and exploring the opportunities of the European GAIA-X project. This event week is part of Atos and Radboud University’s joint initiative FINDER (https://thefinderproject.eu/), funded by the European Commission. Five independent sessions will allow you to listen to expert presentations and discuss with the presenters and your peers your thoughts, ideas and questions. Please see below for our world-leading speakers.
CEOs helming the next acquisition are commonly expected to convey confidence in the outcome of their recent strategic decision to pair up with others for the future. However, too much confidence by the CEO, also known as CEO overconfidence, can jeopardize the value-creation of deals due to a higher likelihood of overpayment: CEOs who are overconfident believe to possess superior capabilities in deriving synergy from acquisitions leading them to make higher bids than more rational CEOs.
Overconfidence is a widely spread human phenomenon. It affects humans’ belief in their capabilities and the precision of their judgment. For instance, people often believe to be better-than-average car drivers, which violates a rational conception of an average. People in powerful positions are even more prone to fall victim to overconfidence since their assignment indicates superiority by nature. Hence, it is not surprising to find overconfidence among CEOs.
In the context of mergers and acquisitions, overconfident CEOs represent a risk to shareholders. While it is common to observe the acquirer stock plummed upon acquisition announcement, this reaction is especially true for acquisitions that will be helmed by overconfident acquirer CEOs. So how do firms helmed by more overconfident CEOs communicate acquisition announcements so that investors do not start selling their shares?
We conducted a study on acquisitions by S&P500 constituents between 2014 and 2020. Using an automated linguistic analysis on acquirer press statements, we found that investors react more positively to acquisitions by overconfident CEOs if the firm’s announcement press release conveys less confidence in the deal. That represents an exciting finding since usually conveying confidence in a strategic decision represents a positive signal for investors to draw on. However, it seems that the effect depends on who is signaling the confidence. In the case of an overconfident CEO, it appears investors prefer a bit less confidence, maybe because that shows a more realistic view of a given deal, which evokes confidence in investors that the acquirer is on the right track.
While the linguistic analysis of firm communication does not represent a novelty for business analysts or researchers, the interaction of CEO characteristics (i.e., CEO overconfidence) and firm communication is currently not undergoing scrutiny. Hence, also something to be considered by marketing and public relation departments when announcing deals to the public. Considering the past performance and press portrayal of the CEO might be valuable when writing press releases.