Recent News

Retail banking transforms into life-fulfillment services – A session of the FINDER inclusive digital innovation week

On Thursday, our event week on inclusive digital innovation in FS&I reached its final day. It was exciting to bring together perspectives from academia and practice. On our last day, we discussed how banks could transform into life-fulfillment services platforms. Eddy Claessens, Industry director at Atos, gave us insight into his hyper customer-centric retail banking vision, so-called life-fulfillment platforms. The session was hosted and facilitated by Jonas Röttger, a Ph.D. Candidate on the FINDER project.

What are life-fulfillment services platforms?

Life-fulfillment platforms refer to a vision on retail banking. It represents a business model in which the customer interacts through the banking platform with various ecosystems to fulfill diverse real-world needs. For instance, purchasing train tickets, filing insurance claims, and organizing a move. Of course, the platform also offers core financial services provided by almost all retail banks today. The general idea is that life-fulfillment platforms cover all of the customer’s needs that are related to financial transactions. These customer needs can be summarized into four cornerstones of the vision.

  1. Pay and spend
  2. Save and borrow
  3. Invest and protect
  4. Receive and earn

Which roles can banks play in this model?

The overall vision is to create a one-stop-shop solution, meaning one bank becomes the exclusive entry point for customers to fulfill various needs (see above for categories of customer needs). The short-term and mid-term perspective requires banks to position themselves concerning their function as links between the customer and various ecosystems. For instance, banks can decide to act as advisors for ecosystems where they have expertise but do not want to get directly involved. Or they can aggregate services and products in their offerings. The role banks can play hinges on their competencies and prospects. The lfie-fulfillment services vision sees four different roles mentioned below as the most promising options for banks:

  1. Advisor: the bank consults the customer on what to do, when to do it, how to do it, and with whom to do it.
  2. Facilitator: the bank provides, orchestrates, and curates a platform for different stakeholder groups to not only find each other but also interact and transact,
  3. Aggregator: the bank will package and integrate homegrown and third-party solutions.
  4. Initiator: the bank offers direct access through bank distribution channelsto specifically supported externally sourced services or products

What did we learn in our discussion with academia and practitioners on the topic of life-fulfillment services?

  • Banks are already working towards the implementation of similar models. The risk of disintermediation and the lower margins in various fields of retail banking requires banks to shift their business models. Hence, becoming a provider of new services is an appealing vision for banks.
  • Banks do not need to fulfill all roles proclaimed in the model. It strongly depends on their customers and their position in the ecosystems. Which relationships can be leveraged? Also, the customer journey for individual use cases shows which role to play.
  • Banks are well-equipped for the data-driven operational model that is needed to become life-fulfillment service platforms. They have the customers’ trust and access to rich financial data. However, the analytic capability might be something that needs further development, and that can be achieved through partnerships.

If you would like to learn more about life-fulfillment services, please reach out to Eddy Claessens for further dialog on this topic.

Orchestration: Dynamic Control from the Panopticon

Business ecosystems are distinct from random collections of companies for the high degree of interactivity between ecosystem participants. In a purely ecological sense, this is the difference between a random collection of penguins, chimpanzees, and grizzly bears in a zoo – who have little if any interaction with each other for being confined to separate enclosures and having no natural connection otherwise – and a collection of lesser long-nosed bats and the night-blooming cacti of the Sonoran Desert. In these more interesting cases, interactivity is beneficial to at least one party and ideally beneficial – but perhaps inert or even damaging – to the other. In the case of the bat and cactus, the relationship might even be a matter of survival.

Businesswise, we see this in, for instance, the smart lighting industry. Providers of smart lighting platforms provide the lights and a basic infrastructure for third parties to provide add-on services. Those complementors then come in and jazz up an otherwise mundane chandelier with services like music synchronization to make the entire ecosystem – platform, complementors, and all – more attractive to customers. Scale these systems up from individual users to smart lighting systems for entire cities, and the potential for ecosystem diversity becomes immense.

However, unreigned chaos – in the constructive rather than destructive sense – rarely provides efficient market outcomes. In the smart lighting example, platform providers have some sort of de facto control over the ecosystem of complementors that amass around their platforms, and they might use this to nudge certain outcomes. This is not always the case. Especially when the “platform” around which complementors come together is more conceptual rather than a tangible product, the roles of who provides the platform and who really guides where it’s going become disentangled. In these settings, a strategically advantageous position to take is that of an orchestrator, which involves putting one’s self or firm in the center of many others and attempting to order and interlace those others’ capabilities and offerings.

This either requires or hopefully provides a panoptical view of the ecosystem, which can then be harnessed to create things of value. Information asymmetry, which we won’t get into, and intentional ambiguity, which we will, can affect what one can do from that panopticon, but in this post, we’ll first discuss orchestration generally as a concept before getting into some literature that addresses the challenges that can burden the orchestrator. Don’t worry though: we’ll also get to some strategic fixes and recommendations to get around those, which will likely become more useful over time as we see ecosystem cooperation – and thus orchestration – rise in importance.

The panopticon is a dynamic literary symbol, i.e. commenting on cultures of surveillance and the vulnerability of those surveilled. Ideally, this post will strike a more positive tone. Artwork created by and used with permission from Adam Simpson.

Defining an Orchestrator

Rarely are business leaders and their stakeholders – employees concerned with career growth, investors looking for substantial returns, and so on – content with being in the passenger seat while unplanned chaos drives. Ecosystems, despite their difficulty to gain unilateral control of, are steerable. They can be steered by the institutions that oversee them from the beginning – such as regulatory authorities providing tax incentives for companies working towards certain Sustainable Development Goals – or they can be steered by actors within the ecosystem – such as firms attempting to establish themselves as industry leaders by enlisting other ecosystem actors to work towards a collaborative, groundbreaking innovation.

An orchestrator, to borrow and slightly adjust the oft-cited definition of Dhanaraj and Parkhe, are central firms that create value by ordering components of an ecosystem into sequences more valuable than the sum of their parts, and then extract value by selling those sequences as products or services. That this is exactly what happens in a professional orchestra is a painfully obvious statement, but also one I’ve not seen anywhere in all this literature, so there it finally is.

In a paper I recently submitted to a few conferences, I focus on Atos as an orchestrator amid the financial services ecosystem. Over the past five years or so, they’ve worked to create a system whereby they search for promising fintechs, daisy-chain those fintechs’ offerings alongside those of other fintechs as well as the ones Atos itself can provide, then sell those solutions to clients. The benefits are clear in this win-win-win situation: fintechs (those being orchestrated) gain market exposure especially to large clients, clients (those for whom the orchestrator is orchestrating) are able to purchase their innovation goals, and Atos (the orchestrator) draws in revenue as the broker of the deal with minimal costs in terms of production.

Orchestration as a theoretical event is not a necessarily new concept. The previously cited Dhanaraj and Parkhe article, which seems to be the root article in a lot of management literature concerning the topic, was published in 2006. It’s been a long time since then, with 2020 accounting for roughly half of it. However, one of our colleagues right here at Radboud University co-authored an article on orchestration that was published in Organization Studies this year, and the insights are particularly valuable for practitioners in likely any industry who seek to achieve a similar role. The above win-win-win dynamic, after all, is about more than generating profit for shareholders: it’s about improving the health of the ecosystem. Whether in an ecological or a market sense of ecosystems, it’s hard to argue against that. I’ll also reference an article co-authored by a partner of the FINDER project, Dr. Miriam Wilhelm, which relates in its discussion of how a central firm must apply different approaches and more specifically ambidextrous ones when dealing with other firms contributing to its outputs.

The difficulty of constantly being in tune with all orchestrated components – being in the panopticon – is no small factor, and it not only requires many sets of eyes to monitor what’s going on in many places at once, but it also involves many sets of hands to address various issues and concerns among the various project participants. Even more importantly, it requires an intuition for when to apply hands-on, dominant solutions and when to only provide a gentle nudge before letting the consensus figure out the rest.


Two Modes

Broadly speaking, the study focuses on interfirm orchestration. This is in contrast to orchestration that occurs between different units within a firm, which I’ll cover in a future post. In their paper, Reypens, Lieven, and Blazevic assess a project with a large collection of stakeholders to explore how orchestrators go about mobilizing agents in a variety of firms to work towards the same objective. They adopt the view from previous literature that there are two modes of orchestration: dominant and consensus-based. These are fairly self-explanatory: in the former, one entity attempts to centrally govern most processes that happen within the endeavor, putting other entities in a de facto subordinate role. In the latter, governance and management are decentralized or revolving. The authors then assert that both of these modes can be employed in a given project and by a given entity dynamically.

It is along this line of thought that they commence their analysis, and their study lays out in great detail the dynamics that occurred between stakeholders through a four-year project. Specifically, they narrate how orchestrators of the project danced between dominant and consensus-based orchestration based on environmental conditions, the growing capabilities and interaction of the network participants, and so on. The paper – cited in full at the bottom – contains insights that would likely be useful for any manager at the head of a collaborative project, and thus is worth a fuller read. I’ll use the remainder of this piece to discuss how key aspects of their abstraction can turn into specific strategic methods for practitioners. In the following section, I’ll refer to orchestrated projects, but keep in mind that this can be scaled up to long-term, international events or scaled down to embedded units within a single company. If you find exceptions to that, feel free to engage in the comments section of whatever medium through which you found this post. For the speed-readers out there, I’ve put the main takeaways in bold.


Strategic Recommendations

This section briefly extracts a few points that are practically relevant for managers finding themselves at the beginning of or in the midst of populated projects. The authors also included a chart in their work for this, which I’ve included below, that discusses specific orchestration practices that address the plurality as well as the diversity of stakeholders – again, the paper is worth a look for a more comprehensive explanation.

To start, I’m going to momentarily reach out to a different theoretical topic before coming back to this paper. You might’ve heard the team “ambidexterity” in contexts not referring to what people can do with their hands lately; as a theoretical topic, it’s a contemporary darling in management literature and not for no reason. At its core, it refers to the basic idea of doing two different things (well) at once. In this paper, the authors suggest that orchestrating dominantly and orchestrating harmoniously must be dynamically balanced over time to account for stakeholder diversity. The link between these concepts is clear, but we can make it clearer if we compress the four-year period they researched the medical project of their focus into one event1. As such, these occur ambidextrously and through three episodes the authors define: connecting members, facilitating their work, and governing the process.

To tie this in with the article co-authored by Dr. Wilhelm, the orchestrator should make it a goal from the beginning to gain a comprehensive understanding of each orchestrated member’s own capabilities and how motivated they are of their own accord to accelerate or modify those capabilities. While a complex task to pull off, it can really pay dividends: having an in-depth knowledge of how certain, KPI-driven members respond to ambiguous versus very specific task guidance sounds intuitive but is also overlooked to a disappointing extent. Consider, if nothing else, how this knowledge might be used to motivate those members to optimize their own processes without repetitive external pressure (from the orchestrator).

To borrow an example from the above-cited paper, Toyota sought cost-reduction behaviors from its supply network partners. However, Toyota also was interested in maintaining quality of parts delivered. While on one hand demarcating clear, measurable cost-reduction goals to all of its suppliers, Toyota on the other hand offered coaching in the production-optimization practice of kaizen2 to individual suppliers without explicitly forcing them to follow it or micromanaging how those suppliers optimize their practices. This lateral freedom allows those suppliers to explore their own potential for improvement, and giving that to members of an orchestration project at every ripe opportunity is a key strategy managers should keep at the top of their toolboxes.

For business leaders finding themselves near the starting line of projects that resonate so far, the connection step is important. Especially as the ongoing COVID-19 pandemic has largely scattered the workforce out of centralized working locations such as corporate offices or construction jobsites, bringing members back together is necessary to prevent a situation where project members feel like they’re disconnected from their peers. In a material sense, this can have resounding consequences for the serendipitous generation of new ideas that could make a good project even greater. I beg of you, however, to mitigate effects such as “Zoom fatigue” (a review of that linked article being a good first step).

Shifting tracks slightly from connection of members to facilitation of their work, being a present and connected orchestrator goes a long way. “Work” of course means different things in different arenas, but I focus here on the type of work where various members of a larger project have relative freedom in the ways in which they go about performing their tasks. In other words, they’re able to deliberate, think of alternative methods, and perhaps implement them even if it slightly shifts the course of the entire project. This stands in contrast to, for instance, assembly line work, where workers (be they human or machine) perform highly specialized tasks without much room for on-the-spot improvisation.

Members of an orchestrated project – especially due to the tendency for these workers to get into states where their field of vision narrows to what they and only the direct links in the project’s system are concerned with on a daily basis – might find themselves hitting the proverbial “writer’s block,” or perhaps straying away from original objectives. Especially when given ambiguous guidance per the earlier recommendation, this is likely in large projects with a diversity of stakeholders. Orchestrators, however, have an extremely valuable bird’s-eye view of the project even when it might seem chaotically dense. How can they leverage this to refuel, restart, and realign their agents? By making the objectives and especially the interdependencies of other components in the project chain known to straying or stalled participants, giving them a reference point to guide their own way forward.

The nexus of this paper, and the final point I’ll discuss here despite there being much more that’s worth a look in the paper itself, is in discussing the orchestration mode as dynamic through time. Sounds intuitive, doesn’t it? But considering the reasons why that might need an entire research paper to cover alludes to the instinctive and perhaps counterproductive nature of projects with too many cooks in the kitchen, so to speak.

The project they researched showed that orchestration moved from dominating to consensus-based because “as ambiguity decreases and relationships form, the reliance on formal structures decreases.” It’s not difficult to imagine why this crucial step goes missed in, for example, old-school dinosaur companies that have opted for a community-based innovation approach in trying to leapfrog past their advancing competitors. Relinquishing control, even if for the health of the initiative itself, is a difficult thing to do for high-level managers in these companies who might perceive doing so as jeopardizing their professional reputation.

– S. James Ellis, ESR


The original paper co-authored by our Radboud colleague, Dr. Vera Blazevic:

Reypens, C., Lievens, A., & Blazevic, V. (2019). Hybrid Orchestration in Multi-stakeholder Innovation Networks: Practices of mobilizing multiple, diverse stakeholders across organizational boundaries. Organization Studies, 42(1), 61–83. https://doi.org/10.1177/0170840619868268

The paper co-authored by Dr. Miriam Wilhelm, a member of the broader FINDER team:

Aoki, K., & Wilhelm, M. (2017). The Role of Ambidexterity in Managing Buyer–Supplier Relationships: The Toyota Case. Organization Science, 28(6), 1080–1097. https://doi.org/10.1287/orsc.2017.1156


1: “Why would you do that though?” Good question. In process research methods, and more specifically in researching Markov processes (which I do not claim to be an expert about, so take the following with a grain of salt), occurrences (such as the collaborative writing of one work package that is a small component of a larger project) stack into events (such as the combination, assignment, and fulfillment of these work packages to achieve project outcomes); events then stack into states (such as the project shifting from incomplete to complete). This is not absolute, but rather a good framework through which one can comprehend how long-term processes can be systematically divided up for incremental analysis.

2: Kaizen, per Dr. Katsuki Aoki (the co-author of Dr. Wilhelm’s paper), is “a term generally and broadly used in Japanese manufacturing industries to refer to activity that is implemented onsite by recognizing and bridging the gap between ideal and actual conditions and applying ideas to improve a production situation.”

Atos and FINDER to host online event week on digital innovation in financial services (15th until 18th of March)

Atos and the FINDER team are hosting an online event week on Inclusive Digital Innovation in Financial Services & Insurance from the 15th until the 18th of March, everyday at 16:00 CET (Thursday already at 15:00 CET). To see the agenda and register for the event go to https://digitalevents.atos.net/Digital-Innovation-in-FSI/home

The event consists of five sessions with presentations by world-leading speakers:

  • GAIA-X: The future of the European datacloud (Hubert Tardieu, Chairman of the Board of GAIX-X)
  • How to de-risk corporate-startup innovations, while improving speed and cost? (Josemaria Siota, Executive Director of IESE Business School’s Entrepreneurship and Innovation Center)
  • Ecoystem dominance (Ivo Luijendijk, Group Industry Director at Atos and S. James Ellis, FINDER PhD candidate)
  • Retail Banking transforms into Life-fulfilment services (Eddy Claessens, Group Industry Director at Atos and Jonas Röttger, FINDER PhD candidate)
  • Enabling next generation customer insights & interactions in insurance through explainable AI (Jérémie Abiteboul, Chief Technology Advisor at DreamQuark)

About the event

The COVID-19 pandemic has been a catalyst for digital adoption across various aspects of our private and professional life. In the financial services and insurance industry, processes are increasingly tackled by leveraging data, machine-learning, and Fintechs/InsurTechs. Atos has joined forces with practitioners, academics, and policy-makers to discuss how to yield benefits from these developments by re-positioning banks in the ecosystem, using Artificial Intelligence in insurance, mitigating risks in new venture collaborations and exploring the opportunities of the European GAIA-X project. This event week is part of Atos and Radboud University’s joint initiative FINDER (https://thefinderproject.eu/), funded by the European Commission. Five independent sessions will allow you to listen to expert presentations and discuss with the presenters and your peers your thoughts, ideas and questions. Please see below for our world-leading speakers.

M&A announcements: How much confidence to convey if you are considered overconfident?

Photo by Sharon McCutcheon on Unsplash

CEOs helming the next acquisition are commonly expected to convey confidence in the outcome of their recent strategic decision to pair up with others for the future. However, too much confidence by the CEO, also known as CEO overconfidence, can jeopardize the value-creation of deals due to a higher likelihood of overpayment: CEOs who are overconfident believe to possess superior capabilities in deriving synergy from acquisitions leading them to make higher bids than more rational CEOs.

Overconfidence is a widely spread human phenomenon. It affects humans’ belief in their capabilities and the precision of their judgment. For instance, people often believe to be better-than-average car drivers, which violates a rational conception of an average. People in powerful positions are even more prone to fall victim to overconfidence since their assignment indicates superiority by nature. Hence, it is not surprising to find overconfidence among CEOs.

In the context of mergers and acquisitions, overconfident CEOs represent a risk to shareholders. While it is common to observe the acquirer stock plummed upon acquisition announcement, this reaction is especially true for acquisitions that will be helmed by overconfident acquirer CEOs. So how do firms helmed by more overconfident CEOs communicate acquisition announcements so that investors do not start selling their shares?

We conducted a study on acquisitions by S&P500 constituents between 2014 and 2020. Using an automated linguistic analysis on acquirer press statements, we found that investors react more positively to acquisitions by overconfident CEOs if the firm’s announcement press release conveys less confidence in the deal. That represents an exciting finding since usually conveying confidence in a strategic decision represents a positive signal for investors to draw on. However, it seems that the effect depends on who is signaling the confidence. In the case of an overconfident CEO, it appears investors prefer a bit less confidence, maybe because that shows a more realistic view of a given deal, which evokes confidence in investors that the acquirer is on the right track.

While the linguistic analysis of firm communication does not represent a novelty for business analysts or researchers, the interaction of CEO characteristics (i.e., CEO overconfidence) and firm communication is currently not undergoing scrutiny. Hence, also something to be considered by marketing and public relation departments when announcing deals to the public. Considering the past performance and press portrayal of the CEO might be valuable when writing press releases.

– Jonas Röttger, ESR

Collaboration FINDER and TechQuartier for the project ’Financial Big Data Cluster’

On the 1st January 2021 the initiative Financial Big Data Cluster was launched with the research project “safeFBDC” as a solution for a technological driven development of the European Financial Sector. To do so the safeFBDC congregates a consortium of public and private collaborating partners managed by TechQuartier – to leverage knowledge in the areas of artificial intelligence, machine learning and business model development.

Therefore, the initiative is a response to increasingly structural change, fuelled by technological innovation. Participants are reacting to the challenge of adaptation with increasing speed.[1] As “banking is unbreakably connected with the use of information technology”[2] the financial sector is a prime representative of the importance of technological innovation. While US and Chinese actors have been predominant in the adaptation of technological innovation in the financial sector, European actors have to step up their game. Their engagement is of importance to secure data sovereignty and thus obtain a competitive position when it comes to data-driven financial services. To achieve this collaboration of the private and public sector is of utmost necessity. For this applicable, european-centristic research is needed to understand and thus enable innovations and their necessary environment. Providing such research will in turn enable the proactive engagement of practitioners.

Thus the safeFBDC project is set up to deliver on these necessities by aligning three major goals:

  1. Increasing research output through the development of new AI systems and analysis of new, information-rich data sets.
  2. Enhancing financial stability by facilitating the exercise of oversight and supervisory functions by public authorities.
  3. Promoting the development of new data-based products, services and business models, and to increase the transfer of knowledge from research to business.

Collaboration on the research of new business models driven by technology

To facilitate applicable, european-centristic research of the financial sector TechQuartier and FINDER, have decided to join forces. Together we want to utilize the opportunity the safeFBDC is providing to study the collaboration driven by technological change. To do so Luisa Kruse from TechQuartier and me will work together on this project. Aim of our collaboration is to study the underlying organizational mechanisms driving this flagship project. By doing so we generate value in three important ways. First, we facilitate applicable research to enable practitioners. Second, we gain a better understanding of how technology affects opportunities of innovation. Third, we establish a new venture of research of the European financial sector. The progress of our collaboration will subsequentially covered within my blogposts culminating in a collaborative whitepaper.

Jonas Geisen, ESR


[1] Schwab, K. (2017). The fourth industrial revolution. Currency.

[2] Thalassinos, E. (2008). Trends and Developments in the European Financial Sector. European Financial and Accounting Journal, 3(3), p. 58

How banks should harvest their internal data

Data fuels decision-making. Banks are well-equipped with the financial data of their customers. Experts often point out that consolidating internal financial data with other data sources (e.g. behavioral data, macro-economic data, etc.) will unfold data’s full potential. Yet, banks’ rich internal data is regularly overlooked as an opportunity that can be used to fuel decision-making. Banks need a solid data-gathering strategy and advanced data analytic skills to leverage their internal data.

How should banks approach internal data?

Data needs to be gathered with a clear purpose. Hence, the journey towards a data-fueled operating model starts with defining clear use cases. Subsequently, the use cases have to be checked against reality. Therefore, banks’ internal data should first be inventoried and categorized. It is crucial to define a timeframe for which data collection is performed (depending on the use case, data collection for the last three to ten years could be most suitable). Subsequently, the data can be put to work through e.g. model-building. While harvesting data with the goal to implement use cases is crucial, the strategy should also entail how to manage data in the future. Harvesting data from legacy architectures demonstrates the potential of data in general but is very inefficient for future endeavors. Here, breaking down data silos and building data lakes represents a robust solution. Currently, banks are still struggling with small projects that only reach the proof of concept stage and large projects that are abandoned due to overwhelming complexity. Incremental progress on mid-complex level projects represents the largest potential to strive.

Too much of a good thing: why data frugality is important

Occam’s razor is the idea that in problem-solving, the simplest solution is usually the right one. This approach is well-adapted in data science for several reasons. Firstly, a model’s appetite for data increases the risk of having unobserved data points which negatively affect the predictive power of a model. Secondly, more data increases the training time for models. More training time means more energy and consequently higher costs. Thirdly, more data can lead to impaired explicability of a model as a complex model’s results are harder to interpret. This is especially the case if deep learning methods are applied (which remain to a large extent black boxes). The low explicability of models prevents their application as part of automated decision-making due to GDPR regulations. Moreover, low explicability could make the model unstable in times of new hitherto unseen data. Users will have difficulties to explain why and with what accuracy the model is adapting to the new circumstances. In general, striving for parsimony is an important criterion for which banks have to optimize when using their data.

Keeping data in the loop

Oftentimes, it is argued that data evolves from simple data to information to knowledge. While that is true for many use cases, it should be pointed out that data-fueled decision-making does not always require intense computation to become knowledge. Depending on the level of human-in-the-loop or the affordances of a decision, very simple data points can be highly informative. However, if data is processed in a time-consuming and complicated manner to derive knowledge (e.g. in form of a report), this knowledge should be kept in the loop. Hence, the results of data processing should become part of the data storage.

– Jonas Röttger, ESR

Volatility Precedes Standardization

The financial services ecosystem is experiencing innovation at breakneck speed, as can be seen within the walls of fintech-heavy startup incubators such as TechQuartier. Regulations – PSD2 and MiFID II for instance – from the topside constrain the direction of innovation, usually with consumer protection as the driving force. However, a third force is equally in play: standards. Standards are independent from regulations, in that “regulations stem primarily from a top-down approach, while formal standards are typically the result of a market-driven process (Büthe and Mattli, 2011)”1.

Ecosystems are groups of interacting firms, where interaction is largely of collaborative and/or interdependent natures2. The standards of interaction especially in digital ecosystems are critical: APIs must be able to interact, programming languages must be mutually intelligible, the data that certain services rely on to provide value must be created and packaged in workable ways, and so forth. A lack of adherence to these standards would mean, for instance, that the smartphone in your pocket that you use for mobile banking, equities trading, and payment processing would figuratively fall apart.

However, standards, especially in uncertain environments, take time to formulate. During this process, multiple parties might attempt to control the outcome of standardization, likely in their and their stakeholders’ interests. As Dr. Philipp Tuertscher commented in a recent FINDER meeting, standards are fairly mundane once enacted, but their formation is highly political and an interesting phenomenon to observe.

An easy opportunity to watch this process is in the standardization of corporate ESG data reporting for investors. In the financial services ecosystem, this is a huge step ahead of MiFID II’s full implementation. In this essay, we’ll briefly cover these terms and discuss what’s happened thus far, which will set a baseline for a future series of essays covering key events, lessons learned, and theoretical takeaways from data collection during the ESG data standards-setting process.

ESG Data

ESG stands for environmental, social, and governance. This category of data has experienced a proliferation of importance alongside corporate social responsibility, or CSR, initiatives. The three subcategories of data, when considering a company, cover aspects such as gender wage gaps, environmental waste protocols, and anti-corruption protections.

Until recently, the disclosure of ESG data has been generally voluntary, with some exceptions. As such, industrialized ESG data production itself has not been a heavily regimented practice, so it’s largely been the efforts of NGOs, watchdog groups, investor discretion, and so forth that have pushed companies to publish ESG data. Since ESG data has not been directly monetizable (a familiar trait of all so-called “alternative data,” a category to which ESG has historically been ascribed),

However, self-generated reports of CSR performance are riddled with inconsistencies and gaps for obvious reasons. To address this, agencies and companies such as MSCI and Sustainalytics began publishing independent ESG ratings on mostly publicly listed companies, and over time, this practice has gained enough importance with institutional investors and asset managers such that there are even ecosystems of sustainability ratings agencies, most of which having their own unique methodologies and outputs.

While the proliferation of ESG reporting is generally good, there are obvious problems. Asset managers on the hunt for comprehensive data regarding a given publicly listed firm’s CSR performance are confronted with a blurry landscape of reporting and rating methodologies. Even as the agencies consolidate over time, the lack of industry-wide standards in ESG data reporting has asset managers significantly concerned over the loss of reporting and rating quality.3

Regulations & Standards

No classification system currently exists at EU level which clarifies what constitutes an environmentally-sustainable economic activity. Market-led initiatives that have emerged in recent years are not comprehensive enough and do not sufficiently reflect all EU environmental sustainability priorities.” – European Commission

The European Commission has introduced MiFID II, a sustainability-incorporating revision of the original Markets in Financial Instruments Directive from earlier this century, and a battery of sustainable finance directives installing, among other things, a taxonomy of sustainable economic activity. This combination pushes asset managers and institutional investors to bring ESG data closer to the core of their and their clients’ financial decision-making and affairs.

The benefits of a clear and concise data reporting methodology, which is only one of the foci of this push, are clear. It takes the burden of figuring out what important metrics are off of asset managers and institutional investors, it allows companies all along a value chain to assess each other and exclude any proverbial bad apples, which in turn gives the end consumer the ability to knowledgeably avoid below-threshold products and services.

However, as these initiatives come from a regulating body, they’re a bit top-down, and therefore the problem of standardization remains: who is in control? Who gains from the way this will eventually pan out? Who loses? These are just a few of the vivid questions that we ask as this process wears on.

Through interviews, participant observation, and content analysis, interesting angles of the standardization process will become apparent. We hope that these will have theoretical implications that go beyond sustainable finance, so please follow the FINDER blog and feel free to weigh in with your own insights – all perspectives are welcome. I can be reached for questions and comments at s.ellis@fm.ru.nl.

S. James Ellis, ESR


  1. Blind, K., Petersen, S. S., & Riillo, C. A. F. (2017). The impact of standards and regulation on innovation in uncertain markets. Research Policy, 46(1), 249–264. https://doi.org/10.1016/j.respol.2016.11.003
  2. Jacobides, M. G., Cennamo, C., & Gawer, A. (2018). Towards a theory of ecosystems. Strategic Management Journal, 39(8), 2255–2276. https://doi.org/10.1002/smj.2904
  3. Avetisyan, E., & Hockerts, K. (2017). The Consolidation of the ESG Rating Industry as an Enactment of Institutional Retrogression. Business Strategy and the Environment, 26(3), 316–330. https://doi.org/10.1002/bse.1919

Open Banking – an opportunity within grasp

Disclaimer:
The content of the FINDER blog is not an expression of Commerzbank AG, nor created on behalf of Commerzbank AG. The content is created and contributed by private persons.

On 05.11.2020 we tuned in into the Open Banking Summit held by the Commerzbank in cooperation with the Business Engineering Institute St. Gallen. We will have a look into the Summits key notes to see how the realisation of Open Banking is progressing based on this use case  –  which opportunities may arise and which challenges are still to face.

Open Banking became more prominent in 2016 when the United Kingdom announced its Open Banking Standard and the European Union published its Revised Payment Services Directive (PSD2). However, it only gained momentum in 2018 when these drafted legislations came into effect. Simplified these laws require banks to open up their IT infrastructure. Technological this is done through application programming interfaces (APIs) which allow different IT infrastructures to communicate with each other. In the case of Open Banking, APIs enable third parties to connect to banks existing IT infrastructure and thereby access and usage of the data gathered– say bye to data silos guarded by banks.

The backbone of the Summit was a whitepaper The Future of Collaboration in Corporate Banking in which Joerg Hessenmueller (Commerzbank AG) defined

API [as] a crucial technology that enables communication between IT-systems with enough flexibility to address the complexity of today’s world [based on] closer collaboration among different parties leveraging on their different capabilities to create value for the customer”.

Resulting from that one can draw the conclusion of David Kauer (PostFiannce AG) that

“Open Banking is a fundamental strategic and architectural question. Banks do not just do Open Banking – Open Banking is a framework that requires a 360-degree view of business and corporate clients and their needs. Banks, thus, have to decide wisely about the order of actions they take to follow such an approach.”

So what is achieved so far?

As the use case of Commerzbank depicts cooperation is key to identify and leverage the options available. Slowly, new networks are emerging. First attempts of opening up are made. So far these are still in their infancy. An example is the developer portal. This sandbox provides developers the documentation and option to play around and get used to the APIs provided by the Commerzbank. When having a look at the opportunities and challenges it is, however, clear that this is only a small first step in the right direction.

What are the outstanding opportunities?

The approach envisioned by the PSD2 is to fundamentally change banking in the European Union. Its implementation is aimed to enhance the value proposition of financial organisations. The basic framework is set to achieve a higher degree of cooperation and co-innovation between banks and third parties for example FinTechs. This is highly dependent on the abilities of banks to think beyond their organisational borders. If this outward-opening is happening the most valuable opportunity can be realised:

Building a new digital ecosystem marked by new business models and driven by customer expectations.

Technological enabled would such an ecosystem be through the opening of banks APIs. Cooperation, innovative ideas could facilitate user value by enhancing consumer protection and security of internet payments as well as account access within the EU and EEA. Accordingly, the opportunity for customers is access to enhanced services within one digital ecosystem. Such services would greatly enhance banks attractiveness by increasing their value proposition. At the same time, FinTechs have the opportunity to grow by getting access to a greater market reach or even provide the B2C of banking. Another actor in such an ecosystem would be BigTechs which, according to David Kauer, could take a role as technological orchestrators. In that case banks would probably occupy the B2B in such an B2B2C banking ecosystem. To not be pressured into the role of an anonymous backoffice service provider banks have to seek an pro-active role. So in general Open Banking should not be understood as a threat or zero sum game by banks but instead as an opportunity. In that sense all actors would profit in the banking B2B2C ecosystem.

Which challenges is the industry still facing?

However, the transformation is still facing challenges that need to be tackled for a digital ecosystem to emerge. As the banking sector will open up for everyone offering financial services a mind-set of collaboration is of importance. Customer centricity should be the focus flanked by provisioning of the necessary infrastructure –for example in innovation labs. An optimal setup is completed by a bank’s readiness to identify partnerships and then leverage resources to seize the presented opportunities.

Technological there are still some hurdles that hinder the facilitation of a collaborative approach to adapt to structural change. Technological readiness is one challenge to face. The adaptation of key technologies across the industry differs strongly and may, in the current state, make collaboration more difficult. Tightly connected to this is the missing standardization of APIs. Heterogeneous architectures for the same services are making a fast and approachable cooperation across organisations fairly difficult.

Future will show of all potential actors can overcome these challenges and thus provide the necessary prerequisites to foster an ecosystem marked by innovative ideas combined with industry-specific know-how

–  Jonas Geisen, ESR

FINDER hosts the SMS Berkeley digital phd workshop on Reshaping Firms in Digital Ecosystems: Designing the Future

Thursday, November 5, 2020 
18:00 CET (UTC + 1) / 09:00 PST (UTC –  7) 
Virtual Workshop on Zoom

This workshop was originally intended to take place during the SMS Special Conference in Berkeley “Designing the Future: Strategy, Technology, and Society in the 4th Industrial Revolution”, which was canceled due to COVID-19. However, as we all learn to adapt to the challenges presented by this health crisis, we are proud to offer you this workshop in a different and virtual format.

We relabeled the title to “Reshaping Firms in Digital Ecosystems: Designing the Future”.

With the rise of digital technologies, new demands and challenges have emerged that require the attention of practitioners and scholars alike. While the world has grown familiar with digital ecosystems as a platform for future growth, little is known still about the ways firms restructure, reshape, and adapt – proactively or reactively – in response to sudden disruption of the emergent digital ecosystems they are part of. The doctoral workshop will have an interactive intent and will also reflect on the impact of COVID-19 on strategy and innovation; how resilient are we when facing a crisis?  

The main objective of the Doctoral Workshop is to foster interaction among leading faculty scholars and doctoral students on various aspects of research and preparing for a professional career in academia. The doctoral student participants will be offered the opportunity to broaden their academic network with senior faculty from around the world and develop a better understanding of the particularities of the academic career. 

Leading researchers in the field brought their experience into the discussion to further develop participants’ insight into the key themes of this conference.

Participants will be able to pitch theirresearch, practicing to anticipate, critically reflect and nuance your contribution as part of the academic debate. The intent is to strengthen and effectively position your research (e.g. the research question, the importance of the research gap, theoretical reasoning, and selection of target journals). Throughout the workshop, they get a chance to present your work and engage in a constructive dialogue with senior faculty and peers.

What is preventing incumbent banks from monetizing their data?

Banks are often described as possessing a huge pile of customer data but being unable or unwilling to leverage it. We confronted five industry experts with this statement asking what is hindering banks to monetize their rich data reservoirs? Here are their answers and recommendations on how banks could overcome them. 

 IT legacy – banks’ IT systems are not in shape to allow state-of-the-art data analytics

An often described hurdle to leverage data is the IT legacy system of incumbent banks. While the mere size of the data banks own could be a rich resource, the IT systems are not (yet) consolidated data pools that can provide information. Even in collaboration with fintech companies that developed efficient algorithms to perform smart data inquiry, implementation often fails after a successful proof-of-concept stage. The data is not structured and stored in ways that allow for relevant and timely data consultation. So, where to start?

The unique data of banks are spending data. An expert recommendation is to stratify spending data according to customer demographics for a time horizon of the past five years. Some experts recommended that effective and efficient usage of data would only be possible if banks were building new systems from scratch and migrate carefully selected data (e.g. the last five years) subsequently.

Talent turnover – culture and demands are not attractive for young high potential IT workforce

Banks’ IT systems display opportunities for young and ambitious IT workers: they are embedded in huge and well-paying organizations and require plenty of work. While banks communicate externally that they are particularly looking for IT employees with a disruptive mindset the reality is often very different: a highly regulated and risk-aversive culture is skeptical of incrementally built and improved IT solutions. No IT system is released flawlessly today. Systems are optimized, catered towards customer needs, or improved in terms of security standards while they are already in the market. Banks expect a bullet-proof solution from the get-go. In addition, banks are not particularly interested in functionality that does not yet have a clear use case.  Industry expertise is needed in combination with data analytics skills to develop promising use cases that appeal to strategy-setting executives. This represents a key to stretch banks’ risk-averse culture and provide young IT employees with interesting challenges.

Value chain positioning – highly-regulated back-end vaults vs. life-fulfillment platforms

Big tech companies are entering the financial services market. While companies like Apple and Google are partially interested in gathering access to spending data via financial products, their main interest is to extend their portfolio by yet another revenue stream. However, because of their data analytics skills and their business model, tech companies can offer a level of convenience and pricing (e.g. freemium) banks are unable to provide. The question is whether banks are willing to play the role of highly regulated institutions that manage the back-end of financial services while tech companies will own the customer relationships?

Tech companies are increasingly becoming targets of supervising and regulatory bodies (especially in Europe) and it is at least unclear whether they are motivated to become as regulated as banks. This represents a competitive advantage for banks that are very familiar to strive in the regulated environment.

Moreover, if banks do want to act proactively defending their customer relationships, data analytics are necessary to design platforms that offer financial services that go beyond today’s banking products. A banking platform should provide internal, external, and integrated financial services that facilitate everyday life (e.g. buying public transportation tickets) or rare life-changing financial decisions (e.g. buying own property). The challenge is that not every bank can turn into a platform, given that platform economics usually represent natural oligopolies.

Data monetization can be direct or indirect – which path to chose?

Direct data monetization refers to trade data in exchange for value, whereas indirect data monetization refers to using data to enable, improve, or maintain revenue streams (without trading data itself). While trading data could be lucrative for banks on a short- to midterm scale, it could also jeopardize their reputation as highly entrusted institutions. Hence, pursuing indirect data monetization by using customer data to design tailor-made solutions seems to be the golden route. However, for services and solutions to be highly relevant in content and timing, banks still have a long way to go.

by Jonas Röttger, FINDER ESR