Please ensure Javascript is enabled for purposes of website accessibility Global Perspectives: Addressing the most essential questions around AI - Janus Henderson Investors - Sweden Institutional
For institutional investors in Sweden

Global Perspectives: Addressing the most essential questions around AI

In this episode, Portfolio Manager Denny Fish takes a deep dive into the current state of artificial intelligence (AI), including the latest advancements, its potential to propel economic growth, and the rise of agentic AI and its impact on software business models. He also shares insights from a recent research trip in China.

Alternatively, watch a video of the recording:

6 May 2026
28 minute listen

Key takeaways:

  • Despite market skepticism about the longevity of the AI theme, we have seen step-function improvements in AI models, with new models evolving rapidly and continually leapfrogging those of competitors.
  • This ongoing advancement is a key driver of aggressive capital expenditure (CapEx) by companies, the benefits of which are just beginning to emerge.
  • Compared to previous technology cycles, AI is diffusing across industries much more rapidly; as a result, we believe the impact it will have on economies, society, and global security will be much more profound.

IMPORTANT INFORMATION

Artificial intelligence (“AI”) focused companies, including those that develop or utilize AI technologies, may face rapid product obsolescence, intense competition, and increased regulatory scrutiny. These companies often rely heavily on intellectual property, invest significantly in research and development, and depend on maintaining and growing consumer demand. Their securities may be more volatile than those of companies offering more established technologies and may be affected by risks tied to the use of AI in business operations, including legal liability or reputational harm. 

Technology industries can be significantly affected by obsolescence of existing technology, short product cycles, falling prices and profits, competition from new market entrants, and general economic conditions. A concentrated investment in a single industry could be more volatile than the performance of less concentrated investments and the market as a whole. 

On-premises (“on-prem”) software is installed and operated from a company’s own physical servers and data centers, rather than in the cloud.  It provides complete control over data security, customization, and maintenance, making it popular for high-security environments. Key benefits include data sovereignty, while downsides involve high upfront costs, and responsibility for all updates. 

S&P 500® Index reflects U.S. large-cap equity performance and represents broad U.S. equity market performance.

Dan Block: Hello, and thank you for joining this episode of Global Perspectives, a Janus Henderson podcast created to share insights from our investment professionals and the implications they have for investors. I’m your host for the day, Dan Block, Equity Client Portfolio Manager.

Markets have been anything but quiet. AI optimism continues to shape equity leadership, while rapid technological change is raising new questions around software business models, capital spending, and global competition. Beneath the surface, investors are trying to separate durable innovation from short-term hype and understand where the economic impact of AI will truly emerge.

To discuss what’s really driving technology markets today and what matters most looking ahead, I’m thrilled to be joined by Denny Fish, Portfolio Manager of the Janus Henderson Global Technology and Innovation Fund and our Technology Sector Lead. Denny also recently returned from China, offering a first-hand perspective on how the global tech landscape is evolving. Denny, thank you for being here.

Denny Fish: Yeah, it’s my pleasure.

Block: So, from where you sit today, what is the single most important thing investors misunderstand about the current technology environment?

Fish: Yeah, I think, it is interesting if we think about what’s happened in the last three years in terms of where the leadership’s been for technology investing, it really has been rinse and repeat each year. And what I mean by that, it’s been AI infrastructure, now it’s been various forms of it, you know, and clearly just started out with GPUs in the ‘23 timeframe. But then it really expanded into data center infrastructure, liquid cooling, electrification, memory, optics, kind of every power, kind of everything associated with building out and laying the groundwork for AI.

And there’s constantly this narrative in the market of, when is this going to end? When is it going to roll over? And I think the single biggest thing that is missed by those that have not participated in this, or are not participating, is that if anything, each year we’ve continued to see advancements far in excess of what I would have imagined, each year. And it’s true right now. And if you look at the advancement in the models, whether it’s OpenAI 5.3 and now 5.4, if it was Claude Opus 4.5, or 4.4, 4.5, now 4.6. If you look at even, you know, Meta finally getting their act together with their Muse Spark model, we’ve just seen a lot of step-function advancements, and that’s why you’ve actually witnessed the desire.

Like, let’s think about what the market was doing before the war. Okay, before the war, it was rinse and repeat, AI infrastructure very strong, other areas of tech weaker, particularly software, which tends to be in the eye of disruption. And then we had the war, things sell off a little bit, we start to get a little more comfort there. The market does anyway in the near term. And what comes right back? It’s that leadership. And I think that’s what the market continues to underappreciate is the exponential step function changes that we’re seeing in these models as they continue to advance and leapfrog each other. And that is very, very healthy.

Block: You know, within the models, and I think maybe it’s a little bit misunderstood out there, these advances you’re talking about, all of them thus far until the most recent Claude Mythos model have been off of the old Nvidia Hopper, or trained off of the Hopper. The new Claude was the first one fully trained off of the Blackwell. What does this mean going forward?

Fish: I think it’s a really good point. And obviously we all speculate because they don’t actually disclose exactly what each one was trained on. But if true, which we tend to believe it is true, is that Mythos being trained on Blackwell, huge step function change, just continuing to support the idea that it is a compute-driven problem still, and that covers a lot of areas of compute.

And if we even look at the latest iteration of OpenAI’s models, those were clearly partially trained on Blackwell as well. And we also saw step function changes with things like Codex, which now competes very effectively with Claude code, for example. And so, it’s just more evidence that the more compute you can throw at the scaling laws in an efficient way, we continue to get excess output, right? And as long as that holds, these companies will continue to invest aggressively, because not investing is not optional at this point for anybody, because the pace of change continues to accelerate, if anything.

Block: And so, we’re talking about that investing and this CapEx spend that’s going on. When does this shift to actual economic return? Are we seeing it? And what evidence do we need to see to confirm these economic gains?

Fish: I’ll point to three things. Widely reported, Anthropic’s doing $30 billion in annual recurring revenue, okay? Up meaningfully from like [a month ago], or even an $8 or $9 billion run rate at the end of 2025. OpenAI is doing somewhere in the $25 to $30 billion range. Now, they account for things differently, gross versus net, and so there’s some nuances, but nonetheless, these are exponential growth curves.

And then I would point you to Amazon’s shareholder letter that Andy Jassy just put out, where he clearly defines the returns that Amazon is getting and expects to get on the compute. You can look at the ramp of things like Copilot at Microsoft. I think maybe the two biggest examples that everyone listening can appreciate is AI Summaries from Google with Gemini; they are monetizing that much more effectively. And boy, it’s just a better experience. It’s great.

If you look at what Meta’s doing internally, both on the ability to create content with AI, which is really accelerating the amount of content that gets put onto the platform, both for consumption as well as for ads, and then the targeting that they can do with the investments that they’ve made into artificial intelligence, you’ve seen their ARPU [average revenue per user] and engagement continue to go up. And so, for those that suggest that the companies are not getting ROI [return on investment], I would argue exactly the opposite, because you can look at all the companies that are spending the most on CapEx and you can point to very tangible benefits they’re getting. And we’re early, really early.

Block: So, you’re saying my neighbors will even more so keep telling me how their phone is listening to them?

Fish: Possibly. I mean, that is one of the things, like, if you think about the devices that have been reported to be developed right now, there’s a lot around this idea of ambient computing, where should you choose, you have a device that’s always listening and is there to take actions on your behalf or collect data on your behalf to improve your life, right?

Block: So, let’s shift over to agentics. We’re hearing a lot about this. Explain what is meant by that so everyone understands what an agent is and what role can these tools play in helping AI, helping the users enhance their productivity?

Fish: Yeah, so the fundamental difference is … I’m assuming everyone listening has used ChatGPT or Claude or Gemini and you have a text box that you put your prompt into, and you ask it a question, and then you get an answer back. And that seems very basic, but it’s actually very hard, you know, to train that model to do that. But that was like the big kind of AI moment in November 2022 that we were able to do that. Since then, we’ve advanced, you know, AI to the point where it can reason and it can perform what are called agentic tasks. And what that means is, you’re not putting in a prompt into something like ChatGPT; you have effectively instructed an agent to go out and act on your behalf in the background.

And so, agents, and you can have many of them, and whether you’re at work and they’re performing many of the tasks that maybe were routine and repeatable, rules-based, rote things, an agent can just be in the background just taking care of that, right? And serving up the most important information that requires human judgment at the time that it’s needed. And, you know, for consumers, think about an agent constantly working in the background for you, adjusting your calendar, checking on flights for you because maybe you’re going to have a trip that’s coming up. The agent knows like when to notify your wife if you’re going to be late, you know, ahead of even you knowing it because it can tell where you’re at a point in time.

There’s so many things, and why this is really important is because the agents can reason and they can take action based on that reason and that’s why it’s so powerful. Now, there are a lot of implications and a lot of things we have to think about in terms of governance and guardrails and security, and we can come back to that. But 2026 is firmly the year of the agent. We have moved from the chatbot to the agent. And if you think about how much more effective an agent can be in the background operating on a human’s behalf, they don’t work from 9 to 5; they work 24-7. And if you think about, if you have three or four agents per person, and I think that’s bare minimum, as we’ll move forward, and those communicating with the agents of everyone else, there are networking laws that, as each node of the network gets added, it’s exponential in terms of the amount of traffic that goes back and forth.

Once again, reinforcing the idea of the amount of compute infrastructure that’s going to be required to perform what’s called inference. And that’s where the agents actually take the action and where the agents actually perform that action on the compute relative to these big training clusters that are used for training the models.

Block: Well, so within those agents, and a lot of fear has come over all software ever since Anthropic’s December release of Claude Code. How does AI disrupt the software economics, and where do you think maybe it’s mostly cosmetic?

Fish: I think it’s really hard. I think it’s easier to paint the disruptive case just because, when you watch how powerful these tools are, how quickly they can spin up a bespoke application or workflow, I mean, that’s what teams of developers used to do over months that, you know, Claude Code or Codex from OpenAI can do in days, if not a day. And so that in itself is a big threat, because the barriers to software development are now zero. They are effectively zero. It is the cost of electricity and compute to support the model. That’s your cost for software development. And so, in a world, you can see the number of applications … it doesn’t mean software’s dead. It just means the barriers to entry are so low now that you’re going to see the number of software applications explode worldwide. And that just creates a lot more competition, it makes the terminal values of the companies more uncertain.

And then to your point, you used the word cosmetic. I think the companies that have a chance, the legacy companies or incumbents that have a chance to get to the other side have maybe a number of important attributes. Number one, they clearly need to be some sort of really sticky system of record that’s not going to go away anytime soon because it handles the general ledger, supply chain, or the most important elements of a business. But it’s got to be more than that; it’s got to be a system of action as well, meaning that the systems require agents to actually interact with it on an ongoing basis. Because if they’re really low-touch systems, they effectively just become a dumb database. And there’s not a whole lot of value in being a dumb database.

And so, I think those companies that have really valuable data, systems of record, are actively used, and agents can consume that data and get a lot of value out of it, and the companies that are that system of record and system of action and can develop agents to actually perform the function that all these bespoke software applications will perform, those are the ones that have a chance to get to the other side. But right now, we’re taking what I call a rifle shot approach in software versus a shotgun approach.

Block: Sure. So, what would you say then is the biggest risk in software? I hear people talking about, oh, this company is just going to use a bunch of agents and write their own code. Or is it really of what you touched on of just the ease of new software companies entering the arena.

Fish: Yeah, I think it’s the latter. I mean, most companies, while they can write their own applications … I mean, we can go back 40 years of software development, and companies have always written their own applications, but they’ve only written their own applications because there’s been nothing else available off the shelf. Once there’s something available off the shelf, they’ll use it. It’s because someone needs to be managing the security, the compliance, the governance, all the things. There could be statutory changes that are required to get updated through systems like HR and financials, for example. And so, enterprises just want to buy software, they want it to work, they want it to be efficient, they want to get a lot of value out of it. And that’s what these tools are now allowing just a lot more companies to create that type of software.

Block: So, as we approach the end of this year, nine months from now, what should we be measuring in software companies that most people are currently ignoring?

Fish: I don’t know if it’s what people are ignoring, but most importantly, software companies need to prove a few things. One is what their contribution to growth is from agentic applications that augment their core base. And then they also need to convince investors what the new operating model looks like. Because, you know, on-prem [on-premises] software, you ship software out, 99% gross margin, and then you charge maintenance on it, which was also, you know, high-margin business; we had a high cost of sale associated with it.

And then SaaS was kind of 80-ish percent gross margins. That was because you had data center costs and you had to pay for, say, a database license and things like that. And so, the operating margins kind of got to the same ballpark, maybe a little bit less for a lot of them than on-prem, but your gross margins compressed by about 2,000 basis points. Now, one of the fears is, if you actually have to use the third-party models and token costs go up a lot, maybe the gross margin profile of these businesses is … I don’t, then, like 65 to 70%.

And now let’s say that to sell your Gentic applications, you need very expensive forward deployed engineers. Maybe that means this is like a 55% to 60% gross margin business. And I’m going kind of extreme here, but nonetheless, so they need to prove they can actually monetize AI and it’s incremental to the core business, which isn’t deteriorating, and then show investors what the economic model is going to look like for people to get comfortable again at why software was a great business for 30 years.

Block: So, I think I know the answer to this, but there was a lot of disruption when we shifted from that on-prem model to the SaaS model. And we’re going through a lot of disruption now to new models. Are there going to be really strong opportunities within the software space as we go forward, whether it’s near term, probably a little bit longer-term in nature?

Fish: So, maybe. We don’t know. I think one of the differences is … one of the shortcomings of on-prem software was the way you updated it, the customization, and the data sat with each customer inside their four walls. If you think about SaaS, it’s all updated together. You should be able to layer agentic into it at your own pace. All the data is centralized. So even though the customers own the data, the data is sitting within the application of the software provider. Those are kind of natural advantages that an incumbent would have.

On the flip side, during every transition, whether it was mainframe to minicomputer, minicomputer to PC, PC to internet to cloud, there’s just been a new burrito of winners because you fundamentally need a new architecture. And it’s the classic innovator’s dilemma: It’s easier to create that architecture if you’re starting from ground zero versus trying to. I call it trying to change a flat going 60 miles an hour. It’s just, it’s really hard to do. Maybe there are a few people that can do it, but not too many.

Block: Shifting over, maybe we’ve talked a lot about software, maybe shifting over to semiconductors. And many people think of semiconductor as one kind of consistent blob on every company within there. Is there risk in assuming that every layer of that semiconductor ecosystem participates equally.

Fish: Completely, because you have different industry structures at every layer. Nvidia has been in this fortunate position that they’re primarily the only game in town for the highest-performing GPUs for doing training and, you know, some degree of inference. You go and certain use cases with ASICs [Application-Specific Integrated Circuits], and there have been a few companies like Google that have been developing their own silicon, and that’s great. That continues to scale. You look at memory; it’s a very rationalized industry: Three players dominate. And as a result, it takes a long time to bring capacity on. It’s generally pretty rationalized how you bring it on. So that can go for some period of time. You have subcategories like certain areas of optics and lasers that are dependent on certain materials. There’s only so much material and so much manufacturing capacity, that creates a bottleneck.

And so really, the way I think about it is, how significant is the bottleneck? How long would it take to bring capacity on that would relieve that bottleneck, given our assumptions for what growth is going to look like? And that’s the thing, capacity for most of these subsectors can’t keep up with the amount of growth. And so, if you’ve got low competitive intensity, you’re an extreme bottleneck. So very short. You only have one or two players, or maybe three. That means you’re going to have pricing power. And that’s what we’ve seen. We’ve seen pricing go up a lot in DRAM [Dynamic Random-Access Memory), particularly because of HBM [high-bandwidth memory], but also core DRAM. We’ve seen it in NAND. We’ve seen it in optics. And it’s just because they’re all constrained, even more so than GPUs were constrained a year ago. They’re going through that cycle.

And then if you want to bring it all back, at the end of the day, everyone can ask for as much as they want, but we’re only going to get as much as TSMC can produce. And then to a lesser extent, Intel and Samsung, whether it’s memory or, you know, third-party chips at Samsung or CPUs, which are increasingly becoming more important in an agentic world, where we thought it was all a GPU world. Now, you know, there’s a bull case for CPUs now, too. And the thing that’s interesting about that is, if you’re TSMC, you want to allocate as much capacity as humanly possible to GPUs because they’re a lot more expensive. You make a lot more money, okay? And minimize the amount that you actually allocate to CPUs or mobile processing chips because you don’t make nearly as much.

Block: So, you talked about the amount of growth. And every once in a while, I get questions from clients worried about the overbuild of everything. And it seems to me that the market is kind of underestimating the durability of the growth involved in this. Would you agree with that?

Fish: Yeah, I would. And I think there are two reasons for that. One is just the nature of scaling laws continuing to go exponential. And now the move to agentic, just that in itself is going to put a lot of pressure on everyone for more compute infrastructure. I mean, effectively, what we’re talking about is a human economy augmented by compute, right? I mean, that’s where we’re going. And then at the same time, you can only move so fast because of power, because of labor, because of foundry capacity. Now we even have NIMBYism around data centers in certain communities and so forth.

And so, the pace that you can move at, you just can’t overbuild in a reasonable period of time. This is so different than the dot-com bubble, where it was so easy to overbuild because you could lay down dark fiber anywhere you had right away. You could throw a cell tower up pretty much anywhere. I mean, I could put a tower on my roof and start charging rents if I wanted to at that point in time. It’s very different here. It’s much more complex. The capital needed, the labor constraints, energy. There’s just … there’s a lot that’s going to create a natural governor to just how fast this can build out.

Block: So, shifting gears again, and I mentioned in the intro, you recently made a research trip to China. What stood out most to you on that trip that maybe you didn’t fully appreciate before you went?

Fish: Yeah, I think how far the Chinese models have advanced, number one. Now it’s important to note the reason they’ve advanced the way that they have is because they do what’s called distillation on Western models. So, they also do access GPUs through offshore data centers, for certain aspects of what they’re doing. And those two things combined are allowing them to move at a fast speed, not nearly as fast as the Western models, but there are a lot of really good models that are accomplishing a lot of good things over there. And so that was fascinating. They’re making a lot of progress. Still years behind on chip design and chip manufacturing, but nonetheless, they continue to have strong ambitions there. And there are many companies that are funded and just trying to possibly get somewhat as efficient as, say, a five-year-old Nvidia hopper GPU, right? That’s kind of the ambition, which would be helpful for them.

But I think the bigger takeaway was seeing the progress that China’s making in physical AI and driving around in robo taxis, seeing the progress they’ve made on autonomous, which looks and feels a lot like a Waymo, and then robotics. And because they’re a manufacturing-driven economy, there is no shortage of companies that are focused on every bottleneck within the supply chain that can actually be addressed with robotics. And then their ambitions around humanoids, much like Tesla’s ambitions around Optimus and what that could mean as well, and even though it’s years off, they’re moving as fast as we are, if not faster.

Block: Okay, and then kind of to finish off, and you touched a little bit on the difference between the current environment and the tech bubble, but actually, what do you think the biggest distinction between today’s kind of platform shift to AI relative to prior platform shifts like to mobile or to cloud? You know, there have been numerous big tech regime changes almost that have been incredible investment opportunities. What’s different as you see it today?

Fish: I think two things are fundamentally different. With every one of those technology cycles, each cycle created a much bigger technology market, and it diffused faster than the prior cycle. And that is exactly what’s happening right now with AI. And it’s diffusing much, much faster. It’s diffusing across every industry. The dawn of the commercial internet and cloud, it kind of moved kind of fast, and it moved slow, and then it moved fast. This is moving really fast. It’s going to continue to move fast, and it’s going to move fast. And I think that’s the fundamental difference. And also, this is more profound in terms of the impact that it’s going to have on economies, society, and global security.

Block: And we haven’t even truly seen it across the S&P 500, for example, across companies using it to improve their efficiency, improve their margins.

Fish: Yeah, and we’re just seeing it. We’re just seeing it. It’s starting to happen.

Block: Denny, thank you so much for being here. It’s what I appreciate about our research effort here at Janus Henderson, we’re not making top-down bets on themes. Our focus is really understanding what’s actually changing, identifying the company’s best position for that change. Your perspectives on AI, software, what you observe firsthand in China are great examples of that approach, and we really appreciate you sharing that with the audience.

Fish: My pleasure. Thanks for having me.

Block: Hopefully you enjoyed the conversation. And as technology continues to evolve and markets become more discerning, consider how active fundamental research can help separate durable winners from short-term excitement as we continue into 2026 and beyond.

For more insights from Janus Henderson, you can download additional episodes of Global Perspectives wherever you get your podcasts or visit Janushenderson.com. I’ve been your host for the day, Dan Block. Thank you for listening, and we’ll see you next time.

These are the views of the author at the time of publication and may differ from the views of other individuals/teams at Janus Henderson Investors. References made to individual securities do not constitute a recommendation to buy, sell or hold any security, investment strategy or market sector, and should not be assumed to be profitable. Janus Henderson Investors, its affiliated advisor, or its employees, may have a position in the securities mentioned.

 

Past performance does not predict future returns. The value of an investment and the income from it can fall as well as rise and you may not get back the amount originally invested.

 

The information in this article does not qualify as an investment recommendation.

 

There is no guarantee that past trends will continue, or forecasts will be realised.

 

Marketing Communication.

 

Glossary