On the 18th September T-Mobile US hosted a Capital Markets Day where it shared a number of interesting perspectives on the industry and approaches to applying new technology to its operations. Now we’re a few weeks down the line I’ve had time to digest some of the discussion topics and triangulate with, for instance, my attendance this week at MWC Las Vegas.
I don’t want to go over how bullish T-Mobile is regarding revenue targets and free cash flow and so on, other than to note that the financials look good. I’m not a financial analyst, that’s not really what I look at. And neither do I really want to discuss fixed or mobile customer growth. It’s a neat turn around story for T-Mobile overall, but I’m more fixated on the use of disruptive technology.
I’m focused on what’s new and interesting. If T-Mobile has the mission to have the best network in the world, what does that mean for its ability to address IoT and other advanced services? Having the best spectrum and network assets would be a waste if it were just to support Fixed Wireless Access (important though I’m sure those services are).
Much of the focus of coverage of the event has been on AI, not least because of the invited guests at the event, Jensen Huang and Sam Altman. This was certainly one of the more intriguing areas explored. There were three critical elements:
One of the particularly interesting seams of discussion during the event was on the idea of ‘customer-driven coverage’. T-Mobile has divided the US into hexagons ranging from 165 metres in rural areas to 27 metres in urban. (Any recovering Civ 6 addicts will be getting itchy mouse fingers at this point.) The aim is to map customer experience and the impact that network capabilities have on it in a highly granular way, with a subsequent extension into using AI to do predictive modelling of outcomes and then feeding that back into network build decisions. All this in a very granular and targeted way based on impact on customers and commercial implications.
The logic is inescapable. Rendering the decision-making process down to the micro level means that resources are more effectively deployed. While this might be couched in terms of improving customer experience the key aspect is that it is done in a highly targeted way. Providing perfect mobile network coverage is easy. You just place a transmitter every 100 metres. Doing it in a cost-effective way is more difficult and inevitably requires targeting. Everyone does that. This is an even more granular approach based on business outcomes rather than simply where people are.
One adjacent example is the fallow capacity approach that T-Mobile has taken to its Fixed Wireless Access offering. Check out where has spare capacity and sell FWA there. If there’s no spare capacity, don’t. Same principle.
One of the things we’ve noticed about the use of AI by technology companies is that it tends to focus on pretty mundane use cases. That’s probably to be expected since the mundane use cases are fairly common across different verticals (invoice analysis, for instance) and so there are large data sets that can be pointed at a problem. The more esoteric the issue, the longer it’ll probably take for AI to be applicable. I’m generalising massively there, but it's probably a reasonable rule of thumb. You can learn more about one manifestation of this trend in the report ‘How are MNOs and MVNOs harnessing AI for their IoT operations?’.
So it’s perhaps no surprise that one of the areas that T-Mobile has identified as being an opportunity for AI is to point it at the enormous amount of customer data with a view to doing churn analysis. In amongst all the customer data is a breadcrumb trail to why they churned and no-one has really done anything with all that data until now.
T-Mobile is working with NVIDIA, Nokia and Ericsson, including through setting up a new research center in Bellevue, on the concept of AI-RAN (not to be confused with the popular Middle Eastern yoghurt drink). Put simply this is a vehicle for pointing AI at the network and making use of the network assets of T-Mobile to support AI.
The idea of using AI to run a real-time self-optimising network, trained initially on virtual cities is pretty intuitive. There are doubtless efficiency savings to be made.
The second aspect, and the more interesting, is the idea of using the network to put compute closer to the customer. Specifically this involves running non-RAN workloads on network equipment. Given that networks are provisioned for peak demand there is, inherently, spare capacity at off peak time and that capacity can be used for other purposes, i.e. running AI. And running AI in a way that’s closer to the customer.
To be honest, we would hope that ‘using up spare processing capacity on RAN equipment’ isn’t the extent of the aspiration for this aspect. At Transforma Insights, another slowly emerging opportunity we see for network operators is as an orchestrator of AI. Running AI is most efficiently done if you apply a principle of subsidiarity to it, i.e. the thing is done at the level (from centralised cloud down to edge device) at which it is most appropriate to do it, based on trade-offs of efficiency, power consumption, data transmission capability, latency and so forth. Someone needs to manage that process. We discuss this topic in the Position Paper ‘Connected-by-Design: Optimising Device-to-Cloud Connectivity’.
I probably should have led with this topic as the most potentially impactful aspect of AI. It’s also, at this stage, the most speculative. There is a lot still to be worked out, and there weren’t unequivocal answers to some questions. But it’s early days, so to be expected.
Unlike the network AI cloud topic, there was a lot more clarity around the 5G proposition including some specific announcements.
It has been a noticeable trend this year that carriers are starting to get more serious about their 5G Stand Alone capabilities and translating that into service offerings. As predicted in our Transition Topics at the start of the year). It’s going to take a while, but with the SA switch-ons happening recently we’re now starting to see that turning into products that actually make use of 5G functionality. And one of those products is T-Priority, a dedicated network slice for first responders offering lower latency, higher bandwidth and traffic prioritisation. Press release is here.
I should note that when launching a new product, having an anchor customer is always good practice. In this case it’s the City of New York, which is, to say the least, credible.
My perspective on network slicing has always been that the focus would be on providing sets of capabilities aimed at types of applications rather than specific users. Emergency services is one of the most obvious and with the greatest notable immediate need. And, frankly, it’s also associated with specific users so it’s not an either/or question.
For more on our reports on 5G IoT, check out our 5G IoT hot topics page.
I should note that there was precious little IoT in there other than a passing reference and of course the fact that the aforementioned T-Priority is going to be used for IoT devices. The T-Mobile approach is completely 5G focused and the overlap between 5G and IoT today is concentrated in just a few very high value areas, covered by the Advanced Network Solutions portfolio. In that environment, IoT more broadly, which is still overwhelmingly a non-5G (I’m not going to count NB-IoT/LTE-M as 5G in this context despite the fact that they’re officially 5G mMTC technologies) space, clearly isn’t the priority. T IoT, the initiative between Deutsche Telekom and T-Mobile, for instance, got the smallest of references. The company’s attention is elsewhere.
T-Mobile is one of the 25 MNOs and MVNOs profiled in our Communications Service Provider (CSP) IoT Peer Benchmarking report, Transforma Insight’s 147-page annual review of the capabilities and strategies of the leading CSPs in IoT. The report is available to Corporate subscriber to Transforma Insights’ Advisory Service.