Phases of Division: Navigating Vendor Pressure and Emerging Innovations in Cloud-Computing
As the earth and moon continue to drift ever further apart, so too do the cloud computing options of SaaS and Customer Managed Deployments - widening the cosmic gap between AI solutions and services. Although the move to cloud computing is a promising cost-reducing embrace of AI, the divisive global cloud ecosystem is leading to nervousness from banks and brokerages about which direction, if any, to choose.
Although 91% of financial services executives have reported their firms embarking on cloud-computing strategy exploration in the last year, most vendors still aim to pigeon-hole firms into one of the two camps.
Companies now find themselves weighing the pros and cons of privacy vs ease of access, reduced total cost of ownership vs hiring independent cloud computing teams, and community model fixes and updates vs tailored security and login needs. All of this comes on top of working out how agile the company’s IT infrastructure is and how easy it would be to switch, and leaves many wondering if aiming for AI enabled cloud computing is a matter of howling at the moon.
Weighing up the risks and rewards of the polarised environment is made all the more difficult by the launch of China’s new AI “All-In-One” Boxes for companies to implement on their own premises. A large step beyond Customer Managed Deployments, these “boxes” are bringing the advances of generative AI to private cloud set-ups, whilst combining them with large language models in independent training thus granting each private cloud its very own data set and learning abilities. Large projects utilising these “All-In-One” Boxes include China National Nuclear, and government and state-owned groups who hope their private clouds will each be able to handle “inferencing”. This is responding to user queries through training the system and shaping its output, alongside in some cases, a whopping 10 billion parameters.
Last week’s Harvest Supermoon appeared bigger and brighter than normal (30% bigger, and 15% brighter to be precise!), but the tendency for us to gravitate towards new innovations leaves us wondering if these new “All-In-One” Boxes are a distorted Supermoon phenomenon too. The speed at which these technological advances can be made rely heavily on the capabilities of companies to keep up. With each individual “All-In-One” Box requiring data centres with vast processing power, there is a risk that despite the privacy and security it offers, they will be more inefficient when compared with public cloud or using API’s to connect to LLM’s and community models.
Speaking to this issue, Dylan Patel, Chief Analyst at SemiAnalysis research group has said that, “Usage is going to be very sporadic, which means you’re going to have all this very expensive AI hardware that is not utilised properly.” Many companies have decided that this is just the price of progress as the Chinese market for ““All-In-One” Boxes is estimated to hit Rmb16.8nb (£1.8bn) this year and Rmb450bn ($49.05bn) by 2027.
The differences between private clouds and private AI servers with their own learning data sets are vast. With technological advancements such as we see in China aiming to eclipse the private cloud deployments they once stemmed from, it is no wonder that picking a deployment method is keeping key leaders up at night.
With the future of the cloud computing cosmos splitting irrevocably into two camps, urged on by the innovations of “All-In-One” Boxes and the pressure-filled response of vendors, we expect to see more nervousness around cloud-computing selection. Both cloud application types have their silver linings and the choice is a deeply personal one based on a company’s needs. Company feedback maintains that firms frequently feel wrongly shoe-horned into one option but feel they are told that SaaS is the future and therefore the only way forward.
Rather than vendors pushing all businesses to pick one cloud computing method and cram themselves into, more frequently than not, a SaaS-shaped hole - we advocate for flexible vendors who understand the needs and wishes of a firm. SaaS works beautifully for smaller more agile banks, but for those thinking they’ve missed out on rocketing towards cloud computing due to their scale or desire to stay on premises, we suggest looking for a vendor who supports their decision and who does not shy away from launching Customer Managed Deployments.
The race towards the future moon and stars of AI should not leave banks feeling pressure to conform or to change methods in order to fit in for fear of being left behind. To keep to our lunar metaphor, we suggest these large banks should not fear the giant leap to SaaS, but instead use Customer Managed Deployments and a vendor to launch with them as they aim for their cloud-computing perilune: the closest point to the moon in an object’s orbit around it.