<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=2634489&amp;fmt=gif">

Interview, Al and Machine Learning

A FAIR Approach to Generative AI: An Interview with Rackspace CTO Srini Koushik

By Stephen C2C | June 1, 2023

Generative AI is taking the world by storm. Whether it sees it as an opportunity as a threat, every organization, in the tech world and beyond, is preparing in some way for the next phase of the generative AI boom. With speculation running wild, what most organizations need now is a clear-headed approach to using this technology effectively and sustainably. C2C partners Rackspace are providing just that. The Foundry for Generative AI by Rackspace is the company's new suite of services that enables customers to implement AI with the right systems in place and the right values in mind. To learn more about this solution and what it can offer, C2C sat down with Rackspace CTO Srini Koushik for a long and wide-ranging discussion about the risks and the possibilities of "releasing AI into the wild."

 

Tell us about your background and your role at Rackspace.

 

I'm Srini Koushik. I've been at Rackspace for about eighteen months. I'm the chief technology officer. I've got almost forty years in this industry. I got my start back in 1987 in India. When I did my master's degree thesis, it was on artificial intelligence, and it was all in this area called "frames," which was all about giving context to the AI you're building. At that time, it was very rudimentary. '87 was a long, long time ago, but when you fast-forward and see how these incremental developments have built on one another over the past 40 years, I feel as excited as a twenty-something entering this new age of AI.

 

What is FAIR?

 

Most people, if they're in technology, have heard of Generative AI. One clear thing about generative AI is that it's real and here to stay. If you go back and look at Rackspace's 25-year history, we've been there whenever there's been a technology shift where our customers needed help. At the start of the century, it was managed hosting when the web was taking off, and later the first public cloud with OpenStack when Cloud Computing emerged in 2009. We think Generative AI is such a massive opportunity that we must be here for our customers. We've been working with Generative AI for Rackspace's internal applications since the start of 2023, and we feel we have compelling offerings that we are ready to bring to the market. FAIR is the global practice that we've set up to be able to co-create these solutions with our customers.

A little bit about the name: we chose the word FAIR because it stands for Foundry for Generative AI by Rackspace, but what we liked about the name were a few things: number one, the word foundry. If you go back to the Industrial Revolution, the foundry was where you brought raw materials together with machinery and had skilled professionals create things that had value to customers. It's precisely where we are right now. We have the materials, which is data; we have the machinery, the large language models, and skilled practitioners, our Rackers coming together to develop AI-powered solutions that are valuable to our customers.

At this point, many Service companies in our industry have discussed the hundreds of use cases they've all identified. We have focused on converting those ideas into reality. So that's what FAIR is. The other reason why we loved the name FAIR was that it's a guiding principle for us to focus on the responsible and sustainable adoption of AI. This isn't AI for AI's sake, but it is about a responsible approach to AI that's equitable to people, is secure, protects privacy and intellectual rights, and does so in a way that consumes the planet's resources in such a way that we promote sustainability.

 

What does it mean to use AI fairly and sustainably, and how does FAIR accomplish that?

 

We decided to take many of our internal systems to Google Cloud three years ago. That was before my time, but when I got in, I was very pleased that that was the platform they picked. I'm certified as an architect on all three hyper scalers, but the first one I got certified on was Google Cloud. Google has been a leader with its stance on sustainability and its approach to open source, and these were the same core values that Rackspace was built on - so it was a great fit.

The IT function within Rackspace reports to me, and so being on Google Cloud ourselves gave us an opportunity to be a pioneer with Generative AI. We've been a preview program for many of the products that Google has released, and it allows us to learn by doing and building solutions that help our business. We have had to learn how to build these solutions, select the appropriate large language model, tune the model, and secure and protect the privacy of data. As a mid-sized global organization, we also had to learn how to do these things frugally.

People ask me about sustainability, and I don't say it lightly, but I say, "The only green technology is the one you don't use." Anything that you use is going to consume electricity, consume resources. However, suppose you are very responsible about how you consume it and pay attention to that as a non-functional requirement of any solution you're building. In that case, you're going to end up reaping the benefits of that solution.

Rackspace Intelligent Co-pilot for the Enterprise (ICE) is one of the first solutions we're rolling out, and if we're going to deploy Rackspace ICE, and we know what it looks like when we deploy it to twenty people, we know the best way for us to take it and deploy it to a thousand people across the globe. Where do you deploy the models, and in what Google Cloud regions? How do you tie it to clean energy? We're not only producing the outcomes we're looking for, but we're also trying to make sustainability a business outcome, and that's critical.

 

What are some key use cases for FAIR? How can Google Cloud customers use it? How are Rackspace customers already using it?

 

We started with these cross-domain use cases. We had two solutions that we started with. One is called RITA (Rackspace Intelligent Technology Assistant), and the other is Rackspace Intelligent Co-pilot for the Enterprise (Rackspace ICE). RITA is precisely what it sounds like. It's an intelligent chatbot that uses intent-driven automation to automate and simplify the provisioning IT Services within Rackspace. Rackspace IT doesn't have anyone answering the phones anymore. All the level-one support is done through automation, and then the second-level support goes to our engineers. It's been very helpful because RITA automates the toil, freeing up our engineers to step in and become problem solvers. This is a case where AI is not replacing people but giving them an opportunity to move up in their careers. As Google Cloud continues to enhance its products, it opens up many new possibilities for us - for example, we can leverage language translation to make RITA multi-lingual, so Rackers across the globe can converse with her in their native language.

The other use case, Rackspace ICE, is essentially what Google Cloud calls enterprise search. It's, "How do you take these islands of information that sit within an enterprise and start connecting and correlating them, and expand access to this wealth of context-rich information through a friendly natural language interface so that you start to unlock solutions you didn't even know existed?"

 

"The emergence of Generative AI is not unlike the invention of the Gutenberg press."

 

The minute we start showing those capabilities, you start unlocking the possibilities in other places. I spent time with our chief legal officer yesterday, and he asked, "Can we go search our contracts? I've got to be able to do the same thing. I want our lawyers to focus on being lawyers and not spend the majority of their time looking for information that is relevant to what they are working on." You can imagine that within any enterprise, so many of these areas are underinvested over the years, and they've grown up as silos: HR, finance, legal, and marketing. We can see Rackspace ICE solving these problems in all of these domains.

Those two use cases are essential for making us more effective, every one of those applies to every customer I go to, and it goes to any customer we have. As we work with our customers, we can address challenges from a position of experience as we have dealt with challenges that our customers are likely to encounter in their journey - cloud platform setup, securing AI, Security controls and privacy controls, policies, guardrails, and governance.

While Google Cloud has made the technology much easier, implementing it within an enterprise is much more involved. We've been advising companies on how to do that. Three months ago, we created a generative AI policy that governs the responsible use of AI within Rackspace. Now we're applying the policy as we create these solutions, and we're finding out that it was a good start, but we probably must continue adding more things. This is the learning process, so our customers can benefit from all our work in each of these domains.

 

A new technology emerges every year. Why a foundry for generative AI?

 

There's a technology every six months these days, not even every year, but we think the emergence of Generative AI is not unlike the invention of the Gutenberg press. The invention of the Gutenberg press revolutionized the world by transforming the way ideas were communicated and knowledge was disseminated. With the movable type and mechanized printing, the press made books more accessible, accelerating the spread of information. This breakthrough democratized knowledge, fueling the Renaissance, Reformation, and Enlightenment, ultimately shaping the course of human history. Just as the Gutenberg press disrupted the dissemination of knowledge, Generative AI is redefining how we create and interact with information. Like the press, Generative AI will reshape industries, foster new ideas, and democratize artistic expression, opening doors to a future limited only by our imagination.

With FAIR, we cut through all the complexities of AI and aim to make working with Generative AI easy for our customers. FAIR does three things: ideate, incubate, and industrialize. In the ideation phase, we're trying to determine how desirable AI is to your organization. How ready are you as an organization for the advent of AI? Do you have the right policies, governance, and guardrails? We start with the database of use cases and work with customers to determine: which use cases apply to them? Which one is the first one you need to work on? And does the customer have access to the data they need to get started?

In the incubation phase, we move from establishing the desirability of AI to determining whether it's feasible to implement the use case in the organization. You may want to do this, but if you don't have all the data or if you don't have the skills, you're going to run into different constraints. Feasibility is all about trying to identify those constraints and figure out how you would overcome those constraints. At the end of this incubate phase, you have something that you can take to the board. You can demonstrate based on your data and get the buy-in of the board and the leadership to be able to drive this forward.

The last step in our approach is the Industrialize phase, I call this phase "releasing AI into the wild." In the Incubate phase, the solution was available to a handful of people. Still, if you want to release it to your entire organization, you need to build new processes and techniques to manage and govern AI to ensure the desired outcomes.

We're working with our customers to co-create that journey for them and do that iteratively, and Google Cloud has allowed us to do this with the innovative products they are releasing at a breakneck pace. I'm excited about it; I go to bed, and when I wake up, they've released something new, and those products open up different solutions that we can co-create with our customers. We're thrilled to be a Google Cloud partner with generative AI and data, and as we move forward and get our customers through the incubation phase, you'll see a flurry of customer testimonials from FAIR.

 

Extra Credit:

 

 


Recent Articles

Data Analytics

Generative AI: Are You Behind?!

Review the latest insights from the AI Readiness Report.
By Bruno Aziza
Industry Solutions

Make "Gen AI Work": Landscape, SLMs vs. LLMs, Cost & More...

Discover the 5 metrics you need to know in order to be an exceptional CEO and Operator.
By Bruno Aziza
Google Cloud Strategy

AI Cheat Sheet

AI is no more and no less the drive to create robots with human minds so they can do everything we do and more. Use this cheat sheet to help decode the space.
By Leah Zitter