AWS Launches $100M Program to Help Customers Implement Generative AI

AWS, Amazon Web Services

Amazon Web Services (AWS) has launched a program to help customers accelerate their implementation of generative artificial intelligence (AI).

The new AWS Generative AI Innovation Center will connect the company’s AI and machine learning (ML) experts with customers and partners around the globe, AWS said in a Thursday (June 22) press release.

“The Generative AI Innovation Center is part of our goal to help every organization leverage AI by providing flexible and cost-effective generative AI services for the enterprise, alongside our team of generative AI experts to take advantage of all this new technology has to offer,” Matt Garman, senior vice president of sales, marketing and global services at AWS, said in the release.

The new program will offer customers no-cost workshops, engagements and training that will help them imagine use cases for generative AI in their businesses, according to the release.

It will also enable customers to work closely with experts from AWS and its partners to develop and launch solutions that use the technology, the release said.

“Together with our global community of partners, we’re working with business leaders across every industry to help them maximize the impact of generative AI in their organizations, creating value for their customers, employees and bottom line,” Garman said in the release.

The launch of this program comes soon after the announcement of some other AI-related efforts by AWS.

In April, AWS launched new services to help companies build generative AI tools.

One new offering, Amazon Bedrock, gives customers access to foundation models developed by AWS and other companies so they can choose the model that is best suited to their needs and use it to build their own generative AI application.

Another new offering rolled out in April is the general availability of server resources that lower the cost and energy consumption of running generative AI.

A month earlier, in March, AWS and NVIDIA said they are collaborating on “next-generation” AI infrastructure that will be optimized for training large language models (LLMs) and developing generative AI applications.

The capabilities delivered by this partnership are meant to power question answering, code generation, video and image generation, speech recognition and other demanding generative AI applications.