UiPath has launched a new Integration Service Connector to provide customers with entry to Amazon Bedrock, a fully managed service that provides access to foundation models (FMs) through an API.
UiPath’s connector for Amazon Bedrock enables automation and citizen developers to seamlessly integrate Generative AI directly in their UiPath Studio and Studio Web automations, using the model of their choice and within Amazon Web Services (AWS).
The connector supports text and chat capabilities through Amazon Titan FMs and several other FMs from AI providers via Amazon Bedrock, including the Jurassic-2 family of multilingual LLMs from AI21 Labs. These AI providers follow natural language instructions to generate text in Spanish, French, German, Portuguese, Italian and Dutch.
Explore related questions
FMs – large models that are pre-trained on a broad range of data, according to AWS, can “perform so many more tasks because they contain such a large number of parameters that make them capable of learning complex concepts”.
Customers also have the choice of Claude, Anthropic’s LLM, which can perform various conversational and text-processing tasks.
Graham Sheldon, chief product officer at UiPath, said: “The UiPath connector for Amazon Bedrock is simple to use and brings the power of foundation models to all UiPath customers so they can accelerate building their own Generative AI applications.
“With its open, flexible and responsible approach, UiPath provides organizations with a comprehensive platform for implementing and harnessing the power of AI-powered automation. This functionality complements our vision for helping customers innovate faster with Generative AI.”
UiPath emphasized that all data is encrypted and does not leave a customer’s Virtual Private Cloud.
Earlier this year, UiPath also announced that data science teams using Amazon SageMaker, can now quickly connect Amazon SageMaker-hosted ML models into UiPath business processes without the need for complex coding and manual effort. As a result, users can interact with deployed models via the connector and use their outputs in their workflows.