AI21 Labs (Amazon Bedrock) analytics source
Use the AI21 Labs (Amazon Bedrock) analytics connector to communicate with AI21 Labs, enriching your Qlik Sense apps with contextual and analytical depth from generative AI and large language model (LLM) technology.
With the AI21 Labs (Amazon Bedrock) analytics connector, you can send data from app consumer input, or from data loaded in your script, to AI21 Labs. You can connect to this analytics source from the Create page in the Analytics activity center, the Script, or within an app.
The AI21 Labs (Amazon Bedrock) connector supports the following foundation model variants:
-
Jurassic-2 Ultra
-
Jurassic-2 Mid
AI21 Studio (AI21 documentation portal)
Prerequisites
To work with this connector, you must be an AWS user with an Access Key and Secret Key. You must also have the bedrock:invokemodel permission.
Enabling ML endpoints in Qlik Cloud
To work with this connector, machine learning endpoints must be enabled in the Administration activity center. The switch is located under Feature control in the Settings section.
For more information, see Enabling analytic connections for machine learning endpoints.
Limitations
-
This connector has a request limit of 25 rows per request, with a maximum batch size of one row being sent at a time.
-
The resources available on the services where the model has been deployed will impact and limit performance in the Qlik Sense reload and chart responsiveness.
-
In a scenario where an application is regularly reloaded, it is best practice to cache the machine learning predictions using a QVD file and only send the new rows to the endpoint. This will improve the performance of the Qlik Sense application reload and reduce the load on the model endpoint.
-
If you are using a relative connection name, and if you decide to move your app from a shared space to another shared space, or if you move your app from a shared space to your personal space, then it will take some time for the analytic connection to be updated to reflect the new space location.