AI
Last updated
Last updated
Create an Ockam Portal from any application, to any AI model, in any environment.
In each example, we connect a nodejs app in one private network with an AI service in another private network.
Each company’s network is private, isolated, and doesn't expose ports. To learn how end-to-end trust is established, please read: “How does Ockam work?”
Please select an example to dig in:
The Amazon EC2 example uses a LLaMA model and the Amazon Bedrock model uses an Amazon Titan model. However, the same setup works for any other AI models: GPT, Claude, LaMDA, etc.
Amazon EC2
We connect a nodejs app in an AWS virtual private network with a LLaMA model provisioned on an EC2 instance in another AWS virtual private network. The example uses the AWS CLI to instantiate AWS resources.
Amazon Bedrock
We connect a nodejs app in an AWS virtual private network with an Amazon Bedrock API in another AWS virtual private network. The example uses the AWS CLI to instantiate AWS resources.