Skip to content

Supported LLM Providers

EasyLLM supports finetuning LLMs from multiple providers. List of supported LLM providers: OpenAI, Azure OpenAI, GCP Vertex AI, Google AI Studio, AWS Bedrock, and Cohere

Info

Finetuning Jobs are created using respective provider's API with your API keys/credentials. You can find the finetuning jobs in the respective provider's dashboard. And you can directly use the finetuned LLMs with the providers API or SDK by just providing the finetuned model name.

We use Single-Tenancy technique to securely perform all the finetuning jobs.

OpenAI

To finetune LLMs in OpenAI, you need to provide your OpenAI API key. You can create your API key by following these steps:

  1. Login to OpenAI Platform
  2. Click on Dashboard
  3. Click on "API keys" on the left sidebar
  4. Click on "Create new secret key"
  5. You can configure the Permissions for API Key. Required Permissions for finetuning OpenAI LLMs in EasyLLM: Models - Read, Model capabilities - Write, Fine-tuning - Write, Files - Write
  6. Click on "Create secret key"
  7. Copy the API key and paste it in the EasyLLM profile settings.
Info

Fine-Tuning and Files permissions are required for finetuning LLMs. Model capabilities and Models permissions are required for evaluation of finetuned LLMs.

Google AI Studio

  1. Login to Google AI Studio Platform
  2. Click on "Get API Key" on the left sidebar
  3. Click on "Create API Key"
  4. Select the GCP Project and Click on "Create API Key in existing project"

Cohere

  1. Login to Cohere Platform
  2. Click on "API Keys" on the left sidebar
  3. Click on "New Production Key" or "New Trial Key"
  4. Copy the API key and paste it in the EasyLLM profile settings.

Azure OpenAI Service

We recommend to use a separate Azure OpenAI Service resource for finetuning LLMs with EasyLLM. Follow these steps to setup Azure OpenAI Service:

  1. Login to Azure Portal
  2. Go to Azure OpenAI Service
  3. Click on "All Resources" on top and then click on "Create new Azure OpenAI Service resource"
  4. Make sure to choose the Region as North Central US or Sweden Central (Only these regions are supported for finetuning in Azure OpenAI Service). Fill in the other required fields accordingly and complete the resource creation.
  5. Once the resource is created, change the current resource to the newly created resource.
  6. Copy and paste the necessary details (API Key, Endpoint, Resource Group Name, and Subscription ID) in the EasyLLM profile settings.

Azure Service Principal is required for EasyLLM to depoly the finetuned LLM to run evaluations. To create Azure Service Principal, follow these steps:

  1. Login to Azure Portal
  2. Open the Azure Cloud Shell
  3. Run the following command with the appropriate values to create a new Service Principal:
    az ad sp create-for-rbac --name=ft --role owner --scopes /subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.CognitiveServices/accounts/{resource-name}
    
  4. Copy the output JSON and paste it into Service Principal field in the EasyLLM profile settings.

GCP Vertex AI

To use GCP Vertex AI, you need to provide a GCP Service Account JSON. Follow these steps:

  1. Login to GCP Console
  2. Click on "IAM & Admin" on the left menu
  3. Click on "Service Accounts" on the left sidebar
  4. Click on "Create Service Account"
  5. Fill in the required name and id fields and click on "Create and Continue"
  6. In the "Grant this service account access to the project" section, add the following roles:
    • Vertex AI User
    • Storage Object User
  7. Click on "Continue" and then click on "DONE"
  8. From the list of service accounts, Click on the one you have just created. Click on "Keys". And then Click on "ADD KEY" and Create new key. Select "JSON" as the key type and click on "Create"
  9. Open the downloaded JSON file and copy the Service Account JSON. Paste it into GCP Service Account JSON field in the EasyLLM profile settings.
  10. Create a new bucket in GCP Storage for storing datasets and paste the bucket name into GS Bucket Name field in the EasyLLM profile settings.
  11. In Location field, provide the location (region) Ex. us-central1

AWS Bedrock

Info

Make sure you have the access to the Foundation Models, follow this guide to get access to the Foundation Models: Getting Access to Bedrock Foundation Models

To use AWS Bedrock, you need to provide your AWS Bedrock Config details. Please follow these steps:

  • Login to AWS Console
  • Create two new S3 buckets. One for storing datasets and another one for storing output: For example finetuning-data and finetuning-output
  • Go to Identity and Access Management (IAM)

Create Service Role

We need to Create a service role for model customization to do finetuning in Bedrock.

  1. Click on "Policies" on the left sidebar -> Click on "Create policy" -> Click on "JSON" tab -> Add the following JSON. Modify the bucket names accordingly -> Click on "Next" -> Provide the Name (For example S3AccessForBedrock) -> Click on "Create Policy"
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "s3:GetObject",
                    "s3:ListBucket"
                ],
                "Resource": [
                    "arn:aws:s3:::{dataset-bucket-name}",
                    "arn:aws:s3:::{dataset-bucket-name}/*"
                ]
            },
            {
                "Effect": "Allow",
                "Action": [
                    "s3:GetObject",
                    "s3:PutObject",
                    "s3:ListBucket"
                ],
                "Resource": [
                    "arn:aws:s3:::{output-bucket-name}",
                    "arn:aws:s3:::{output-bucket-name}/*"
                ]
            }
        ]
    }
    
  2. Click on "Roles" on the left sidebar -> Click on "Create role" -> Click on "Custom trust policy" -> Paste the below JSON into Custom trust policy field-> Click on "Next" -> Click on "Next"
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Principal": {
                    "Service": "bedrock.amazonaws.com"
                },
                "Action": "sts:AssumeRole"
            }
        ]
    }
    
  3. In the Add permissions section, Search for the policy you just created in the previous step and Select it. Then, Click on "Next".
  4. Provide the Role Name (For example FinetuneServiceRole) and Click on "Create role"
  5. Copy the Role ARN and paste it into AWS Bedrock Config: Role ARN field in the EasyLLM profile settings.

Create User

Info

You need to create a new user to access Bedrock API from EasyLLM. Only the necessary permissions for finetuning and running evaluations of finetuned LLMs are given to this user.

  1. Click on "Policies" on the left sidebar -> Click on "Create policy" -> Click on "JSON" tab -> Add the following JSON. Modify the bucket names accordingly -> Click on "Next" -> Provide the Name (For example FinetunePermissions) -> Click on "Create Policy"
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "bedrock:GetFoundationModel",
                    "bedrock:GetFoundationModelAvailability",
                    "bedrock:ListFoundationModels",
                    "bedrock:GetCustomModel",
                    "bedrock:ListCustomModels",
                    "bedrock:InvokeModel",
                    "bedrock:GetModelCustomizationJob",
                    "bedrock:ListModelCustomizationJobs",
                    "bedrock:CreateModelCustomizationJob",
                    "bedrock:StopModelCustomizationJob",
                    "bedrock:GetProvisionedModelThroughput",
                    "bedrock:CreateProvisionedModelThroughput",
                    "bedrock:ListProvisionedModelThroughputs",
                    "bedrock:UpdateProvisionedModelThroughput",
                    "bedrock:DeleteProvisionedModelThroughput",
                ],
                "Resource": "*"
            },
            {
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:GetObject",
                    "s3:DeleteObject",
                    "s3:ListBucket"
                ],
                "Resource": [
                    "arn:aws:s3:::{dataset-bucket-name}/*",
                    "arn:aws:s3:::{dataset-bucket-name}"
                ]
            },
            {
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:GetObject",
                    "s3:ListBucket"
                ],
                "Resource": [
                    "arn:aws:s3:::{output-bucket-name}/*",
                    "arn:aws:s3:::{output-bucket-name}"
                ]
            },
            {
                "Effect": "Allow",
                "Action": [
                    "iam:PassRole"
                ],
                "Resource": [
                    "{data-access-role-arn}"
                ]
            }
        ]
    }
    
  2. Click on "Users" on the left sidebar -> Click on "Create user" -> Provide the User Name (For example FinetuneUser) -> Click on "Next"
  3. In the Set permissions section, Choose "Attach policy directly" for Permissions options -> Search for the policy you just created in the previous step and Select it -> Click on "Next" -> Click on "Create user"
  4. Click and open the user you just created -> Click on "Create access key" -> Click on "Create access key" -> In "Access key best practices & alternatives" section, Choose "Third party service" -> Select the Confirmation checkbox -> Click on "Next" -> Click on "Create access key"
  5. Copy the Access Key ID and Secret Access Key and paste it into AWS Bedrock Config: Access Key ID and Secret Access Key fields in the EasyLLM profile settings.