1

I am using DSPy framework (v2.6.4) which uses liteLLM (v1.63.7) to connect to LLMs.

While connecting Azure OpenAI via liteLLM (v1.63.7) using the below method (Azure AD Token Refresh - DefaultAzureCredential),

from litellm import completion
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default")
response = completion(
 model = "azure/<your deployment name>", # model = azure/<your deployment name> 
 api_base = "<api-url>", # azure api base
 api_version = "<api-version>", # azure api version
 azure_ad_token_provider=token_provider
 messages = [{"role": "user", "content": "good morning"}],
)

I am getting the below error,

litellm\litellm_core_utils\exception_mapping_utils.py", line 2001, in exception_type
 raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.

Connecting without liteLLM (via openai.AzureOpenAI) works fine but the same cred when used via liteLLM, I am getting authentication error.

Code that works

import openai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default")
openai_client = openai.AzureOpenAI(
 api_version="<---version--->",
 azure_endpoint="<---endpoint--->",
 azure_deployment="<---deployment_name--->",
 azure_ad_token_provider=token_provider
)
def interact_with_model():
 try:
 response = openai_client.chat.completions.create(
 model="gpt-4o",
 messages=[ {"role": "system", "content": "You are a helpful assistant that helps me with my math homework!"}, {"role": "user", "content": "Hello! Could you solve 20 x 5?"} ],
 max_tokens=100 
 )
 print(response)
 return response.choices[0].message.content
 except Exception as e:
 return f"Error: {e}"
if __name__ == "__main__":
 response = interact_with_model()
 print(f"Response from the model: {response}")

Anyone faced similar issues? Am I missing something here?

asked Mar 27, 2025 at 8:08
5
  • 1
    Check the token is correct by credential = DefaultAzureCredential() token = credential.get_token("https://cognitiveservices.azure.com/.default").token Commented Mar 27, 2025 at 9:29
  • Thanks. I did the suggested changes but still getting the same error. what else could be wrong? Commented Mar 27, 2025 at 12:49
  • Make sure you have proper RBAC role to access the resource (Azure OpenAI). Commented Mar 27, 2025 at 12:55
  • but the code using openai.AzureOpenAI works well. I have added the code in the question. Commented Mar 28, 2025 at 5:32
  • It worked after upgrading the version of LiteLLM to 1.65.1 Commented Apr 1, 2025 at 10:31

1 Answer 1

1

litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.

The above error occurred may be passing wrong access token or incorrect way passing token to access the endpoint.

You can use the below code it will fetch the response using LiteLLM using Python.

Code:

from litellm import completion
from azure.identity import DefaultAzureCredential
import json
# Get Azure AD Token
credential = DefaultAzureCredential()
token = credential.get_token("https://cognitiveservices.azure.com/.default").token
# Call liteLLM with the token
response = completion(
 model="azure/<deployment name>",
 api_base="<Resource endpoint>",
 api_version="2023年05月15日",
 azure_ad_token=token, # Use azure_ad_token, not azure_ad_token_provider
 messages=[{"role": "user", "content": "good morning"}],
)
print(json.dumps(response.model_dump(), indent=4))

Output:

{
 "id": "cxxxcmpl-xxxxx",
 "created": 1743074722,
 "model": "xxxxxx",
 "object": "chat.completion",
 "system_fingerprint": "xxx",
 "choices": [
 {
 "finish_reason": "stop",
 "index": 0,
 "message": {
 "content": "Good morning! How can I assist you today?",
 "role": "assistant",
 "tool_calls": null,
 "function_call": null
 }
 }
 ],
 "usage": {
 "completion_tokens": 11,
 "prompt_tokens": 9,
 "total_tokens": 20,
 "completion_tokens_details": {
 "accepted_prediction_tokens": 0,
 "audio_tokens": 0,
 "reasoning_tokens": 0,
 "rejected_prediction_tokens": 0
 },
 "prompt_tokens_details": {
 "audio_tokens": 0,
 "cached_tokens": 0
 }
 },
 "service_tier": null
}

enter image description here

Reference: Azure OpenAI | liteLLM

answered Mar 27, 2025 at 11:31
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.