Openai_ex - OpenAI API client library

@rched

It appears that the second authorization method mentioned here (Microsoft Entra ID), results in an API call that’s identical to the OpenAI call. You just have to use the Entra Id token in place of your OpenAI api key in OpenaiEx.new(apikey). You would not have to make any changes in the library code.

The second change in your commit changes the actual URL path. It may be possible to maintain a static mapping of OpenAI URLs to Azure (or other 3rd party) URLs which is applied before the API call. I’m not terribly keen to do this, but will consider it, if it’s possible to keep the existing call signatures undisturbed.

If MS has their own separate OpenAPI spec, instead of re-using the OpenAI one, that’s a sign that they intend to go their own way in the future. In that case, it may be better to create a separate azureai_ex library right now, instead of forcing this one to perform double duty.

@rched I have created an issue for this.

As outlined above, I am calling a path mapping function to determine the URL just before making the API call. I have enabled this for the chat completion API in this issue branch.

Please try this out and see if this approach solves your problem.

If it does, we can apply it to all the API calls. The call signatures and semantics remain unchanged for everyone who doesn’t use the path mapping function.

I will not explicitly commit to supporting Azure, but as long as the API doesn’t deviate too far from OpenAI, this should continue to work.

If this works, it would be great if you could contribute a livebook to document how the Azure configuration should be done.

@restlessronin I’ll give that a try if I get some time to figure out Entra IDs though they seem to add significant friction to the setup process which may make it tough to rely on for my use case.

1 Like

@rched If it’s that much of a pain, let me see what I can do to provide an alternative. i’ll try to come up with something today. since i don’t use azure myself, it would be great if you could test and provide feedback. possibly a livebook on the setup that works for you.

@rched I have created another approach that should work. Please try this , and let me know if it’s working.

@rched I have now packaged what I believe to be basic azure support in a for_azure function. All you should have to do is call it.
I do not work with Azure. Can you please test this to ensure that it works?

If someone other than @rched is interested in Azure support, please try this and see if this is a satisfactory solution.

@restlessronin Yep works great. Thanks for adding this. I’d be happy to open a PR with a livebook if you’d like.

@rched thanks for testing. I’ve added some (non executable) documentation to the main user guide. I’m not sure if a separate livebook adds any value over that. WDYT?

I have released v0.5.7 with Azure OpenAI support

Shoutout to @rched for initiating the work, providing references and code samples and testing.

I agree. A separate livebook seems unnecessary.

1 Like

Published v0.5.8 which includes this PR.

Thanks to github user @kernel-io for the PR.