Skip to content

Use AzureOpenAI client and clean up env config in chat & image apps#36

Closed
jack-williams wants to merge 7 commits intomainfrom
fix/use-azure-openai-client
Closed

Use AzureOpenAI client and clean up env config in chat & image apps#36
jack-williams wants to merge 7 commits intomainfrom
fix/use-azure-openai-client

Conversation

@jack-williams
Copy link
Copy Markdown
Contributor

Fixes a few issues in how the chat & image apps configure their OpenAI client.

Changes

Use the dedicated Azure OpenAI client when a base URL is set

Previously the code passed both apiKey and a manual defaultHeaders: { "api-key": apiKey } to the OpenAI SDK whenever a base URL was set, which sent the key twice on Azure requests (once as Authorization: Bearer and once as api-key). That's redundant on Azure and breaks non-Azure OpenAI-compatible endpoints.

The PR switches to the SDK's AzureOpenAI class when VITE_OPENAI_BASE_URL is configured, which uses the correct api-key auth header and api-version query param. Otherwise it falls back to the standard OpenAI client.

Drop the dead process.env.OPENAI_* fallbacks

vite.config.ts defines process.env as {}, so every process.env.OPENAI_* reference in ChatService / ImageService was being inlined as undefined at build time and contributed nothing. The fallbacks looked plausible (a user might think setting OPENAI_BASE_URL would work) but silently did nothing. Configuration is now read only from import.meta.env.VITE_OPENAI_*, the supported way to surface env vars to a Vite client bundle.

Tidy .env.example for both apps

Optional VITE_OPENAI_BASE_URL / VITE_OPENAI_API_VERSION / VITE_OPENAI_MODEL lines are commented out instead of having placeholder values. Previously, copying .env.example to .env verbatim would pass a placeholder string like your_openai_base_url_here as a truthy baseURL, activating the Azure path with a bogus endpoint.

Apply the same fixes to the image app

apps/promptions-image/src/services/ImageService.ts had the same dead process.env fallback. It now uses the same AzureOpenAI switch and reads model/base URL/api-version from import.meta.env. streamChat uses the configured chat model instead of the hardcoded gpt-4.1.

Document the optional config in the README

A new ""Optional configuration"" section covers VITE_OPENAI_MODEL, VITE_OPENAI_BASE_URL, and VITE_OPENAI_API_VERSION for both apps.

Background

This started as feedback on #31 (now superseded by #23 on main). Co-authoring with @based-jace since they did the original groundwork in #31.

Verification

Jace Medlin (AIM Consulting Addison Group) and others added 7 commits April 29, 2026 02:29
Co-authored-by: Copilot <copilot@github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Signed-off-by: Jack Williams <4489219+jack-williams@users.noreply.github.com>
Replace manual headers/defaultQuery hacks on the base OpenAI client with the dedicated AzureOpenAI client when a baseURL is configured. This avoids sending the API key twice (once via apiKey -> Authorization, once via defaultHeaders.api-key) and uses the SDK's supported Azure auth path.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Vite's vite.config.ts defines process.env as {}, so process.env.OPENAI_* references were inlined as undefined at build time and never did anything. Drop the dead fallbacks and read configuration only from import.meta.env.VITE_OPENAI_*, which is the supported way to surface env vars to a Vite client bundle.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Comment out optional VITE_OPENAI_BASE_URL / _API_VERSION / _MODEL placeholders in .env.example so a copied file does not silently activate the Azure OpenAI client with a bogus endpoint, and align the example model with the code default (gpt-4.1).

Add an Optional configuration section to the README documenting the new env vars used by the chat app.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Apply the same fixes to apps/promptions-image:
- Drop the dead process.env.OPENAI_API_KEY fallback (Vite defines process.env as {} so it never resolved).
- Read optional VITE_OPENAI_BASE_URL / _API_VERSION / _MODEL from import.meta.env, with the same AzureOpenAI client switch when a base URL is configured.
- Use the configured chat model in streamChat instead of the hardcoded gpt-4.1.
- Mirror the .env.example layout (commented-out optional vars) so copying it does not silently activate the Azure path with a placeholder URL.
- Update vite-env.d.ts with the new optional types.

Generalize the README Optional configuration section to cover both apps.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…client

# Conflicts:
#	apps/promptions-chat/.env.example
#	apps/promptions-chat/src/services/ChatService.ts
#	apps/promptions-image/.env.example
#	apps/promptions-image/src/services/ImageService.ts
@jack-williams jack-williams deleted the fix/use-azure-openai-client branch April 30, 2026 11:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant