[フレーム]
BT

InfoQ Software Architects' Newsletter

A monthly overview of things you need to know as an architect or aspiring architect.

View an example

We protect your privacy.

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Unlock the full InfoQ experience

Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with content, and download exclusive resources.

Log In
or

Don't have an InfoQ account?

Register
  • Stay updated on topics and peers that matter to youReceive instant alerts on the latest insights and trends.
  • Quickly access free resources for continuous learningMinibooks, videos with transcripts, and training materials.
  • Save articles and read at anytimeBookmark articles to read whenever youre ready.

Topics

Choose your language

InfoQ Homepage News Dev Proxy v0.28 Introduces Telemetry for LLM Usage and Cost Analysis

Dev Proxy v0.28 Introduces Telemetry for LLM Usage and Cost Analysis

This item in japanese

Jun 13, 2025 2 min read

Write for InfoQ

Feed your curiosity. Help 550k+ global
senior developers
each month stay ahead.
Get in touch
Listen to this article - 0:00
Audio ready to play
0:00
0:00

The .NET team has released Dev Proxy version 0.28, introducing new capabilities to improve observability, plugin extensibility, and integration with AI models. A central feature of this release is the OpenAITelemetryPlugin, which, as reported, allows developers to track usage and estimated costs of OpenAI and Azure OpenAI language model requests within their applications.

The plugin intercepts requests and records details such as the model used, token counts (prompt, completion, and total), per-request cost estimates, and grouped summaries per model.

According to the announcement, this plugin supports deeper visibility into how applications interact with LLMs, which can be visualized using external tools like OpenLIT to understand usage patterns and optimize AI-related expenses.

The update also supports Microsoft’s Foundry Local, a high-performance local AI runtime stack introduced at the Build conference last month. Foundry Local enables developers to redirect cloud-based LLM calls to local environments, reducing cost and enabling offline development.

As stated, Dev Proxy can now be configured to use local models, quoting the following from the dev team:

Our initial tests show significant improvements using Phi-4 mini on Foundry Local compared to other models we’ve used in the past. We’re planning to integrate with Foundry Local by default, in the future versions of Dev Proxy.

To configure Dev Proxy with Foundry Local, developers can specify the local model and endpoint in the languageModel section of the proxy’s configuration file. This integration offers a cost-effective alternative for developers working with LLMs during local development.

Regarding the .NET Aspire users, a preview version of Dev Proxy extensions is now available. These extensions simplify integration with Aspire applications, allowing Dev Proxy to run either locally or via Docker with minimal setup. As reported, this enhancement improves portability and simplifies the configuration process for distributed development teams.

In addition, support for OpenAI payloads has been expanded. As stated, previously limited to text completions, Dev Proxy now includes support for a wider range of completion types, increasing compatibility with OpenAI APIs.

The release also brings enhancements to TypeSpec generation. In line with TypeSpec v1.0 updates, the plugin now supports improved PATCH operation generation, using MergePatchUpdate to clearly define merge patch behavior.

As noted in the release, Dev Proxy now supports JSONC (JSON with comments) across all configuration files. This addition enables developers to add inline documentation and annotations, which can aid in team collaboration and long-term maintenance.

Concurrency improvements have also been made in logging and mocking. These changes ensure that logs for parallel requests are grouped accurately, helping developers trace request behavior more effectively.

Two breaking changes are included in this release. First, the GraphConnectorNotificationPlugin has been removed, following the deprecation of Graph connector deployment via Microsoft Teams.

Furthermore, the --audience flag in the devproxy jwt create command has been renamed to --audiences, while the shorthand alias -a remains unchanged.

The CRUD API plugin has been updated with improved CORS handling and consistent JSON responses, enhancing its reliability in client-side applications.

Finally, the Dev Proxy Toolkit for Visual Studio Code has been updated to version 0.24.0. This release introduces new snippets and commands, including support for the already mentioned OpenAITelemetryPlugin, also improved Dev Proxy Beta compatibility, and better process detection.

For interested readers, full release notes are available in the official repository, providing a complete overview of features, changes, and guidance for this version

About the Author

Almir Vuk

Show moreShow less

Rate this Article

Adoption
Style

This content is in the .NET topic

Related Topics:

Related Content

The InfoQ Newsletter

A round-up of last week’s content on InfoQ sent out every Tuesday. Join a community of over 250,000 senior developers. View an example

We protect your privacy.

BT

AltStyle によって変換されたページ (->オリジナル) /