[フレーム]
BT

InfoQ Software Architects' Newsletter

A monthly overview of things you need to know as an architect or aspiring architect.

View an example

We protect your privacy.

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Unlock the full InfoQ experience

Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with content, and download exclusive resources.

Log In
or

Don't have an InfoQ account?

Register
  • Stay updated on topics and peers that matter to youReceive instant alerts on the latest insights and trends.
  • Quickly access free resources for continuous learningMinibooks, videos with transcripts, and training materials.
  • Save articles and read at anytimeBookmark articles to read whenever youre ready.

Topics

Choose your language

InfoQ Homepage News Dev Proxy Reaches v1.0 with AI Failure Simulation, Token Rate Limiting, and Integration Enhancements

Dev Proxy Reaches v1.0 with AI Failure Simulation, Token Rate Limiting, and Integration Enhancements

Aug 13, 2025 2 min read

Write for InfoQ

Feed your curiosity. Help 550k+ global
senior developers
each month stay ahead.
Get in touch
Listen to this article - 0:00
Audio ready to play
0:00
0:00

Dev Proxy has reached a significant milestone with the release of version 1.0, introducing a range of new features aimed at helping developers build more reliable AI-powered applications. As reported in the announcement, the update focuses on realistic simulation of language model behavior, advanced resource tracking, and improvements to integration tools.

This marks the first major version of Dev Proxy, which the team described as:

We’re excited to announce the first major version of Dev Proxy! Over the last few years, we shipped functionality that we believe helps developers build more robust apps. After the recent refactorings, we’ve reached what we believe is a solid foundation for our future work. That said, we keep improving our code base and are open to any changes. Moving forward, we’re going to use SemVer to communicate the scope of changes in each release. We’ll keep publishing regular releases and should we ship some breaking changes, we’ll clearly communicate what’s changed and how it affects you.

One of the most notable additions is the LanguageModelFailurePlugin, which enables developers to test how their applications respond to unpredictable AI output. As stated, the plugin can simulate 15 common failure types, including hallucinations, bias, misinterpretations, contradictory statements, and ambiguous responses. With a note that developers can also define custom failure scenarios, ensuring their systems are robust when handling unreliable AI-generated content.

Another key enhancement is the LanguageModelRateLimitingPlugin, which introduces token-based rate limiting simulation. This feature reflects how large language model providers enforce limits by allowing different thresholds for input and output tokens within configurable timeframes. According to the development team, this capability helps simulate realistic performance boundaries and budget constraints for AI integrations.

(Simulating exceeding a token limit for an LLM request, Source: Official Microsoft announcement)

The OpenAITelemetryPlugin has also been improved, now supporting token usage tracking from streamed responses. The plugin can generate detailed cost and usage summaries in Markdown, JSON, or plain text formats, enabling developers to better monitor testing activity and forecast production expenses.

OpenAPI specification generation has been refined through updates to the OpenApiSpecGeneratorPlugin. The enhancements include the ability to exclude response types—particularly useful for AI agents that ignore such metadata—and to capture default parameter values, improving the accuracy of automated API usage by AI tools.

(Generates OpenAPI spec in JSON format from the intercepted requests and responses, Source: Microsoft Documentation)

Alongside the main application, several Dev Proxy tools have been updated. The Dev Proxy Toolkit, a Visual Studio Code extension, now supports v1.0.0 schemas, includes new configuration snippets for language model plugins, adds JSONC support for diagnostics, and offers quick actions for setting essential configuration flags.

The Visual Studio Code Tasks integration has been enhanced to automatically start and stop Dev Proxy during debug sessions. GitHub Actions integration has been simplified for easier CI/CD workflows, and .NET Aspire extensions have been updated with .NET 8 support.

The release also features an improved Dev Proxy MCP Server, which provides coding agents with direct access to updated documentation, schemas, and best practice guidance for building configurations. According to the team, these additions have already resulted in improved outcomes when using AI-assisted configuration editing.

Additional changes include enhanced streaming response handling in Chrome DevTools, expanded authentication compatibility through custom OIDC metadata URLs, streamlined Linux installation defaults, improved configuration validation, and more resilient error handling.

For interested readers, full release notes with a complete list of features, improvements, and bug fixes are available in the official announcement.

About the Author

Almir Vuk

Show moreShow less

Rate this Article

Adoption
Style

This content is in the .NET topic

Related Topics:

Related Content

The InfoQ Newsletter

A round-up of last week’s content on InfoQ sent out every Tuesday. Join a community of over 250,000 senior developers. View an example

We protect your privacy.

BT

AltStyle によって変換されたページ (->オリジナル) /