Conversation
System.Text.Json deserializes JSON values into Dictionary<string, object> as JsonElement wrappers rather than native strings. The previous code used `attr.Value as string` which returned null for JsonElement values, causing targeting rules to fail. This fix properly extracts string values from JsonElement, enabling targeting rule evaluation to work correctly.
|
|
This comment has been minimized.
This comment has been minimized.
| { | ||
| value = request.VariationType?.ToUpper() switch | ||
| { | ||
| "BOOLEAN" => await _client.GetBooleanValueAsync(request.Flag, Convert.ToBoolean(request.DefaultValue), context), |
There was a problem hiding this comment.
Problem
The EvaluateRequest class has a DefaultValue property typed as object:
public class EvaluateRequest
{
public object DefaultValue { get; set; } // Can be bool, int, string, etc.
}When ASP.NET Core's System.Text.Json deserializes JSON into an object property, it does not convert to native .NET types. Instead, it wraps the value in a JsonElement:
{ "defaultValue": true } // Becomes JsonElement, not bool
{ "defaultValue": 42 } // Becomes JsonElement, not int
{ "defaultValue": "hello" } // Becomes JsonElement, not stringThe original code assumed native types:
Convert.ToBoolean(request.DefaultValue) // Throws! DefaultValue is JsonElementConvert.ToBoolean() has no knowledge of JsonElement and throws InvalidCastException.
Why Previous Tests Didn't Fail
The original code had a catch block that swallowed all exceptions:
try
{
value = ... Convert.ToBoolean(request.DefaultValue) ...
}
catch (Exception ex)
{
value = request.DefaultValue;
reason = "ERROR";
}Previous tests (test_dynamic_evaluation.py, test_exposures.py) were asserting on:
- The evaluated flag value (returned by the endpoint)
- The reason field in the response
When InvalidCastException was thrown, the catch block returned reason = "ERROR" and value = request.DefaultValue. If a test expected an error case or didn't care about the reason, it would pass.
Why Flag Eval Metrics Tests Failed
The metrics tests need OpenFeature hooks to fire. The exception was thrown before calling the OpenFeature API:
Convert.ToBoolean(JsonElement)throws- The
catchblock catches it and returns"ERROR" - OpenFeature is never called → hooks never fire → no metrics emitted
The tests assert on metrics received by the OTLP intake, not on the HTTP response. No OpenFeature call = no hooks = no metrics = test failure.
The Fix
Check if the value is a JsonElement and extract the actual value:
private static bool GetDefaultValueAsBool(object defaultValue)
{
if (defaultValue is JsonElement jsonElement)
{
return jsonElement.ValueKind switch
{
JsonValueKind.True => true,
JsonValueKind.False => false,
JsonValueKind.String => bool.Parse(jsonElement.GetString()),
_ => false
};
}
return Convert.ToBoolean(defaultValue); // Fallback for non-JsonElement
}136dee4 to
84f94a9
Compare
84f94a9 to
05a4aed
Compare
Remove irrelevant skips for: - Test_FFE_Eval_Metric_Parse_Error_Invalid_Regex - Test_FFE_Eval_No_Config_Loaded The .NET SDK uses a managed evaluator (not libdatadog), and now correctly returns PARSE_ERROR for invalid regex patterns and PROVIDER_NOT_READY when no config is loaded. Related: DataDog/dd-trace-dotnet#8367
- Add local NuGet source from binaries folder if nupkg files present - Update OpenFeature to 2.3.0 (required by Datadog.FeatureFlags.OpenFeature 2.0.1) - Use wildcard version for Datadog.FeatureFlags.OpenFeature to pick up local builds
The flag eval metrics feature is implemented in dd-trace-dotnet PR #8367 but not yet released. Mark as missing_feature until v3.42.0 is published.
|
Leaving the test disabled for now because the changes in DataDog/dd-trace-dotnet#8367 haven't been published yet. Tests pass when run this locally: DataDog/dd-trace-dotnet#8367 (comment) Copy-pasted from the referenced comment:
|
| RUN apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y curl | ||
|
|
||
| # install dd-trace-dotnet (must be done before setting LD_PRELOAD) | ||
| COPY utils/build/docker/dotnet/install_ddtrace.sh binaries/ /binaries/ |
There was a problem hiding this comment.
Why run COPY binaries/ commands twice?
Stage 1: build-app (SDK image)
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build-app
# NEW: Copy binaries for NuGet package resolution
COPY binaries/ /binaries/
RUN if ls /binaries/*.nupkg 1> /dev/null 2>&1; then \
dotnet nuget add source /binaries --name local-packages; \
fi
# NuGet packages are resolved HERE
RUN dotnet restore
RUN dotnet publishPurpose: Build the weblog application. NuGet packages (like Datadog.FeatureFlags.OpenFeature.*.nupkg) must be available as a NuGet source before dotnet restore runs.
Stage 2: runtime (ASP.NET image)
FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS runtime
# EXISTING: Copy binaries for tracer installation
COPY utils/build/docker/dotnet/install_ddtrace.sh binaries/ /binaries/
RUN /binaries/install_ddtrace.shPurpose: Install the Datadog tracer into the runtime image. The install_ddtrace.sh script uses:
- Native profiler:
Datadog.Trace.ClrProfiler.Native.so - Managed tracer DLLs:
net6.0/Datadog.Trace.dll, etc.
nccatoni
left a comment
There was a problem hiding this comment.
LGTM (for @DataDog/system-tests-core) but you should get a review from someone more familiar with the weblog



Motivation
Prepare the .NET weblog for
test_flag_eval_metrics.pytests to validate the newfeature_flag.evaluationsOTel metric implementation.Related PRs:
Changes
Weblog (
utils/build/docker/dotnet/weblog/):Fix
JsonElementhandling inFeatureFlagEvaluatorController.csSystem.Text.Jsondeserializesobjectproperties asJsonElement, not native typesJsonElementbefore calling OpenFeature APIsInvalidCastExceptionthat was occurring before OpenFeature hooks could fireUpgrade OpenFeature packages in
app.csproj:OpenFeature: 2.0.0 → 2.3.0 (required forFinallyAsynchook signature withFlagEvaluationDetails<T>)Datadog.FeatureFlags.OpenFeature: Use wildcard version to pick up local/CI buildsNote: The
tests/ffe/test_flag_eval_metrics.pytests remain marked asmissing_featurein the manifest until dd-trace-dotnet v3.42.0 is released with flag eval metrics support.Workflow
🚀 Once your PR is reviewed and the CI green, you can merge it!
🛟 #apm-shared-testing 🛟
Reviewer checklist
tests/ormanifests/is modified ? I have the approval from R&P teambuild-XXX-imagelabel is present