-
Notifications
You must be signed in to change notification settings - Fork 4
Description
System: macOS Sequoia 15.6
Unity version: 6000.0.32f1
com.unity.xr.arkit version: 6.1.0
com.unity.xr.arfoundation version: 6.1.0
com.unity.xr.core-utils version: 2.5.1
Expected behavior: ARFoundationQRTracker.OnCameraFrameReceived executes qrDetector.DetectMultipleAsync without errors
Observed behavior ARFoundationQRDetector.DetectMultipleAsync at line 37 throws exception "ArgumentException: TextureFormat not supported".
Why: because it calls XRCpuImage.ConvertAsync with ConversionParams' outputFormat equal to TextureFormat.R8, which, jugding by what debugger is showing me, XRCpuImage no longer supports.
Consequences: OnCameraFrameReceived is never called again because the thrown error causes it to abort before setting qrDetectionProcessing back to false, so no images are being processed.
Question: what format should be used instead? The raw data received from ConvertAsync is passed to this line:
var luminanceSource = new NativeArrayLuminanceSource(rawData, width, height);
which throws an error because it gets an array three times the length that it expects (RGB24 is three bytes per pixel, R8 is one byte per pixel).
Temporary solution:
- Add a function to ARFoundationQRDetector to compress an RGB24 raw data array into an R8 raw data array by averaging out every three pixels:
public NativeArray<byte> ConvertRGB24ToR8(NativeArray<byte> input)
{
// Using a C# array because I don't know if NativeArray's write speed is fast and this seemed to lag a bit
byte[] resultArr = new byte[input.Length / 3];
int sum = 0;
for (int i = 0; i < input.Length; i++)
{
sum += input[i];
if ((i + 1) % 3 == 0)
{
resultArr[i / 3] = (byte)(sum / 3);
sum = 0;
}
}
NativeArray<byte> result = new NativeArray<byte>(resultArr, Allocator.TempJob);
return result;
}
- Make sure that rawData in ARFoundationQRDetector.DetectMultipleAsync like so:
public async UniTask<Result[]> DetectMultipleAsync(XRCpuImage image)
{
XRCpuImage.AsyncConversion request = default;
try
{
request = image.ConvertAsync(new XRCpuImage.ConversionParams
{
inputRect = new RectInt(0, 0, image.width, image.height),
outputDimensions = new Vector2Int(image.width, image.height),
- outputFormat = TextureFormat.R8,
+ outputFormat = TextureFormat.RGB24, // Changed R8 to RGB24 because R8 no longer supported by XRCpuImage
});
await UniTask.WaitUntil(() => request.status.IsDone());
if (request.status != XRCpuImage.AsyncConversionStatus.Ready)
{
Debug.LogErrorFormat("Request failed with status {0}", request.status);
return Array.Empty<Result>();
}
await UniTask.SwitchToThreadPool();
NativeArray<byte> rawData = request.GetData<byte>();
+ // Convert rawData from 3 color channels to 1 grayscale channel by averaging the values
+ NativeArray<byte> rawDataR8 = ConvertRGB24ToR8(rawData);
+ rawData.Dispose();
var width = request.conversionParams.outputDimensions.x;
var height = request.conversionParams.outputDimensions.y;
- var luminanceSource = new NativeArrayLuminanceSource(rawData, width, height);
+ var luminanceSource = new NativeArrayLuminanceSource(rawDataR8, width, height);
+ rawDataR8.Dispose();
var results = barcodeReader.DecodeMultiple(luminanceSource);
await UniTask.SwitchToMainThread();
return results ?? Array.Empty<Result>();
}
finally
{
request.Dispose();
}
}