3

I want to use the Android camera to report lighting and colour information from a sampled patch on the image preview. The camerax preview generates ImageProxy images, and I can get the average LUV data for a patch. I would like to turn this data into absolute light levels using the exposure information and the camera white balance. The exposure data is in the Exif information, and maybe the white balance information too.

I would like this information, however we get it. Exif seems a very likely route, but any other non-Exif solutions are welcome.

At first sight, it looks as if Exif is always read from a file. However, ExifInterface can be created from an InputStream, and one of the streamType options is STREAM_TYPE_EXIF_DATA_ONLY. This looks promising - it seems something makes and streams just the EXIF data, and a camera preview could easily do just that. Or maybe we can get Exif from the ImageProxy somehow.

I found many old threads on how to get at Exif data to find out the camera orientation. About 4 years ago these people were saying Exif is only read from a file. Is this still so?

Reply to comment: With due misgiving, I attach my dodgy code...

    private class LuvAnalyzer(private val listener:LuvListener) : ImageAnalysis.Analyzer {

        private fun ByteBuffer.toByteArray(): ByteArray {
            rewind()    // Rewind the buffer to zero
            val data = ByteArray(remaining())
            get(data)   // Copy the buffer into a byte array
            return data // Return the byte array
        }

        override fun analyze(image: ImageProxy) {
            // Sum for 1/5 width square of YUV_420_888 image
            val YUV = DoubleArray(3)

            val w = image.width
            val h = image.height
            val sq = kotlin.math.min(h,w) / 5
            val w0 = ((w - sq)/4)*2
            val h0 = ((h - sq)/4)*2

            var ySum = 0
            var uSum = 0
            var vSum = 0

            val y = image.planes[0].buffer.toByteArray()
            val stride = image.planes[0].rowStride

            var offset = h0*stride + w0
            for (row in 1..sq) {
                var o = offset
                for (pix in 1..sq) { ySum += y[o++].toInt() and 0xFF }
                offset += stride
            }
            YUV[0] = ySum.toDouble()/(sq*sq).toDouble()

            val uv = image.planes[1].buffer.toByteArray()
            offset = (h0/2)*stride + w0
            for (row in 1..sq/2) {
                var o = offset
                for (pix in 1..sq/2) {
                    uSum += uv[o++].toInt() and 0xFF
                    vSum += uv[o++].toInt() and 0xFF
                }
                offset += stride
            }
            YUV[1] = uSum.toDouble()/(sq*sq/4).toDouble()
            YUV[2] = vSum.toDouble()/(sq*sq/4).toDouble()

            // val exif = Exif.createFromImageProxy(image)
            listener(YUV)

            image.close()
        }
    }

    private fun startCamera() {
        val cameraProviderFuture = ProcessCameraProvider.getInstance(this)

        cameraProviderFuture.addListener({
            // Used to bind the lifecycle of cameras to the lifecycle owner
            val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get()

            // Preview
            val preview = Preview.Builder()
                .build()
                .also {
                    it.setSurfaceProvider(binding.viewFinder.surfaceProvider)
                }

            imageCapture = ImageCapture.Builder()
                .build()

            // Image analyser
            val imageAnalyzer = ImageAnalysis.Builder()
                .build()
                .also {
                    it.setAnalyzer(cameraExecutor, LuvAnalyzer { LUV ->
                        // Log.d(TAG, "Average LUV: %.1f %.1f %.1f".format(LUV[0], LUV[1], LUV[2]))
                        luvText = "Average LUV: %.1f %.1f %.1f".format(LUV[0], LUV[1], LUV[2])
                    })
                }

            // Select back camera as a default
            val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA

            try {
                // Unbind use cases before rebinding
                cameraProvider.unbindAll()

                // Bind use cases to camera
                cameraProvider.bindToLifecycle(
                    this, cameraSelector, preview, imageCapture, imageAnalyzer)

            } catch(exc: Exception) {
                Log.e(TAG, "Use case binding failed", exc)
            }

        }, ContextCompat.getMainExecutor(this))
    }

I am doing my image averaging from an ImageProxy. I am currently trying to get the Exif data from the same ImageProxy because there not saving images to files, because this is intended to provide a stream of colour values. And there is an intriguing Exif.createFromImageProxy(image) (now commented out) which I discovered after writing the original note, but I can't get it to do anything.

I might get the Exif information if I saved an image to a .jpg file and then read it back in again. The camera is putting out a stream of preview images, and the exposure settings may be changing all the time, so I would have to save a stream of images. If I was really stuck, I might try that. But I feel there are enough Exif bits and pieces to get the information live from the camera.

  • Update

The Google camerax-developers suggest getting the exposure information using the camera2 Extender. I have got it working enough to see the numbers go up and down roughly as they should. This feels a lot better than the Exif route.

I am tempted to mark this as the solution, as it is the solution for me, but I shall leave it open as my original question in the title may have an answer.

val previewBuilder = Preview.Builder()
            val previewExtender = Camera2Interop.Extender(previewBuilder)

            // Turn AWB off
            previewExtender.setCaptureRequestOption(CaptureRequest.CONTROL_AWB_MODE,
               CaptureRequest.CONTROL_AWB_MODE_DAYLIGHT)

            previewExtender.setSessionCaptureCallback(
                object : CameraCaptureSession.CaptureCallback() {
                    override fun onCaptureCompleted(
                        session: CameraCaptureSession,
                        request: CaptureRequest,
                        result: TotalCaptureResult
                    ) {
                        result.get(CaptureResult.SENSOR_EXPOSURE_TIME)
                        result.get(CaptureResult.SENSOR_SENSITIVITY)
                        result.get(CaptureResult.COLOR_CORRECTION_GAINS)
                        result.get(CaptureResult.COLOR_CORRECTION_TRANSFORM)
                    }
                }
            )
1
  • However, ExifInterface can be created from an InputStream Welll.. If the stream delivers a .jpg or another file that has an exif header. You did not tell what you have available.
    – blackapps
    Commented Jun 3, 2022 at 12:03

0

Browse other questions tagged or ask your own question.