Glasses of twilight vision. Android Camera2 API from the Maker Part 5 Flash



Living in an era of technological breakthroughs and accomplishments, looking at how the Mask and Bezos rockets rush into the sky, we, ordinary people with a higher technical education, often do not notice the possibility of making a breakthrough not there, far in space, but here next to us, literally not getting up from the sofa from the table.

Judge for yourself what discovery can lead to reading a regular article about modern smartphones. I will not give the source, so as not to share future income.
, , . « » Google Pixel. IT RAW, HDR-, «», . Pixel 4 «Night Sight» . : , . , .

Another thing is that walking at night and scrubbing at the screen of a mobile phone is somehow uncomfortable, even in night mode. And then my eyes accidentally fell on a VR-headset for a smartphone, lying on a shelf. The breakthrough has come true! It remains only, using it and the knowledge accumulated over four posts about the Android Camera2 API, to direct the image from "Night Sight" directly into the eye. At the same time, hands will be free to catch a black cat in a dark room. Without light, of course, it will not work, photons, at least a little, but it is necessary. But at least we must reach (or even surpass) the level of Kotan’s peepers in the dark.

So, to learn how to see in the dark, we need:

1: a virtual reality headset for a smartphone, (the cheapest one possible)



2: a smartphone with support for modern guglofich for a camera (well, it definitely won’t be the cheapest)



3: knowledge of the basics of Android Camera2 API (we already have it)

part one
part two
part three
part four

We open a new project in Android Studio and start sculpting the code.

The first thing to do is to assemble the actual VR surface that will shine in the headset.

Layout
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:background="#03061B"
    tools:context=".MainActivity">

    <TextureView
        android:id="@+id/textureView"
        android:layout_width="240dp"
        android:layout_height="320dp"
        android:layout_marginTop="28dp"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintHorizontal_bias="0.497"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toTopOf="parent" />

    <TextureView
        android:id="@+id/textureView3"
        android:layout_width="240dp"
        android:layout_height="320dp"
        android:layout_marginTop="16dp"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toBottomOf="@+id/textureView" />

    <LinearLayout
        android:layout_width="165dp"
        android:layout_height="40dp"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toBottomOf="@+id/textureView3"
        app:layout_constraintVertical_bias="0.838">

        <Button
            android:id="@+id/button1"
            android:layout_width="wrap_content"
            android:layout_height="36dp"
            android:backgroundTint="#3F51B5"
            android:text=""
            android:textColor="#1A87DD" />

        <Button
            android:id="@+id/button3"
            android:layout_width="wrap_content"
            android:layout_height="37dp"
            android:backgroundTint="#3F51B5"
            android:text=""
            android:textColor="#2196F3" />
    </LinearLayout>


</androidx.constraintlayout.widget.ConstraintLayout>

The output should have something like this:



As you can see, the textures turned out to be rectangular, while in any VR toy for a smartphone, developers somehow manage to make them round. But we will not dwell on this. Further experience will still show that it will do.

For this, in fact, we write a modest Activity, in which everything is already familiar to us from previous articles.

package com.example.twovideosurfaces;
import androidx.annotation.RequiresApi;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.content.ContextCompat;
import android.Manifest;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.content.pm.PackageManager;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureRequest;
import android.os.Build;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.StrictMode;
import android.util.Log;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.widget.Button;
import java.util.Arrays;
public class MainActivity extends AppCompatActivity  {
    public static final String LOG_TAG = "myLogs";
    public static Surface surface1 = null;
    public static Surface surface2 = null;
    CameraService[] myCameras = null;
    private CameraManager mCameraManager = null;
    private final int CAMERA1 = 0;
    private Button mOn = null;
    private Button mOff = null;
    public static TextureView mImageViewUp = null;
    public static TextureView mImageViewDown = null;
    private HandlerThread mBackgroundThread;
    private Handler mBackgroundHandler = null;
    private void startBackgroundThread() {
        mBackgroundThread = new HandlerThread("CameraBackground");
        mBackgroundThread.start();
        mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
    }
    private void stopBackgroundThread() {
        mBackgroundThread.quitSafely();
        try {
            mBackgroundThread.join();
            mBackgroundThread = null;
            mBackgroundHandler = null;
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
    @RequiresApi(api = Build.VERSION_CODES.M)
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build();
        StrictMode.setThreadPolicy(policy);
        setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
        setContentView(R.layout.activity_main);
        Log.d(LOG_TAG, " ");
        if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED
                ||
                (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED)
        ) {
            requestPermissions(new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, 1);
        }
        mOn = findViewById(R.id.button1);
        mOff = findViewById(R.id.button3);
        mImageViewUp = findViewById(R.id.textureView);
        mImageViewDown = findViewById(R.id.textureView3);
        mOn.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                if (myCameras[CAMERA1] != null) {//  
                    if (!myCameras[CAMERA1].isOpen()) myCameras[CAMERA1].openCamera();
                }
            }
        });
        mOff.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
            }
        });
        mCameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        try {
            //     
            myCameras = new CameraService[mCameraManager.getCameraIdList().length];
            for (String cameraID : mCameraManager.getCameraIdList()) {
                Log.i(LOG_TAG, "cameraID: " + cameraID);
                int id = Integer.parseInt(cameraID);
                //    
                myCameras[id] = new CameraService(mCameraManager, cameraID);
            }
        } catch (CameraAccessException e) {
            Log.e(LOG_TAG, e.getMessage());
            e.printStackTrace();
        }
    }
    public class CameraService {
        private String mCameraID;
        private CameraDevice mCameraDevice = null;
        private CameraCaptureSession mSession;
        private CaptureRequest.Builder mPreviewBuilder;
        public CameraService(CameraManager cameraManager, String cameraID) {
            mCameraManager = cameraManager;
            mCameraID = cameraID;
        }
        private CameraDevice.StateCallback mCameraCallback = new CameraDevice.StateCallback() {
            @Override
            public void onOpened(CameraDevice camera) {
                mCameraDevice = camera;
                Log.i(LOG_TAG, "Open camera  with id:" + mCameraDevice.getId());
                startCameraPreviewSession();
            }
            @Override
            public void onDisconnected(CameraDevice camera) {
                mCameraDevice.close();
                Log.i(LOG_TAG, "disconnect camera  with id:" + mCameraDevice.getId());
                mCameraDevice = null;
            }
            @Override
            public void onError(CameraDevice camera, int error) {
                Log.i(LOG_TAG, "error! camera id:" + camera.getId() + " error:" + error);
            }
        };
        private void startCameraPreviewSession() {
            SurfaceTexture texture = mImageViewUp.getSurfaceTexture();
            texture.setDefaultBufferSize(1280, 1024);
            surface1 = new Surface(texture);
            SurfaceTexture texture2 = mImageViewDown.getSurfaceTexture();
            surface2 = new Surface(texture2);
            texture2.setDefaultBufferSize(1280, 1024);
            try {
                mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
                mPreviewBuilder.addTarget(surface1);
                mPreviewBuilder.addTarget(surface2);
                mCameraDevice.createCaptureSession(Arrays.asList(surface1,surface2),
                        new CameraCaptureSession.StateCallback() {
                            @Override
                            public void onConfigured(CameraCaptureSession session) {
                                mSession = session;

                                try {
                                    mSession.setRepeatingRequest(mPreviewBuilder.build(), null, mBackgroundHandler);
                                } catch (CameraAccessException e) {
                                    e.printStackTrace();
                                }
                            }
                            @Override
                            public void onConfigureFailed(CameraCaptureSession session) {
                            }
                        }, mBackgroundHandler);
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
        }
        public boolean isOpen() {
            if (mCameraDevice == null) {
                return false;
            } else {
                return true;
            }
        }
        public void openCamera() {
            try {
                if (checkSelfPermission(Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
                    mCameraManager.openCamera(mCameraID, mCameraCallback, mBackgroundHandler);
                }
            } catch (CameraAccessException e) {
                Log.i(LOG_TAG, e.getMessage());
            }
        }
        public void closeCamera() {
            if (mCameraDevice != null) {
                mCameraDevice.close();
                mCameraDevice = null;
            }
        }
    }
    @Override
    public void onPause() {
        if (myCameras[CAMERA1].isOpen()) {
            myCameras[CAMERA1].closeCamera();
        }
        stopBackgroundThread();
        super.onPause();
    }
    @Override
    public void onResume() {
        super.onResume();
        startBackgroundThread();
    }
}

Yes, and do not forget about

Manifesto
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.twovideosurfaces">
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.INTERNET"/>
    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/Theme.AppCompat.NoActionBar"
        >
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>
</manifest>


Now we push the smartphone into the VR headset and walk around the house, enjoying the cyborg's vision with a resolution of 1280 x 1024 for each eye. The sensations are, of course, strange, with a loss of depth of vision, but still cool. The only thing is that it looks a bit dark, but this is because the front translucent headset panel interferes. Therefore, it is necessary to make a hole in it in front of the smartphone’s camera. But again, on the most budgetary VR models, such a panel may not even exist at all, and it is likely that you will not have to defile yourself with manual labor.

All that is left now is to convince the Google camera API that we have complete darkness, and it would be nice to use the Night Vision mode, and with it all these RAW, HDR-stacking and scene recognition by neural networks .

To do this, just write in the session:

mPreviewBuilder.set(CaptureRequest.CONTROL_SCENE_MODE,
                                            CaptureRequest.CONTROL_SCENE_MODE_NIGHT);


and unscrew the exposure and photosensitivity to the maximum

 mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE,
        CaptureRequest.CONTROL_AE_MODE_OFF);
mPreviewBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME,Long.valueOf("100000000"));
 mPreviewBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, 30000);


Oh, I'm blind!



This is what the cat turns out to see when he is thrown out of the bedroom, where he prevents people from having sex in the living room.

But of course, this is enumeration and parameters (and there are a lot of them in the API, and here are just a couple) you need to tweak it later by experience.

Now we can only wait for the night. Not moonless, of course, with dense cloud cover somewhere in the taiga, but an ordinary such night with random flying photons. And here’s what’s going to happen ...

Although it would seem that with normal shooting, almost nothing is visible.



But modern cameras work wonders and still find a black cat ...



Now you can walk at night, because during the day you can not because of quarantine. In theory, it’s also impossible at night, but who will see you, stalking in the darkness of night with VR headsets on their heads ...

All Articles