After some research on Google, I have finally found an article about Sockets that shows how to send requests between a client and a server located on the same network [source].
So I decided to implement it to do a little conversation between the client and the server. Obviously, the phone will be the server and the PC the client because it will be the PC that will be requesting an image to the phone.
So I implemented my ServerSocket thread on the phone and a ClientSocket on the PC but it fails : the connection is not opened and nothing happens. But after some research, I finally found the solution [source].
I now have a working dialog between the phone and the PC. It’s now time to send an image from the camera instead of just some text, and this is where it is getting tricky.
After some research, I find how to get the bytes of the current preview image thanks to a Callback function [source]. So I thought that everything was solved : I had a byte array corresponding to an image, and my connection class was made to send byte arrays! So I managed to put all the things together quite quickly, created a simple GUI on the PC in order to be able to display a picture on the client and finally initiated the connection.
The connection works, the bytes are transferred but the client just won’t rebuild the image from them. After a lot of research, I finally finds the beginning of solution here [source, source]. In fact, event if we set the preview format to JPEG, the format of the bytes provided by Android camera PreviewCallback function are always in YUV format. So to solve the problem, we have to add convert the frames from YUV to RGB, encode them in JPEG, and finally convert them to a byte array. [source]. (Note : this solution is quite heavy for a phone, I should perhaps convert the data on the client instead)
I now thought that everything was solved now but no! Still not : I have some bitmap transferred the right way and I thought in the right format but on the computer screen, I get an unclear and stripped image.
So now I am getting quite annoyed by this callback function that seems to be completely buggy. So decide to change my approach by instead of getting the image from the preview callback, getting it from the takePicture callback (every time we take a picture). So I commented my YUV decoding solution and send directly the bytes obtained from the takePicture callback. Finally, did a perpetual loop that takes picture continuously.
And it is a success! I finally have my first successful image transfer from the phone’s camera to the computer screen over Wi-Fi. But I still have one major problem : I have a frame-rate of something like 1 image every 10 seconds.