If you do not need the recording and playback functionality in your app, using off-the-shelf streaming app and player is a reasonable choice.
If you do need them to be in your app, however, you will have to look into MediaRecorder API (for the server/camera app) and MediaPlayer (for client/player app).
Quick sample code for the server:
// this is your network socket
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
mCamera = getCameraInstance();
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// this is the unofficially supported MPEG2TS format, suitable for streaming (Android 3.0+)
mMediaRecorder.setOutputFormat(8);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
mMediaRecorder.prepare();
mMediaRecorder.start();
On the player side it is a bit tricky, you could try this:
// this is your network socket, connected to the server
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setDataSource(pfd.getFileDescriptor());
mMediaPlayer.prepare();
mMediaPlayer.start();
Unfortunately mediaplayer tends to not like this, so you have a couple of options: either (a) save data from socket to file and (after you have a bit of data) play with mediaplayer from file, or (b) make a tiny http proxy that runs locally and can accept mediaplayer’s GET request, reply with HTTP headers, and then copy data from the remote server to it. For (a) you would create the mediaplayer with a file path or file url, for (b) give it a http url pointing to your proxy.
See also:
Stream live video from phone to phone using socket fd
MediaPlayer stutters at start of mp3 playback