Android视频采集

http://www.rosoo.net/a/201111/15259.html

将之前做过的一个比较实用的在Android实时采集视频,并在PC上显示出采集到的视频的程序,包括PC端和Android端程序,基于Android 1.5 在HTC G3上测试通过。开发平台是Android 1.5,这个程序实现视频流的获取,程序简单地在第20帧到来的时候,写入到文件中。这样就可以拿

TAG: Android  视频采集  

 

相关完整源码可以从这里下载到:

http://bbs.rosoo.net/forum.php?mod=viewthread&tid=8669

2010-10-13晚上 更新~ 将之前做过的一个比较实用的在Android实时采集视频,并在PC上显示出采集到的视频的程序,包括PC端和Android端程序,基于Android 1.5
在HTC G3上测试通过。代码在分界线之后。

之前网上找了很多资料,没有找到如何截取Android视频流。后来发现在Android的拍照视频预览时就可以截取视频数据。每获得一帧就调用一下接口函数。

我的开发平台是Android 1.5,这个程序实现视频流的获取,程序简单地在第20帧到来的时候,写入到文件中。这样就可以拿到电脑上进行分析。

具体请大家参考代码

package com.sunshine;

import java.io.File;
import java.io.RandomAccessFile;

import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.Window;
import android.view.WindowManager;
import android.view.SurfaceHolder.Callback;

public class AndroidVideo extends Activity implements Callback,
Camera.PictureCallback {
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);

   getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);

   setContentView(R.layout.main);

   mSurfaceView = (SurfaceView) this.findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

}

@Override
public void onPictureTaken(byte[] data, Camera camera) {
try {
Log.v("System.out", "get it!");
File file = new File("/sdcard/camera.jpg");
RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.write(data);
raf.close();
} catch (Exception ex) {
Log.v("System.out", ex.toString());
}
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
mCamera.setPreviewCallback(new StreamIt());
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (Exception ex) {
}
mCamera.startPreview();
mPreviewRunning = true;
}

@Override
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}

@Override
public void onConfigurationChanged(Configuration newConfig) {
try {
super.onConfigurationChanged(newConfig);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
} else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
}
} catch (Exception ex) {
}
}
}

class StreamIt implements Camera.PreviewCallback {
private int tick = 1;

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
if (tick == 20) {
System.out.println("data len: " + data.length);
try {
File file = new File("/sdcard/pal.pal");
if (!file.exists())
file.createNewFile();
RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.write(data);
raf.close();
tick++;
} catch (Exception ex) {
Log.v("System.out", ex.toString());
}
}
tick++;
}
}

xml 布局文件

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent" android:layout_height="fill_parent"
android:orientation="vertical">
<SurfaceView android:id="@+id/surface_camera"
android:layout_width="fill_parent" android:layout_height="fill_parent">
</SurfaceView>
</LinearLayout>

注意在项目配置文件中还要加上访问权限

<uses-permission android:name="android.permission.CAMERA" />

通过查资料发现,Android每帧的数据流的格式是YUV420

下面附上一个将 YUV420转成RGB的函数,

  1. staticpublicvoiddecodeYUV420SP(byte[]
    rgbBuf,byte[] yuv420sp,intwidth,intheight)
    {   
  2.     finalintframeSize
    = width * height;   
  3. if(rgbBuf
    ==null)   
  4.     thrownewNullPointerException("buffer
    'rgbBuf' is null");   
  5. if(rgbBuf.length
    < frameSize *3)   
  6.     thrownewIllegalArgumentException("buffer
    'rgbBuf' size "  
  7.              + rgbBuf.length +" < minimum "+
    frameSize *3);   
  8.   
  9. if(yuv420sp
    ==null)   
  10.     thrownewNullPointerException("buffer
    'yuv420sp' is null");   
  11.   
  12. if(yuv420sp.length
    < frameSize *3/2)
      
  13.     thrownewIllegalArgumentException("buffer
    'yuv420sp' size "+ yuv420sp.length   
  14.              +" < minimum "+
    frameSize *3/2);
      
  15.        
  16.     inti
    =0, y =0;
      
  17.     intuvp
    =0, u =0,
    v =0;   
  18.     inty1192
    =0, r =0,
    g =0, b =0;
      
  19.        
  20.     for(intj
    =0, yp =0;
    j < height; j++) {   
  21.          uvp = frameSize + (j >>1)
    * width;   
  22.          u =0;
      
  23.          v =0;
      
  24.         for(i
    =0; i < width; i++, yp++) {   
  25.              y = (0xff&
    ((int) yuv420sp[yp])) -16;
      
  26.             if(y
    <0) y =0;
      
  27.             if((i
    &1) ==0)
    {   
  28.                  v = (0xff&
    yuv420sp[uvp++]) -128;   
  29.                  u = (0xff&
    yuv420sp[uvp++]) -128;   
  30.              }   
  31.                
  32.              y1192 =1192*
    y;   
  33.              r = (y1192 +1634*
    v);   
  34.              g = (y1192 -833*
    v -400* u);   
  35.              b = (y1192 +2066*
    u);   
  36.                
  37.             if(r
    <0) r =0;elseif(r
    >262143) r =262143;
      
  38.             if(g
    <0) g =0;elseif(g
    >262143) g =262143;
      
  39.             if(b
    <0) b =0;elseif(b
    >262143) b =262143;
      
  40.                
  41.              rgbBuf[yp *3]
    = (byte)(r >>10);
      
  42.              rgbBuf[yp *3+1]
    = (byte)(g >>10);
      
  43.              rgbBuf[yp *3+2]
    = (byte)(b >>10);
      
  44.          }   
  45.      }   
  46.    }  

代码来自http://chenweihuacwh.javaeye.com/blog/571223

感谢cwh643

-----------------------------分界线-------------------------------------------

-----------------------------2010-10-13更新-------------------------------

Android 端

package com.sunshine;

import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.net.Socket;

import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;
import android.view.SurfaceHolder.Callback;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;

public class AndroidVideo extends Activity implements Callback,OnClickListener{
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;

//连接相关
private EditText remoteIP=null;
private Button connect=null;
private String remoteIPStr=null;

//视频数据
private StreamIt streamIt=null;
public static  Kit kit=null;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);

getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);

setContentView(R.layout.main);

mSurfaceView = (SurfaceView) this.findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

remoteIP=(EditText)this.findViewById(R.id.remoteIP);
connect=(Button)this.findViewById(R.id.connect);
connect.setOnClickListener(this);

}

public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
streamIt=new StreamIt();
kit=new Kit();
mCamera.setPreviewCallback(streamIt);

mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (Exception ex) {
}
mCamera.startPreview();
mPreviewRunning = true;
}

public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}

public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}

@Override
public void onConfigurationChanged(Configuration newConfig) {
try {
super.onConfigurationChanged(newConfig);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
} else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
}
} catch (Exception ex) {
}
}

class Kit implements Runnable {
private boolean run=true;
//        private final int dataLen=57600; //307200 OR 230400 76800 OR 57600
private final int tt=28800;

public void run() {
// TODO Auto-generated method stub
try {
Socket socket = new Socket(remoteIPStr, 8899);
DataOutputStream dos = new DataOutputStream(socket
.getOutputStream());
DataInputStream dis = new DataInputStream(socket
.getInputStream());
while (run) {
dos.write(streamIt.yuv420sp, 0, 28800);
dos.write(streamIt.yuv420sp, 28800, 28800);

dis.readBoolean();
Thread.sleep(155);
}
} catch (Exception ex) {
run=false;
ex.printStackTrace();
}
}

}

@Override
public void onClick(View view) {
// TODO Auto-generated method stub
if(view==connect){//连接函数
remoteIPStr=remoteIP.getText().toString();
new Thread(AndroidVideo.kit).start();
}
}
}

class StreamIt implements Camera.PreviewCallback {
public byte[] yuv420sp =null;
private boolean t=true;

public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
//        if(t){
//            t=false;
//            new Thread(AndroidVideo.kit).start();
//        }
yuv420sp=data;
}
}

PC端

import java.awt.Frame;
import java.awt.Graphics;
import java.awt.Point;
import java.awt.Transparency;
import java.awt.color.ColorSpace;
import java.awt.image.BufferedImage;
import java.awt.image.ComponentColorModel;
import java.awt.image.DataBuffer;
import java.awt.image.DataBufferByte;
import java.awt.image.PixelInterleavedSampleModel;
import java.awt.image.Raster;
import java.awt.image.SampleModel;
import java.awt.image.WritableRaster;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.net.ServerSocket;
import java.net.Socket;

public class FlushMe extends Frame {
private static final long serialVersionUID = 1L;
private BufferedImage im;
// 图像信息
//    private final int width = 480;
//    private final int height = 320;
private static final int width = 240;
private static final int height = 160;
private static final int numBands = 3;
private static final int dataLen = 57600;//307200 OR 230400//57600 76800
private static final int tt = 28800;//14400;//28800;
// 图像数组
private byte[] byteArray = new byte[width * height * numBands];// 图像RGB数组
private byte[] yuv420sp = new byte[dataLen];// 图像YUV数组

private static final int[] bandOffsets = new int[] { 0, 1, 2 };
private static final SampleModel sampleModel = new PixelInterleavedSampleModel(
DataBuffer.TYPE_BYTE, width, height, 3, width * 3,
bandOffsets);
// ColorModel
private static final ColorSpace cs=ColorSpace.getInstance(ColorSpace.CS_sRGB);
private static final ComponentColorModel cm=new ComponentColorModel(cs, false, false,
Transparency.OPAQUE, DataBuffer.TYPE_BYTE);

public FlushMe() {
super("Flushing");
updateIM();
setSize(480, 320);
// 窗口关闭方法
this.addWindowListener(new java.awt.event.WindowAdapter() {
public void windowClosing(java.awt.event.WindowEvent e) {
System.exit(0);
}
});
// 窗口居中
this.setLocationRelativeTo(null);
this.setResizable(false);
this.setVisible(true);
this.getData();
}

public void update(Graphics g){
paint(g);
}

public void paint(Graphics g) {
g.drawImage(im, 0, 0, 480, 320, this);
}

public void getData() {
try {
ServerSocket server = new ServerSocket(8899);
Socket socket = server.accept();
DataInputStream dis = new DataInputStream(socket.getInputStream());
DataOutputStream dos = new DataOutputStream(socket.getOutputStream());
while (true) {
for (int i = 0; i < dataLen / tt; i++) {
dis.read(yuv420sp, i * tt, tt);
}
// 得到数据之后立即更新显示
updateIM();
im.flush();
repaint();

dos.writeBoolean(true);
}
} catch (Exception ex) {
ex.printStackTrace();
}
}

private void updateIM() {
try {
// 解析YUV成RGB格式
decodeYUV420SP(byteArray, yuv420sp, width, height);
DataBuffer dataBuffer = new DataBufferByte(byteArray, numBands);
WritableRaster wr = Raster.createWritableRaster(sampleModel,
dataBuffer, new Point(0, 0));
im = new BufferedImage(cm, wr, false, null);
} catch (Exception ex) {
ex.printStackTrace();
}
}

private static void decodeYUV420SP(byte[] rgbBuf, byte[] yuv420sp,
int width, int height) {
final int frameSize = width * height;
if (rgbBuf == null)
throw new NullPointerException("buffer 'rgbBuf' is null");
if (rgbBuf.length < frameSize * 3)
throw new IllegalArgumentException("buffer 'rgbBuf' size "
+ rgbBuf.length + " < minimum " + frameSize * 3);

if (yuv420sp == null)
throw new NullPointerException("buffer 'yuv420sp' is null");

if (yuv420sp.length < frameSize * 3 / 2)
throw new IllegalArgumentException("buffer 'yuv420sp' size "
+ yuv420sp.length + " < minimum " + frameSize * 3 / 2);

int i = 0, y = 0;
int uvp = 0, u = 0, v = 0;
int y1192 = 0, r = 0, g = 0, b = 0;

for (int j = 0, yp = 0; j < height; j++) {
uvp = frameSize + (j >> 1) * width;
u = 0;
v = 0;
for (i = 0; i < width; i++, yp++) {
y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}

y1192 = 1192 * y;
r = (y1192 + 1634 * v);
g = (y1192 - 833 * v - 400 * u);
b = (y1192 + 2066 * u);

if (r < 0)
r = 0;
else if (r > 262143)
r = 262143;
if (g < 0)
g = 0;
else if (g > 262143)
g = 262143;
if (b < 0)
b = 0;
else if (b > 262143)
b = 262143;

rgbBuf[yp * 3] = (byte) (r >> 10);
rgbBuf[yp * 3 + 1] = (byte) (g >> 10);
rgbBuf[yp * 3 + 2] = (byte) (b >> 10);
}
}
}

public static void main(String[] args) {
Frame f = new FlushMe();
}
}

上个截图

 

(sundos)

时间: 2024-10-10 09:40:32

Android视频采集的相关文章

android中如何将视频采集到缓冲区中

问题描述 android中如何将视频采集到缓冲区中 我打算先通过摄像头将视频采集到缓冲中,然后再从缓冲中读取数据 解决方案 视频采集这块的 我记得自己有demo

视频流-Android端音视频采集到推流这一步,服务器端不需要

问题描述 Android端音视频采集到推流这一步,服务器端不需要 各位大牛,小弟现在因为项目需求,需要做一个android端采集音视频并实时对音视频编解码(AAC/H.264), 然后封包RTMP推送至流媒体服务器,服务器端再实时将可播放的流发送至其他android客户端以支持RTMP协议的流媒体播放. 在网上找了很多文章都大同小异,而且感觉不是很适合. 感觉视频流用MediaRecorder进行H264硬编码 + LocalSocket的方案可行,但要是直接可播放的流貌似应该要先发送SPS和P

我的Android进阶之旅------&amp;gt;Android视频录制小例子

============================首先看看官网上关于视频捕捉的介绍================================ Capturing videos Video capture using the Android framework requires careful management of the Camera object and coordination with the MediaRecorder class. When recording vid

android 实时视频采集,传输,,显示

问题描述 大家好,小弟最近在做android项目的时候遇到了这些问题,,我想知道如何如何提取再视频采集的时候字节里面的每一帧数据,并把它实时的播放出来,,小弟有礼了,,跪求啊 解决方案 解决方案二:++额也想知道.解决方案三:等待........

入门视频采集与处理(BT656简介)

凡是做模拟信号采集的,很少不涉及BT.656标准的,因为常见的模拟视频信号采集芯片都支持输出BT.656的数字信号,那么,BT.656到底是何种格式呢?      本文将主要介绍 标准的 8bit BT656(4:2:2)YCbCr SDTV(标清) 数字视频信号格式,主要针对刚刚入门模拟视频采集的初学者入门之用. 1.  帧的概念(Frame)     一个视频序列是由N个帧组成的,采集图像的时候一般有2种扫描方式,一种是逐行扫描(progressive scanning),一种是隔行扫描(i

视频采集

前一段时间写的"实时语音通信",朋友们给了我很大鼓励,甚感欣慰!但是就那个程序,现在让我困苦不堪.在这里恳请高手帮助解决这些问题: 1.回放问题,仍然没有实现的双缓冲播放.播放两个声音块的效果就像火车经过两节铁轨的接点,不连续,有杂音.这简直让人无法忍受!为了弱化这个问题,不得不增大INP_BUFFER_SIZE值,以减少"接点"个数.于是第二个问题就来了. 2.延时问题,那个程序的延时达到秒级,当时考虑了网络壅塞以及节省内存的情况多次使用了内存申请.释放.内存拷贝

android 视频实时传输

问题描述 android 视频实时传输 求大神能指点一下android 视频聊天和录制,小弟最近在弄这方面,要求是视频能实时传输到另外一台手机上观看,而且本机上面还有录制的功能.

入门视频采集与处理(学会分析YUV数据)

做视频采集与处理,自然少不了要学会分析YUV数据.因为从采集的角度来说,一般的视频采集芯片输出的码流一般都是YUV数据流的形式,而从视频处理(例如H.264.MPEG视频编解码)的角度来说,也是在原始YUV码流进行编码和解析,所以,了解如何分析YUV数据流对于做视频领域的人而言,至关重要.本文就是根据我的学习和了解,简单地介绍如何分析YUV数据流.     YUV,分为三个分量,"Y"表示明亮度(Luminance或Luma),也就是灰度值:而"U"和"V

谈谈关于Android视频编码的那些坑

本文讲的是谈谈关于Android视频编码的那些坑,Android的视频相关的开发,大概一直是整个Android生态,以及Android API中,最为分裂以及兼容性问题最为突出的一部分.摄像头,以及视频编码相关的API,Google一直对这方面的控制力非常差,导致不同厂商对这两个API的实现有不少差异,而且从API的设计来看,一直以来优化也相当有限,甚至有人认为这是"Android上最难用的API之一" 以微信为例,我们录制一个540p的mp4文件,对于Android来说,大体上是遵循