Streaming video from Android camera to server
I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?
I have hosted a open source project to enabling Android phone to IP camera:
http://code.google.com/p/ipcamera-for-android
The raw video data is fetched from LocalSocket, and the MDAT MOOV of MP4 was checked first before streaming. The live video is packed as FLV format, and can be played via Flash video player with a build in web server :)
Took me some time, but I finally manage do make an app that does just that. Check out the google code page if you're interested: http://code.google.com/p/spydroid-ipcamera/ I added loads of comments in my code (mainly, look at CameraStreamer.java), so it should be pretty self-explanatory. The hard part was actually to understand the RFC 3984 and implement a proper algorithm for the packetization process. (This algorithm actually turns the mpeg4/h.264 stream produced by the MediaRecorder into a nice rtp stream, according to the rfc)
Bye
I'm looking into this as well, and while I don't have a good solution for you I did manage to dig up SIPDroid's video code:
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
链接地址: http://www.djcxy.com/p/25924.html上一篇: 您今天会选择哪个视频流媒体平台?