2012年6月28日 星期四

start a service on boot on Android

如果要在一開機時自動啟動一些service or app,
必須照一些步驟來作, 步驟如下:
1.在 manifest加上 permission: android.permission.RECEIVE_BOOT_COMPLETED

2.<receiver android:name="com.example.MyBroadcastReceiver">
    <intent-filter>
        <action android:name="android.intent.action.BOOT_COMPLETED" />
    </intent-filter>
</receiver>

3.加上BroadcastReciver
package com.example;

public class MyBroadcastreceiver extends BroadcastReceiver {
    @Override
    public void onReceive(Context context, Intent intent) {
if ("android.intent.action.BOOT_COMPLETED".equals(intent.getAction()))
    {
        Intent startServiceIntent = new Intent(context, MyService.class);
        context.startService(startServiceIntent);
    }
}
這樣就搞定了!

PS. 如果你是HTC手機, 可能要加上 <action android:name="android.intent.action.QUICKBOOT_POWERON" />

Ref:
Trying to start a service on boot on Android
Boot Receiver not work

2012年6月27日 星期三

get android status (notification) bar height & system resolution

Android最麻煩的其中之一, 就是解析度的種類有夠多, 所以在寫程式時有時必須要依解析度去作動態換算.
這次需要的是取得status bar的高度, 在網路上找了好一陣子才找到...... 很不會寫程式(嘆)

範例如下
View rootView  = getRootView();  
Rect r = new Rect();  
rootView.getWindowVisibleDisplayFrame(r);  
statusBarHeight = r.top; 
要注意的只有一個小地方, 請依device不同來決定要取r的哪個部位. 手機是取top. 平板可能會取bottom, 但我沒試過.

順便附上取得系統resolution的方法
Display localDisplay = getSystemService("window").getDefaultDisplay();
DisplayMetrics localDisplayMetrics = new DisplayMetrics();
localDisplay.getMetrics(localDisplayMetrics);
float scale = localDisplayMetrics.density;
Log.d("Resolution", localDisplayMetrics.widthPixels + " " + localDisplayMetrics.heightPixels + " " + scale);
要注意的還是只有一個小地方, landscape的width/height與portrait的width/height是相反的, EX:landspace為 800/480, portrait時會變成 480/800, 取值計算時要注意.

float window for android

最近在練習寫一些Android app, 在網路上看到浮動視窗的例子.
自己試了一下, 效果還不錯, 可以拿來作一些外掛功能的選單.

記錄一下關鍵的地方, 範例如下:
public class WindowParam extends Application {
 private WindowManager.LayoutParams wmParams=new WindowManager.LayoutParams();
 private Activity refActivity = null;
 
 public WindowManager.LayoutParams getMywmParams(){
  return wmParams;
 }
}

public class TestActivity extends Activity {
 ......
 private void onCreate() {
     try { 
      WindowManager wm = (WindowManager)mContext.getSystemService("window");     
      WindowParam wmParams = ((WindowParam)mContext.getApplicationContext()).getMywmParams();
      wmParams.width=width;
      wmParams.height=height;
      wmParams.x=0;
      wmParams.y=0;
      wmParams.type=LayoutParams.TYPE_PHONE;
      wmParams.flags=LayoutParams.FLAG_NOT_TOUCH_MODAL | LayoutParams.FLAG_NOT_FOCUSABLE;
      wmParams.format=PixelFormat.RGBA_8888;
      wmParams.gravity=Gravity.LEFT|Gravity.TOP;
      
      btn = new MyFloatView(mContext);
      wm.addView(btn, wmParams);
      
     } catch (Exception e) {
   Log.d("shaw error", e.toString());
  }
     
 }
 ......
}

public class MyFloatView extends TextView {
    private float mTouchStartX;
    private float mTouchStartY;
    private float x;
    private float y;
    
    private WindowManager wm=(WindowManager)getContext().getApplicationContext().getSystemService("window");
    private WindowManager.LayoutParams wmParams = ((MyApplication)getContext().getApplicationContext()).getMywmParams();

    public MyFloatView(Context context) {
        super(context);        
        // TODO Auto-generated constructor stub
    }
    
     @Override
     public boolean onTouchEvent(MotionEvent event) {         
         x = event.getRawX();
         y = event.getRawY()-25;
         Log.i("currP", "currX"+x+"====currY"+y);
         switch (event.getAction()) {
            case MotionEvent.ACTION_DOWN:
                mTouchStartX =  event.getX();  
                mTouchStartY =  event.getY();               
                Log.i("startP", "startX"+mTouchStartX+"====startY"+mTouchStartY);                
                break;
            case MotionEvent.ACTION_MOVE:
                updateViewPosition();
                break;
            case MotionEvent.ACTION_UP:
                updateViewPosition();
                mTouchStartX=mTouchStartY=0;
                break;
            }
            return true;
        }     
     private void updateViewPosition(){
        wmParams.x=(int)( x-mTouchStartX);
        wmParams.y=(int) (y-mTouchStartY);
        wm.updateViewLayout(this, wmParams);
     }
}


要注意的地方:
1. wmParams需建立global物件或正確傳遞, 在更新畫面時必須填入完整的wmParams, 否則會出現大大的icon擋住視窗, 網路上查到的作法大多都是用global object來進行傳遞.

2. wmParams.type=LayoutParams.TYPE_PHONE; 決定這個視窗在整個system的顯示等級. 可以參考官網的WindowManager.LayoutParams. 數字愈小表示priority愈大, 

3. y = event.getRawY()-25; 這個25是status bar的高度, 依各家手機之解析度不同, 這東西的值也不同. 必須去動態取得. 這個值會影響touch的效果, 如果你所在的畫面有status bar, 但沒減去正確的offset, 系統在render畫面時, 物件的Y值就會錯誤.

4. manifest的 user-permission 要加入 android.permission.SYSTEM_ALERT_WINDOW

2012年6月23日 星期六

Build FFMpeg for Android & example app

最近在寫些video相關的app, 想試看看FFMPEG在Android平台上的效果. 在網路上看了一下別人build ffmpeg的過程, 自己也試了一下,還算簡單, 跟大家作個分享.

假設已經裝好Android SDK + NDK, 我的版本是r7b

1.下載ffmpeg
我是用git clone下來的, 也可以下載ffmpeg下載頁面下面的snapshot
http://www.ffmpeg.org/download.html

2.建立build script
網路上有很多種方法, 最後我是用build script(from rockplayer)來建立.
一來可以選擇想要的功能, 二來這真的挺方便, 推薦! 記得把build script放在ffmpeg的目錄裡.
參考的build script: http://roman10.net/src/build_android_r5b.txt
我的build script
#!/bin/bash
######################################################
# Usage:
# put this script in top of FFmpeg source tree
# ./build_android
# It generates binary for following architectures:
# ARMv6 
# ARMv6+VFP 
# ARMv7+VFPv3-d16 (Tegra2) 
# ARMv7+Neon (Cortex-A8)
# Customizing:
# 1. Feel free to change ./configure parameters for more features
# 2. To adapt other ARM variants
# set $CPU and $OPTIMIZE_CFLAGS 
# call build_one
######################################################

NDK=/home/shaw/work/Android/android-sdk-linux/android-ndk-r7b
PLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86
function build_one
{
./configure --target-os=linux \
    --prefix=$PREFIX \
    --enable-cross-compile \
    --enable-shared \
    --enable-static \
    --extra-libs="-lgcc" \
    --arch=arm \
    --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
    --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
    --nm=$PREBUILT/bin/arm-linux-androideabi-nm \
    --sysroot=$PLATFORM \
    --extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS -I/usr/local/include" \
    --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L $PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog -L/usr/local/lib " \
    --enable-gpl \
    --disable-everything \
    --enable-demuxer=mov \
    --enable-demuxer=h264 \
    --enable-muxer=h264 \
    --enable-muxer=mp4 \
    --enable-muxer=flv \
    --enable-muxer=mov \
    --disable-ffplay \
    --enable-protocol=file \
    --enable-avformat \
    --enable-avcodec \
    --enable-decoder=rawvideo \
    --enable-decoder=mjpeg \
    --enable-decoder=h263 \
    --enable-decoder=h264 \
    --enable-decoder=mpeg4 \
    --enable-encoder=mjpeg \
    --enable-encoder=h263 \
    --enable-encoder=mpeg4 \
    --enable-encoder=h264 \
    --enable-parser=h264 \
    --disable-network \
    --enable-zlib \
    --disable-avfilter \
    --disable-avdevice \
    $ADDITIONAL_CONFIGURE_FLAG

#    --enable-libx264 \
make clean
make  -j4 install
$PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
$PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib  -soname libffmpeg.so -shared -nostdlib  -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog  --warn-once  --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a
}

#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=
#build_one


#arm v6
#CPU=armv6
#OPTIMIZE_CFLAGS="-marm -march=$CPU"
#PREFIX=./android/$CPU 
#ADDITIONAL_CONFIGURE_FLAG=
#build_one

#arm v7vfpv3
#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=
#build_one

#arm v7vfp
CPU=armv7-a
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU "
PREFIX=/home/shaw/work/Android/ffmpeg/android/$CPU
ADDITIONAL_CONFIGURE_FLAG=
build_one

#arm v7n
#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=neon -marm -march=$CPU -mtune=cortex-a8"
#PREFIX=/home/shaw/work/Android/ffmpeg/android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=--enable-neon
#build_one

#arm v6+vfp
#CPU=armv6
#OPTIMIZE_CFLAGS="-DCMP_HAVE_VFP -mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU"
#PREFIX=./android/${CPU}_vfp 
#ADDITIONAL_CONFIGURE_FLAG=
#build_one

要注意的地方有幾個:

<1>前面三個變數的path要設到相對應的位置.

<2>configure要選好相關的功能, 我打算用到muxer的功能, 所以有把muxer的功能打開.

<3>要選對cpu的版本, 不然在跑app時會說找不到.so, 其實是因為.so的loader讀取時產生錯誤, 所以沒辦法執行.

<4>如果想編armv7+neon的版本, 記得加上--enable-mdct和--enable-fft, 因為開了這兩個選項才會去編譯一些跟neon有關的最佳化函式, 不然編譯到最後會跳出找不到function的error.

3.build ffmpeg
執行剛剛的build script後, 會在ffmpeg目錄裡產生android目錄, 裡面有libffmpeg.so. 這個.so檔就是我們要的ffmpeg library. ffmpeg for android已經編譯完成了!

4.寫個簡單測試程式
我是直接從ndk的samples/hello-jni拿來改, 在改之前記得先用eclipse把範例import進來, 沒import的話jni會build fail, 目前還沒去研究為何, 不過我想有可能是在import時eclipse有幫忙去寫一些相關的環境變數吧.
先改一下hello-jni.c的內容
/*
 * Copyright (C) 2009 The Android Open Source Project
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 *
 */
#include 
#include 
#include 
#include 
#include 

/* This is a trivial JNI example where we use a native method
 * to return a new VM String. See the corresponding Java source
 * file located at:
 *
 *   apps/samples/hello-jni/project/src/com/example/HelloJni/HelloJni.java
 */
jstring
Java_com_example_hellojni_HelloJni_stringFromJNI( JNIEnv* env,
                                                  jobject thiz )
{
 char str[25];
 sprintf(str, "av codec version is %d", avcodec_version());
 return (*env)->NewStringUTF(env, str);
}
再來在jni目錄下新增Android.mk & Application.mk(可不加)
Android.mk:
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)
PATH_TO_FFMPEG_SOURCE := $(LOCAL_PATH)/ffmpeg
LOCAL_C_INCLUDES += $(PATH_TO_FFMPEG_SOURCE)
LOCAL_LDLIBS := -lffmpeg
LOCAL_MODULE    := hello-jni
LOCAL_SRC_FILES := hello-jni.c

include $(BUILD_SHARED_LIBRARY)
Application.mk
# The ARMv7 is significanly faster due to the use of the hardware FPU
APP_ABI := armeabi-v7a
APP_PLATFORM := android-8

接下來把剛剛build ffmpeg的source tree整個搬進jni目錄裡, 目錄名字記得換成ffmpeg

5.ndk-build
全部設定完之後, 在hello-jni的目錄下輸入ndk-build, 編譯完成後會產生libs目錄, 裡面會有libhello-jni.so, 把我們剛剛build好的libffmpeg.so也放進這個目錄.

6.test
上面步驟都完成後, 接下來就是在device上執行. 執行成功的話畫面上會出現一段數字, 這樣就完成基本測試了!

目前先測試到這裡, 等有空再繼續往下作.