Showing posts with label android. Show all posts
Showing posts with label android. Show all posts

Tuesday, November 22, 2016

Building Racket for Android

I've been using Racket for academic purposes and decided to try it running on android. I was able to get it built for android. This post details the efforts made and is also meant to be sort of a log entry.

Download the source from github. Use this script to compile it on linux/mac.

It's simple to compile it for android. However the real trouble lies in getting it to run, because you need to have all the startup files and also the binary should know where to find them. So first time you run it on a android device it shows this error message

Welcome to Racket v6.7.0.3.
standard-module-name-resolver: collection not found
  for module path: (submod (lib "racket/init") configure-runtime)
  collection: "racket"
  in collection directories:
   /data/.racket/6.7.0.3/collects
   /home/regnarts/workspace/scheme/android/armv5/racket/collects
standard-module-name-resolver: collection not found
  for module path: racket/interactive
  collection: "racket"
  in collection directories:
   /data/.racket/6.7.0.3/collects
  /home/regnarts/workspace/scheme/android/armv5/racket/collects
standard-module-name-resolver: collection not found
  for module path: racket/base
  collection: "racket"
  in collection directories:
   /data/.racket/6.7.0.3/collects
   /home/regnarts/workspace/scheme/android/armv5/racket/collects

Which is expected. Now you can fix this in two ways one is by using '-S' to specify the path that the interpreter is expected to look at. Or as the README suggests, edit the binary to change the default collects directory path name!!. You might want to go through the README and this post to get it working .

Once that is done the interpreter should start-up and behave as it would do on any other platform. However the boot time is notoriously long. Got to figure out a way to reduce it. This is not a main project but something I did to kill time. So I am not going to look into it now. However it would be really nice to get something like DRRacket to work on Android. This can be made into a app with moderate effort. But that's the story for another day.

Finally I could get my interpreter to run on racket running on Android.

Cheers!!

Friday, January 9, 2015

The Androux project

Finally, I've completed the linux support for the ports. I have decided to call the project androux. A play on words android and linux. You can grab the source from github. Finally "make" works and the compile script works on linux x86_64. The build scripts still won't run if you are using 32 bit machine. Will try to bring support for that in future. The android shell is not fit to run the configure scripts and hence a port of few utilities like sed and bash shell is required. I am working on it.

As of now you can create a Makefile and compile your projects on the device. I have tried a basic test and make works fine.

Visit github for instructions on compiling.

After compiling, you need to copy system folder created to the system folder of your device. You also need to copy the include and lib folders from platform directory corresponding to android version that your device runs.

The final system image will be quite big and I am quite sure that the partitions that are created in your phone by default are not enough to hold it all. There are two ways of approaching this problem, either extend the system partition, if the inbuilt flash can hold it or create a new partition in the sdcard and change the mount scripts to mount that partition as system instead.

When writing install scripts in make file we need to take care that system partition is mounted r/w. By default the system partition is read only in android. Or you could choose another install location.

I still haven't had time or resources to test on an actual device. Please let me know if any of you can do so and share the results

Cheers ! :-)
Stay Awesome  

Wednesday, January 7, 2015

MAKE compile script added

Quick repo update. I've added the compile script for make

Have fun!! Cheers :-)

c++ and g++ compile scripts fixed to build on mac OS X

I have finally found time to fix the scripts to build c++ and g++. I have pushed the code to the github. You can grab a snapshot here - https://github.com/heliumfire/androux. Or you can alternatively do

git clone -depth=1 https://github.com/heliumfire/androux

I am also working on getting linux scripts to work so that building can be supported on ubuntu. Next plan is to fix make and bash so that everything can be run smoothly on the device.

Future plans is to grab a AOSP source and integrate into it. I was thinking of first trying it on GB. Still haven't done it yet.

As far as putting it up on a real device. I am trying to procure an android device, unfortunately its an old one running GB (xperia play). I need help to try it out on other devices. Let me know if anyone is willing to.

Saturday, January 3, 2015

Source for linux utilities on Android

Hi all,

I have finally made time to push the source of my ports to github. Access the source at https://github.com/heliumfire/androux. To get the source clone the repo by
git clone https://github.com/heliumfire/androux. As of now the source only compiles binutils gmp mpc and mpfr. And will only work on linux. I will put up more stuff as the time comes

Cheers, Have fun :)

Tuesday, December 30, 2014

Make for Android and associated problems

NOTE: Your phone, your responsibility. (all stuff from previous posts)

I managed to compile make for android. But unfortunately it is not foolproof. Download it here and push the contents to the /system folder in the emulator. But there is one slight problem. There is a function that make uses called ttyname(). This function is supposed to be defined in libc. But as android doesn't run on standard libc but bionic. It has not been implemented and left as a stub. It basically returns the name of the terminal device like "/dev/pts/1". I think the best way to get around would be to implement it in bionic stubs code itself. But that would require replacing libc. And I don't like the sound of it. Another way would be to implement it in make. But I wasn't gonna spend this night doing that. So instead i hardcoded it to return "root".

And fortunately enough make seems to work fine without it. I wrote a small makefile and compiled to check. Sorry I did't include that in the package. But have fun trying it out by writing your own.

Let me know if you have any thoughts on implementation of ttyname.

PS: I have made progress on putting the scripts together for build system. Stay tuned 

System image [android-19] with gcc, g++, file and vim

I have uploaded the system image so that you can simply download it and try it out without having to remount or push. Copy the android-19 directory to "system-images" directory of your sdk and create a new avd for android-19 or use the old one doesn't matter. Just remember to close any avds that are open and restart them.

I have also included two source files for hello world, one for C and another for C++. They are there in "/system/bin" directory. Check the system by


Cheers, Have a nice day !!!

Edit: I've also included file and vim binaries, check em out

Monday, December 29, 2014

C C++ programming on Android [added g++ and vim to the package] (ARM specific)

NOTE: Same as old one. If you are dumb enough to spoil your device or emulator (lol), don't blame me. Try at your own risk

For long we have distanced using android as a development machine. This might not be far.

Yesterday, I successfully compiled gcc that could run on android. And today without much delay g++ is here. I've also added vim from my older post. Now you can push all the binaries to the phone and have a nice programming package on phone. I have done a basic sanity test on the binaries and they seem to be working fine. I've also included three sample programs in the bin directory. If possible remove them before pushing to the device. (for details on how to push them to the emulator see my last post)

This system will create binaries that will definitely run on all arm (32 bit) android devices. That means you can compile on your phone and then send it to another phone and it would work (lol, what am I writing. Sorry, sleepy :( ). This will create similar binaries to that of your ndk. So basically all the executables that you could compile with your ndk. Now you can on the phone.

Now to the backsides of it. It takes up large space (around 500 MB). Most of the system partitions are smaller than this so it would require that the partitions be resized and mounted. I could probably make a zip to install but due to this problem it wouldn't work on most devices. You need to edit the system partition size. And I have no idea how to do that, because I don't have an android phone. But if there is anyone willing to do this, please let me know how that turns out.

I've seen a friend of mine try use vim on his phone. And trust me it sucks. We will need a better editor and also a better terminal. For now I would suggest using terminal emulator to try it out. But it will not be as good as a personal computer. I wonder how it will be on a physical keyboard android device. I will search for my old HTC Dream, and hopefully it will work. I also have memory issues on that to be taken care of (half GB extra system storage on HTC Dream, you got to be kidding me). I will let you guys know how that turns out. Meanwhile have fun trying it out.

And ya I almost forgot download it here.

PS: I did try to make a nice build system out of this. But due to my laziness and non-linear working style it has become difficult to put together a script to do the whole build as I initially intended to. But I promise that in near future that I will try and put it up on github or something, after I get it to work. 

Sunday, December 28, 2014

GCC on Android [ARM Specific]

NOTE: I am not responsible for whatever damage you might cause to your emulator or device. You wish to try it out, its by your own will.

Last week, I compiled binutils for android. This week I have another of linux utility for you - "GCC". I haven't fully tested any of these. I've only checked for their sanity. Also owing to the fact that I do not own an android device makes it more problematic. I've only tested it on emulator. I expect it work on devices. I will be more than thankful if anyone can confirm it. The package is very big [around 154 MB zip]. I will probably write how I achieved it some time in future. Unzip the file and copy the contents of the "sysroot" directory to "/system/" in your emulator.You might need to do

adb remount

before that in order to mount the /system/ partition as rw. If you are willing to try in a mobile device.  Open the device prompt by "adb shell" and then

# mount
rootfs / rootfs ro 0 0
tmpfs /dev tmpfs rw,nosuid,mode=755 0 0
devpts /dev/pts devpts rw,mode=600 0 0
proc /proc proc rw 0 0
sysfs /sys sysfs rw 0 0
none /acct cgroup rw,cpuacct 0 0
tmpfs /mnt/asec tmpfs rw,mode=755,gid=1000 0 0
tmpfs /mnt/obb tmpfs rw,mode=755,gid=1000 0 0
none /dev/cpuctl cgroup rw,cpu 0 0
/dev/block/mtdblock0 /system yaffs2 rw 0 0
/dev/block/mtdblock1 /data yaffs2 rw,nosuid,nodev 0 0
/dev/block/mtdblock2 /cache yaffs2 rw,nosuid,nodev 0 0

will display the various devices and the mount points. Now you can do remount on "/system" directory. In my case this will be

mount -o remount,rw /dev/block/mtdblock0 /system

Because my system files are contained in "/dev/block/mtdblock0". Now that you have remounted push all the contents of "sysroot" directory to "/system". (Please note that this won't work if you don't have enough space in the system partition. You will need to find a way to increase the space. If you are using an emulator start the emulator with "emulator" binary in sdk "/tools" directory by invoking "./emulator -partition-size 1024 @<avd-name>",  the size of partition can be adjusted to your will. Note that you will need to first create an "avd" for this. Use the GUI in eclipse to do so)

adb push  sysroot/ /system/

[NOTE: This command might not work in all systems. adb push doesn't handle directories. It only works on my macbook. On my ubuntu it doesn't work. If it doesn't work find a better way to handle this or you are stuck copying each of them manually. Or write a script to copy all the contents]

 Now you can use vim that I had previously compiled and posted months ago in conjunction with gcc to create c programs on your mobile and compile them over your device. I could have also included the vim program in the package. I forgot that unfortunately

Somewhere in the bin directory I've included two hello world programs by mistake. one is written in 'c' and the other in assembly. Compile them to test the working of the system.

Above is the output from my adb. And yes I forgot to include a newline character in the hello.c program, get over it.

I expect everything to go fine. There are two versions of toolchain in there, one that produces code for "arm-linux-androideabi" (default one). There is also a second one "arm-none-eabi" this can, in theory be used to produce os independent code like for an arm micro-controller.

I am working on another stupid idea now. I'll let you guys know of when that happens :)

Wednesday, December 17, 2014

Binutils on Android

This is a quick post. I managed to compile binutils to run on android. I have just run a sanity test on them. I am still not sure of its behaviour. But as of now, every thing seems to be running smooth. Once again, I haven't had much time to test it. The assembler seems to be working, I verified it by assembling hello world program on device then pulling it back and linking it with ndk. The compiled program, sure enough spit out "hello, world!". You can download the compiled package here.

I shall keep you posted on the future developments.

PS: Please don't question why the port

Sunday, May 18, 2014

Android Kernel Compile | Galaxy Fit S5670 | ARM Basics

Very recently I was asked by one of my professors to compile a custom kernel to his phone. He gave me an old Samsung Galaxy Fit phone.  This post documents the process carried out to do the same.

The phone runs ARMv6-compatible processor rev 5 (v6l) and supports features like



swp [SWaP] & swpb [SWaPByte]

This instruction is deprecated as of ARMv7 even in some later versions of ARMv6

This instruction was used to implement exclusive access to semaphores themselves.
Semaphores are used to manage access to a shared resource. Depending on the type of semaphore, one or more clients may be granted access.
Before accessing a resource, a client must read the semaphore value and check that it indicates whether the client can proceed, or whether it must wait. When the client is about to proceed it must change the semaphore value to inform other clients.
A fundamental issue with semaphores is that they themselves are shared resources, which – as we just learned – must be protected by semaphores.
SWP (Swap) and SWPB (Swap Byte) provide a method for software synchronization that does not require disabling interrupts. This is achieved by performing a special type of memory access, reading a value into a processor register and writing a value to a memory location as an atomic operation. The example code below shows the implementation of simple mutex functions using the SWPinstruction. SWP and SWPB are not supported in the Thumb instruction set, so the example must be assembled for ARM.
binary mutex functions
    EXPORT lock_mutex_swp
lock_mutex_swp PROC
    LDR r2, =locked
    SWP r1, r2, [r0]       ; Swap R2 with location [R0], [R0] value placed in R1
    CMP r1, r2             ; Check if memory value was ‘locked’
    BEQ lock_mutex_swp     ; If so, retry immediately
    BX  lr                 ; If not, lock successful, return
    ENDP

    EXPORT unlock_mutex_swp
unlock_mutex_swp
    LDR r1, =unlocked
    STR r1, [r0]           ; Write value ‘unlocked’ to location [R0]
    BX  lr
    ENDP
In the SWP instruction in above example, R1 is the destination register that receives the value from the memory location, and R2 is the source register that is written to the memory location. You can use the same register for destination and source. For example 
        SWP r2, r2, [r0]       ; Swap R2 with location [R0], [R0] value placed in R2
is a valid instruction.

But there are certain limitations to this. If an interrupt triggers while a swap operation is taking place, the processor must complete both the load and the store part of the instruction before taking the interrupt, increasing interrupt latency. Because Load-Exclusive and Store-Exclusive are separate instructions, this effect is reduced when using the new synchronization primitives. In our source for Galaxy Fit we will find that swp or swpb aren't used to create atomic operations but Load-Exclusive and Store-Exclusive. Here is an example 

static inline void atomic_add(int i, atomic_t *v)
{
unsigned long tmp;
int result;

__asm__ __volatile__("@ atomic_add\n"
"1: ldrex %0, [%3]\n"
" add %0, %0, %4\n"
" strex %1, %0, [%3]\n"
" teq %1, #0\n"
" bne 1b"
: "=&r" (result), "=&r" (tmp), "+Qo" (v->counter)
: "r" (&v->counter), "Ir" (i)
: "cc");

}

This is taken from atomic.h of /arch/arm/include/asm/ directory.
In a multi-core system, preventing access to main memory for all processors for the duration of a swap instruction can reduce overall system performance. This is especially true in a multi-core system where processors operate at different frequencies but share the same main memory.

half [Half - Precision Floating Point Support]

In the ARM's VFP [Vector Floating Point Co-Processor] the support for 'half' means that it supports 16 bit floating point numbers and conversions between 16-32 bit floating point numbers. Half-precision floating-point numbers are provided as an optional extension to the VFPv3 architecture.
Half-precision floating-point format

Where:
   S (bit[15]):      Sign bit
   E (bits[14:10]):  Biased exponent
   T (bits[9:0]):    Mantissa.
The meanings of these fields depend on the format that is selected.

thumb [The Thumb Instruction set]

The Thumb instruction set is a subset of the most commonly used 32-bit ARM instructions. Thumb instructions are each 16 bits long, and have a corresponding 32-bit ARM instruction that has the same effect on the processor model. Thumb instructions operate with the standard ARM register configuration, allowing excellent interoperability between ARM and Thumb states.
On execution, 16-bit Thumb instructions are transparently decompressed to full 32-bit ARM instructions in real time, without performance loss.
Thumb has all the advantages of a 32-bit core:
  • 32-bit address space
  • 32-bit registers
  • 32-bit shifter, and Arithmetic Logic Unit (ALU)
  • 32-bit memory transfer.
Thumb therefore offers a long branch range, powerful arithmetic operations, and a large address space.
Thumb code is typically 65% of the size of ARM code, and provides 160% of the performance of ARM code when running from a 16-bit memory system.
The availability of both 16-bit Thumb and 32-bit ARM instruction sets gives designers the flexibility to emphasize performance or code size on a subroutine level, according to the requirements of their applications. For example, critical loops for applications such as fast interrupts and DSP algorithms can be coded using the full ARM instruction set then linked with Thumb code

fastmult [Fast Multiplication]

This refers to the fact that the processor can do 32 bit multiplication. The processor implementing this provides hardware to preform 32bit x 32 bit = 64 bit operation. This is a common feature in almost all the processors nowadays. But still they guys who built the kernel felt the need to mention it.

vfp [Vector Floating Point Co-Processor]

The VFP coprocessor supports floating point arithmetic operations and is a functional block within.
The VFP has its own bank of 32 registers for single-precision operands that you can:
  • use in pairs for double-precision operands
  • operate loads and stores of VFP registers in parallel with arithmetic operations.
The VFP supports a wide range of single and double precision operations, including ABS, NEG, COPY, MUL, MAC, DIV, and SQRT. The VFP effectively executes most of these in a single cycle. Sometime in future I promise to do a tutorial on this and NEON instructions.

edsp [DSP extensions]

The ARM DSP instruction set extensions increase the DSP processing capability of ARM solutions in high-performance applications, while offering the low power consumption required by portable, battery-powered devices. DSP extensions are optimized for a broad range of software applications including servo motor control, Voice over IP (VOIP) and video & audio codecs, where the extensions increase the DSP performance to enable efficient processing of the required tasks.

Features

  • Single-cycle 16x16 and 32x16 MAC implementations
  • 2-3 x DSP performance improvement over ARM7™ processor-based CPU products
  • Zero overhead saturation extension support
  • New instructions to load and store pairs of registers, with enhanced addressing modes
  • New CLZ instruction improves normalization in arithmetic operations and improves divide performance
  • Full support in the ARMv5TE, ARMv6 and ARMv7 architectures

Applications

  • Audio encode/decode (MP3: AAC, WMA)
  • Servo motor control (HDD/DVD)
  • MPEG4 decode
  • Voice and handwriting recognition
  • Embedded control
  • Bit exact algorithms (GSM-AMR)

java [Jazelle]

Jazelle DBX (Direct Bytecode eXecution) allows some ARM processors to execute Java bytecode in hardware as a third execution state alongside the existing ARM and Thumb modes. Jazelle functionality was specified in the ARMv5TEJ architecture and the first processor with Jazelle technology was the ARM926EJ-S. Jazelle is denoted by a 'J' appended to the CPU name, except for post-v5 cores where it is required (albeit only in trivial form) for architecture conformance.
These are the extra features that the processor supports over the regular ones. The kernel can only be complied over linux or Mac. For this one I will be using linux. First we need to install a few requirements and then download the source.


You will find the following files in the kernel source. Next install the required programs to compile the source. Run the following line in your terminal.

sudo apt-get install git-core gnupg flex bison gperf libsdl-dev libesd0-dev libwxgtk2.6-dev build-essential zip curl libncurses5-dev zlib1g-dev valgrind
Extract the kernel source. And also download the toolchain. Now extract the toolchain. and place it in a folder. You'll find the following in the toolchain folder.

Go to the kernel source and edit the Makefile to add the CROSS_COMPILE to point to the toolchain. It must look like this


Make sure the path is right. For me the toolchain was in /home/regnarts/S3/arm-2011.03 check the full path before updating.

Now to make this kernel there is a small problem. The original kernel source that was released by the samsung didn't have any support for Ext filesystem. That was because the original rom for this phone runs on samsung's proprietary RFS file system. So if you want to only run the stock rom over the kernel then just jump this part. Else you will need to enable the support for this in the config file. Find /arch/arm/configs/beni_rev01_defconfig and change the File System settings as shown below.


There is another important thing. The phone uses proprietary wifi/wireless drivers. So their source code is not provided. Therefore we need to just use the already existing ones. But the problem is that insmod checks for the kernel version before loading the modules. So if there is a mismatch. The kernel won't load the modules. Therefore you'll need to name the kernel as the module wishes to see it. Run

make menuconfig

This opens up the menu to configure the kernel.

In general setup append the name "-pref-CL928291" to the Local version. Now you are good to go. Run the make script that came with the kernel make_kernel_GT-S5670.sh. Or run

make beni_rev01_defconfig
make

That should give us the kernel. You will find it as /arch/arm/boot/zImage. Congratulations you've built the kernel for your phone. However you will need to still put it up in to the phone. For this we shall make the android's very popular "recovery flashable" zip file.

Download a sample file. Extract the zip, you'll find this

Now the file boot.img is the kernel+ramdisk file. But in order to build a boot.img you'll first need a ramdisk. We shall take the ramdisk from the sample previously downloaded. Use boot-tools to unpack and use the boot.img. The following must help
To unpack the boot.img

tools/unpackbootimg -i source_img/boot.img -o unpack

To unpack ramdisk (If you want to)

gzip -dc ../unpack/boot.img-ramdisk.gz | cpio -i

Replace boot.img-zImage with your own zImage and then run

tools/mkbootimg --kernel unpack/boot.img-zImage --ramdisk unpack/boot.img-ramdisk-new.gz -o target_img/boot.img --base `cat unpack/boot.img-base`

With the new boot.img replace it in the update.zip and zip it again.
This will flash in most of the recoveries if you turn of the signature verification. If you are unable to do it in your recovery. Then you'll need to sign it. Use this to sign it from linux.

export CLASSPATH=$CLASSPATH:./lib.jar
java testsign update.zip update-finished.zip

Now flash the zip file from recovery.
I'll discuss overclocking this phone in the next blog post.






Sunday, December 8, 2013

FFMPEG on Android | Command line tool

Got the ffmpeg command line tool working on android today. I was working on a screenshot binary for android. And the way you take a screenshot on android is to cat /dev/graphics/fb0. Now this file is in RGB565 format [on Emulator]. Initially I managed it by pulling the file to my computer and then using ffmpeg to convert it to jpg. Though porting ffmpeg for android was not my initial idea, and it has already been done before. But I didn't find any ffmpeg binaries online that I could directly download and run over my phone. Therefore I decided to see if I could build it from source and it took me a couple of hours to get it working. The binary runs all the user commands except the common options like "-h","-v" [For some reason there was no definitions to cover those in the source I used]. You can download the binaries here [Need to copy the library 'libffmpeg.so' to the '/system/lib' folder]. I did, even copy the binary to my 'system/bin' so that I can comfortably run it from the shell.
To finish of the work I wrote a shell script to do rest of the work

#Shell script to take a screenshot
cat /dev/graphics/fb0 > /sdcard/raw-input
ffmpeg -vcodec rawvideo -f rawvideo -pix_fmt rgb565 -s 480x800 -i /sdcard/raw-input -f image2 -vcodec mjpeg /sdcard/screenshot.jpg

rm /sdcard/raw-input
echo "Screen shot saved at scard/screenshot.jpg"


I do know that there is an inbuilt screenshot binary in android that does the work instead of this lousy process. But I am working on something more interesting. To be revealed in the days to come.

Have fun with ffmpeg

Here are some screenshots that were pulled from the device
PS: I am not sure if it works on all versions of android. I've only tried it on KitKat...

Tuesday, October 15, 2013

OpenSLES | Android audio interface through c

Open SL | ES (Open Sound Library for Embedded Systems) is a software library created by khronos group. It is a royalty free, cross platform, hardware accelerated c - language 2D and 3D audio API. This has been adopted as the standard in android since API level 9 [gingerbread]. It follows the object and interface model. Rest of this post details creating a simple c program to play sound.

I will be using the 8bit pcm file that came with android ndk as my sound source.  We can play other popular streams too, but for this article the program will just say 'hello android'. The api calls for an 'engine' object to be created at first. This diagram illustrates the required relationship

Before we start anything let's include the necessary headers


#include <stdio.h>
#include <assert.h>
#include <string.h>
#include <SLES/OpenSLES.h>
#include <SLES/OpenSLES_Android.h>


The pcm 8 bit values to say hello are encoded in the file hello_clip.h [found it one of the examples in android ndk]. The values look like this.

"\x01\x00\x03\x00\x06\x00\x0a\x00\x0a\x00\x09\x00\x04\x00\x04\x00"
"\x06\x00\x06\x00\x00\x00\xff\xff\x05\x00\x08\x00\x01\x00\xfe\xff"
"\xff\xff\x03\x00\x04\x00\xfe\xff\xf9\xff\xfd\xff\x04\x00\xfe\xff"
"\x03\x00\x04\x00\x01\x00\xfb\xff\xfb\xff\xfc\xff\xfb\xff\x03\x00"
"\xfc\xff\xf9\xff\xfc\xff\x01\x00\x06\x00\x00\x00\xf9\xff\xfa\xff"
"\x04\x00\x06\x00\xfe\xff\xfa\xff\xfd\xff\x01\x00\xfe\xff\xfe\xff"
"\xfe\xff\xfd\xff\xfd\xff\xfd\xff\xfe\xff\xff\xff\xfd\xff\xfa\xff"
"\xfe\xff\x00\x00\x03\x00\xfe\xff\xfc\xff\xfb\xff\xfe\xff\x01\x00"
        .............................................................

So lets create an object of engine and realize it. Don't worry its not that complicated stuff. assert() is used to check if everything is all right on creation of engine and its realisation. 

     SLresult result;

     static SLObjectItf engineObject = NULL;
     
     //First we need to create an engine and then check if it was successful
     result = slCreateEngine(&engineObject, 0, NULL, 0, NULL, NULL);
     assert(SL_RESULT_SUCCESS == result);
     
     //We realize the engine object
     result = (*engineObject)->Realize(engineObject, SL_BOOLEAN_FALSE);
     assert(SL_RESULT_SUCCESS == result);

Now we get the engine interface [observe that we need two things to control the engine ObjectItf and EngineItf].

     static SLEngineItf engineEngine;
     
     //Get the engine's interface
     result = (*engineObject)->GetInterface(engineObject, SL_IID_ENGINE, &engineEngine);
     assert(SL_RESULT_SUCCESS == result);

Now we create the output mix

     static SLObjectItf outputMixObject = NULL;

     const SLInterfaceID ids[1] = {SL_IID_ENVIRONMENTALREVERB};
     const SLboolean req[1] = {SL_BOOLEAN_FALSE};
     result = (*engineEngine)->CreateOutputMix(engineEngine, &outputMixObject, 1, ids, req);
     assert(SL_RESULT_SUCCESS == result);

Observe that output mix is used to control the quality of the output. It maintains the equalisation, virtualiser, etc,. The ids and req help the function in determining the type of the output mix. Here we tell the function that we don't want the environmental reverberation settings. We also need to realise this 

     // realize the output mix
     result = (*outputMixObject)->Realize(outputMixObject, SL_BOOLEAN_FALSE);
     assert(SL_RESULT_SUCCESS == result);

Next we create a buffer queue, set data type  and configure the source.

     // configure audio source
     SLDataLocator_AndroidSimpleBufferQueue loc_bufq =   {SL_DATALOCATOR_ANDROIDSIMPLEBUFFERQUEUE, 2};
     SLDataFormat_PCM format_pcm = {SL_DATAFORMAT_PCM, 1, SL_SAMPLINGRATE_8,
         SL_PCMSAMPLEFORMAT_FIXED_16, SL_PCMSAMPLEFORMAT_FIXED_16,
         SL_SPEAKER_FRONT_CENTER, SL_BYTEORDER_LITTLEENDIAN};
     SLDataSource audioSrc = {&loc_bufq, &format_pcm};

Next create an audio sink

     SLDataLocator_OutputMix loc_outmix = {SL_DATALOCATOR_OUTPUTMIX, outputMixObject};
     SLDataSink audioSnk = {&loc_outmix, NULL};

Now its the time we can create our audio player. We mention that Buffer queue, Effect send and volume objects are required 

     const SLInterfaceID ids2[3] = {SL_IID_BUFFERQUEUE, SL_IID_EFFECTSEND,
         SL_IID_VOLUME};
     const SLboolean req2[3] = {SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE,
         SL_BOOLEAN_TRUE};
     result = (*engineEngine)->CreateAudioPlayer(engineEngine, &bqPlayerObject, &audioSrc, &audioSnk,
                                                 3, ids2, req2);
     assert(SL_RESULT_SUCCESS == result);

We need to realise this too.

     result = (*bqPlayerObject)->Realize(bqPlayerObject, SL_BOOLEAN_FALSE);
     assert(SL_RESULT_SUCCESS == result);

Now we need to the player interface and the buffer queue interface

     result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_PLAY, &bqPlayerPlay);
     assert(SL_RESULT_SUCCESS == result);
     
     // get the buffer queue interface
     result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_BUFFERQUEUE,
                                              &bqPlayerBufferQueue);
     assert(SL_RESULT_SUCCESS == result);

At the end of each frame next frame has to be loaded. There is a call back functionality available. This function is implemented and linked to the main program. Let us create a function first.

void bqPlayerCallback(SLAndroidSimpleBufferQueueItf bq, void *context)
{
    assert(bq == bqPlayerBufferQueue);
    assert(NULL == context);
        if (NULL != nextBuffer && 0 != nextSize) {
        SLresult result;
        result = (*bqPlayerBufferQueue)->Enqueue(bqPlayerBufferQueue, nextBuffer, nextSize);
        assert(SL_RESULT_SUCCESS == result);
        nextBuffer = NULL;
        nextSize = NULL;

    }
}

We will use this to play a second frame and then stop. The second frame will be saying 'android'. It was taken from the same example. We shall register this callback 

     result = (*bqPlayerBufferQueue)->RegisterCallback(bqPlayerBufferQueue, bqPlayerCallback, NULL);
     assert(SL_RESULT_SUCCESS == result);


Then we set the player state to playing

     result = (*bqPlayerPlay)->SetPlayState(bqPlayerPlay, SL_PLAYSTATE_PLAYING);
     assert(SL_RESULT_SUCCESS == result);

Now we put the buffers out to play

     nextBuffer = (short *) hello;
     nextSize = sizeof(hello);
     result = (*bqPlayerBufferQueue)->Enqueue(bqPlayerBufferQueue, nextBuffer, nextSize);
     assert(SL_RESULT_SUCCESS != result);
     nextBuffer = (short *) android;
     nextSize = sizeof(android);

     sleep(2);

This will play hello first and then call the call back function, which will play android as it is queued next. There is a delay to complete the playing.

     if (bqPlayerObject != NULL) {
         (*bqPlayerObject)->Destroy(bqPlayerObject);
         bqPlayerObject = NULL;
         bqPlayerPlay = NULL;
         bqPlayerBufferQueue = NULL;
     }
     

     if (outputMixObject != NULL) {
         (*outputMixObject)->Destroy(outputMixObject);
         outputMixObject = NULL;
         outputMixEnvironmentalReverb = NULL;
     }
     
     if (engineObject != NULL) {
         (*engineObject)->Destroy(engineObject);
         engineObject = NULL;
         engineEngine = NULL;
     }

We destroy all the objects created, and then exit the program.

Download the binaries and code from here. You will have to run it through either adb or terminal emulator by copying the binaries to convenient place, like /data/local/tmp. Try it out, let me know if the code or binary doesn't work. Should work flawlessly on a jelly bean device. Haven't checked it on any others



Thursday, October 18, 2012

Android app building | Android Camera

There have been a lot of posts on this blog relating to android but none of them say anything about android application development. This is the first one to be talking about android application development. Yes that's right ".apk". Though its customary for you to build a "Hello, World!" program whenever you start to learn some new language, I took a small detour to teach you guys how easy it will be to utilize real hardware on your phone. This post will outline an app to take pictures from your phone camera("Yes!, it does the same work as your default camera app").

If you don't have Android SDK downloaded and installed along with Eclipse. And ADT plugin configured with it. Do it fast.

Now fire up your IDE and create a new project with appropriate name. Now that we want to create a camera app, let's see what google has to offer. Android has a page to describe building a camera app, nice!!.

The first snippet it gives is to detect the camera availability. It runs like this.


/** Check if this device has a camera */
private boolean checkCameraHardware(Context context) {
    if (context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA)){
        // this device has a camera
        return true;
    } else {
        // no camera on this device
        return false;
    }
}

This is a really useful piece of code I decided to put out of my program, becoz I know my phone has a camera(lol). Anyways if you want to check out run it in your program. The next code is to get an instance of the camera. You'll obviously need it in your program.


/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance(){
    Camera c = null;
    try {
        c = Camera.open(); // attempt to get a Camera instance
    }
    catch (Exception e){
        // Camera is not available (in use or does not exist)
    }
    return c; // returns null if camera is unavailable
}

Now here is something that you must know. Android handles camera as an OBJECT. So this code creates a camera object and calls the associated function camera.open(). This creates an exception if the camera doesn't exist or the resource isn't available (Some other program utilizing it). Now your layout will typically look like this over eclipse


We need to change this (obviously!!) to display our camera. We'll add a frame to the display to make a viewfinder out of it.


<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >
    
     <FrameLayout
         android:id="@+id/camera_preview"
         android:layout_width="fill_parent"
         android:layout_height="fill_parent"
         android:layout_alignParentLeft="true"
         android:layout_alignParentTop="true"
         />
     
      <Button
             android:id="@+id/button_capture"
             android:layout_width="wrap_content"
             android:layout_height="wrap_content"
             android:gravity="center|left"
             android:text="capture" />

   
</RelativeLayout>

Now that you have a button that says capture and a frame to display the camera view. It must pretty much look like this.

We'll come back to refining this later. Now let us add the code to the main program. Eclipse has created initial code for me, that runs like this

package com.regnartstranger.cam;

import android.os.Bundle;
import android.app.Activity;
import android.view.Menu;

public class Cam extends Activity {

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_cam);
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.activity_cam, menu);
        return true;
    }
}


Adding up our program to get the camera resource. Now the file is


Observe that we haven't declared the variable camera that is giving us an error. So create an object to hold our camera. We'll call it camera itself(yes, because we have already started using the name).


package com.regnartstranger.cam;

import android.hardware.Camera;
import android.os.Bundle;
import android.app.Activity;
import android.util.Log;
import android.view.Menu;

public class Cam extends Activity {

private String TAG = "Cam";
private Camera camera;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_cam);
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.activity_cam, menu);
        return true;
    }
    
    public void getCameraResource(){
    try{
        camera = Camera.open();
        }catch(Exception ioe)
        {
        Log.d(TAG,ioe.getMessage());
        }
    }
}



Notice that I've also used a TAG and updated the log on error(Like a good programmer should). Also note the extra importing of the android.hardware.camera that the eclipse has done for me(Thank you!). Now to create a preview, the developer website list's it to be a class called preview class


/** A basic Camera preview class */
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
    private SurfaceHolder mHolder;
    private Camera mCamera;

    public CameraPreview(Context context, Camera camera) {
        super(context);
        mCamera = camera;

        // Install a SurfaceHolder.Callback so we get notified when the
        // underlying surface is created and destroyed.
        mHolder = getHolder();
        mHolder.addCallback(this);
        // deprecated setting, but required on Android versions prior to 3.0
        mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void surfaceCreated(SurfaceHolder holder) {
        // The Surface has been created, now tell the camera where to draw the preview.
        try {
            mCamera.setPreviewDisplay(holder);
            mCamera.startPreview();
        } catch (IOException e) {
            Log.d(TAG, "Error setting camera preview: " + e.getMessage());
        }
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        // empty. Take care of releasing the Camera preview in your activity.
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
        // If your preview can change or rotate, take care of those events here.
        // Make sure to stop the preview before resizing or reformatting it.

        if (mHolder.getSurface() == null){
          // preview surface does not exist
          return;
        }

        // stop preview before making changes
        try {
            mCamera.stopPreview();
        } catch (Exception e){
          // ignore: tried to stop a non-existent preview
        }

        // set preview size and make any resize, rotate or
        // reformatting changes here

        // start preview with new settings
        try {
            mCamera.setPreviewDisplay(mHolder);
            mCamera.startPreview();

        } catch (Exception e){
            Log.d(TAG, "Error starting camera preview: " + e.getMessage());
        }
    }
}


To display the picture on the screen we use SurfaceView. SurfaceView is a class in android that can be used to paint a canvas to the screen. To use it we create a new class that inherits SurfaceView and implements its call back. So the functions surfacecreated, surfacedestoryed, surfacechanged implement these call backs. We initialize the class by creating a camera object, a holder for the Surface and adding callback to that holder.
Now using the same code into our app. We get


package com.regnartstranger.cam;

import java.io.IOException;

import android.hardware.Camera;
import android.os.Bundle;
import android.app.Activity;
import android.content.Context;
import android.util.Log;
import android.view.Menu;
import android.view.SurfaceHolder;
import android.view.SurfaceView;

public class Cam extends Activity {

private String TAG = "Cam";
private Camera camera;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_cam);
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.activity_cam, menu);
        return true;
    }
    /** A basic Camera preview class */
    public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
        private SurfaceHolder mHolder;
        private Camera mCamera;

        public CameraPreview(Context context, Camera camera) {
            super(context);
            mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        }

        public void surfaceCreated(SurfaceHolder holder) {
            try {
                mCamera.setPreviewDisplay(holder);
                mCamera.startPreview();
            } catch (IOException e) {
                Log.d(TAG, "Error setting camera preview: " + e.getMessage());
            }
        }

        public void surfaceDestroyed(SurfaceHolder holder) {
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {

            if (mHolder.getSurface() == null){
              return;
            }

            try {
                mCamera.stopPreview();
            } catch (Exception e){
            }


            try {
                mCamera.setPreviewDisplay(mHolder);
                mCamera.startPreview();

            } catch (Exception e){
                Log.d(TAG, "Error starting camera preview: " + e.getMessage());
            }
        }
    }
    
    public void getCameraResource(){
    try{
        camera = Camera.open();
        }catch(Exception ioe)
        {
        Log.d(TAG,ioe.getMessage());
        }
   
    }
}



Now to create a preview in the screen, we need to create an object that handles it. It is called camerapreview. So adding these three lines of code sets up the camera view activity.


 mPreview = new CameraPreview(this, camera);
 FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
 preview.addView(mPreview);

The whole code looks like this now


package com.regnartstranger.cam;

import java.io.IOException;
import android.hardware.Camera;
import android.os.Bundle;
import android.app.Activity;
import android.content.Context;
import android.util.Log;
import android.view.Menu;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.widget.FrameLayout;

public class Cam extends Activity {

private String TAG = "Cam";
private Camera camera;
public CameraPreview mPreview;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_cam);
        getCameraResource();
        mPreview = new CameraPreview(this, camera);
        FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
        preview.addView(mPreview);
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.activity_cam, menu);
        return true;
    }
    /** A basic Camera preview class */
    public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
        private SurfaceHolder mHolder;
        private Camera mCamera;

        public CameraPreview(Context context, Camera camera) {
            super(context);
            mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        }

        public void surfaceCreated(SurfaceHolder holder) {
            try {
                mCamera.setPreviewDisplay(holder);
                mCamera.startPreview();
            } catch (IOException e) {
                Log.d(TAG, "Error setting camera preview: " + e.getMessage());
            }
        }

        public void surfaceDestroyed(SurfaceHolder holder) {
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {

            if (mHolder.getSurface() == null){
              return;
            }

            try {
                mCamera.stopPreview();
            } catch (Exception e){
            }


            try {
                mCamera.setPreviewDisplay(mHolder);
                mCamera.startPreview();

            } catch (Exception e){
                Log.d(TAG, "Error starting camera preview: " + e.getMessage());
            }
        }
    }
    
    public void getCameraResource(){
    try{
        camera = Camera.open();
        }catch(Exception ioe)
        {
        Log.d(TAG,ioe.getMessage());
        }
   
    }
}

At this point if you just add android.permission.CAMERA to the manifest file you'll be able to run the program.


The camera now looks like this. If you want to capture a photograph all you need to do is to implement a callback method to save the photograph to a file in memory. The callback is described in the developer website as

private PictureCallback mPicture = new PictureCallback() {

    @Override
    public void onPictureTaken(byte[] data, Camera camera) {

        File pictureFile = getOutputMediaFile(MEDIA_TYPE_IMAGE);
        if (pictureFile == null){
            Log.d(TAG, "Error creating media file, check storage permissions: " +
                e.getMessage());
            return;
        }

        try {
            FileOutputStream fos = new FileOutputStream(pictureFile);
            fos.write(data);
            fos.close();
        } catch (FileNotFoundException e) {
            Log.d(TAG, "File not found: " + e.getMessage());
        } catch (IOException e) {
            Log.d(TAG, "Error accessing file: " + e.getMessage());
        }
    }
};

Add this to your code.It would finally look like this


package com.regnartstranger.cam;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import android.hardware.Camera;
import android.hardware.Camera.PictureCallback;
import android.os.Bundle;
import android.os.Environment;
import android.app.Activity;
import android.content.Context;
import android.util.Log;
import android.view.Menu;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import android.widget.FrameLayout;

public class Cam extends Activity {

private String TAG = "Cam";
private Camera camera;
public CameraPreview mPreview;
public Button captureButton;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_cam);
        getCameraResource();
        captureButton = (Button) findViewById(R.id.button_capture);
        mPreview = new CameraPreview(this, camera);
        FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
        preview.addView(mPreview);
        captureButton.setOnClickListener(
           new View.OnClickListener() {
               public void onClick(View v) {
                   // get an image from the camera
                   camera.takePicture(null, null, mPicture);
               }
           }
        );
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.activity_cam, menu);
        return true;
    }
    private PictureCallback mPicture = new PictureCallback() {

        public void onPictureTaken(byte[] data, Camera camera) {
        File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES), "MyCameraApp");
        String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
        File mediaFile;
            mediaFile = new File(mediaStorageDir.getPath() + File.separator + "IMG_"+ timeStamp + ".jpg");

            if (mediaFile == null){
                Log.d(TAG, "Error creating media file, check storage permissions\n");
                return;
            }

            try {
                FileOutputStream fos = new FileOutputStream(mediaFile);
                fos.write(data);
                fos.close();
            } catch (FileNotFoundException e) {
                Log.d(TAG, "File not found: " + e.getMessage());
            } catch (IOException e) {
                Log.d(TAG, "Error accessing file: " + e.getMessage());
            }
        }
    };
    /** A basic Camera preview class */
    public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
        private SurfaceHolder mHolder;
        private Camera mCamera;

        public CameraPreview(Context context, Camera camera) {
            super(context);
            mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        }

        public void surfaceCreated(SurfaceHolder holder) {
            try {
                mCamera.setPreviewDisplay(holder);
                mCamera.startPreview();
            } catch (IOException e) {
                Log.d(TAG, "Error setting camera preview: " + e.getMessage());
            }
        }

        public void surfaceDestroyed(SurfaceHolder holder) {
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {

            if (mHolder.getSurface() == null){
              return;
            }

            try {
                mCamera.stopPreview();
            } catch (Exception e){
            }


            try {
                mCamera.setPreviewDisplay(mHolder);
                mCamera.startPreview();

            } catch (Exception e){
                Log.d(TAG, "Error starting camera preview: " + e.getMessage());
            }
        }
    }
    
    public void getCameraResource(){
    try{
        camera = Camera.open();
        }catch(Exception ioe)
        {
        Log.d(TAG,ioe.getMessage());
        }
   
    }
}


Add the permission to write to sdcard [android.permission.WRITE_EXTERNAL_STORAGE]. Now you are good to go. Run the app once, you must be able to click pictures.

To create a more formal looking app. Add android:theme="@android:style/Theme.NoTitleBar.Fullscreen" under application in the manifest file. And android:screenOrientation="landscape" under activity. It must finally look like this.


<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.regnartstranger.cam"
    android:versionCode="1"
    android:versionName="1.0" >

    <uses-sdk
        android:minSdkVersion="8"
        android:targetSdkVersion="15" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /><uses-permission android:name="android.permission.CAMERA"/>
    

    <application
        android:icon="@drawable/ic_launcher"
        android:label="@string/app_name"
        android:theme="@android:style/Theme.NoTitleBar.Fullscreen" >
        <activity
            android:name=".Cam"
            android:label="@string/title_activity_cam"
            android:screenOrientation="landscape" >
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>

</manifest>


This gives it a decent camera look. Run the program to get your basic Camera App

Have a nice day!!