Friday, October 23, 2009

UI framework changes in Android 1.6

Android 1.6 introduces numerous enhancements and bug fixes in the UI framework. Today, I'd like to highlight three two improvements in particular.

Optimized drawing

The UI toolkit introduced in Android 1.6 is aware of which views are opaque and can use this information to avoid drawing views that the user will not be able to see. Before Android 1.6, the UI toolkit would sometimes perform unnecessary operations by drawing a window background when it was obscured by a full-screen opaque view. A workaround was available to avoid this, but the technique was limited and required work on your part. With Android 1.6, the UI toolkit determines whether a view is opaque by simply querying the opacity of the background drawable. If you know that your view is going to be opaque but that information does not depend on the background drawable, you can simply override the method called isOpaque():


@Override
public boolean isOpaque() {
return true;
}


The value returned by isOpaque() does not have to be constant and can change at any time. For instance, the implementation of ListView in Android 1.6 indicates that a list is opaque only when the user is scrolling it.

Updated: Our apologies—we spoke to soon about isOpaque(). It will be available in a future update to the Android platform.

More flexible, more robust RelativeLayout

RelativeLayout is the most versatile layout offered by the Android UI toolkit and can be successfully used to reduce the number of views created by your applications. This layout used to suffer from various bugs and limitations, sometimes making it difficult to use without having some knowledge of its implementation. To make your life easier, Android 1.6 comes with a revamped RelativeLayout. This new implementation not only fixes all known bugs in RelativeLayout (let us know when you find new ones) but also addresses its major limitation: the fact that views had to be declared in a particular order. Consider the following XML layout:


<?xml version="1.0" encoding="utf-8"?>

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="64dip"
android:padding="6dip">

<TextView
android:id="@+id/band"
android:layout_width="fill_parent"
android:layout_height="26dip"

android:layout_below="@+id/track"
android:layout_alignLeft="@id/track"
android:layout_alignParentBottom="true"

android:gravity="top"
android:text="The Airborne Toxic Event" />

<TextView
android:id="@id/track"
android:layout_marginLeft="6dip"
android:layout_width="fill_parent"
android:layout_height="26dip"

android:layout_toRightOf="@+id/artwork"

android:textAppearance="?android:attr/textAppearanceMedium"
android:gravity="bottom"
android:text="Sometime Around Midnight" />

<ImageView
android:id="@id/artwork"
android:layout_width="56dip"
android:layout_height="56dip"
android:layout_gravity="center_vertical"

android:src="@drawable/artwork" />

</RelativeLayout>


This code builds a very simple layout—an image on the left with two lines of text stacked vertically. This XML layout is perfectly fine and contains no errors. Unfortunately, Android 1.5's RelativeLayout is incapable of rendering it correctly, as shown in the screenshot below.


Media_httpdocsgooglec_xhvtj

The problem is that this layout uses forward references. For instance, the "band" TextView is positioned below the "track" TextView but "track" is declared after "band" and, in Android 1.5, RelativeLayout does not know how to handle this case. Now look at the exact same layout running on Android 1.6:


Media_httpdocsgooglec_sxrju

As you can see Android 1.6 is now better able to handle forward reference. The result on screen is exactly what you would expect when writing the layout.

Easier click listeners

Setting up a click listener on a button is very common task, but it requires quite a bit of boilerplate code:


findViewById(R.id.myButton).setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
// Do stuff
}
});


One way to reduce the amount of boilerplate is to share a single click listener between several buttons. While this technique reduces the number of classes, it still requires a fair amount of code and it still requires giving each button an id in your XML layout file:


View.OnClickListener handler = View.OnClickListener() {
public void onClick(View v) {
switch (v.getId()) {
case R.id.myButton: // doStuff
break;
case R.id.myOtherButton: // doStuff
break;
}
}
}

findViewById(R.id.myButton).setOnClickListener(handler);
findViewById(R.id.myOtherButton).setOnClickListener(handler);


With Android 1.6, none of this is necessary. All you have to do is declare a public method in your Activity to handle the click (the method must have one View argument):


class MyActivity extends Activity {
public void myClickHandler(View target) {
// Do stuff
}
}


And then reference this method from your XML layout:


<Button android:onClick="myClickHandler" />


This new feature reduces both the amount of Java and XML you have to write, leaving you more time to concentrate on your application.

The Android team is committed to helping you write applications in the easiest and most efficient way possible. We hope you find these improvements useful and we're excited to see your applications on Android Market.

Wednesday, October 21, 2009

Rumah Tahan Gempa ala Bantul

Rumah dan bangunan tahan gempa ternyata bisa didapat dengan harga murah. Di Bantul, Yogyakarta, sejumlah warga mengembangkan pembuatan rumah tanah gempa bumi dengan biaya berkisar Rp 8 hingga Rp 9 juta rupiah per unit.

Dinding rumah hanya sedikit menggunakan batu bata, sedangkan kerangka rumah tidak menggunakan besi dan beton, melainkan kayu.

Sistem konstruksi pondasi rumah ini mengadopsi konstruksi rumah adat Jawa, Joglo, yaitu dengan menggunakan umpak atau batu penyangga tiang. Sementara, dinding atas hingga ke atap rumah menggunakan asbes.

Selain penggunaan sedikit materi bangunan, proses pembuatannya juga mudah. Plus, dapat dikerjakan secara bergotong royong. Warga pun mengaku tenang bisa tinggal di rumah yang dibangun selama lima hari ini.

Di Desa Ngibikan, sedikitnya sudah seratus warga yang mendirikan rumah murah tahan gempa. Model rumah seperti ini setidaknya dapat dikembangkan di daerah rawan gempa di tanah air.

Thursday, October 15, 2009

Maxima Bantah Tendang Keluar Raditya Dika

Production House film Menculik Miyabi, Maxima Pictures membantah telah menendang keluar novelis Raditya Dika dari film yang melibatkan supermodel porno asal Jepang Maria Ozawa alias Miyabi.

"Enggak, enggak ada seperti itu," bantah General Manajer Maxima Pictures Sediadi saat berbincang dengan okezone, Selasa (13/10/2009).

Dia menegaskan, Maxima Pictures masih memiliki hubungan yang baik dengan pacar mantan penyanyi cilik Sherina. "Tidak mungkin lah kita menendang Raditya," ujarnya.

Pria yang disapa Adi ini mengaku, pihaknya sebenarnya belum mengetahui persis perihal keluarnya Raditya sebagai pemain dan penulis. Yang pasti, katanya, Maxima tetap akan menghargai pemilik ide cerita milik Raditya Dika tersebut.

Sebelumnya, Raditya Dika keluar dari peran utama Raffan, yang rencananya akan dia bintangi. Bahkan, dia juga tidak lagi menjadi penulis utama naskah film tersebut. Dia digantikan Raditya Mangunsong.

Keluarnya Raditya Dika mengembuskan gosip pihak Maxima Pictures kesal perihal bocornya rencana mengundang Miyabi ke Jakarta. Pasalnya, pertama kali kabar Miyabi ke Jakarta muncul dalam wawancara Raditya dengan salah satu Koran nasional 12 September 2009 lalu.

Akibatnya, Maxima Pictures diprotes dan didemo untuk mengurungkan niatnya mendatangkan Miyabi ke Jakarta.

Friday, October 9, 2009

Menculik Miyabi

Bagi sebagian kalangan, kata “Miyabi” begitu akrab di telinga. Mereka yang paham biasanya langsung tersenyum begitu mendengar “Miyabi”. Yang tidak mengerti cuma bingung melihat temannya tersenyum. Belakangan, kata “Miyabi” jadi begitu populer dan tak lagi dipahami secara eksklusif oleh sebagian orang menyusul kabar akan datangnya aktris Jepang Maria Ozawa Video (23).
Miyabi adalah nama alias Maria Ozawa. Di Jepang, ia dikenal sebagai bintang film porno. Sebuah film berjudul Menculik Miyabi akan diproduksi di Indonesia. Ini tentu bukan film porno. Tapi, Miyabi akan ikut menjadi pemain di film yang digarap oleh rumah produksi Maxima Pictures itu.

Diam-diam demam Miyabi merebak. Banyak orang tergelitik ingin tahu seperti apakah film orang dewasa yang dibintanginya. Ternyata tidak sulit mencari “Miyabi”, bahkan untuk seorang "pemula" sekali pun. Kata “Miyabi” telah menjadi ikon di kawasan Glodok, Jakarta Barat, yang dikenal sebagai pusat perdagangan film bajakan. Film-film Miyabi dan film dewasa lainnya digelar secara terbuka di sana.

Memasuki halaman gedung Glodok, mata Anda langsung tertumbuk pada deretan DVD Miyabi bersampul foto dirinya yang manis. Terlebih begitu Anda memasuki koridor pusat perbelanjaan elektronik terbesar di Jakarta ini. Film-film Miyabi pun semakin mudah Anda temui. Puluhan DVD Miyabi sengaja dijejerkan di rak seukuran 5 meter x 1 meter, sementara mata pemilik lapak berkeliaran mencari calon pembeli potensial.

Support for additional screen resolutions and densities in Android

You may have heard that one of the key changes introduced in Android 1.6 is support for new screen sizes. This is one of the things that has me very excited about Android 1.6 since it means Android will start becoming available on so many more devices. However, as a developer, I know this also means a bit of additional work. That's why we've spent quite a bit of time making it as easy as possible for you to update your apps to work on these new screen sizes.

To date, all Android devices (such as the T-Mobile G1 and Samsung I7500, among others) have had HVGA (320x480) screens. The essential change in Android 1.6 is that we've expanded support to include three different classes of screen sizes:


  • small: devices with a screen size smaller than the T-Mobile G1 or Samsung I7500, for example the recently announced HTC Tattoo

  • normal: devices with a screen size roughly the same as the G1 or I7500.

  • large: devices with a screen size larger than the G1 or I7500 (such as a tablet-style device.)

Any given device will fall into one of those three groups. As a developer, you can control if and how your app appears to devices in each group by using a few tools we've introduced in the Android framework APIs and SDK. The documentation at the developer site describes each of these tools in detail, but here they are in a nutshell:


  • new attributes in AndroidManifest for an application to specify what kind of screens it supports,

  • framework-level support for using image drawables/layouts correctly regardless of screen size,

  • a compatibility mode for existing applications, providing a pseudo-HVGA environment, and descriptions of compatible device resolutions and minimum diagonal sizes.

The documentation also provides a quick checklist and testing tips for developers to ensure their apps will run correctly on devices of any screen size.

Once you've upgraded your app using Android 1.6 SDK, you'll need to make sure your app is only available to users whose phones can properly run it. To help you with that, we've also added some new tools to Android Market.

Until the next time you upload a new version of your app to Android Market, we will assume that it works for normal-class screen sizes. This means users with normal-class and large-class screens will have access to these apps. Devices with "large" screens simply run these apps in a compatibility mode, which simulates an HVGA environment on the larger screen.

Devices with small-class screens, however, will only be shown apps which explicitly declare (via the AndroidManifest) that they will run properly on small screens. In our studies, we found that "squeezing" an app designed for a larger screen onto a smaller screen often produces a bad result. To prevent users with small screens from getting a bad impression of your app (and reviewing it negatively!), Android Market makes sure that they can't see it until you upload a new version that declares itself compatible.

We expect small-class screens, as well as devices with additional resolutions in Table 1 in the developer document to hit the market in time for the holiday season. Note that not all devices will be upgraded to Android 1.6 at the same time. There will be significant number of users still with Android 1.5 devices. To use the same apk to target Android 1.5 devices and Android 1.6 devices, build your apps using Android 1.5 SDK and test your apps on both Android 1.5 and 1.6 system images to make sure they continue to work well on both types of devices. If you want to target small-class devices like HTC Tattoo, please build your app using the Android 1.6 SDK. Note that if your application requires Android 1.6 features, but does not support a screen class, you need to set the appropriate attributes to false. To use optimized assets for normal-class, high density devices like WVGA, or for low density devices please use the Android 1.6 SDK.

Tuesday, October 6, 2009

ADC 2 Round 1 Scoring Complete


Media_http1bpblogspot_hatoe

The response to round one of the Android Developer Challenge 2 has been phenomenal! We originally expected that it would take two weeks to get all the necessary data to complete scoring. Over the last 10 days, more than 26,000 Android users reviewed and submitted our target of over 100 scores per application. With this enthusiastic support of the Android community, we are closing the first round of ADC 2 judging today.

We will now be reviewing the results and preparing for round 2. Please stay tuned for information about round 2, where the community, combined with a panel of judges, will narrow down the top 20 applications in each category to determine the final winners. Until then, users with the ADC 2 judging application currently installed will get a notice saying that round 1 is over. When round 2 opens, the judging application will resume giving out new submissions to score. We look forward to seeing the results of the final round and hope that you choose to help us score these top apps as well!

Monday, October 5, 2009

Gestures on Android 1.6

Touch screens are a great way to interact with applications on mobile devices. With a touch screen, users can easily tap, drag, fling, or slide to quickly perform actions in their favorite applications. But it's not always that easy for developers. With Android, it's easy to recognize simple actions, like a swipe, but it's much more difficult to handle complicated gestures, which also require developers to write a lot of code. That's why we have decided to introduce a new gestures API in Android 1.6. This API, located in the new package android.gesture, lets you store, load, draw and recognize gestures. In this post I will show you how you can use the android.gesture API in your applications. Before going any further, you should download the source code of the examples.

Creating a gestures library

The Android 1.6 SDK comes with a new application pre-installed on the emulator, called Gestures Builder. You can use this application to create a set of pre-defined gestures for your own application. It also serves as an example of how to let the user define his own gestures in your applications. You can find the source code of Gestures Builders in the samples directory of Android 1.6. In our example we will use Gestures Builder to generate a set of gestures for us (make sure to create an AVD with an SD card image to use Gestures Builder.) The screenshot below shows what the application looks like after adding a few gestures:


Media_httpdocsgooglec_pjaaa

As you can see, a gesture is always associated with a name. That name is very important because it identifies each gesture within your application. The names do not have to be unique. Actually it can be very useful to have several gestures with the same name to increase the precision of the recognition. Every time you add or edit a gesture in the Gestures Builder, a file is generated on the emulator's SD card, /sdcard/gestures. This file contains the description of all the gestures, and you will need to package it inside your application inside the resources directory, in /res/raw.

Loading the gestures library

Now that you have a set of pre-defined gestures, you must load it inside your application. This can be achieved in several ways but the easiest is to use the GestureLibraries class:


mLibrary = GestureLibraries.fromRawResource(this, R.raw.spells);
if (!mLibrary.load()) {
finish();
}


In this example, the gesture library is loaded from the file /res/raw/spells. You can easily load libraries from other sources, like the SD card, which is very important if you want your application to be able to save the library; a library loaded from a raw resource is read-only and cannot be modified. The following diagram shows the structure of a library:


Media_httpdocsgooglec_calts

Recognizing gestures

To start recognizing gestures in your application, all you have to do is add a GestureOverlayView to your XML layout:


<android.gesture.GestureOverlayView
android:id="@+id/gestures"
android:layout_width="fill_parent"
android:layout_height="0dip"
android:layout_weight="1.0" />


Notice that the GestureOverlayView is not part of the usual android.widget package. Therefore, you must use its fully qualified name. A gesture overlay acts as a simple drawing board on which the user can draw his gestures. You can tweak several visual properties, like the color and the width of the stroke used to draw gestures, and register various listeners to follow what the user is doing. The most commonly used listener is GestureOverlayView.OnGesturePerformedListener which fires whenever a user is done drawing a gesture:


GestureOverlayView gestures = (GestureOverlayView) findViewById(R.id.gestures);
gestures.addOnGesturePerformedListener(this);


When the listener fires, you can ask the GestureLibrary to try to recognize the gesture. In return, you will get a list of Prediction instances, each with a name - the same name you entered in the Gestures Builder - and a score. The list is sorted by descending scores; the higher the score, the more likely the associated gesture is the one the user intended to draw. The following code snippet demonstrates how to retrieve the name of the first prediction:


public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {
ArrayList<prediction> predictions = mLibrary.recognize(gesture);

// We want at least one prediction
if (predictions.size() > 0) {
Prediction prediction = predictions.get(0);
// We want at least some confidence in the result
if (prediction.score > 1.0) {
// Show the spell
Toast.makeText(this, prediction.name, Toast.LENGTH_SHORT).show();
}
}
}


In this example, the first prediction is taken into account only if it's score is greater than 1.0. The threshold you use is entirely up to you but know that scores lower than 1.0 are typically poor matches. And this is all the code you need to create a simple application that can recognize pre-defined gestures (see the source code of the project GesturesDemo):


Media_httpdocsgooglec_nhxel

Gestures overlay

In the example above, the GestureOverlayView was used as a normal view, embedded inside a LinearLayout. However, as its name suggests, it can also be used as an overlay on top of other views. This can be useful to recognize gestures in a game or just anywhere in the UI of an application. In the second example, called GesturesListDemo, we'll create an overlay on top of a list of contacts. We start again in Gestures Builder to create a new set of pre-defined gestures:


Media_httpdocsgooglec_fvrci

And here is what the XML layout looks like:


<android.gesture.GestureOverlayView
xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/gestures"
android:layout_width="fill_parent"
android:layout_height="fill_parent"

android:gestureStrokeType="multiple"
android:eventsInterceptionEnabled="true"
android:orientation="vertical">

<ListView
android:id="@android:id/list"
android:layout_width="fill_parent"
android:layout_height="fill_parent" />

</android.gesture.GestureOverlayView>


In this application, the gestures view is an overlay on top of a regular ListView. The overlay also specifies a few properties that we did not need before:



  • gestureStrokeType: indicates whether we want to recognize gestures made of a single stroke or multiple strokes. Since one of our gestures is the "+" symbol, we need multiple strokes


  • eventsInterceptionEnabled: when set to true, this property tells the overlay to steal the events from its children as soon as it knows the user is really drawing a gesture. This is useful when there's a scrollable view under the overlay, to avoid scrolling the underlying child as the user draws his gesture


  • orientation: indicates the scroll orientation of the views underneath. In this case the list scrolls vertically, which means that any horizontal gestures (like action_delete) can immediately be recognized as a gesture. Gestures that start with a vertical stroke must contain at least one horizontal component to be recognized. In other words, a simple vertical line cannot be recognized as a gesture since it would conflict with the list's scrolling.

The code used to load and set up the gestures library and overlay is exactly the same as before. The only difference is that we now check the name of the predictions to know what the user intended to do:


public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {
ArrayList<Prediction> predictions = mLibrary.recognize(gesture);
if (predictions.size() > 0 && predictions.get(0).score > 1.0) {
String action = predictions.get(0).name;
if ("action_add".equals(action)) {
Toast.makeText(this, "Adding a contact", Toast.LENGTH_SHORT).show();
} else if ("action_delete".equals(action)) {
Toast.makeText(this, "Removing a contact", Toast.LENGTH_SHORT).show();
} else if ("action_refresh".equals(action)) {
Toast.makeText(this, "Reloading contacts", Toast.LENGTH_SHORT).show();
}
}
}


The user is now able to draw his gestures on top of the list without interfering with the scrolling:


Media_httpdocsgooglec_zddjc

The overlay even gives visual clues as to whether the gesture is considered valid for recognition. In the case of a vertical overlay, for instance, a single vertical stroke cannot be recognized as a gesture and is therefore drawn with a translucent color:


Media_httpdocsgooglec_zvbgd

It's your turn

Adding support for gestures in your application is easy and can be a valuable addition. The gestures API does not even have to be used to recognize complex shapes; it will work equally well to recognize simple swipes. We are very excited by the possibilities the gestures API offers, and we're eager to see what cool applications the community will create with it.