Android Development: Missing API

This article is part 2 of 11 in the series Android Development

Not for the first time I came across a missing API in Android yesterday. It’s not like you can’t live without it; in most cases you certainly can. But there’s a bit of functionality missing from Android’s View class that would be very handy when trying to extend existing widgets.

Maybe you’ve come across this before, and maybe you’ll notice if I ask you to take a look at the sort of events you can listen for on views:

  • Clicks
  • Context menu creation
  • Change of focus
  • Key presses
  • Long clicks
  • Touches

Several of these — clicks, key presses and touches — clearly belong to the same group of events, that is, events triggered directly by the user interacting with input devices. There’s a bit of translation going on for you, such as delivering a key press not as a raw key code, but as the symbol the code stands for with the user’s keyboard settings taken into account. Other than that, you get pretty raw information about what the user does.

By contrast change of focus and context menu creation are clearly high-level events, where user input is interpreted, context is taken into account, and an event is launched that describes a change in application state. Long clicks fall somewhere in between the two types: they’re not quite as raw as clicks, but by taken into account time, some form of very simple user gesture is recognized for you, which takes the event to a higher level of abstraction.


Well, not quite. Clicks aren’t exactly the act of clicking — or in the case of Android’s touch interfaces tapping. Strictly speaking, clicks describe the moment where the finger that pressed down on a UI element is lifted (or, if you think of mice, when the mouse button is released).

As such, clicks require almost as much interpretation as long clicks. And when you compare change of focus to both, you’ll realize that belongs in the same category. All three describe changes in the UI’s state. We end up with three different levels at which events are generated.

  1. Input events: events generated by user input, i.e. key presses and touches
  2. UI events: events signalling changes in state significant to UI development, i.e. clicks, long clicks and focus changes.
  3. Application events: events signalling changs in state significant to application development, i.e. selection of subviews or context menus being created.

The lines between these categories may be more blurry than I make them out to be above, but the above distinction helps illustrate what’s missing. Here’s a hint: I’m talking about the second of the above categories.

If you haven’t figured it out yet, focus (haha) on the focus changes. The View class can at all times inform you of it’s focussed state via the handy isFocused() function. Similarly, isSelected() informs you — where appriopriate — whether a view within a group of related views is selected. There’s also a function to figure out whether the user is currently depressing a view, which is of special importance to buttons: isPressed().

Notice that you can listen to focus change events and selection events, but there is no way you can listen to pressed events. Why is that?

More importantly, why should you care?

Well, I recently had need of what’s essentially a regular ToggleButton, except that it needed a bit more than just text as it’s content. There’s a limitation here which I’ll rant about a bit soon. Forget about it for now.

The thing is, you can’t add content to a ToggleButton — what you can do, however, is overlay views on top of it. The only problem with that is that they won’t enter the pressed state when you tap the button. You somehow have to capture that state off the button, and propagate it to the overlay.

The whole thing would be a breeze if View offered functionality for listening to changes pressed state, but that’s not the case.