LeafySeadragon: Cetacean Human Network

   
By Dana Nourie with Serge Masse, July 2004  

Oceans are constantly moving because of currents and wave action. The water is dense, high in salinity, and visibility is frequently low due to plankton bloom and broken-down plants and animals. This is why cetaceans (whales, dolphins and porpoises) have developed a way of navigating, communicating, and finding food -- through the use of sound. Nearly 80 species of cetaceans use varying levels and frequencies of sound.

The meanings of sounds that the different cetaceans produce are still not well understood, but scientists agree that the sounds cetaceans make are critical to their survival. Without using sound, these aquatic mammals couldn't find food, navigate, attract mates or keep tabs on young, and they couldn't avoid collisions with unseen obstacles or avoid predators. The LeafySeadragon system is designed for cetacean research, and for setting up a worldwide internet-based cetacean listening and two-way acoustics interaction system (human-to-cetacean and vice versa).

leafyseadragon
leafyseadragon

The developer notes, "I chose the name LeafySeadragon for this project because it has the word sea in it, and because of the word dragon, which is a strong word, full of imagery, mythology and power. Another characteristic is that seahorses are an endangered species and need our help. So, a Leafy Sea Dragon is an excellent mascot as well as a living model for Leafy.dev.java.net. The Leafy Sea Dragon has a cousin species called a Weedy Sea Dragon, and I am using this name for a modified version of the system for use between human divers, with or without surface vessels."

The system converts underwater cetacean whistles into above-water audio and into text form for human users, and also converts text messages into underwater whistles, in near real time. LeafySeadragon is an open source grid prototype that sends simulated cetacean sounds over the Internet and plays them back on an ordinary PC or hand-held device. In addition, it's based on generic Leafy API for peer-to-peer applications using sockets.

Leafy is a generic API for any distributed applications using mobile Java 2 Micro Edition (J2ME), and non-mobile Java 2 Standard Edition (J2SE) devices, while LeafySeadragron is a complete application built on top of Leafy. This version also includes a generic grid package, org.smgridlayer0, that can be used for any type of IP-socket-based distributed application.

Because this project is open source, you can help in its development.

Leafy and LeafySeadragon Architecture

All software is written in the Java programming language: signal acquisition, audio synthesis, pattern recognition, human user interface, as well as the TCP socket connections between multiple backbone nodes and end-point nodes. The J2SE components Leafy and LeafySeadragon require J2SE 1.4 or higher to run.

When working with the LeafySeadragon and Leafy code, files are organized in five top directories, or rather five projects, three for Leafy and two for LeafySeadragon. This decentralized structure supports a mix of J2SE and J2ME code with a minimum of duplication and therefore a maximum of inheritance (the .class files structures were omitted for brevity):

Legend:
c = cetacean
h = human
ls = LeafySeadragon
j2 = parent to components using j2me and j2se
pro = properties
app = application
comm = communication
dir = directory


Leafy and Nodes

Leafy is the generic API for developing robust networked applications. Leafy is composed of three subprojects, or packages, leafyj2, leafyj2me and leafyj2se. To use them in a J2SE application, jar leafyj2 and leafyj2se together.

A node is an object from the Leafy generic API, and its job is to manage the communication with other nodes. There also may be more than one node per VM. It creates a CommControl object that handles the 2-way communication with the peer for each peer node that a given application is expected to communicate with. There is one CommControl for each peer, remote or not. CommControl manages the Comm objects.

Each CommControl creates a Comm object for the required type of communication with the peer, such as socket, local procedure call (for a local peer), or, in the future, http (with or without web services standards) or JXTA or some other protocol. Nodes can also have a CommControl for managing a server socket if their security level is public or protected. Security levels in Leafy are analogous to the visibility qualifiers on the Java platform.

A public node can be accessed by any other Leafy-compatible node. A protected node can only be accessed by a node identified in its property file. A private node does not have a server socket so it cannot be connected to from a remote node. Nodes and the communication classes that they use are designed to be easy to setup because they have minimal properties and are in a single property file per VM. They are designed to be self-resilient to communication failures.

LeafySeadragon Nodes and Applications

      
 

"Scientists study cetaceans by recording and analyzing their sounds to understand all of these uses of sound," says David K. Mellinger Assistant Professor, Senior Research Cooperative Institute for Marine Resources Studies. "For instance, our lab here analyzes sounds to try to understand whale movements, distributions, habitats, and populations. Other researchers look at cetacean behavior, social groups, interactions, echolocation (sonar), and so on."

 

LeafySeadragon contains different types of applications: c, c2h, and h networking, by using the node classes from the Leafy API. For example, the HRelay application is a SeadragonMain class using an HNode class for its communication where the HNode class extends the generic Node class from Leafy.

LeafySeadragon's three basic types of applications are:

  • Cetacean Interface Application ( c): This type of application manages the communication with the hardware that capture and emit sounds underwater. In the dry simulation mode, for testing the network without a real cetacean, this application can generate voltage numbers internally for simulating the voltages to be received from hydrophones from underwater sound. The voltage numbers are converted to frequency numbers and these are sent as xml text messages by the CNode to a remote peer C2hNode on a TCP socket. A Cetacean Interface is composed of a SeadragonMain class using a CNode for communication.
  • Converter Application ( c2h): Converts cetacean signals to human formats, and vice versa, from human format to cetacean. A Converter is composed of a SeadragonMain class using a C2hNode for communication.
  • Human Interface Application ( h): Plays and displays the cetacean signals, writes the text forms of the signals, and also reads and sends human signals to cetaceans, through the Converter and Cetacean Interface applications. LeafySeadragon presently contains these types of Human Interface Applications, all children of SeadragonMain and using HNode: HBackbone, HRelay, HApplet, and HMidlet.

The applications can be configured to run on individual hosts or on the same host and same VM. In the latter configuration, the applications sharing a VM communicate by ordinary method invocation instead of through socket streams, for example. This communication feature is implemented in the Comm familly of classes, whose ancestors are in the leafyj2 package.

 
LeafySeadragon
LeafySeadragon
Click to enlarge

The current version of LeafySeadragon, version 0.5.2, contains extensive logic to setup the nodes of the application with minimal user input and to detect and report anomalies that may occur. Once the properties file is edited for the specific local features, such as the security level of local nodes and the IP address of peer nodes, the application sets up the local nodes and their communication without user intervention, other than launching the batch file or selecting an icon representing a jar file.

The minimum configuration is a network of three applications, one of each type c, c2h, and h. These could all be in the same VM or on individual VMs, or a mix. The minimum configuration is called a backbone. For emple, the main class of an h application in a backbone is the HBackbone class, a child of SeadragonMain.

For a specific network configuration, one could add more applications to the network. For example, the current version supports any number of h applications, implemented as applets, midlet (MIDP2), and desktop applications. These are the HApplet, HMidlet, and HRelay classes. The HApplet and HMidlet applications are end points and can only connect to an HBackbone for security reasons, while the HRelay can be used as a JXTA-like node and relay messages to and from other h applications, thus implementing a peer-to-peer configuration.

The applications and nodes in this release are prototypes with basic functionality. For example, there are three predefined signals, and one must write Java code to edit them or create new signals. The Converter needs to be expanded into a grid of inexpensive computers to perform pattern recognition analysis on a large set of signals in the lexicon (the Lex class). The system is also designed to have many c hosts so that multiple remote cetaceans could be involved in communication sessions at the same time, over the Internet.

The application detects the readiness of nodes and possible anomalies.

Java Sound API and Java USB API (JSR-80)

The Java Sound API specification provides low-level support for audio operations in the human frequency range (0-20 kHz) such as audio playback and capture (recording), mixing, MIDI sequencing, and MIDI synthesis in an extensible, flexible framework.

The currently used simulation signals, in class Lex, contain predefined frequencies ranging from 9 kKHz to 50 kKHz. The lexicon basically is a list of Signal instances.

 
Using Java Sound to emit Cetacean signals in human frequencies
Using Java Sound to emit Cetacean signals in
human frequencies

A Signal contains all parameters needed to use the data in another group or system. A future version may use XML files to store Signal data and load the data in RAM at once and in the form of Signal instances in Hastable. Clients use Lex.IT.get(signalName) to get a Signal from the lexicon.

A C2hNode's methods handle the signals formats conversions and rely on Lex methods for the details of the conversion and for the storage of signals. Lex signals are stored in a hashtable (with the text as key) in order to optimize h2c conversion and storage by not having duplicates.

cSignals are time-ordered to optimize the positioning of new hz values in a multithreaded design. hzList are time-ordered to optimize the positioning of new hz values in a multithreaded design.

Until a USB device capable of supporting a 1-100 kHz frequency range becomes available, LeafySeadragon will probably use Java Sound API for communicating with hydrophones connected to the mic jack and the earphones jack of a laptop, and thus limited to 1-19 kHz. This may be sufficient for some species.

LeafySeadragon is designed to work with acquired signals ranging from 1 kKHz to 100 kKHz, and this frequency range is flexible, such that it can be modified to a lower or higher value if the host CPU speed is insufficient for this range or sufficient to support a higher limit.

In the c2h application on a backbone, recognized signals are assigned their text form, such as a word or id number, which is obtained from the matched signal in the lexicon; while unrecognized signals are given a system-generated text form and added to the lexicon. The signals with their text form and frequency numbers are sent to h nodes for reading and hearing by people. H nodes classes are in the h package. For hearing by people, the frequency numbers of a signal are converted in sound in air by using the JavaSound API.

The Java Sound API provides the lowest level of sound support on the Java platform. It provides application programs with a great amount of control over sound operations, and it is extensible. For example, the Java Sound API supplies mechanisms for installing, accessing, and manipulating system resources such as audio mixers, MIDI synthesizers, other audio or MIDI devices, file readers and writers, and sound format converters. The Java Sound API does not include sophisticated sound editors or graphical tools, but it provides capabilities upon which such programs can be built. It emphasizes low-level control beyond that commonly expected by the end user.

Java Sound is used in LeafySeadragon for the repetition of cetacean sounds in the human audio range, that is up to 20 kHz, while the LeafySeadragon components that handle cetacean sounds go up to 100 kHz and possibly higher because some cetacean species can emit sounds up to 300 kHz. The frequency range required for effective communication for any species is still not known.

The Java Sound API includes two PCM encodings that use linear quantization of amplitude, and signed or unsigned integer values. Linear quantization means that the number stored in each sample is directly proportional (except for any distortion) to the original sound pressure at that instant and similarly proportional to the displacement of a loudspeaker or eardrum that is vibrating with the sound at that instant.

A frame contains the data for all channels at a particular time. For PCM-encoded data, the frame is simply the set of simultaneous samples in all channels, for a given instant in time, without any additional information. In this case, the frame rate is equal to the sample rate, and the frame size in bytes is the number of channels multiplied by the sample size in bits, divided by the number of bits in a byte.

The communication interface with the hydrophones (transducers), uses its own custom acoustic Java software coupled with the USB v.2 API (JSR-80) because serial or parallel communications are not fast enough for cetacean acoustics when not using an expensive hardware interface. Code for the hydrophone interface is in progress and will probably be an individual subproject in Leafy because it could be reused in other applications that are not using Leafy or LeafySeadragon.

Java Logging API

The Java Logging APIs, introduced in package java.util.logging, capture information such as security failures, configuration errors, performance bottlenecks, and/or bugs in the application or platform. The core package includes support for delivering plain text or XML-formatted log records to memory, output streams, consoles, files, and sockets. In addition, the logging APIs are capable of interacting with logging services that already exist on the host operating system.

The LeafySeadragon session reports contain all communication that occurred during a given communication session with cetaceans, to be used offline by the researchers. Because it uses xml tags (based on the plain logger.dtd that comes with the SDK) the reports can also be read by another application afterwards easily, and replay a previous session. It can be used to try to reproduce sessions, an important research feature of LeafySeadragon.

The trace log contains technical trace texts to be used by the administrator of the system, such as for debugging.

These two types of reports use the same logging API, but they have very different written formats.

Leafy and LeafySeadragon J2SE-with-J2ME

Packages are organized appropriately for easy reuse of classes in J2ME and J2SE.

Roadmap Summary

There are still many important features to be implemented, and developers have a variety of areas they can put their programming skills to great use.

 

  • Add graphics to the h applications for the user interface, probably using SVG and JSR-226 for J2ME
  • Use Java USB API (JSR-80) to connect hydrophones using a USB ADC-DAC device
  • Improve C2hNode to use a grid (of a type yet to be determined) to improve the recognition of acquired signals (sounds from a cetacean) and convert them to human forms. The current version does the conversion, but on a single node and this process is CPU-intensive.
  • Add cetacean sounds on handheld devices, including cellphones (J2ME MIDP 2.0)
  • Add communication research functionalities, such as exchange of data and components between research teams, and support session replication
  • Add a grammatical theories manager (a grammar is a layer of structure above the signal level) with ergonomic features for researchers to modify grammars without changing the source code, and to exchange grammars with other research teams
  • Upgrade the threading techniques to use JSR-166 API in the J2SE classes
  • Complete the JavaDoc
  • Test and debug application further
  • Setup more backbone networks, running demos live on the Internet

This project is hosted on the java.net site, which provides a common area for interesting conversations and innovative development projects related to Java technologies.

Developer and project owner, Serge Masse is President and Chief Technology Officer of Simplecode, Inc., and he is speaking at the 2004 JavaOne session for this project.

For More Information

Leafy API
LeafySeadragon
Java Logging API
Java Sound API
java.net web site

Rate and Review
Tell us what you think of the content of this page.
Excellent   Good   Fair   Poor  
Comments:
Your email address (no reply is possible without an address):
Sun Privacy Policy

Note: We are not able to respond to all submitted comments.