Call for Paper
Call for Posters
Australia Visa

Workshop Program

Workshop on Scientific Instruments and Sensors on the Grid

Session 1: Plenary Session
David De Roure 'Instruments, Sensors and the Sematic Grid'
Marimuthu Palaniswami ' Sensors on the Grid'
Session 2
Tomas Molina. ‘A Generalised Service-Oriented Architecture for Remote
Control of Scientific Imaging Instruments.’
Tharaka Devadithya, Kenneth Chiu, Donald F. (Rick) McMullen, and Kia Huffman ‘The Common Instrument Middleware Architecture: Overview of Goals and Implementation.’
Andrew Stephen McGough ‘The GRIDCC Project’.  
Session 3
Frederic Villeneuve-Seguier ‘Reprocessing D0 data with SAMGrid.’
Jeremy G Frey and Jamie Michael Robinson ‘Sensor Networks and Grid Middleware for Laboratory Monitoring.’
Ian Atkinson ‘Sensor Networks on the Great Barrier Reef: Overview and Progress.’

Session 4

Claudio Vuerli ‘Monitoring and remote control of scientific instrumentation.’
Roberto Pugliese ‘Elettra Virtual Collaboratory: the evolution of a Virtual Laboratory Software from a simple web application to the GRIDCC .’
Mike Hursthouse and Simon Coles ‘Grid-enabling an existing instrument-based national service.’


Session 2.

Title: A Generalized Service-Oriented Architecture for Remote Control of Scientific Imaging Instruments
Authors: Tomas Molina, George Yang, Abel Lin, Steven Peltier and Mark Ellisman. Tomas Molina National Center for Microscopy and Imaging Research, University of California at San Diego 9500 Gilman Drive, BSB 1000 La Jolla, CA 92093-0608.
Abstract. Scientific imaging instruments are used in a variety of disciplines to gather vital data for research and study. Specifically, in the biomedical field various types of biological imaging instruments, such as electron microscopes and light microscopes, are used everyday to acquire 2D and 3D datasets for further understanding of biological structures. Remote operation or “tele-operation” of instruments has become a popular solution for research scientists to acquire and share data across research domains separated by geographical barriers. A generalized software architecture solution is presented in this paper to use emerging software technologies to develop a reusable framework to easily integrate instruments for remote-operation in a safe and secure fashion. Web services have emerged as a popular technology to provide software applications with a framework to achieve interoperability and integration with other applications.
This generalized software architecture was developed to take advantage of web service middleware technology and to provide a solution for easily plugging in scientific imaging instruments for tele-operation. The architecture has also incorporated Grid technology to achieve a more scalable and robust solution for handling the enormous data sets produced from these instruments. Finally a set of client libraries is presented to demonstrate a useful API for developers to quickly develop a graphical user interface to communicate and acquire data from these instruments.

Title: The Common Instrument Middleware Architecture: Overview of Goals and Implementation
Authors: Tharaka Devadithya, Kenneth Chiu, Donald McMullen and Kia Huffman.
Indiana University and SUNY Binghamton.
Abstract. Instruments and sensors and their accompanying actuators are essential to the conduct of scientific research. In many cases they provide observations in
electronic format and can be connected to computer networks with varying degrees of remote interactivity. These devices vary in their architectures and type of data they capture and may generate data at various rates. In this paper we present an overview of the design goals and initial implementation of the Common Instrument Middleware Architecture (CIMA), a framework for making instruments and sensors network accessible in a standards-based, uniform way, and for for interacting remotely with instruments and the data they produce. Some of the issues CIMA addresses include: flexibility in network transport, efficient and high throughput data transport, the availability (or lack of ) computational, storage and networking resources at the instrument or sensor platform, evolution of instrument design, and reuse of data acquisition and processing

Title: The GRIDCC Project
Authors: David Colling and Andrew Stephen McGough.
Department of Computing, Imperial College London, London, SW7 2BZ, UK.
Abstract. The GRIDCC project is integrating into the Grid remote interaction with instruments, along with distributed control and real time interaction. The GRIDCC middleware is being designed with use cases from a very diverse set of applications and so the GRIDCC architecture provides access to the instruments in as generic a way as possible. The middleware will be validated on a representative subset of these applications. GRIDCC is also developing an adaptable user interface and a mechanism for performing complex workflows in order to increase both the usability and the usefulness of the system. Wherever possible the GRIDCC middleware builds on top of other middleware stacks allowing the effort to be concentrated on the more novel elements of the project. The GRIDCC project is a collaboration between 10 organizations in 4 different countries and is funded by the European Union.


Session 3.

Title: Reprocessing D0 data with SAMGrid
Authors: Frederic Villeneuve-Seguier Imperial College
High Energy Physics Department, The Blackett Laboratory, Prince Consort Road, LONDON SW7 2BW UK.
Abstract. The Dzero experiment studies proton-antiproton collisions at the Tevatron collider based at Fermilab. Processing, managing and distributing the large amount of real data coming from the detector as well as generating sufficient Monte Carlo data are some of the challenges faced by the Dzero collaboration. SAMGrid combines the SAM data handling system with the necessary job and information management allowing us to use the distributed computing resources in the various worldwilde computing centres. This is one of the first large scale grid applications in High Energy Physics. After succesful Monte Carlo production and a limited data reprocessing in the winter of 2003/04, the next milestone will be the reprocessing of the full current RunII data set. This consists of ~500 TB of data, encompassing one billion events and has started in april 2005. Already more than 650 million events have been reprocessed so far.

Title: Sensor Networks and Grid Middleware for Laboratory Monitoring
Authors: Jamie Michael Robinson, Jeremy G Frey, Andy J Stanford-Clark, Andrew D Reynolds and Bharat V Bedi, School of Chemistry, University of Southampton, SOUTHAMPTON, SO17 2HJ, England.
Abstract. By combining automatic environment sensing and experimental data collection with broker based messaging middleware, a system has been produced for the real-time monitoring of experiments whilst away from the lab. Changes in the laboratory environment are encapsulated as simple XML messages, which are published using an MQTT compliant broker. Clients subscribe to the MQTT stream, and perform a data transform on the messages; this may be to produce a user display or to change the format of the message for republishing. For example an MQTT client written for the Java MIDP platform, can be run on a smart-phone with a GPRS Internet connection, freeing us from the constraints of the network. We present an overview of the technologies used, and how these are helping chemists make the best use of their time.

Title: Sensor Networks on the Great Barrier Reef: Overview and Progress
Author: Ian Atkinson, VeRG Lab, School of Information Technology, James Cook University, Townsville, Queensland, 4811, Australia.
Abstract. The environmental dynamics of marine systems such as the Great Barrier Reef (GBR) are highly complex. With over 3,200 reefs extended over 280,000 km2, and fluctuations ranging from kilometres (oceanic mixing) to millimetres (inter-skeletal currents) the understanding of the GBR presents challenges at many levels. In order to manage anthropogenic stresses effectively on the GBR, an observational knowledge base of orders of magnitude greater in size than that presently available is required. Many parts of the GBR will remain under sampled as a result of the economically unviable manual sampling methods currently in use.

Clearly the only form of observational system that has the capacity to meet this requirement is some form of remote sensor network system. We are therefore in the process of developing a pilot sensor network to test and research the concept in our environment, with the long-term goal of a complete GBR observation system. Our basic environmental sensor platform measures temperature, salinity and light, and initial site locations are Davies Reef, Magnetic Island and Heron Island in North Queensland. We hope the gathered data will give us a better understanding of the relationship between various environmental parameters, the impact of temperature changes on coral reefs and the impact of global warming on the GBR system.

As much as possible, we are building the sensor network to utilise standard network protocols and hardware. The sensors packages are be IP based, spatially aware and will eventually be able to adapt to conditions they are monitoring. There are several major challenges we face in the rollout of a sensor network across the GBR:
• Establishing long range communications to outer reefs (>100 km)
• Local short range/ad hoc sensor networks for reef deployment
• Sensor design, packaging, development and power supply
• Handling large number of independent data streams
• Data curation and management

In addition there are particular problems raised by deployment of sensors in marine environments. One of the major problems faced is how to establish a low-power, low-cost data link from remote reefs to the shore. Fortunately, we can exploit the phenomenon of humidity ducts to channel low-power microwave transmission over long ranges. However, this technique is not consistently reliable, varying in bandwidth and reliability depending on local climatic conditions (humidity, rainfall etc.). We are developing a multi scale approach to the sensor network that is network adaptable so as to cope with this unreliability.

While traditional approaches coping with large volumes of realtime sensor data involve the deployment systems such as object ring buffers (ORB’s) we have used an orbless approach in order to reduce complexity and cost. Data in our environment is streamed directly into the storage resource broker (SRB) data federation system, where it is remotely quality assured. Because of link reliability issues we are using a store forward approach.

Session 4

Title: Monitoring and remote control of scientific instrumentation through the Grid
Authors: Claudio Vuerli, Giuliano Taffoni, Igor Coretti, Fabio Pasian and Paolo Santin
Claudio Vuerli, INAF - Osservatorio Astronomico di Trieste, Via Tiepolo 11, I-34131 Trieste (Italy).
Abstract. Grid infrastructures currently in use for production purposes are strongly computing-oriented, suitable for scientific communities whose applications require intensive computation on a relatively small amount of data. Middleware implementations underlying such infrastructures well support the sharing and distribution of Grid-embedded computational resources but problems arise when trying to use such Grids to satisfy the sharing of data-oriented and services-oriented resources. The Grid middleware model does not allow the embedding of a meta-computing machine.
Some scientific communities are strongly limited in using such Grid infrastructures for their applications; they have a wider perception of the Grid and their applications require not only traditional computation but also access to complex data repositories and services as well as mixed distributed computations. The astrophysical community certainly has this perception of the Grid.
This work concentrates on the interoperability aspects between the Grid and the scientific instrumentation. The new IE (Instrument Element) Grid Element has been designed, built and tested for this purpose.

The IE makes possible to monitor and remotely control any scientific instrumentation. The first implementation of the IE is focused on the monitoring aspects; astronomers having access to a Grid infrastructure through a Grid-UI can interface the observing facility where his/her observing runs are in progress and check the telemetric data as well as scientific data during their acquisition. Future releases of the IE will be extended to the remote control so that remote working sessions using remote astronomical instrumentation shall also be possible.

This work is part of the wider project (including the Query Element) whose goal is to exploit the Grid technology to build a homogeneous astronomical working environment where scientific data are acquired, checked, compared with data coming from other databases, processed and stored.

Title: Elettra Virtual Collaboratory: the evolution of a Virtual Laboratory Software from a simple web application to the GRIDCC
Authors: Roberto Pugliese, Alessandro Busato, Alessio Curri, Enrico Mariotti, Daniele Favretto, Valentina Chenda, Fulvio Billh, Michele Turcinovich, Roberto Borghes, Lawrence Iviani, Fabio Asnicar and Laura Del Cano., Sincrotrone Trieste S.C.p.A. di interesse nazionale, Strada Statale 14 - km 163,5 in AREA Science Park, 34012 Basovizza, Trieste ITALY.
Abstract. Elettra Virtual Collaboratory (EVC) is an example of virtual laboratory, a system which allows a team of researchers distributed anywhere in the world to perform a complete experiment on the beamlines and experimental stations of ELETTRA. The creation and introduction of effective CSCW systems aims at bringing the following main advantages: provide remote access to expensive and hard-to-duplicate equipment; increase the effectiveness of the experimental activity, since more experts can participate to experiments, give useful hints and solve problems; facilitate multi-institutional consortia collaborations on large-scale projects.

Experience and know-how acquired during the development of the first release of EVC was exploited in the FP6 EU founded projects in which ELETTRA is currently involved. In the BIOXHIT project which will develop an integrated platform for high-throughput structure determination ELETTRA is developing the Virtual Collaboratory System a Virtual Organization (VO) connecting all the European laboratories doing research in the field of structural genomics.

In the EURO TeV project the design study of the International Linear Collider ELETTRA is developing the Multipurpose Virtual Laboratory, the core tool to implement the Global Accelerator Network, a VO connecting all the international laboratories doing research in the field of Accelerators. Remote control of an accelerator facility has the potential of revolutionizing the mode of operation and the degree of exploitation of large experimental physics facilities. The first prototype of the system allowed in May 2005 the remote control of ELETTRA storage ring from DESY.

The GRIDCC project (Grid Enabled Instrumentation with Distributed Control and Computation) has the goal of extending the by introducing the handling of real-time constraints and interactive response into the existing Grid middleware. GRIDCC will introduce the concept of GRID enabled sensor which is extremely important for industrial applications.

The paper describes the status ofthe Elettra Virtual Collaboratory as evolved under the pressure of the above mentioned projects and presents the development plans for the future.

EVC software has been improved in these years moving from a simple single facility web application to a multi-facility integration platform based on webservices. We are currently refactoring EVC in order to migrate to the GRIDCC MCE middleware. EVC can be considered now another testbed application of the GRIDCC project.

Most of this work is founded by the EC through the BIOXHIT, EUROTeV and particularly the RIDCC project (IST-511382) .

Title: Grid-enabling an existing instrument-based national service
Authors: Simon Coles, Jeremy Frey, Mike Hursthouse, Mark Light, Mike Surridge, Ken Meacham, Hugo Mills, Dave DeRoure and Ed Zaluska, School of Chemistry, University of Southampton, Southampton SO17 1BJ, United Kingdom.
Abstract. Recent work by the UK National Crystallography Service (NCS) has integrated the service environment into a Grid environment. The existing high-throughput crystallography facility is enhanced by on-line feedback and the ability to monitor and steer diffraction experiments remotely. Grid-based security mechanisms are used to determine authorisation attributes and hence to allow user interaction at appropriate stages, together with access of a database recording the status of the submitted samples. The user can see the position of their samples, be alerted to all stages from submission to experiment and then analysis, visualise raw data as it is generated, be involved in the key decision-making during the parameterisation and initialization of the experiment and may then monitor the data collection to ensure its successful completion. Results data are staged to a secure area and made available for download (either the raw diffraction data or as a refined structure generated by NCS staff).



Dec. 5 - 8, 2005, Melbourne, Australia
Dec.  5 - 8, 2005, Melbourne , Australia