Distributed Applications, Web Services, Tools and GRID Infrastructures for Bioinformatics

Home Page NETTAB 2006

Call for Papers

Instructions for Authors

Registration Form



Workshop Location

Accomodation and Travel Info

About Sardinia  

Call for papers

The Call for papers is CLOSED


It is well known that bioinformatics has to cope with one of the largest amounts of information in all knowledge domains. This information is exponentially increasing in volume and distributed over the network, with heterogeneous data structures, information systems and semantics. The biomedical research domain is one of the fastest evolving research areas. New knowledge and hypotheses are accumulated daily, leading to new data and elaboration needs, which are often peculiar and different from one application domain to another. Hence, the development of general purpose applications is difficult and limited to basic tools. Two main approaches to the problem of efficiently accessing such huge amount of data have been proposed: gathering all information in a unique data warehouse or carrying out all possible elaborations remotely. Each of these approaches has pros and cons. Main limitations of the data warehouse approach are concerned with managing the overhead needed to set up and maintain data and analysis tools up-to-date, whereas limitations of the network-based approach are mainly related with the need of dealing with heterogeneous formats and network bottlenecks. Remote access to this huge amount of information requires complex search and retrieval software. The activity of data integration is concerned with the semantics of the information; in particular how to link data and how to pipe retrieval and analysis steps. A workflow is "a computerized facilitation or automation of a business process, in whole or part". Its main goal is the implementation of data analysis processes in standardized environments. Its main advantages relate to effectiveness, reproducibility, reusability and traceability. Implementation of workflow management software has already started. Enactment of workflows over the network has been greatly simplified by the development of XML-related standards and by the adoption of more powerful RPC mechanisms, like the ones provided by Web Services. Moreover, high performance network infrastructures, like GRID, now offer an improved environment where remote applications can be effectively carried out. To summarize, advantages of remote data analysis versus the warehouse approach are clear. Diffusion of automation is quickly improving as to available technologies and tools and their diffusion in the biomedical domain. This workshop will focus on technologies, tools and applications that can now be applied in bioinformatics.


  • Technologies and technological platforms:
    • Standards and protocols for Web Services
    • Choreography and Orchestration of Web Services
    • Comparison of available technologies, limitations, pros and cons
    • Knowledge representation and knowledge modeling tools for biological data
    • Ontologies, databases and applications of semantics in bioinformatics
    • Semantic Web technologies
  • New tools for bioinformatics:
    • Web Services and related tools
    • Workflow management systems and enactment portals
    • Technologies for GRID infrastructures
    • Semantic Web tools
  • Applications in bioinformatics:
    • Case studies, scenarios and use cases
    • Remote applications for life sciences analysis
    • Workflows for actual data analysis in life sciences
    • Data intensive applications on GRID solutions
    • Remote biological data mining
Download here your "Call for papers" brochure.