Senin, 12 Desember 2011

ScienceCloud 2012 3rd Workshop on Scientific Cloud Computing

Computational and Data Driven Sciences have become the third and fourth pillar of empirical sciences in addition to experimental and theoretical science. Scientific Computing has already begun to change how science is done, enabling scientific breakthroughs through new kinds of experiments that would have been impossible only a decade ago. Today’s science is generating datasets that are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. The support for data intensive computing is critical to advance modern science as storage systems have exposed a widening gap between their capacity and their bandwidth by more than 10-fold over the last decade. There is a growing need for advanced techniques to manipulate, visualize and interpret large datasets. Scientific Computing is the key to solving “grand challenges” in many domains and provide breakthrough in new knowledge, and it comes in many shapes and forms, from high-performance computing (HPC) which is heavily focused on compute-intensive applications, high-throughput computing (HTC) which focuses on using many computing resources over long periods of time to accomplish its computational tasks, many-task computing (MTC) which aims to bridge the gap between HPC and HTC by focusing on using many resources over short periods of time, to data-intensive computing which is heavily focused on data distribution and harnessing data locality by scheduling of computations close to the data.

The 3rd workshop on Scientific Cloud Computing (ScienceCloud) will provide the scientific community a dedicated forum for discussing new research, development, and deployment efforts in running these kinds of scientific computing workloads on Cloud Computing infrastructures. The ScienceCloud workshop will focus on the use of cloud-based technologies to meet new compute intensive and data intensive scientific challenges that are not well served by the current supercomputers, grids and HPC clusters. The workshop will aim to address questions such as: What architectural changes to the current cloud frameworks (hardware, operating systems, networking and/or programming models) are needed to support science? Dynamic information derived from remote instruments and coupled simulation, and sensor ensembles that stream data for realtime analysis are important emerging techniques in scientific and cyber-physical engineering systems. How can cloud technologies enable and adapt to these new scientific approaches dealing with dynamism? How are scientists using clouds? Are there scientific HPC/HTC/MTC workloads that are suitable candidates to take advantage of emerging cloud computing resources with high efficiency? Commercial public clouds provide easy access to cloud infrastructure for scientists. What are the gaps in commercial cloud offerings and how can they be adapted for running existing and novel eScience applications? What benefits exist by adopting the cloud model, over clusters, grids, or supercomputers? What factors are limiting clouds use or would make them more usable/efficient?

This workshop encourages interaction and cross-pollination between those developing applications, algorithms, software, hardware and networking, emphasizing scientific computing for such cloud platforms. We believe the workshop will be an excellent place to help the community define the current state, determine future goals, and define architectures and services for future science clouds.

Topics of Interest

We invite the submission of original work that is related to the topics below. The papers can be either short (5 pages) position papers, or long (10 pages) research papers. Topics of interest include (in the context of Cloud Computing):

  • Scientific application cases studies on cloud infrastructure
  • Performance evaluation of cloud environments and technologies
  • Fault tolerance and reliability in cloud system
  • Data-intensive workloads and tools on clouds
  • Use of programming models such as Map-Reduce and its implementations
  • Storage cloud architectures
  • I/O and Data management in the cloud
  • Workflow and resource management in the cloud
  • Use of cloud technologies (e.g., NoSQL databases, etc) for scientific applications
  • Data streaming and dynamic applications on clouds
  • Application of cloud concepts in HPC environments
  • High performance parallel file systems and interconnects in virtual environments
  • Research and best practices in Cloud security

For more details, please contact Yogesh Simmhan (simmhan@usc.edu).

Download Flyer (PDF)


ScienceCloud is one of the workshops held in conjunction with HPDC'12:
Short Name Full Workshop Name
Application Domain
Astro-HPC 2012 Workshop on High-Performance Computing for Astronomy
ECMLS 2012 Emerging Computational Methods for the Life Sciences Workshop
Science Cloud 2012 Workshop on Scientific Cloud Computing
SocMP 2012 Workshop on Social Media Processing
Data-Intensive Processing
DIDC 2012 Workshop on Data-Intensive Distributed Computing
ISDP 2012 In-Situ Data Processing Technologies
General
LSAP 2012 Workshop on Large-scale Systems and Applications Performance
Infrastructure
MapReduce'12 Workshop on MapReduce and its Applications
VTDC-2012 Workshop on Virtualization Technologies in Distributed Computing

Tidak ada komentar:

Posting Komentar