Yu Fang

Peking University, Peping, Beijing, China

Are you Yu Fang?

Claim your profile

Publications (55)6.05 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Three-dimensional(3D) virtual environments and immersive simulation can be used to improve students' understanding of complex material in the science curriculum. Virtual field practice or virtual field trip (VFT) has become popular with both university students and teachers as a means of learning and teaching during the last decade. The paper presents 3D construction and visualization of complex geologic environments based on Virtual Geographic Environment(VGE) that enable us to improve the effects of geologic field practice, making geologic field practice education more efficient and effective. We introduce the technology of VGE and present a VGE-based project, Huyu Virtual Geologic Field Practice System. The approach provides a flexibility interaction platform and distributed and collaborative learning system for students' learning and virtual practicing. To some extent, it can be a replacement of traditional geologic field practice. This is of greateducational importance for the innovativewaysof virtual field trips.
    Geoinformatics (GEOINFORMATICS), 2013 21st International Conference on; 01/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: This thesis is based on the application of Internet of Things (IoT) and WebGIS in precision agriculture. Through analyzing the current development of precision agriculture in China and considering its advantages and shortcomings, we choose an ecology farm as an example to conduct a new precision agriculture management system (PAMS) based on the above two techniques. We designed the four architectures of PAMS: the spatial information infrastructure platform, the IoT infrastructure platform, the agriculture management platform and the mobile client. Users can monitor and manage the agriculture production by PAMS. What's more, module integration method and open source software can help us to reduce the development cost and to improve the system efficiency.
    Geoinformatics (GEOINFORMATICS), 2013 21st International Conference on; 01/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: With the rapid development of remote sensing and computer technologies, remote sensing image data obtained by satellite isincreasing dramatically [1]. The speed has exceeded one TB each day and will obviously increase in the future. How to manage it efficiently becomes a problem because traditional waysare expensive and difficultto extend. Hence, we need a scalable and parallel processing model. HBaseand MapReduce meet the needs naturally. In this paper, we propose a method to store massive image data in HBase, and process it using MapReduce. Experimental results illustrate that the speeds of data importing and data processing increase obviously as the cluster of HBase grows.
    Geoinformatics (GEOINFORMATICS), 2013 21st International Conference on; 01/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: In wireless sensor networks, location information is essential for the monitoring activities. Accessing the locations of events or determining the locations of mobile nodes is one of basic functions of wireless sensor networks. Except for normal information, sensor nodes should also provide position information of sensor nodes. So it's necessary to have a reliable algorithm for localization. Using GPS (Global Position System) technology is a good way to fix position in many fields, and high precision and performance could be obtained in outdoor environment. However, high energy consumption and device volume make it not proper for the low cost self-organizing sensor networks. Some researchers used Monte-Carlo Localization (MCL) algorithm in mobile nodes localization, and revealed that better localization effects could be obtained. However, current MCL-based approaches need to acquire a large number of samples to calculate to achieve good precision. The energy of one node is limited and can't last for a long time. In this paper, a new method has been suggested to apply genetic algorithm to improve MCL in MSNs for localization. Experimental results illustrate that our methodology has a better performance in comparison with Monte Carlo localization algorithm.
    Geoinformatics (GEOINFORMATICS), 2012 20th International Conference on; 01/2012
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Cloud is considered to be an important platform for the execution of scientific applications such as scientific workflows. Effective plan and management are desired in complex spatial analysis and decision applications. Geospatial workflow provides a scientific solution to manage such geospatial processes; it can define geospatial processes precisely to solve particular spatial-referenced issues in GIS applications. In this paper we explore the use of cloud computing for geospatial workflow applications, focusing on a well-known geospatial application - Weights of Evidence Method Metallogenic Prediction. We propose the architecture of geospatial workflow applications in the Cloud as well as across Clouds, elaborate each of the components of the framework, including how the application could be deployed in a cloud environment, along with structure changes, useful tools and services. Finally we discuss the challenges of deploying and executing geospatial workflows in cloud environment and whether the widely used scientific workflow management system is able to support Sky computing by executing a single geospatial workflow across multiple Cloud infrastructures at the same time.
    Geoinformatics (GEOINFORMATICS), 2012 20th International Conference on; 01/2012
  • [Show abstract] [Hide abstract]
    ABSTRACT: A fire disaster has the highest occurrence of frequency among disasters. A fire presents some of the characteristics of a disaster because of the highly destructive action of fire and of the considerable number of victims. Fire rescue is very essential in fire disaster emergency response. Disaster planning and response require ever more scientific elaboration and technological support. The paper establishes applications of fire disaster simulation and virtual fire training by using our Collaborative Virtual Geographic Environment (CVGE) platform-CySim, which is developed based on open source OpenSimulator server and Second Life client. The approach links fire simulation and human behaviour rehearsal to virtual environment, thus provides a flexibility interaction platform and distributed and collaborative learning network for disaster knowledge learning and virtual training. Users (trainers) can play it again and again, until they get it right. It therefore can be a replacement of live simulation training of fire disaster.
    Geoinformatics (GEOINFORMATICS), 2012 20th International Conference on; 01/2012
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Over recent years, massive geospatial information has been produced at a prodigious rate, and is usually geographically distributed across the Internet. Grid computing, as a recent development in the landscape of distributed computing, is deemed as a good solution for distributed geospatial data management and manipulation. Thus, the Grid computing technology can be applied to integrate various distributed resources into a ‘super-computer’ that enables efficient distributed geospatial query processing. In order to realize this vision, an effective mechanism for building the distributed geospatial query workflow in the Grid environment needs to be elaborately designed. The workflow-building technology aims to automatically transform the global geospatial query into an equivalent distributed query process in the Grid. In response to this goal, detailed steps and algorithms for building the distributed geospatial query workflow in the Grid environment are discussed in this article. Moreover, we develop corresponding software tools that enable Grid-based geospatial queries to be run against multiple data resources. Experimental results demonstrate that the proposed methodology is feasible and correct.
    International Journal of Geographical Information Science 07/2011; 25:1117-1145. · 1.61 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Large gap of supply and demand for mineral resources brings an urgent need to develop technology and methods for mineral exploration. Metallogenic prediction plays a leading role in mineral exploration. In consideration of current software platform we used and characteristics of specific metallogenic prediction method, the efficiency and quality of metallogenic prediction are not so satisfactory. This article describes an open source spatial information software-based technical framework and how it is used to develop the system for metallogenic prediction. The architecture of the framework is clearly stated in this paper, based on the study of principles of GIS-based metallogenic prediction. An application on copper mineral prediction in the Yangtze River metallogenic belt is discussed, which is executed by using the proposed framework and one of the existing GIS software platform respectively. The results prove that our proposed framework has significantly improved the efficiency of metallogenic prediction, on premise of that the prediction result is accurate. It provides a free convenient way for metallogenic prediction.
    01/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Nearest neighbor analysis is one of the classic methods to find out the tendency of the observed point dataset. With the explosion of spatial data, conventional implementation of nearest neighbor analysis cannot present high performance towards large amount of dataset. So in this paper, a parallel implementation of nearest neighbor analysis is proposed, with parallelization of computing the nearest neighbor distance of each point. Compared with CPU, now GPU can provide more powerful capacity of processing floating point operations and has more multiprocessors for parallel processing. So we develop the parallel program of nearest neighbor analysis with CUDA (Compute Unified Device Architecture) in terms of GPGPU (General-Purpose computing on Graphics Processing Units). In our experiments, when the number of points is large, the speedup of the parallel implementation can achieve more than 10, compared with the conventional implementation in CPU.
    01/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: The paper reviews virtual environment and GIS technologies available and suitable for virtual city development and urban planning, and focus on the form of 3D virtual city for representing realistic urban environment. The several fields are discussed in terms of their applicability to Web-based applications relating to the urban environment, interactive mapping and collaborative modelling, solid, geometric modelling, photospatial panoramic views, and multi-user virtual worlds. An approach integrating virtual environment and GIS to develop immersive 3d virtual cities is proposed and virtual planning applications are presented. The functionalities can achieve to practice, simulate, visualize and conceptualise issues which relate to a medium-large scale area of urban environments.
    2011 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2011, Vancouver, BC, Canada, July 24-29, 2011; 01/2011
  • [Show abstract] [Hide abstract]
    ABSTRACT: Since the introduction of CUDA (Compute Unified Device Architecture), GPU (Graphics Processing Unit) was used in various fields rapidly. Some researchers used the GPU computing technology in remote sensing image processing, and revealed that one hundred times speedup could be obtained. Current GPU-based approaches need to load all the image data at a time prior to image processing. However, the current computer memory and GPU memory are limited, and are not big enough for loading the remote sensing image data which are always massive. Hence, current GPU-based image processing approaches cannot be directly applied in remote sensing image processing. Under this situation, this paper proposes a dual-parallel processing mechanism, which is based on GPU and POSIX thread technologies, in massive remote sensing image data processing. Experimental results illustrate that our methodology can not only deal with massive remote sensing image data, but also improve the processing efficiency greatly.
    01/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Nowadays, high performance parallel computation is deemed as a good solution to the complicated processing of massive spatial data. It is a very important precondition to make the most of this technology that data be partitioned. In this paper, we talk about the general strategy of spatial data partition and summarize its principles are good space proximity, balanced data load, small data redundancy and short time consumed. After analyzing the current partition algorithms, we find that there are many partition problems, such as the space division and load unbalanced. In order to solve these problems, we presented a new partition algorithm based on the statistical cluster method , which has better spatial proximity and data load than traditional algorithms. Keywords-high performance parallel computation GIS; spatial data partition ; statistical cluster I. INTRODUCTION With the fast development of Geographic Information System (GIS), traditional GIS cannot deal with the more and more complex spatial operation and massive spatial data processing gradually, which often takes a long time but cannot gain an ideal result yet. Meanwhile, with the rise of the high performance parallel computing GIS (HPPCGIS), we deem it as a good solution to the above problems. HPPCGIS is a technology that uses supercomputers and computer clusters to solve advanced GIS computation problems, such as the massive spatial data storage, search, retrieve and processing in parallel. While the nodes have the same computing ability, ideally, each of the node will have nearly the same working time so that there are no resources wasted. However, if there are a lot of data or tasks in some of nodes but just a few in other nodes, when working, some nodes need more longer time than others, so the longest working time is the total time consuming which is more longer than the average level. We should try to avoid this phenomenon through setting up a balanced data or task load in computing nodes. It depends on the data or task partition at the beginning of the spatial operation. As a result, we can see the importance of partition work for HPPCGIS. In this paper, we will talk about this topic. Because of the more hard task partition research is still in a primary stage, we focus on the work of finding better data partition method.
    01/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: DGIP (Distributed Geographic Information Processing) has become a new tendency of GIS (Geographic Information System) recently. DGIP focuses on how to organize and process a series of geographic resources in distributed computing environment and now existing research is mainly carried out from a global point of view. But it is noticeable that each computing node in distributed computing environment will carry a heavy load with growth of data quantity. So this paper concentrates on how to make each computing node fulfill the subtask more quickly to achieve efficient local acceleration. The paper designs a prototype for distributed remote sensing image processing and achieves local acceleration in each computing node with CUDA (Compute Unified Device Architecture). Firstly, the paper introduces the distributed procedure of the prototype and overviews the architecture and programming model of CUDA. Then the paper takes Mean Filter as an example to design and implement the parallel program with CUDA to accelerate the procedure of remote sensing image processing in each node. To evaluate the performance of the local acceleration, the paper carries out a group of comparative tests between the parallel implementation with CUDA and the conventional implementation. The results demonstrate that the local acceleration with CUDA runs more than 20 times faster than conventional process.
    Geoinformatics, 2010 18th International Conference on; 07/2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
    Computers & Geosciences 07/2010; · 1.83 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Barn project, a spatial data grid prototype under development currently, provides a grid environment for applications of the nationwide geological survey data. Within Barn, in order to reduce data access latency, avoid a single point of failure, and increase the performance and load-balance of the system, we have designed and implemented a set of replica management services. These services, based on WSRF, offer users a convenient way to partition a large logical layer into multiple fragments, enable replication and management of the tiled spatial data, maintenance of replica consistency, catalog consistency and replica synchronization. This paper describes the framework and design of the replica management and evaluate its performance under the context of the Barn project.
    Computational Intelligence and Software Engineering, 2009. CiSE 2009. International Conference on; 01/2010
  • Source
    Zhou Huang, Yu Fang, Mao Pan
    [Show abstract] [Hide abstract]
    ABSTRACT: Grid computing is deemed as a good solution to the digital earth infrastructure. Various geographically dispersed geospatial resources can be connected and merged into a ‘supercomputer’ by using the grid-computing technology. On the other side, geosensor networks offer a new perspective for collecting physical data dynamically and modeling a real-time virtual world. Integrating geosensor networks and grid computing in geosensor grid can be compared to equipping the geospatial information grid with ‘eyes’ and ‘ears.’ Thus, real-time information in the physical world can be processed, correlated, and modeled to enable complex and advanced geospatial analyses on geosensor grid with capability of high-performance computation. There are several issues and challenges that need to be overcome before geosensor grid comes true. In this paper, we propose an integrated framework, comprising the geosensor network layer, the grid layer and the application layer, to address these design issues. Key technologies of the geosensor grid framework are discussed. And, a geosensor grid testbed is set up to illustrate the proposed framework and improve our geosensor grid design.
    Int. J. Digital Earth. 01/2010; 3:207-216.
  • [Show abstract] [Hide abstract]
    ABSTRACT: In order to develop an effective technical methodology for environmental pollution data management and modeling, this paper propose an integrated framework for pollution modeling based on the GIS database. Three layers, including the pollution data management layer, the modeling and analysis layer, and the application layer, construct the complete framework. Design issues of the framework are discussed and a detailed example is introduced to illustrate how the framework works. It is observed that the framework is feasible and can benefit future GIS applications in environmental pollution data management, modeling and analysis.
    01/2010;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Geospatial data is the core of spatial information system. It is always a research hotspot of how to realize the sharing and integration applications of distributed geospatial data. OGC has developed a number of web services specifications that enable the interoperation of heterogeneous geospatial data sources. Grid is the technology enabling resource sharing and coordinated problem solving in dynamic, distributed environment. In this paper, we research on the integration technologies of distributed geospatial data based on OGC standards-compliant geospatial grid services by combining grid and OGC. Firstly, we introduce the OGC Web Services standards and grid technology. Then, we design the framework of integration system of distributed geospatial data based on OGC standards-compliant geospatial data grid services. Lastly, we research the main technologies of distributed geospatial data integration based on OGC standards-compliant geospatial grid services.
    The 18th International Conference on Geoinformatics: GIScience in Change, Geoinformatics 2010, Peking University, Beijing, China, June, 18-20, 2010; 01/2010
  • Zhou Huang, Yu Fang
    [Show abstract] [Hide abstract]
    ABSTRACT: Nowadays, GIS software is entering a new era of Grid GIS. Represented by Grid GIS, the next generation GIS has become the frontline and hot issue in both the academic community and the industrial sector. However, implementing Grid GIS is confronted with a great deal of challenges, among which Grid-based geospatial computational task processing is the key issue. This paper proposes a new conceptual framework for Grid-based geospatial computational task processing, i.e., a mechanism that efficiently processes the Grid-based geospatial computational task submitted by users and obtains reliable results so as to improve the geospatial information sharing and cooperative computation capability. Design issues for the proposed framework are discussed, and then some concluding remarks are presented.
    IEEE International Geoscience & Remote Sensing Symposium, IGARSS 2010, July 25-30, 2010, Honolulu, Hawaii, USA, Proceedings; 01/2010
  • [Show abstract] [Hide abstract]
    ABSTRACT: Energy shortage brings the challenge of "high-efficiency, high-quality" to evaluation of mineral resources. Subject to conditions, general softwares for evaluation of mineral resources are mainly used in the stand-alone environment, while geological spatial data are often cross-regional, cross-platform and stored distributed. With the rapid development of grid computing, various distributed resources are able to be interconnected and merged into a “super-computer”. To realize rapid evaluation of mineral resources under the network environment, we apply grid computing to evaluation of mineral resources, building mineral resources evaluation platform in the grid computing environment. The architecture and method of weight-of-evidence evaluation of mineral resources under grid computing environment are proposed in this paper, based on the theoretical study of basic principles of weight-of-evidence and grid computing features. Meanwhile, an application on evaluation of copper mineral resources in the Yangtze River metallogenic belt is discussed. A small-scale grid computing environment is set up and upon the platform distributed spatial analysis is able to be executed according to the process of weight of evidence evaluation of mineral resources to obtain evaluation results. The results prove that our proposed method of mineral potential evaluation in grid environment can not only significantly shorten the calculation time, but also improve the evaluation quality.
    The 18th International Conference on Geoinformatics: GIScience in Change, Geoinformatics 2010, Peking University, Beijing, China, June, 18-20, 2010; 01/2010