Ebook: Information Technology in Geo-Engineering
Information technology continues to evolve and remains central to all aspects of geo-engineering. Key issues are the effective use and re-use of data, particularly within Building Information Modelling (BIM) frameworks; the use of smart monitoring; artificial intelligence and data processing techniques. All these contribute to improvements in design processes, greater construction efficiency and more cost-effective maintenance.
This book presents the proceedings of the 2nd International Conference on Information Technology in Geo-Engineering (ICITG 2014), held in Durham, United Kingdom, in July 2014. Topics of the conference cover the full range of information technology applications in geotechnical and geo-environmental engineering, as well as engineering geology. The focus of the papers in this book is on geotechnical data, specifically dealing with issues related to data standards and data exchange. The wider issues of managing data and data sharing through global web portals are also addressed. Also included are papers on artificial intelligence applications, and the use of expert (knowledge-based) systems, artificial neural networks and data mining techniques, particularly as applied to the identification of properties of geo-materials. The use of web-based materials for education, data processing techniques, and the numerical modeling of tunnels, piles and anchors are also discussed.
This book will be of interest to the geo-engineering community and is the second in a series of proceedings designed to keep practitioners and researchers abreast of the developments in information technology which relate to their work.
The Second International Conference on Information Technology in Geo-Engineering (ICITG 2014) was held in Durham, UK in July 2014. It follows the first conference held in Shanghai in 2010.
Toll, D.G., Zhu, H. & Li, X. (eds.) (2010) Information Technology in Geo-Engineering. 1st International Conference on Information Technology in Geo-Engineering, Shanghai. Amsterdam: IOS Press (Millpress), ISBN: 9781607506164, 745 pp.
The conference covered a full range of information technology applications in geotechnical, geo-environmental engineering and engineering geology. The first day of the conference focused on “Geotechnical Data”, dealing with issues related to data standards and data exchange. There was a particular focus on the AGS format for data exchange which is widely used in the UK and increasingly internationally. Wider issues of managing “big data” and data sharing through global web portals were also addressed. Consideration was given to new developments in data acquisition and monitoring for slope stability and tunnelling applications in the field (using Gigapan photography, LiDAR terrestrial scanning and wireless sensors). Data acquisition and control systems for use in the laboratory were also discussed.
The second day focussed on Artificial Intelligence applications. The use of expert (knowledge-based) systems, artificial neural networks and data mining techniques all featured, particularly for identification of properties of geo-materials. The use of web-based materials for education was discussed, targeting both university students and public bodies. Data processing techniques, using Fast Fourier Transforms and Response Surface Analysis techniques were considered. Numerical modelling of tunnels, piles and anchors then completed the extensive discussions covered within the two days.
The editors hope you find the contents of this volume useful in your work and professional practice, as information technology continues to play a central role within all aspects of our work. This second conference has now established an international conference series that we hope will continue into the future, keeping the geo-engineering community abreast of the incredible power of information technology and how it can be applied.
Data exchange, sometimes referred to as data interchange, is critical to make full use of the complex but valuable data from geotechnical and geoenvironmental engineering projects. Data exchange is not only about preserving data between source and target databases but also about building up meaningful data sets during projects. The paper identifies four major approaches for exchanging geotechnical and geoenvironmental data, namely the relational, XML, GIS/GML, and metadata approaches. It describes at a high level their main characteristics and differences and ponders on their respective advantages and disadvantages.
Over the past twenty five years there has been an almost exponential rise in the quantities of data generated during construction projects. The base data behind this huge volume of traffic is surprisingly small but in general is handled in inefficient ways leading to duplication and double handing. Many organisations have attempted to improve on the management of information and communication but with mixed results. The paper will review the history of construction data management for geotechnical jobs in Asia including Australia, comment on successes and failures and consider the driving forces behind these. The paper will conclude by offering a road map for the future management of information which avoids the pot holes and focuses of delivery of improved risk management within complex contractual and technical project environments.
Intelligent rock mechanics (IRM) has become an important aspect of rock engineering design, as one of the most popular non-1:1 mapping techniques [1]. Since its advent in 1993, IRM has aimed to provide an alternative research scheme for complicated rock mechanics problems. With successful applications in rock engineering, the past 20 years have seen progress in new directions, both in rock mechanics modelling and rock engineering design. This work reviews the 20 years of development of IRM and covers some of the most important intelligent modelling methods and integrated intelligent design concepts that have been developed, with a focus on the evolution of integrated intelligent design concepts with applications to various rock engineering projects in China.
This paper addresses one of the main issues related to Jet Grouting (JG) technology, that is, the design of the mechanical properties of the soil-cement mixture. Thus, one of the most powerful Data Mining (DM) algorithms is applied, that is, Support Vector Machine (SVM), towards to the development of a new and more accurate approach for Uniaxial Compressive Strength (UCS) and stiffness prediction of both Jet Grouting Laboratory Formulations (JGLF) and soilcrete mixtures. The obtained results show that SVM algorithm can be used to accurately predict both strength and stiffness of JGLF. Related to soilcrete mixtures, it is shown that the SVM algorithm, despite some of the difficulties found, can give an important contribution for a better understanding of JG technology. Based on a detailed Sensitivity Analysis (SA) some important observations were made, which certainly will contribute for JG technical and economic efficiency improvement.
The AGS format is a digital data interchange format for the geotechnical community, consisting of a data dictionary, rules and the file format itself. The Association of Geotechnical and Geoenvironmental Specialists (AGS) in the UK set up a Working Party in 1991 to reduce the proliferation of data formats and establish a format to transfer data between systems. The AGS Format was first published in 1991 and has been updated and extended in response to industry requirements ever since. AGS3.1 which was published in Dec 2004 contains monitoring and instrumentation data. The latest version AGS4.0 contains a significant range of additional fields to cover the requirements of EC7, mainly for QA/QC purposes and has added a totally reworked environmental section. The AGS Format continues with an ASCII commas and quotes format although other file types have been studied and are being investigated for future versions. The format has been adopted world wide in the practising construction industry for Australasia, China and Singapore in the Far East, through India and the Gulf States in the Middle East, Eastern and Western Europe and is beginning to be used in the USA. The paper will discuss the origins of the format, its maintenance and continued changes and in particular the methodology used for governance and development.
Since the lack of data standards would limit the generality of test data, ISRM and the Commission on Testing Methods of ISRM promoted the data standardization of rock mechanics tests by establishing the Joint Working Group on Representing ISRM Suggested Methods in Electronic Form (RISMEF) in 2007. Prior to that, AGS had published the first edition of the interchange format in 1992 to achieve the similar goal. Therefore, the applicability of AGS format to represent the rock mechanics test data by using ISRM Suggested Methods is discussed in this paper. The hierarchy of how the AGS4 Groups are organized is reviewed. Based on the comparison of the AGS4 Rock Uniaxial Compressive Strength and Deformability Tests (RUCS) Group and the Uniaxial Compressive Test (UCT) results, the validity of using AGS4 to describe the UCT results is investigated. In addition, the AGS4 Groups which record the necessary data for Rock Mass Rating (RMR) system are analyzed. This study demonstrates that the AGS format can be extended to be in accordance with the ISRM Suggested Methods, and potentially used to standardize ISRM rock mechanics test data.
In 1998 ‘the geotechnical data journey’ laid out a workflow to aid the efficiency of the site investigation reporting process. This essentially represented a digital version of a previously paper driven process. However, developments in technology and from elsewhere suggest that a more evolved process is required to suit the current requirements of industry. This paper reviews and refines ‘the geotechnical data journey’ for the present day.
The AGS data format which has evolved over the last 20 years provides a good mechanism to transfer technical data between different geotechnical partners. Historically, data have been transferred from the contractors to the consultants at the end of the site investigation as a one off transfer of information.
Improvements in information and communication technologies and the growing pressures on time and budgets placed on recent projects have imposed new requirements. Partners, in particular on large projects, are now commonly expected to exchange data in an almost real-time basis. In order to keep track of the status of any test (i.e. the degree of confidence of the level of understanding achieved so far), the latest version of the AGS format (4.0) has introduced a new required TRAN group to convey this information.
However, despite this improvement, the sheer number of transmissions required to achieve this quasi real-time data transfer and the shortcomings of currently available software lead to specific operational challenges which will be developed in detail.
Real life examples of solutions which are being used routinely to mitigate these limitations are presented. In view of current practices, the likely implications of new regulatory framework, such as BIM, will be briefly discussed.
Is Building Information Modelling (BIM) a case of the emperor's new clothes or are there real benefits to it for the geotechnical industry? The basic premise of BIM is for organisations to collaborate and share information through the whole project, from initial conception, design, build and maintenance but what this translates to, how it is achieved and what the benefits are is not clear to a number of geotechnical consultants. This paper will explore what BIM means to the geotechnical industry by looking at real world examples of geotechnical involvement in BIM.
The paper proposes an XML data exchange format based on the UK AGS 4.0 data exchange format for site investigation data. The case is made for updating the existing AGS text based (CSV) format to XML. This allows opportunities for central control of versions of the data standard by reference to a centrally maintained XML schema. It also allows use of XML validation functions to ensure files conform to the agreed standard. Data structures and stylesheets have been developed for borehole data and field testing data. A simple structure was adopted that can be used for importing data into a spreadsheet if required.
Borehole information is widely used for many purposes since the data provides essential and accurate geological information of underground and surface conditions. The borehole data is generally provided in the form of boring logs which contains boring data of a single borehole so a lot of boring logs are required to produce further information about a target area. We developed application software for the iPad to enable a user to look into the underground from the surface using borehole information and augmented reality techniques. The borehole data from a remote data server or internal storage of the iPad is processed internally and then overlaid on a camera screen or 2-dimensional map so the user can navigate the borehole data visually. Various geological and geotechnical information (i.e. USCS rating and explanation of each layer, bed rock type, N-value obtained from standard penetration test, groundwater level) as well as positional information (i.e., depth, coordinate, altitude) of the underground can be identified by touching the searched borehole data. The user also can modify existing data or add a new borehole in the field. A field survey was also carried out using more than 100,000 borehole data to validate the accuracy and reliability of the system.
The paper presents a Norwegian standardization project from 2008 to 2011 where the main output was an application schema describing geotechnical borehole data including the most common sounding and sampling methods, in situ tests and laboratory tests. The schema defines a reference structure for underground observations, suitable for at least three different user cases: Site investigations on land, offshore gravity coring and offshore borehole mode. The Norwegian Mapping Authority (NMA) has approved the schema and included it in the national feature catalog, a vital part of the Norwegian Spatial Data Infrastructure (SDI).
The standard is a result of contributions and teamwork from several organizations, including the Norwegian Geotechnical Society (NGF), a member of the International Society for Soil Mechanics and Geotechnical Engineering (ISSMG). The Norwegian Geotechnical Institute (NGI) has used the schema to implement a Geographic Information System (GIS), which has improved the availability and possibility for reuse of borehole data.
EarthServer, an ongoing e-Infrastructure EU-FP7 project ending in August 2014, is developing capabilities for access to, and ad-hoc processing of, large Earth science data sets using Open Geospatial Consortium standard web service interfaces for coverage data. At project close it is intended that at least 200TB of data, in total, will be available online through a number of services, covering the Cryospheric, Atmospheric, Planetary science, Oceanographic, and Geological domains, the latter of which is being provided by the British Geological Survey (BGS).
Currently BGS has a number of services available, serving both our internal users and the public. Applications to date (1) allow the spatial selection, editing and display of Landsat and aerial photographic imagery, including band selection and contrast stretching, (2) a 3-D visualization client that allows the draping of remote sensing data over a DTM, (3) a 3-D model of the Glasgow area which can be displayed and thickness calculations between geological surfaces performed, and (4) an application querying superficial thickness floating point data to show regions of a thickness range as an image.
In the final year we will be working with members of the ASK (Accessing Subsurface Knowledge) consortia to help provide a methodology for display and query of geological model data to non-domain specialists, developing the 3-D viewer, and the thickness calculations applications, and working with archived hyperspectral data (CASI and HyMap) from the MINEO project, to develop an application to show band reflectance signatures to characterize mineral assemblages, to help the management of mine waste and resource mapping.
Shield tunneling has become an attractive method in the development of underground space. In recent years, engineers have been faced with pressure to reduce tunnel maintenance fees. Thus, this paper presents a web-based digital platform as a management tool to manage, query and visualize shield tunnel operation data. First, a data standard involving monitoring, inspection and maintenance is proposed based on existing engineering data standards and shield tunnel projects. The data standard is needed to collect data and design the database. Then a web-based digital platform is developed based on WebGIS software and Web Service Technology. Users can obtain all the engineering information through this platform. A novel feature of this system is an ability to share, update and analyze data through the internet in a visual way. The functions implemented in the platform include monitoring point positioning, historical monitoring data queries, alarm monitoring, maintenance data queries and maintenance data analysis. A case study of the Shanghai Metro Line No. 13 is presented. Compared with traditional tunnel operation management system, the digital platform provides a more effective way to manage the massive data set.
Development in remote sensing has opened up new techniques to map and quantify rockfall in massive rockfaces like those in the western Norwegian fjords. Gigapan photogrammetry enable the visualization of the entire slope in detail with the ability to zoom into the image to see individual features. The terrestrial LiDAR scanning gives the possibility to quantify the findings in the Gigapan photos. In Lyseboten this technique was used with success, and 19 rockfall sources threatening the construction site of Lyse's new electrical power plant was identified. Prime lenses with long focal length targeted the size and distance to the rockface should be used when shooting the Gigapans. Care should also be taken when selecting scanning positions to avoid wind and other factors that could influence on the data quality.
While temporary timber structures have been extensively used in the underground construction processes, their current design is often conservative and over-engineered. Understanding the real performance of mixed timber-steel supporting systems will be beneficial to reduce both the health and safety risks to workers, and to cut the construction cost and time. In this paper, we introduce a novel technology called Wireless SmartPlank Network (WSPN) to monitor the timber structures in underground constructions. SmartPlank is a wooden beam equipped with a streamlined wireless sensor node, two thin foil strain gauges and two temperature sensors, which enable us to measure the bending strain and temperature experienced by the both sides of the beam, and transmit this information in real-time over an IPv6 (6LowPan) multi-hop wireless mesh network and Internet while the underground construction is taking place. The SmartPlank was carefully tested and calibrated in the lab. Four SmartPlanks were deployed at Tottenham Court Road (TCR) station during the Stair 14 excavation, together with seven relay nodes and a gateway node to provide connectivity to the Internet, and this monitoring will last for one and half years until the Central Line possession in 2015. The captured temperature data shows that the cement hydration rose up to 33 degrees around 10 hours after grouting, while the measured strain data indicates that the planks were not experiencing pure bending, with very large compression on the top surface during grouting but much less strain on the bottom side. The converted vertical earth pressure is much smaller than the design load. Some challenges we encountered on the real site deployment are also highlighted.
The shield tunnels in Shanghai, China are going through the transition from the construction period to the maintenance period after they were put into operation. The current maintenance policies of shield tunnel are passive which are not effective or efficient. In order to maintain the service performance of shield tunnel above a certain level through the active maintenance polices, the framework of shield tunnel management system is proposed in this paper. It mainly consists of six parts, which are identification of the elements to be maintained, performance requirement, condition assessment, performance prediction, optimal maintenance strategies and maintenance operation respectively. Each part of the system is elaborated in this paper. With the utilization of the system, it can maintain and control the serviceability of shield tunnels in an effective and efficient way.
A common requirement of laboratory and field data collection is for automated data logging software to control instrumentation and report on the results. Large projects may require a range of different instrumentation each with its own preferred method for storing data that increases overheads during analysis. The prototype software described herein attempts to address some of these issues by providing a general framework for controlling any serial port device and organising data centrally in a cloud based data storage solution. The benefits include a fully queryable results database, storage of settings in the cloud allowing for easy relocation of hardware and an advanced serial port controller where nearly any physical system can be described and functions automated with real time feedback to the user.
Automated multi-electrode resistivity systems now allow rapid and efficient data acquisition of electrical resistivity measurements to address a wide range of hydrogeological, geotechnical and environmental problems. In this work, a flexible data acquisition and control software package has been developed to acquire resistivity data using configurable electrode arrangements. The user friendly and reliable acquisition software has been used to integrate the hardware and to control the resistivity data collection process. The user can remotely set the current and voltage; multiplex 64 electrodes interchangeably as current and voltage electrodes in a fully automated procedure to collect real-time resistivity data. The package developed has been tested using a wide range of high precision reference resistors and compacted clay soils, and the results have been correlated with those acquired with commercial standard instruments. The results demonstrate high precision, accuracy, and resolution of the collected data with a maximum error of 0.8%.
Different types of soil stress measuring cells have been successfully used in model tests under monotonic loading after following a strict calibration exercise to simulate field conditions. This paper explores the potential of one particular type of strain-gauged cell (type TML–PDA–200kPa) to determine the magnitudes of stresses generated at soil-structure interfaces and also within the soil medium for repeating load tests on models at 1 g. Difficulties associated with the use and calibration of these cells, including hysteresis, are discussed. In the present investigation, repeating load tests performed on dry sand prepared at different density states indicated that the cell response was strongly dependent on sand density, load amplitude, loading frequency and number of load cycles.