Search results for: continuous data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26710

Search results for: continuous data

24220 Detect Cable Force of Cable Stayed Bridge from Accelerometer Data of SHM as Real Time

Authors: Nguyen Lan, Le Tan Kien, Nguyen Pham Gia Bao

Abstract:

The cable-stayed bridge belongs to the combined system, in which the cables is a major strutual element. Cable-stayed bridges with large spans are often arranged with structural health monitoring systems to collect data for bridge health diagnosis. Cables tension monitoring is a structural monitoring content. It is common to measure cable tension by a direct force sensor or cable vibration accelerometer sensor, thereby inferring the indirect cable tension through the cable vibration frequency. To translate cable-stayed vibration acceleration data to real-time tension requires some necessary calculations and programming. This paper introduces the algorithm, labview program that converts cable-stayed vibration acceleration data to real-time tension. The research results are applied to the monitoring system of Tran Thi Ly cable-stayed bridge and Song Hieu cable-stayed bridge in Vietnam.

Keywords: cable-stayed bridge, cable fore, structural heath monitoring (SHM), fast fourie transformed (FFT), real time, vibrations

Procedia PDF Downloads 71
24219 Impacts of Building Design Factors on Auckland School Energy Consumptions

Authors: Bin Su

Abstract:

This study focuses on the impact of school building design factors on winter extra energy consumption which mainly includes space heating, water heating and other appliances related to winter indoor thermal conditions. A number of Auckland schools were randomly selected for the study which introduces a method of using real monthly energy consumption data for a year to calculate winter extra energy data of school buildings. The study seeks to identify the relationships between winter extra energy data related to school building design data related to the main architectural features, building envelope and elements of the sample schools. The relationships can be used to estimate the approximate saving in winter extra energy consumption which would result from a changed design datum for future school development, and identify any major energy-efficient design problems. The relationships are also valuable for developing passive design guides for school energy efficiency.

Keywords: building energy efficiency, building thermal design, building thermal performance, school building design

Procedia PDF Downloads 444
24218 Mathematical Model for Flow and Sediment Yield Estimation on Tel River Basin, India

Authors: Santosh Kumar Biswal, Ramakar Jha

Abstract:

Soil erosion is a slow and continuous process and one of the prominent problems across the world leading to many serious problems like loss of soil fertility, loss of soil structure, poor internal drainage, sedimentation deposits etc. In this paper remote sensing and GIS based methods have been applied for the determination of soil erosion and sediment yield. Tel River basin which is the second largest tributary of the river Mahanadi laying between latitude 19° 15' 32.4"N and, 20° 45' 0"N and longitude 82° 3' 36"E and 84° 18' 18"E chosen for the present study. The catchment was discretized into approximately homogeneous sub-areas (grid cells) to overcome the catchment heterogeneity. The gross soil erosion in each cell was computed using Universal Soil Loss Equation (USLE). Various parameters for USLE was determined as a function of land topography, soil texture, land use/land cover, rainfall, erosivity and crop management and practice in the watershed. The concept of transport limited accumulation was formulated and the transport capacity maps were generated. The gross soil erosion was routed to the catchment outlet. This study can help in recognizing critical erosion prone areas of the study basin so that suitable control measures can be implemented.

Keywords: Universal Soil Loss Equation (USLE), GIS, land use, sediment yield,

Procedia PDF Downloads 308
24217 The Meta–Evaluation of Master Degree Theses in Science Program of Evaluation Methodology, Srinakharinwirot University

Authors: Panwasn Mahalawalert

Abstract:

The objective of this study was to meta-evaluation of Master Degree theses in Science Program of Evaluation Methodology at Srinakharinwirot University, published during 2008-2011. This study was summative meta-evaluation that evaluated all theses of Master Degree in Science Program of Evaluation Methodology. Data were collected using the theses characteristics recording form and the evaluation meta-evaluation checklist. The collected data were analyzed by two parts: 1) Quantitative data were analyzed by descriptive statistics presented in frequency, percentages, mean, and standard deviation and 2) Qualitative data were analyzed by content analysis. The results of this study were found the theses characteristics was results revealed that most of theses were published in 2011. The largest group of theses researcher were female and were from the government office. The evaluation model of all theses were Decision-Oriented Evaluation Model. The objective of all theses were evaluate the project or curriculum. The most sampling technique were used the multistage random sampling technique. The most tool were used to gathering the data were questionnaires. All of the theses were analysed by descriptive statistics. The meta-evaluation results revealed that most of theses had fair on Utility Standards and Feasibility Standards, good on Propriety Standards and Accuracy Standards.

Keywords: meta-evaluation, evaluation, master degree theses, Srinakharinwirot University

Procedia PDF Downloads 536
24216 Re-Stating the Origin of Tetrapod Using Measures of Phylogenetic Support for Phylogenomic Data

Authors: Yunfeng Shan, Xiaoliang Wang, Youjun Zhou

Abstract:

Whole-genome data from two lungfish species, along with other species, present a valuable opportunity to re-investigate the longstanding debate regarding the evolutionary relationships among tetrapods, lungfishes, and coelacanths. However, the use of bootstrap support has become outdated for large-scale phylogenomic data. Without robust phylogenetic support, the phylogenetic trees become meaningless. Therefore, it is necessary to re-evaluate the phylogenies of tetrapods, lungfishes, and coelacanths using novel measures of phylogenetic support specifically designed for phylogenomic data, as the previous phylogenies were based on 100% bootstrap support. Our findings consistently provide strong evidence favoring lungfish as the closest living relative of tetrapods. This conclusion is based on high internode certainty, relative gene support, and high gene concordance factor. The evidence stems from five previous datasets derived from lungfish transcriptomes. These results yield fresh insights into the three hypotheses regarding the phylogenies of tetrapods, lungfishes, and coelacanths. Importantly, these hypotheses are not mere conjectures but are substantiated by a significant number of genes. Analyzing real biological data further demonstrates that the inclusion of additional taxa leads to more diverse tree topologies. Consequently, gene trees and species trees may not be identical even when whole-genome sequencing data is utilized. However, it is worth noting that many gene trees can accurately reflect the species tree if an appropriate number of taxa, typically ranging from six to ten, are sampled. Therefore, it is crucial to carefully select the number of taxa and an appropriate outgroup, such as slow-evolving species, while excluding fast-evolving taxa as outgroups to mitigate the adverse effects of long-branch attraction and achieve an accurate reconstruction of the species tree. This is particularly important as more whole-genome sequencing data becomes available.

Keywords: novel measures of phylogenetic support for phylogenomic data, gene concordance factor confidence, relative gene support, internode certainty, origin of tetrapods

Procedia PDF Downloads 60
24215 Predicting Daily Patient Hospital Visits Using Machine Learning

Authors: Shreya Goyal

Abstract:

The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.

Keywords: machine learning, SVM, HIPAA, data

Procedia PDF Downloads 65
24214 Analyzing Keyword Networks for the Identification of Correlated Research Topics

Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita

Abstract:

The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is  characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.

Keywords: bibliometrics, data analysis, extraction and data integration, scientometrics

Procedia PDF Downloads 258
24213 The Extraction and Stripping of Hg(II) from Produced Water via Hollow Fiber Contactor

Authors: Dolapop Sribudda, Ura Pancharoen

Abstract:

The separation of Hg(II) from produced water by hollow fiber contactors (HFC) was investigation. This system included of two hollow fiber modules in the series connecting. The first module used for the extraction reaction and the second module for stripping reaction. Aliquat336 extractant was fed from the organic reservoirs into the shell side of the first hollow fiber module and continuous to the shell side of the second module. The organic liquid was continuously feed recirculate and back to the reservoirs. The feed solution was pumped into the lumen (tube side) of the first hollow fiber module. Simultaneously, the stripping solution was pumped in the same way in tube side of the second module. The feed and stripping solution was fed which had a counter current flow. Samples were kept in the outlet of feed and stripping solution for 1 hour and characterized concentration of Hg(II) by Inductively Couple Plasma Atomic Emission Spectroscopy (ICP-AES). Feed solution was produced water from natural gulf of Thailand. The extractant was Aliquat336 dissolved in kerosene diluent. Stripping solution used was nitric acid (HNO3) and thiourea (NH2CSNH2). The effect of carrier concentration and type of stripping solution were investigated. Results showed that the best condition were 10 % (v/v) Aliquat336 and 1.0 M NH2CSNH2. At the optimum condition, the extraction and stripping of Hg(II) were 98% and 44.2%, respectively.

Keywords: Hg(II), hollow fiber contactor, produced water, wastewater treatment

Procedia PDF Downloads 403
24212 A New Approach towards the Development of Next Generation CNC

Authors: Yusri Yusof, Kamran Latif

Abstract:

Computer Numeric Control (CNC) machine has been widely used in the industries since its inception. Currently, in CNC technology has been used for various operations like milling, drilling, packing and welding etc. with the rapid growth in the manufacturing world the demand of flexibility in the CNC machines has rapidly increased. Previously, the commercial CNC failed to provide flexibility because its structure was of closed nature that does not provide access to the inner features of CNC. Also CNC’s operating ISO data interface model was found to be limited. Therefore, to overcome that problem, Open Architecture Control (OAC) technology and STEP-NC data interface model are introduced. At present the Personal Computer (PC) has been the best platform for the development of open-CNC systems. In this paper, both ISO data interface model interpretation, its verification and execution has been highlighted with the introduction of the new techniques. The proposed is composed of ISO data interpretation, 3D simulation and machine motion control modules. The system is tested on an old 3 axis CNC milling machine. The results are found to be satisfactory in performance. This implementation has successfully enabled sustainable manufacturing environment.

Keywords: CNC, ISO 6983, ISO 14649, LabVIEW, open architecture control, reconfigurable manufacturing systems, sustainable manufacturing, Soft-CNC

Procedia PDF Downloads 516
24211 A Study on the Establishment of a 4-Joint Based Motion Capture System and Data Acquisition

Authors: Kyeong-Ri Ko, Seong Bong Bae, Jang Sik Choi, Sung Bum Pan

Abstract:

A simple method for testing the posture imbalance of the human body is to check for differences in the bilateral shoulder and pelvic height of the target. In this paper, to check for spinal disorders the authors have studied ways to establish a motion capture system to obtain and express motions of 4-joints, and to acquire data based on this system. The 4 sensors are attached to the both shoulders and pelvis. To verify the established system, the normal and abnormal postures of the targets listening to a lecture were obtained using the established 4-joint based motion capture system. From the results, it was confirmed that the motions taken by the target was identical to the 3-dimensional simulation.

Keywords: inertial sensor, motion capture, motion data acquisition, posture imbalance

Procedia PDF Downloads 515
24210 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 360
24209 Urban Change Detection and Pattern Analysis Using Satellite Data

Authors: Shivani Jha, Klaus Baier, Rafiq Azzam, Ramakar Jha

Abstract:

In India, generally people migrate from rural area to the urban area for better infra-structural facilities, high standard of living, good job opportunities and advanced transport/communication availability. In fact, unplanned urban development due to migration of people causes seriou damage to the land use, water pollution and available water resources. In the present work, an attempt has been made to use satellite data of different years for urban change detection of Chennai metropolitan city along with pattern analysis to generate future scenario of urban development using buffer zoning in GIS environment. In the analysis, SRTM (30m) elevation data and IRS-1C satellite data for the years 1990, 2000, and 2014, are used. The flow accumulation, aspect, flow direction and slope maps developed using SRTM 30 m data are very useful for finding suitable urban locations for industrial setup and urban settlements. Normalized difference vegetation index (NDVI) and Principal Component Analysis (PCA) have been used in ERDAS imagine software for change detection in land use of Chennai metropolitan city. It has been observed that the urban area has increased exponentially in Chennai metropolitan city with significant decrease in agriculture and barren lands. However, the water bodies located in the study regions are protected and being used as freshwater for drinking purposes. Using buffer zone analysis in GIS environment, it has been observed that the development has taken place in south west direction significantly and will do so in future.

Keywords: urban change, satellite data, the Chennai metropolis, change detection

Procedia PDF Downloads 408
24208 HelpMeBreathe: A Web-Based System for Asthma Management

Authors: Alia Al Rayssi, Mahra Al Marar, Alyazia Alkhaili, Reem Al Dhaheri, Shayma Alkobaisi, Hoda Amer

Abstract:

We present in this paper a web-based system called “HelpMeBreathe” for managing asthma. The proposed system provides analytical tools, which allow better understanding of environmental triggers of asthma, hence better support of data-driven decision making. The developed system provides warning messages to a specific asthma patient if the weather in his/her area might cause any difficulty in breathing or could trigger an asthma attack. HelpMeBreathe collects, stores, and analyzes individuals’ moving trajectories and health conditions as well as environmental data. It then processes and displays the patients’ data through an analytical tool that leads to an effective decision making by physicians and other decision makers.

Keywords: asthma, environmental triggers, map interface, web-based systems

Procedia PDF Downloads 294
24207 Extraction of Road Edge Lines from High-Resolution Remote Sensing Images Based on Energy Function and Snake Model

Authors: Zuoji Huang, Haiming Qian, Chunlin Wang, Jinyan Sun, Nan Xu

Abstract:

In this paper, the strategy to extract double road edge lines from acquired road stripe image was explored. The workflow is as follows: the road stripes are acquired by probabilistic boosting tree algorithm and morphological algorithm immediately, and road centerlines are detected by thinning algorithm, so the initial road edge lines can be acquired along the road centerlines. Then we refine the results with big variation of local curvature of centerlines. Specifically, the energy function of edge line is constructed by gradient feature and spectral information, and Dijkstra algorithm is used to optimize the initial road edge lines. The Snake model is constructed to solve the fracture problem of intersection, and the discrete dynamic programming algorithm is used to solve the model. After that, we could get the final road network. Experiment results show that the strategy proposed in this paper can be used to extract the continuous and smooth road edge lines from high-resolution remote sensing images with an accuracy of 88% in our study area.

Keywords: road edge lines extraction, energy function, intersection fracture, Snake model

Procedia PDF Downloads 338
24206 Design and Characterization of a Smart Composite Fabric for Knee Brace

Authors: Rohith J. K., Amir Nazemi, Abbas S. Milani

Abstract:

In Paralympic sports, athletes often depend on some form of equipment to enable competitive sporting, where most of this equipment would only allow passive physiological supports and discrete physiological measurements. Active feedback physiological support and continuous detection of performance indicators, without time or space constraints, would be beneficial in more effective training and performance measures of Paralympic athletes. Moreover, occasionally the athletes suffer from fatigue and muscular stains due to improper monitoring systems. The latter challenges can be overcome by using Smart Composites technology when manufacturing, e.g., knee brace and other sports wearables utilities, where the sensors can be fused together into the fabric and an assisted system actively support the athlete. This paper shows how different sensing functionality may be created by intrinsic and extrinsic modifications onto different types of composite fabrics, depending on the level of integration and the employed functional elements. Results demonstrate that fabric sensors can be well-tailored to measure muscular strain and be used in the fabrication of a smart knee brace as a sample potential application. Materials, connectors, fabric circuits, interconnects, encapsulation and fabrication methods associated with such smart fabric technologies prove to be customizable and versatile.

Keywords: smart composites, sensors, smart fabrics, knee brace

Procedia PDF Downloads 178
24205 Dynamical Relation of Poisson Spike Trains in Hodkin-Huxley Neural Ion Current Model and Formation of Non-Canonical Bases, Islands, and Analog Bases in DNA, mRNA, and RNA at or near the Transcription

Authors: Michael Fundator

Abstract:

Groundbreaking application of biomathematical and biochemical research in neural networks processes to formation of non-canonical bases, islands, and analog bases in DNA and mRNA at or near the transcription that contradicts the long anticipated statistical assumptions for the distribution of bases and analog bases compounds is implemented through statistical and stochastic methods apparatus with addition of quantum principles, where the usual transience of Poisson spike train becomes very instrumental tool for finding even almost periodical type of solutions to Fokker-Plank stochastic differential equation. Present article develops new multidimensional methods of finding solutions to stochastic differential equations based on more rigorous approach to mathematical apparatus through Kolmogorov-Chentsov continuity theorem that allows the stochastic processes with jumps under certain conditions to have γ-Holder continuous modification that is used as basis for finding analogous parallels in dynamics of neutral networks and formation of analog bases and transcription in DNA.

Keywords: Fokker-Plank stochastic differential equation, Kolmogorov-Chentsov continuity theorem, neural networks, translation and transcription

Procedia PDF Downloads 406
24204 The Optimal Location of Brickforce in Brickwork

Authors: Sandile Daniel Ngidi

Abstract:

A brickforce is a product consisting of two main parallel wires joined by in-line welded cross wires. Embedded in the normal thickness of the brickwork joint, the wires are manufactured to a flattened profile to simplify location into the mortar joint without steel build-up problems at lap positions corners/junctions or when used in conjunction with wall ties. A brickforce has been in continuous use since 1918. It is placed in the cement between courses of bricks. Brickforce is used in every course of the foundations and every course above lintel height. Otherwise, brickforce is used every fourth course in between the foundations and lintel height or a concrete slab and lintel height. The brickforce strengthens and stabilizes the wall, especially if you are building on unstable ground. It provides brickwork increased resistance to tensional stresses. Brickforce uses high tensile steel wires, which can withstand high forces but with a very little stretch. This helps to keep crack widths to a minimum. Recently a debate has opened about the purpose of using brickforce in single-story buildings. The debate has been compounded by the fact that there is no consensus about the spacing of brickforce in brickwork or masonry. In addition, very little information had been published on the relative merits of using the same size of brickforce for the different atmospheric conditions in South Africa. This paper aims to compare different types of brickforce systems used in different countries. Conclusions are made to identify the point and location of brickforce that optimize the system.

Keywords: brickforce, masonry concrete, reinforcement, strengthening, wall panels

Procedia PDF Downloads 230
24203 Project Management Practices and Operational Challenges in Conflict Areas: Case Study Kewot Woreda North Shewa Zone, Amhara Region, Ethiopia

Authors: Rahel Birhane Eshetu

Abstract:

This research investigates the complex landscape of project management practices and operational challenges in conflict-affected areas, with a specific focus on Kewot Woreda in the North Shewa Zone of the Amhara region in Ethiopia. The study aims to identify essential project management methodologies, the significant operational hurdles faced, and the adaptive strategies employed by project managers in these challenging environments. Utilizing a mixed-methods approach, the research combines qualitative and quantitative data collection. Initially, a comprehensive literature review was conducted to establish a theoretical framework. This was followed by the administration of questionnaires to gather empirical data, which was then analyzed using statistical software. This sequential approach ensures a robust understanding of the context and challenges faced by project managers. The findings reveal that project managers in conflict zones encounter a range of escalating challenges. Initially, they must contend with immediate security threats and the presence of displaced populations, which significantly disrupt project initiation and execution. As projects progress, additional challenges arise, including limited access to essential resources and environmental disruptions such as natural disasters. These factors exacerbate the operational difficulties that project managers must navigate. In response to these challenges, the study highlights the necessity for project managers to implement formal project plans while simultaneously adopting adaptive strategies that evolve over time. Key adaptive strategies identified include flexible risk management frameworks, change management practices, and enhanced stakeholder engagement approaches. These strategies are crucial for maintaining project momentum and ensuring that objectives are met despite the unpredictable nature of conflict environments. The research emphasizes that structured scope management, clear documentation, and thorough requirements analysis are vital components for effectively navigating the complexities inherent in conflict-affected regions. However, the ongoing threats and logistical barriers necessitate a continuous adjustment to project management methodologies. This adaptability is not only essential for the immediate success of projects but also for fostering long-term resilience within the community. Concluding, the study offers actionable recommendations aimed at improving project management practices in conflict zones. These include the adoption of adaptive frameworks specifically tailored to the unique conditions of conflict environments and targeted training for project managers. Such training should focus on equipping managers with the skills to better address the dynamic challenges presented by conflict situations. The insights gained from this research contribute significantly to the broader field of project management, providing a practical guide for practitioners operating in high-risk areas. By emphasizing sustainable and resilient project outcomes, this study underscores the importance of adaptive management strategies in ensuring the success of projects in conflict-affected regions. The findings serve not only to enhance the understanding of project management practices in Kewot Woreda but also to inform future research and practice in similar contexts, ultimately aiming to promote stability and development in areas beset by conflict.

Keywords: project management practices, operational challenges, conflict zones, adaptive strategies

Procedia PDF Downloads 15
24202 Geographic Information Systems and Remotely Sensed Data for the Hydrological Modelling of Mazowe Dam

Authors: Ellen Nhedzi Gozo

Abstract:

Unavailability of adequate hydro-meteorological data has always limited the analysis and understanding of hydrological behaviour of several dam catchments including Mazowe Dam in Zimbabwe. The problem of insufficient data for Mazowe Dam catchment analysis was solved by extracting catchment characteristics and aerial hydro-meteorological data from ASTER, LANDSAT, Shuttle Radar Topographic Mission SRTM remote sensing (RS) images using ILWIS, ArcGIS and ERDAS Imagine geographic information systems (GIS) software. Available observed hydrological as well as meteorological data complemented the use of the remotely sensed information. Ground truth land cover was mapped using a Garmin Etrex global positioning system (GPS) system. This information was then used to validate land cover classification detail that was obtained from remote sensing images. A bathymetry survey was conducted using a SONAR system connected to GPS. Hydrological modelling using the HBV model was then performed to simulate the hydrological process of the catchment in an effort to verify the reliability of the derived parameters. The model output shows a high Nash-Sutcliffe Coefficient that is close to 1 indicating that the parameters derived from remote sensing and GIS can be applied with confidence in the analysis of Mazowe Dam catchment.

Keywords: geographic information systems, hydrological modelling, remote sensing, water resources management

Procedia PDF Downloads 336
24201 Facial Design of Combined Photoelectrocehmcial-Fenton Coupling Nanocomposites for Antibiotic Eliminations

Authors: Xinyong Li

Abstract:

A new coupling system was constructed by combining photo-electrochemical cell with eletro-fenton cell (PEC-EF). The electrode material in this system was derived from MnyFe₁₋yCo Prussian-Blue-Analog (PBA). Mn₀.₄Fe₀.₆Co₀.₆₇-N@C spin-coated on carbon paper behaved as the gas diffusion cathode and Mn₀.₄Fe₀.₆Co₀.₆₇O₂.₂ spin-coated on fluorine-tin oxide glass (FTO) as anode. The two separated cells could degrade Sulfamethoxazole (SMX) simultaneously and some coupling mechanisms by PEC and EF enhancing the degradation efficiency were investigated. The continuous on-site generation of H₂O₂ at cathode through an oxygen reduction reaction (ORR) was realized over rotating ring-disk electrode (RRDE). The electron transfer number (n) of the ORR with Mn₀.₄Fe₀.₆Co₀.₆₇-N@C was 2.5 in the selected potential and pH range. The photo-electrochemical properties of Mn₀.₄Fe₀.₆Co₀.₆₇O₂.₂ were systematically studied, which displayed good response towards visible light. The photo-induced electrons at anode can transfer to cathode for further use. Efficient photo-electro-catalytic performance was observed in degrading SMX. Almost 100% SMX removal was achieved in 120 min. This work not only provided a highly effective technique for antibiotic treatment but also revealed the synergic effect between PEC and EF.

Keywords: Electro-Fenton, photo-electrochemical, synergic effect, sulfamethoxazole

Procedia PDF Downloads 142
24200 A Bayesian Model with Improved Prior in Extreme Value Problems

Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro

Abstract:

In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).

Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior

Procedia PDF Downloads 198
24199 Serviceability of Fabric-Formed Concrete Structures

Authors: Yadgar Tayfur, Antony Darby, Tim Ibell, Mark Evernden, John Orr

Abstract:

Fabric form-work is a technique to cast concrete structures with a great advantage of saving concrete material of up to 40%. This technique is particularly associated with the optimized concrete structures that usually have smaller cross-section dimensions than equivalent prismatic members. However, this can make the structural system produced from these members prone to smaller serviceability safety margins. Therefore, it is very important to understand the serviceability issue of non-prismatic concrete structures. In this paper, an analytical computer-based model to optimize concrete beams and to predict load-deflection behaviour of both prismatic and non-prismatic concrete beams is presented. The model was developed based on the method of sectional analysis and integration of curvatures. Results from the analytical model were compared to load-deflection behaviour of a number of beams with different geometric and material properties from other researchers. The results of the comparison show that the analytical program can accurately predict the load-deflection response of concrete beams with medium reinforcement ratios. However, it over-estimates deflection values for lightly reinforced specimens. Finally, the analytical program acceptably predicted load-deflection behaviour of on-prismatic concrete beams.

Keywords: fabric-formed concrete, continuous beams, optimisation, serviceability

Procedia PDF Downloads 372
24198 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing

Authors: Rowan P. Martnishn

Abstract:

During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.

Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding

Procedia PDF Downloads 29
24197 Culture and Commodification: A Study of William Gibson's the Bridge Trilogy

Authors: Aruna Bhat

Abstract:

Culture can be placed within the social structure that embodies both the creation of social groups, and the manner in which they interact with each other. As many critics have pointed out, culture in the Postmodern context has often been considered a commodity, and indeed it shares many attributes with commercial products. Popular culture follows many patterns of behavior derived from Economics, from the simple principle of supply and demand, to the creation of marketable demographics which fit certain criterion. This trend is exemplary visible in contemporary fiction, especially in contemporary science fiction; Cyberpunk fiction in particular which is an off shoot of pure science fiction. William Gibson is one such author who in his works portrays such a scenario, and in his The Bridge Trilogy he adds another level of interpretation to this state of affairs, by describing a world that is centered on industrialization of a new kind – that focuses around data in the cyberspace. In this new world, data has become the most important commodity, and man has become nothing but a nodal point in a vast ocean of raw data resulting into commodification of each thing including Culture. This paper will attempt to study the presence of above mentioned elements in William Gibson’s The Bridge Trilogy. The theories applied will be Postmodernism and Cultural studies.

Keywords: culture, commodity, cyberpunk, data, postmodern

Procedia PDF Downloads 505
24196 Impact of Safety and Quality Considerations of Housing Clients on the Construction Firms’ Intention to Adopt Quality Function Deployment: A Case of Construction Sector

Authors: Saif Ul Haq

Abstract:

The current study intends to examine the safety and quality considerations of clients of housing projects and their impact on the adoption of Quality Function Deployment (QFD) by the construction firm. Mixed method research technique has been used to collect and analyze the data wherein a survey was conducted to collect the data from 220 clients of housing projects in Saudi Arabia. Then, the telephonic and Skype interviews were conducted to collect data of 15 professionals working in the top ten real estate companies of Saudi Arabia. Data were analyzed by using partial least square (PLS) and thematic analysis techniques. Findings reveal that today’s customer prioritizes the safety and quality requirements of their houses and as a result, construction firms adopt QFD to address the needs of customers. The findings are of great importance for the clients of housing projects as well as for the construction firms as they could apply QFD in housing projects to address the safety and quality concerns of their clients.

Keywords: construction industry, quality considerations, quality function deployment, safety considerations

Procedia PDF Downloads 125
24195 Assisted Prediction of Hypertension Based on Heart Rate Variability and Improved Residual Networks

Authors: Yong Zhao, Jian He, Cheng Zhang

Abstract:

Cardiovascular diseases caused by hypertension are extremely threatening to human health, and early diagnosis of hypertension can save a large number of lives. Traditional hypertension detection methods require special equipment and are difficult to detect continuous blood pressure changes. In this regard, this paper first analyzes the principle of heart rate variability (HRV) and introduces sliding window and power spectral density (PSD) to analyze the time domain features and frequency domain features of HRV, and secondly, designs an HRV-based hypertension prediction network by combining Resnet, attention mechanism, and multilayer perceptron, which extracts the frequency domain through the improved ResNet18 features through a modified ResNet18, its fusion with time-domain features through an attention mechanism, and the auxiliary prediction of hypertension through a multilayer perceptron. Finally, the network was trained and tested using the publicly available SHAREE dataset on PhysioNet, and the test results showed that this network achieved 92.06% prediction accuracy for hypertension and outperformed K Near Neighbor(KNN), Bayes, Logistic, and traditional Convolutional Neural Network(CNN) models in prediction performance.

Keywords: feature extraction, heart rate variability, hypertension, residual networks

Procedia PDF Downloads 106
24194 Implementation of Lean Manufacturing in Some Companies in Colombia: A Case Study

Authors: Natalia Marulanda, Henry González, Gonzalo León, Alejandro Hincapié

Abstract:

Continuous improvement tools are the result of a set of studies that developed theories and methodologies. These methodologies enable organizations to increase their levels of efficiency, effectiveness, and productivity. Based on these methodologies, lean manufacturing philosophy, which is based on the optimization of resources, waste disposal, and generation of value to products and services, was developed. Lean application has been massive globally, but Colombian companies have been made it incipiently. Therefore, the purpose of this article is to identify the impacts generated by the implementation of lean manufacturing tools in five companies located in Colombia and Medellín metropolitan area. It also seeks to make a comparison of the results obtained from the implementation of lean philosophy and Theory of Constraints. The methodology is qualitative and quantitative, is based on the case study interview from dialogue with the leaders of the processes that used lean tools. The most used tools by research companies are 5's with 100% and TPM with 80%. The less used tool is the synchronous production with 20%. The main reason for the implementation of lean was supply chain management with 83.3%. For the application of lean and TOC, we did not find significant differences between the impact, in terms of methodology, areas of application, staff initiatives, supply chain management, planning, and training.

Keywords: business strategy, lean manufacturing, theory of constraints, supply chain

Procedia PDF Downloads 354
24193 Customers’ Acceptability of Islamic Banking: Employees’ Perspective in Peshawar

Authors: Tahira Imtiaz, Karim Ullah

Abstract:

This paper aims to incorporate the banks employees’ perspective on acceptability of Islamic banking by the customers of Peshawar. A qualitative approach is adopted for which six in-depth interviews with employees of Islamic banks are conducted. The employees were asked to share their experience regarding customers’ acceptance attitude towards acceptability of Islamic banking. Collected data was analyzed through thematic analysis technique and its synthesis with the current literature. Through data analysis a theoretical framework is developed, which highlights the factors which drive customers towards Islamic banking, as witnessed by the employees. The practical implication of analyzed data evident that a new model could be developed on the basis of four determinants of human preference namely: inner satisfaction, time, faith and market forces.

Keywords: customers’ attraction, employees’ perspective, Islamic banking, Riba

Procedia PDF Downloads 333
24192 Customized Design of Amorphous Solids by Generative Deep Learning

Authors: Yinghui Shang, Ziqing Zhou, Rong Han, Hang Wang, Xiaodi Liu, Yong Yang

Abstract:

The design of advanced amorphous solids, such as metallic glasses, with targeted properties through artificial intelligence signifies a paradigmatic shift in physical metallurgy and materials technology. Here, we developed a machine-learning architecture that facilitates the generation of metallic glasses with targeted multifunctional properties. Our architecture integrates the state-of-the-art unsupervised generative adversarial network model with supervised models, allowing the incorporation of general prior knowledge derived from thousands of data points across a vast range of alloy compositions, into the creation of data points for a specific type of composition, which overcame the common issue of data scarcity typically encountered in the design of a given type of metallic glasses. Using our generative model, we have successfully designed copper-based metallic glasses, which display exceptionally high hardness or a remarkably low modulus. Notably, our architecture can not only explore uncharted regions in the targeted compositional space but also permits self-improvement after experimentally validated data points are added to the initial dataset for subsequent cycles of data generation, hence paving the way for the customized design of amorphous solids without human intervention.

Keywords: metallic glass, artificial intelligence, mechanical property, automated generation

Procedia PDF Downloads 56
24191 Effect of Planting Techniques on Mangrove Seedling Establishment in Kuwait Bay

Authors: L. Al-Mulla, B. M. Thomas, N. R. Bhat, M. K. Suleiman, P. George

Abstract:

Mangroves are halophytic shrubs habituated in the intertidal zones in the tropics and subtropics, forming a complex and highly dynamic coastal ecosystem. Historical evidence indicating the existence followed by the extinction of mangrove in Kuwait; hence, continuous projects have been established to reintroduce this plant to the marine ecosystem. One of the major challenges in establishing large-scale mangrove plantations in Kuwait is the very high rate of seedling mortality, which should ideally be less than 20%. This study was conducted at three selected locations in the Kuwait bay during 2016-2017, to evaluate the effect of four planting techniques on mangrove seedling establishment. Coir-pillow planting technique, comp-mat planting technique, and anchored container planting technique were compared with the conventional planting method. The study revealed that the planting techniques significantly affected the establishment of mangrove seedlings in the initial stages of growth. Location-specific difference in seedling establishment was also observed during the course of the study. However, irrespective of the planting techniques employed, high seedling mortality was observed in all the planting locations towards the end of the study; which may be attributed to the physicochemical characteristics of the mudflats selected.

Keywords: Avicennia marina (Forsk.) Vierh, coastal pollution, heavy metal accumulation, marine ecosystem, sedimentation, tidal inundation

Procedia PDF Downloads 152