Search results for: dielectric methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15162

Search results for: dielectric methods

13722 Application of Transportation Linear Programming Algorithms to Cost Reduction in Nigeria Soft Drinks Industry

Authors: Salami Akeem Olanrewaju

Abstract:

The transportation models or problems are primarily concerned with the optimal (best possible) way in which a product produced at different factories or plants (called supply origins) can be transported to a number of warehouses or customers (called demand destinations). The objective in a transportation problem is to fully satisfy the destination requirements within the operating production capacity constraints at the minimum possible cost. The objective of this study is to determine ways of minimizing transport cost in order to maximum profit. Data were gathered from the records of the Distribution Department of 7-Up Bottling Company Plc. Ilorin, Kwara State, Nigeria. The data were analyzed using SPSS (Statistical Package for Social Sciences) while applying the three methods of solving a transportation problem. The three methods produced the same results; therefore, any of the method can be adopted by the company in transporting its final products to the wholesale dealers in order to minimize total production cost.

Keywords: cost minimization, resources utilization, distribution system, allocation problem

Procedia PDF Downloads 238
13721 Heart Ailment Prediction Using Machine Learning Methods

Authors: Abhigyan Hedau, Priya Shelke, Riddhi Mirajkar, Shreyash Chaple, Mrunali Gadekar, Himanshu Akula

Abstract:

The heart is the coordinating centre of the major endocrine glandular structure of the body, which produces hormones that profoundly affect the operations of the body, and diagnosing cardiovascular disease is a difficult but critical task. By extracting knowledge and information about the disease from patient data, data mining is a more practical technique to help doctors detect disorders. We use a variety of machine learning methods here, including logistic regression and support vector classifiers (SVC), K-nearest neighbours Classifiers (KNN), Decision Tree Classifiers, Random Forest classifiers and Gradient Boosting classifiers. These algorithms are applied to patient data containing 13 different factors to build a system that predicts heart disease in less time with more accuracy.

Keywords: logistic regression, support vector classifier, k-nearest neighbour, decision tree, random forest and gradient boosting

Procedia PDF Downloads 31
13720 Analysis of DC\DC Converter of Photovoltaic System with MPPT Algorithms Comparison

Authors: Badr M. Alshammari, Mohamed A. Khlifi

Abstract:

This paper presents the analysis of DC/DC converter including a comparative study of control methods to extract the maximum power and to track the maximum power point (MPP) from photovoltaic (PV) systems under changeable environmental conditions. This paper proposes two methods of maximum power point tracking algorithm for photovoltaic systems, based on the first hand on P&O control and the other hand on the first order IC. The MPPT system ensures that solar cells can deliver the maximum power possible to the load. Different algorithms are used to design it. Here we compare them and simulate the photovoltaic system with two algorithms. The algorithms are used to control the duty cycle of a DC-DC converter in order to boost the output voltage of the PV generator and guarantee the operation of the solar panels in the Maximum Power Point (MPP). Simulation and experimental results show that the proposed algorithms can effectively improve the efficiency of a photovoltaic array output.

Keywords: solar cell, DC/DC boost converter, MPPT, photovoltaic system

Procedia PDF Downloads 181
13719 Icephobic and Hydrophobic Behaviour of Laser Patterned Transparent Coatings

Authors: Bartłomiej Przybyszewski, Rafał Kozera, Anna Boczkowska, Maciej Traczyk, Paulina Kozera, Malwina Liszewska, Daria Pakuła

Abstract:

The goal of the work was to reduce or completely eliminate the accumulation of dirt, snow and ice build-up on transparent coatings by achieving self-cleaning and icephobic properties. The research involved the use of laser surface texturing technology for chemically modified coatings of the epoxy materials group and their hybrids used to protect glass surfaces. For this purpose, two methods of surface structuring and the preceding volumetric modification of the chemical composition with proprietary organosilicon compounds and/or mineral additives were used. An attractive approach to the topic was the development of efficient and, most importantly, durable coatings with self-cleaning and ice-phobic properties that reduced or avoided dirt build-up and adhesion of water, snow and ice. With a view to the future industrial application of the developed technologies, all methods meet the requirements in terms of their practical use on a large scale.

Keywords: icephobic coatings, hydrophobic coatings, transparent coatings, laser patterning

Procedia PDF Downloads 88
13718 BIM Application Research Based on the Main Entrance and Garden Area Project of Shanghai Disneyland

Authors: Ying Yuken, Pengfei Wang, Zhang Qilin, Xiao Ben

Abstract:

Based on the main entrance and garden area (ME&G) project of Shanghai Disneyland, this paper introduces the application of BIM technology in this kind of low-rise comprehensive building with complex facade system, electromechanical system and decoration system. BIM technology is applied to the whole process of design, construction and completion of the whole project. With the construction of BIM application framework of the whole project, the key points of BIM modeling methods of different systems and the integration and coordination of BIM models are elaborated in detail. The specific application methods of BIM technology in similar complex low-rise building projects are sorted out. Finally, the paper summarizes the benefits of BIM technology application, and puts forward some suggestions for BIM management mode and practical application of similar projects in the future.

Keywords: BIM, complex low-rise building, BIM modeling, model integration and coordination, 3D scanning

Procedia PDF Downloads 150
13717 Q Slope Rock Mass Classification and Slope Stability Assessment Methodology Application in Steep Interbedded Sedimentary Rock Slopes for a Motorway Constructed North of Auckland, New Zealand

Authors: Azariah Sosa, Carlos Renedo Sanchez

Abstract:

The development of a new motorway north of Auckland (New Zealand) includes steep rock cuts, from 63 up to 85 degrees, in an interbedded sandstone and siltstone rock mass of the geological unit Waitemata Group (Pakiri Formation), which shows sub-horizontal bedding planes, various sub-vertical joint sets, and a diverse weathering profile. In this kind of rock mass -that can be classified as a weak rock- the definition of the stable maximum geometry is not only governed by discontinuities and defects evident in the rock but is important to also consider the global stability of the rock slope, including (in the analysis) the rock mass characterisation, influence of the groundwater, the geological evolution, and the weathering processes. Depending on the weakness of the rock and the processes suffered, the global stability could, in fact, be a more restricting element than the potential instability of individual blocks through discontinuities. This paper discusses those elements that govern the stability of the rock slopes constructed in a rock formation with favourable bedding and distribution of discontinuities (horizontal and vertical) but with a weak behaviour in terms of global rock mass characterisation. In this context, classifications as Q-Slope and slope stability assessment methodology (SSAM) have been demonstrated as important tools which complement the assessment of the global stability together with the analytical tools related to the wedge-type failures and limit equilibrium methods. The paper focuses on the applicability of these two new empirical classifications to evaluate the slope stability in 18 already excavated rock slopes in the Pakiri formation through comparison between the predicted and observed stability issues and by reviewing the outcome of analytical methods (Rocscience slope stability software suite) compared against the expected stability determined from these rock classifications. This exercise will help validate such findings and correlations arising from the two empirical methods in order to adjust the methods to the nature of this specific kind of rock mass and provide a better understanding of the long-term stability of the slopes studied.

Keywords: Pakiri formation, Q-slope, rock slope stability, SSAM, weak rock

Procedia PDF Downloads 195
13716 Developing Ergonomic Prototype Testing Method for Manual Material Handling

Authors: Yusuf Nugroho Doyo Yekti, Budi Praptono, Fransiskus Tatas Dwi Atmaji

Abstract:

There is no ergonomic prototype testing method for manual material handling yet. This study has been carried out to demonstrate the comprehensive ergonomic assessment. The ergonomic assessment is important to improve safety of products and to ensure usefulness of the product. The prototype testing is conducted by involving few intended users and ordinary people. In this study, there are four operators who participated in several tests. Also, there are 30 ordinary people who joined the usability test. All the ordinary people never do material handling activity nor use material handling device. The methods used in the tests are Rapid Entire Body Assessment (REBA), Recommended Weight Limit (RWL), and Cardiovascular Load (%CVL) other than usability test and questionnaire. The proposed testing methods cover comprehensive ergonomic aspects, i.e. physical aspect, mental aspect, emotional aspects of human.

Keywords: ergonomic, manual material handling, prototype testing, assessment

Procedia PDF Downloads 502
13715 Effects of Stokes Shift and Purcell Enhancement in Fluorescence Assisted Radiative Cooling

Authors: Xue Ma, Yang Fu, Dangyuan Lei

Abstract:

Passive daytime radiative cooling is an emerging technology which has attracted worldwide attention in recent years due to its huge potential in cooling buildings without the use of electricity. Various coating materials with different optical properties have been developed to improve the daytime radiative cooling performance. However, commercial cooling coatings comprising functional fillers with optical bandgaps within the solar spectral range suffers from severe intrinsic absorption, limiting their cooling performance. Fortunately, it has recently been demonstrated that introducing fluorescent materials into polymeric coatings can covert the absorbed sunlight to fluorescent emissions and hence increase the effective solar reflectance and cooling performance. In this paper, we experimentally investigate the key factors for fluorescence-assisted radiative cooling with TiO2-based white coatings. The surrounding TiO2 nanoparticles, which enable spatial and temporal light confinement through multiple Mie scattering, lead to Purcell enhancement of phosphors in the coating. Photoluminescence lifetimes of two phosphors (BaMgAl10O17:Eu2+ and (Sr, Ba)SiO4:Eu2+) exhibit significant reduction of ~61% and ~23%, indicating Purcell factors of 2.6 and 1.3, respectively. Moreover, smaller Stokes shifts of the phosphors are preferred to further diminish solar absorption. Field test of fluorescent cooling coatings demonstrate an improvement of ~4% solar reflectance for the BaMgAl10O17:Eu2+-based fluorescent cooling coating. However, to maximize solar reflectance, a white appearance is introduced based on multiple Mie scattering by the broad size distribution of fillers, which is visually pressurized and aesthetically bored. Besides, most colored pigments absorb visible light significantly and convert it to non-radiative thermal energy, offsetting the cooling effect. Therefore, current colored cooling coatings are facing the compromise between color saturation and cooling effect. To solve this problem, we introduced colored fluorescent materials into white coating based on SiO2 microspheres as a top layer, covering a white cooling coating based on TiO2. Compared with the colored pigments, fluorescent materials could re-emit the absorbed light, reducing the solar absorption introduced by coloration. Our work investigated the scattering properties of SiO2 dielectric spheres with different diameters and detailly discussed their impact on the PL properties of phosphors, paving the way for colored fluorescent-assisted cooling coting to application and industrialization.

Keywords: solar reflection, infrared emissivity, mie scattering, photoluminescent emission, radiative cooling

Procedia PDF Downloads 71
13714 Wastes of Oil Drilling: Treatment Techniques and Their Effectiveness

Authors: Abbas Hadj Abbas, Hacini Massaoud, Aiad Lahcen

Abstract:

In Hassi-Messoud’s oil industry, the systems which are water based (WBM) are generally used for drilling in the first phase. For the rest of the well, the oil mud systems are employed (OBM). In the field of oil exploration, panoply of chemical products is employed in the drilling fluids formulation. These components of different natures and whose toxicity and biodegradability are of ill-defined parameters are; however, thrown into nature. In addition to the hydrocarbon (HC, such as diesel) which is a major constituent of oil based mud, we also can notice spills as well as a variety of other products and additives on the drilling sites. These wastes are usually stored in places called (crud wastes). These may cause major problems to the ecosystem. To treat these wastes, we have considered two methods which are: solidification/ stabilization (chemical) and thermal. So that we can evaluate the techniques of treatment, a series of analyses are performed on dozens of specimens of wastes before treatment. After that, and on the basis of our analyses of wastes, we opted for diagnostic treatments of pollution before and after solidification and stabilization. Finally, we have done some analyses before and after the thermal treatment to check the efficiency of the methods followed in the study.

Keywords: wastes treatment, the oil pollution, the norms, wastes drilling

Procedia PDF Downloads 273
13713 Non-Invasive Imaging of Human Tissue Using NIR Light

Authors: Ashwani Kumar

Abstract:

Use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function.

Keywords: NIR light, tissue, blurring, Monte Carlo simulation

Procedia PDF Downloads 476
13712 Ending Communal Conflicts in Africa: The Relevance of Traditional Approaches to Conflict Resolution

Authors: Kindeye Fenta Mekonnen, Alagaw Ababu Kifle

Abstract:

The failure of international responses to armed conflict to address local preconditions for national stability has recently attracted what has been called the ‘local turn’ in peace building. This ‘local turn’ in peace building amplified a renewed interest in traditional/indigenous methods of conflict resolution, a field that has been hitherto dominated by anthropologists with their focus on the procedures and rituals of such approaches. This notwithstanding, there is still limited empirical work on the relevance of traditional methods of conflict resolution to end localized conflicts vis-à-vis hybrid and modern approaches. The few exceptions to this generally draw their conclusion from very few (almost all successful) cases that make it difficult to judge the validity and cross-case application of their results. This paper seeks to fill these gaps by undertaking a quantitative analysis of the trend and applications of different communal conflict resolution initiatives, their potential to usher in long-term peace, and the extent to which their outcomes are influenced by the intensity and scope of a conflict. The paper makes the following three tentative conclusions. First, traditional mechanisms and traditional actors still dominate the communal conflict resolution landscape, either individually or in combination with other methods. Second, traditional mechanisms of conflict resolution tend to be more successful in ending a conflict and preventing its re-occurrence compared to hybrid and modern arrangements. This notwithstanding and probably due to the scholarly call for local turn in peace building, contemporary communal conflict resolution approaches are becoming less and less reliant on traditional mechanisms alone and (therefore) less effective. Third, there is yet inconclusive evidence on whether hybridization is an asset or a liability in the resolution of communal conflicts and the extent to which this might be mediated by the intensity of a conflict.

Keywords: traditional conflict resolution, hybrid conflict resolution, communal conflict, relevance, conflict intensity

Procedia PDF Downloads 60
13711 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis

Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel

Abstract:

Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.

Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI

Procedia PDF Downloads 161
13710 Hydrodynamic Behaviour Study of Fast Mono-Hull and Catamaran Vessels in Calm Waters Using Free Surface Flow Analysis

Authors: Mohammad Sadeghian, Mohsen Sadeghian

Abstract:

In this paper, planning catamaran and mono-hull vessels resistance and trim in calm waters were considered. Hydrodynamic analysis of fast mono-hull planning vessel was also investigated. For hull form geometry optimization, numerical methods of different parameters were used for this type of vessels. Hull material was selected as carbon fiber composite. Exact architectural aspects were specified and stability calculations were performed, as well. Hydrodynamic calculations to extract the resistance force using semi-analytical methods and numerical modeling were carried out. Free surface numerical analysis of vessel in designed draft using finite volume method and double phase were evaluated and verified by experimental tests.

Keywords: fast vessel, hydrostatic and hydrodynamic optimization, free surface flow, computational fluid dynamics

Procedia PDF Downloads 267
13709 Equal Channel Angular Pressing of Al1050 Sheets: Experimental and Finite Element Survey

Authors: P. M. Keshtiban, M. Zdshakoyan, G. Faragi

Abstract:

Different severe plastic deformation (SPD) methods are the most successful ways to build nano-structural materials from coarse grain samples without changing the cross-sectional area. One of the most widely used methods in the SPD process is equal channel angler pressing (ECAP). In this paper, ECAP process on Al1050 sheets was evaluated at room temperature by both experiments and finite element method. Since, one of the main objectives of SPD processes is to achieve high equivalent plastic strain (PEEQ) in one cycle, the values of PEEQ obtained by finite element simulation. Also, force-displacement curve achieved by FEM. To study the changes of mechanical properties, micro-hardness tests were conducted on samples and improvement in the mechanical properties were investigated. Results show that there is the good proportion between FEM, theory and experimental results.

Keywords: AL1050, experiments, finite element method, severe plastic deformation

Procedia PDF Downloads 400
13708 Numerical Simulation of Footing on Reinforced Loose Sand

Authors: M. L. Burnwal, P. Raychowdhury

Abstract:

Earthquake leads to adverse effects on buildings resting on soft soils. Mitigating the response of shallow foundations on soft soil with different methods reduces settlement and provides foundation stability. Few methods such as the rocking foundation (used in Performance-based design), deep foundation, prefabricated drain, grouting, and Vibro-compaction are used to control the pore pressure and enhance the strength of the loose soils. One of the problems with these methods is that the settlement is uncontrollable, leading to differential settlement of the footings, further leading to the collapse of buildings. The present study investigates the utility of geosynthetics as a potential improvement of the subsoil to reduce the earthquake-induced settlement of structures. A steel moment-resisting frame building resting on loose liquefiable dry soil, subjected to Uttarkashi 1991 and Chamba 1995 earthquakes, is used for the soil-structure interaction (SSI) analysis. The continuum model can simultaneously simulate structure, soil, interfaces, and geogrids in the OpenSees framework. Soil is modeled with PressureDependentMultiYield (PDMY) material models with Quad element that provides stress-strain at gauss points and is calibrated to predict the behavior of Ganga sand. The model analyzed with a tied degree of freedom contact reveals that the system responses align with the shake table experimental results. An attempt is made to study the responses of footing structure and geosynthetics with unreinforced and reinforced bases with varying parameters. The result shows that geogrid reinforces shallow foundation effectively reduces the settlement by 60%.

Keywords: settlement, shallow foundation, SSI, continuum FEM

Procedia PDF Downloads 178
13707 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements

Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath

Abstract:

Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.

Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing

Procedia PDF Downloads 156
13706 A Network Approach to Analyzing Financial Markets

Authors: Yusuf Seedat

Abstract:

The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.

Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks

Procedia PDF Downloads 175
13705 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field

Authors: Nastaran Moosavi, Mohammad Mokhtari

Abstract:

Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.

Keywords: density, p-impedance, s-impedance, post-stack seismic inversion, pre-stack seismic inversion

Procedia PDF Downloads 304
13704 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors

Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui

Abstract:

Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.

Keywords: data-driven method, process control, anomaly detection, dimensionality reduction

Procedia PDF Downloads 278
13703 Study for Establishing a Concept of Underground Mining in a Folded Deposit with Weathering

Authors: Chandan Pramanik, Bikramjit Chanda

Abstract:

Large metal mines operated with open-cast mining methods must transition to underground mining at the conclusion of the operation; however, this requires a period of a difficult time when production convergence due to interference between the two mining methods. A transition model with collaborative mining operations is presented and established in this work, based on the case of the South Kaliapani Underground Project, to address these technical issues of inadequate production security and other mining challenges during the transition phase and beyond. By integrating the technology of the small-scale Drift and Fill method and Highly productive Sub Level Open Stoping at deep section, this hybrid mining concept tries to eliminate major bottlenecks and offers an optimized production profile with the safe and sustainable operation. Considering every geo-mining aspect, this study offers a genuine and precise technical deliberation for the transition from open pit to underground mining.

Keywords: drift and fill, geo-mining aspect, sublevel open stoping, underground mining method

Procedia PDF Downloads 86
13702 Research of the Three-Dimensional Visualization Geological Modeling of Mine Based on Surpac

Authors: Honggang Qu, Yong Xu, Rongmei Liu, Zhenji Gao, Bin Wang

Abstract:

Today's mining industry is advancing gradually toward digital and visual direction. The three-dimensional visualization geological modeling of mine is the digital characterization of mineral deposits and is one of the key technology of digital mining. Three-dimensional geological modeling is a technology that combines geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in a three-dimensional environment with computer technology and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between the distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provides scientific bases for mine resource assessment, reserve calculation, mining design and so on.

Keywords: three-dimensional geological modeling, geological database, geostatistics, block model

Procedia PDF Downloads 62
13701 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms

Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma

Abstract:

Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.

Keywords: image fusion, pyramid, wavelets, principal component analysis

Procedia PDF Downloads 268
13700 Geometric, Energetic and Topological Analysis of (Ethanol)₉-Water Heterodecamers

Authors: Jennifer Cuellar, Angie L. Parada, Kevin N. S. Chacon, Sol M. Mejia

Abstract:

The purification of bio-ethanol through distillation methods is an unresolved issue at the biofuel industry because of the ethanol-water azeotrope formation, which increases the steps of the purification process and subsequently increases the production costs. Therefore, understanding the mixture nature at the molecular level could provide new insights for improving the current methods and/or designing new and more efficient purification methods. For that reason, the present study focuses on the evaluation and analysis of (ethanol)₉-water heterodecamers, as the systems with the minimum molecular proportion that represents the azeotropic concentration (96 %m/m in ethanol). The computational modelling was carried out with B3LYP-D3/6-311++G(d,p) in Gaussian 09. Initial explorations of the potential energy surface were done through two methods: annealing simulated runs and molecular dynamics trajectories besides intuitive structures obtained from smaller (ethanol)n-water heteroclusters, n = 7, 8 and 9. The energetic order of the seven stable heterodecamers determines the most stable heterodecamer (Hdec-1) as a structure forming a bicyclic geometry with the O-H---O hydrogen bonds (HBs) where the water is a double proton donor molecule. Hdec-1 combines 1 water molecule and the same quantity of every ethanol conformer; this is, 3 trans, 3 gauche 1 and 3 gauche 2; its abundance is 89%, its decamerization energy is -80.4 kcal/mol, i.e. 13 kcal/mol most stable than the less stable heterodecamer. Besides, a way to understand why methanol does not form an azeotropic mixture with water, analogous systems ((ethanol)10, (methanol)10, and (methanol)9-water)) were optimized. Topologic analysis of the electron density reveals that Hec-1 forms 33 weak interactions in total: 11 O-H---O, 8 C-H---O, 2 C-H---C hydrogen bonds and 12 H---H interactions. The strength and abundance of the most unconventional interactions (H---H, C-H---O and C-H---O) seem to explain the preference of the ethanol for forming heteroclusters instead of clusters. Besides, O-H---O HBs present a significant covalent character according to topologic parameters as the Laplacian of electron density and the relationship between potential and kinetic energy densities evaluated at the bond critical points; obtaining negatives values and values between 1 and 2, for those two topological parameters, respectively.

Keywords: ADMP, DFT, ethanol-water azeotrope, Grimme dispersion correction, simulated annealing, weak interactions

Procedia PDF Downloads 91
13699 Beyond Informality: Relocation from a Traditional Village 'Mit Oqbah' to Masaken El-Barageel and the Role of ‘Urf in Governing Built Environment, Egypt

Authors: Sarah Eldefrawi, Maike Didero

Abstract:

In Egypt, residents’ urban interventions (colloquially named A’hali’s interventions) are always tackled by government, scholars, and media as an encroachment (taeadiyat), chaotic (a’shwa’i) or informal (gheir mokanan) practices. This paper argues that those interventions cannot be simply described as an encroachment on public space or chaotic behaviour. We claim here that they are relevant to traditional governing methods (‘Urf) that were governing Arab cities for many decades. Through an in-depth field study conducted in a real estate public housing project in the city of Giza called 'Masaken El-Barageel', we traced the urban transformations demonstrated in private and public spaces. To understand those transformations, we used wide-range of qualitative research methods such as semi-guided and informal interviews, observations and mapping of the built environment and the newly added interventions. This study was as well strengthened through the contributions of the author in studying nine sectors emerging by Ahali in six districts in Great Cairo. The results of this study indicate that a culturally and socially sensitive framework has to be related to the individual actions toward the spatial and social structures as well as to culturally transmitted views and meanings connected with 'Urf'. The study could trace three crucial principals in ‘urf that influenced these interventions; the eliminating of harm (Al-Marafiq wa Man’ al-Darar), the appropriation of space (Haqq el-Intefa’) and public interest (maslaha a’ma). Our findings open the discussion for the (il) legitimate of a’hali governing methods in contemporary cities.

Keywords: Urf, urban governance, public space, public housing, encroachments, chaotic, Egyptian cities

Procedia PDF Downloads 119
13698 Architectural Visualization: From Ancient Civilizations to the Roman Empire

Authors: Matthias Stange

Abstract:

Architectural visualization has been practiced for as long as there have been buildings. Visualization (lat.: visibilis "visible") generally refers to bringing abstract data and relationships into a graphically, visually comprehensible form. Particularly, visualization refers to the process of translating relationships that are difficult to formulate linguistically or logically into visual media (e.g., drawings or models) to make them comprehensible. Building owners have always been interested in knowing how their building will look before it is built. In the empirical part of this study, the roots of architectural visualization are examined, starting from the ancient civilizations to the end of the Roman Empire. Extensive literature research on architectural theory and architectural history forms the basis for this analysis. The focus of the analysis is basic research from the emergence of the first two-dimensional drawings in the Neolithic period to the triggers of significant further developments of architectural representation, as well as their importance for subsequent methods and the transmission of knowledge over the following epochs. The analysis focuses on the development of analog methods of representation from the first Neolithic house floor plans to the Greek detailed stone models and paper drawings in the Roman Empire. In particular, the question of socio-cultural, socio-political, and economic changes as possible triggers for the development of representational media and methods will be analyzed. The study has shown that the development of visual building representation has been driven by scientific, technological, and social developments since the emergence of the first civilizations more than 6000 years ago first by the change in human’s subsistence strategy, from food appropriation by hunting and gathering to food production by agriculture and livestock, and the sedentary lifestyle required for this.

Keywords: ancient Greece, ancient orient, Roman Empire, architectural visualization

Procedia PDF Downloads 96
13697 Optimization of the Administration of Intravenous Medication by Reduction of the Residual Volume, Taking User-Friendliness, Cost Efficiency, and Safety into Account

Authors: A. Poukens, I. Sluyts, A. Krings, J. Swartenbroekx, D. Geeroms, J. Poukens

Abstract:

Introduction and Objectives: It has been known for many years that with the administration of intravenous medication, a rather significant part of the planned to be administered infusion solution, the residual volume ( the volume that remains in the IV line and or infusion bag), does not reach the patient and is wasted. This could possibly result in under dosage and diminished therapeutic effect. Despite the important impact on the patient, the reduction of residual volume lacks attention. An optimized and clearly stated protocol concerning the reduction of residual volume in an IV line is necessary for each hospital. As described in my Master’s thesis, acquiring the degree of Master in Hospital Pharmacy, administration of intravenous medication can be optimized by reduction of the residual volume. Herewith effectiveness, user-friendliness, cost efficiency and safety were taken into account. Material and Methods: By usage of a literature study and an online questionnaire sent out to all Flemish hospitals and hospitals in the Netherlands (province Limburg), current flush methods could be mapped out. In laboratory research, possible flush methods aiming to reduce the residual volume were measured. Furthermore, a self-developed experimental method to reduce the residual volume was added to the study. The current flush methods and the self-developed experimental method were compared to each other based on cost efficiency, user-friendliness and safety. Results: There is a major difference between the Flemish and the hospitals in the Netherlands (Province Limburg) concerning the approach and method of flushing IV lines after administration of intravenous medication. The residual volumes were measured and laboratory research showed that if flushing was done minimally 1-time equivalent to the residual volume, 95 percent of glucose would be flushed through. Based on the comparison, it became clear that flushing by use of a pre-filled syringe would be the most cost-efficient, user-friendly and safest method. According to laboratory research, the self-developed experimental method is feasible and has the advantage that the remaining fraction of the medication can be administered to the patient in unchanged concentration without dilution. Furthermore, this technique can be applied regardless of the level of the residual volume. Conclusion and Recommendations: It is recommendable to revise the current infusion systems and flushing methods in most hospitals. Aside from education of the hospital staff and alignment on a uniform substantiated protocol, an optimized and clear policy on the reduction of residual volume is necessary for each hospital. It is recommended to flush all IV lines with rinsing fluid with at least the equivalent volume of the residual volume. Further laboratory and clinical research for the self-developed experimental method are needed before this method can be implemented clinically in a broader setting.

Keywords: intravenous medication, infusion therapy, IV flushing, residual volume

Procedia PDF Downloads 113
13696 Retrospective Evaluation of Vector-borne Infections in Cats Living in Germany (2012-2019)

Authors: I. Schäfer, B. Kohn, M. Volkmann, E. Müller

Abstract:

Introduction: Blood-feeding arthropods transmit parasitic, bacterial, or viral pathogens to domestic animals and wildlife. Vector-borne infections are gaining significance due to the increase of travel, import of domestic animals from abroad, and the changing climate in Europe. Aims of the study: The main objective of this retrospective study was to assess the prevalence of vector-borne infections in cats in which a ‘Feline Travel Profile’ had been conducted. Material and Methods: This retrospective study included test results from cats for which a ‘Feline Travel Profile’ established by LABOKLIN had been requested by veterinarians between April 2012 and December 2019. This profile contains direct detection methods via polymerase chain reaction (PCR) for Hepatozoon spp. and Dirofilaria spp. as well as indirect detection methods via immunofluorescence antibody test (IFAT) for Ehrlichia spp. and Leishmania spp. This profile was expanded to include an IFAT for Rickettsia spp. from July 2015 onwards. The prevalence of the different vector-borne infectious agents was calculated. Results: A total of 602 cats were tested using the ‘Feline Travel Profile’. Positive test results were as follows: Rickettsia spp. IFAT 54/442 (12.2%), Ehrlichia spp. IFAT 68/602 (11.3%), Leishmania spp. IFAT 21/602 (3.5%), Hepatozoon spp. PCR 51/595 (8.6%), and Dirofilaria spp. PCR 1/595 cats (0.2%). Co-infections with more than one pathogen could be detected in 22/602 cats. Conclusions: 170/602 cats (28.2%) were tested positive for at least one vector-borne pathogen. Infections with multiple pathogens could be detected in 3.7% of the cats. The data emphasizes the importance of considering vector-borne infections as potential differential diagnoses in cats.

Keywords: arthopod-transmitted infections, feline vector-borne infections, Germany, laboratory diagnostics

Procedia PDF Downloads 156
13695 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques

Authors: Gizem Eser Erdek

Abstract:

This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.

Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet

Procedia PDF Downloads 57
13694 Vulnerability of People to Climate Change: Influence of Methods and Computation Approaches on Assessment Outcomes

Authors: Adandé Belarmain Fandohan

Abstract:

Climate change has become a major concern globally, particularly in rural communities that have to find rapid coping solutions. Several vulnerability assessment approaches have been developed in the last decades. This comes along with a higher risk for different methods to result in different conclusions, thereby making comparisons difficult and decision-making non-consistent across areas. The effect of methods and computational approaches on estimates of people’s vulnerability was assessed using data collected from the Gambia. Twenty-four indicators reflecting vulnerability components: (exposure, sensitivity, and adaptive capacity) were selected for this purpose. Data were collected through household surveys and key informant interviews. One hundred and fifteen respondents were surveyed across six communities and two administrative districts. Results were compared over three computational approaches: the maximum value transformation normalization, the z-score transformation normalization, and simple averaging. Regardless of the approaches used, communities that have high exposure to climate change and extreme events were the most vulnerable. Furthermore, the vulnerability was strongly related to the socio-economic characteristics of farmers. The survey evidenced variability in vulnerability among communities and administrative districts. Comparing output across approaches, overall, people in the study area were found to be highly vulnerable using the simple average and maximum value transformation, whereas they were only moderately vulnerable using the z-score transformation approach. It is suggested that assessment approach-induced discrepancies be accounted for in international debates to harmonize/standardize assessment approaches to the end of making outputs comparable across regions. This will also likely increase the relevance of decision-making for adaptation policies.

Keywords: maximum value transformation, simple averaging, vulnerability assessment, West Africa, z-score transformation

Procedia PDF Downloads 87
13693 Quantified Metabolomics for the Determination of Phenotypes and Biomarkers across Species in Health and Disease

Authors: Miroslava Cuperlovic-Culf, Lipu Wang, Ketty Boyle, Nadine Makley, Ian Burton, Anissa Belkaid, Mohamed Touaibia, Marc E. Surrette

Abstract:

Metabolic changes are one of the major factors in the development of a variety of diseases in various species. Metabolism of agricultural plants is altered the following infection with pathogens sometimes contributing to resistance. At the same time, pathogens use metabolites for infection and progression. In humans, metabolism is a hallmark of cancer development for example. Quantified metabolomics data combined with other omics or clinical data and analyzed using various unsupervised and supervised methods can lead to better diagnosis and prognosis. It can also provide information about resistance as well as contribute knowledge of compounds significant for disease progression or prevention. In this work, different methods for metabolomics quantification and analysis from Nuclear Magnetic Resonance (NMR) measurements that are used for investigation of disease development in wheat and human cells will be presented. One-dimensional 1H NMR spectra are used extensively for metabolic profiling due to their high reliability, wide range of applicability, speed, trivial sample preparation and low cost. This presentation will describe a new method for metabolite quantification from NMR data that combines alignment of spectra of standards to sample spectra followed by multivariate linear regression optimization of spectra of assigned metabolites to samples’ spectra. Several different alignment methods were tested and multivariate linear regression result has been compared with other quantification methods. Quantified metabolomics data can be analyzed in the variety of ways and we will present different clustering methods used for phenotype determination, network analysis providing knowledge about the relationships between metabolites through metabolic network as well as biomarker selection providing novel markers. These analysis methods have been utilized for the investigation of fusarium head blight resistance in wheat cultivars as well as analysis of the effect of estrogen receptor and carbonic anhydrase activation and inhibition on breast cancer cell metabolism. Metabolic changes in spikelet’s of wheat cultivars FL62R1, Stettler, MuchMore and Sumai3 following fusarium graminearum infection were explored. Extensive 1D 1H and 2D NMR measurements provided information for detailed metabolite assignment and quantification leading to possible metabolic markers discriminating resistance level in wheat subtypes. Quantification data is compared to results obtained using other published methods. Fusarium infection induced metabolic changes in different wheat varieties are discussed in the context of metabolic network and resistance. Quantitative metabolomics has been used for the investigation of the effect of targeted enzyme inhibition in cancer. In this work, the effect of 17 β -estradiol and ferulic acid on metabolism of ER+ breast cancer cells has been compared to their effect on ER- control cells. The effect of the inhibitors of carbonic anhydrase on the observed metabolic changes resulting from ER activation has also been determined. Metabolic profiles were studied using 1D and 2D metabolomic NMR experiments, combined with the identification and quantification of metabolites, and the annotation of the results is provided in the context of biochemical pathways.

Keywords: metabolic biomarkers, metabolic network, metabolomics, multivariate linear regression, NMR quantification, quantified metabolomics, spectral alignment

Procedia PDF Downloads 325