Search results for: hydrodynamic methods
14228 Wastes of Oil Drilling: Treatment Techniques and Their Effectiveness
Authors: Abbas Hadj Abbas, Hacini Massaoud, Aiad Lahcen
Abstract:
In Hassi-Messoud’s oil industry, the systems which are water based (WBM) are generally used for drilling in the first phase. For the rest of the well, the oil mud systems are employed (OBM). In the field of oil exploration, panoply of chemical products is employed in the drilling fluids formulation. These components of different natures and whose toxicity and biodegradability are of ill-defined parameters are; however, thrown into nature. In addition to the hydrocarbon (HC, such as diesel) which is a major constituent of oil based mud, we also can notice spills as well as a variety of other products and additives on the drilling sites. These wastes are usually stored in places called (crud wastes). These may cause major problems to the ecosystem. To treat these wastes, we have considered two methods which are: solidification/ stabilization (chemical) and thermal. So that we can evaluate the techniques of treatment, a series of analyses are performed on dozens of specimens of wastes before treatment. After that, and on the basis of our analyses of wastes, we opted for diagnostic treatments of pollution before and after solidification and stabilization. Finally, we have done some analyses before and after the thermal treatment to check the efficiency of the methods followed in the study.Keywords: wastes treatment, the oil pollution, the norms, wastes drilling
Procedia PDF Downloads 29514227 Non-Invasive Imaging of Human Tissue Using NIR Light
Authors: Ashwani Kumar
Abstract:
Use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function.Keywords: NIR light, tissue, blurring, Monte Carlo simulation
Procedia PDF Downloads 49514226 High Efficiency Electrolyte Lithium Battery and RF Characterization
Authors: Wei Quan, Liu Chao, Mohammed N. Afsar
Abstract:
The dielectric properties and ionic conductivity of novel "ceramic state" polymer electrolytes for high capacity lithium battery are characterized by radio-frequency and Microwave methods in two broad frequency ranges from 50 Hz to 20 KHz and 4 GHz to 40 GHz. This innovative solid polymer electrolyte which is highly ionic conductive (10-3 S/cm at room temperature) from -40 oC to +150 oC and can be used in any battery application. Such polymer exhibits properties more like a ceramic rather than polymer. The various applied measurement methods produced accurate dielectric results for comprehensive analysis of electrochemical properties and ion transportation mechanism of this newly invented polymer electrolyte. Two techniques and instruments employing air gap measurement by capacitance bridge and inwave guide measurement by vector network analyzer are applied to measure the complex dielectric spectra. The complex dielectric spectra are used to determine the complex alternating current electrical conductivity and thus the ionic conductivity.Keywords: polymer electrolyte, dielectric permittivity, lithium battery, ionic relaxation, microwave measurement
Procedia PDF Downloads 47914225 Ending Communal Conflicts in Africa: The Relevance of Traditional Approaches to Conflict Resolution
Authors: Kindeye Fenta Mekonnen, Alagaw Ababu Kifle
Abstract:
The failure of international responses to armed conflict to address local preconditions for national stability has recently attracted what has been called the ‘local turn’ in peace building. This ‘local turn’ in peace building amplified a renewed interest in traditional/indigenous methods of conflict resolution, a field that has been hitherto dominated by anthropologists with their focus on the procedures and rituals of such approaches. This notwithstanding, there is still limited empirical work on the relevance of traditional methods of conflict resolution to end localized conflicts vis-à-vis hybrid and modern approaches. The few exceptions to this generally draw their conclusion from very few (almost all successful) cases that make it difficult to judge the validity and cross-case application of their results. This paper seeks to fill these gaps by undertaking a quantitative analysis of the trend and applications of different communal conflict resolution initiatives, their potential to usher in long-term peace, and the extent to which their outcomes are influenced by the intensity and scope of a conflict. The paper makes the following three tentative conclusions. First, traditional mechanisms and traditional actors still dominate the communal conflict resolution landscape, either individually or in combination with other methods. Second, traditional mechanisms of conflict resolution tend to be more successful in ending a conflict and preventing its re-occurrence compared to hybrid and modern arrangements. This notwithstanding and probably due to the scholarly call for local turn in peace building, contemporary communal conflict resolution approaches are becoming less and less reliant on traditional mechanisms alone and (therefore) less effective. Third, there is yet inconclusive evidence on whether hybridization is an asset or a liability in the resolution of communal conflicts and the extent to which this might be mediated by the intensity of a conflict.Keywords: traditional conflict resolution, hybrid conflict resolution, communal conflict, relevance, conflict intensity
Procedia PDF Downloads 8814224 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis
Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel
Abstract:
Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI
Procedia PDF Downloads 17614223 Equal Channel Angular Pressing of Al1050 Sheets: Experimental and Finite Element Survey
Authors: P. M. Keshtiban, M. Zdshakoyan, G. Faragi
Abstract:
Different severe plastic deformation (SPD) methods are the most successful ways to build nano-structural materials from coarse grain samples without changing the cross-sectional area. One of the most widely used methods in the SPD process is equal channel angler pressing (ECAP). In this paper, ECAP process on Al1050 sheets was evaluated at room temperature by both experiments and finite element method. Since, one of the main objectives of SPD processes is to achieve high equivalent plastic strain (PEEQ) in one cycle, the values of PEEQ obtained by finite element simulation. Also, force-displacement curve achieved by FEM. To study the changes of mechanical properties, micro-hardness tests were conducted on samples and improvement in the mechanical properties were investigated. Results show that there is the good proportion between FEM, theory and experimental results.Keywords: AL1050, experiments, finite element method, severe plastic deformation
Procedia PDF Downloads 42514222 Numerical Simulation of Footing on Reinforced Loose Sand
Authors: M. L. Burnwal, P. Raychowdhury
Abstract:
Earthquake leads to adverse effects on buildings resting on soft soils. Mitigating the response of shallow foundations on soft soil with different methods reduces settlement and provides foundation stability. Few methods such as the rocking foundation (used in Performance-based design), deep foundation, prefabricated drain, grouting, and Vibro-compaction are used to control the pore pressure and enhance the strength of the loose soils. One of the problems with these methods is that the settlement is uncontrollable, leading to differential settlement of the footings, further leading to the collapse of buildings. The present study investigates the utility of geosynthetics as a potential improvement of the subsoil to reduce the earthquake-induced settlement of structures. A steel moment-resisting frame building resting on loose liquefiable dry soil, subjected to Uttarkashi 1991 and Chamba 1995 earthquakes, is used for the soil-structure interaction (SSI) analysis. The continuum model can simultaneously simulate structure, soil, interfaces, and geogrids in the OpenSees framework. Soil is modeled with PressureDependentMultiYield (PDMY) material models with Quad element that provides stress-strain at gauss points and is calibrated to predict the behavior of Ganga sand. The model analyzed with a tied degree of freedom contact reveals that the system responses align with the shake table experimental results. An attempt is made to study the responses of footing structure and geosynthetics with unreinforced and reinforced bases with varying parameters. The result shows that geogrid reinforces shallow foundation effectively reduces the settlement by 60%.Keywords: settlement, shallow foundation, SSI, continuum FEM
Procedia PDF Downloads 19514221 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements
Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath
Abstract:
Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing
Procedia PDF Downloads 17714220 A Network Approach to Analyzing Financial Markets
Authors: Yusuf Seedat
Abstract:
The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks
Procedia PDF Downloads 19314219 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field
Authors: Nastaran Moosavi, Mohammad Mokhtari
Abstract:
Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.Keywords: density, p-impedance, s-impedance, post-stack seismic inversion, pre-stack seismic inversion
Procedia PDF Downloads 32614218 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors
Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui
Abstract:
Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.Keywords: data-driven method, process control, anomaly detection, dimensionality reduction
Procedia PDF Downloads 30014217 Study for Establishing a Concept of Underground Mining in a Folded Deposit with Weathering
Authors: Chandan Pramanik, Bikramjit Chanda
Abstract:
Large metal mines operated with open-cast mining methods must transition to underground mining at the conclusion of the operation; however, this requires a period of a difficult time when production convergence due to interference between the two mining methods. A transition model with collaborative mining operations is presented and established in this work, based on the case of the South Kaliapani Underground Project, to address these technical issues of inadequate production security and other mining challenges during the transition phase and beyond. By integrating the technology of the small-scale Drift and Fill method and Highly productive Sub Level Open Stoping at deep section, this hybrid mining concept tries to eliminate major bottlenecks and offers an optimized production profile with the safe and sustainable operation. Considering every geo-mining aspect, this study offers a genuine and precise technical deliberation for the transition from open pit to underground mining.Keywords: drift and fill, geo-mining aspect, sublevel open stoping, underground mining method
Procedia PDF Downloads 10114216 Research of the Three-Dimensional Visualization Geological Modeling of Mine Based on Surpac
Authors: Honggang Qu, Yong Xu, Rongmei Liu, Zhenji Gao, Bin Wang
Abstract:
Today's mining industry is advancing gradually toward digital and visual direction. The three-dimensional visualization geological modeling of mine is the digital characterization of mineral deposits and is one of the key technology of digital mining. Three-dimensional geological modeling is a technology that combines geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in a three-dimensional environment with computer technology and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between the distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provides scientific bases for mine resource assessment, reserve calculation, mining design and so on.Keywords: three-dimensional geological modeling, geological database, geostatistics, block model
Procedia PDF Downloads 8014215 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms
Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma
Abstract:
Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.Keywords: image fusion, pyramid, wavelets, principal component analysis
Procedia PDF Downloads 28414214 Geometric, Energetic and Topological Analysis of (Ethanol)₉-Water Heterodecamers
Authors: Jennifer Cuellar, Angie L. Parada, Kevin N. S. Chacon, Sol M. Mejia
Abstract:
The purification of bio-ethanol through distillation methods is an unresolved issue at the biofuel industry because of the ethanol-water azeotrope formation, which increases the steps of the purification process and subsequently increases the production costs. Therefore, understanding the mixture nature at the molecular level could provide new insights for improving the current methods and/or designing new and more efficient purification methods. For that reason, the present study focuses on the evaluation and analysis of (ethanol)₉-water heterodecamers, as the systems with the minimum molecular proportion that represents the azeotropic concentration (96 %m/m in ethanol). The computational modelling was carried out with B3LYP-D3/6-311++G(d,p) in Gaussian 09. Initial explorations of the potential energy surface were done through two methods: annealing simulated runs and molecular dynamics trajectories besides intuitive structures obtained from smaller (ethanol)n-water heteroclusters, n = 7, 8 and 9. The energetic order of the seven stable heterodecamers determines the most stable heterodecamer (Hdec-1) as a structure forming a bicyclic geometry with the O-H---O hydrogen bonds (HBs) where the water is a double proton donor molecule. Hdec-1 combines 1 water molecule and the same quantity of every ethanol conformer; this is, 3 trans, 3 gauche 1 and 3 gauche 2; its abundance is 89%, its decamerization energy is -80.4 kcal/mol, i.e. 13 kcal/mol most stable than the less stable heterodecamer. Besides, a way to understand why methanol does not form an azeotropic mixture with water, analogous systems ((ethanol)10, (methanol)10, and (methanol)9-water)) were optimized. Topologic analysis of the electron density reveals that Hec-1 forms 33 weak interactions in total: 11 O-H---O, 8 C-H---O, 2 C-H---C hydrogen bonds and 12 H---H interactions. The strength and abundance of the most unconventional interactions (H---H, C-H---O and C-H---O) seem to explain the preference of the ethanol for forming heteroclusters instead of clusters. Besides, O-H---O HBs present a significant covalent character according to topologic parameters as the Laplacian of electron density and the relationship between potential and kinetic energy densities evaluated at the bond critical points; obtaining negatives values and values between 1 and 2, for those two topological parameters, respectively.Keywords: ADMP, DFT, ethanol-water azeotrope, Grimme dispersion correction, simulated annealing, weak interactions
Procedia PDF Downloads 10414213 Beyond Informality: Relocation from a Traditional Village 'Mit Oqbah' to Masaken El-Barageel and the Role of ‘Urf in Governing Built Environment, Egypt
Authors: Sarah Eldefrawi, Maike Didero
Abstract:
In Egypt, residents’ urban interventions (colloquially named A’hali’s interventions) are always tackled by government, scholars, and media as an encroachment (taeadiyat), chaotic (a’shwa’i) or informal (gheir mokanan) practices. This paper argues that those interventions cannot be simply described as an encroachment on public space or chaotic behaviour. We claim here that they are relevant to traditional governing methods (‘Urf) that were governing Arab cities for many decades. Through an in-depth field study conducted in a real estate public housing project in the city of Giza called 'Masaken El-Barageel', we traced the urban transformations demonstrated in private and public spaces. To understand those transformations, we used wide-range of qualitative research methods such as semi-guided and informal interviews, observations and mapping of the built environment and the newly added interventions. This study was as well strengthened through the contributions of the author in studying nine sectors emerging by Ahali in six districts in Great Cairo. The results of this study indicate that a culturally and socially sensitive framework has to be related to the individual actions toward the spatial and social structures as well as to culturally transmitted views and meanings connected with 'Urf'. The study could trace three crucial principals in ‘urf that influenced these interventions; the eliminating of harm (Al-Marafiq wa Man’ al-Darar), the appropriation of space (Haqq el-Intefa’) and public interest (maslaha a’ma). Our findings open the discussion for the (il) legitimate of a’hali governing methods in contemporary cities.Keywords: Urf, urban governance, public space, public housing, encroachments, chaotic, Egyptian cities
Procedia PDF Downloads 13514212 Architectural Visualization: From Ancient Civilizations to the Roman Empire
Authors: Matthias Stange
Abstract:
Architectural visualization has been practiced for as long as there have been buildings. Visualization (lat.: visibilis "visible") generally refers to bringing abstract data and relationships into a graphically, visually comprehensible form. Particularly, visualization refers to the process of translating relationships that are difficult to formulate linguistically or logically into visual media (e.g., drawings or models) to make them comprehensible. Building owners have always been interested in knowing how their building will look before it is built. In the empirical part of this study, the roots of architectural visualization are examined, starting from the ancient civilizations to the end of the Roman Empire. Extensive literature research on architectural theory and architectural history forms the basis for this analysis. The focus of the analysis is basic research from the emergence of the first two-dimensional drawings in the Neolithic period to the triggers of significant further developments of architectural representation, as well as their importance for subsequent methods and the transmission of knowledge over the following epochs. The analysis focuses on the development of analog methods of representation from the first Neolithic house floor plans to the Greek detailed stone models and paper drawings in the Roman Empire. In particular, the question of socio-cultural, socio-political, and economic changes as possible triggers for the development of representational media and methods will be analyzed. The study has shown that the development of visual building representation has been driven by scientific, technological, and social developments since the emergence of the first civilizations more than 6000 years ago first by the change in human’s subsistence strategy, from food appropriation by hunting and gathering to food production by agriculture and livestock, and the sedentary lifestyle required for this.Keywords: ancient Greece, ancient orient, Roman Empire, architectural visualization
Procedia PDF Downloads 11814211 Optimization of the Administration of Intravenous Medication by Reduction of the Residual Volume, Taking User-Friendliness, Cost Efficiency, and Safety into Account
Authors: A. Poukens, I. Sluyts, A. Krings, J. Swartenbroekx, D. Geeroms, J. Poukens
Abstract:
Introduction and Objectives: It has been known for many years that with the administration of intravenous medication, a rather significant part of the planned to be administered infusion solution, the residual volume ( the volume that remains in the IV line and or infusion bag), does not reach the patient and is wasted. This could possibly result in under dosage and diminished therapeutic effect. Despite the important impact on the patient, the reduction of residual volume lacks attention. An optimized and clearly stated protocol concerning the reduction of residual volume in an IV line is necessary for each hospital. As described in my Master’s thesis, acquiring the degree of Master in Hospital Pharmacy, administration of intravenous medication can be optimized by reduction of the residual volume. Herewith effectiveness, user-friendliness, cost efficiency and safety were taken into account. Material and Methods: By usage of a literature study and an online questionnaire sent out to all Flemish hospitals and hospitals in the Netherlands (province Limburg), current flush methods could be mapped out. In laboratory research, possible flush methods aiming to reduce the residual volume were measured. Furthermore, a self-developed experimental method to reduce the residual volume was added to the study. The current flush methods and the self-developed experimental method were compared to each other based on cost efficiency, user-friendliness and safety. Results: There is a major difference between the Flemish and the hospitals in the Netherlands (Province Limburg) concerning the approach and method of flushing IV lines after administration of intravenous medication. The residual volumes were measured and laboratory research showed that if flushing was done minimally 1-time equivalent to the residual volume, 95 percent of glucose would be flushed through. Based on the comparison, it became clear that flushing by use of a pre-filled syringe would be the most cost-efficient, user-friendly and safest method. According to laboratory research, the self-developed experimental method is feasible and has the advantage that the remaining fraction of the medication can be administered to the patient in unchanged concentration without dilution. Furthermore, this technique can be applied regardless of the level of the residual volume. Conclusion and Recommendations: It is recommendable to revise the current infusion systems and flushing methods in most hospitals. Aside from education of the hospital staff and alignment on a uniform substantiated protocol, an optimized and clear policy on the reduction of residual volume is necessary for each hospital. It is recommended to flush all IV lines with rinsing fluid with at least the equivalent volume of the residual volume. Further laboratory and clinical research for the self-developed experimental method are needed before this method can be implemented clinically in a broader setting.Keywords: intravenous medication, infusion therapy, IV flushing, residual volume
Procedia PDF Downloads 13714210 Retrospective Evaluation of Vector-borne Infections in Cats Living in Germany (2012-2019)
Authors: I. Schäfer, B. Kohn, M. Volkmann, E. Müller
Abstract:
Introduction: Blood-feeding arthropods transmit parasitic, bacterial, or viral pathogens to domestic animals and wildlife. Vector-borne infections are gaining significance due to the increase of travel, import of domestic animals from abroad, and the changing climate in Europe. Aims of the study: The main objective of this retrospective study was to assess the prevalence of vector-borne infections in cats in which a ‘Feline Travel Profile’ had been conducted. Material and Methods: This retrospective study included test results from cats for which a ‘Feline Travel Profile’ established by LABOKLIN had been requested by veterinarians between April 2012 and December 2019. This profile contains direct detection methods via polymerase chain reaction (PCR) for Hepatozoon spp. and Dirofilaria spp. as well as indirect detection methods via immunofluorescence antibody test (IFAT) for Ehrlichia spp. and Leishmania spp. This profile was expanded to include an IFAT for Rickettsia spp. from July 2015 onwards. The prevalence of the different vector-borne infectious agents was calculated. Results: A total of 602 cats were tested using the ‘Feline Travel Profile’. Positive test results were as follows: Rickettsia spp. IFAT 54/442 (12.2%), Ehrlichia spp. IFAT 68/602 (11.3%), Leishmania spp. IFAT 21/602 (3.5%), Hepatozoon spp. PCR 51/595 (8.6%), and Dirofilaria spp. PCR 1/595 cats (0.2%). Co-infections with more than one pathogen could be detected in 22/602 cats. Conclusions: 170/602 cats (28.2%) were tested positive for at least one vector-borne pathogen. Infections with multiple pathogens could be detected in 3.7% of the cats. The data emphasizes the importance of considering vector-borne infections as potential differential diagnoses in cats.Keywords: arthopod-transmitted infections, feline vector-borne infections, Germany, laboratory diagnostics
Procedia PDF Downloads 16914209 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 7914208 Vulnerability of People to Climate Change: Influence of Methods and Computation Approaches on Assessment Outcomes
Authors: Adandé Belarmain Fandohan
Abstract:
Climate change has become a major concern globally, particularly in rural communities that have to find rapid coping solutions. Several vulnerability assessment approaches have been developed in the last decades. This comes along with a higher risk for different methods to result in different conclusions, thereby making comparisons difficult and decision-making non-consistent across areas. The effect of methods and computational approaches on estimates of people’s vulnerability was assessed using data collected from the Gambia. Twenty-four indicators reflecting vulnerability components: (exposure, sensitivity, and adaptive capacity) were selected for this purpose. Data were collected through household surveys and key informant interviews. One hundred and fifteen respondents were surveyed across six communities and two administrative districts. Results were compared over three computational approaches: the maximum value transformation normalization, the z-score transformation normalization, and simple averaging. Regardless of the approaches used, communities that have high exposure to climate change and extreme events were the most vulnerable. Furthermore, the vulnerability was strongly related to the socio-economic characteristics of farmers. The survey evidenced variability in vulnerability among communities and administrative districts. Comparing output across approaches, overall, people in the study area were found to be highly vulnerable using the simple average and maximum value transformation, whereas they were only moderately vulnerable using the z-score transformation approach. It is suggested that assessment approach-induced discrepancies be accounted for in international debates to harmonize/standardize assessment approaches to the end of making outputs comparable across regions. This will also likely increase the relevance of decision-making for adaptation policies.Keywords: maximum value transformation, simple averaging, vulnerability assessment, West Africa, z-score transformation
Procedia PDF Downloads 10514207 Quantified Metabolomics for the Determination of Phenotypes and Biomarkers across Species in Health and Disease
Authors: Miroslava Cuperlovic-Culf, Lipu Wang, Ketty Boyle, Nadine Makley, Ian Burton, Anissa Belkaid, Mohamed Touaibia, Marc E. Surrette
Abstract:
Metabolic changes are one of the major factors in the development of a variety of diseases in various species. Metabolism of agricultural plants is altered the following infection with pathogens sometimes contributing to resistance. At the same time, pathogens use metabolites for infection and progression. In humans, metabolism is a hallmark of cancer development for example. Quantified metabolomics data combined with other omics or clinical data and analyzed using various unsupervised and supervised methods can lead to better diagnosis and prognosis. It can also provide information about resistance as well as contribute knowledge of compounds significant for disease progression or prevention. In this work, different methods for metabolomics quantification and analysis from Nuclear Magnetic Resonance (NMR) measurements that are used for investigation of disease development in wheat and human cells will be presented. One-dimensional 1H NMR spectra are used extensively for metabolic profiling due to their high reliability, wide range of applicability, speed, trivial sample preparation and low cost. This presentation will describe a new method for metabolite quantification from NMR data that combines alignment of spectra of standards to sample spectra followed by multivariate linear regression optimization of spectra of assigned metabolites to samples’ spectra. Several different alignment methods were tested and multivariate linear regression result has been compared with other quantification methods. Quantified metabolomics data can be analyzed in the variety of ways and we will present different clustering methods used for phenotype determination, network analysis providing knowledge about the relationships between metabolites through metabolic network as well as biomarker selection providing novel markers. These analysis methods have been utilized for the investigation of fusarium head blight resistance in wheat cultivars as well as analysis of the effect of estrogen receptor and carbonic anhydrase activation and inhibition on breast cancer cell metabolism. Metabolic changes in spikelet’s of wheat cultivars FL62R1, Stettler, MuchMore and Sumai3 following fusarium graminearum infection were explored. Extensive 1D 1H and 2D NMR measurements provided information for detailed metabolite assignment and quantification leading to possible metabolic markers discriminating resistance level in wheat subtypes. Quantification data is compared to results obtained using other published methods. Fusarium infection induced metabolic changes in different wheat varieties are discussed in the context of metabolic network and resistance. Quantitative metabolomics has been used for the investigation of the effect of targeted enzyme inhibition in cancer. In this work, the effect of 17 β -estradiol and ferulic acid on metabolism of ER+ breast cancer cells has been compared to their effect on ER- control cells. The effect of the inhibitors of carbonic anhydrase on the observed metabolic changes resulting from ER activation has also been determined. Metabolic profiles were studied using 1D and 2D metabolomic NMR experiments, combined with the identification and quantification of metabolites, and the annotation of the results is provided in the context of biochemical pathways.Keywords: metabolic biomarkers, metabolic network, metabolomics, multivariate linear regression, NMR quantification, quantified metabolomics, spectral alignment
Procedia PDF Downloads 34014206 Credit Risk Evaluation Using Genetic Programming
Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira
Abstract:
Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.Keywords: credit risk assessment, rule generation, genetic programming, feature selection
Procedia PDF Downloads 35514205 Biosensors as Analytical Tools in Legume Processing
Authors: S. V. Ncube, A. I. O. Jideani, E. T. Gwata
Abstract:
The plight of food insecurity in developing countries has led to renewed interest in underutilized legumes. Their nutritional versatility, desirable functionality, pharmaceutical value and inherent bioactive compounds have drawn the attention of researchers. This has provoked the development of value added products with the aim of commercially exploiting their full potential. However processing of these legumes leads to changes in nutritional composition as affected by processing variables like pH, temperature and pressure. There is therefore a need for process control and quality assurance during production of the value added products. However, conventional methods for microbiological and biochemical identification are labour intensive and time-consuming. Biosensors offer rapid and affordable methods to assure the quality of the products. They may be used to quantify nutrients and anti-nutrients in the products while manipulating and monitoring variables such as pH, temperature, pressure and oxygen that affect the quality of the final product. This review gives an overview of the types of biosensors used in the food industry, their advantages and disadvantages and their possible application in processing of legumes.Keywords: legume processing, biosensors, quality control, nutritional versatility
Procedia PDF Downloads 49414204 Investigating the Contribution of Road Construction on Soil Erosion, a Case Study of Engcobo Local Municipality, Chris Hani District, South Africa
Authors: Yamkela Zitwana
Abstract:
Soil erosion along the roads and/or road riparian areas has become a norm in the Eastern Cape. Soil erosion refers to the detachment and transportation of soil from one area (onsite) to another (offsite). This displacement or removal of soil can be caused by water, air and sometimes gravity. This will focus on accelerated soil erosion which is the result of human interference with the environment. Engcobo local municipality falls within the Eastern Cape Province in the eastern side of CHRIS HANI District municipality. The focus road is R61 protruding from the Engcobo town outskirts along the Nyanga SSS on the way to Umtata although it will cover few Kilometers away from Engcobo. This research aims at looking at the contribution made by road construction to soil erosion. Steps to achieve the result will involve revisiting the phases of road construction through unstructured interviews, identifying the types of soil erosion evident in the area by doing a checklist, checking the material, utensils and equipment used for road construction and the contribution of road construction through stratified random sampling checking the soil color and texture. This research will use a pragmatic approach which combines related methods and consider the flaws of each method so as to ensure validity, precision and accuracy. Both qualitative and quantitative methods will be used. Statistical methods and GIS analysis will be used to analyze the collected data.Keywords: soil erosion, road riparian, accelerated soil erosion, road construction, sampling, universal soil loss model, GIS analysis, focus groups, qualitative, quantitative method, research, checklist questionnaires, unstructured interviews, pragmatic approach
Procedia PDF Downloads 39514203 Voyage Analysis of a Marine Gas Turbine Engine Installed to Power and Propel an Ocean-Going Cruise Ship
Authors: Mathias U. Bonet, Pericles Pilidis, Georgios Doulgeris
Abstract:
A gas turbine-powered cruise Liner is scheduled to transport pilgrim passengers from Lagos-Nigeria to the Islamic port city of Jeddah in Saudi Arabia. Since the gas turbine is an air breathing machine, changes in the density and/or mass flow at the compressor inlet due to an encounter with variations in weather conditions induce negative effects on the performance of the power plant during the voyage. In practice, all deviations from the reference atmospheric conditions of 15 oC and 1.103 bar tend to affect the power output and other thermodynamic parameters of the gas turbine cycle. Therefore, this paper seeks to evaluate how a simple cycle marine gas turbine power plant would react under a variety of scenarios that may be encountered during a voyage as the ship sails across the Atlantic Ocean and the Mediterranean Sea before arriving at its designated port of discharge. It is also an assessment that focuses on the effect of varying aerodynamic and hydrodynamic conditions which deteriorate the efficient operation of the propulsion system due to an increase in resistance that results from some projected levels of the ship hull fouling. The investigated passenger ship is designed to run at a service speed of 22 knots and cover a distance of 5787 nautical miles. The performance evaluation consists of three separate voyages that cover a variety of weather conditions in winter, spring and summer seasons. Real-time daily temperatures and the sea states for the selected transit route were obtained and used to simulate the voyage under the aforementioned operating conditions. Changes in engine firing temperature, power output as well as the total fuel consumed per voyage including other performance variables were separately predicted under both calm and adverse weather conditions. The collated data were obtained online from the UK Meteorological Office as well as the UK Hydrographic Office websites, while adopting the Beaufort scale for determining the magnitude of sea waves resulting from rough weather situations. The simulation of the gas turbine performance and voyage analysis was effected through the use of an integrated Cranfield-University-developed computer code known as ‘Turbomatch’ and ‘Poseidon’. It is a project that is aimed at developing a method for predicting the off design behavior of the marine gas turbine when installed and operated as the main prime mover for both propulsion and powering of all other auxiliary services onboard a passenger cruise liner. Furthermore, it is a techno-economic and environmental assessment that seeks to enable the forecast of the marine gas turbine part and full load performance as it relates to the fuel requirement for a complete voyage.Keywords: cruise ship, gas turbine, hull fouling, performance, propulsion, weather
Procedia PDF Downloads 16614202 Speech Intelligibility Improvement Using Variable Level Decomposition DWT
Authors: Samba Raju, Chiluveru, Manoj Tripathy
Abstract:
Intelligibility is an essential characteristic of a speech signal, which is used to help in the understanding of information in speech signal. Background noise in the environment can deteriorate the intelligibility of a recorded speech. In this paper, we presented a simple variance subtracted - variable level discrete wavelet transform, which improve the intelligibility of speech. The proposed algorithm does not require an explicit estimation of noise, i.e., prior knowledge of the noise; hence, it is easy to implement, and it reduces the computational burden. The proposed algorithm decides a separate decomposition level for each frame based on signal dominant and dominant noise criteria. The performance of the proposed algorithm is evaluated with speech intelligibility measure (STOI), and results obtained are compared with Universal Discrete Wavelet Transform (DWT) thresholding and Minimum Mean Square Error (MMSE) methods. The experimental results revealed that the proposed scheme outperformed competing methodsKeywords: discrete wavelet transform, speech intelligibility, STOI, standard deviation
Procedia PDF Downloads 15014201 Parenting Styles and Their Relation to Videogame Addiction
Authors: Petr Květon, Martin Jelínek
Abstract:
We try to identify the role of various aspects of parenting style in the phenomenon of videogame playing addiction. Relevant self-report questionnaires were part of a wider set of methods focused on the constructs related to videogame playing. The battery of methods was administered in school settings in paper and pencil form. The research sample consisted of 333 (166 males, 167 females) elementary and high school students at the age between 10 and 19 years (m=14.98, sd=1.77). Using stepwise regression analysis, we assessed the influence of demographic variables (gender and age) and parenting styles. Age and gender together explained 26.3% of game addiction variance (F(2,330)=58.81, p<.01). By adding four aspect of parenting styles (inconsistency, involvement, control, and warmth) another 10.2% of variance was explained (∆F(4,326)=13.09, p<.01). The significant predictor was gender of the respondent, where males scored higher on game addiction scale (B=0.70, p<.01), age (β=-0.18, p<.01), where younger children showed higher level of addiction, and parental inconsistency (β=0.30, p<.01), where the higher the inconsistency in upbringing, the more developed game playing addiction.Keywords: gender, parenting styles, video games, addiction
Procedia PDF Downloads 35314200 Application of the State of the Art of Hydraulic Models to Manage Coastal Problems, Case Study: The Egyptian Mediterranean Coast Model
Authors: Al. I. Diwedar, Moheb Iskander, Mohamed Yossef, Ahmed ElKut, Noha Fouad, Radwa Fathy, Mustafa M. Almaghraby, Amira Samir, Ahmed Romya, Nourhan Hassan, Asmaa Abo Zed, Bas Reijmerink, Julien Groenenboom
Abstract:
Coastal problems are stressing the coastal environment due to its complexity. The dynamic interaction between the sea and the land results in serious problems that threaten coastal areas worldwide, in addition to human interventions and activities. This makes the coastal environment highly vulnerable to natural processes like flooding, erosion, and the impact of human activities as pollution. Protecting and preserving this vulnerable coastal zone with its valuable ecosystems calls for addressing the coastal problems. This, in the end, will support the sustainability of the coastal communities and maintain the current and future generations. Consequently applying suitable management strategies and sustainable development that consider the unique characteristics of the coastal system is a must. The coastal management philosophy aims to solve the conflicts of interest between human development activities and this dynamic nature. Modeling emerges as a successful tool that provides support to decision-makers, engineers, and researchers for better management practices. Modeling tools proved that it is accurate and reliable in prediction. With its capability to integrate data from various sources such as bathymetric surveys, satellite images, and meteorological data, it offers the possibility for engineers and scientists to understand this complex dynamic system and get in-depth into the interaction between both the natural and human-induced factors. This enables decision-makers to make informed choices and develop effective strategies for sustainable development and risk mitigation of the coastal zone. The application of modeling tools supports the evaluation of various scenarios by affording the possibility to simulate and forecast different coastal processes from the hydrodynamic and wave actions and the resulting flooding and erosion. The state-of-the-art application of modeling tools in coastal management allows for better understanding and predicting coastal processes, optimizing infrastructure planning and design, supporting ecosystem-based approaches, assessing climate change impacts, managing hazards, and finally facilitating stakeholder engagement. This paper emphasizes the role of hydraulic models in enhancing the management of coastal problems by discussing the diverse applications of modeling in coastal management. It highlights the modelling role in understanding complex coastal processes, and predicting outcomes. The importance of informing decision-makers with modeling results which gives technical and scientific support to achieve sustainable coastal development and protection.Keywords: coastal problems, coastal management, hydraulic model, numerical model, physical model
Procedia PDF Downloads 3014199 Nickel-Titanium Endodontic Instruments: The Evolution
Authors: Fadwa Chtioui
Abstract:
The field of endodontics has witnessed constant advancements in treatment methods and instrument design, particularly for nickel-titanium (NiTi) files. Despite these developments, it remains crucial for clinicians to have a thorough understanding of their characteristics and behavior to choose the appropriate instruments for different clinical and anatomical situations. Research Aim: The aim of this work is to study and discuss the impact of heat treatment developments on the properties of endodontic NiTi files, with the ultimate goal of providing ways to adapt these files to the anatomical features of dental roots. Methodology: This study involves both clinical cases and extensive bibliographic research. Findings: The study highlights the importance of heat treatment in the design and manufacture of NiTi files, as it significantly affects their physical and mechanical properties. It also provides insights into the ways in which NiTi files can be adapted to the complex geometries of dental roots for more effective endodontic treatments. Theoretical Importance: Theoretical implications of this study include a better understanding of the relationship between heat treatment and the properties of NiTi files, leading to improvements in both their manufacturing methods and clinical applications. Data Collection and Analysis Procedures: The data for this study was collected through clinical cases and an extensive review of relevant literature. Analysis was performed through qualitative and quantitative methods, examining the impact of heat treatment on the physical and mechanical properties of NiTi files. Questions Addressed: This study aims to answer questions concerning the properties of NiTi files and the impact of heat treatment on their behavior. It also seeks to examine ways in which these files can be adapted to complex dental root geometries for more effective endodontic treatments. Conclusion: In conclusion, this study emphasizes the importance of heat treatment in the design and manufacture of NiTi files, as it significantly impacts their physical and mechanical properties. Further research is necessary to explore additional methods for adapting NiTi files to the unique anatomies of dental roots to improve endodontic treatments further. Ultimately, this study provides valuable insights into the continued evolution of endodontic treatment and instrument design.Keywords: endodontic files, nickel-titanium, tooth anatomy, heat treatment
Procedia PDF Downloads 71