Search results for: Automatic keyphrase extraction.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1320

Search results for: Automatic keyphrase extraction.

90 A Text Mining Technique Using Association Rules Extraction

Authors: Hany Mahgoub, Dietmar Rösner, Nabil Ismail, Fawzy Torkey

Abstract:

This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.

Keywords: Text mining, data mining, association rule mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4437
89 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups

Authors: Érica Lima

Abstract:

In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.

Keywords: Facebook, social representation, translation, translator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 818
88 Dynamic Features Selection for Heart Disease Classification

Authors: Walid MOUDANI

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: Multi-Classifier Decisions Tree, Features Reduction, Dynamic Programming, Rough Sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2532
87 Utilization of Whey for the Production of β-Galactosidase Using Yeast and Fungal Culture

Authors: Rupinder Kaur, Parmjit S. Panesar, Ram S. Singh

Abstract:

Whey is the lactose rich by-product of the dairy industry, having good amount of nutrient reservoir. Most abundant nutrients are lactose, soluble proteins, lipids and mineral salts. Disposing of whey by most of milk plants which do not have proper pre-treatment system is the major issue. As a result of which, there can be significant loss of potential food and energy source. Thus, whey has been explored as the substrate for the synthesis of different value added products such as enzymes. β-galactosidase is one of the important enzymes and has become the major focus of research due to its ability to catalyze both hydrolytic as well as transgalactosylation reaction simultaneously. The enzyme is widely used in dairy industry as it catalyzes the transformation of lactose to glucose and galactose, making it suitable for the lactose intolerant people. The enzyme is intracellular in both bacteria and yeast, whereas for molds, it has an extracellular location. The present work was carried to utilize the whey for the production of β-galactosidase enzyme using both yeast and fungal cultures. The yeast isolate Kluyveromyces marxianus WIG2 and various fungal strains have been used in the present study. Different disruption techniques have also been investigated for the extraction of the enzyme produced intracellularly from yeast cells. Among the different methods tested for the disruption of yeast cells, SDS-chloroform showed the maximum β-galactosidase activity. In case of the tested fungal cultures, Aureobasidium pullulans NCIM 1050 was observed to be the maximum extracellular enzyme producer.

Keywords: β-galactosidase, fungus, yeast, whey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5579
86 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes

Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani

Abstract:

Development of a method to estimate gene functions is an important task in bioinformatics. One of the approaches for the annotation is the identification of the metabolic pathway that genes are involved in. Since gene expression data reflect various intracellular phenomena, those data are considered to be related with genes’ functions. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.

Keywords: Metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
85 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine

Abstract:

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

Keywords: Bottom elevation, multi-view stereo, river, structure-from-motion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
84 Removal of Volatile Organic Compounds from Contaminated Surfactant Solution using Co-Curren Vacuum Stripping

Authors: Pornchai Suriya-Amrit, Suratsawadee Kungsanant, Boonyarach Kitiyanan

Abstract:

There has been a growing interest in utilizing surfactants in remediation processes to separate the hydrophobic volatile organic compounds (HVOCs) from aqueous solution. One attractive process is cloud point extraction (CPE), which utilizes nonionic surfactants as a separating agent. Since the surfactant cost is a key determination of the economic viability of the process, it is important that the surfactants are recycled and reused. This work aims to study the performance of the co-current vacuum stripping using a packed column for HVOCs removal from contaminated surfactant solution. Six types HVOCs are selected as contaminants. The studied surfactant is the branched secondary alcohol ethoxylates (AEs), Tergitol TMN-6 (C14H30O2). The volatility and the solubility of HVOCs in surfactant system are determined in terms of an apparent Henry’s law constant and a solubilization constant, respectively. Moreover, the HVOCs removal efficiency of vacuum stripping column is assessed in terms of percentage of HVOCs removal and the overall liquid phase volumetric mass transfer coefficient. The apparent Henry’s law constant of benzenz , toluene, and ethyl benzene were 7.00×10-5, 5.38×10-5, 3.35× 10-5 respectively. The solubilization constant of benzene, toluene, and ethyl benzene were 1.71, 2.68, 7.54 respectively. The HVOCs removal for all solute were around 90 percent.

Keywords: Apparent Henry’s law constant, Branched secondary alcohol ethoxylates, Vacuum Stripping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
83 Automatic Distance Compensation for Robust Voice-based Human-Computer Interaction

Authors: Randy Gomez, Keisuke Nakamura, Kazuhiro Nakadai

Abstract:

Distant-talking voice-based HCI system suffers from performance degradation due to mismatch between the acoustic speech (runtime) and the acoustic model (training). Mismatch is caused by the change in the power of the speech signal as observed at the microphones. This change is greatly influenced by the change in distance, affecting speech dynamics inside the room before reaching the microphones. Moreover, as the speech signal is reflected, its acoustical characteristic is also altered by the room properties. In general, power mismatch due to distance is a complex problem. This paper presents a novel approach in dealing with distance-induced mismatch by intelligently sensing instantaneous voice power variation and compensating model parameters. First, the distant-talking speech signal is processed through microphone array processing, and the corresponding distance information is extracted. Distance-sensitive Gaussian Mixture Models (GMMs), pre-trained to capture both speech power and room property are used to predict the optimal distance of the speech source. Consequently, pre-computed statistic priors corresponding to the optimal distance is selected to correct the statistics of the generic model which was frozen during training. Thus, model combinatorics are post-conditioned to match the power of instantaneous speech acoustics at runtime. This results to an improved likelihood in predicting the correct speech command at farther distances. We experiment using real data recorded inside two rooms. Experimental evaluation shows voice recognition performance using our method is more robust to the change in distance compared to the conventional approach. In our experiment, under the most acoustically challenging environment (i.e., Room 2: 2.5 meters), our method achieved 24.2% improvement in recognition performance against the best-performing conventional method.

Keywords: Human Machine Interaction, Human Computer Interaction, Voice Recognition, Acoustic Model Compensation, Acoustic Speech Enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
82 A Neurofuzzy Learning and its Application to Control System

Authors: Seema Chopra, R. Mitra, Vijay Kumar

Abstract:

A neurofuzzy approach for a given set of input-output training data is proposed in two phases. Firstly, the data set is partitioned automatically into a set of clusters. Then a fuzzy if-then rule is extracted from each cluster to form a fuzzy rule base. Secondly, a fuzzy neural network is constructed accordingly and parameters are tuned to increase the precision of the fuzzy rule base. This network is able to learn and optimize the rule base of a Sugeno like Fuzzy inference system using Hybrid learning algorithm, which combines gradient descent, and least mean square algorithm. This proposed neurofuzzy system has the advantage of determining the number of rules automatically and also reduce the number of rules, decrease computational time, learns faster and consumes less memory. The authors also investigate that how neurofuzzy techniques can be applied in the area of control theory to design a fuzzy controller for linear and nonlinear dynamic systems modelling from a set of input/output data. The simulation analysis on a wide range of processes, to identify nonlinear components on-linely in a control system and a benchmark problem involving the prediction of a chaotic time series is carried out. Furthermore, the well-known examples of linear and nonlinear systems are also simulated under the Matlab/Simulink environment. The above combination is also illustrated in modeling the relationship between automobile trips and demographic factors.

Keywords: Fuzzy control, neuro-fuzzy techniques, fuzzy subtractive clustering, extraction of rules, and optimization of membership functions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2592
81 Large Scale Production of Polyhydroxyalkanoates (PHAs) from Wastewater: A Study of Techno-Economics, Energy Use and Greenhouse Gas Emissions

Authors: Cora Fernandez Dacosta, John A. Posada, Andrea Ramirez

Abstract:

The biodegradable family of polymers polyhydroxyalkanoates is an interesting substitute for convectional fossil-based plastics. However, the manufacturing and environmental impacts associated with their production via intracellular bacterial fermentation are strongly dependent on the raw material used and on energy consumption during the extraction process, limiting their potential for commercialization. Industrial wastewater is studied in this paper as a promising alternative feedstock for waste valorization. Based on results from laboratory and pilot-scale experiments, a conceptual process design, techno-economic analysis and life cycle assessment are developed for the large-scale production of the most common type of polyhydroxyalkanoate, polyhydroxbutyrate. Intracellular polyhydroxybutyrate is obtained via fermentation of microbial community present in industrial wastewater and the downstream processing is based on chemical digestion with surfactant and hypochlorite. The economic potential and environmental performance results help identifying bottlenecks and best opportunities to scale-up the process prior to industrial implementation. The outcome of this research indicates that the fermentation of wastewater towards PHB presents advantages compared to traditional PHAs production from sugars because the null environmental burdens and financial costs of the raw material in the bioplastic production process. Nevertheless, process optimization is still required to compete with the petrochemicals counterparts.

Keywords: Circular economy, life cycle assessment, polyhydroxyalkanoates, waste valorization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4182
80 Improving Activity Recognition Classification of Repetitious Beginner Swimming Using a 2-Step Peak/Valley Segmentation Method with Smoothing and Resampling for Machine Learning

Authors: Larry Powell, Seth Polsley, Drew Casey, Tracy Hammond

Abstract:

Human activity recognition (HAR) systems have shown positive performance when recognizing repetitive activities like walking, running, and sleeping. Water-based activities are a reasonably new area for activity recognition. However, water-based activity recognition has largely focused on supporting the elite and competitive swimming population, which already has amazing coordination and proper form. Beginner swimmers are not perfect, and activity recognition needs to support the individual motions to help beginners. Activity recognition algorithms are traditionally built around short segments of timed sensor data. Using a time window input can cause performance issues in the machine learning model. The window’s size can be too small or large, requiring careful tuning and precise data segmentation. In this work, we present a method that uses a time window as the initial segmentation, then separates the data based on the change in the sensor value. Our system uses a multi-phase segmentation method that pulls all peaks and valleys for each axis of an accelerometer placed on the swimmer’s lower back. This results in high recognition performance using leave-one-subject-out validation on our study with 20 beginner swimmers, with our model optimized from our final dataset resulting in an F-Score of 0.95.

Keywords: Time window, peak/valley segmentation, feature extraction, beginner swimming, activity recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205
79 An Overview of Electronic Waste as Aggregate in Concrete

Authors: S. R. Shamili, C. Natarajan, J. Karthikeyan

Abstract:

Rapid growth of world population and widespread urbanization has remarkably increased the development of the construction industry which caused a huge demand for sand and gravels. Environmental problems occur when the rate of extraction of sand, gravels, and other materials exceeds the rate of generation of natural resources; therefore, an alternative source is essential to replace the materials used in concrete. Now-a-days, electronic products have become an integral part of daily life which provides more comfort, security, and ease of exchange of information. These electronic waste (E-Waste) materials have serious human health concerns and require extreme care in its disposal to avoid any adverse impacts. Disposal or dumping of these E-Wastes also causes major issues because it is highly complex to handle and often contains highly toxic chemicals such as lead, cadmium, mercury, beryllium, brominates flame retardants (BFRs), polyvinyl chloride (PVC), and phosphorus compounds. Hence, E-Waste can be incorporated in concrete to make a sustainable environment. This paper deals with the composition, preparation, properties, classification of E-Waste. All these processes avoid dumping to landfills whilst conserving natural aggregate resources, and providing a better environmental option. This paper also provides a detailed literature review on the behaviour of concrete with incorporation of E-Wastes. Many research shows the strong possibility of using E-Waste as a substitute of aggregates eventually it reduces the use of natural aggregates in concrete.

Keywords: Disposal, electronic waste, landfill, toxic chemicals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2837
78 Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area

Authors: Nassib Abdallah, Pierre Chauvet, Abd El Salam Hajjar, Bassam Daya

Abstract:

In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words.

Keywords: Brain-computer interface, speech recognition, electroencephalography EEG, Wernicke area, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918
77 Destination Port Detection for Vessels: An Analytic Tool for Optimizing Port Authorities Resources

Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin

Abstract:

Port authorities have many challenges in congested ports to allocate their resources to provide a safe and secure loading/unloading procedure for cargo vessels. Selecting a destination port is the decision of a vessel master based on many factors such as weather, wavelength and changes of priorities. Having access to a tool which leverages Automatic Identification System (AIS) messages to monitor vessel’s movements and accurately predict their next destination port promotes an effective resource allocation process for port authorities. In this research, we propose a method, namely, Reference Route of Trajectory (RRoT) to assist port authorities in predicting inflow and outflow traffic in their local environment by monitoring AIS messages. Our RRo method creates a reference route based on historical AIS messages. It utilizes some of the best trajectory similarity measures to identify the destination of a vessel using their recent movement. We evaluated five different similarity measures such as Discrete Frechet Distance (DFD), Dynamic Time ´ Warping (DTW), Partial Curve Mapping (PCM), Area between two curves (Area) and Curve length (CL). Our experiments show that our method identifies the destination port with an accuracy of 98.97% and an f-measure of 99.08% using Dynamic Time Warping (DTW) similarity measure.

Keywords: Spatial temporal data mining, trajectory mining, trajectory similarity, resource optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
76 Determination of Physicochemical Properties, Bioaccessibility of Phenolics and Antioxidant Capacity of Mineral Enriched Linden Herbal Tea Beverage

Authors: Senem Suna, Canan Ece Tamer, Ömer Utku Çopur

Abstract:

In this research, dried linden (Tilia argentea) leaves and blossoms were used as a raw material for mineral enriched herbal tea beverage production. For this aim, %1 dried linden was infused with boiling water (100 °C) for 5 minutes. After cooling, sucrose, citric acid, ascorbic acid, natural lemon flavor and natural mineral water were added. Beverage samples were plate filtered, filled into 200-mL glass bottles, capped then pasteurized at 98 °C for 15 minutes. Water soluble dry matter, titratable acidity, ascorbic acid, pH, minerals (Fe, Ca, Mg, K, Na), color (L*, a*, b*), turbidity, bioaccessible phenolics and antioxidant capacity were analyzed. Water soluble dry matter, titratable acidity, and ascorbic were determined as 7.66±0.28 g/100 g, 0.13±0.00 g/100 mL, and 19.42±0.62 mg/100 mL, respectively. pH was measured as 3.69. Fe, Ca, Mg, K and Na contents of the beverage were determined as 0.12±0.00, 115.48±0.05, 34.72±0.14, 48.67±0.43 and 85.72±1.01 mg/L, respectively. Color was measured as 13.63±0.05, -4.33±0.05, and 3.06±0.05 for L*, a*, and b* values. Turbidity was determined as 0.69±0.07 NTU. Bioaccessible phenolics were determined as 312.82±5.91 mg GAE/100 mL. Antioxidant capacities of chemical (MetOH:H2O:HCl) and physiological extracts (in vitro digestive enzymatic extraction) with DPPH (27.59±0.53 and 0.17±0.02 μmol trolox/mL), FRAP (21.01±0.97 and 13.27±0.19 μmol trolox/mL) and CUPRAC (44.71±9.42 and 2.80±0.64 μmol trolox/mL) methods were also evaluated. As a result, enrichment with natural mineral water was proposed for the development of functional and nutritional values together with a good potential for commercialization.

Keywords: Antioxidant capacity, bioaccessibility, herbal tea beverage, linden.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1101
75 Development of a Robot Assisted Centrifugal Casting Machine for Manufacturing Multi-Layer Journal Bearing and High-Tech Machine Components

Authors: Mohammad Syed Ali Molla, Mohammed Azim, Mohammad Esharuzzaman

Abstract:

Centrifugal-casting machine is used in manufacturing special machine components like multi-layer journal bearing used in all internal combustion engine, steam, gas turbine and air craft turboengine where isotropic properties and high precisions are desired. Moreover, this machine can be used in manufacturing thin wall hightech machine components like cylinder liners and piston rings of IC engine and other machine parts like sleeves, and bushes. Heavy-duty machine component like railway wheel can also be prepared by centrifugal casting. A lot of technological developments are required in casting process for production of good casted machine body and machine parts. Usually defects like blowholes, surface roughness, chilled surface etc. are found in sand casted machine parts. But these can be removed by centrifugal casting machine using rotating metallic die. Moreover, die rotation, its temperature control, and good pouring practice can contribute to the quality of casting because of the fact that the soundness of a casting in large part depends upon how the metal enters into the mold or dies and solidifies. Poor pouring practice leads to variety of casting defects such as temperature loss, low quality casting, excessive turbulence, over pouring etc. Besides these, handling of molten metal is very unsecured and dangerous for the workers. In order to get rid of all these problems, the need of an automatic pouring device arises. In this research work, a robot assisted pouring device and a centrifugal casting machine are designed, developed constructed and tested experimentally which are found to work satisfactorily. The robot assisted pouring device is further modified and developed for using it in actual metal casting process. Lot of settings and tests are required to control the system and ultimately it can be used in automation of centrifugal casting machine to produce high-tech machine parts with desired precision.

Keywords: Casting, cylinder liners, journal bearing, robot.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2164
74 Assessment of Conventional Drinking Water Treatment Plants as Removal Systems of Virulent Microsporidia

Authors: M. A. Gad, A. Z. Al-Herrawy

Abstract:

Microsporidia comprises various pathogenic species can infect humans by means of water. Moreover, chlorine disinfection of drinking-water has limitations against this protozoan pathogen. A total of 48 water samples were collected from two drinking water treatment plants having two different filtration systems (slow sand filter and rapid sand filter) during one year period. Samples were collected from inlet and outlet of each plant. Samples were separately filtrated through nitrocellulose membrane (142 mm, 0.45 µm), then eluted and centrifuged. The obtained pellet from each sample was subjected to DNA extraction, then, amplification using genus-specific primer for microsporidia. Each microsporidia-PCR positive sample was performed by two species specific primers for Enterocytozoon bieneusi and Encephalitozoon intestinalis. The results of the present study showed that the percentage of removal for microsporidia through different treatment processes reached its highest rate in the station using slow sand filters (100%), while the removal by rapid sand filter system was 81.8%. Statistically, the two different drinking water treatment plants (slow and rapid) had significant effect for removal of microsporidia. Molecular identification of microsporidia-PCR positive samples using two different primers for Enterocytozoon bieneusi and Encephalitozoon intestinalis showed the presence of the two pervious species in the inlet water of the two stations, while Encephalitozoon intestinalis was detected in the outlet water only. In conclusion, the appearance of virulent microsporidia in treated drinking water may cause potential health threat.

Keywords: Removal, efficacy, microsporidia, drinking water treatment plants, PCR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1008
73 Verification of Sr-90 Determination in Water and Spruce Needles Samples Using IAEA-TEL-2016-04 ALMERA Proficiency Test Samples

Authors: S. Visetpotjanakit, N. Nakkaew

Abstract:

Determination of 90Sr in environmental samples has been widely developed with several radioanlytical methods and radiation measurement techniques since 90Sr is one of the most hazardous radionuclides produced from nuclear reactors. Liquid extraction technique using di-(2-ethylhexyl) phosphoric acid (HDEHP) to separate and purify 90Y and Cherenkov counting using liquid scintillation counter to determine 90Y in secular equilibrium to 90Sr was developed and performed at our institute, the Office of Atoms for Peace. The approach is inexpensive, non-laborious, and fast to analyse 90Sr in environmental samples. To validate our analytical performance for the accurate and precise criteria, determination of 90Sr using the IAEA-TEL-2016-04 ALMERA proficiency test samples were performed for statistical evaluation. The experiment used two spiked tap water samples and one naturally contaminated spruce needles sample from Austria collected shortly after the Chernobyl accident. Results showed that all three analyses were successfully passed in terms of both accuracy and precision criteria, obtaining “Accepted” statuses. The two water samples obtained the measured results of 15.54 Bq/kg and 19.76 Bq/kg, which had relative bias 5.68% and -3.63% for the Maximum Acceptable Relative Bias (MARB) 15% and 20%, respectively. And the spruce needles sample obtained the measured results of 21.04 Bq/kg, which had relative bias 23.78% for the MARB 30%. These results confirm our analytical performance of 90Sr determination in water and spruce needles samples using the same developed method.

Keywords: ALMERA proficiency test, Cherenkov counting, determination of 90Sr, environmental samples.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 857
72 Image Analysis for Obturator Foramen Based on Marker-Controlled Watershed Segmentation and Zernike Moments

Authors: Seda Sahin, Emin Akata

Abstract:

Obturator Foramen is a specific structure in Pelvic bone images and recognition of it is a new concept in medical image processing. Moreover, segmentation of bone structures such as Obturator Foramen plays an essential role for clinical research in orthopedics. In this paper, we present a novel method to analyze the similarity between the substructures of the imaged region and a hand drawn template as a preprocessing step for computation of Pelvic bone rotation on hip radiographs. This method consists of integrated usage of Marker-controlled Watershed segmentation and Zernike moment feature descriptor and it is used to detect Obturator Foramen accurately. Marker-controlled Watershed segmentation is applied to separate Obturator Foramen from the background effectively. Then, Zernike moment feature descriptor is used to provide matching between binary template image and the segmented binary image for final extraction of Obturator Foramens. Finally, Pelvic bone rotation rate calculation for each hip radiograph is performed automatically to select and eliminate hip radiographs for further studies which depend on Pelvic bone angle measurements. The proposed method is tested on randomly selected 100 hip radiographs. The experimental results demonstrated that the proposed method is able to segment Obturator Foramen with 96% accuracy.

Keywords: Medical image analysis, marker-controlled watershed segmentation, segmentation of bone structures on hip radiographs, pelvic bone rotation rate, zernike moment feature descriptor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
71 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation

Authors: Lae-Jeong Park

Abstract:

The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.

Keywords: Pedestrian detection, color segmentation, false positives, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1144
70 Interoperable CNC System for Turning Operations

Authors: Yusri Yusof, Stephen Newman, Aydin Nassehi, Keith Case

Abstract:

The changing economic climate has made global manufacturing a growing reality over the last decade, forcing companies from east and west and all over the world to collaborate beyond geographic boundaries in the design, manufacture and assemble of products. The ISO10303 and ISO14649 Standards (STEP and STEP-NC) have been developed to introduce interoperability into manufacturing enterprises so as to meet the challenge of responding to production on demand. This paper describes and illustrates a STEP compliant CAD/CAPP/CAM System for the manufacture of rotational parts on CNC turning centers. The information models to support the proposed system together with the data models defined in the ISO14649 standard used to create the NC programs are also described. A structured view of a STEP compliant CAD/CAPP/CAM system framework supporting the next generation of intelligent CNC controllers for turn/mill component manufacture is provided. Finally a proposed computational environment for a STEP-NC compliant system for turning operations (SCSTO) is described. SCSTO is the experimental part of the research supported by the specification of information models and constructed using a structured methodology and object-oriented methods. SCSTO was developed to generate a Part 21 file based on machining features to support the interactive generation of process plans utilizing feature extraction. A case study component has been developed to prove the concept for using the milling and turning parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM environment.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1989
69 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal

Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha

Abstract:

Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.

Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 986
68 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images

Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir

Abstract:

The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement. On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.

Keywords: Automatic landing, multirotor, nonlinear control, parameters estimation, optical flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 527
67 Current Status and Future Trends of Mechanized Fruit Thinning Devices and Sensor Technology

Authors: Marco Lopes, Pedro D. Gaspar, Maria P. Simões

Abstract:

This paper reviews the different concepts that have been investigated concerning the mechanization of fruit thinning as well as multiple working principles and solutions that have been developed for feature extraction of horticultural products, both in the field and industrial environments. The research should be committed towards selective methods, which inevitably need to incorporate some kinds of sensor technology. Computer vision often comes out as an obvious solution for unstructured detection problems, although leaves despite the chosen point of view frequently occlude fruits. Further research on non-traditional sensors that are capable of object differentiation is needed. Ultrasonic and Near Infrared (NIR) technologies have been investigated for applications related to horticultural produce and show a potential to satisfy this need while simultaneously providing spatial information as time of flight sensors. Light Detection and Ranging (LIDAR) technology also shows a huge potential but it implies much greater costs and the related equipment is usually much larger, making it less suitable for portable devices, which may serve a purpose on smaller unstructured orchards. Portable devices may serve a purpose on these types of orchards. In what concerns sensor methods, on-tree fruit detection, major challenge is to overcome the problem of fruits’ occlusion by leaves and branches. Hence, nontraditional sensors capable of providing some type of differentiation should be investigated.

Keywords: Fruit thinning, horticultural field, portable devices, sensor technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 983
66 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: Computer-aided system, detection, image segmentation, morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 544
65 Participation in IAEA Proficiency Test to Analyse Cobalt, Strontium and Caesium in Seawater Using Direct Counting and Radiochemical Techniques

Authors: S. Visetpotjanakit, C. Khrautongkieo

Abstract:

Radiation monitoring in the environment and foodstuffs is one of the main responsibilities of Office of Atoms for Peace (OAP) as the nuclear regulatory body of Thailand. The main goal of the OAP is to assure the safety of the Thai people and environment from any radiological incidents. Various radioanalytical methods have been developed to monitor radiation and radionuclides in the environmental and foodstuff samples. To validate our analytical performance, several proficiency test exercises from the International Atomic Energy Agency (IAEA) have been performed. Here, the results of a proficiency test exercise referred to as the Proficiency Test for Tritium, Cobalt, Strontium and Caesium Isotopes in Seawater 2017 (IAEA-RML-2017-01) are presented. All radionuclides excepting ³H were analysed using various radioanalytical methods, i.e. direct gamma-ray counting for determining ⁶⁰Co, ¹³⁴Cs and ¹³⁷Cs and developed radiochemical techniques for analysing ¹³⁴Cs, ¹³⁷Cs using AMP pre-concentration technique and 90Sr using di-(2-ethylhexyl) phosphoric acid (HDEHP) liquid extraction technique. The analysis results were submitted to IAEA. All results passed IAEA criteria, i.e. accuracy, precision and trueness and obtained ‘Accepted’ statuses. These confirm the data quality from the OAP environmental radiation laboratory to monitor radiation in the environment.

Keywords: International atomic energy agency, proficiency test, radiation monitoring, seawater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 824
64 New Simultaneous High Performance Liquid Chromatographic Method for Determination of NSAIDs and Opioid Analgesics in Advanced Drug Delivery Systems and Human Plasma

Authors: Asad Ullah Madni, Mahmood Ahmad, Naveed Akhtar, Muhammad Usman

Abstract:

A new and cost effective RP-HPLC method was developed and validated for simultaneous analysis of non steroidal anti inflammatory dugs Diclofenac sodium (DFS), Flurbiprofen (FLP) and an opioid analgesic Tramadol (TMD) in advanced drug delivery systems (Liposome and Microcapsules), marketed brands and human plasma. Isocratic system was employed for the flow of mobile phase consisting of 10 mM sodium dihydrogen phosphate buffer and acetonitrile in molar ratio of 67: 33 with adjusted pH of 3.2. The stationary phase was hypersil ODS column (C18, 250×4.6 mm i.d., 5 μm) with controlled temperature of 30 C°. DFS in liposomes, microcapsules and marketed drug products was determined in range of 99.76-99.84%. FLP and TMD in microcapsules and brands formulation were 99.78 - 99.94 % and 99.80 - 99.82 %, respectively. Single step liquid-liquid extraction procedure using combination of acetonitrile and trichloroacetic acid (TCA) as protein precipitating agent was employed. The detection limits (at S/N ratio 3) of quality control solutions and plasma samples were 10, 20, and 20 ng/ml for DFS, FLP and TMD, respectively. The Assay was acceptable in linear dynamic range. All other validation parameters were found in limits of FDA and ICH method validation guidelines. The proposed method is sensitive, accurate and precise and could be applicable for routine analysis in pharmaceutical industry as well as in human plasma samples for bioequivalence and pharmacokinetics studies.

Keywords: Diclofenac Sodium, Flurbiprofen, Tramadol, HPLCUV detection, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
63 Molecular Identification of ESBL Genesbla GES-1, blaVEB-1, blaCTX-M blaOXA-1, blaOXA-4,blaOXA-10 and blaPER-1 in Pseudomonas aeruginosa Strains Isolated from Burn Patientsby PCR, RFLP and Sequencing Techniques

Authors: Fereshteh Shacheraghi, Mohammad Reza Shakibaie, Hanieh Noveiri

Abstract:

Fourty one strains of ESBL producing P.aeruginosa which were previously isolated from burn patients in Kerman University general hospital, Iran were subjected to PCR, RFLP and sequencing in order to determine the type of extended spectrum β- lactamases (ESBL), the restriction digestion pattern and possibility of mutation among detected genes. DNA extraction was carried out by phenol chloroform method. PCR for detection of bla genes was performed using specific primer for each gene. Restriction Fragment Length Polymorphism (RFLP) for ESBL genes was carried out using EcoRI, NheI, PVUII, EcoRV, DdeI, and PstI restriction enzymes. The PCR products were subjected to direct sequencing of both the strands for identification of the ESBL genes.The blaCTX-M, blaVEB-1, blaPER-1, blaGES-1, blaOXA-1, blaOXA-4 and blaOXA-10 genes were detected in the (n=1) 2.43%, (n=41)100%, (n=28) 68.3%, (n=10) 24.4%, (n=29) 70.7%, (n=7)17.1% and (n=38) 92.7% of the ESBL producing isolates respectively. The RFLP analysis showed that each ESBL gene has identical pattern of digestion among the isolated strains. Sequencing of the ESBL genes confirmed the genuinety of PCR products and revealed no mutation in the restriction sites of the above genes. From results of the present investigation it can be concluded that blaVEB-1 and blaCTX-M were the most and the least frequently isolated ESBL genes among the P.aeruginosa strains isolated from burn patients. The RFLP and sequencing analysis revealed that same clone of the bla genes were indeed existed among the antibiotic resistant strains.

Keywords: ESBL genes, PCR, RFLP, Sequencing, P.aeruginosa

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2974
62 Intelligent Assistive Methods for Diagnosis of Rheumatoid Arthritis Using Histogram Smoothing and Feature Extraction of Bone Images

Authors: SP. Chokkalingam, K. Komathy

Abstract:

Advances in the field of image processing envision a new era of evaluation techniques and application of procedures in various different fields. One such field being considered is the biomedical field for prognosis as well as diagnosis of diseases. This plethora of methods though provides a wide range of options to select from, it also proves confusion in selecting the apt process and also in finding which one is more suitable. Our objective is to use a series of techniques on bone scans, so as to detect the occurrence of rheumatoid arthritis (RA) as accurately as possible. Amongst other techniques existing in the field our proposed system tends to be more effective as it depends on new methodologies that have been proved to be better and more consistent than others. Computer aided diagnosis will provide more accurate and infallible rate of consistency that will help to improve the efficiency of the system. The image first undergoes histogram smoothing and specification, morphing operation, boundary detection by edge following algorithm and finally image subtraction to determine the presence of rheumatoid arthritis in a more efficient and effective way. Using preprocessing noises are removed from images and using segmentation, region of interest is found and Histogram smoothing is applied for a specific portion of the images. Gray level co-occurrence matrix (GLCM) features like Mean, Median, Energy, Correlation, Bone Mineral Density (BMD) and etc. After finding all the features it stores in the database. This dataset is trained with inflamed and noninflamed values and with the help of neural network all the new images are checked properly for their status and Rough set is implemented for further reduction.

Keywords: Computer Aided Diagnosis, Edge Detection, Histogram Smoothing, Rheumatoid Arthritis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
61 Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals

Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty

Abstract:

A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient, but not the magnitude. A neural network with two hidden layers was then used to learn the coefficient magnitudes, along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.

Keywords: Quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188