Search results for: Template Knowledge Models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4345

Search results for: Template Knowledge Models

1105 Volatile Organochlorine Compounds Emitted by Temperate Coniferous Forests

Authors: Jana Doležalová, Josef Holík, Zdeněk Wimmer, Sándor T. Forczek

Abstract:

Chlorine is one of the most abundant elements in nature, which undergoes a complex biogeochemical cycle. Chlorine bound in some substances is partly responsible for atmospheric ozone depletion and contamination of some ecosystems. As due to international regulations anthropogenic burden of volatile organochlorines (VOCls) in atmosphere decreases, natural sources (plants, soil, abiotic formation) are expected to dominate VOCl production in the near future. Examples of plant VOCl production are methyl chloride, and bromide emission from (sub)tropical ferns, chloroform, 1,1,1-trichloroethane and tetrachloromethane emission from temperate forest fern and moss. Temperate forests are found to emit in addition to the previous compounds tetrachloroethene, and brominated volatile compounds. VOCls can be taken up and further metabolized in plants. The aim of this work is to identify and quantitatively analyze the formed VOCls in temperate forest ecosystems by a cryofocusing/GC-ECD detection method, hence filling a gap of knowledge in the biogeochemical cycle of chlorine.

Keywords: chloroform, cryofocusing-GC-ECD, ozonedepletion, volatile organochlorines

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
1104 Review and Classification of the Indicators and Trends Used in Bridge Performance Modeling

Authors: S. Rezaei, Z. Mirzaei, M. Khalighi, J. Bahrami

Abstract:

Bridges, as an essential part of road infrastructures, are affected by various deterioration mechanisms over time due to the changes in their performance. As changes in performance can have many negative impacts on society, it is essential to be able to evaluate and measure the performance of bridges throughout their life. This evaluation includes the development or the choice of the appropriate performance indicators, which, in turn, are measured based on the selection of appropriate models for the existing deterioration mechanism. The purpose of this article is a statistical study of indicators and deterioration mechanisms of bridges in order to discover further research capacities in bridges performance assessment. For this purpose, some of the most common indicators of bridge performance, including reliability, risk, vulnerability, robustness, and resilience, were selected. The researches performed on each index based on the desired deterioration mechanisms and hazards were comprehensively reviewed. In addition, the formulation of the indicators and their relationship with each other were studied. The research conducted on the mentioned indicators were classified from the point of view of deterministic or probabilistic method, the level of study (element level, object level, etc.), and the type of hazard and the deterioration mechanism of interest. For each of the indicators, a number of challenges and recommendations were presented according to the review of previous studies.

Keywords: Bridge, deterioration mechanism, lifecycle, performance indicator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 451
1103 The Public Law Studies: Relationship between Accountability, Environmental Education and Smart Cities

Authors: Aline Alves Bandeira, Luís Pedro Lima, Maria Cecília de Paula Silva, Paulo Henrique de Viveiros Tavares

Abstract:

Nowadays, the study of public policies regarding management efficiency is essential. Public policies are about what governments do or do not do, being an area that has grown worldwide, contributing through the knowledge of technologies and methodologies that monitor and evaluate the performance of public administrators. The information published on official government websites needs to provide for transparency and responsiveness of managers. Thus, transparency is a primordial factor for the execution of accountability, providing, in this way, services to the citizen with the expansion of transparent, efficient, democratic information and that value administrative eco-efficiency. The ecologically balanced management of a Smart City must optimize environmental education, building a fairer society, which brings about equality in the use of quality environmental resources. Smart Cities add value in the construction of public management, enabling interaction between people, enhancing environmental education and the practical applicability of administrative eco-efficiency, fostering economic development and improving the quality of life.

Keywords: Accountability, environmental education, new public administration, smart cities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 608
1102 Production Throughput Modeling under Five Uncertain Variables Using Bayesian Inference

Authors: Amir Azizi, Amir Yazid B. Ali, Loh Wei Ping

Abstract:

Throughput is an important measure of performance of production system. Analyzing and modeling of production throughput is complex in today-s dynamic production systems due to uncertainties of production system. The main reasons are that uncertainties are materialized when the production line faces changes in setup time, machinery break down, lead time of manufacturing, and scraps. Besides, demand changes are fluctuating from time to time for each product type. These uncertainties affect the production performance. This paper proposes Bayesian inference for throughput modeling under five production uncertainties. Bayesian model utilized prior distributions related to previous information about the uncertainties where likelihood distributions are associated to the observed data. Gibbs sampling algorithm as the robust procedure of Monte Carlo Markov chain was employed for sampling unknown parameters and estimating the posterior mean of uncertainties. The Bayesian model was validated with respect to convergence and efficiency of its outputs. The results presented that the proposed Bayesian models were capable to predict the production throughput with accuracy of 98.3%.

Keywords: Bayesian inference, Uncertainty modeling, Monte Carlo Markov chain, Gibbs sampling, Production throughput

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2141
1101 Centre Of Mass Selection Operator Based Meta-Heuristic For Unbounded Knapsack Problem

Authors: D.Venkatesan, K.Kannan, S. Raja Balachandar

Abstract:

In this paper a new Genetic Algorithm based on a heuristic operator and Centre of Mass selection operator (CMGA) is designed for the unbounded knapsack problem(UKP), which is NP-Hard combinatorial optimization problem. The proposed genetic algorithm is based on a heuristic operator, which utilizes problem specific knowledge. This center of mass operator when combined with other Genetic Operators forms a competitive algorithm to the existing ones. Computational results show that the proposed algorithm is capable of obtaining high quality solutions for problems of standard randomly generated knapsack instances. Comparative study of CMGA with simple GA in terms of results for unbounded knapsack instances of size up to 200 show the superiority of CMGA. Thus CMGA is an efficient tool of solving UKP and this algorithm is competitive with other Genetic Algorithms also.

Keywords: Genetic Algorithm, Unbounded Knapsack Problem, Combinatorial Optimization, Meta-Heuristic, Center of Mass

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
1100 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1068
1099 Localization of Anatomical Landmarks in Head CT Images for Image to Patient Registration

Authors: M. Ovinis, D. Kerr, K. Bouazza-Marouf, M. Vloeberghs

Abstract:

The use of anatomical landmarks as a basis for image to patient registration is appealing because the registration may be performed retrospectively. We have previously proposed the use of two anatomical soft tissue landmarks of the head, the canthus (corner of the eye) and the tragus (a small, pointed, cartilaginous flap of the ear), as a registration basis for an automated CT image to patient registration system, and described their localization in patient space using close range photogrammetry. In this paper, the automatic localization of these landmarks in CT images, based on their curvature saliency and using a rule based system that incorporates prior knowledge of their characteristics, is described. Existing approaches to landmark localization in CT images are predominantly semi-automatic and primarily for localizing internal landmarks. To validate our approach, the positions of the landmarks localized automatically and manually in near isotropic CT images of 102 patients were compared. The average difference was 1.2mm (std = 0.9mm, max = 4.5mm) for the medial canthus and 0.8mm (std = 0.6mm, max = 2.6mm) for the tragus. The medial canthus and tragus can be automatically localized in CT images, with performance comparable to manual localization, based on the approach presented.

Keywords: Anatomical Landmarks, CT, Localization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3321
1098 Market Segmentation and Conjoint Analysis for Apple Family Design

Authors: Abbas Al-Refaie, Nour Bata

Abstract:

A distributor of Apple products' experiences numerous difficulties in developing marketing strategies for new and existing mobile product entries that maximize customer satisfaction and the firm's profitability. This research, therefore, integrates market segmentation in platform-based product family design and conjoint analysis to identify iSystem combinations that increase customer satisfaction and business profits. First, the enhanced market segmentation grid is created. Then, the estimated demand model is formulated. Finally, the profit models are constructed then used to determine the ideal product family design that maximizes profit. Conjoint analysis is used to explore customer preferences with their satisfaction levels. A total of 200 surveys are collected about customer preferences. Then, simulation is used to determine the importance values for each attribute. Finally, sensitivity analysis is conducted to determine the product family design that maximizes both objectives. In conclusion, the results of this research shall provide great support to Apple distributors in determining the best marketing strategies that enhance their market share.

Keywords: Market segmentation, conjoint analysis, market strategies, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2510
1097 Perceptions of Climate Change and Adaptation of Climate-Smart Technology by the Paddy Farmers: A Case Study of Kandy District in Sri Lanka

Authors: W. A. D. P. Wanigasundera, P. C. B. Alahakoon

Abstract:

Kandy district in Sri Lanka, has small scale and rain-fed paddy farming, and highly vulnerable to climate change. In this study, the status of climate change was assessed using meteorological data and compared with the perceptions of paddy farming community. Factors affecting the adaptation to the climate smart farming were also assessed.

 Meteorological data for 33 years were collected and the changes over time compared with the perceptions of farmers. The temperature, rainfall and number of rainy days have increased in both locations. The onset of rains also has shifted. The perceptions of the majority of the farmers were in line with the actual changes. The knowledge and attitudes about the causes of climate change and adaptation were medium and related to level of adoption. Formulating effective communication strategies, and a collaborative approach involving state, private sector, civil society to make Sri Lankan agriculture ‘climate-smart’ is urgently needed.

Keywords: Adaptation of climate-smart technology, climate change, perception, rain-fed paddy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3242
1096 Noise Reduction in Image Sequences using an Effective Fuzzy Algorithm

Authors: Mahmoud Saeidi, Khadijeh Saeidi, Mahmoud Khaleghi

Abstract:

In this paper, we propose a novel spatiotemporal fuzzy based algorithm for noise filtering of image sequences. Our proposed algorithm uses adaptive weights based on a triangular membership functions. In this algorithm median filter is used to suppress noise. Experimental results show when the images are corrupted by highdensity Salt and Pepper noise, our fuzzy based algorithm for noise filtering of image sequences, are much more effective in suppressing noise and preserving edges than the previously reported algorithms such as [1-7]. Indeed, assigned weights to noisy pixels are very adaptive so that they well make use of correlation of pixels. On the other hand, the motion estimation methods are erroneous and in highdensity noise they may degrade the filter performance. Therefore, our proposed fuzzy algorithm doesn-t need any estimation of motion trajectory. The proposed algorithm admissibly removes noise without having any knowledge of Salt and Pepper noise density.

Keywords: Image Sequences, Noise Reduction, fuzzy algorithm, triangular membership function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1873
1095 A Study of Panel Logit Model and Adaptive Neuro-Fuzzy Inference System in the Prediction of Financial Distress Periods

Authors: Ε. Giovanis

Abstract:

The purpose of this paper is to present two different approaches of financial distress pre-warning models appropriate for risk supervisors, investors and policy makers. We examine a sample of the financial institutions and electronic companies of Taiwan Security Exchange (TSE) market from 2002 through 2008. We present a binary logistic regression with paned data analysis. With the pooled binary logistic regression we build a model including more variables in the regression than with random effects, while the in-sample and out-sample forecasting performance is higher in random effects estimation than in pooled regression. On the other hand we estimate an Adaptive Neuro-Fuzzy Inference System (ANFIS) with Gaussian and Generalized Bell (Gbell) functions and we find that ANFIS outperforms significant Logit regressions in both in-sample and out-of-sample periods, indicating that ANFIS is a more appropriate tool for financial risk managers and for the economic policy makers in central banks and national statistical services.

Keywords: ANFIS, Binary logistic regression, Financialdistress, Panel data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2338
1094 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers

Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice

Abstract:

In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.

Keywords: Churn prediction, data mining, decision-theoretic rough set, feature selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
1093 2D-Modeling with Lego Mindstorms

Authors: Miroslav Popelka, Jakub Nožička

Abstract:

The whole work is based on possibility to use Lego Mindstorms robotics systems to reduce costs. Lego Mindstorms consists of a wide variety of hardware components necessary to simulate, programme and test of robotics systems in practice. To programme algorithm, which simulates space using the ultrasonic sensor, was used development environment supplied with kit. Software Matlab was used to render values afterwards they were measured by ultrasonic sensor. The algorithm created for this paper uses theoretical knowledge from area of signal processing. Data being processed by algorithm are collected by ultrasonic sensor that scans 2D space in front of it. Ultrasonic sensor is placed on moving arm of robot which provides horizontal moving of sensor. Vertical movement of sensor is provided by wheel drive. The robot follows map in order to get correct positioning of measured data. Based on discovered facts it is possible to consider Lego Mindstorm for low-cost and capable kit for real-time modelling.

Keywords: LEGO Mindstorms, ultrasonic sensor, Real-time modeling, 2D object, low-cost robotics systems, sensors, Matlab, EV3 Home Edition Software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3101
1092 Objective Assessment of Psoriasis Lesion Thickness for PASI Scoring using 3D Digital Imaging

Authors: M.H. Ahmad Fadzil, Hurriyatul Fitriyah, Esa Prakasa, Hermawan Nugroho, S.H. Hussein, Azura Mohd. Affandi

Abstract:

Psoriasis is a chronic inflammatory skin condition which affects 2-3% of population around the world. Psoriasis Area and Severity Index (PASI) is a gold standard to assess psoriasis severity as well as the treatment efficacy. Although a gold standard, PASI is rarely used because it is tedious and complex. In practice, PASI score is determined subjectively by dermatologists, therefore inter and intra variations of assessment are possible to happen even among expert dermatologists. This research develops an algorithm to assess psoriasis lesion for PASI scoring objectively. Focus of this research is thickness assessment as one of PASI four parameters beside area, erythema and scaliness. Psoriasis lesion thickness is measured by averaging the total elevation from lesion base to lesion surface. Thickness values of 122 3D images taken from 39 patients are grouped into 4 PASI thickness score using K-means clustering. Validation on lesion base construction is performed using twelve body curvature models and show good result with coefficient of determinant (R2) is equal to 1.

Keywords: 3D digital imaging, base construction, PASI, psoriasis lesion thickness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2451
1091 The Modification of the Mixed Flow Pump with Respect to Stability of the Head Curve

Authors: Roman Klas, František Pochylý, Pavel Rudolf

Abstract:

This paper is focused on the CFD simulation of the radiaxial pump (i.e. mixed flow pump) with the aim to detect the reasons of Y-Q characteristic instability. The main reasons of pressure pulsations were detected by means of the analysis of velocity and pressure fields within the pump combined with the theoretical approach. Consequently, the modifications of spiral case and pump suction area were made based on the knowledge of flow conditions and the shape of dissipation function. The primary design of pump geometry was created as the base model serving for the comparison of individual modification influences. The basic experimental data are available for this geometry. This approach replaced the more complicated and with respect to convergence of all computational tasks more difficult calculation for the compressible liquid flow. The modification of primary pump consisted in inserting the three fins types. Subsequently, the evaluation of pressure pulsations, specific energy curves and visualization of velocity fields were chosen as the criterion for successful design. 

Keywords: CFD, radiaxial pump, spiral case, stability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
1090 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features

Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan

Abstract:

Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.

Keywords: Pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1275
1089 Global Kinetics of Direct Dimethyl Ether Synthesis Process from Syngas in Slurry Reactor over a Novel Cu-Zn-Al-Zr Slurry Catalyst

Authors: Zhen Chen, Haitao Zhang, Weiyong Ying, Dingye Fang

Abstract:

The direct synthesis process of dimethyl ether (DME) from syngas in slurry reactors is considered to be promising because of its advantages in caloric transfer. In this paper, the influences of operating conditions (temperature, pressure and weight hourly space velocity) on the conversion of CO, selectivity of DME and methanol were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst, which is far more suitable to liquid phase dimethyl ether synthesis process than bifunctional catalyst commercially. A Langmuir- Hinshelwood mechanism type global kinetics model for liquid phase DME direct synthesis based on methanol synthesis models and a methanol dehydration model has been investigated by fitting our experimental data. The model parameters were estimated with MATLAB program based on general Genetic Algorithms and Levenberg-Marquardt method, which is suitably fitting experimental data and its reliability was verified by statistical test and residual error analysis.

Keywords: alcohol/ether fuel, Cu-Zn-Al-Zr slurry catalyst, global kinetics, slurry reactor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5514
1088 Personal Health Assistance Service Expert System (PHASES)

Authors: Chakkrit Snae, Michael Brueckner

Abstract:

In this paper the authors present the framework of a system for assisting users through counseling on personal health, the Personal Health Assistance Service Expert System (PHASES). Personal health assistance systems need Personal Health Records (PHR), which support wellness activities, improve the understanding of personal health issues, enable access to data from providers of health services, strengthen health promotion, and in the end improve the health of the population. This is especially important in societies where the health costs increase at a higher rate than the overall economy. The most important elements of a healthy lifestyle are related to food (such as balanced nutrition and diets), activities for body fitness (such as walking, sports, fitness programs), and other medical treatments (such as massage, prescriptions of drugs). The PHASES framework uses an ontology of food, which includes nutritional facts, an expert system keeping track of personal health data that are matched with medical treatments, and a comprehensive data transfer between patients and the system.

Keywords: Personal health assistance service, expert system, ontologies, knowledge management, information technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
1087 Wear Measuring and Wear Modelling Based On Archard, ASTM, and Neural Network Models

Authors: A. Shebani, C. Pislaru

Abstract:

The wear measuring and wear modelling are fundamental issues in the industrial field, mainly correlated to the economy and safety. Therefore, there is a need to study the wear measurements and wear estimation. Pin-on-disc test is the most common test which is used to study the wear behaviour. In this paper, the pin-on-disc (AEROTECH UNIDEX 11) is used for the investigation of the effects of normal load and hardness of material on the wear under dry and sliding conditions. In the pin-on-disc rig, two specimens were used; one, a pin is made of steel with a tip, positioned perpendicular to the disc, where the disc is made of aluminium. The pin wear and disc wear were measured by using the following instruments: The Talysurf instrument, a digital microscope, and the alicona instrument. The Talysurf profilometer was used to measure the pin/disc wear scar depth, digital microscope was used to measure the diameter and width of wear scar, and the alicona was used to measure the pin wear and disc wear. After that, the Archard model, American Society for Testing and Materials model (ASTM), and neural network model were used for pin/disc wear modelling. Simulation results were implemented by using the Matlab program. This paper focuses on how the alicona can be used for wear measurements and how the neural network can be used for wear estimation.

Keywords: Wear measuring, Wear modelling, Neural Network, Alicona.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4268
1086 Project Base Learning for IT Personnel Resources Development using TVML

Authors: Tansuriyavong Suriyon, Endo Takanobu, Boonmee Choompol

Abstract:

Using the animations video of teaching materials is an effective learning method. However, we thought that more effective learning method is to produce the teaching video by learners themselves. The learners who act as the producer must learn and understand well to produce and present video of teaching materials to others. The purpose of this study is to propose the project based learning (PBL) technique by co-producing video of IT (information technology) teaching materials. We used the T2V player to produce the video based on TVML a TV program description language. By proposed method, we have assigned the learners to produce the animations video for “National Examination for Information Processing Technicians (IPA examination)" in Japan, in order to get them learns various knowledge and skill on IT field. Experimental result showed that learning effect has occurred at the video production process that useful for IT personnel resources development.

Keywords: TVML , T2V Player, The animation made as learning materials, National Examination for Information Processing Technicians, IT Education, Problem Based Learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
1085 Advance in Monitoring and Process Control of Surface Roughness

Authors: Somkiat Tangjitsitcharoen, Siripong Damrongthaveesak

Abstract:

This paper presents an advance in monitoring and process control of surface roughness in CNC machine for the turning and milling processes. An integration of the in-process monitoring and process control of the surface roughness is proposed and developed during the machining process by using the cutting force ratio. The previously developed surface roughness models for turning and milling processes of the author are adopted to predict the inprocess surface roughness, which consist of the cutting speed, the feed rate, the tool nose radius, the depth of cut, the rake angle, and the cutting force ratio. The cutting force ratios obtained from the turning and the milling are utilized to estimate the in-process surface roughness. The dynamometers are installed on the tool turret of CNC turning machine and the table of 5-axis machining center to monitor the cutting forces. The in-process control of the surface roughness has been developed and proposed to control the predicted surface roughness. It has been proved by the cutting tests that the proposed integration system of the in-process monitoring and the process control can be used to check the surface roughness during the cutting by utilizing the cutting force ratio.

Keywords: Turning, milling, monitoring, surface roughness, cutting force ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2122
1084 Formulation and in vitro Evaluation of Sustained Release Matrix Tablets of Levetiracetam for Better Epileptic Treatment

Authors: Nagasamy Venkatesh Dhandapani

Abstract:

The objective of the present study was to develop sustained release oral matrix tablets of anti epileptic drug levetiracetam. The sustained release matrix tablets of levetiracetam were prepared using hydrophilic matrix hydroxypropyl methylcellulose (HPMC) as a release retarding polymer by wet granulation method. Prior to compression, FTIR studies were performed to understand the compatibility between the drug and excipients. The study revealed that there was no chemical interaction between drug and excipients used in the study. The tablets were characterized by physical and chemical parameters and results were found in acceptable limits. In vitro release study was carried out for the tablets using 0.1 N HCl for 2 hours and in phosphate buffer pH 7.4 for remaining time up to 12 hours. The effect of polymer concentration was studied. Different dissolution models were applied to drug release data in order to evaluate release mechanisms and kinetics. The drug release data fit well to zero order kinetics. Drug release mechanism was found as a complex mixture of diffusion, swelling and erosion.

Keywords: Levetiracetam, sustained-release, hydrophilic matrix tablet, HPMC grade K 100 MCR, wet granulation, zero order release kinetics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
1083 On Developing an Automatic Speech Recognition System for Standard Arabic Language

Authors: R. Walha, F. Drira, H. El-Abed, A. M. Alimi

Abstract:

The Automatic Speech Recognition (ASR) applied to Arabic language is a challenging task. This is mainly related to the language specificities which make the researchers facing multiple difficulties such as the insufficient linguistic resources and the very limited number of available transcribed Arabic speech corpora. In this paper, we are interested in the development of a HMM-based ASR system for Standard Arabic (SA) language. Our fundamental research goal is to select the most appropriate acoustic parameters describing each audio frame, acoustic models and speech recognition unit. To achieve this purpose, we analyze the effect of varying frame windowing (size and period), acoustic parameter number resulting from features extraction methods traditionally used in ASR, speech recognition unit, Gaussian number per HMM state and number of embedded re-estimations of the Baum-Welch Algorithm. To evaluate the proposed ASR system, a multi-speaker SA connected-digits corpus is collected, transcribed and used throughout all experiments. A further evaluation is conducted on a speaker-independent continue SA speech corpus. The phonemes recognition rate is 94.02% which is relatively high when comparing it with another ASR system evaluated on the same corpus.

Keywords: ASR, HMM, acoustical analysis, acoustic modeling, Standard Arabic language

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774
1082 CFD Investigation of Turbulent Mixed Convection Heat Transfer in a Closed Lid-Driven Cavity

Authors: A. Khaleel, S. Gao

Abstract:

Both steady and unsteady turbulent mixed convection heat transfer in a 3D lid-driven enclosure, which has constant heat flux on the middle of bottom wall and with isothermal moving sidewalls, is reported in this paper for working fluid with Prandtl number Pr = 0.71. The other walls are adiabatic and stationary. The dimensionless parameters used in this research are Reynolds number, Re = 5000, 10000 and 15000, and Richardson number, Ri = 1 and 10. The simulations have been done by using different turbulent methods such as RANS, URANS, and LES. The effects of using different k-ε models such as standard, RNG and Realizable k-ε model are investigated. Interesting behaviours of the thermal and flow fields with changing the Re or Ri numbers are observed. Isotherm and turbulent kinetic energy distributions and variation of local Nusselt number at the hot bottom wall are studied as well. The local Nusselt number is found increasing with increasing either Re or Ri number. In addition, the turbulent kinetic energy is discernibly affected by increasing Re number. Moreover, the LES results have shown good ability of this method in predicting more detailed flow structures in the cavity.

Keywords: Mixed convection, Lid-driven cavity, Turbulent flow, RANS model, URANS model, Large eddy simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2269
1081 Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology

Authors: Inas S. Khayal, Weiping Zhou, Jonathan Skinner

Abstract:

Healthcare delivery systems around the world are in crisis. The need to improve health outcomes while decreasing healthcare costs have led to an imminent call to action to transform the healthcare delivery system. While Bioinformatics and Biomedical Engineering have primarily focused on biological level data and biomedical technology, there is clear evidence of the importance of the delivery of care on patient outcomes. Classic singular decomposition approaches from reductionist science are not capable of explaining complex systems. Approaches and methods from systems science and systems engineering are utilized to structure healthcare delivery system data. Specifically, systems architecture is used to develop a multi-scale and multi-dimensional characterization of the healthcare delivery system, defined here as the Healthcare Delivery System Knowledge Base. This paper is the first to contribute a new method of structuring and visualizing a multi-dimensional and multi-scale healthcare delivery system using systems architecture in order to better understand healthcare delivery.

Keywords: Health informatics, systems thinking, systems architecture, healthcare delivery system, data analytics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1129
1080 Low-Latency and Low-Overhead Path Planning for In-band Network-Wide Telemetry

Authors: Penghui Zhang, Hua Zhang, Jun-Bo Wang, Cheng Zeng, Zijian Cao

Abstract:

With the development of software-defined networks and programmable data planes, in-band network telemetry (INT) has become an emerging technology in communications because it can get accurate and real-time network information. However, due to the expansion of the network scale, existing telemetry systems, to the best of the authors’ knowledge, have difficulty in meeting the common requirements of low overhead, low latency and full coverage for traffic measurement. This paper proposes a network-wide telemetry system with a low-latency low-overhead path planning (INT-LLPP). This paper builds a mathematical model to analyze the telemetry overhead and latency of INT systems. Then, we adopt a greedy-based path planning algorithm to reduce the overhead and latency of the network telemetry with the full network coverage. The simulation results show that network-wide telemetry is achieved and the telemetry overhead can be reduced significantly compared with existing INT systems. INT-LLPP can control the system latency to get real-time network information.

Keywords: Network telemetry, network monitoring, path planning, low latency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234
1079 Analysis of GI/M(n)/1/N Queue with Single Working Vacation and Vacation Interruption

Authors: P. Vijaya Laxmi, V. Goswami, V. Suchitra

Abstract:

This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs. 

Keywords: State Dependent Service, Vacation Interruption, Supplementary Variable, Single Working Vacation, Blocking Probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2153
1078 The Socio-Economic Impact of the English Leather Glove Industry from the 17th Century to Its Recent Decline

Authors: Frances Turner

Abstract:

Gloves are significant physical objects, being one of the oldest forms of dress. Glove culture is part of every facet of life; its extraordinary history encompasses practicality, and symbolism reflecting a wide range of social practices. The survival of not only the gloves but associated articles enables the possibility to analyse real lives, however so far this area has been largely neglected. Limited information is available to students, researchers, or those involved with the design and making of gloves. There are several museums and independent collectors in England that hold collections of gloves (some from as early as 16th century), machinery, tools, designs and patterns, marketing materials and significant archives which demonstrate the rich heritage of English glove design and manufacturing, being of national significance and worthy of international interest. Through a research glove network which now exists thanks to research grant funding, there is potential for the holders of glove collections to make connections and explore links between these resources to promote a stronger understanding of the significance, breadth and heritage of the English glove industry. The network takes an interdisciplinary approach to bring together interested parties from academia, museums and manufacturing, with expert knowledge of the production, collections, conservation and display of English leather gloves. Academics from diverse arts and humanities disciplines benefit from the opportunities to share research and discuss ideas with network members from non-academic contexts including museums and heritage organisations, industry, and contemporary designers. The fragmented collections when considered in entirety provide an overview of English glove making since earliest times and those who wore them. This paper makes connections and explores links between these resources to promote a stronger understanding of the significance, breadth and heritage of the English Glove industry. The following areas are explored: current content and status of the individual museum collections, potential links, sharing of information histories, social and cultural and relationship to history of fashion design, manufacturing and materials, approaches to maintenance and conservation, access to the collections and strategies for future understanding of their national significance. The facilitation of knowledge exchange and exploration of the collections through the network informs organisations’ future strategies for the maintenance, access and conservation of their collections. By involving industry in the network, it is possible to ensure a contemporary perspective on glove-making in addition to the input from heritage partners. The slow fashion movement and awareness of artisan craft and how these can be preserved and adopted for glove and accessory design is addressed. Artisan leather glove making was a skilled and significant industry in England that has now declined to the point where there is little production remaining utilising the specialist skills that have hardly changed since earliest times. This heritage will be identified and preserved for future generations of the rich cultural history of gloves may be lost.

Keywords: Artisan glove making skills, English leather gloves, glove culture, glove network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 614
1077 User-Based Cannibalization Mitigation in an Online Marketplace

Authors: Vivian Guo, Yan Qu

Abstract:

Online marketplaces are not only digital places where consumers buy and sell merchandise, and they are also destinations for brands to connect with real consumers at the moment when customers are in the shopping mindset. For many marketplaces, brands have been important partners through advertising. There can be, however, a risk of advertising impacting a consumer’s shopping journey if it hurts the use experience or takes the user away from the site. Both could lead to the loss of transaction revenue for the marketplace. In this paper, we present user-based methods for cannibalization control by selectively turning off ads to users who are likely to be cannibalized by ads subject to business objectives. We present ways of measuring cannibalization of advertising in the context of an online marketplace and propose novel ways of measuring cannibalization through purchase propensity and uplift modeling. A/B testing has shown that our methods can significantly improve user purchase and engagement metrics while operating within business objectives. To our knowledge, this is the first paper that addresses cannibalization mitigation at the user-level in the context of advertising.

Keywords: Cannibalization, machine learning, online marketplace, revenue optimization, yield optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 900
1076 The Effect of e-learning on the Promotion of Optoelectronics Technology and Daily Livings Literacy among Students in Universities of Technology

Authors: Chin-Pin Chen, David W.S. Tai, Wen-Jong Chen, Hui-Min Lai

Abstract:

This study aims to analyze the effect of e-learning on photonics technology and daily livings among college students. The course contents of photonics technology and daily livings are first drafted based on research discussions and expert interviews. Having expert questionnaires with Delphi Technique for three times, the knowledge units and items for the course of photonics technology and daily livings are established. The e-learning materials and the drafts of instructional strategies, academic achievement, and learning attitude scales are then developed. With expert inspection, reliability and validity test, and experimental instructions, the scales and the material are further revised. Finally, the formal instructions are implemented to test the effect of different instructional methods on the academic achievement of photonics technology and daily livings among students in universities of technology. The research results show that e-learning could effectively promote academic achievement and learning attitude, and the students with e-learning obviously outperform the ones with trandition instructions.

Keywords: E-learning, Photonics Technology and Daily Livings, Academic Achievement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705