Search results for: continuous process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6042

Search results for: continuous process

5742 Measurement Tools of the Maturity Model for IT Service Outsourcing in Higher Education Institutions

Authors: Victoriano Valencia García, Luis Usero Aragonés, Eugenio J. Fernández Vicente

Abstract:

Nowadays, the successful implementation of ICTs is vital for almost any kind of organization. Good governance and ICT management are essential for delivering value, managing technological risks, managing resources and performance measurement. In addition, outsourcing is a strategic IT service solution which complements IT services provided internally in organizations. This paper proposes the measurement tools of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. These measurement tools allow independent validation and practical application in the field of higher education, using a questionnaire, metrics tables, and continuous improvement plan tables as part of the measurement process. Guidelines and standards are proposed in the model for facilitating adaptation to universities and achieving excellence in the outsourcing of IT services.

Keywords: IT Governance, IT Management, IT Services, Maturity Model, Measurement Tools, Outsourcing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2792
5741 Line Balancing in the Hard Disk Drive Process Using Simulation Techniques

Authors: Teerapun Saeheaw, Nivit Charoenchai, Wichai Chattinnawat

Abstract:

Simulation model is an easy way to build up models to represent real life scenarios, to identify bottlenecks and to enhance system performance. Using a valid simulation model may give several advantages in creating better manufacturing design in order to improve the system performances. This paper presents result of implementing a simulation model to design hard disk drive manufacturing process by applying line balancing to improve both productivity and quality of hard disk drive process. The line balance efficiency showed 86% decrease in work in process, output was increased by an average of 80%, average time in the system was decreased 86% and waiting time was decreased 90%.

Keywords: line balancing, arena, hard disk drive process, simulation, work in process (WIP)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2167
5740 Integration Process of Industrial Design and Engineering Design

Authors: Kazuhide Sugiyama, Hiroshi Osada

Abstract:

Lately management strategy that put Industrial Design (ID) in its core is recognized more important, as technology and price alone cannot differentiate a product. The needs to shorten the time to develop a product also shorten the development period of ID, and it necessitates the ID process management. This research analyzes the status of integration process of ID and Engineering Design (ED) of office equipment that requires the collaboration of ID and ED to clarify the issues for the efficiency of the development and to propose solutions.

Keywords: Industrial Design (ID), Engineering Design (ED), Integration process, Office equipment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
5739 Using Perspective Schemata to Model the ETL Process

Authors: Valeria M. Pequeno, Joao Carlos G. M. Pires

Abstract:

Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.

Keywords: conceptual data model, correspondence assertions, data warehouse, data integration, ETL process, object relational database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1514
5738 Scaling Strategy of a New Experimental Rig for Wheel-Rail Contact

Authors: Meysam Naeimi, Zili Li, Rolf Dollevoet

Abstract:

A new small–scale test rig developed for rolling contact fatigue (RCF) investigations in wheel–rail material. This paper presents the scaling strategy of the rig based on dimensional analysis and mechanical modelling. The new experimental rig is indeed a spinning frame structure with multiple wheel components over a fixed rail-track ring, capable of simulating continuous wheelrail contact in a laboratory scale. This paper describes the dimensional design of the rig, to derive its overall scaling strategy and to determine the key elements’ specifications. Finite element (FE) modelling is used to simulate the mechanical behavior of the rig with two sample scale factors of 1/5 and 1/7. The results of FE models are compared with the actual railway system to observe the effectiveness of the chosen scales. The mechanical properties of the components and variables of the system are finally determined through the design process.

Keywords: New test rig, rolling contact fatigue, rail, small scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2345
5737 Design of a CMOS Highly Linear Front-end IC with Auto Gain Controller for a Magnetic Field Transceiver

Authors: Yeon-kug Moon, Kang-Yoon Lee, Yun-Jae Won, Seung-Ok Lim

Abstract:

This paper describes a low-voltage and low-power channel selection analog front end with continuous-time low pass filters and highly linear programmable gain amplifier (PGA). The filters were realized as balanced Gm-C biquadratic filters to achieve a low current consumption. High linearity and a constant wide bandwidth are achieved by using a new transconductance (Gm) cell. The PGA has a voltage gain varying from 0 to 65dB, while maintaining a constant bandwidth. A filter tuning circuit that requires an accurate time base but no external components is presented. With a 1-Vrms differential input and output, the filter achieves -85dB THD and a 78dB signal-to-noise ratio. Both the filter and PGA were implemented in a 0.18um 1P6M n-well CMOS process. They consume 3.2mW from a 1.8V power supply and occupy an area of 0.19mm2.

Keywords: component ; Channel selection filters, DC offset, programmable gain amplifier, tuning circuit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2142
5736 Energy Management System and Interactive Functions of Smart Plug for Smart Home

Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya

Abstract:

Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.

Keywords: Energy management, load profile, smart plug, wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
5735 Assessing Community Participation in Decision-Making Process under Co-Management: A Case Study on Hail Haor, Bangladesh

Authors: R. Ferdous

Abstract:

Power, responsibility sharing, and democratic decision-making are the central ethos to co-management. It is assumed that involving local community in the decision-making process can create a sense of ownership and responsibility of that community and motivate the community towards collective action. But this paper demonstrated that the process to involve local community is not simple and straightforward as it is influenced by structural aspects, power relations among the actors, and social embedded institutions. These factors shape the process in that way who will participate, how they will participate and how the local community maneuvers their agency in the decision-making process. To grasp the complexities that materialize in the process of participation and to understand the inclusionary and exclusionary nature of participation, this paper examines the subjective understanding of different stakeholders concerning participation and furthermore observes the enabling or constraining factors that affect the community to exercise their agency.

Keywords: Participation, social embeddedness, power, structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
5734 Quality Based Approach for Efficient Biologics Manufacturing

Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama

Abstract:

To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.

Keywords: Antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
5733 Developing Manufacturing Process for the Graphene Sensors

Authors: Abdullah Faqihi, John Hedley

Abstract:

Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.

Keywords: Laser scribing, LightScribe DVD, graphene oxide, scanning electron microscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674
5732 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, A. Syed-Khaja, J. Franke

Abstract:

The importance of energy efficiency within the production processes increases steadily. For a comprehensive assessment of energy efficiency within the production process, unfortunately no tools exist or have been developed yet. Therefore the Institute for Factory Automation and Production Systems at the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency namely EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state-of-the-art as well as the developed approaches.

Keywords: Energy efficiency, energy efficiency value, energetic process efficiency, production.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2283
5731 Mobile Velocity Based Bidirectional Call Overflow Scheme in Hierarchical Cellular System

Authors: G. M. Mir, Moinuddin, N. A. Shah

Abstract:

In the age of global communications, heterogeneous networks are seen to be the best choice of strategy to ensure continuous and uninterruptible services. This will allow mobile terminal to stay in connection even they are migrating into different segment coverage through the handoff process. With the increase of teletraffic demands in mobile cellular system, hierarchical cellular systems have been adopted extensively for more efficient channel utilization and better QoS (Quality of Service). This paper presents a bidirectional call overflow scheme between two layers of microcells and macrocells, where handoffs are decided by the velocity of mobile making the call. To ensure that handoff calls are given higher priorities, it is assumed that guard channels are assigned in both macrocells and microcells. A hysteresis value introduced in mobile velocity is used to allow mobile roam in the same cell if its velocity changes back within the set threshold values. By doing this the number of handoffs is reduced thereby reducing the processing overhead and enhancing the quality of service to the end user.

Keywords: Hierarchical cellular systems, hysteresis, overflow, threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1373
5730 Nonlinear Control of a Continuous Bioreactor Based on Cell Population Model

Authors: Mahdi Sharifian, Mohammad Ali Fanaei

Abstract:

Saccharomyces cerevisiae (baker-s yeast) can exhibit sustained oscillations during the operation in a continuous bioreactor that adversely affects its stability and productivity. Because of heterogeneous nature of cell populations, the cell population balance models can be used to capture the dynamic behavior of such cultures. In this paper an unstructured, segregated model is used which is based on population balance equation(PBE) and then in order to simulation, the 4th order Rung-Kutta is used for time dimension and three methods, finite difference, orthogonal collocation on finite elements and Galerkin finite element are used for discretization of the cell mass domain. The results indicate that the orthogonal collocation on finite element not only is able to predict the oscillating behavior of the cell culture but also needs much little time for calculations. Therefore this method is preferred in comparison with other methods. In the next step two controllers, a globally linearizing control (GLC) and a conventional proportional-integral (PI) controller are designed for controlling the total cell mass per unit volume, and performances of these controllers are compared through simulation. The results show that although the PI controller has simpler structure, the GLC has better performance.

Keywords: Bioreactor, cell population balance, finite difference, orthogonal collocation on finite elements, Galerkin finite element, feedback linearization, PI controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886
5729 Using Combination of Optimized Recurrent Neural Network with Design of Experiments and Regression for Control Chart Forecasting

Authors: R. Behmanesh, I. Rahimi

Abstract:

recurrent neural network (RNN) is an efficient tool for modeling production control process as well as modeling services. In this paper one RNN was combined with regression model and were employed in order to be checked whether the obtained data by the model in comparison with actual data, are valid for variable process control chart. Therefore, one maintenance process in workshop of Esfahan Oil Refining Co. (EORC) was taken for illustration of models. First, the regression was made for predicting the response time of process based upon determined factors, and then the error between actual and predicted response time as output and also the same factors as input were used in RNN. Finally, according to predicted data from combined model, it is scrutinized for test values in statistical process control whether forecasting efficiency is acceptable. Meanwhile, in training process of RNN, design of experiments was set so as to optimize the RNN.

Keywords: RNN, DOE, regression, control chart.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662
5728 Experimental Investigation on Freeze-Concentration Process Desalting for Highly Saline Brines

Authors: H. Al-Jabli

Abstract:

Using the freeze-melting process for the disposing of high saline brines was the aim of the paper by confirming the performance estimation of the treatment system. A laboratory bench scale freezing technique test unit was designed, constructed, and tested at Doha Research Plant (DRP) in Kuwait. The principal unit operations that have been considered for the laboratory study are: ice crystallization, separation, washing, and melting. The applied process is characterized as “the secondary-refrigerant indirect freezing”, which is utilizing normal freezing concept. The high saline brine was used as definite feed water, i.e. average TDS of 250,000 ppm. Kuwait desalination plants were carried out in the experimental study to measure the performance of the proposed treatment system. Experimental analysis shows that the freeze-melting process is capable of dropping the TDS of the feed water from 249,482 ppm to 56,880 ppm of the freeze-melting process in the two-phase’s course, whereas overall recovery results of the salt passage and salt rejection are 31.11%, 19.05%, and 80.95%, correspondingly. Therefore, the freeze-melting process is encouraging for the proposed application, as it shows on the results, which approves the process capability of reducing a major amount of the dissolved salts of the high saline brine with reasonable sensible recovery. This process might be reasonable with other brine disposal processes.

Keywords: High saline brine, freeze-melting process, ice crystallization, brine disposal process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1060
5727 Overriding Moral Intuitions – Does It Make Us Immoral? Dual-Process Theory of Higher Cognition Account for Moral Reasoning

Authors: Michał Białek, Simon J. Handley

Abstract:

Moral decisions are considered as an intuitive process, while conscious reasoning is mostly used only to justify those intuitions. This problem is described in few different dual-process theories of mind, that are being developed e.g. by Frederick and Kahneman, Stanovich and Evans. Those theories recently evolved into tri-process theories with a proposed process that makes ultimate decision or allows to paraformal processing with focal bias.. Presented experiment compares the decision patterns to the implications of those models. In presented study participants (n=179) considered different aspects of trolley dilemma or its footbridge version and decided after that. Results show that in the control group 70% of people decided to use the lever to change tracks for the running trolley, and 20% chose to push the fat man down the tracks. In contrast, after experimental manipulation almost no one decided to act. Also the decision time difference between dilemmas disappeared after experimental manipulation. The result supports the idea of three co-working processes: intuitive (TASS), paraformal (reflective mind) and algorithmic process.

Keywords: Moral reasoning, moral decision, reflection, trolley problem, dual-process theory of reasoning, tri-process theory of cognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
5726 Aspen Plus Simulation of Saponification of Ethyl Acetate in the Presence of Sodium Hydroxide in a Plug Flow Reactor

Authors: U. P. L. Wijayarathne, K. C. Wasalathilake

Abstract:

This work presents the modelling and simulation of saponification of ethyl acetate in the presence of sodium hydroxide in a plug flow reactor using Aspen Plus simulation software. Plug flow reactors are widely used in the industry due to the non-mixing property. The use of plug flow reactors becomes significant when there is a need for continuous large scale reaction or fast reaction. Plug flow reactors have a high volumetric unit conversion as the occurrence for side reactions is minimum. In this research Aspen Plus V8.0 has been successfully used to simulate the plug flow reactor. In order to simulate the process as accurately as possible HYSYS Peng- Robinson EOS package was used as the property method. The results obtained from the simulation were verified by the experiment carried out in the EDIBON plug flow reactor module. The correlation coefficient (r2) was 0.98 and it proved that simulation results satisfactorily fit for the experimental model. The developed model can be used as a guide for understanding the reaction kinetics of a plug flow reactor.

Keywords: Aspen Plus, Modelling, Plug Flow Reactor, Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9404
5725 Continuous Flow Experimental Set-Up for Fouling Deposit Study

Authors: A. L. Ho, N. Ab. Aziz, F. S. Taip, M. N. Ibrahim

Abstract:

The study of the fouling deposition of pink guava juice (PGJ) is relatively new research compared to milk fouling deposit. In this work, a new experimental set-up was developed to imitate the fouling formation in heat exchanger, namely a continuous flow experimental set-up heat exchanger. The new experimental setup was operated under industrial pasteurization temperature of PGJ, which was at 93°C. While the flow rate and pasteurization period were based on the experimental capacity, which were 0.5 and 1 liter/min for the flow rate and the pasteurization period was set for 1 hour. Characterization of the fouling deposit was determined by using various methods. Microstructure of the deposits was carried out using ESEM. Proximate analyses were performed to determine the composition of moisture, fat, protein, fiber, ash and carbohydrate content. A study on the hardness and stickiness of the fouling deposit was done using a texture analyzer. The presence of seedstone in pink guava juice was also analyzed using a particle analyzer. The findings shown that seedstone from pink guava juice ranging from 168 to 200μm and carbohydrate was found to be a major composition (47.7% of fouling deposit consists of carbohydrate). Comparison between the hardness and stickiness of the deposits at two different flow rates showed that fouling deposits were harder and denser at higher flow rate. Findings from this work provide basis knowledge for further study on fouling and cleaning of PGJ.

Keywords: Pink guava juice, fouling deposit, heat exchanger.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
5724 A Study on Stochastic Integral Associated with Catastrophes

Authors: M. Reni Sagayaraj, S. Anand Gnana Selvam, R. Reynald Susainathan

Abstract:

We analyze stochastic integrals associated with a mutation process. To be specific, we describe the cell population process and derive the differential equations for the joint generating functions for the number of mutants and their integrals in generating functions and their applications. We obtain first-order moments of the processes of the two-way mutation process in first-order moment structure of X (t) and Y (t) and the second-order moments of a one-way mutation process. In this paper, we obtain the limiting behaviour of the integrals in limiting distributions of X (t) and Y (t).

Keywords: Stochastic integrals, single–server queue model, catastrophes, busy period.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803
5723 Rapid Data Acquisition System for Complex Algorithm Testing in Plastic Molding Industry

Authors: A. Tellaeche, R. Arana

Abstract:

Injection molding is a very complicated process to monitor and control. With its high complexity and many process parameters, the optimization of these systems is a very challenging problem. To meet the requirements and costs demanded by the market, there has been an intense development and research with the aim to maintain the process under control. This paper outlines the latest advances in necessary algorithms for plastic injection process and monitoring, and also a flexible data acquisition system that allows rapid implementation of complex algorithms to assess their correct performance and can be integrated in the quality control process. This is the main topic of this paper. Finally, to demonstrate the performance achieved by this combination, a real case of use is presented.

Keywords: Plastic injection, machine learning, rapid complex algorithm prototyping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2125
5722 Development of Performance Measures for the Implementation of Total Quality Management in Indian Industry

Authors: Perminderjit Singh, Sukhvir Singh

Abstract:

Total Quality Management (TQM) refers to management methods used to enhance quality and productivity in business organizations. Total Quality Management (TQM) has become a frequently used term in discussions concerning quality. Total Quality management has brought rise in demands on the organizations policy and the customers have gained more importance in the organizations focus. TQM is considered as an important management tool, which helps the organizations to satisfy their customers. In present research critical success factors includes management commitment, customer satisfaction, continuous improvement, work culture and environment, supplier quality management, training and development, employee satisfaction and product/process design are studied. A questionnaire is developed to implement these critical success factors in implementation of total quality management in Indian industry. Questionnaires filled by consulting different industrial organizations. Data collected from questionnaires is analyzed by descriptive and importance indexes. 

Keywords: Total quality management, critical success factor, employee satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
5721 Continuous Feature Adaptation for Non-Native Speech Recognition

Authors: Y. Deng, X. Li, C. Kwan, B. Raj, R. Stern

Abstract:

The current speech interfaces in many military applications may be adequate for native speakers. However, the recognition rate drops quite a lot for non-native speakers (people with foreign accents). This is mainly because the nonnative speakers have large temporal and intra-phoneme variations when they pronounce the same words. This problem is also complicated by the presence of large environmental noise such as tank noise, helicopter noise, etc. In this paper, we proposed a novel continuous acoustic feature adaptation algorithm for on-line accent and environmental adaptation. Implemented by incremental singular value decomposition (SVD), the algorithm captures local acoustic variation and runs in real-time. This feature-based adaptation method is then integrated with conventional model-based maximum likelihood linear regression (MLLR) algorithm. Extensive experiments have been performed on the NATO non-native speech corpus with baseline acoustic model trained on native American English. The proposed feature-based adaptation algorithm improved the average recognition accuracy by 15%, while the MLLR model based adaptation achieved 11% improvement. The corresponding word error rate (WER) reduction was 25.8% and 2.73%, as compared to that without adaptation. The combined adaptation achieved overall recognition accuracy improvement of 29.5%, and WER reduction of 31.8%, as compared to that without adaptation.

Keywords: speaker adaptation; environment adaptation; robust speech recognition; SVD; non-native speech recognition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3219
5720 Modeling of Processes Running in Radical Clusters Formed by Ionizing Radiation with the Help of Continuous Petri Nets and Oxygen Effect

Authors: J. Barilla, M. Lokajíček, H. Pisaková, P. Simr

Abstract:

The final biological effect of ionizing particles may be influenced strongly by some chemical substances present in cells mainly in the case of low-LET radiation. The influence of oxygen may by particularly important because oxygen is always present in living cells. The corresponding processes are then running mainly in the chemical stage of radiobiological mechanism.

The radical clusters formed by densely ionizing ends of primary or secondary charged particles are mainly responsible for final biological effect. The damage effect depends then on radical concentration at a time when the cluster meets a DNA molecule. It may be strongly influenced by oxygen present in a cell as oxygen may act in different directions: at small concentration of it the interaction with hydrogen radicals prevails while at higher concentrations additional efficient oxygen radicals may be formed.

The basic radical concentration in individual clusters diminishes, which is influenced by two parallel processes: chemical reactions and diffusion of corresponding clusters. The given simultaneous evolution may be modeled and analyzed well with the help of Continuous Petri nets. The influence of other substances present in cells during irradiation may be studied, too. Some results concerning the impact of oxygen content will be presented.

Keywords: DSB formation, chemical stage, Petri nets, radiobiological mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
5719 Sparse-View CT Reconstruction Based on Nonconvex L1 − L2 Regularizations

Authors: Ali Pour Yazdanpanah, Farideh Foroozandeh Shahraki, Emma Regentova

Abstract:

The reconstruction from sparse-view projections is one of important problems in computed tomography (CT) limited by the availability or feasibility of obtaining of a large number of projections. Traditionally, convex regularizers have been exploited to improve the reconstruction quality in sparse-view CT, and the convex constraint in those problems leads to an easy optimization process. However, convex regularizers often result in a biased approximation and inaccurate reconstruction in CT problems. Here, we present a nonconvex, Lipschitz continuous and non-smooth regularization model. The CT reconstruction is formulated as a nonconvex constrained L1 − L2 minimization problem and solved through a difference of convex algorithm and alternating direction of multiplier method which generates a better result than L0 or L1 regularizers in the CT reconstruction. We compare our method with previously reported high performance methods which use convex regularizers such as TV, wavelet, curvelet, and curvelet+TV (CTV) on the test phantom images. The results show that there are benefits in using the nonconvex regularizer in the sparse-view CT reconstruction.

Keywords: Computed tomography, sparse-view reconstruction, L1 −L2 minimization, non-convex, difference of convex functions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037
5718 The Development of the Quality Management Processes for the Building and Environment of the Basic Education Schools

Authors: Suppara Charoenpoom

Abstract:

The objectives of this research was to design and develop a quality management of the school buildings and environment. A quantitative and qualitative mixed research methodology was used. The population sample included 14 directors of primary schools. Two research tools were used. The first research tool included an in-depth interview and questionnaire. The second research tool included the Quality Business Process and Quality Work Procedure, and a Key Performance Indicator of each activity. The statistics included mean and standard deviation. The findings for the development of a quality management process of buildings and environment administration of the basic schools consisted of one quality business process (QBP) and seven quality work processes (QWP). The result from the experts’ evaluation revealed that the process and implementation of quality management of the school buildings and environment has passed the inspection process with consensus. This implies that the process of quality management of the school buildings and environment is suitable for implementation. Moreover, the level of agreement in the feasibility of the implementation of this plan had the mean in the range of 0.64-1.00 which suggests the design of the new plan is acceptable.

Keywords: Process, Building, Environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
5717 Roundness Deviation Measuring Strategy at Coordination Measuring Machines and Conventional Machines

Authors: Lenka Ocenasova, Bartosz Gapinski, Robert Cep, Linda Gregova, Branimir Barisic, Jana Novakova, Lenka Petrkovska

Abstract:

Today technological process makes possible surface control of producing parts which is needful for product quality guarantee. Geometrical structure of part surface includes form, proportion, accuracy to shape, accuracy to size, alignment and surface topography (roughness, waviness, etc.). All these parameters are dependence at technology, production machine parameters, material properties, but also at human, etc. Every parameters approves at total part accuracy, it is means at accuracy to shape. One of the most important accuracy to shape element is roundness. This paper will be deals by comparison of roughness deviations at coordination measuring machines and at special single purpose machines. Will describing measuring by discreet method (discontinuous) and scanning method (continuous) at coordination measuring machines and confrontation with reference method using at single purpose machines.

Keywords: Coordinating Measuring Machines (CMM), Measuring Strategy, Roughness Deviation, Accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2376
5716 Hazard Rate Estimation of Temporal Point Process, Case Study: Earthquake Hazard Rate in Nusatenggara Region

Authors: Sunusi N., Kresna A. J., Islamiyati A., Raupong

Abstract:

Hazard rate estimation is one of the important topics in forecasting earthquake occurrence. Forecasting earthquake occurrence is a part of the statistical seismology where the main subject is the point process. Generally, earthquake hazard rate is estimated based on the point process likelihood equation called the Hazard Rate Likelihood of Point Process (HRLPP). In this research, we have developed estimation method, that is hazard rate single decrement HRSD. This method was adapted from estimation method in actuarial studies. Here, one individual associated with an earthquake with inter event time is exponentially distributed. The information of epicenter and time of earthquake occurrence are used to estimate hazard rate. At the end, a case study of earthquake hazard rate will be given. Furthermore, we compare the hazard rate between HRLPP and HRSD method.

Keywords: Earthquake forecast, Hazard Rate, Likelihood point process, Point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
5715 Comparison of Router Intelligent and Cooperative Host Intelligent Algorithms in a Continuous Model of Fixed Telecommunication Networks

Authors: Dávid Csercsik, Sándor Imre

Abstract:

The performance of state of the art worldwide telecommunication networks strongly depends on the efficiency of the applied routing mechanism. Game theoretical approaches to this problem offer new solutions. In this paper a new continuous network routing model is defined to describe data transfer in fixed telecommunication networks of multiple hosts. The nodes of the network correspond to routers whose latency is assumed to be traffic dependent. We propose that the whole traffic of the network can be decomposed to a finite number of tasks, which belong to various hosts. To describe the different latency-sensitivity, utility functions are defined for each task. The model is used to compare router and host intelligent types of routing methods, corresponding to various data transfer protocols. We analyze host intelligent routing as a transferable utility cooperative game with externalities. The main aim of the paper is to provide a framework in which the efficiency of various routing algorithms can be compared and the transferable utility game arising in the cooperative case can be analyzed.

Keywords: Routing, Telecommunication networks, Performance evaluation, Cooperative game theory, Partition function form games

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
5714 Framework for the Modeling of the Supply Chain Collaborative Planning Process

Authors: D. Pérez, M. M. E. Alemany

Abstract:

In this work, a framework to model the Supply Chain (SC) Collaborative Planning (CP) process is proposed. The main contributions of this framework concern 1) the presentation of the decision view, the most important one due to the characteristics of the process, jointly within the physical, organisation and information views, and 2) the simultaneous consideration of the spatial and temporal integration among the different supply chain decision centres. This framework provides the basis for a realistic and integrated perspective of the supply chain collaborative planning process and also the analytical modeling of each of its decisional activities.

Keywords: Collaborative Planning, Decision View, Distributed Decision-Making, Framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1325
5713 The Auto-Tuning PID Controller for Interacting Water Level Process

Authors: Satean Tunyasrirut, Tianchai Suksri, Arjin Numsomran, Supan Gulpanich, Kitti Tirasesth

Abstract:

This paper presents the approach to design the Auto- Tuning PID controller for interactive Water Level Process using integral step response. The Integral Step Response (ISR) is the method to model a dynamic process which can be done easily, conveniently and very efficiently. Therefore this method is advantage for design the auto tune PID controller. Our scheme uses the root locus technique to design PID controller. In this paper MATLAB is used for modeling and testing of the control system. The experimental results of the interacting water level process can be satisfyingly illustrated the transient response and the steady state response.

Keywords: Coupled-Tank, Interacting water level process, PIDController, Auto-tuning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2311