Search results for: measure valued process
6098 Prediction of Solidification Behavior of Al Alloy in a Cube Mold Cavity
Authors: N. P. Yadav, Deepti Verma
Abstract:
This paper focuses on the mathematical modeling for solidification of Al alloy in a cube mold cavity to study the solidification behavior of casting process. The parametric investigation of solidification process inside the cavity was performed by using computational solidification/melting model coupled with Volume of fluid (VOF) model. The implicit filling algorithm is used in this study to understand the overall process from the filling stage to solidification in a model metal casting process. The model is validated with past studied at same conditions. The solidification process is analyzed by including the effect of pouring velocity as well as natural convection from the wall and geometry of the cavity. These studies show the possibility of various defects during solidification process.Keywords: Buoyancy driven flow, natural convection driven flow, residual flow, secondary flow, volume of fluid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23156097 Clustering Protein Sequences with Tailored General Regression Model Technique
Authors: G. Lavanya Devi, Allam Appa Rao, A. Damodaram, GR Sridhar, G. Jaya Suma
Abstract:
Cluster analysis divides data into groups that are meaningful, useful, or both. Analysis of biological data is creating a new generation of epidemiologic, prognostic, diagnostic and treatment modalities. Clustering of protein sequences is one of the current research topics in the field of computer science. Linear relation is valuable in rule discovery for a given data, such as if value X goes up 1, value Y will go down 3", etc. The classical linear regression models the linear relation of two sequences perfectly. However, if we need to cluster a large repository of protein sequences into groups where sequences have strong linear relationship with each other, it is prohibitively expensive to compare sequences one by one. In this paper, we propose a new technique named General Regression Model Technique Clustering Algorithm (GRMTCA) to benignly handle the problem of linear sequences clustering. GRMT gives a measure, GR*, to tell the degree of linearity of multiple sequences without having to compare each pair of them.Keywords: Clustering, General Regression Model, Protein Sequences, Similarity Measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15676096 Empirical Exploration of Correlations between Software Design Measures: A Replication Study
Authors: Jehad Al Dallal
Abstract:
Software engineers apply different measures to quantify the quality of software design. These measures consider artifacts developed at low or high level software design phases. The results are used to point to design weaknesses and to indicate design points that have to be restructured. Understanding the relationship among the quality measures and among the design quality aspects considered by these measures is important to interpreting the impact of a measure for a quality aspect on other potentially related aspects. In addition, exploring the relationship between quality measures helps to explain the impact of different quality measures on external quality aspects, such as reliability and maintainability. In this paper, we report a replication study that empirically explores the correlation between six well known and commonly applied design quality measures. These measures consider several quality aspects, including complexity, cohesion, coupling, and inheritance. The results indicate that inheritance measures are weakly correlated to other measures, whereas complexity, coupling, and cohesion measures are mostly strongly correlated.
Keywords: Quality attribute, quality measure, software design quality, spearman correlation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8096095 Quantification of Heart Rate Variability: A Measure based on Unique Heart Rates
Authors: V. I. Thajudin Ahamed, P. Dhanasekaran, A. Naseem, N. G. Karthick, T. K. Abdul Jaleel, Paul K.Joseph
Abstract:
It is established that the instantaneous heart rate (HR) of healthy humans keeps on changing. Analysis of heart rate variability (HRV) has become a popular non invasive tool for assessing the activities of autonomic nervous system. Depressed HRV has been found in several disorders, like diabetes mellitus (DM) and coronary artery disease, characterised by autonomic nervous dysfunction. A new technique, which searches for pattern repeatability in a time series, is proposed specifically for the analysis of heart rate data. These set of indices, which are termed as pattern repeatability measure and pattern repeatability ratio are compared with approximate entropy and sample entropy. In our analysis, based on the method developed, it is observed that heart rate variability is significantly different for DM patients, particularly for patients with diabetic foot ulcer.
Keywords: Autonomic nervous system, diabetes mellitus, heart rate variability, pattern identification, sample entropy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19096094 Comparison between Beta Wavelets Neural Networks, RBF Neural Networks and Polynomial Approximation for 1D, 2DFunctions Approximation
Authors: Wajdi Bellil, Chokri Ben Amar, Adel M. Alimi
Abstract:
This paper proposes a comparison between wavelet neural networks (WNN), RBF neural network and polynomial approximation in term of 1-D and 2-D functions approximation. We present a novel wavelet neural network, based on Beta wavelets, for 1-D and 2-D functions approximation. Our purpose is to approximate an unknown function f: Rn - R from scattered samples (xi; y = f(xi)) i=1....n, where first, we have little a priori knowledge on the unknown function f: it lives in some infinite dimensional smooth function space and second the function approximation process is performed iteratively: each new measure on the function (xi; f(xi)) is used to compute a new estimate f as an approximation of the function f. Simulation results are demonstrated to validate the generalization ability and efficiency of the proposed Beta wavelet network.
Keywords: Beta wavelets networks, RBF neural network, training algorithms, MSE, 1-D, 2D function approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19196093 Q-Test of Undergraduate Epistemology and Scientific Thought: Development and Testing of an Assessment of Scientific Epistemology
Authors: Matthew J. Zagumny
Abstract:
The QUEST is an assessment of scientific epistemic beliefs and was developed to measure students’ intellectual development in regards to beliefs about knowledge and knowing. The QUEST utilizes Q-sort methodology, which requires participants to rate the degree to which statements describe them personally. As a measure of personal theories of knowledge, the QUEST instrument is described with the Q-sort distribution and scoring explained. A preliminary demonstration of the QUEST assessment is described with two samples of undergraduate students (novice/lower division compared to advanced/upper division students) being assessed and their average QUEST scores compared. The usefulness of an assessment of epistemology is discussed in terms of the principle that assessment tends to drive educational practice and university mission. The critical need for university and academic programs to focus on development of students’ scientific epistemology is briefly discussed.Keywords: Scientific epistemology, critical thinking, Q-sort method, STEM undergraduates.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15326092 A Context-Sensitive Algorithm for Media Similarity Search
Authors: Guang-Ho Cha
Abstract:
This paper presents a context-sensitive media similarity search algorithm. One of the central problems regarding media search is the semantic gap between the low-level features computed automatically from media data and the human interpretation of them. This is because the notion of similarity is usually based on high-level abstraction but the low-level features do not sometimes reflect the human perception. Many media search algorithms have used the Minkowski metric to measure similarity between image pairs. However those functions cannot adequately capture the aspects of the characteristics of the human visual system as well as the nonlinear relationships in contextual information given by images in a collection. Our search algorithm tackles this problem by employing a similarity measure and a ranking strategy that reflect the nonlinearity of human perception and contextual information in a dataset. Similarity search in an image database based on this contextual information shows encouraging experimental results.
Keywords: Context-sensitive search, image search, media search, similarity ranking, similarity search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6396091 Lean Impact Analysis Assessment Models: Development of a Lean Measurement Structural Model
Authors: Catherine Maware, Olufemi Adetunji
Abstract:
The paper is aimed at developing a model to measure the impact of Lean manufacturing deployment on organizational performance. The model will help industry practitioners to assess the impact of implementing Lean constructs on organizational performance. It will also harmonize the measurement models of Lean performance with the house of Lean that seems to have become the industry standard. The sheer number of measurement models for impact assessment of Lean implementation makes it difficult for new adopters to select an appropriate assessment model or deployment methodology. A literature review is conducted to classify the Lean performance model. Pareto analysis is used to select the Lean constructs for the development of the model. The model is further formalized through the use of Structural Equation Modeling (SEM) in defining the underlying latent structure of a Lean system. An impact assessment measurement model developed can be used to measure Lean performance and can be adopted by different industries.
Keywords: Impact measurement model, lean bundles, lean manufacturing, organizational performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12256090 Bi-axial Stress Effects on Barkhausen-Noise
Authors: G. Balogh, I. A. Szabó, P. Z. Kovács
Abstract:
Mechanical stress has a strong effect on the magnitude of the Barkhausen-noise in structural steels. Because the measurements are performed at the surface of the material, for a sample sheet, the full effect can be described by a biaxial stress field. The measured Barkhausen-noise is dependent on the orientation of the exciting magnetic field relative to the axis of the stress tensor. The sample inhomogenities including the residual stress also modifies the angular dependence of the measured Barkhausen-noise. We have developed a laboratory device with a cross like specimen for bi-axial bending. The measuring head allowed performing excitations in two orthogonal directions. We could excite the two directions independently or simultaneously with different amplitudes. The simultaneous excitation of the two coils could be performed in phase or with a 90 degree phase shift. In principle this allows to measure the Barkhausen-noise at an arbitrary direction without moving the head, or to measure the Barkhausen-noise induced by a rotating magnetic field if a linear superposition of the two fields can be assumed.
Keywords: Barkhausen-noise, Bi-axial stress, Stress dependency, Stress measuring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21876089 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology
Authors: Anjian Chen, Joseph C. Chen
Abstract:
This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.
Keywords: Additive manufacturing, fused deposition modeling, surface roughness, Six-Sigma, Taguchi method, 3D printing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13896088 Supply Chain Modeling and Improving Manufacturing Industry in Developing Countries: A Research Agenda
Authors: F.B. Georgise, K. D. Thoben, M. Seifert
Abstract:
This paper presents a research agenda on the SCOR model adaptation. SCOR model is designated to measure supply chain performance and logistics impact across the boundaries of individual organizations. It is at its growing stage of its life cycle and is enjoying the leverage of becoming the industry standard. The SCOR model has been developed and used widely in developed countries context. This research focuses on the SCOR model adaptation for the manufacturing industry in developing countries. With a necessary understanding of the characteristics, difficulties and problems of the manufacturing industry in developing countries- supply chain; consequently, we will try to designs an adapted model with its building blocks: business process model, performance measures and best practices.Keywords: developing countries, manufacturing industry, SCOR model adaptation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22166087 Line Balancing in the Hard Disk Drive Process Using Simulation Techniques
Authors: Teerapun Saeheaw, Nivit Charoenchai, Wichai Chattinnawat
Abstract:
Simulation model is an easy way to build up models to represent real life scenarios, to identify bottlenecks and to enhance system performance. Using a valid simulation model may give several advantages in creating better manufacturing design in order to improve the system performances. This paper presents result of implementing a simulation model to design hard disk drive manufacturing process by applying line balancing to improve both productivity and quality of hard disk drive process. The line balance efficiency showed 86% decrease in work in process, output was increased by an average of 80%, average time in the system was decreased 86% and waiting time was decreased 90%.Keywords: line balancing, arena, hard disk drive process, simulation, work in process (WIP)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21646086 Integration Process of Industrial Design and Engineering Design
Authors: Kazuhide Sugiyama, Hiroshi Osada
Abstract:
Lately management strategy that put Industrial Design (ID) in its core is recognized more important, as technology and price alone cannot differentiate a product. The needs to shorten the time to develop a product also shorten the development period of ID, and it necessitates the ID process management. This research analyzes the status of integration process of ID and Engineering Design (ED) of office equipment that requires the collaboration of ID and ED to clarify the issues for the efficiency of the development and to propose solutions.
Keywords: Industrial Design (ID), Engineering Design (ED), Integration process, Office equipment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18136085 Using Perspective Schemata to Model the ETL Process
Authors: Valeria M. Pequeno, Joao Carlos G. M. Pires
Abstract:
Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.
Keywords: conceptual data model, correspondence assertions, data warehouse, data integration, ETL process, object relational database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15116084 An Experimental Method for Measuring Clamping Force in Bolted Connections and Effect of Bolt Threads Lubrication on Its Value
Authors: E. Hemmati Vand, R. H. Oskouei, T. N. Chakherlou
Abstract:
In this paper, the details of an experimental method to measure the clamping force value at bolted connections due to application of wrenching torque to tighten the nut have been presented. A simplified bolted joint including a holed plate with a single bolt was considered to carry out the experiments. This method was designed based on Hooke-s law by measuring compressive axial strain of a steel bush placed between the nut and the plate. In the experimental procedure, the values of clamping force were calculated for seven different levels of applied torque, and this process was repeated three times for each level of the torque. Moreover, the effect of lubrication of threads on the clamping value was studied using the same method. In both conditions (dry and lubricated threads), relation between the torque and the clamping force have been displayed in graphs.
Keywords: Clamping force, Bolted joints, Experimental method, Lubrication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76826083 Supremacy of Differential Evolution Algorithm in Designing Multiplier-Less Low-Pass FIR Filter
Authors: Abhijit Chandra, Sudipta Chattopadhyay
Abstract:
In this communication, we have made an attempt to design multiplier-less low-pass finite impulse response (FIR) filter with the aid of various mutation strategies of Differential Evolution (DE) algorithm. Impulse response coefficient of the designed FIR filter has been represented as sums or differences of powers of two. Performance of the proposed filter has been evaluated in terms of its frequency response and associated hardware cost. Supremacy of our approach has been substantiated by comparing our result with many of the existing multiplier-less filter design algorithms of recent interest. It has also been demonstrated that DE-optimized filter outperforms Genetic Algorithm (GA) based design by a large margin. Hardware efficiency of our algorithm has further been validated by implementing those filters on a Field Programmable Gate Array (FPGA) chip.
Keywords: Convergence speed, Differential Evolution (DE), error histogram, finite impulse response (FIR) filter, total power of two (TPT), zero-valued filter coefficient (ZFC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21556082 A Complexity Measure for Java Bean based Software Components
Authors: Sandeep Khimta, Parvinder S. Sandhu, Amanpreet Singh Brar
Abstract:
The traditional software product and process metrics are neither suitable nor sufficient in measuring the complexity of software components, which ultimately is necessary for quality and productivity improvement within organizations adopting CBSE. Researchers have proposed a wide range of complexity metrics for software systems. However, these metrics are not sufficient for components and component-based system and are restricted to the module-oriented systems and object-oriented systems. In this proposed study it is proposed to find the complexity of the JavaBean Software Components as a reflection of its quality and the component can be adopted accordingly to make it more reusable. The proposed metric involves only the design issues of the component and does not consider the packaging and the deployment complexity. In this way, the software components could be kept in certain limit which in turn help in enhancing the quality and productivity.Keywords: JavaBean Components, Complexity, Metrics, Validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15276081 Assessing Community Participation in Decision-Making Process under Co-Management: A Case Study on Hail Haor, Bangladesh
Authors: R. Ferdous
Abstract:
Power, responsibility sharing, and democratic decision-making are the central ethos to co-management. It is assumed that involving local community in the decision-making process can create a sense of ownership and responsibility of that community and motivate the community towards collective action. But this paper demonstrated that the process to involve local community is not simple and straightforward as it is influenced by structural aspects, power relations among the actors, and social embedded institutions. These factors shape the process in that way who will participate, how they will participate and how the local community maneuvers their agency in the decision-making process. To grasp the complexities that materialize in the process of participation and to understand the inclusionary and exclusionary nature of participation, this paper examines the subjective understanding of different stakeholders concerning participation and furthermore observes the enabling or constraining factors that affect the community to exercise their agency.
Keywords: Participation, social embeddedness, power, structure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16876080 Quality Based Approach for Efficient Biologics Manufacturing
Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama
Abstract:
To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.Keywords: Antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16516079 A Study on Creation of Human-Based Co-Design Service Platform
Authors: Chiung-Hui Chen
Abstract:
With the approaching of digital era, various interactive service platforms and systems support human beings- needs in lives by different contents and measures. Design strategies have gradually turned from function-based to user-oriented, and are often customized. In other words, how designers include users- value reaction in creation becomes the goal. Creative design service of interior design requires positive interaction and communication to allow users to obtain full design information, recognize the style and process of personal needs, develop creative service design, lower communication time and cost and satisfy users- sense of achievement. Thus, by constructing a co-design method, based on the communication between interior designers and users, this study recognizes users- real needs and provides the measure of co-design for designers and users.Keywords: Co-Design, Customized, Design Service, Interactive Genetic Algorithm, Interior Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14966078 Time Series Regression with Meta-Clusters
Authors: Monika Chuchro
Abstract:
This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain subgroups of time series data with normal distribution from the inflow into wastewater treatment plant data, composed of several groups differing by mean value. Two simple algorithms, K-mean and EM, were chosen as a clustering method. The Rand index was used to measure the similarity. After simple meta-clustering, a regression model was performed for each subgroups. The final model was a sum of the subgroups models. The quality of the obtained model was compared with the regression model made using the same explanatory variables, but with no clustering of data. Results were compared using determination coefficient (R2), measure of prediction accuracy- mean absolute percentage error (MAPE) and comparison on a linear chart. Preliminary results allow us to foresee the potential of the presented technique.
Keywords: Clustering, Data analysis, Data mining, Predictive models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19516077 Comprehensive Assessment of Energy Efficiency within the Production Process
Authors: S. Kreitlein, N. Eder, A. Syed-Khaja, J. Franke
Abstract:
The importance of energy efficiency within the production processes increases steadily. For a comprehensive assessment of energy efficiency within the production process, unfortunately no tools exist or have been developed yet. Therefore the Institute for Factory Automation and Production Systems at the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency namely EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state-of-the-art as well as the developed approaches.
Keywords: Energy efficiency, energy efficiency value, energetic process efficiency, production.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22796076 Theoretical Considerations for Software Component Metrics
Authors: V. Lakshmi Narasimhan, Bayu Hendradjaya
Abstract:
We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.Keywords: Component Assembly, Component Based SoftwareEngineering, CORBA Component Model, Software ComponentMetrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22816075 Using Combination of Optimized Recurrent Neural Network with Design of Experiments and Regression for Control Chart Forecasting
Authors: R. Behmanesh, I. Rahimi
Abstract:
recurrent neural network (RNN) is an efficient tool for modeling production control process as well as modeling services. In this paper one RNN was combined with regression model and were employed in order to be checked whether the obtained data by the model in comparison with actual data, are valid for variable process control chart. Therefore, one maintenance process in workshop of Esfahan Oil Refining Co. (EORC) was taken for illustration of models. First, the regression was made for predicting the response time of process based upon determined factors, and then the error between actual and predicted response time as output and also the same factors as input were used in RNN. Finally, according to predicted data from combined model, it is scrutinized for test values in statistical process control whether forecasting efficiency is acceptable. Meanwhile, in training process of RNN, design of experiments was set so as to optimize the RNN.Keywords: RNN, DOE, regression, control chart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16596074 Examining Herzberg-s Two Factor Theory in a Large Chinese Chemical Fiber Company
Authors: Ju-Chun Chien
Abstract:
The validity of Herzberg-s Two-Factor Theory of Motivation was tested empirically by surveying 2372 chemical fiber employees in 2012. In the valid sample of 1875 respondents, the degree of overall job satisfaction was more than moderate. The most highly valued components of job satisfaction were: “corporate image," “collaborative working atmosphere," and “supervisor-s expertise"; whereas the lowest mean score was 34.65 for “job rotation and promotion." The top three job retention options rated by the participants were “good image of the enterprise," “good compensation," and “workplace is close to my residence." The overall evaluation of the level of thriving facilitation workplace reached almost to “mostly agree." For those participants who chose at least one motivator as their job retention options had significantly greater job satisfaction than those who chose only hygiene factors as their retention options. Therefore, Herzberg-s Two-Factor Theory of Motivation was proven valid in this study.Keywords: Employee job satisfaction, Job retention, Traditional business, Two-factor theory of motivation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 54136073 Meta-requirements that Model Change
Authors: Gouri Prakash
Abstract:
One of the common problems encountered in software engineering is addressing and responding to the changing nature of requirements. While several approaches have been devised to address this issue, ranging from instilling resistance to changing requirements in order to mitigate impact to project schedules, to developing an agile mindset towards requirements, the approach discussed in this paper is one of conceptualizing the delta in requirement and modeling it, in order to plan a response to it. To provide some context here, change is first formally identified and categorized as either formal change or informal change. While agile methodology facilitates informal change, the approach discussed in this paper seeks to develop the idea of facilitating formal change. To collect, document meta-requirements that represent the phenomena of change would be a pro-active measure towards building a realistic cognition of the requirements entity that can further be harnessed in the software engineering process.Keywords: Change Management, Agile methodology, Metarequirements
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15436072 A Development of a Weight-Balancing Control System Based On Android Operating System
Authors: Rattanathip Rattanachai, Piyachai Petchyen, Kunyanuth Kularbphettong
Abstract:
This paper describes the development of a Weight- Balancing Control System based on the Android Operating System and it provides recommendations on ways of balancing of user’s weight based on daily metabolism process and need so that user can make informed decisions on his or her weight controls. The system also depicts more information on nutrition details. Furthermore, it was designed to suggest to users what kinds of foods they should eat and how to exercise in the right ways. We describe the design methods and functional components of this prototype. To evaluate the system performance, questionnaires for system usability and Black Box Testing were used to measure expert and user satisfaction. The results were satisfactory as followed: Means for experts and users were 3.94 and 4.07 respectively.
Keywords: Weight-Balancing Control, Android Operating System, daily metabolism, Black Box Testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21556071 Variable Rough Set Model and Its Knowledge Reduction for Incomplete and Fuzzy Decision Information Systems
Authors: Da-kuan Wei, Xian-zhong Zhou, Dong-jun Xin, Zhi-wei Chen
Abstract:
The information systems with incomplete attribute values and fuzzy decisions commonly exist in practical problems. On the base of the notion of variable precision rough set model for incomplete information system and the rough set model for incomplete and fuzzy decision information system, the variable rough set model for incomplete and fuzzy decision information system is constructed, which is the generalization of the variable precision rough set model for incomplete information system and that of rough set model for incomplete and fuzzy decision information system. The knowledge reduction and heuristic algorithm, built on the method and theory of precision reduction, are proposed.Keywords: Rough set, Incomplete and fuzzy decision information system, Limited valued tolerance relation, Knowledge reduction, Variable rough set model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15856070 A Case Study of an Online Assignment Submission System at UOM
Authors: V. Ramnarain-Seetohul, J. Abdool Karim, A. Amir
Abstract:
Almost all universities include some form of assignment in their courses. The assignments are either carried out in either in groups or individually. To effectively manage these submitted assignments, a well-designed assignment submission system is needed, hence the need for an online assignment submission system to facilitate the distribution, and collection of assignments on due dates. The objective of such system is to facilitate interaction of lecturers and students for assessment and grading purposes. The aim of this study was to create a web based online assignment submission system for University of Mauritius. The system was created to eliminate the traditional process of giving an assignment and collecting the answers for the assignment. Lecturers can also create automated assessment to assess the students online. Moreover, the online submission system consists of an automatic mailing system which acts as a reminder for students about the deadlines of the posted assignments. System was tested to measure its acceptance rate among both student and lecturers.
Keywords: Assignment, assessment, online, submission
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 71956069 Overriding Moral Intuitions – Does It Make Us Immoral? Dual-Process Theory of Higher Cognition Account for Moral Reasoning
Authors: Michał Białek, Simon J. Handley
Abstract:
Moral decisions are considered as an intuitive process, while conscious reasoning is mostly used only to justify those intuitions. This problem is described in few different dual-process theories of mind, that are being developed e.g. by Frederick and Kahneman, Stanovich and Evans. Those theories recently evolved into tri-process theories with a proposed process that makes ultimate decision or allows to paraformal processing with focal bias.. Presented experiment compares the decision patterns to the implications of those models. In presented study participants (n=179) considered different aspects of trolley dilemma or its footbridge version and decided after that. Results show that in the control group 70% of people decided to use the lever to change tracks for the running trolley, and 20% chose to push the fat man down the tracks. In contrast, after experimental manipulation almost no one decided to act. Also the decision time difference between dilemmas disappeared after experimental manipulation. The result supports the idea of three co-working processes: intuitive (TASS), paraformal (reflective mind) and algorithmic process.Keywords: Moral reasoning, moral decision, reflection, trolley problem, dual-process theory of reasoning, tri-process theory of cognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028