Search results for: pragmatic factors; Wason selection task
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4288

Search results for: pragmatic factors; Wason selection task

1648 An Attribute-Centre Based Decision Tree Classification Algorithm

Authors: Gökhan Silahtaroğlu

Abstract:

Decision tree algorithms have very important place at classification model of data mining. In literature, algorithms use entropy concept or gini index to form the tree. The shape of the classes and their closeness to each other some of the factors that affect the performance of the algorithm. In this paper we introduce a new decision tree algorithm which employs data (attribute) folding method and variation of the class variables over the branches to be created. A comparative performance analysis has been held between the proposed algorithm and C4.5.

Keywords: Classification, decision tree, split, pruning, entropy, gini.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
1647 A New Distribution Network Reconfiguration Approach using a Tree Model

Authors: E. Dolatdar, S. Soleymani, B. Mozafari

Abstract:

Power loss reduction is one of the main targets in power industry and so in this paper, the problem of finding the optimal configuration of a radial distribution system for loss reduction is considered. Optimal reconfiguration involves the selection of the best set of branches to be opened ,one each from each loop, for reducing resistive line losses , and reliving overloads on feeders by shifting the load to adjacent feeders. However ,since there are many candidate switching combinations in the system ,the feeder reconfiguration is a complicated problem. In this paper a new approach is proposed based on a simple optimum loss calculation by determining optimal trees of the given network. From graph theory a distribution network can be represented with a graph that consists a set of nodes and branches. In fact this problem can be viewed as a problem of determining an optimal tree of the graph which simultaneously ensure radial structure of each candidate topology .In this method the refined genetic algorithm is also set up and some improvements of algorithm are made on chromosome coding. In this paper an implementation of the algorithm presented by [7] is applied by modifying in load flow program and a comparison of this method with the proposed method is employed. In [7] an algorithm is proposed that the choice of the switches to be opened is based on simple heuristic rules. This algorithm reduce the number of load flow runs and also reduce the switching combinations to a fewer number and gives the optimum solution. To demonstrate the validity of these methods computer simulations with PSAT and MATLAB programs are carried out on 33-bus test system. The results show that the performance of the proposed method is better than [7] method and also other methods.

Keywords: Distribution System, Reconfiguration, Loss Reduction , Graph Theory , Optimization , Genetic Algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3782
1646 Supplementation of Vascular Endothelial Growth Factor during in vitro Maturation of Porcine Cumulus Oocyte Complexes and Subsequent Developmental Competence after Parthenogenesis and in vitro Fertilization

Authors: D. Biswas, Sang H. Hyun

Abstract:

In mammalian reproductive tract, the oviduct secretes huge number of growth factors and cytokines that create an optimal micro-environment for the initial stages of preimplantation embryos. Secretion of these growth factors is stage-specific. Among them, VEGF is a potent mitogen for vascular endothelium and stimulates vascular permeability. Apart from angiogenesis, VEGF in the oviduct may be involved in regulating the oocyte maturation and subsequent developmental process during embryo production in vitro. In experiment 1, to evaluate the effect of VEGF during IVM of porcine COC and subsequent developmental ability after PA and SCNT. The results from these experiments indicated that maturation rates among the different VEGF concentrations were not significant different. In experiment 2, total intracellular GSH concentrations of oocytes matured with VEGF (5-50 ng/ml) were increased significantly compared to a control and VEGF group (500 ng/ml). In experiment 3, the blastocyst formation rates and total cell number per blastocyst after parthenogenesis of oocytes matured with VEGF (5-50 ng/ml) were increased significantly compared to a control and VEGF group (500 ng/ml). Similarly, in experiment 4, the blastocyst formation rate and total cell number per blastocyst after SCNT and IVF of oocytes matured with VEGF (5 ng/ml) were significantly higher than that of oocytes matured without VEGF group. In experiment 5, at 10 hour after the onset of IVF, pronuclear formation rate was evaluated. Monospermy was significantly higher in VEGF-matured oocytes than in the control, and polyspermy and sperm penetration per oocyte were significantly higher in the control group than in the VEGFmatured oocytes. Supplementation with VEGF during IVM significantly improved male pronuclear formation as compared with the control. In experiment 6, type III cortical granule distribution in oocytes was more common in VEGF-matured oocytes than in the control. In conclusion, the present study suggested that supplementation of VEGF during IVM may enhance the developmental potential of porcine in vitro embryos through increase of the intracellular GSH level, higher MPN formation and increased fertilization rate as a consequence of an improved cytoplasmic maturation.

Keywords: angiogenesis, GSH, monospermy, VEGF

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829
1645 Factors Affecting Test Automation Stability and Their Solutions

Authors: Nagmani Lnu

Abstract:

Test automation is a vital requirement of any organization to release products faster to their customers. In most cases, an organization has an approach to developing automation but struggles to maintain it. It results in an increased number of flaky tests, reducing return on investments and stakeholders’ confidence. Challenges grow in multiple folds when automation is for User Interface (UI) behaviors. This paper describes the approaches taken to identify the root cause of automation instability in an extensive payments application and the best practices to address that using processes, tools, and technologies, resulting in a 75% reduction of effort.

Keywords: Automation stability, test stability, flaky test, test quality, test automation quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 246
1644 Property of Polyurethane: from Soy-derived Phosphate Ester

Authors: Flora Elvistia Firdaus

Abstract:

Polyurethane foams (PUF) were formed by a chemical reaction of polyol and isocyanate. The polyol was manufactured by ring-opening hydrolysis of epoxidized soybean oil in the presence of phosphoric acid under varying experimental conditions. Other factors in the foam formulation such as water content and surfactant were kept constant. The effect of the amount of solvents, phosphoric acid, and their derivates in the foam formulation on the properties of polyurethane foams were studied. The properties of the material were measured via a number of parameters, which are water content of prepared polyol, polymer density and cellular structures.

Keywords: soy, polyurethane, phosporic acid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
1643 A Probabilistic Reinforcement-Based Approach to Conceptualization

Authors: Hadi Firouzi, Majid Nili Ahmadabadi, Babak N. Araabi

Abstract:

Conceptualization strengthens intelligent systems in generalization skill, effective knowledge representation, real-time inference, and managing uncertain and indefinite situations in addition to facilitating knowledge communication for learning agents situated in real world. Concept learning introduces a way of abstraction by which the continuous state is formed as entities called concepts which are connected to the action space and thus, they illustrate somehow the complex action space. Of computational concept learning approaches, action-based conceptualization is favored because of its simplicity and mirror neuron foundations in neuroscience. In this paper, a new biologically inspired concept learning approach based on the probabilistic framework is proposed. This approach exploits and extends the mirror neuron-s role in conceptualization for a reinforcement learning agent in nondeterministic environments. In the proposed method, instead of building a huge numerical knowledge, the concepts are learnt gradually from rewards through interaction with the environment. Moreover the probabilistic formation of the concepts is employed to deal with uncertain and dynamic nature of real problems in addition to the ability of generalization. These characteristics as a whole distinguish the proposed learning algorithm from both a pure classification algorithm and typical reinforcement learning. Simulation results show advantages of the proposed framework in terms of convergence speed as well as generalization and asymptotic behavior because of utilizing both success and failures attempts through received rewards. Experimental results, on the other hand, show the applicability and effectiveness of the proposed method in continuous and noisy environments for a real robotic task such as maze as well as the benefits of implementing an incremental learning scenario in artificial agents.

Keywords: Concept learning, probabilistic decision making, reinforcement learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
1642 Angiographic Evaluation of ETT (Treadmill) Positive Patients in a Tertiary Care Hospital of Bangladesh

Authors: Syed Dawood Md. Taimur, Saidur Rahman Khan, Farzana Islam

Abstract:

To evaluate the factors which predetermine the coronary artery disease in patients having positive Exercise Tolerance Test (ETT) that is treadmill results and coronary artery findings. This descriptive study was conducted at Department of Cardiology, Ibrahim Cardiac Hospital & Research Institute, Dhaka, Bangladesh from 1st January, 2014 to 31st August, 2014. All patients who had done ETT (treadmill) for chest pain diagnosis were studied. One hundred and four patients underwent coronary angiogram after positive treadmill result. Patients were divided into two groups depending upon the angiographic findings, i.e. true positive and false positive. Positive treadmill test patients who have coronary artery involvement these are called true positive and who have no involvement they are called false positive group. Both groups were compared with each other. Out of 104 patients, 81 (77.9%) patients had true positive ETT and 23 (22.1%) patients had false positive ETT. The mean age of patients in positive ETT was 53.46± 8.06 years and male mean age was 53.63±8.36 years and female was 52.87±7.0 years. Sixty nine (85.19%) male patients and twelve (14.81%) female patients had true positive ETT, whereas 15 (65.21%) males and 8 (34.79%) females had false positive ETT, this was statistically significant (p<0.032) in the two groups (sex) in comparison of true and false positive ETT. The risk factors of these patients like diabetes mellitus, hypertension, dyslipidemia, family history and smoking were seen among these patients. Hypertensive patients having true positive which were statistically significant (p<0.004) and diabetic, dyslipidemic patients having true positive which were statistically significant (p<0.032 & 0.030).True positive patients had family history were 68(83.95%) and smoking were 52 (64.20%), where family history patients had statistically significant (p<0.017) between two groups of patients and smokers were significant (p<0.012). 46 true positive patients achieved THR which was not statistically significant (P<0.138) and 79 true patients had abnormal resting ECG whether it was significant (p<0.036). Amongst the vessels involvement the most common was LAD 55 (67.90 %) followed by LCX 42 (51.85%), RCA 36 (44.44%), and the LMCA was 9 (11.11%). 40 patients (49.38%) had SVD, 26 (30.10%) had DVD, 15(18.52%) had TVD and 23 had normal coronary arteries. It can be concluded that among the female patients who have positive ETT with normal resting ECG, who had achieved target heart rate are likely to have a false positive test result. Conversely male patients, resting abnormal ECG who had not achieved THR, symptom limited ETT, have a hypertension, diabetes, dyslipidemia, family history and smoking are likely to have a true positive treadmill test result.

Keywords: Exercise tolerance test, Coronary artery disease, Coronary angiography, True positive, False positive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3956
1641 The Effect of Impinging WC-12Co Particles Temperature on Thickness of HVOF Thermally Sprayed Coatings

Authors: M. Jalali Azizpour, H. Mohammadi Majd

Abstract:

In this paper, the effect of WC-12Co particle temperature in HVOF thermal spraying process on the coating thickness has been studied. The statistical results show that the spray distance and oxygen-to-fuel ratio are effective factors on particle characterization and thickness of HVOF thermal spraying coatings. Spray Watch diagnostic system, scanning electron microscopy (SEM), X-ray diffraction and thickness measuring system were used for this purpose.

Keywords: HVOF, Temperature, Thickness, Velocity, WC- 12Co.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
1640 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: Actuarial loss reserving techniques, logistic regression, parametric function, volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
1639 How CATV Survive in the Era of Convergence?

Authors: J. Park, J. Song, B. Lee

Abstract:

The purpose of this paper is to analyze the case of the U.S. Pivot and to suggest an appropriate model including entry strategies and success factors for QPS of Cable TV. The telecommunication companies have been operating QPS including IPTV service, which enables them to cross over broadcasting areas. Due to this circumstance, the Cable TV operators are now concerned and are planning to add QPS with the mobile service. Based on the Porter's five forces model, an analytical framework has been proposed to MVNO in Cable TV industry in the United States. As a result of this study, MVNO in Cable TV industry has to have a clear killer application with their sufficient contents. Subsequently, the direction of the future Cable TV industry is proposed.

Keywords: CATV, MVNO, Pivot, QPS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
1638 Long-Term Deformations of Concrete Structures

Authors: A. Brahma

Abstract:

Drying is a phenomenon that accompanies the hardening of hydraulic materials. This study is concerned the modelling of drying shrinkage of the hydraulic materials and the prediction of the rate of spontaneous deformations of hydraulic materials during hardening. The model developed takes consideration of the main factors affecting drying shrinkage. There was agreement between drying shrinkage predicted by the developed model and experimental results. In last we show that developed model describe the evolution of the drying shrinkage of high performances concretes correctly.

Keywords: Drying, hydraulic concretes, shrinkage, modeling, prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
1637 Named Entity Recognition using Support Vector Machine: A Language Independent Approach

Authors: Asif Ekbal, Sivaji Bandyopadhyay

Abstract:

Named Entity Recognition (NER) aims to classify each word of a document into predefined target named entity classes and is now-a-days considered to be fundamental for many Natural Language Processing (NLP) tasks such as information retrieval, machine translation, information extraction, question answering systems and others. This paper reports about the development of a NER system for Bengali and Hindi using Support Vector Machine (SVM). Though this state of the art machine learning technique has been widely applied to NER in several well-studied languages, the use of this technique to Indian languages (ILs) is very new. The system makes use of the different contextual information of the words along with the variety of features that are helpful in predicting the four different named (NE) classes, such as Person name, Location name, Organization name and Miscellaneous name. We have used the annotated corpora of 122,467 tokens of Bengali and 502,974 tokens of Hindi tagged with the twelve different NE classes 1, defined as part of the IJCNLP-08 NER Shared Task for South and South East Asian Languages (SSEAL) 2. In addition, we have manually annotated 150K wordforms of the Bengali news corpus, developed from the web-archive of a leading Bengali newspaper. We have also developed an unsupervised algorithm in order to generate the lexical context patterns from a part of the unlabeled Bengali news corpus. Lexical patterns have been used as the features of SVM in order to improve the system performance. The NER system has been tested with the gold standard test sets of 35K, and 60K tokens for Bengali, and Hindi, respectively. Evaluation results have demonstrated the recall, precision, and f-score values of 88.61%, 80.12%, and 84.15%, respectively, for Bengali and 80.23%, 74.34%, and 77.17%, respectively, for Hindi. Results show the improvement in the f-score by 5.13% with the use of context patterns. Statistical analysis, ANOVA is also performed to compare the performance of the proposed NER system with that of the existing HMM based system for both the languages.

Keywords: Named Entity (NE), Named Entity Recognition (NER), Support Vector Machine (SVM), Bengali, Hindi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3404
1636 An Incomplete Factorization Preconditioner for LMS Adaptive Filter

Authors: Shazia Javed, Noor Atinah Ahmad

Abstract:

In this paper an efficient incomplete factorization preconditioner is proposed for the Least Mean Squares (LMS) adaptive filter. The proposed preconditioner is approximated from a priori knowledge of the factors of input correlation matrix with an incomplete strategy, motivated by the sparsity patter of the upper triangular factor in the QRD-RLS algorithm. The convergence properties of IPLMS algorithm are comparable with those of transform domain LMS(TDLMS) algorithm. Simulation results show efficiency and robustness of the proposed algorithm with reduced computational complexity.

Keywords: Autocorrelation matrix, Cholesky's factor, eigenvalue spread, Markov input.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1793
1635 Effect of Environmental Changes in Working Heart Rate among Industrial Workers: An Ergonomic Interpretation

Authors: P. Mukhopadhyay, N. C. Dey

Abstract:

Occupational health hazard is a very common term in every emerging country. Along with the unorganized sector, most organized sectors including government industries are suffering from this affliction. In addition to workload, the seasonal changes also have some impacts on working environment. With this focus in mind, one hundred male industrial workers, who are directly involved to the task of Periodic Overhauling (POH) in a fabricating workshop in the public domain are selected for this research work. They have been studied during work periods throughout different seasons in a year. For each and every season, the participants working heart rate (WHR) is measured and compared with the standards given by different national and internationally recognized agencies i.e., World Health Organization (WHO) and American Conference of Governmental Industrial Hygienists (ACGIH) etc. The different environmental parameters i.e. dry bulb temperature (DBT), wet bulb temperature (WBT), globe temperature (GT), natural wet bulb temperature (NWB), relative humidity (RH), wet bulb globe temperature (WBGT), air velocity (AV), effective temperature (ET) are recorded throughout the seasons to critically observe the effect of seasonal changes on the WHR of the workers. The effect of changes in environment to the WHR of the workers is very much surprising. It is found that the percentages of workers who belong to the ‘very heavy’ workload category are 83.33%, 66.66% and 16.66% in the summer, rainy and winter seasons, respectively. Ongoing undertaking of this type of job profile forces the worker towards occupational disorders causing absenteeism. This occurrence results in lower production rates, and on the other hand, costs due to medical claims also weaken the industry’s economic condition. In this circumstance, the authors are trying to focus on some remedial measures from the ergonomic angle by proposing a new work/ rest regimen and introducing engineering controls along with management controls which may help the worker, and consequently, the management also.

Keywords: Environmental changes, industrial worker, working heart rate, workload, occupational health hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1038
1634 The Impact Factors of the Environmental Pollution and Workers Health in Printing Industry

Authors: Kiurski J., Marić B., Djaković V., Adamović S., Oros I., Krstić J.

Abstract:

This paper presents the study of parameters affecting the environment protection in the printing industry. The paper has also compared LCA studies performed within the printing industry in order to identify common practices, limitations, areas for improvement, and opportunities for standardization. This comparison is focused on the data sources and methodologies used in the printing pollutants register. The presented concepts, methodology and results represent the contribution to the sustainable development management. Furthermore, the paper analyzes the result of the quantitative identification of hazardous substances emitted in printing industry of Novi Sad.

Keywords: LCA, parameters of pollution, printing industry, register

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
1633 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru

Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar

Abstract:

Nowadays, Heritage Building Information Modeling (HBIM) is considered an efficient tool to represent and manage information of Cultural Heritage (CH). The basis of this tool relies on a 3D model generally obtained from a Cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired Level of Development (LOD), Level of Information (LOI), Grade of Generation (GOG) as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models’ families respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources, since the BIM software used has a free student license.

Keywords: Cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927
1632 Cold Flow Investigation of Primary Zone Characteristics in Combustor Utilizing Axial Air Swirler

Authors: Yehia A. Eldrainy, Mohammad Nazri Mohd. Jaafar, Tholudin Mat Lazim

Abstract:

This paper presents a cold flow simulation study of a small gas turbine combustor performed using laboratory scale test rig. The main objective of this investigation is to obtain physical insight of the main vortex, responsible for the efficient mixing of fuel and air. Such models are necessary for predictions and optimization of real gas turbine combustors. Air swirler can control the combustor performance by assisting in the fuel-air mixing process and by producing recirculation region which can act as flame holders and influences residence time. Thus, proper selection of a swirler is needed to enhance combustor performance and to reduce NOx emissions. Three different axial air swirlers were used based on their vane angles i.e., 30°, 45°, and 60°. Three-dimensional, viscous, turbulent, isothermal flow characteristics of the combustor model operating at room temperature were simulated via Reynolds- Averaged Navier-Stokes (RANS) code. The model geometry has been created using solid model, and the meshing has been done using GAMBIT preprocessing package. Finally, the solution and analysis were carried out in a FLUENT solver. This serves to demonstrate the capability of the code for design and analysis of real combustor. The effects of swirlers and mass flow rate were examined. Details of the complex flow structure such as vortices and recirculation zones were obtained by the simulation model. The computational model predicts a major recirculation zone in the central region immediately downstream of the fuel nozzle and a second recirculation zone in the upstream corner of the combustion chamber. It is also shown that swirler angles changes have significant effects on the combustor flowfield as well as pressure losses.

Keywords: cold flow, numerical simulation, combustor;turbulence, axial swirler.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205
1631 How to Use E-Learning to Increase Job Satisfaction in Large Commercial Bank in Bangkok

Authors: Teerada Apibunyopas, Nithinant Thammakoranonta

Abstract:

Many organizations bring e-Learning to use as a tool in their training and human development department. It is getting more popular because it is easy to access to get knowledge all the time and also it provides a rich content, which can develop the employees’ skill efficiently. This study is focused on the factors that affect using e-Learning efficiently, so it will make job satisfaction increasing. The questionnaires were sent to employees in large commercial banks, which use e-Learning located in Bangkok, the results from multiple linear regression analysis showed that employee’s characteristics, characteristics of e-Learning, learning and growth have influence on job satisfaction.

Keywords: e-Learning, Job Satisfaction, Learning and growth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2386
1630 Usability Evaluation Framework for Computer Vision Based Interfaces

Authors: Muhammad Raza Ali, Tim Morris

Abstract:

Human computer interaction has progressed considerably from the traditional modes of interaction. Vision based interfaces are a revolutionary technology, allowing interaction through human actions, gestures. Researchers have developed numerous accurate techniques, however, with an exception to few these techniques are not evaluated using standard HCI techniques. In this paper we present a comprehensive framework to address this issue. Our evaluation of a computer vision application shows that in addition to the accuracy, it is vital to address human factors

Keywords: Usability evaluation, cognitive walkthrough, think aloud, gesture recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
1629 The Impact of Leadership Style on Innovative Work Behavior

Authors: Dyah P. Srirahayu, Esti Putri Anugrah, Amelia Firdaus

Abstract:

The existence of the current library has met the complex needs of users. However, human resources in the library are often a source of problems in service. This is influenced by the leadership style in each library. This study aims to analyze the impact of leadership style on innovative work behavior. The research method used is a quantitative approach to analyze using SPSS. The findings in this study illustrate that leadership style has an influence on innovative work behavior which is certainly influenced by various existing factors.

Keywords: Leadership, public libraries, innovative behavior, Transactional leadership.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864
1628 SAF: A Substitution and Alignment Free Similarity Measure for Protein Sequences

Authors: Abdellali Kelil, Shengrui Wang, Ryszard Brzezinski

Abstract:

The literature reports a large number of approaches for measuring the similarity between protein sequences. Most of these approaches estimate this similarity using alignment-based techniques that do not necessarily yield biologically plausible results, for two reasons. First, for the case of non-alignable (i.e., not yet definitively aligned and biologically approved) sequences such as multi-domain, circular permutation and tandem repeat protein sequences, alignment-based approaches do not succeed in producing biologically plausible results. This is due to the nature of the alignment, which is based on the matching of subsequences in equivalent positions, while non-alignable proteins often have similar and conserved domains in non-equivalent positions. Second, the alignment-based approaches lead to similarity measures that depend heavily on the parameters set by the user for the alignment (e.g., gap penalties and substitution matrices). For easily alignable protein sequences, it's possible to supply a suitable combination of input parameters that allows such an approach to yield biologically plausible results. However, for difficult-to-align protein sequences, supplying different combinations of input parameters yields different results. Such variable results create ambiguities and complicate the similarity measurement task. To overcome these drawbacks, this paper describes a novel and effective approach for measuring the similarity between protein sequences, called SAF for Substitution and Alignment Free. Without resorting either to the alignment of protein sequences or to substitution relations between amino acids, SAF is able to efficiently detect the significant subsequences that best represent the intrinsic properties of protein sequences, those underlying the chronological dependencies of structural features and biochemical activities of protein sequences. Moreover, by using a new efficient subsequence matching scheme, SAF more efficiently handles protein sequences that contain similar structural features with significant meaning in chronologically non-equivalent positions. To show the effectiveness of SAF, extensive experiments were performed on protein datasets from different databases, and the results were compared with those obtained by several mainstream algorithms.

Keywords: Protein, Similarity, Substitution, Alignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1410
1627 Contextual Variables Affecting Frustration Level in Reading: An Integral Inquiry

Authors: Mae C. Pavilario

Abstract:

This study employs a sequential explanatory mixed method. Quantitatively it investigated the profile of grade VII students. Qualitatively, the prevailing contextual variables that affect their frustration-level were sought based on their perspective and that of their parents and teachers. These students were categorized as frustration-level in reading based on the data on word list of the Philippine Informal Reading Inventory (Phil-IRI). The researcher-made reading factor instrument translated to local dialect (Hiligaynon) was subjected to cross-cultural translation to address content, semantic, technical, criterion, or conceptual equivalence, the open-ended questions, and one unstructured interview was utilized. In the profile of the 26 participants, the 12 males are categorized as grade II and grade III frustration-levels. The prevailing contextual variables are personal-“having no interest in reading”, “being ashamed and fear of having to read in front of others” for extremely high frustration level; social environmental-“having no regular reading schedule at home” for very high frustration level and personal- “having no interest in reading” for high frustration level. Kendall Tau inferential statistical tool was used to test the significant relationship in the prevailing contextual variables that affect frustration-level readers when grouped according to perspective. Result showed that significant relationship exists between students-parents perspectives; however, there is no significant relationship between students’ and teachers’, and parents’ and teachers’ perspectives. The themes in the narratives of the participants on frustration-level readers are existence of speech defects, undesirable attitude, insufficient amount of reading materials, lack of close supervision from parents, and losing time and focus on task. Intervention was designed.

Keywords: Contextual variables, frustration-level readers, perspective, inquiry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1130
1626 Design of Experiment and Computational Fluid Dynamics Used to Optimize Hydrodynamic Characteristics of the Marine Propeller

Authors: Rohit Suryawanshi

Abstract:

In this study, the commercial Computational Fluid Dynamics (CFD), ANSYS-Fluent, has been used to optimize the marine propeller with the design of experiment (DOE) method. At the initial stage, different propeller parameters ware selected for the three different levels. The four characteristics factors are: no. of the blade, camber value, pitch delta & chord at the hub. Then, CAD modelling is performed by considering the selected factor and level. In this investigation, a total of 9 test models are simulated with the Reynolds-Averaged Navier-Stokes (RANS) equations. The standard, realizable

Keywords: Marine propeller, Computational Fluid Dynamics, optimization, DOE, propeller thrust.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 901
1625 Model of Community Management for Sustainable Utilization

Authors: Luedech Girdwichai, Witthaya Mekhum

Abstract:

This research intended to develop the model of community management for sustainable utilization by investigating on 2 groups of population, the family heads and the community management team. The population of the former group consisted of family heads from 511 families in 12 areas to complete the questionnaires which were returned at 479 sets. The latter group consisted of the community management team of 12 areas with 1 representative from each area to give the interview. The questionnaires for the family heads consisted of 2 main parts; general information such as occupations, etc. in the form of checklist. The second part dealt with the data on self reliance community development based on 4P Framework, i.e., People (human resource) development, Place (area) development, Product (economic and income source) development, and Plan (community plan) development in the form of rating scales. Data in the 1st part were calculated to find frequency and percentage while those in the 2nd part were analyzed to find arithmetic mean and SD. Data from the 2nd group of population or the community management team were derived from focus group to find factors influencing successful management together with the in depth interview which were analyzed by descriptive statistics. The results showed that 479 family heads reported that the aspect on the implementation of community plan to self reliance community activities based on Sufficient Economy Philosophy and the 4P was at the average of 3.28 or moderate level. When considering in details, it was found that the 1st aspect was on the area development with the mean of 3.71 or high level followed by human resource development with the mean of 3.44 or moderate level, then, economic and source of income development with the mean of 3.09 or moderate level. The last aspect was community plan development with the mean of 2.89. The results from the small group discussion revealed some factors and guidelines for successful community management as follows: 1) on the People (human resource) development aspect, there was a project to support and develop community leaders. 2) On the aspect of Place (area) development, there was a development on conservative tourism areas. 3) On the aspect of Product (economic and source of income) development, the community leaders promoted the setting of occupational group, saving group, and product processing group. 4) On the aspect of Plan (community plan) development, there was a prioritization through public hearing.

Keywords: Model of community management, sustainable utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501
1624 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kr. Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with specific focus on Infrared (IR) and Visible image (VI) fusion for various applications including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like Visible camera & IR Thermal Imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (IR) that may be reflected or self-emitted. A digital color camera captures the visible source image and a thermal IR camera acquires the thermal source image. In this paper, some image fusion algorithms based upon Multi-Scale Transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, but they also make it hard to become deployed in system and applications that require real-time operation, high flexibility and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: Image fusion, IR thermal imager, multi-sensor, Multi-Scale Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430
1623 A TISM Model for Structuring the Productivity Elements of Flexible Manufacturing System

Authors: Sandhya Dixit, Tilak Raj

Abstract:

Flexible Manufacturing System (FMS) is seen as an option for industries which want to boost productivity as well as respond quickly to an increasingly changing marketplace. FMS produces in mid variety, mid volume range and can meet the changing market demands very quickly. But still the impact of adoption of FMS on the productivity of any industry is not very clear. In this paper an attempt has been made to model the various factors affecting the productivity of FMS installation using Total Interpretive Structural Modelling (TISM) Technique.

Keywords: Flexible manufacturing system, productivity, total interpretive structural modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1144
1622 Analysis of Security Vulnerabilities for Mobile Health Applications

Authors: Y. Cifuentes, L. Beltrán, L. Ramírez

Abstract:

The availability to deploy mobile applications for health care is increasing daily thru different mobile app stores. But within these capabilities the number of hacking attacks has also increased, in particular into medical mobile applications. The security vulnerabilities in medical mobile apps can be triggered by errors in code, incorrect logic, poor design, among other parameters. This is usually used by malicious attackers to steal or modify the users’ information. The aim of this research is to analyze the vulnerabilities detected in mobile medical apps according to risk factor standards defined by OWASP in 2014.

Keywords: mHealth apps, OWASP, protocols, security vulnerabilities, risk factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4411
1621 Rigorous Electromagnetic Model of Fourier Transform Infrared (FT-IR) Spectroscopic Imaging Applied to Automated Histology of Prostate Tissue Specimens

Authors: Rohith K Reddy, David Mayerich, Michael Walsh, P Scott Carney, Rohit Bhargava

Abstract:

Fourier transform infrared (FT-IR) spectroscopic imaging is an emerging technique that provides both chemically and spatially resolved information. The rich chemical content of data may be utilized for computer-aided determinations of structure and pathologic state (cancer diagnosis) in histological tissue sections for prostate cancer. FT-IR spectroscopic imaging of prostate tissue has shown that tissue type (histological) classification can be performed to a high degree of accuracy [1] and cancer diagnosis can be performed with an accuracy of about 80% [2] on a microscopic (≈ 6μm) length scale. In performing these analyses, it has been observed that there is large variability (more than 60%) between spectra from different points on tissue that is expected to consist of the same essential chemical constituents. Spectra at the edges of tissues are characteristically and consistently different from chemically similar tissue in the middle of the same sample. Here, we explain these differences using a rigorous electromagnetic model for light-sample interaction. Spectra from FT-IR spectroscopic imaging of chemically heterogeneous samples are different from bulk spectra of individual chemical constituents of the sample. This is because spectra not only depend on chemistry, but also on the shape of the sample. Using coupled wave analysis, we characterize and quantify the nature of spectral distortions at the edges of tissues. Furthermore, we present a method of performing histological classification of tissue samples. Since the mid-infrared spectrum is typically assumed to be a quantitative measure of chemical composition, classification results can vary widely due to spectral distortions. However, we demonstrate that the selection of localized metrics based on chemical information can make our data robust to the spectral distortions caused by scattering at the tissue boundary.

Keywords: Infrared, Spectroscopy, Imaging, Tissue classification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
1620 Taguchi Robust Design for Optimal Setting of Process Wastes Parameters in an Automotive Parts Manufacturing Company

Authors: Charles Chikwendu Okpala, Christopher Chukwutoo Ihueze

Abstract:

As a technique that reduces variation in a product by lessening the sensitivity of the design to sources of variation, rather than by controlling their sources, Taguchi Robust Design entails the designing of ideal goods, by developing a product that has minimal variance in its characteristics and also meets the desired exact performance. This paper examined the concept of the manufacturing approach and its application to brake pad product of an automotive parts manufacturing company. Although the firm claimed that only defects, excess inventory, and over-production were the few wastes that grossly affect their productivity and profitability, a careful study and analysis of their manufacturing processes with the application of Single Minute Exchange of Dies (SMED) tool showed that the waste of waiting is the fourth waste that bedevils the firm. The selection of the Taguchi L9 orthogonal array which is based on the four parameters and the three levels of variation for each parameter revealed that with a range of 2.17, that waiting is the major waste that the company must reduce in order to continue to be viable. Also, to enhance the company’s throughput and profitability, the wastes of over-production, excess inventory, and defects with ranges of 2.01, 1.46, and 0.82, ranking second, third, and fourth respectively must also be reduced to the barest minimum. After proposing -33.84 as the highest optimum Signal-to-Noise ratio to be maintained for the waste of waiting, the paper advocated for the adoption of all the tools and techniques of Lean Production System (LPS), and Continuous Improvement (CI), and concluded by recommending SMED in order to drastically reduce set up time which leads to unnecessary waiting.

Keywords: Taguchi Robust Design, signal to noise ratio, Single Minute Exchange of Dies, lean production system, waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 976
1619 Human Capacity Building in Manufacturing Sector: A Factor to Industrial Growth in Nigeria

Authors: Williams S. Ebhota, Ckikaodili Virginia Ugwu

Abstract:

Human ability is a major source of constraint to manufacturing industries in Nigeria. This paper therefore, discusses the importance of human influences on manufacturing and consequently to industrialization and National development. In this paper, the development of manufacturing was anchored on two main factors; Infrastructural Capacity Development (ICD) and Human Capacity Development (HCD). However, a wider view was given to the HCD and the various contemporary human capacity issues militating against manufacturing in Nigeria. It went further to discuss various ways of acquiring and upgrading workers’ skills and finally, suggestions were made on how to tackle the onerous human capacity issues in manufacturing.

Keywords: Manufacturing, Human, Capacity, Development, Innovation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3496