Search results for: Principal Component Analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9344

Search results for: Principal Component Analysis.

8774 Machine Learning Approach for Identifying Dementia from MRI Images

Authors: S. K. Aruna, S. Chitra

Abstract:

This research paper presents a framework for classifying Magnetic Resonance Imaging (MRI) images for Dementia. Dementia, an age-related cognitive decline is indicated by degeneration of cortical and sub-cortical structures. Characterizing morphological changes helps understand disease development and contributes to early prediction and prevention of the disease. Modelling, that captures the brain’s structural variability and which is valid in disease classification and interpretation is very challenging. Features are extracted using Gabor filter with 0, 30, 60, 90 orientations and Gray Level Co-occurrence Matrix (GLCM). It is proposed to normalize and fuse the features. Independent Component Analysis (ICA) selects features. Support Vector Machine (SVM) classifier with different kernels is evaluated, for efficiency to classify dementia. This study evaluates the presented framework using MRI images from OASIS dataset for identifying dementia. Results showed that the proposed feature fusion classifier achieves higher classification accuracy.

Keywords: Magnetic resonance imaging, dementia, Gabor filter, gray level co-occurrence matrix, support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
8773 Indicator of Small Calcification Detection in Ultrasonography using Decorrelation of Forward Scattered Waves

Authors: Hirofumi Taki, Takuya Sakamoto, Makoto Yamakawa, Tsuyoshi Shiina, Toru Sato

Abstract:

For the improvement of the ability in detecting small calcifications using Ultrasonography (US) we propose a novel indicator of calcifications in an ultrasound B-mode image without decrease in frame rate. Since the waveform of an ultrasound pulse changes at a calcification position, the decorrelation of adjacent scan lines occurs behind a calcification. Therefore, we employ the decorrelation of adjacent scan lines as an indicator of a calcification. The proposed indicator depicted wires 0.05 mm in diameter at 2 cm depth with a sensitivity of 86.7% and a specificity of 100%, which were hardly detected in ultrasound B-mode images. This study shows the potential of the proposed indicator to approximate the detectable calcification size using an US device to that of an X-ray imager, implying the possibility that an US device will become a convenient, safe, and principal clinical tool for the screening of breast cancer.

Keywords: Ultrasonography, Calcification, Decorrelation, Forward scattered wave

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
8772 Space Telemetry Anomaly Detection Based on Statistical PCA Algorithm

Authors: B. Nassar, W. Hussein, M. Mokhtar

Abstract:

The critical concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission, but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the problem above coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions, and the results show that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: Space telemetry monitoring, multivariate analysis, PCA algorithm, space operations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2062
8771 Random Projections for Dimensionality Reduction in ICA

Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi

Abstract:

In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.

Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
8770 Wavelet Enhanced CCA for Minimization of Ocular and Muscle Artifacts in EEG

Authors: B. S. Raghavendra, D. Narayana Dutt

Abstract:

Electroencephalogram (EEG) recordings are often contaminated with ocular and muscle artifacts. In this paper, the canonical correlation analysis (CCA) is used as blind source separation (BSS) technique (BSS-CCA) to decompose the artifact contaminated EEG into component signals. We combine the BSSCCA technique with wavelet filtering approach for minimizing both ocular and muscle artifacts simultaneously, and refer the proposed method as wavelet enhanced BSS-CCA. In this approach, after careful visual inspection, the muscle artifact components are discarded and ocular artifact components are subjected to wavelet filtering to retain high frequency cerebral information, and then clean EEG is reconstructed. The performance of the proposed wavelet enhanced BSS-CCA method is tested on real EEG recordings contaminated with ocular and muscle artifacts, for which power spectral density is used as a quantitative measure. Our results suggest that the proposed hybrid approach minimizes ocular and muscle artifacts effectively, minimally affecting underlying cerebral activity in EEG recordings.

Keywords: Blind source separation, Canonical correlationanalysis, Electroencephalogram, Muscle artifact, Ocular artifact, Power spectrum, Wavelet threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334
8769 A Face-to-Face Education Support System Capable of Lecture Adaptation and Q&A Assistance Based On Probabilistic Inference

Authors: Yoshitaka Fujiwara, Jun-ichirou Fukushima, Yasunari Maeda

Abstract:

Keys to high-quality face-to-face education are ensuring flexibility in the way lectures are given, and providing care and responsiveness to learners. This paper describes a face-to-face education support system that is designed to raise the satisfaction of learners and reduce the workload on instructors. This system consists of a lecture adaptation assistance part, which assists instructors in adapting teaching content and strategy, and a Q&A assistance part, which provides learners with answers to their questions. The core component of the former part is a “learning achievement map", which is composed of a Bayesian network (BN). From learners- performance in exercises on relevant past lectures, the lecture adaptation assistance part obtains information required to adapt appropriately the presentation of the next lecture. The core component of the Q&A assistance part is a case base, which accumulates cases consisting of questions expected from learners and answers to them. The Q&A assistance part is a case-based search system equipped with a search index which performs probabilistic inference. A prototype face-to-face education support system has been built, which is intended for the teaching of Java programming, and this approach was evaluated using this system. The expected degree of understanding of each learner for a future lecture was derived from his or her performance in exercises on past lectures, and this expected degree of understanding was used to select one of three adaptation levels. A model for determining the adaptation level most suitable for the individual learner has been identified. An experimental case base was built to examine the search performance of the Q&A assistance part, and it was found that the rate of successfully finding an appropriate case was 56%.

Keywords: Bayesian network, face-to-face education, lecture adaptation, Q&A assistance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
8768 Software Reliability Prediction Model Analysis

Authors: L. Mirtskhulava, M. Khunjgurua, N. Lomineishvili, K. Bakuria

Abstract:

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Keywords: Exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
8767 Feasibility Analysis Studies on New National R&D Programs in Korea

Authors: Seongmin Yim, Hyun-Kyu Kang

Abstract:

As a part of evaluation system for R&D program, the Korean government has applied feasibility analysis since 2008. Various professionals put forth a great effort in order to catch up the high degree of freedom of R&D programs, and make contributions to evolving the feasibility analysis. We analyze diverse R&D programs from various viewpoints, such as technology, policy, and Economics, integrate the separate analysis, and finally arrive at a definite result; whether a program is feasible or unfeasible. This paper describes the concept and method of the feasibility analysis as a decision making tool. The analysis unit and content of each criterion, which are key elements in a comprehensive decision making structure, are examined

Keywords: Decision Making of New Government R&D Program, Feasibility Analysis Study

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
8766 Assesing Extension of Meeting System Performance in Information Technology in Defense and Aerospace Project

Authors: Hakan Gürkan, Ahmet Denker

Abstract:

The Ministry of Defense (MoD) spends hundreds of millions of dollars on software to support its infrastructure, operate its weapons and provide command, control, communications, computing, intelligence, surveillance, and reconnaissance (C4ISR) functions. These and other all new advanced systems have a common critical component is information technology. Defense and Aerospace environment is continuously striving to keep up with increasingly sophisticated Information Technology (IT) in order to remain effective in today-s dynamic and unpredictable threat environment. This makes it one of the largest and fastest growing expenses of Defense. Hundreds of millions of dollars spent a year on IT projects. But, too many of those millions are wasted on costly mistakes. Systems that do not work properly, new components that are not compatible with old once, trendily new applications that do not really satisfy defense needs or lost though poorly managed contracts. This paper investigates and compiles the effective strategies that aim to end exasperation with low returns and high cost of Information Technology Acquisition for defense; it tries to show how to maximize value while reducing time and expenditure.

Keywords: Iterative Process, Acquisition Management, Project management, Software Economics, Requirement analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1243
8765 Riding the Crest of the Wave: Inclusive Education in New Zealand

Authors: Barbara A. Perry

Abstract:

In 1996, the New Zealand government and the Ministry of Education announced that they were setting up a "world class system of inclusive education". As a parent of a son with high and complex needs, a teacher, school Principal and Disability studies Lecturer, this author will track the changes in the journey towards inclusive education over the last 20 years. Strategies for partnering with families to ensure educational success along with insights from one of those on the crest of the wave will be presented. Using a narrative methodology the author will illuminate how far New Zealand has come towards this world class system of inclusion promised and share from personal experience some of the highlights and risks in the system. This author has challenged the old structures and been part of the setting up of new structures particularly for providing parent voice and insight; this paper provides a unique view from an insider’s voice as well as a professional in the system.

Keywords: Disability studies, inclusive education, special education, working with families with children with disability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1249
8764 Performance of Derna Steam Power Plant at Varying Super-Heater Operating Conditions Based on Exergy

Authors: Idris Elfeituri

Abstract:

In the current study, energy and exergy analysis of a 65 MW steam power plant was carried out. This study investigated the effect of variations of overall conductance of the super heater on the performance of an existing steam power plant located in Derna, Libya. The performance of the power plant was estimated by a mathematical modelling which considers the off-design operating conditions of each component. A fully interactive computer program based on the mass, energy and exergy balance equations has been developed. The maximum exergy destruction has been found in the steam generation unit. A 50% reduction in the design value of overall conductance of the super heater has been achieved, which accordingly decreases the amount of the net electrical power that would be generated by at least 13 MW, as well as the overall plant exergy efficiency by at least 6.4%, and at the same time that would cause an increase of the total exergy destruction by at least 14 MW. The achieved results showed that the super heater design and operating conditions play an important role on the thermodynamics performance and the fuel utilization of the power plant. Moreover, these considerations are very useful in the process of the decision that should be taken at the occasions of deciding whether to replace or renovate the super heater of the power plant.

Keywords: Exergy, super-heater, fouling, steam power plant, off-design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1126
8763 Finite Element Application to Estimate Inservice Material Properties using Miniature Specimen

Authors: G. Partheepan, D.K. Sehgal, R.K. Pandey

Abstract:

This paper presents a method for determining the uniaxial tensile properties such as Young-s modulus, yield strength and the flow behaviour of a material in a virtually non-destructive manner. To achieve this, a new dumb-bell shaped miniature specimen has been designed. This helps in avoiding the removal of large size material samples from the in-service component for the evaluation of current material properties. The proposed miniature specimen has an advantage in finite element modelling with respect to computational time and memory space. Test fixtures have been developed to enable the tension tests on the miniature specimen in a testing machine. The studies have been conducted in a chromium (H11) steel and an aluminum alloy (AR66). The output from the miniature test viz. load-elongation diagram is obtained and the finite element simulation of the test is carried out using a 2D plane stress analysis. The results are compared with the experimental results. It is observed that the results from the finite element simulation corroborate well with the miniature test results. The approach seems to have potential to predict the mechanical properties of the materials, which could be used in remaining life estimation of the various in-service structures.

Keywords: ABAQUS, finite element, miniature test, tensileproperties

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
8762 Developing New Media Credibility Scale: A Multidimensional Perspective

Authors: Hanaa Farouk Saleh

Abstract:

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

Keywords: Credibility scale, media credibility components, new media credibility scale, scale development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2882
8761 Student Records Management System Using Smart Cards and Biometric Technology for Educational Institutions

Authors: Patrick O. Bobbie, Prince S. Attrams

Abstract:

In recent times, the rapid change in new technologies has spurred up the way and manner records are handled in educational institutions. Also, there is a need for reliable access and ease-of use to these records, resulting in increased productivity in organizations. In academic institutions, such benefits help in quality assessments, institutional performance, and assessments of teaching and evaluation methods. Students in educational institutions benefit the most when advanced technologies are deployed in accessing records. This research paper discusses the use of biometric technologies coupled with smartcard technologies to provide a unique way of identifying students and matching their data to financial records to grant them access to restricted areas such as examination halls. The system developed in this paper, has an identity verification component as part of its main functionalities. A systematic software development cycle of analysis, design, coding, testing and support was used. The system provides a secured way of verifying student’s identity and real time verification of financial records. An advanced prototype version of the system has been developed for testing purposes.

Keywords: Biometrics, fingerprints, identity-verification, smartcards.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
8760 Modified Genome-Scale Metabolic Model of Escherichia coli by Adding Hyaluronic Acid Biosynthesis-Related Enzymes (GLMU2 and HYAD) from Pasteurella multocida

Authors: P. Pasomboon, P. Chumnanpuen, T. E-kobon

Abstract:

Hyaluronic acid (HA) consists of linear heteropolysaccharides repeat of D-glucuronic acid and N-acetyl-D-glucosamine. HA has various useful properties to maintain skin elasticity and moisture, reduce inflammation, and lubricate the movement of various body parts without causing immunogenic allergy. HA can be found in several animal tissues as well as in the capsule component of some bacteria including Pasteurella multocida. This study aimed to modify a genome-scale metabolic model of Escherichia coli using computational simulation and flux analysis methods to predict HA productivity under different carbon sources and nitrogen supplement by the addition of two enzymes (GLMU2 and HYAD) from P. multocida to improve the HA production under the specified amount of carbon sources and nitrogen supplements. Result revealed that threonine and aspartate supplement raised the HA production by 12.186%. Our analyses proposed the genome-scale metabolic model is useful for improving the HA production and narrows the number of conditions to be tested further.

Keywords: Pasteurella multocida, Escherichia coli, hyaluronic acid, genome-scale metabolic model, bioinformatics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 816
8759 Recent Trends in Nonlinear Methods of HRV Analysis: A Review

Authors: Ramesh K. Sunkaria

Abstract:

The linear methods of heart rate variability analysis such as non-parametric (e.g. fast Fourier transform analysis) and parametric methods (e.g. autoregressive modeling) has become an established non-invasive tool for marking the cardiac health, but their sensitivity and specificity were found to be lower than expected with positive predictive value <30%. This may be due to considering the RR-interval series as stationary and re-sampling them prior to their use for analysis, whereas actually it is not. This paper reviews the non-linear methods of HRV analysis such as correlation dimension, largest Lyupnov exponent, power law slope, fractal analysis, detrended fluctuation analysis, complexity measure etc. which are currently becoming popular as these uses the actual RR-interval series. These methods are expected to highly accurate cardiac health prognosis.

Keywords: chaos, nonlinear dynamics, sample entropy, approximate entropy, detrended fluctuation analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2351
8758 Noise Removal from Surface Respiratory EMG Signal

Authors: Slim Yacoub, Kosai Raoof

Abstract:

The aim of this study was to remove the two principal noises which disturb the surface electromyography signal (Diaphragm). These signals are the electrocardiogram ECG artefact and the power line interference artefact. The algorithm proposed focuses on a new Lean Mean Square (LMS) Widrow adaptive structure. These structures require a reference signal that is correlated with the noise contaminating the signal. The noise references are then extracted : first with a noise reference mathematically constructed using two different cosine functions; 50Hz (the fundamental) function and 150Hz (the first harmonic) function for the power line interference and second with a matching pursuit technique combined to an LMS structure for the ECG artefact estimation. The two removal procedures are attained without the use of supplementary electrodes. These techniques of filtering are validated on real records of surface diaphragm electromyography signal. The performance of the proposed methods was compared with already conducted research results.

Keywords: Surface EMG, Adaptive, Matching Pursuit, Powerline interference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4325
8757 Life Cycle Assessment of Residential Buildings: A Case Study in Canada

Authors: Venkatesh Kumar, Kasun Hewage, Rehan Sadiq

Abstract:

Residential buildings consume significant amounts of energy and produce large amount of emissions and waste. However, there is a substantial potential for energy savings in this sector which needs to be evaluated over the life cycle of residential buildings. Life Cycle Assessment (LCA) methodology has been employed to study the primary energy uses and associated environmental impacts of different phases (i.e., product, construction, use, end of life, and beyond building life) for residential buildings. Four different alternatives of residential buildings in Vancouver (BC, Canada) with a 50-year lifespan have been evaluated, including High Rise Apartment (HRA), Low Rise Apartment (LRA), Single family Attached House (SAH), and Single family Detached House (SDH). Life cycle performance of the buildings is evaluated for embodied energy, embodied environmental impacts, operational energy, operational environmental impacts, total life-cycle energy, and total life cycle environmental impacts. Estimation of operational energy and LCA are performed using DesignBuilder software and Athena Impact estimator software respectively. The study results revealed that over the life span of the buildings, the relationship between the energy use and the environmental impacts are identical. LRA is found to be the best alternative in terms of embodied energy use and embodied environmental impacts; while, HRA showed the best life-cycle performance in terms of minimum energy use and environmental impacts. Sensitivity analysis has also been carried out to study the influence of building service lifespan over 50, 75, and 100 years on the relative significance of embodied energy and total life cycle energy. The life-cycle energy requirements for SDH are found to be a significant component among the four types of residential buildings. The overall disclose that the primary operations of these buildings accounts for 90% of the total life cycle energy which far outweighs minor differences in embodied effects between the buildings.

Keywords: Building simulation, environmental impacts, life cycle assessment, life cycle energy analysis, residential buildings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5188
8756 KM Practices in Service SMEs

Authors: K. Cormican, G. Coppola, S. Farina

Abstract:

Knowledge management is a critical component of competitive success in service organizations. Knowledge management centers on creating new knowledge and utilizing existing knowledge. While utilizing existing knowledge relates to input and control and can lead to a reduction in costs; creating new knowledge relates to output and growth and can lead to an increase in revenue. Therefore managers must ensure that they can successfully optimize the knowledge and talent in their organizations. To do this they and must try to develop an environment that promotes the generation, acquisition, transfer and use of valuable knowledge in creative ways. However knowledge management is complex and diverse. Research suggests that organizations in general and SMEs in particular are finding it difficult to implement successful knowledge management initiatives. Our research attempts to understand whether organizations are adopting best practice initiatives in their organizations. This paper presents findings from an exploratory study of 139 SMEs operating in the tourism sector across Europe. The goals of the survey is to assess the level of awareness of knowledge and talent management strategies and methodologies and to determine whether the responding companies implement best practice knowledge management initiatives in their organizations Analysis of the findings from the study are presented and discussed.

Keywords: service sector, small enterprise, success factors, survey

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
8755 Genetic Variation of Durum Wheat Landraces and Cultivars Using Morphological and Protein Markers

Authors: M. R. Naghavi, S. Rashidi Monfared, A. H. Ahkami, M. A. Ombidbakhsh

Abstract:

Knowledge of patterns of genetic diversity enhances the efficiency of germplasm conservation and improvement. In this study 96 Iranian landraces of Triticum turgidum originating from different geographical areas of Iran, along with 18 durum cultivars from ten countries were evaluated for variation in morphological and high molecular weight glutenin subunit (HMW-GS) composition. The first two principal components clearly separated the Iranian landraces from cultivars. Three alleles were present at the Glu-A1 locus and 11 alleles at Glu-B1. In both cultivars and landraces of durum wheat, the null allele (Glu-A1c) was observed more frequently than the Glu-A1a and Glu-A1b alleles. Two alleles, namely Glu-B1a (subunit 7) and Glu-B1e (subunit 20) represented the more frequent alleles at Glu-B1 locus. The results showed that the evaluated Iranian landraces formed an interesting source of favourable glutenin subunits that might be very desirable in breeding activities for improving pasta-making quality.

Keywords: Triticum turgidum var. durum, glutenin subunits, morphological characters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1998
8754 Application of Pearson Parametric Distribution Model in Fatigue Life Reliability Evaluation

Authors: E. A. Azrulhisham, Y. M. Asri, A. W. Dzuraidah, A. H. Hairul Fahmi

Abstract:

The aim of this paper is to introduce a parametric distribution model in fatigue life reliability analysis dealing with variation in material properties. Service loads in terms of responsetime history signal of Belgian pave were replicated on a multi-axial spindle coupled road simulator and stress-life method was used to estimate the fatigue life of automotive stub axle. A PSN curve was obtained by monotonic tension test and two-parameter Weibull distribution function was used to acquire the mean life of the component. A Pearson system was developed to evaluate the fatigue life reliability by considering stress range intercept and slope of the PSN curve as random variables. Considering normal distribution of fatigue strength, it is found that the fatigue life of the stub axle to have the highest reliability between 10000 – 15000 cycles. Taking into account the variation of material properties associated with the size effect, machining and manufacturing conditions, the method described in this study can be effectively applied in determination of probability of failure of mass-produced parts.

Keywords: Stub axle, Fatigue life reliability, Stress-life, PSN curve, Weibull distribution, Pearson system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2141
8753 Improving Taint Analysis of Android Applications Using Finite State Machines

Authors: Assad Maalouf, Lunjin Lu, James Lynott

Abstract:

We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.

Keywords: Android, static analysis, string analysis, taint analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 664
8752 The Contemporary Visual Spectacle — Critical Visual Literacy

Authors: Lai-Fen Yang

Abstract:

In this increasingly visual world, how can we best decipher and understand the many ways that our everyday lives are organized around looking practices and the many images we encounter each day? Indeed, how we interact with and interpret visual images is a basic component of human life. Today, however, we are living in one of the most artificial visual and image-saturated cultures in human history, which makes understanding the complex construction and multiple social functions of visual imagery more important than ever before. Themes regarding our experience of a visually pervasive mediated culture, here, termed visual spectacle.

Keywords: Visual culture, contemporary, visual spectacle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
8751 New Security Approach of Confidential Resources in Hybrid Clouds

Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander Ghorbel

Abstract:

Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers. 

Keywords: Confidentiality, cryptography, security issues, trust issues.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1474
8750 Maintenance of Philosophical, Humanistic and Religious Values of Security of the Kazakh Nation

Authors: K. K. Kaldybay, T. K. Abdrassilov, G. K. Abdygalieva, P. M. Suleymenov, M. O. Nassimov

Abstract:

People have always needed to believe in some supernatural power, which could explain nature phenomena. Different kinds of religions like Christianity, Hinduism, Islam, Buddhism have thought believers in all world, how to behave themselves. We think the most important role of religion in modern society most important role of religion in modern society is safety of the People. World and traditional religion played a prominent role in the socio-cultural progress, and in the development of man as a spiritual being. At the heart of religious morals the belief in god and responsibility before it lies and specifies religious and ethical values and categories . The religion is based on ethical standards historically developed by society, requirements and concepts, but it puts all social and moral relations of the person in dependence on religious values. For everything that the believer makes on a debt or a duty, he bears moral responsibility before conscience, people and god. The concept of value of religious morals takes the central place because the religion from all forms of public consciousness most values is painted as it is urged to answer vital questions. Any religion not only considers questions of creation of the world, sense of human existence, relationship of god and the person, but also offers the ethical concept, develops rules of behavior of people. The religion a long time dominated in the history of culture, and during this time created a set of cultural and material values. The identity of Kazakh culture can be defined as a Cultural identity traditional ,national identity and the identity values developed by Kazakh people in process of cultural-historical development, promoting formation of Kazakh culture identity on public consciousness. Identity is the historical process but always the tradition exists in it as a component of stability, as a component of self that what this identity formed .

Keywords: Philosophy, religion, education, culture, human, national value, security, religious value.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050
8749 Prediction Heating Values of Lignocellulosics from Biomass Characteristics

Authors: Kaltima Phichai, Pornchanoke Pragrobpondee, Thaweesak Khumpart, Samorn Hirunpraditkoon

Abstract:

The paper provides biomasses characteristics by proximate analysis (volatile matter, fixed carbon and ash) and ultimate analysis (carbon, hydrogen, nitrogen and oxygen) for the prediction of the heating value equations. The heating value estimation of various biomasses can be used as an energy evaluation. Thirteen types of biomass were studied. Proximate analysis was investigated by mass loss method and infrared moisture analyzer. Ultimate analysis was analyzed by CHNO analyzer. The heating values varied from 15 to 22.4MJ kg-1. Correlations of the calculated heating value with proximate and ultimate analyses were undertaken using multiple regression analysis and summarized into three and two equations, respectively. Correlations based on proximate analysis illustrated that deviation of calculated heating values from experimental heating values was higher than the correlations based on ultimate analysis.

Keywords: Heating value equation, Proximate analysis, Ultimate analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3724
8748 A Paradigm Shift towards Personalized and Scalable Product Development and Lifecycle Management Systems in the Aerospace Industry

Authors: David E. Culler, Noah D. Anderson

Abstract:

Integrated systems for product design, manufacturing, and lifecycle management are difficult to implement and customize. Commercial software vendors, including CAD/CAM and third party PDM/PLM developers, create user interfaces and functionality that allow their products to be applied across many industries. The result is that systems become overloaded with functionality, difficult to navigate, and use terminology that is unfamiliar to engineers and production personnel. For example, manufacturers of automotive, aeronautical, electronics, and household products use similar but distinct methods and processes. Furthermore, each company tends to have their own preferred tools and programs for controlling work and information flow and that connect design, planning, and manufacturing processes to business applications. This paper presents a methodology and a case study that addresses these issues and suggests that in the future more companies will develop personalized applications that fit to the natural way that their business operates. A functioning system has been implemented at a highly competitive U.S. aerospace tooling and component supplier that works with many prominent airline manufacturers around the world including The Boeing Company, Airbus, Embraer, and Bombardier Aerospace. During the last three years, the program has produced significant benefits such as the automatic creation and management of component and assembly designs (parametric models and drawings), the extensive use of lightweight 3D data, and changes to the way projects are executed from beginning to end. CATIA (CAD/CAE/CAM) and a variety of programs developed in C#, VB.Net, HTML, and SQL make up the current system. The web-based platform is facilitating collaborative work across multiple sites around the world and improving communications with customers and suppliers. This work demonstrates that the creative use of Application Programming Interface (API) utilities, libraries, and methods is a key to automating many time-consuming tasks and linking applications together.

Keywords: CAD/CAM, CAPP, PDM, PLM, Scalable Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
8747 Generation of Artificial Earthquake Accelerogram Compatible with Spectrum using the Wavelet Packet Transform and Nero-Fuzzy Networks

Authors: Peyman Shadman Heidari, Mohammad Khorasani

Abstract:

The principal purpose of this article is to present a new method based on Adaptive Neural Network Fuzzy Inference System (ANFIS) to generate additional artificial earthquake accelerograms from presented data, which are compatible with specified response spectra. The proposed method uses the learning abilities of ANFIS to develop the knowledge of the inverse mapping from response spectrum to earthquake records. In addition, wavelet packet transform is used to decompose specified earthquake records and then ANFISs are trained to relate the response spectrum of records to their wavelet packet coefficients. Finally, an interpretive example is presented which uses an ensemble of recorded accelerograms to demonstrate the effectiveness of the proposed method.

Keywords: Adaptive Neural Network Fuzzy Inference System, Wavelet Packet Transform, Response Spectrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2832
8746 Organisational Blogging: Reviewing Its Effectiveness as an Organisational Learning Tool

Authors: Gavin J. Baxter, Mark H. Stansfield

Abstract:

This paper reviews the internal use of blogs and their potential effectiveness as organisational learning tools. Since the emergence of the concept of ‘Enterprise 2.0’ there remains a lack of empirical evidence associated with how organisations are applying social media tools and whether they are effective towards supporting organisational learning. Surprisingly, blogs, one of the more traditional social media tools, still remains under-researched in the context of ‘Enterprise 2.0’ and organisational learning. The aim of this paper is to identify the theoretical linkage between blogs and organisational learning in addition to reviewing prior research on organisational blogging exploring why this area remains underresearched. Through a literature review, one of the principal findings of this paper is that organisational blogs have a mutual compatibility with the interpretivist aspect of organisational learning. This paper further advocates that further empirical work in this subject area is required to substantiate this theoretical assumption.

Keywords: Blogs, Enterprise 2.0, Organisational Learning, Social Media Tools.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2122
8745 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks

Authors: K. Indra Gandhi

Abstract:

Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.

Keywords: Model-driven development, wireless sensor networks, data acquisition, separation of concern, layered design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958