Search results for: CAM- Computer Aided Manufacturing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2129

Search results for: CAM- Computer Aided Manufacturing

1529 Making Data Structures and Algorithms more Understandable by Programming Sudoku the Human Way

Authors: Roelien Goede

Abstract:

Data Structures and Algorithms is a module in most Computer Science or Information Technology curricula. It is one of the modules most students identify as being difficult. This paper demonstrates how programming a solution for Sudoku can make abstract concepts more concrete. The paper relates concepts of a typical Data Structures and Algorithms module to a step by step solution for Sudoku in a human type as opposed to a computer oriented solution.

Keywords: Data Structures, Algorithms, Sudoku, ObjectOriented Programming, Programming Teaching, Education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3092
1528 Prediction of Computer and Video Game Playing Population: An Age Structured Model

Authors: T. K. Sriram, Joydip Dhar

Abstract:

Models based on stage structure have found varied applications in population models. This paper proposes a stage structured model to study the trends in the computer and video game playing population of US. The game paying population is divided into three compartments based on their age group. After simulating the mathematical model, a forecast of the number of game players in each stage as well as an approximation of the average age of game players in future has been made.

Keywords: Age structure, Forecasting, Mathematical modeling, Stage structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
1527 Analysis of Modified Heap Sort Algorithm on Different Environment

Authors: Vandana Sharma, Parvinder S. Sandhu, Satwinder Singh, Baljit Saini

Abstract:

In field of Computer Science and Mathematics, sorting algorithm is an algorithm that puts elements of a list in a certain order i.e. ascending or descending. Sorting is perhaps the most widely studied problem in computer science and is frequently used as a benchmark of a system-s performance. This paper presented the comparative performance study of four sorting algorithms on different platform. For each machine, it is found that the algorithm depends upon the number of elements to be sorted. In addition, as expected, results show that the relative performance of the algorithms differed on the various machines. So, algorithm performance is dependent on data size and there exists impact of hardware also.

Keywords: Algorithm, Analysis, Complexity, Sorting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2407
1526 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: Brain-machine interface, EEGLAB, emotiv EEG neuroheadset, openViBE, simulink.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2798
1525 Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets

Authors: Mohammad Ali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi

Abstract:

In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.

Keywords: stable distribution, SaS, infinite variance, heavy tail networks, VaR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2055
1524 Comparison of Different Types of Sources of Traffic Using SFQ Scheduling Discipline

Authors: Alejandro Gomez Suarez, H. Srikanth Kamath

Abstract:

In this paper, SFQ (Start Time Fair Queuing) algorithm is analyzed when this is applied in computer networks to know what kind of behavior the traffic in the net has when different data sources are managed by the scheduler. Using the NS2 software the computer networks were simulated to be able to get the graphs showing the performance of the scheduler. Different traffic sources were introduced in the scripts, trying to establish the real scenario. Finally the results were that depending on the data source, the traffic can be affected in different levels, when Constant Bite Rate is applied, the scheduler ensures a constant level of data sent and received, but the truth is that in the real life it is impossible to ensure a level that resists the changes in work load.

Keywords: Cbq, Cbr, Nam, Ns2.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135
1523 Object-Oriented Simulation of Simulating Anticipatory Systems

Authors: Eugene Kindler

Abstract:

The present paper is oriented to problems of simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. A certain analogy between use of simulation and imagining will be applied to make the explication more comprehensible. The paper will be completed by notes of problems and by some existing applications. The problems consist in the fact that simulation of the mentioned anticipatory systems end is simulation of simulating systems, i.e. in computer models handling two or more modeled time axes that should be mapped to real time flow in a nondescent manner. Languages oriented to objects, processes and blocks can be used to surmount the problems.

Keywords: Anticipatory systems, Nested computer models, Discrete event simulation, Simula.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1435
1522 Risk Assessment in Durations and Costs for Construction of Industrial Facilities in Egypt Using Equations and Computer

Authors: M. Kamal Elbokl, Negadi Kheira

Abstract:

Risk Evaluation is an important step in protecting your workers and your business, as well as complying with the law. It helps you focus on the risks that really matter in your workplace – the ones with the potential to cause real harm. We are in this paper introduce basics of risk assessment then we mention some of ways to risk evaluation by computer especially Monte Carlo simulation and Microsoft project.

We use Program Evaluation and Review Technique (PERT) to deal with Risks in Industrial Facilities in Evaluation and Assessment for this risk. Using PERT Technique in Microsoft Project by the PERT toolbar and using PERTMASTER Program with Primavera Program we evaluate many hazards and make calculations for that by mathematical equation to make right decisions. We define and calculate risk factor and risk severity to ranking the type of the risk then dealing with it using in that many ways like probability computation, curves, and tables. By introducing variables in the equation of functions in computer programs we calculate the risk in the time and the cost in general case and then mention some examples in industrial facilities field.

Keywords: Risk, Industrial Facilities, PERT, Monte Carlo Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
1521 The Application of Non-quantitative Modelling in the Analysis of a Network Warfare Environment

Authors: N. Veerasamy, JPH Eloff

Abstract:

Network warfare is an emerging concept that focuses on the network and computer based forms through which information is attacked and defended. Various computer and network security concepts thus play a role in network warfare. Due the intricacy of the various interacting components, a model to better understand the complexity in a network warfare environment would be beneficial. Non-quantitative modeling is a useful method to better characterize the field due to the rich ideas that can be generated based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define the influential conditions in a network warfare environment.

Keywords: Morphological, non-quantitative, network warfare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1379
1520 Prediction of Optimum Cutting Parameters to obtain Desired Surface in Finish Pass end Milling of Aluminium Alloy with Carbide Tool using Artificial Neural Network

Authors: Anjan Kumar Kakati, M. Chandrasekaran, Amitava Mandal, Amit Kumar Singh

Abstract:

End milling process is one of the common metal cutting operations used for machining parts in manufacturing industry. It is usually performed at the final stage in manufacturing a product and surface roughness of the produced job plays an important role. In general, the surface roughness affects wear resistance, ductility, tensile, fatigue strength, etc., for machined parts and cannot be neglected in design. In the present work an experimental investigation of end milling of aluminium alloy with carbide tool is carried out and the effect of different cutting parameters on the response are studied with three-dimensional surface plots. An artificial neural network (ANN) is used to establish the relationship between the surface roughness and the input cutting parameters (i.e., spindle speed, feed, and depth of cut). The Matlab ANN toolbox works on feed forward back propagation algorithm is used for modeling purpose. 3-12-1 network structure having minimum average prediction error found as best network architecture for predicting surface roughness value. The network predicts surface roughness for unseen data and found that the result/prediction is better. For desired surface finish of the component to be produced there are many different combination of cutting parameters are available. The optimum cutting parameter for obtaining desired surface finish, to maximize tool life is predicted. The methodology is demonstrated, number of problems are solved and algorithm is coded in Matlab®.

Keywords: End milling, Surface roughness, Neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2160
1519 A Study on Remote On-Line Diagnostic System for Vehicles by Integrating the Technology of OBD, GPS, and 3G

Authors: Jyong Lin, Shih-Chang Chen, Yu-Tsen Shih, Shi-Huang Chen

Abstract:

This paper presents a remote on-line diagnostic system for vehicles via the use of On-Board Diagnostic (OBD), GPS, and 3G techniques. The main parts of the proposed system are on-board computer, vehicle monitor server, and vehicle status browser. First, the on-board computer can obtain the location of deriver and vehicle status from GPS receiver and OBD interface, respectively. Then on-board computer will connect with the vehicle monitor server through 3G network to transmit the real time vehicle system status. Finally, vehicle status browser could show the remote vehicle status including vehicle speed, engine rpm, battery voltage, engine coolant temperature, and diagnostic trouble codes. According to the experimental results, the proposed system can help fleet managers and car knockers to understand the remote vehicle status. Therefore this system can decrease the time of fleet management and vehicle repair due to the fleet managers and car knockers who find the diagnostic trouble messages in time.

Keywords: Diagnostic Trouble Code (DTC), Electronic Control Unit (ECU), Global Position System (GPS), On-Board Diagnostic (OBD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3007
1518 Combining Skin Color and Optical Flow for Computer Vision Systems

Authors: Muhammad Raza Ali, Tim Morris

Abstract:

Skin color is an important visual cue for computer vision systems involving human users. In this paper we combine skin color and optical flow for detection and tracking of skin regions. We apply these techniques to gesture recognition with encouraging results. We propose a novel skin similarity measure. For grouping detected skin regions we propose a novel skin region grouping mechanism. The proposed techniques work with any number of skin regions making them suitable for a multiuser scenario.

Keywords: Bayesian tracking, chromaticity space, optical flowgesture recognition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
1517 Analysis of Histogram Asymmetry for Waste Recognition

Authors: Janusz Bobulski, Kamila Pasternak

Abstract:

Despite many years of effort and research, the problem of waste management is still current. There is a lack of fast and effective algorithms for classifying individual waste fractions. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.

Keywords: Computer vision, environmental protection, image processing, waste management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 296
1516 Occurrence of Foreign Matter in Food: Applied Identification Method - Association of Official Agricultural Chemists (AOAC) and Food and Drug Administration (FDA)

Authors: E. C. Mattos, V. S. M. G. Daros, R. Dal Col, A. L. Nascimento

Abstract:

The aim of this study is to present the results of a retrospective survey on the foreign matter found in foods analyzed at the Adolfo Lutz Institute, from July 2001 to July 2015. All the analyses were conducted according to the official methods described on Association of Official Agricultural Chemists (AOAC) for the micro analytical procedures and Food and Drug Administration (FDA) for the macro analytical procedures. The results showed flours, cereals and derivatives such as baking and pasta products were the types of food where foreign matters were found more frequently followed by condiments and teas. Fragments of stored grains insects, its larvae, nets, excrement, dead mites and rodent excrement were the most foreign matter found in food. Besides, foreign matters that can cause a physical risk to the consumer’s health such as metal, stones, glass, wood were found but rarely. Miscellaneous (shell, sand, dirt and seeds) were also reported. There are a lot of extraneous materials that are considered unavoidable since are something inherent to the product itself, such as insect fragments in grains. In contrast, there are avoidable extraneous materials that are less tolerated because it is preventable with the Good Manufacturing Practice. The conclusion of this work is that although most extraneous materials found in food are considered unavoidable it is necessary to keep the Good Manufacturing Practice throughout the food processing as well as maintaining a constant surveillance of the production process in order to avoid accidents that may lead to occurrence of these extraneous materials in food.

Keywords: Food contamination, extraneous materials, foreign matter, surveillance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3695
1515 A Multi Steps Algorithm for Sperm Segmentation in Microscopic Image

Authors: Fereidoon Nowshiravan Rahatabad, Mohammad Hassan Moradi, Vahid Reza Nafisi

Abstract:

Nothing that an effective cure for infertility happens when we can find a unique solution, a great deal of study has been done in this field and this is a hot research subject for to days study. So we could analyze the men-s seaman and find out about fertility and infertility and from this find a true cure for this, since this will be a non invasive and low risk procedure, it will be greatly welcomed. In this research, the procedure has been based on few Algorithms enhancement and segmentation of images which has been done on the images taken from microscope in different fertility institution and have obtained a suitable result from the computer images which in turn help us to distinguish these sperms from fluids and its surroundings.

Keywords: Computer-Assisted Sperm Analysis (CASA), Spermidentification, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
1514 Expressive Modes and Species of Language

Authors: Richard Elling Moe

Abstract:

Computer languages are usually lumped together into broad -paradigms-, leaving us in want of a finer classification of kinds of language. Theories distinguishing between -genuine differences- in language has been called for, and we propose that such differences can be observed through a notion of expressive mode. We outline this concept, propose how it could be operationalized and indicate a possible context for the development of a corresponding theory. Finally we consider a possible application in connection with evaluation of language revision. We illustrate this with a case, investigating possible revisions of the relational algebra in order to overcome weaknesses of the division operator in connection with universal queries.

Keywords: Expressive mode, Computer language species, Evaluation of revision, Relational algebra, Universal database queries

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1323
1513 Forces Association-Based Active Contour

Authors: Aicha Baya Goumeidane, Nafaa. Nacereddine

Abstract:

A welded structure must be inspected to guarantee that the weld quality meets the design requirements to assure safety and reliability. However, X-ray image analyses and defect recognition with the computer vision techniques are very complex. Most difficulties lie in finding the small, irregular defects in poor contrast images which requires pre processing to image, extract, and classify features from strong background noise. This paper addresses the issue of designing methodology to extract defect from noisy background radiograph with image processing. Based on the use of actives contours this methodology seems to give good results

Keywords: Welding, Radiography, Computer vision, Active contour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
1512 Multi-Layer Perceptron Neural Network Classifier with Binary Particle Swarm Optimization Based Feature Selection for Brain-Computer Interfaces

Authors: K. Akilandeswari, G. M. Nasira

Abstract:

Brain-Computer Interfaces (BCIs) measure brain signals activity, intentionally and unintentionally induced by users, and provides a communication channel without depending on the brain’s normal peripheral nerves and muscles output pathway. Feature Selection (FS) is a global optimization machine learning problem that reduces features, removes irrelevant and noisy data resulting in acceptable recognition accuracy. It is a vital step affecting pattern recognition system performance. This study presents a new Binary Particle Swarm Optimization (BPSO) based feature selection algorithm. Multi-layer Perceptron Neural Network (MLPNN) classifier with backpropagation training algorithm and Levenberg-Marquardt training algorithm classify selected features.

Keywords: Brain-Computer Interfaces (BCI), Feature Selection (FS), Walsh–Hadamard Transform (WHT), Binary Particle Swarm Optimization (BPSO), Multi-Layer Perceptron (MLP), Levenberg–Marquardt algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2180
1511 Retina Based Mouse Control (RBMC)

Authors: Arslan Qamar Malik, Jehanzeb Ahmad

Abstract:

The paper presents a novel idea to control computer mouse cursor movement with human eyes. In this paper, a working of the product has been described as to how it helps the special people share their knowledge with the world. Number of traditional techniques such as Head and Eye Movement Tracking Systems etc. exist for cursor control by making use of image processing in which light is the primary source. Electro-oculography (EOG) is a new technology to sense eye signals with which the mouse cursor can be controlled. The signals captured using sensors, are first amplified, then noise is removed and then digitized, before being transferred to PC for software interfacing.

Keywords: Human Computer Interaction, Real-Time System, Electro-oculography, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4237
1510 Green Lean TQM Human Resource Management Practices in Malaysian Automotive Companies

Authors: Noor Azlina Mohd Salleh, Salmiah Kasolang, Ahmed Jaffar

Abstract:

Green Lean Total Quality Management (LTQM) Human Resource Management (HRM) System is a system comprises of HRM in Environmental Management System (EMS) practices which is integrated to TQM with Lean Manufacturing (LM) principles. HRM is essential especially in dealing with low motivation and less productive employees. The ultimate goal of this system is to focus on achieving total human resource development that is motivated and capable to optimize their creativity to be a part of Green and Lean TQM organization. A survey questionnaire was developed and distributed to 30 highly active automotive vendors in Malaysia and analyzed by Minitab v16 and SPSS v17. It was found out companies that are practicing Green LTQM HRM practices have generated more revenue and have RND capability. However, years of company establishment do not affect the openness of the company to adapt new initiatives that can help to improve the effectiveness of the operations. It was also found out the importance of training, communication and rewards for employees. The Green LTQM HRM practices framework model established in this study hopefully will give preliminary insight especially to companies that are still looking for system that can improve their productivity from managing human resource. This is preliminary study that combined 4 awards practices, ISO/TS16949, Toyota Production System SAEJ4000, MAJAICO Lean Production System and EMS focusing on highly active companies that have been involved in MAJAICO Program and Proton Vendor Development Program. Future study can be conducted to know the status at other industry as well as case study pertaining to this system.

Keywords: Automotive Industry, Lean Manufacturing, Operational Engineering Management, Total Quality Management. Environmental Management System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4172
1509 VHL, PBRM1 and SETD2 Genes in Kidney Cancer: A Molecular Investigation

Authors: Rozhgar A. Khailany, Mehri Igci, Emine Bayraktar, Sakip Erturhan, Metin Karakok, Ahmet Arslan

Abstract:

Kidney cancer is the most lethal urological cancer accounting for 3% of adult malignancies. VHL, a tumor-suppressor gene, is best known to be associated with renal cell carcinoma (RCC). The VHL functions as negative regulator of hypoxia inducible factors. Recent sequencing efforts have identified several novel frequent mutations of histone modifying and chromatin remodeling genes in ccRCC (clear cell RCC) including PBRM1 and SETD2. The PBRM1 gene encodes the BAF180 protein, which involved in transcriptional activation and repression of selected genes. SETD2 encodes a histone methyltransferase, which may play a role in suppressing tumor development. In this study, RNAs of 30 paired tumor and normal samples that were grouped according to the types of kidney cancer and clinical characteristics of patients, including gender and average age were examined by RT-PCR, SSCP and sequencing techniques. VHL, PBRM1 and SETD2 expressions were relatively down-regulated. However, statistically no significance was found (Wilcoxon signed rank test, p>0.05). Interestingly, no mutation was observed on the contrary of previous studies. Understanding the molecular mechanisms involved in the pathogenesis of RCC has aided the development of molecular-targeted drugs for kidney cancer. Further analysis is required to identify the responsible genes rather than VHL, PBRM1 and SETD2 in kidney cancer.

Keywords: Kidney cancer, molecular biomarker, expression analysis, mutation screening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2006
1508 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
1507 Assessment-Assisted and Relationship-Based Financial Advising: Using an Empirical Assessment to Understand Personal Investor Risk Tolerance in Professional Advising Relationships

Authors: Jerry Szatko, Edan L. Jorgensen, Stacia Jorgensen

Abstract:

A crucial component to the success of any financial advising relationship is for the financial professional to understand the perceptions, preferences and thought-processes carried by the financial clients they serve. Armed with this information, financial professionals are more quickly able to understand how they can tailor their approach to best match the individual preferences and needs of each personal investor. Our research explores the use of a quantitative assessment tool in the financial services industry to assist in the identification of the personal investor’s consumer behaviors, especially in terms of financial risk tolerance, as it relates to their financial decision making. Through this process, the Unitifi Consumer Insight Tool (UCIT) was created and refined to capture and categorize personal investor financial behavioral categories and the financial personality tendencies of individuals prior to the initiation of a financial advisement relationship. This paper discusses the use of this tool to place individuals in one of four behavior-based financial risk tolerance categories. Our discoveries and research were aided through administration of a web-based survey to a group of over 1,000 individuals. Our findings indicate that it is possible to use a quantitative assessment tool to assist in predicting the behavioral tendencies of personal consumers when faced with consumer financial risk and decisions.

Keywords: Behavior based advising, behavioral finance, financial advising, financial advisor tools, financial risk tolerance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 955
1506 A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses

Authors: Rashima Mahajan, Dipali Bansal, Shweta Singh

Abstract:

Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotiv EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.

Keywords: Brain Computer Interface (BCI), Electroencephalogram (EEG), EEGLab, BCILab, Emotiv, Emotions, Interval features, Spectral features, Artificial Neural Network, Control applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5292
1505 A Review of in-orbit Observations of Radiation- Induced Effects in Commercial Memories onboard Alsat-1

Authors: Y. Bentoutou, A.M. Si Mohammed

Abstract:

This paper presents a review of an 8-year study on radiation effects in commercial memory devices operating within the main on-board computer system OBC386 of the Algerian microsatellite Alsat-1. A statistical analysis of single-event upset (SEU) and multiple-bit upset (MBU) activity in these commercial memories shows that the typical SEU rate at alsat-1's orbit is 4.04 × 10-7 SEU/bit/day, where 98.6% of these SEUs cause single-bit errors, 1.22% cause double-byte errors, and the remaining SEUs result in multiple-bit and severe errors.

Keywords: Radiation effects, error detection and correction, satellite computer, small satellite mission.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886
1504 Quality Classification and Monitoring Using Adaptive Metric Distance and Neural Networks: Application in Pickling Process

Authors: S. Bouhouche, M. Lahreche, S. Ziani, J. Bast

Abstract:

Modern manufacturing facilities are large scale, highly complex, and operate with large number of variables under closed loop control. Early and accurate fault detection and diagnosis for these plants can minimise down time, increase the safety of plant operations, and reduce manufacturing costs. Fault detection and isolation is more complex particularly in the case of the faulty analog control systems. Analog control systems are not equipped with monitoring function where the process parameters are continually visualised. In this situation, It is very difficult to find the relationship between the fault importance and its consequences on the product failure. We consider in this paper an approach to fault detection and analysis of its effect on the production quality using an adaptive centring and scaling in the pickling process in cold rolling. The fault appeared on one of the power unit driving a rotary machine, this machine can not track a reference speed given by another machine. The length of metal loop is then in continuous oscillation, this affects the product quality. Using a computerised data acquisition system, the main machine parameters have been monitored. The fault has been detected and isolated on basis of analysis of monitored data. Normal and faulty situation have been obtained by an artificial neural network (ANN) model which is implemented to simulate the normal and faulty status of rotary machine. Correlation between the product quality defined by an index and the residual is used to quality classification.

Keywords: Modeling, fault detection and diagnosis, parameters estimation, neural networks, Fault Detection and Diagnosis (FDD), pickling process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
1503 Program Memories Error Detection and Correction On-Board Earth Observation Satellites

Authors: Y. Bentoutou

Abstract:

Memory Errors Detection and Correction aim to secure the transaction of data between the central processing unit of a satellite onboard computer and its local memory. In this paper, the application of a double-bit error detection and correction method is described and implemented in Field Programmable Gate Array (FPGA) technology. The performance of the proposed EDAC method is measured and compared with two different EDAC devices, using the same FPGA technology. Statistical analysis of single-event upset (SEU) and multiple-bit upset (MBU) activity in commercial memories onboard the first Algerian microsatellite Alsat-1 is given.

Keywords: Error Detection and Correction, On-board computer, small satellite missions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
1502 Evaluation and Analysis of Lean-Based Manufacturing Equipment and Technology System for Jordanian Industries

Authors: Mohammad D. AL-Tahat, Shahnaz M. Alkhalil

Abstract:

International markets driven forces are changing continuously, therefore companies need to gain a competitive edge in such markets. Improving the company's products, processes and practices is no longer auxiliary. Lean production is a production management philosophy that consolidates work tasks with minimum waste resulting in improved productivity. Lean production practices can be mapped into many production areas. One of these is Manufacturing Equipment and Technology (MET). Many lean production practices can be implemented in MET, namely, specific equipment configurations, total preventive maintenance, visual control, new equipment/ technologies, production process reengineering and shared vision of perfection.The purpose of this paper is to investigate the implementation level of these six practices in Jordanian industries. To achieve that a questionnaire survey has been designed according to five-point Likert scale. The questionnaire is validated through pilot study and through experts review. A sample of 350 Jordanian companies were surveyed, the response rate was 83%. The respondents were asked to rate the extent of implementation for each of practices. A relationship conceptual model is developed, hypotheses are proposed, and consequently the essential statistical analyses are then performed. An assessment tool that enables management to monitor the progress and the effectiveness of lean practices implementation is designed and presented. Consequently, the results show that the average implementation level of lean practices in MET is 77%, Jordanian companies are implementing successfully the considered lean production practices, and the presented model has Cronbach-s alpha value of 0.87 which is good evidence on model consistency and results validation.

Keywords: Lean Production, SME applications, Visual Control, New equipment/technologies, Specific equipment configurations, Jordan

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2294
1501 Interactive Chinese Character Learning System though Pictograph Evolution

Authors: J.H. Low, C.O. Wong, E.J. Han, K.R Kim K.C. Jung, H.K. Yang

Abstract:

This paper proposes an Interactive Chinese Character Learning System (ICCLS) based on pictorial evolution as an edutainment concept in computer-based learning of language. The advantage of the language origination itself is taken as a learning platform due to the complexity in Chinese language as compared to other types of languages. Users especially children enjoy more by utilize this learning system because they are able to memories the Chinese Character easily and understand more of the origin of the Chinese character under pleasurable learning environment, compares to traditional approach which children need to rote learning Chinese Character under un-pleasurable environment. Skeletonization is used as the representation of Chinese character and object with an animated pictograph evolution to facilitate the learning of the language. Shortest skeleton path matching technique is employed for fast and accurate matching in our implementation. User is required to either write a word or draw a simple 2D object in the input panel and the matched word and object will be displayed as well as the pictograph evolution to instill learning. The target of computer-based learning system is for pre-school children between 4 to 6 years old to learn Chinese characters in a flexible and entertaining manner besides utilizing visual and mind mapping strategy as learning methodology.

Keywords: Computer-based learning, Chinese character, pictograph evolution, skeletonization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
1500 Influence of Internal Topologies on Components Produced by Selective Laser Melting: Numerical Analysis

Authors: C. Malça, P. Gonçalves, N. Alves, A. Mateus

Abstract:

Regardless of the manufacturing process used, subtractive or additive, material, purpose and application, produced components are conventionally solid mass with more or less complex shape depending on the production technology selected. Aspects such as reducing the weight of components, associated with the low volume of material required and the almost non-existent material waste, speed and flexibility of production and, primarily, a high mechanical strength combined with high structural performance, are competitive advantages in any industrial sector, from automotive, molds, aviation, aerospace, construction, pharmaceuticals, medicine and more recently in human tissue engineering. Such features, properties and functionalities are attained in metal components produced using the additive technique of Rapid Prototyping from metal powders commonly known as Selective Laser Melting (SLM), with optimized internal topologies and varying densities. In order to produce components with high strength and high structural and functional performance, regardless of the type of application, three different internal topologies were developed and analyzed using numerical computational tools. The developed topologies were numerically submitted to mechanical compression and four point bending testing. Finite Element Analysis results demonstrate how different internal topologies can contribute to improve mechanical properties, even with a high degree of porosity relatively to fully dense components. Results are very promising not only from the point of view of mechanical resistance, but especially through the achievement of considerable variation in density without loss of structural and functional high performance.

Keywords: Additive Manufacturing, Internal topologies, Porosity, Rapid Prototyping, Selective Laser Melting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2357