Search results for: communication technique
906 On Determining the Most Effective Technique Available in Software Testing
Authors: Qasim Zafar, Matthew Anderson, Esteban Garcia, Steven Drager
Abstract:
Software failures can present an enormous detriment to people's lives and cost millions of dollars to repair when they are unexpectedly encountered in the wild. Despite a significant portion of the software development lifecycle and resources are dedicated to testing, software failures are a relatively frequent occurrence. Nevertheless, the evaluation of testing effectiveness remains at the forefront of ensuring high-quality software and software metrics play a critical role in providing valuable insights into quantifiable objectives to assess the level of assurance and confidence in the system. As the selection of appropriate metrics can be an arduous process, the goal of this paper is to shed light on the significance of software metrics by examining a range of testing techniques and metrics as well as identifying key areas for improvement. In doing so, this paper presents a method to compare the effectiveness of testing techniques with heterogeneous output metrics. Additionally, through this investigation, readers will gain a deeper understanding of how metrics can help to drive informed decision-making on delivering high-quality software and facilitate continuous improvement in testing practices.
Keywords: Software testing, software metrics, testing effectiveness, black box testing, random testing, adaptive random testing, combinatorial testing, fuzz testing, equivalence partition, boundary value analysis, white box testings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 66905 Experiences and Coping of Adults with Death of Siblings during Childhood in Chinese Context: Implications for Therapeutic Interventions
Authors: Sze Yee Lee
Abstract:
The death of a sibling in childhood leads to significant impacts on both personal and family development of the surviving siblings, however, both short-term and long-term effects of sibling loss in Chinese societies such as Hong Kong have been inadequately documented in the literature. This paper explores the experience of encountering siblings’ death during childhood with the use of semi-structured interviews. Through thematic analysis, the author explores the impacts on surviving siblings’ emotions, coping styles, struggles and challenges and personal development. Furthermore, the influences on family dynamics are explored thoroughly, including the changes in family atmosphere, family roles, family relationship, family communication and parenting styles. More importantly, the author identifies (i) existing continuing bonds; (ii) crying; (iii) adequate social support; (iv) hiding own emotions as a gesture of protecting parents as the crucial elements pertinent to surviving siblings’ successful adaptation in the face of sibling loss. In addition, “child-centered” and “family-centered” service implications of families with a sibling's death in a Chinese context are discussed.
Keywords: Surviving children, sibling’s death, child-centered, family-centered.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 761904 Gait Biometric for Person Re-Identification
Authors: Lavanya Srinivasan
Abstract:
Biometric identification is to identify unique features in a person like fingerprints, iris, ear, and voice recognition that need the subject's permission and physical contact. Gait biometric is used to identify the unique gait of the person by extracting moving features. The main advantage of gait biometric to identify the gait of a person at a distance, without any physical contact. In this work, the gait biometric is used for person re-identification. The person walking naturally compared with the same person walking with bag, coat and case recorded using long wave infrared, short wave infrared, medium wave infrared and visible cameras. The videos are recorded in rural and in urban environments. The pre-processing technique includes human identified using You Only Look Once, background subtraction, silhouettes extraction and synthesis Gait Entropy Image by averaging the silhouettes. The moving features are extracted from the Gait Entropy Energy Image. The extracted features are dimensionality reduced by the Principal Component Analysis and recognized using different classifiers. The comparative results with the different classifier show that Linear Discriminant Analysis outperform other classifiers with 95.8% for visible in the rural dataset and 94.8% for longwave infrared in the urban dataset.
Keywords: biometric, gait, silhouettes, You Only Look Once
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 531903 Electronic Nose Based On Metal Oxide Semiconductor Sensors as an Alternative Technique for the Spoilage Classification of Oat Milk
Authors: A. Deswal, N. S. Deora, H. N. Mishra
Abstract:
The aim of the present study was to develop a rapid method for electronic nose for online quality control of oat milk. Analysis by electronic nose and bacteriological measurements were performed to analyze spoilage kinetics of oat milk samples stored at room temperature and refrigerated conditions for up to 15 days. Principal component analysis (PCA), Discriminant Factorial Analysis (DFA) and Soft Independent Modelling by Class Analogy (SIMCA) classification techniques were used to differentiate the samples of oat milk at different days. The total plate count (bacteriological method) was selected as the reference method to consistently train the electronic nose system. The e-nose was able to differentiate between the oat milk samples of varying microbial load. The results obtained by the bacteria total viable countsshowed that the shelf-life of oat milk stored at room temperature and refrigerated conditions were 20hrs and 13 days, respectively. The models built classified oat milk samples based on the total microbial population into “unspoiled” and “spoiled”.
Keywords: Electronic-nose, bacteriological, shelf-life, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3272902 Perception of Hygiene Knowledge among Staff Working in Top Five Famous Restaurants of Male’
Authors: Zulaikha Reesha Rashaad
Abstract:
One of the major factors which can contribute greatly to success of catering businesses is to employ food and beverage staff having sound hygiene knowledge. Individuals having sound knowledge of hygiene has a higher chance of following safe food practices in food production. One of the leading causes of food poisoning and food borne illnesses has been identified as lack of hygiene knowledge among food and beverage staff working in catering establishments and restaurants. This research aims to analyze the hygiene knowledge among food and beverage staff working in top five restaurants of Male’, in relation to their age, educational background, occupation and training. The research uses quantitative and descriptive methods in data collection and in data analysis. Data was obtained through random sampling technique with self-administered survey questionnaires which was completed by 60 respondents working in 5 different restaurants operating at top level in Male’. The respondents of the research were service staff and chefs working in these restaurants. The responses to the questionnaires have been analyzed by using SPSS. The results of the research indicated that age, education level, occupation and training correlated with hygiene knowledge perception scores.Keywords: Food and beverage staff, food poisoning, food production, hygiene knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1091901 Resveratrol Incorporated Liposomes Prepared from Pegylated Phospholipids and Cholesterol
Authors: Mont Kumpugdee-Vollrath, Khaled Abdallah
Abstract:
Liposomes and pegylated liposomes were widely used as drug delivery system in pharmaceutical field since a long time. However, in the former time, polyethylene glycol (PEG) was connected into phospholipid after the liposomes were already prepared. In this paper, we intend to study the possibility of applying phospholipids which already connected with PEG and then they were used to prepare liposomes. The model drug resveratrol was used because it can be applied against different diseases. Cholesterol was applied to stabilize the membrane of liposomes. The thin film technique in a laboratory scale was a preparation method. The liposomes were then characterized by nanoparticle tracking analysis (NTA), photon correlation spectroscopy (PCS) and light microscopic techniques. The stable liposomes can be produced and the particle sizes after filtration were in nanometers. The 2- and 3-chains-PEG-phospholipid (PL) caused in smaller particle size than the 4-chains-PEG-PL. Liposomes from PL 90G and cholesterol were stable during storage at 8 °C of 56 days because the particle sizes measured by PCS were almost not changed. There was almost no leakage of resveratrol from liposomes PL 90G with cholesterol after diffusion test in dialysis tube for 28 days. All liposomes showed the sustained release during measuring time of 270 min. The maximum release amount of 16-20% was detected with liposomes from 2- and 3-chains-PEG-PL. The other liposomes gave max. release amount of resveratrol only of 10%. The release kinetic can be explained by Korsmeyer-Peppas equation.
Keywords: Liposome, NTA, resveratrol, pegylation, cholesterol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1063900 Detection of Ultrasonic Images in the Presence of a Random Number of Scatterers: A Statistical Learning Approach
Authors: J. P. Dubois, O. M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.
Keywords: LS-SVM, medical ultrasound imaging, partially developed speckle, multi-look model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1341899 Tracing Quality Cost in a Luggage Manufacturing Industry
Authors: S. B. Jaju, R. R. Lakhe
Abstract:
Quality costs are the costs associated with preventing, finding, and correcting defective work. Since the main language of corporate management is money, quality-related costs act as means of communication between the staff of quality engineering departments and the company managers. The objective of quality engineering is to minimize the total quality cost across the life of product. Quality costs provide a benchmark against which improvement can be measured over time. It provides a rupee-based report on quality improvement efforts. It is an effective tool to identify, prioritize and select quality improvement projects. After reviewing through the literature it was noticed that a simplified methodology for data collection of quality cost in a manufacturing industry was required. The quantified standard methodology is proposed for collecting data of various elements of quality cost categories for manufacturing industry. Also in the light of research carried out so far, it is felt necessary to standardise cost elements in each of the prevention, appraisal, internal failure and external failure costs. . Here an attempt is made to standardise the various cost elements applicable to manufacturing industry and data is collected by using the proposed quantified methodology. This paper discusses the case study carried in luggage manufacturing industry.Keywords: Quality Costs, PAF model, quantified methodology, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2254898 SVM-Based Detection of SAR Images in Partially Developed Speckle Noise
Authors: J. P. Dubois, O. M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of SAR (synthetic aperture radar) images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to real SAR images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected SAR images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (the detection hypotheses) in the original images.Keywords: Least Square-Support Vector Machine, SyntheticAperture Radar. Partially Developed Speckle, Multi-Look Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537897 An Improved Method on Static Binary Analysis to Enhance the Context-Sensitive CFI
Authors: Qintao Shen, Lei Luo, Jun Ma, Jie Yu, Qingbo Wu, Yongqi Ma, Zhengji Liu
Abstract:
Control Flow Integrity (CFI) is one of the most promising technique to defend Code-Reuse Attacks (CRAs). Traditional CFI Systems and recent Context-Sensitive CFI use coarse control flow graphs (CFGs) to analyze whether the control flow hijack occurs, left vast space for attackers at indirect call-sites. Coarse CFGs make it difficult to decide which target to execute at indirect control-flow transfers, and weaken the existing CFI systems actually. It is an unsolved problem to extract CFGs precisely and perfectly from binaries now. In this paper, we present an algorithm to get a more precise CFG from binaries. Parameters are analyzed at indirect call-sites and functions firstly. By comparing counts of parameters prepared before call-sites and consumed by functions, targets of indirect calls are reduced. Then the control flow would be more constrained at indirect call-sites in runtime. Combined with CCFI, we implement our policy. Experimental results on some popular programs show that our approach is efficient. Further analysis show that it can mitigate COOP and other advanced attacks.Keywords: Contex-sensitive, CFI, binary analysis, code reuse attack.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 943896 Matching Pursuit based Removal of Cardiac Pulse-Related Artifacts in EEG/fMRI
Authors: Rainer Schneider, Stephan Lau, Levin Kuhlmann, Simon Vogrin, Maciej Gratkowski, Mark Cook, Jens Haueisen
Abstract:
Cardiac pulse-related artifacts in the EEG recorded simultaneously with fMRI are complex and highly variable. Their effective removal is an unsolved problem. Our aim is to develop an adaptive removal algorithm based on the matching pursuit (MP) technique and to compare it to established methods using a visual evoked potential (VEP). We recorded the VEP inside the static magnetic field of an MR scanner (with artifacts) as well as in an electrically shielded room (artifact free). The MP-based artifact removal outperformed average artifact subtraction (AAS) and optimal basis set removal (OBS) in terms of restoring the EEG field map topography of the VEP. Subsequently, a dipole model was fitted to the VEP under each condition using a realistic boundary element head model. The source location of the VEP recorded inside the MR scanner was closest to that of the artifact free VEP after cleaning with the MP-based algorithm as well as with AAS. While none of the tested algorithms offered complete removal, MP showed promising results due to its ability to adapt to variations of latency, frequency and amplitude of individual artifact occurrences while still utilizing a common template.Keywords: matching pursuit, ballistocardiogram, artifactremoval, EEG/fMRI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1687895 Influencing of Rice Residue Management Method on GHG Emission from Rice Cultivation
Authors: Cheewaphongphan P., Garivait S., Pongpullponsak A., Patumsawad S.
Abstract:
Thailand is one of the world-s leaders of rice producers and exporters. Farmers have to increase the rice cultivation frequency for serving the national increasing of export-s demand. It leads to an elimination of rice residues by open burning which is the quickest and costless management method. The open burning of rice residue is one of the major causes of air pollutants and greenhouse gas (GHG) emission. Under ASEAN agreement on trans-boundary haze, Thailand set the master plan to mitigate air pollutant emission from open burning of agricultural residues. In this master plan, residues incorporation is promoted as alternative management method to open burning. However, the assessment of both options in term of GHG emission in order to investigate their contribution to long-term global warming is still scarce or inexistent. In this study, a method on rice residues assessment was first developed in order to estimate and compare GHG emissions from rice cultivation under rice residues open burning and the case with incorporation of the same amount of rice residues, using 2006 IPCC guidelines for emission estimation and Life Cycle Analysis technique. The emission from rice cultivation in different preparing area practice was also discussed.Keywords: Greenhouse gases, Incorporation, Rice cultivation, Rice field residue, Rice residue management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3226894 Automated Vehicle Traffic Control Tower: A Solution to Support the Next Level Automation
Authors: Xiaoyun Zhao, Rami Darwish, Anna Pernestål
Abstract:
Automated vehicles (AVs) have the potential to enhance road capacity, improving road safety and traffic efficiency. Research and development on AVs have been going on for many years. However, when the complicated traffic rules and real situations interacted, AVs fail to make decisions on contradicting situations, and are not able to have control in all conditions due to highly dynamic driving scenarios. This limits AVs’ usage and restricts the full potential benefits that they can bring. Furthermore, regulations, infrastructure development, and public acceptance cannot keep up at the same pace as technology breakthroughs. Facing these challenges, this paper proposes automated vehicle traffic control tower (AVTCT) acting as a safe, efficient and integrated solution for AV control. It introduces a concept of AVTCT for control, management, decision-making, communication and interaction with various aspects in transportation. With the prototype demonstrations and simulations, AVTCT has the potential to overcome the control challenges with AVs and can facilitate AV reaching their full potential. Possible functionalities, benefits as well as challenges of AVTCT are discussed, which set the foundation for the conceptual model, simulation and real application of AVTCT.
Keywords: Automated vehicle, connectivity and automation, intelligent transport system, traffic control, traffic safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1073893 Magnetohydrodynamic Maxwell Nanofluids Flow over a Stretching Surface through a Porous Medium: Effects of Non-Linear Thermal Radiation, Convective Boundary Conditions and Heat Generation/Absorption
Authors: Sameh E. Ahmed, Ramadan A. Mohamed, Abd Elraheem M. Aly, Mahmoud S. Soliman
Abstract:
In this paper, an enhancement of the heat transfer using non-Newtonian nanofluids by magnetohydrodynamic (MHD) mixed convection along stretching sheets embedded in an isotropic porous medium is investigated. Case of the Maxwell nanofluids is studied using the two phase mathematical model of nanofluids and the Darcy model is applied for the porous medium. Important effects are taken into account, namely, non-linear thermal radiation, convective boundary conditions, electromagnetic force and presence of the heat source/sink. Suitable similarity transformations are used to convert the governing equations to a system of ordinary differential equations then it is solved numerically using a fourth order Runge-Kutta method with shooting technique. The main results of the study revealed that the velocity profiles are decreasing functions of the Darcy number, the Deborah number and the magnetic field parameter. Also, the increase in the non-linear radiation parameters causes an enhancement in the local Nusselt number.
Keywords: MHD, nanofluids, stretching surface, non-linear thermal radiation, convective condition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 960892 Participatory Democracy to the Contemporary Problems of Polish Social Policy
Authors: Agnieszka Szczudlińska-Kanoś
Abstract:
Socio-economic development, which is seen around the world today, has contributed to the emergence of new problems of a social nature. Different political, historical, geographical or economic conditions cause that, in addition to global issues of social policy such as an aging population, unemployment, migration, countries, regions, there are also specific new problems that require diagnosis, individualized approach and efficient, planned solutions. These should include, among others, digital addiction, peer violence, obesity among children, the problem of ‘legal highs’, stress, depression, diseases associated with environmental pollution etc. The central authorities, selected most often with the tools specific to representative democracy, that is, the general election, for many reasons, inter alia, organizational, communication, are not able to effectively diagnose their intensity, territorial distribution, and thus to effectively fight them. This article aims to show how in Poland, citizens influence solving problems related to the broader social policy implemented at the local government level and indicates the possibilities of improving those solutions. The conclusions of theoretical analysis have been supported by empirical studies, which tested the use of instruments of participatory democracy in the planning and creation of communal strategies for solving social problems in one of the Polish voivodeships.Keywords: Commune, democracy, participation, social policy, social problems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2563891 Rheological Properties of Polyethylene and Polypropylene Modified Bitumen
Authors: Noor Zainab Habib, Ibrahim Kamaruddin, Madzalan Napiah, Isa Mohd Tan
Abstract:
This paper presents a part of research on the rheological properties of bitumen modified by thermoplastic namely linear low density polyethylene (LLDPE), high density polyethylene (HDPE) and polypropylene (PP) and its interaction with 80 pen base bitumen. As it is known that the modification of bitumen by the use of polymers enhances its performance characteristics but at the same time significantly alters its rheological properties. The rheological study of polymer modified bitumen (PMB) was made through penetration, ring & ball softening point and viscosity test. The results were then related to the changes in the rheological properties of polymer modified bitumen. It was observed that thermoplastic copolymer shows profound effect on penetration rather than softening point. The viscoelastic behavior of polymer modified bitumen depend on the concentration of polymer, mixing temperature, mixing technique, solvating power of base bitumen and molecular structure of polymer used. PP offer better blend in comparison to HDPE and LLDPE. The viscosity of base bitumen was also enhanced with the addition of polymer. The pseudoplastic behavior was more prominent for HDPE and LLDPE than PP. Best results were obtained when polymer concentration was kept below 3%Keywords: Polymer modified bitumen, Linear low densitypolyethylene, High density polyethylene, Polypropylene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4420890 Using the PARIS Method for Multiple Criteria Decision Making in Unmanned Combat Aircraft Evaluation and Selection
Authors: C. Ardil
Abstract:
Unmanned combat aircraft (UCA) are expanding significantly in several defense industries, along with artificial intelligence improvements in highly precise technology. UCA is crucial in military settings for targeting enemy elements, and objects. UCA is also utilized for highly precise reconnaissance and surveillance tasks. To select the best alternative for critical missions, a methodical and effective strategy for UCA selection is required. Multiple criteria decision-making (MCDM) methodologies are ideally equipped to handle the complexity of alternative aircraft selection. To analyze UCA alternatives for the selection process, an integrated methodology built on the objective criteria weights and preference analysis for reference ideal solution (PARIS). First, the weights of essential elements are determined using the average weight (AW), standard deviation (SW) and entropy weight (EW) approach. The weights of the evaluation criteria affect the decision-making process. The aircraft choices in the decision problem are then ranked using objective criteria weights along with the PARIS technique. The validation and sensitivity analysis of the proposed MCDM approach are discussed.
Keywords: unmanned combat aircraft (UCA), multiple criteria decision making, MCDM, PARIS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 474889 Multiplayer Game System for Therapeutic Exercise in Which Players with Different Athletic Abilities Can Participate on an Even Competitive Footing
Authors: Kazumoto Tanaka, Takayuki Fujino
Abstract:
Sports games conducted as a group are a form of therapeutic exercise for aged people with decreased strength and for people suffering from permanent damage of stroke and other conditions. However, it is difficult for patients with different athletic abilities to play a game on an equal footing. This study specifically examines a computer video game designed for therapeutic exercise, and a game system with support given depending on athletic ability. Thereby, anyone playing the game can participate equally. This video-game, to be specific, is a popular variant of balloon volleyball, in which players hit a balloon by hand before it falls to the floor. In this game system, each player plays the game watching a monitor on which the system displays tailor-made video-game images adjusted to the person’s athletic ability, providing players with player-adaptive assist support. We have developed a multiplayer game system with an image generation technique for the tailor-made video-game and conducted tests to evaluate it.
Keywords: Therapeutic exercise, computer video game, disability-adaptive assist, tailor-made video-game image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107888 Multi-Agent System for Irrigation Using Fuzzy Logic Algorithm and Open Platform Communication Data Access
Authors: T. Wanyama, B. Far
Abstract:
Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.
Keywords: Community water usage, fuzzy logic, irrigation, multi-agent system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1338887 A Novel Approach to Iris Localization for Iris Biometric Processing
Authors: Somnath Dey, Debasis Samanta
Abstract:
Iris-based biometric system is gaining its importance in several applications. However, processing of iris biometric is a challenging and time consuming task. Detection of iris part in an eye image poses a number of challenges such as, inferior image quality, occlusion of eyelids and eyelashes etc. Due to these problems it is not possible to achieve 100% accuracy rate in any iris-based biometric authentication systems. Further, iris detection is a computationally intensive task in the overall iris biometric processing. In this paper, we address these two problems and propose a technique to localize iris part efficiently and accurately. We propose scaling and color level transform followed by thresholding, finding pupil boundary points for pupil boundary detection and dilation, thresholding, vertical edge detection and removal of unnecessary edges present in the eye images for iris boundary detection. Scaling reduces the search space significantly and intensity level transform is helpful for image thresholding. Experimental results show that our approach is comparable with the existing approaches. Following our approach it is possible to detect iris part with 95-99% accuracy as substantiated by our experiments on CASIA Ver-3.0, ICE 2005, UBIRIS, Bath and MMU iris image databases.
Keywords: Iris recognition, iris localization, biometrics, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3191886 Interactive Chinese Character Learning System though Pictograph Evolution
Authors: J.H. Low, C.O. Wong, E.J. Han, K.R Kim K.C. Jung, H.K. Yang
Abstract:
This paper proposes an Interactive Chinese Character Learning System (ICCLS) based on pictorial evolution as an edutainment concept in computer-based learning of language. The advantage of the language origination itself is taken as a learning platform due to the complexity in Chinese language as compared to other types of languages. Users especially children enjoy more by utilize this learning system because they are able to memories the Chinese Character easily and understand more of the origin of the Chinese character under pleasurable learning environment, compares to traditional approach which children need to rote learning Chinese Character under un-pleasurable environment. Skeletonization is used as the representation of Chinese character and object with an animated pictograph evolution to facilitate the learning of the language. Shortest skeleton path matching technique is employed for fast and accurate matching in our implementation. User is required to either write a word or draw a simple 2D object in the input panel and the matched word and object will be displayed as well as the pictograph evolution to instill learning. The target of computer-based learning system is for pre-school children between 4 to 6 years old to learn Chinese characters in a flexible and entertaining manner besides utilizing visual and mind mapping strategy as learning methodology.Keywords: Computer-based learning, Chinese character, pictograph evolution, skeletonization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908885 PUMA 560 Optimal Trajectory Control using Genetic Algorithm, Simulated Annealing and Generalized Pattern Search Techniques
Authors: Sufian Ashraf Mazhari, Surendra Kumar
Abstract:
Robot manipulators are highly coupled nonlinear systems, therefore real system and mathematical model of dynamics used for control system design are not same. Hence, fine-tuning of controller is always needed. For better tuning fast simulation speed is desired. Since, Matlab incorporates LAPACK to increase the speed and complexity of matrix computation, dynamics, forward and inverse kinematics of PUMA 560 is modeled on Matlab/Simulink in such a way that all operations are matrix based which give very less simulation time. This paper compares PID parameter tuning using Genetic Algorithm, Simulated Annealing, Generalized Pattern Search (GPS) and Hybrid Search techniques. Controller performances for all these methods are compared in terms of joint space ITSE and cartesian space ISE for tracking circular and butterfly trajectories. Disturbance signal is added to check robustness of controller. GAGPS hybrid search technique is showing best results for tuning PID controller parameters in terms of ITSE and robustness.Keywords: Controller Tuning, Genetic Algorithm, Pattern Search, Robotic Controller, Simulated Annealing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3717884 Determinants of Students- Intentions to Use a Mobile Messaging Service in Educational Institutions: a Theoretical Model
Authors: Boonlert Watjatrakul
Abstract:
Mobile marketing through mobile messaging service has highly impressive growth as it enables e-business firms to communicate with their customers effectively. Educational institutions hence start using this service to enhance communication with their students. Previous studies, however, have limited understanding of applying mobile messaging service in education. This study proposes a theoretical model to understand the drivers of students- intentions to use the university-s mobile messaging service. The model indicates that social influence, perceived control and attitudes affect students- intention to use the university-s mobile messaging service. It also provides five antecedents of students- attitudes–perceived utility (information utility, entertainment utility, and social utility), innovativeness, information seeking, transaction specificity (content specificity, sender specificity, and time specificity) and privacy concern. The proposed model enables universities to understand what students concern about the use of a mobile messaging service in universities and handle the service more effectively. The paper discusses the model development and concludes with limitations and implications of the proposed model.Keywords: education, intention, mobile marketing, mobile messaging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684883 Semantically Enriched Web Usage Mining for Personalization
Authors: Suresh Shirgave, Prakash Kulkarni, José Borges
Abstract:
The continuous growth in the size of the World Wide Web has resulted in intricate Web sites, demanding enhanced user skills and more sophisticated tools to help the Web user to find the desired information. In order to make Web more user friendly, it is necessary to provide personalized services and recommendations to the Web user. For discovering interesting and frequent navigation patterns from Web server logs many Web usage mining techniques have been applied. The recommendation accuracy of usage based techniques can be improved by integrating Web site content and site structure in the personalization process.
Herein, we propose semantically enriched Web Usage Mining method for Personalization (SWUMP), an extension to solely usage based technique. This approach is a combination of the fields of Web Usage Mining and Semantic Web. In the proposed method, we envisage enriching the undirected graph derived from usage data with rich semantic information extracted from the Web pages and the Web site structure. The experimental results show that the SWUMP generates accurate recommendations and is able to achieve 10-20% better accuracy than the solely usage based model. The SWUMP addresses the new item problem inherent to solely usage based techniques.
Keywords: Prediction, Recommendation, Semantic Web Usage Mining, Web Usage Mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3023882 Role of Process Parameters on Pocket Milling with Abrasive Water Jet Machining Technique
Authors: T. V. K. Gupta, J. Ramkumar, Puneet Tandon, N. S. Vyas
Abstract:
Abrasive Water Jet Machining is an unconventional machining process well known for machining hard to cut materials. The primary research focus on the process was for through cutting and a very limited literature is available on pocket milling using AWJM. The present work is an attempt to use this process for milling applications considering a set of various process parameters. Four different input parameters, which were considered by researchers for part separation, are selected for the above application, i.e., abrasive size, flow rate, standoff distance and traverse speed. Pockets of definite size are machined to investigate surface roughness, material removal rate and pocket depth. Based on the data available through experiments on SS304 material, it is observed that higher traverse speeds gives a better finish because of reduction in the particle energy density and lower depth is also observed. Increase in the standoff distance and abrasive flow rate reduces the rate of material removal as the jet loses its focus and occurrence of collisions within the particles. ANOVA for individual output parameter has been studied to know the significant process parameters.
Keywords: Abrasive flow rate, surface finish, abrasive size, standoff distance, traverse speed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4232881 Corporate Governance Role of Audit Committees in the Banking Sector: Evidence from Libya
Authors: Abdulaziz Abdulsaleh
Abstract:
This study aims at identifying the practices that should be taken into consideration by audit committees as a tool of corporate governance in Libyan commercial banks by investigating various perceptions on this topic. The study is based on a questionnaire submitted to audit committees ‘members at Libyan commercial banks, directors of internal audit departments as well as members of board of directors at these banks in addition to a number of external auditors and academic staff from Libyan universities. The study reveals that the role of audit committees has to be shifted from traditional areas of accounting to a broader role including functions related to financial reporting, audit planning, support the independence of internal and external auditors, acting as a channel of communication between external auditors and board of directors, reviewing external audit, and evaluating internal control systems. Although the study is a starting point in developing a framework of good audit committees’ practices in Libya, it is believed that the adoption of its results can result in enhancing the corporate governance practices not only in the banking sector but also in the entire corporate sector in Libya.
Keywords: Audit committees, Corporate Governance, Commercial Banks, Libya.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4629880 An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure
Authors: Fiona Browne, Huiru Zheng, Haiying Wang, Francisco Azuaje
Abstract:
Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.Keywords: Bayesian network, Classification, Data integration, Protein interaction networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616879 Efficient High Fidelity Signal Reconstruction Based on Level Crossing Sampling
Authors: Negar Riazifar, Nigel G. Stocks
Abstract:
This paper proposes strategies in level crossing (LC) sampling and reconstruction that provide high fidelity signal reconstruction for speech signals; these strategies circumvent the problem of exponentially increasing number of samples as the bit-depth is increased and hence are highly efficient. Specifically, the results indicate that the distribution of the intervals between samples is one of the key factors in the quality of signal reconstruction; including samples with short intervals does not improve the accuracy of the signal reconstruction, whilst samples with large intervals lead to numerical instability. The proposed sampling method, termed reduced conventional level crossing (RCLC) sampling, exploits redundancy between samples to improve the efficiency of the sampling without compromising performance. A reconstruction technique is also proposed that enhances the numerical stability through linear interpolation of samples separated by large intervals. Interpolation is demonstrated to improve the accuracy of the signal reconstruction in addition to the numerical stability. We further demonstrate that the RCLC and interpolation methods can give useful levels of signal recovery even if the average sampling rate is less than the Nyquist rate.
Keywords: Level crossing sampling, numerical stability, speech processing, trigonometric polynomial.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430878 Study on a Nested Cartesian Grid Method
Authors: Yih-Ferng Peng
Abstract:
In this paper, the local grid refinement is focused by using a nested grid technique. The Cartesian grid numerical method is developed for simulating unsteady, viscous, incompressible flows with complex immersed boundaries. A finite volume method is used in conjunction with a two-step fractional-step procedure. The key aspects that need to be considered in developing such a nested grid solver are imposition of interface conditions on the inter-block and accurate discretization of the governing equation in cells that are with the inter-block as a control surface. A new interpolation procedure is presented which allows systematic development of a spatial discretization scheme that preserves the spatial accuracy of the underlying solver. The present nested grid method has been tested by two numerical examples to examine its performance in the two dimensional problems. The numerical examples include flow past a circular cylinder symmetrically installed in a Channel and flow past two circular cylinders with different diameters. From the numerical experiments, the ability of the solver to simulate flows with complicated immersed boundaries is demonstrated and the nested grid approach can efficiently speed up the numerical solutions.Keywords: local grid refinement, Cartesian grid, nested grid, fractional-step method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562877 Measuring Process Component Design on Achieving Managerial Goals
Authors: Eakong Atiptamvaree, Twittie Senivongse
Abstract:
Process-oriented software development is a new software development paradigm in which software design is modeled by a business process which is in turn translated into a process execution language for execution. The building blocks of this paradigm are software units that are composed together to work according to the flow of the business process. This new paradigm still exhibits the characteristic of the applications built with the traditional software component technology. This paper discusses an approach to apply a traditional technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses, and these process components can be reused to design the business processes of other application domains. The decomposition considers five managerial goals, namely cost effectiveness, ease of assembly, customization, reusability, and maintainability. The paper presents how to design or decompose process components from a business process model and measure some technical features of the design that would affect the managerial goals. A comparison between the measurement values from different designs can tell which process component design is more appropriate for the managerial goals that have been set. The proposed approach can be applied in Web Services environment which accommodates process-oriented software development.Keywords: Business Process Model, Managerial Goals, ProcessComponent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513