Search results for: control methods
1738 The Estimation of Human Vital Signs Complexity
Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius
Abstract:
Nonstationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based on the interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore, we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables’ interactions.
Keywords: Cardiac diseases, Complex systems theory, ECG analysis, matrix analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22471737 Implementation of a Low-Cost Instrumentation for an Open Cycle Wind Tunnel to Evaluate Pressure Coefficient
Authors: Cristian P. Topa, Esteban A. Valencia, Victor H. Hidalgo, Marco A. Martinez
Abstract:
Wind tunnel experiments for aerodynamic profiles display numerous advantages, such as: clean steady laminar flow, controlled environmental conditions, streamlines visualization, and real data acquisition. However, the experiment instrumentation usually is expensive, and hence, each test implies a incremented in design cost. The aim of this work is to select and implement a low-cost static pressure data acquisition system for a NACA 2412 airfoil in an open cycle wind tunnel. This work compares wind tunnel experiment with Computational Fluid Dynamics (CFD) simulation and parametric analysis. The experiment was evaluated at Reynolds of 1.65 e5, with increasing angles from -5° to 15°. The comparison between the approaches show good enough accuracy, between the experiment and CFD, additional parametric analysis results differ widely from the other methods, which complies with the lack of accuracy of the lateral approach due its simplicity.Keywords: Wind tunnel, low cost instrumentation, experimental testing, CFD simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8161736 An Efficient Adaptive Thresholding Technique for Wavelet Based Image Denoising
Authors: D.Gnanadurai, V.Sadasivam
Abstract:
This frame work describes a computationally more efficient and adaptive threshold estimation method for image denoising in the wavelet domain based on Generalized Gaussian Distribution (GGD) modeling of subband coefficients. In this proposed method, the choice of the threshold estimation is carried out by analysing the statistical parameters of the wavelet subband coefficients like standard deviation, arithmetic mean and geometrical mean. The noisy image is first decomposed into many levels to obtain different frequency bands. Then soft thresholding method is used to remove the noisy coefficients, by fixing the optimum thresholding value by the proposed method. Experimental results on several test images by using this method show that this method yields significantly superior image quality and better Peak Signal to Noise Ratio (PSNR). Here, to prove the efficiency of this method in image denoising, we have compared this with various denoising methods like wiener filter, Average filter, VisuShrink and BayesShrink.Keywords: Wavelet Transform, Gaussian Noise, ImageDenoising, Filter Banks and Thresholding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29071735 Using the Nerlovian Adjustment Model to Assess the Response of Farmers to Price and Other Related Factors: Evidence from Sierra Leone Rice Cultivation
Authors: Alhaji M. H. Conteh, Xiangbin Yan, Alfred V. Gborie
Abstract:
The goal of this study was to increase the awareness of the description and assessments of rice acreage response and to offer mechanisms for agricultural policy scrutiny. The ordinary least square (OLS) technique was utilized to determine the coefficients of acreage response models for the rice varieties. The magnitudes of the coefficients (λ) of both the ROK lagged and NERICA lagged acreages were found positive and highly significant, which indicates that farmers’ adjustment rate was very low. Regarding lagged actual price for both the ROK and NERICE rice varieties, the short-run price elasticitieswere lower than long-run, which is suggesting a long term adjustment of the acreage under the crop.
However, the apparent recommendations for policy transformation are to open farm gate prices and to decrease government’s involvement in agricultural sector especially in the acquisition of agricultural inputs. Impending research have to be centered on how this might be better realized. Necessary conditions should be made available to the private sector by means of minimizing price volatility. In accordance with structural reforms, it is necessary to convey output prices to farmers with minimum distortion. There is need to eradicate price subsidies and control, which generate distortion in the market in addition to huge financial costs.
Keywords: Acreage response, rate of adjustment, rice varieties, Sierra Leone.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37911734 Designing of the Heating Process for Fiber- Reinforced Thermoplastics with Middle-Wave Infrared Radiators
Abstract:
Manufacturing components of fiber-reinforced thermoplastics requires three steps: heating the matrix, forming and consolidation of the composite and terminal cooling the matrix. For the heating process a pre-determined temperature distribution through the layers and the thickness of the pre-consolidated sheets is recommended to enable forming mechanism. Thus, a design for the heating process for forming composites with thermoplastic matrices is necessary. To obtain a constant temperature through thickness and width of the sheet, the heating process was analyzed by the help of the finite element method. The simulation models were validated by experiments with resistance thermometers as well as with an infrared camera. Based on the finite element simulation, heating methods for infrared radiators have been developed. Using the numeric simulation many iteration loops are required to determine the process parameters. Hence, the initiation of a model for calculating relevant process parameters started applying regression functions.Keywords: Fiber-reinforced thermoplastics, heating strategies, middle-wave infrared radiator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17421733 A Tabu Search Heuristic for Scratch-Pad Memory Management
Authors: Maha Idrissi Aouad, Rene Schott, Olivier Zendra
Abstract:
Reducing energy consumption of embedded systems requires careful memory management. It has been shown that Scratch- Pad Memories (SPMs) are low size, low cost, efficient (i.e. energy saving) data structures directly managed at the software level. In this paper, the focus is on heuristic methods for SPMs management. A method is efficient if the number of accesses to SPM is as large as possible and if all available space (i.e. bits) is used. A Tabu Search (TS) approach for memory management is proposed which is, to the best of our knowledge, a new original alternative to the best known existing heuristic (BEH). In fact, experimentations performed on benchmarks show that the Tabu Search method is as efficient as BEH (in terms of energy consumption) but BEH requires a sorting which can be computationally expensive for a large amount of data. TS is easy to implement and since no sorting is necessary, unlike BEH, the corresponding sorting time is saved. In addition to that, in a dynamic perspective where the maximum capacity of the SPM is not known in advance, the TS heuristic will perform better than BEH.
Keywords: Energy consumption, memory allocation management, optimization, tabu search heuristic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16801732 Vibration Analysis of an Alstom Typhoon Gas Turbine Power Plant Related to Iran Oil Industry
Authors: Omid A. Zargar
Abstract:
Vibration analysis is the most important factor in preventive maintenance. Gas turbine vibration analysis is also one of the most challenging categories in most critical equipment monitoring systems. Utilities are heart of the process in big industrial plants like petrochemical zones. Vibration analysis methods and condition monitoring systems of this kind of equipment developed too much in recent years. On the other hand, too much operation condition consideration in this kind of equipment should be adjusted properly like inlet and outlet pressure and temperature for both turbine and compressor. In this paper the most important tools and hypothesis used for analyzing of gas turbine power plants discussed in details through a real case history related to an Alstom Typhoon gas turbine power plant in Iran oil industries. In addition, the basic principal of vibration behavior caused by mechanical unbalance in gas turbine rotor discussed in details.
Keywords: Vibration analysis, gas turbine, time wave form (TWF), fast Fourier transform (FFT), phase angle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49111731 Analysis of Heuristic Based Hybrid Simulated Annealing Algorithm for Multiprocessor Task Scheduling
Authors: Supriya Arya, Sunita Dhingra
Abstract:
Multiprocessor task scheduling problem for dependent and independent tasks is computationally complex problem. Many methods are proposed to achieve optimal running time. As the multiprocessor task scheduling is NP hard in nature, therefore, many heuristics are proposed which have improved the makespan of the problem. But due to problem specific nature, the heuristic method which provide best results for one problem, might not provide good results for another problem. So, Simulated Annealing which is meta heuristic approach is considered. It can be applied on all types of problems. However, due to many runs, meta heuristic approach takes large computation time. Hence, the hybrid approach is proposed by combining the Duplication Scheduling Heuristic and Simulated Annealing (SA) and the makespan results of Simple Simulated Annealing and Hybrid approach are analyzed.
Keywords: Multiprocessor task scheduling Problem, Makespan, Duplication Scheduling Heuristic, Simulated Annealing, Hybrid Approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22281730 Structural and Electronic Characterization of Supported Ni and Au Catalysts used in Environment Protection Determined by XRD,XAS and XPS methods
Authors: N. Aldea, V. Rednic, F. Matei, Tiandou Hu, M. Neumann
Abstract:
The nickel and gold nanoclusters as supported catalysts were analyzed by XAS, XRD and XPS in order to determine their local, global and electronic structure. The present study has pointed out a strong deformation of the local structure of the metal, due to its interaction with oxide supports. The average particle size, the mean squares of the microstrain, the particle size distribution and microstrain functions of the supported Ni and Au catalysts were determined by XRD method using Generalized Fermi Function for the X-ray line profiles approximation. Based on EXAFS analysis we consider that the local structure of the investigated systems is strongly distorted concerning the atomic number pairs. Metal-support interaction is confirmed by the shape changes of the probability densities of electron transitions: Ni K edge (1s → continuum and 2p), Au LIII-edge (2p3/2 → continuum, 6s, 6d5/2 and 6d3/2). XPS investigations confirm the metal-support interaction at their interface.Keywords: local and global structure, metal-support interaction, supported metal catalysts, synchrotron radiation, X-ray absorptionspectroscopy, X-ray diffraction, X-ray photoelectron spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17791729 Metal(loids) Speciation Using HPLC-ICP-MS Technique in Klodnica River, Upper Silesia, Poland
Authors: Magdalena Jabłońska-Czapla
Abstract:
The work allowed gaining knowledge about redox and speciation changes of As, Cr and Sb ionic forms in Klodnica River water. This kind of studies never has been conducted in this region of Poland. In study optimized and validated previously HPLC-ICP-MS methods for determination of As, Sb and Cr was used. Separation step was done using high-performance liquid chromatograph equipped with ion-exchange column followed by ICP-MS spectrometer detector. Preliminary studies included determination of the total concentration of As, Sb and Cr, pH, Eh, temperature and conductivity of the water samples. The study was conducted monthly from March to August 2014, at six points on the Klodnica River. The results indicate that exceeded at acceptable concentration of total Cr and Sb was observed in Klodnica River and we should qualify Klodnica River waters below the second purity class. In Klodnica River waters dominates oxidized antimony and arsenic forms, as well as the two forms of chromium Cr(VI) and Cr(III). Studies have also shown the methyl derivative of arsenic's presence.
Keywords: Antimony, arsenic, chromium, HPLC-ICP-MS, river water, speciation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21321728 Evaluation of the Microbiological, Chemical and Sensory Quality of Carp Processed by the Sous Vide Method
Authors: Özlem Pelin Can
Abstract:
This study evaluated the microbiological quality and the sensory characteristics of carp fillets processed by the sousvide method when stored at 2 and 10 °C. Four different combinations of sauced–storage were studied then stored at 2 or 10 °C was evaluate periodically sensory, microbiological and chemical quality. Batches stored at 2 °C had lower growth rates of mesophiles and psychrotrophs. Moreover, these counts decreased by increasing the heating temperature and time. Staphylococcus aureus, Bacillus cereus, Clostridium perfringens and Listeria monocytogenes were not found in any of the samples. The heat treatment of 90 °C for 15 min and sauced was the most effective to ensure the safety and extend the shelf-life of sousvide carp preserving its sensory characteristics. This study establishes the microbiological quality of sous vide carp and emphasizes the relevance of the raw materials, heat treatment and storage temperature to ensure the safety of the product.Keywords: Sous- vide methods, carp, sauce, microbiological, chemical and sensory quality
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26381727 Modeling and Analysis for Effective Capacity of a Cross-Layer Optimized Wireless Networks
Authors: Reham A. El-mayet, Hesham M. El-Badawy, Salwa H. Elramly
Abstract:
New generation mobile communication networks have the ability of supporting triple play. In order that, Orthogonal Frequency Division Multiplexing (OFDM) access techniques have been chosen to enlarge the system ability for high data rates networks. Many of cross-layer modeling and optimization schemes for Quality of Service (QoS) and capacity of downlink multiuser OFDM system were proposed. In this paper, the Maximum Weighted Capacity (MWC) based resource allocation at the Physical (PHY) layer is used. This resource allocation scheme provides a much better QoS than the previous resource allocation schemes, while maintaining the highest or nearly highest capacity and costing similar complexity. In addition, the Delay Satisfaction (DS) scheduling at the Medium Access Control (MAC) layer, which allows more than one connection to be served in each slot is used. This scheduling technique is more efficient than conventional scheduling to investigate both of the number of users as well as the number of subcarriers against system capacity. The system will be optimized for different operational environments: the outdoor deployment scenarios as well as the indoor deployment scenarios are investigated and also for different channel models. In addition, effective capacity approach [1] is used not only for providing QoS for different mobile users, but also to increase the total wireless network's throughput.Keywords: Cross-layer, effective capacity, LTE, OFDM, QoS, resource allocation, wireless networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17971726 Evaluation of Food Safety Management Systems of Food Service Establishments within the Greater Accra Region
Authors: Benjamin Osei-Tutu
Abstract:
Food contaminated with biological, chemical and physical hazards usually leads to foodborne illnesses which in turn increase the disease burden of developing and developed economies. Restaurants play a key role in the food service industry and violations in application of standardized food safety management systems in these establishments have been associated with foodborne disease outbreaks. This study was undertaken to assess the level of compliance to the Code of practice that was developed and implemented after conducting needs assessment of the food safety management systems employed by the Food Service Establishments in Ghana. Data on pre-licence inspections were reviewed to assess the compliance of the Food Service Establishments. During the period under review (2012-2016), 74.52% of the food service facilities in the hospitality industry were in compliance with the FDA’s code of practice. Main violations observed during the study bordered on facility layout and fabrication (61.8%) and this is because these facilities may not have been built for use as a food service establishment. Another fact that came to the fore was that the redesigning of the facilities to bring them into compliance required capital intensive investments, which some establishments are not prepared for. Other challenges faced by the industry regarded issues on records and documentations, personnel facilities and hygiene, raw materials acquisition, storage and control, and cold storage.
Keywords: Assessment, Accra, food safety management systems, restaurants, hotel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18171725 Quality as an Approach to Organizational Change and Its Role in the Reorganization of Enterprises: Case of Four Moroccan Small and Medium-Sized Enterprises
Authors: A. Boudiaf
Abstract:
The purpose of this paper is to analyze and apprehend, through four case studies, the interest of the project of the implementation of the quality management system (QMS) at four Moroccan small and medium-sized enterprises (SMEs). This project could generate significant organizational change to improve the functioning of the organization. In fact, quality is becoming a necessity in the current business world. It is considered to be a major component in companies’ competitive strategies. It should be noted that quality management is characterized by a set of methods and techniques that can be used to solve malfunctions and reorganize companies. It is useful to point out that the choice of the adoption of the quality approach could be influenced by the circumstances of the business context, it could also be derived from its strategic vision; this means that this choice can be characterized as either a strategic aspect or a reactive aspect. This would probably have a major impact on the functioning of the QMS and also on the perception of the quality issue by company managers and their employees.
Keywords: Business context, organizational change, quality, reorganization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8091724 Ranking Alternatives in Multi-Criteria Decision Analysis using Common Weights Based on Ideal and Anti-ideal Frontiers
Authors: Saber Saati Mohtadi, Ali Payan, Azizallah Kord
Abstract:
One of the most important issues in multi-criteria decision analysis (MCDA) is to determine the weights of criteria so that all alternatives can be compared based on the collective performance of criteria. In this paper, one of popular methods in data envelopment analysis (DEA) known as common weights (CWs) is used to determine the weights in MCDA. Two frontiers named ideal and anti-ideal frontiers, instead of ideal and anti-ideal alternatives, are defined based on two new proposed CWs models. Ideal and antiideal frontiers are more flexible than that of alternatives. According to the optimal solutions of these two models, the distances of an alternative from the ideal and anti-ideal frontiers are derived. Then, a relative distance is introduced to measure the value of each alternative. The suggested models are linear and despite weight restrictions are feasible. An example is presented for explaining the method and for comparing to the existing literature.
Keywords: Anti-ideal frontier, Common weights (CWs), Ideal frontier, Multi-criteria decision analysis (MCDA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18911723 A Finite Difference Calculation Procedure for the Navier-Stokes Equations on a Staggered Curvilinear Grid
Authors: R. M. Barron, B. Zogheib
Abstract:
A new numerical method for solving the twodimensional, steady, incompressible, viscous flow equations on a Curvilinear staggered grid is presented in this paper. The proposed methodology is finite difference based, but essentially takes advantage of the best features of two well-established numerical formulations, the finite difference and finite volume methods. Some weaknesses of the finite difference approach are removed by exploiting the strengths of the finite volume method. In particular, the issue of velocity-pressure coupling is dealt with in the proposed finite difference formulation by developing a pressure correction equation in a manner similar to the SIMPLE approach commonly used in finite volume formulations. However, since this is purely a finite difference formulation, numerical approximation of fluxes is not required. Results obtained from the present method are based on the first-order upwind scheme for the convective terms, but the methodology can easily be modified to accommodate higher order differencing schemes.Keywords: Curvilinear, finite difference, finite volume, SIMPLE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32031722 Proximate and Mineral Composition of Chicken Giblets from Vojvodina (Northern Serbia)
Authors: M. R. Jokanović, V. M. Tomović, M. T. Jović, S. B. Škaljac, B. V. Šojić, P. M. Ikonić, T. A. Tasić
Abstract:
Proximate (moisture, protein, total fat, total ash) and mineral (K, P, Na, Mg, Ca, Zn, Fe, Cu and Mn) composition of chicken giblets (heart, liver and gizzard) were investigated. Phosphorous content, as well as proximate composition, were determined according to recommended ISO methods. The content of all elements, except phosphorus, of the giblets tissues were determined using inductively coupled plasma-optical emission spectrometry (ICP-OES), after dry ashing mineralization. Regarding proximate composition heart was the highest in total fat content, and the lowest in protein content. Liver was the highest in protein and total ash content, while gizzard was the highest in moisture and the lowest in total fat content. Regarding mineral composition liver was the highest for K, P, Ca, Mg, Fe, Zn, Cu, and Mn, while heart was the highest for Na content. The contents of almost all investigated minerals in analysed giblets tissues of chickens from Vojvodina were similar to values reported in the literature, i.e. in national food composition databases of other countries.
Keywords: Chicken giblets, proximate composition, mineral composition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27041721 Musical Notation Reading versus Alphabet Reading - Comparison and Implications for Teaching Music Reading to Students with Dyslexia
Authors: Ora Geiger
Abstract:
This paper discusses the question whether a person diagnosed with dyslexia will necessarily have difficulty in reading musical notes. The author specifies the characteristics of alphabet reading in comparison to musical notation reading, and concludes that there should be no contra-indication for teaching standard music reading to children with dyslexia if an appropriate process is offered. This conclusion is based on a long term case study and relies on two main characteristics of music reading: (1) musical notation system is a systematic, logical, relative set of symbols written on a staff; and (2) music reading learning connected with playing a musical instrument is a multi-sensory activity that combines sight, hearing, touch, and movement. The paper describes music reading teaching procedures, using soprano recorders, and provides unique teaching methods that have been found to be effective for students who were diagnosed with dyslexia. It provides theoretical explanations in addition to guidelines for music education practices.Keywords: Alphabet reading, music reading, multisensory teaching method, dyslexia, recorder playing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21291720 A Novel Architecture for Wavelet based Image Fusion
Authors: Susmitha Vekkot, Pancham Shukla
Abstract:
In this paper, we focus on the fusion of images from different sources using multiresolution wavelet transforms. Based on reviews of popular image fusion techniques used in data analysis, different pixel and energy based methods are experimented. A novel architecture with a hybrid algorithm is proposed which applies pixel based maximum selection rule to low frequency approximations and filter mask based fusion to high frequency details of wavelet decomposition. The key feature of hybrid architecture is the combination of advantages of pixel and region based fusion in a single image which can help the development of sophisticated algorithms enhancing the edges and structural details. A Graphical User Interface is developed for image fusion to make the research outcomes available to the end user. To utilize GUI capabilities for medical, industrial and commercial activities without MATLAB installation, a standalone executable application is also developed using Matlab Compiler Runtime.Keywords: Filter mask, GUI, hybrid architecture, image fusion, Matlab Compiler Runtime, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23891719 Dempster-Shafer Evidence Theory for Image Segmentation: Application in Cells Images
Authors: S. Ben Chaabane, M. Sayadi, F. Fnaiech, E. Brassart
Abstract:
In this paper we propose a new knowledge model using the Dempster-Shafer-s evidence theory for image segmentation and fusion. The proposed method is composed essentially of two steps. First, mass distributions in Dempster-Shafer theory are obtained from the membership degrees of each pixel covering the three image components (R, G and B). Each membership-s degree is determined by applying Fuzzy C-Means (FCM) clustering to the gray levels of the three images. Second, the fusion process consists in defining three discernment frames which are associated with the three images to be fused, and then combining them to form a new frame of discernment. The strategy used to define mass distributions in the combined framework is discussed in detail. The proposed fusion method is illustrated in the context of image segmentation. Experimental investigations and comparative studies with the other previous methods are carried out showing thus the robustness and superiority of the proposed method in terms of image segmentation.Keywords: Fuzzy C-means, Color image, data fusion, Dempster-Shafer's evidence theory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22001718 Comparison of Hough Transform and Mean Shift Algorithm for Estimation of the Orientation Angle of Industrial Data Matrix Codes
Authors: Ion-Cosmin Dita, Vasile Gui, Franz Quint, Marius Otesteanu
Abstract:
In automatic manufacturing and assembling of mechanical, electrical and electronic parts one needs to reliably identify the position of components and to extract the information of these components. Data Matrix Codes (DMC) are established by these days in many areas of industrial manufacturing thanks to their concentration of information on small spaces. In today’s usually order-related industry, where increased tracing requirements prevail, they offer further advantages over other identification systems. This underlines in an impressive way the necessity of a robust code reading system for detecting DMC on the components in factories. This paper compares two methods for estimating the angle of orientation of Data Matrix Codes: one method based on the Hough Transform and the other based on the Mean Shift Algorithm. We concentrate on Data Matrix Codes in industrial environment, punched, milled, lasered or etched on different materials in arbitrary orientation.
Keywords: Industrial data matrix code, Hough transform, mean shift.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13361717 Autonomic Sonar Sensor Fault Manager for Mobile Robots
Authors: Martin Doran, Roy Sterritt, George Wilkie
Abstract:
NASA, ESA, and NSSC space agencies have plans to put planetary rovers on Mars in 2020. For these future planetary rovers to succeed, they will heavily depend on sensors to detect obstacles. This will also become of vital importance in the future, if rovers become less dependent on commands received from earth-based control and more dependent on self-configuration and self-decision making. These planetary rovers will face harsh environments and the possibility of hardware failure is high, as seen in missions from the past. In this paper, we focus on using Autonomic principles where self-healing, self-optimization, and self-adaption are explored using the MAPE-K model and expanding this model to encapsulate the attributes such as Awareness, Analysis, and Adjustment (AAA-3). In the experimentation, a Pioneer P3-DX research robot is used to simulate a planetary rover. The sonar sensors on the P3-DX robot are used to simulate the sensors on a planetary rover (even though in reality, sonar sensors cannot operate in a vacuum). Experiments using the P3-DX robot focus on how our software system can be adapted with the loss of sonar sensor functionality. The autonomic manager system is responsible for the decision making on how to make use of remaining ‘enabled’ sonars sensors to compensate for those sonar sensors that are ‘disabled’. The key to this research is that the robot can still detect objects even with reduced sonar sensor capability.Keywords: Autonomic, self-adaption, self-healing, self-optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10021716 Performance Evaluation of AOMDV-PAMAC Protocols for Ad Hoc Networks
Authors: B. Malarkodi, S. K. Riyaz Hussain, B. Venkataramani
Abstract:
Power consumption of nodes in ad hoc networks is a critical issue as they predominantly operate on batteries. In order to improve the lifetime of an ad hoc network, all the nodes must be utilized evenly and the power required for connections must be minimized. In this project a link layer algorithm known as Power Aware medium Access Control (PAMAC) protocol is proposed which enables the network layer to select a route with minimum total power requirement among the possible routes between a source and a destination provided all nodes in the routes have battery capacity above a threshold. When the battery capacity goes below a predefined threshold, routes going through these nodes will be avoided and these nodes will act only as source and destination. Further, the first few nodes whose battery power drained to the set threshold value are pushed to the exterior part of the network and the nodes in the exterior are brought to the interior. Since less total power is required to forward packets for each connection. The network layer protocol AOMDV is basically an extension to the AODV routing protocol. AOMDV is designed to form multiple routes to the destination and it also avoid the loop formation so that it reduces the unnecessary congestion to the channel. In this project, the performance of AOMDV is evaluated using PAMAC as a MAC layer protocol and the average power consumption, throughput and average end to end delay of the network are calculated and the results are compared with that of the other network layer protocol AODV.Keywords: AODV, PAMAC, AOMDV, Power consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18251715 A Real-time 4M Collecting Method for Production Information System
Authors: Seung Woo Lee, So Jeong Nam, Jai-Kyung Lee
Abstract:
It can be said that the business sector is faced with a range of challenges–a rapidly changing business environment, an increase and diversification of customers- demands and the consequent need for quick response–for having in place flexible management and production info systems. As a matter of fact, many manufacturers have adopted production info management systems such as MES and ERP. Nevertheless, managers are having difficulties obtaining ever-changing production process information in real time, or responding quickly to any change in production related needs on the basis of such information. This is because they rely on poor production info systems which are not capable of providing real-time factory settings. If the manufacturer doesn-t have a capacity for collecting or digitalizing the 4 Ms (Man, Machine, Material, Method), which are resources for production, on a real time basis, it might to difficult to effectively maintain the information on production process. In this regard, this paper will introduce some new alternatives to the existing methods of collecting the 4 Ms in real time, which are currently comprise the production field.
Keywords: 4M, Acquisition of Data on shop-floor, Real-time machine interface
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43331714 The Risk Assessment of Nano-particles and Investigation of Their Environmental Impact
Authors: Nader Nabhani, Amir Tofighi
Abstract:
Nanotechnology is the science of creating, using and manipulating objects which have at least one dimension in range of 0.1 to 100 nanometers. In other words, nanotechnology is reconstructing a substance using its individual atoms and arranging them in a way that is desirable for our purpose. The main reason that nanotechnology has been attracting attentions is the unique properties that objects show when they are formed at nano-scale. These differing characteristics that nano-scale materials show compared to their nature-existing form is both useful in creating high quality products and dangerous when being in contact with body or spread in environment. In order to control and lower the risk of such nano-scale particles, the main following three topics should be considered: 1) First of all, these materials would cause long term diseases that may show their effects on body years after being penetrated in human organs and since this science has become recently developed in industrial scale not enough information is available about their hazards on body. 2) The second is that these particles can easily spread out in environment and remain in air, soil or water for very long time, besides their high ability to penetrate body skin and causing new kinds of diseases. 3) The third one is that to protect body and environment against the danger of these particles, the protective barriers must be finer than these small objects and such defenses are hard to accomplish. This paper will review, discuss and assess the risks that human and environment face as this new science develops at a high rate.Keywords: Nanotechnology, risk assessment, environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19841713 Immunohistochemical Expression of β-catenin and Epidermal Growth Factor Receptor in Adamantinomatous Craniopharyngioma
Authors: Ghada Esheba, Fatimah Alturkistani, Arwa Obaid, Ahdab Bashehab, Moayad Alturkistani
Abstract:
Introduction: Craniopharyngiomas (CPs) are rare epithelial tumors located mainly in the sellar/parasellar region. CPs have been classified histopathologically, genetically, clinically and prognostically into two distinctive subtypes: adamantinomatous and papillary variants. Aim: To examine the pattern of expression of both the β-catenin and epidermal growth factor receptor (EGFR) in surgically resected samples of adamantinomatous CP, and to asses for the possibility of using anti-EGFR in the management of ACP patients. Materials and methods: β-catenin and EGFR immunostaining was performed on paraffin-embedded tissue sections of 18 ACP cases. Result: 17 out of 18 cases (94%) of ACP exhibited strong nuclear/cytoplasmic expression of β-catenin, 15 (83%) of APC cases were positive for EGFR. Conclusion: Nuclear accumulation of β-catenin is a diagnostic hallmark of ACP. EGFR positivity in most cases of ACP could qualify the use of anti-EGFR therapy.Keywords: Craniopharyngioma, adamantinomatous, papillary, epidermal growth factor receptor, B-catenin.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17641712 An Intelligent Human-Computer Interaction System for Decision Support
Authors: Chee Siong Teh, Chee Peng Lim
Abstract:
This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.
Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14541711 Toxicity Study of Two Different Synthesized Silver Nanoparticles on Bacteria Vibrio Fischeri
Authors: E. Binaeian, A.M. Rashidi, H. Attar
Abstract:
A comparative evaluation of acute toxicity of synthesized nano silvers using two different procedures (biological and chemical reduction methods) and silver ions on bacteria Vibrio fischeri was investigated. The bacterial light inhibition test as a toxicological endpoint was used by applying of a homemade luminometer. To compare the toxicity effects as a quantitative parameter, a nominal effective concentrations (EC) of chemicals and a susceptibility constant (Z-value) of bacteria, after 5 min and 30 min exposure times, were calculated. After 5 and 30 min contact times, the EC50 values of two silver nanoparticles and the EC20 values were about similar. It demonstrates that toxicity of silvers was independent of their procedure. The EC values of nanoparticles were larger than those of the silver ions. The susceptibilities(Z- Values) of V.fischeri (L/mg) to the silver ions were greater than those of the nano silvers. According to the EC and Z values, the toxicity of silvers decreased in the following order: Silver ions >> silver nanoparticles from chemical reduction method ~ silver nanoparticles from biological method.Keywords: Bioluminescence, Luminometer, silver nano particles, Toxicity, Vibrio fischeri
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30801710 Forecasting e-Learning Efficiency by Using Artificial Neural Networks and a Balanced Score Card
Authors: Petar Halachev
Abstract:
Forecasting the values of the indicators, which characterize the effectiveness of performance of organizations is of great importance for their successful development. Such forecasting is necessary in order to assess the current state and to foresee future developments, so that measures to improve the organization-s activity could be undertaken in time. The article presents an overview of the applied mathematical and statistical methods for developing forecasts. Special attention is paid to artificial neural networks as a forecasting tool. Their strengths and weaknesses are analyzed and a synopsis is made of the application of artificial neural networks in the field of forecasting of the values of different education efficiency indicators. A method of evaluation of the activity of universities using the Balanced Scorecard is proposed and Key Performance Indicators for assessment of e-learning are selected. Resulting indicators for the evaluation of efficiency of the activity are proposed. An artificial neural network is constructed and applied in the forecasting of the values of indicators for e-learning efficiency on the basis of the KPI values.Keywords: artificial neural network, balanced scorecard, e-learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15461709 A Framework to Support Reuse in Object-Oriented Software Development
Authors: Fathi Taibi
Abstract:
Reusability is a quality desired attribute in software products. Generally, it could be achieved through adopting development methods that promote it and achieving software qualities that have been linked with high reusability proneness. With the exponential growth in mobile application development, software reuse became an integral part in a substantial number of projects. Similarly, software reuse has become widely practiced in start-up companies. However, this has led to new emerging problems. Firstly, the reused code does not meet the required quality and secondly, the reuse intentions are dubious. This work aims to propose a framework to support reuse in Object-Oriented (OO) software development. The framework comprises a process that uses a proposed reusability assessment metric and a formal foundation to specify the elements of the reused code and the relationships between them. The framework is empirically evaluated using a wide range of open-source projects and mobile applications. The results are analyzed to help understand the reusability proneness of OO software and the possible means to improve it.
Keywords: Software reusability, software metrics, object-oriented software, modularity, low complexity, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 379