Search results for: fuzzy data
5817 Dynamic Modeling and Simulation of Industrial Naphta Reforming Reactor
Authors: Gholamreza Zahedi, M. Tarin, M. Biglari
Abstract:
This work investigated the steady state and dynamic simulation of a fixed bed industrial naphtha reforming reactors. The performance of the reactor was investigated using a heterogeneous model. For process simulation, the differential equations are solved using the 4th order Runge-Kutta method .The models were validated against measured process data of an existing naphtha reforming plant. The results of simulation in terms of components yields and temperature of the outlet were in good agreement with empirical data. The simple model displays a useful tool for dynamic simulation, optimization and control of naphtha reforming.Keywords: Dynamic simulation, fixed bed reactor, modeling, reforming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29655816 Health Monitoring and Failure Detection of Electronic and Structural Components in Small Unmanned Aerial Vehicles
Authors: Gopi Kandaswamy, P. Balamuralidhar
Abstract:
Fully autonomous small Unmanned Aerial Vehicles (UAVs) are increasingly being used in many commercial applications. Although a lot of research has been done to develop safe, reliable and durable UAVs, accidents due to electronic and structural failures are not uncommon and pose a huge safety risk to the UAV operators and the public. Hence there is a strong need for an automated health monitoring system for UAVs with a view to minimizing mission failures thereby increasing safety. This paper describes our approach to monitoring the electronic and structural components in a small UAV without the need for additional sensors to do the monitoring. Our system monitors data from four sources; sensors, navigation algorithms, control inputs from the operator and flight controller outputs. It then does statistical analysis on the data and applies a rule based engine to detect failures. This information can then be fed back into the UAV and a decision to continue or abort the mission can be taken automatically by the UAV and independent of the operator. Our system has been verified using data obtained from real flights over the past year from UAVs of various sizes that have been designed and deployed by us for various applications.Keywords: Fault detection, health monitoring, unmanned aerial vehicles, vibration analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14955815 Transmit Sub-aperture Optimization in MSTA Ultrasound Imaging Method
Authors: YuriyTasinkevych, Ihor Trots, AndrzejNowicki, Marcin Lewandowski
Abstract:
The paper presents the optimization problem for the multi-element synthetic transmit aperture method (MSTA) in ultrasound imaging applications. The optimal choice of the transmit aperture size is performed as a trade-off between the lateral resolution, penetration depth and the frame rate. Results of the analysis obtained by a developed optimization algorithm are presented. Maximum penetration depth and the best lateral resolution at given depths are chosen as the optimization criteria. The optimization algorithm was tested using synthetic aperture data of point reflectors simulated by Filed II program for Matlab® for the case of 5MHz 128-element linear transducer array with 0.48 mm pitch are presented. The visualization of experimentally obtained synthetic aperture data of a tissue mimicking phantom and in vitro measurements of the beef liver are also shown. The data were obtained using the SonixTOUCH Research systemequipped with a linear 4MHz 128 element transducerwith 0.3 mm element pitch, 0.28 mm element width and 70% fractional bandwidth was excited by one sine cycle pulse burst of transducer's center frequency.Keywords: synthetic aperture method, ultrasound imaging, beamforming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18855814 FPGA Implementation of RSA Encryption Algorithm for E-Passport Application
Authors: Khaled Shehata, Hanady Hussien, Sara Yehia
Abstract:
Securing the data stored on E-passport is a very important issue. RSA encryption algorithm is suitable for such application with low data size. In this paper the design and implementation of 1024 bit-key RSA encryption and decryption module on an FPGA is presented. The module is verified through comparing the result with that obtained from MATLAB tools. The design runs at a frequency of 36.3 MHz on Virtex-5 Xilinx FPGA. The key size is designed to be 1024-bit to achieve high security for the passport information. The whole design is achieved through VHDL design entry which makes it a portable design and can be directed to any hardware platform.
Keywords: RSA, VHDL, FPGA, modular multiplication, modular exponential.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 54165813 Microclimate Variations in Rio de Janeiro Related to Massive Public Transportation
Authors: Marco E. O. Jardim, Frederico A. M. Souza, Valeria M. Bastos, Myrian C. A. Costa, Nelson F. F. Ebecken
Abstract:
Urban public transportation in Rio de Janeiro is based on bus lines, powered by diesel, and four limited metro lines that support only some neighborhoods. This work presents an infrastructure built to better understand microclimate variations related to massive urban transportation in some specific areas of the city. The use of sensor nodes with small analytics capacity provides environmental information to population or public services. The analyses of data collected from a few small sensors positioned near some heavy traffic streets show the harmful impact due to poor bus route plan.
Keywords: Big data, IoT, public transportation, public health system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10725812 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis
Authors: V. Venkatachalam, S. Selvan
Abstract:
The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17485811 Application of Computational Intelligence for Sensor Fault Detection and Isolation
Authors: A. Jabbari, R. Jedermann, W. Lang
Abstract:
The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.
Keywords: Fault detection and Isolation, Neural network, Temperature measurement, measurement approximation and classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20715810 Improving Classification Accuracy with Discretization on Datasets Including Continuous Valued Features
Authors: Mehmet Hacibeyoglu, Ahmet Arslan, Sirzat Kahramanli
Abstract:
This study analyzes the effect of discretization on classification of datasets including continuous valued features. Six datasets from UCI which containing continuous valued features are discretized with entropy-based discretization method. The performance improvement between the dataset with original features and the dataset with discretized features is compared with k-nearest neighbors, Naive Bayes, C4.5 and CN2 data mining classification algorithms. As the result the classification accuracies of the six datasets are improved averagely by 1.71% to 12.31%.Keywords: Data mining classification algorithms, entropy-baseddiscretization method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24615809 Velocity Distribution in Open Channels: Combination of Log-law and Parabolic-law
Authors: Snehasis Kundu, Koeli Ghoshal
Abstract:
In this paper, based on flume experimental data, the velocity distribution in open channel flows is re-investigated. From the analysis, it is proposed that the wake layer in outer region may be divided into two regions, the relatively weak outer region and the relatively strong outer region. Combining the log law for inner region and the parabolic law for relatively strong outer region, an explicit equation for mean velocity distribution of steady and uniform turbulent flow through straight open channels is proposed and verified with the experimental data. It is found that the sediment concentration has significant effect on velocity distribution in the relatively weak outer region.
Keywords: Inner and outer region, Log law, Parabolic law, Richardson number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60885808 Efficient Web Usage Mining Based on K-Medoids Clustering Technique
Authors: P. Sengottuvelan, T. Gopalakrishnan
Abstract:
Web Usage Mining is the application of data mining techniques to find usage patterns from web log data, so as to grasp required patterns and serve the requirements of Web-based applications. User’s expertise on the internet may be improved by minimizing user’s web access latency. This may be done by predicting the future search page earlier and the same may be prefetched and cached. Therefore, to enhance the standard of web services, it is needed topic to research the user web navigation behavior. Analysis of user’s web navigation behavior is achieved through modeling web navigation history. We propose this technique which cluster’s the user sessions, based on the K-medoids technique.Keywords: Clustering, K-medoids, Recommendation, User Session, Web Usage Mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13965807 High-Value Health System for All: Technologies for Promoting Health Education and Awareness
Authors: M. P. Sebastian
Abstract:
Health for all is considered as a sign of well-being and inclusive growth. New healthcare technologies are contributing to the quality of human lives by promoting health education and awareness, leading to the prevention, early diagnosis and treatment of the symptoms of diseases. Healthcare technologies have now migrated from the medical and institutionalized settings to the home and everyday life. This paper explores these new technologies and investigates how they contribute to health education and awareness, promoting the objective of high-value health system for all. The methodology used for the research is literature review. The paper also discusses the opportunities and challenges with futuristic healthcare technologies. The combined advances in genomics medicine, wearables and the IoT with enhanced data collection in electronic health record (EHR) systems, environmental sensors, and mobile device applications can contribute in a big way to high-value health system for all. The promise by these technologies includes reduced total cost of healthcare, reduced incidence of medical diagnosis errors, and reduced treatment variability. The major barriers to adoption include concerns with security, privacy, and integrity of healthcare data, regulation and compliance issues, service reliability, interoperability and portability of data, and user friendliness and convenience of these technologies.
Keywords: Bigdata, education, healthcare, ICT, patients, technologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10445806 Experiment and Simulation of Laser Effect on Thermal Field of Porcine Liver
Authors: K.Ting, K. T. Chen, Y. L. Su, C. J. Chang
Abstract:
In medical therapy, laser has been widely used to conduct cosmetic, tumor and other treatments. During the process of laser irradiation, there may be thermal damage caused by excessive laser exposure. Thus, the establishment of a complete thermal analysis model is clinically helpful to physicians in reference data. In this study, porcine liver in place of tissue was subjected to laser irradiation to set up the experimental data considering the explored impact on surface thermal field and thermal damage region under different conditions of power, laser irradiation time, and distance between laser and porcine liver. In the experimental process, the surface temperature distribution of the porcine lever was measured by the infrared thermal imager. In the part of simulation, the bio heat transfer Pennes-s equation was solved by software SYSWELD applying in welding process. The double ellipsoid function as a laser source term is firstly considered in the prediction for surface thermal field and internal tissue damage. The simulation results are compared with the experimental data to validate the mathematical model established here in.
Keywords: laser infrared thermal imager, bio-heat transfer, double ellipsoid function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20585805 Adsorption of Paracetamol Using Activated Carbon of Dende and Babassu Coconut Mesocarp
Authors: R. C. Ferreira, H. H. C. De Lima, A. A. Cândido, O. M. Couto Junior, P. A. Arroyo, K. Q De Carvalho, G. F. Gauze, M. A. S. D. Barros
Abstract:
Removal of the widespread used drug paracetamol from water was investigated using activated carbon originated from dende coconut mesocarp and babassu coconut mesocarp. Kinetic and equilibrium data were obtained at different values of pH. Both activated carbons showed high efficiency when pH ≤ pHPZC as the carbonil group of paracetamol molecule are adsorbed due to positively charged carbon surface. Microporosity also played an important role in such process. Pseudo-second order model was better adjusted to the kinetic results. Equilibrium data may be represented by Langmuir equation.Keywords: Adsorption, activated carbon, babassu, dende.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31985804 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.
Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4495803 Optimizing Spatial Trend Detection By Artificial Immune Systems
Authors: M. Derakhshanfar, B. Minaei-Bidgoli
Abstract:
Spatial trends are one of the valuable patterns in geo databases. They play an important role in data analysis and knowledge discovery from spatial data. A spatial trend is a regular change of one or more non spatial attributes when spatially moving away from a start object. Spatial trend detection is a graph search problem therefore heuristic methods can be good solution. Artificial immune system (AIS) is a special method for searching and optimizing. AIS is a novel evolutionary paradigm inspired by the biological immune system. The models based on immune system principles, such as the clonal selection theory, the immune network model or the negative selection algorithm, have been finding increasing applications in fields of science and engineering. In this paper, we develop a novel immunological algorithm based on clonal selection algorithm (CSA) for spatial trend detection. We are created neighborhood graph and neighborhood path, then select spatial trends that their affinity is high for antibody. In an evolutionary process with artificial immune algorithm, affinity of low trends is increased with mutation until stop condition is satisfied.Keywords: Spatial Data Mining, Spatial Trend Detection, Heuristic Methods, Artificial Immune System, Clonal Selection Algorithm (CSA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20465802 Ontology-Based Systemizing of the Science Information Devoted to Waste Utilizing by Methanogenesis
Authors: Ye. Shapovalov, V. Shapovalov, O. Stryzhak, A. Salyuk
Abstract:
Over the past decades, amount of scientific information has been growing exponentially. It became more complicated to process and systemize this amount of data. The approach to systematization of scientific information on the production of biogas based on the ontological IT platform “T.O.D.O.S.” has been developed. It has been proposed to select semantic characteristics of each work for their further introduction into the IT platform “T.O.D.O.S.”. An ontological graph with a ranking function for previous scientific research and for a system of selection of microorganisms has been worked out. These systems provide high performance of information management of scientific information.
Keywords: Ontology-based analysis, analysis of scientific data, methanogenesys, microorganism hierarchy, T.O.D.O.S.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7345801 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool
Authors: D. Subedi, S. Pradhan
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However, when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore, during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: Accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35795800 Entrepreneurs’ Perceptions of the Economic, Social and Physical Impacts of Tourism
Authors: Oktay Emir
Abstract:
The objective of this study is to determine how entrepreneurs perceive the economic, social and physical impacts of tourism. The study was conducted in the city of Afyonkarahisar, Turkey, which is rich in thermal tourism resources and investments. A survey was used as the data collection method, and the questionnaire was applied to 472 entrepreneurs. A simple random sampling method was used to identify the sample. Independent sampling t-tests and ANOVA tests were used to analyse the data obtained. Additionally, some statistically significant differences (p<0.05) were found based on the participants’ demographic characteristics regarding their opinions about the social, economic and physical impacts of tourism activities.Keywords: Tourism, perception, entrepreneurship, entrepreneurs, structural equation modelling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12335799 Tracing Quality Cost in a Luggage Manufacturing Industry
Authors: S. B. Jaju, R. R. Lakhe
Abstract:
Quality costs are the costs associated with preventing, finding, and correcting defective work. Since the main language of corporate management is money, quality-related costs act as means of communication between the staff of quality engineering departments and the company managers. The objective of quality engineering is to minimize the total quality cost across the life of product. Quality costs provide a benchmark against which improvement can be measured over time. It provides a rupee-based report on quality improvement efforts. It is an effective tool to identify, prioritize and select quality improvement projects. After reviewing through the literature it was noticed that a simplified methodology for data collection of quality cost in a manufacturing industry was required. The quantified standard methodology is proposed for collecting data of various elements of quality cost categories for manufacturing industry. Also in the light of research carried out so far, it is felt necessary to standardise cost elements in each of the prevention, appraisal, internal failure and external failure costs. . Here an attempt is made to standardise the various cost elements applicable to manufacturing industry and data is collected by using the proposed quantified methodology. This paper discusses the case study carried in luggage manufacturing industry.Keywords: Quality Costs, PAF model, quantified methodology, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22545798 Using Electrical Impedance Tomography to Control a Robot
Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi
Abstract:
Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.Keywords: Electrical impedance tomography, EIT, Surgeon robot, image processing of Electrical impedance tomography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23335797 Increase of Error Detection Effectiveness in the Data Transmission Channels with Pulse-Amplitude Modulation
Authors: Akram A. Mustafa
Abstract:
In this paper an approaches for increasing the effectiveness of error detection in computer network channels with Pulse-Amplitude Modulation (PAM) has been proposed. Proposed approaches are based on consideration of special feature of errors, which are appearances in line with PAM. The first approach consists of CRC modification specifically for line with PAM. The second approach is base of weighted checksums using. The way for checksum components coding has been developed. It has been shown that proposed checksum modification ensure superior digital data control transformation reliability for channels with PAM in compare to CRC.Keywords: Pulse-Amplitude Modulation, checksum, transmission, discrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13455796 Artificial Intelligence Techniques applied to Biomedical Patterns
Authors: Giovanni Luca Masala
Abstract:
Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.Keywords: Computer Aided Detection, mammary tumor, pattern recognition, thalassemia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14255795 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19675794 Application of Single Subject Experimental Designs in Adapted Physical Activity Research: A Descriptive Analysis
Authors: Jiabei Zhang, Ying Qi
Abstract:
The purpose of this study was to develop a descriptive profile of the adapted physical activity research using single subject experimental designs. All research articles using single subject experimental designs published in the journal of Adapted Physical Activity Quarterly from 1984 to 2013 were employed as the data source. Each of the articles was coded in a subcategory of seven categories: (a) the size of sample; (b) the age of participants; (c) the type of disabilities; (d) the type of data analysis; (e) the type of designs, (f) the independent variable, and (g) the dependent variable. Frequencies, percentages, and trend inspection were used to analyze the data and develop a profile. The profile developed characterizes a small portion of research articles used single subject designs, in which most researchers used a small sample size, recruited children as subjects, emphasized learning and behavior impairments, selected visual inspection with descriptive statistics, preferred a multiple baseline design, focused on effects of therapy, inclusion, and strategy, and measured desired behaviors more often, with a decreasing trend over years.Keywords: Adapted physical activity research, single subject experimental designs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18425793 An Approach to Improvement of Information Integrity in Key Areas of Portfolio Management
Authors: Victoria A. Bakhtina
Abstract:
At a time of growing market turbulence and a strong shifts towards increasingly complex risk models and more stringent audit requirements, it is more critical than ever to maintain the highest quality of financial and credit information. IFC implemented an approach that helps increase data integrity and quality significantly. This approach is called “Screening". Screening is based on linking information from different sources to identify potential inconsistencies in key financial and credit data. That, in turn, can help to ease the trials of portfolio supervision, and improve overall company global reporting and assessment systems. IFC experience showed that when used regularly, Screening led to improved information.Keywords: Information Integrity, Information Quality, Business Rules, Portfolio Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14525792 Knowledge and Eating Behavior of Teenage Pregnancy
Authors: Udomporn Yingpaisuk, Premwadee Karuhadej
Abstract:
The purposed of this research was to study the eating habit of teenage pregnancy and its relationship to the knowledge of nutrition during pregnancy. The 100 samples were derived from simple random sampling technique of the teenage pregnancy in Bangkae District. The questionnaire was used to collect data with the reliability of 0.8. The data were analyzed by SPSS for Windows with multiple regression technique. Percentage, mean and the relationship of knowledge of eating and eating behavior were obtained. The research results revealed that their knowledge in nutrition was at the average of 4.07 and their eating habit that they mentioned most was to refrain from alcohol and caffeine at 82% and the knowledge in nutrition influenced their eating habits at 54% with the statistically significant level of 0.001.Keywords: Teenage pregnancy, knowledge of nutrition, eating habit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14925791 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor
Authors: Jinseon Song, Yongwan Park
Abstract:
In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.Keywords: Positioning, Distance, Camera, Features, SURF (Speed-Up Robust Features), Database, Estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14605790 Perception of Hygiene Knowledge among Staff Working in Top Five Famous Restaurants of Male’
Authors: Zulaikha Reesha Rashaad
Abstract:
One of the major factors which can contribute greatly to success of catering businesses is to employ food and beverage staff having sound hygiene knowledge. Individuals having sound knowledge of hygiene has a higher chance of following safe food practices in food production. One of the leading causes of food poisoning and food borne illnesses has been identified as lack of hygiene knowledge among food and beverage staff working in catering establishments and restaurants. This research aims to analyze the hygiene knowledge among food and beverage staff working in top five restaurants of Male’, in relation to their age, educational background, occupation and training. The research uses quantitative and descriptive methods in data collection and in data analysis. Data was obtained through random sampling technique with self-administered survey questionnaires which was completed by 60 respondents working in 5 different restaurants operating at top level in Male’. The respondents of the research were service staff and chefs working in these restaurants. The responses to the questionnaires have been analyzed by using SPSS. The results of the research indicated that age, education level, occupation and training correlated with hygiene knowledge perception scores.Keywords: Food and beverage staff, food poisoning, food production, hygiene knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10915789 Modeling Stress-Induced Regulatory Cascades with Artificial Neural Networks
Authors: Maria E. Manioudaki, Panayiota Poirazi
Abstract:
Yeast cells live in a constantly changing environment that requires the continuous adaptation of their genomic program in order to sustain their homeostasis, survive and proliferate. Due to the advancement of high throughput technologies, there is currently a large amount of data such as gene expression, gene deletion and protein-protein interactions for S. Cerevisiae under various environmental conditions. Mining these datasets requires efficient computational methods capable of integrating different types of data, identifying inter-relations between different components and inferring functional groups or 'modules' that shape intracellular processes. This study uses computational methods to delineate some of the mechanisms used by yeast cells to respond to environmental changes. The GRAM algorithm is first used to integrate gene expression data and ChIP-chip data in order to find modules of coexpressed and co-regulated genes as well as the transcription factors (TFs) that regulate these modules. Since transcription factors are themselves transcriptionally regulated, a three-layer regulatory cascade consisting of the TF-regulators, the TFs and the regulated modules is subsequently considered. This three-layer cascade is then modeled quantitatively using artificial neural networks (ANNs) where the input layer corresponds to the expression of the up-stream transcription factors (TF-regulators) and the output layer corresponds to the expression of genes within each module. This work shows that (a) the expression of at least 33 genes over time and for different stress conditions is well predicted by the expression of the top layer transcription factors, including cases in which the effect of up-stream regulators is shifted in time and (b) identifies at least 6 novel regulatory interactions that were not previously associated with stress-induced changes in gene expression. These findings suggest that the combination of gene expression and protein-DNA interaction data with artificial neural networks can successfully model biological pathways and capture quantitative dependencies between distant regulators and downstream genes.
Keywords: gene modules, artificial neural networks, yeast, stress
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14655788 Study of a BVAR(p) Process Applied to U.S. Commodity Market Data
Authors: Jan Sindelar
Abstract:
The paper presents an applied study of a multivariate AR(p) process fitted to daily data from U.S. commodity futures markets with the use of Bayesian statistics. In the first part a detailed description of the methods used is given. In the second part two BVAR models are chosen one with assumption of lognormal, the second with normal distribution of prices conditioned on the parameters. For a comparison two simple benchmark models are chosen that are commonly used in todays Financial Mathematics. The article compares the quality of predictions of all the models, tries to find an adequate rate of forgetting of information and questions the validity of Efficient Market Hypothesis in the semi-strong form.
Keywords: Vector auto-regression, forecasting, financial, Bayesian, efficient markets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1198