Search results for: Data Aggregation
5962 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool
Authors: D. Subedi, S. Pradhan
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However, when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore, during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: Accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35845961 Entrepreneurs’ Perceptions of the Economic, Social and Physical Impacts of Tourism
Authors: Oktay Emir
Abstract:
The objective of this study is to determine how entrepreneurs perceive the economic, social and physical impacts of tourism. The study was conducted in the city of Afyonkarahisar, Turkey, which is rich in thermal tourism resources and investments. A survey was used as the data collection method, and the questionnaire was applied to 472 entrepreneurs. A simple random sampling method was used to identify the sample. Independent sampling t-tests and ANOVA tests were used to analyse the data obtained. Additionally, some statistically significant differences (p<0.05) were found based on the participants’ demographic characteristics regarding their opinions about the social, economic and physical impacts of tourism activities.Keywords: Tourism, perception, entrepreneurship, entrepreneurs, structural equation modelling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12375960 Tracing Quality Cost in a Luggage Manufacturing Industry
Authors: S. B. Jaju, R. R. Lakhe
Abstract:
Quality costs are the costs associated with preventing, finding, and correcting defective work. Since the main language of corporate management is money, quality-related costs act as means of communication between the staff of quality engineering departments and the company managers. The objective of quality engineering is to minimize the total quality cost across the life of product. Quality costs provide a benchmark against which improvement can be measured over time. It provides a rupee-based report on quality improvement efforts. It is an effective tool to identify, prioritize and select quality improvement projects. After reviewing through the literature it was noticed that a simplified methodology for data collection of quality cost in a manufacturing industry was required. The quantified standard methodology is proposed for collecting data of various elements of quality cost categories for manufacturing industry. Also in the light of research carried out so far, it is felt necessary to standardise cost elements in each of the prevention, appraisal, internal failure and external failure costs. . Here an attempt is made to standardise the various cost elements applicable to manufacturing industry and data is collected by using the proposed quantified methodology. This paper discusses the case study carried in luggage manufacturing industry.Keywords: Quality Costs, PAF model, quantified methodology, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22575959 Using Electrical Impedance Tomography to Control a Robot
Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi
Abstract:
Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.Keywords: Electrical impedance tomography, EIT, Surgeon robot, image processing of Electrical impedance tomography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23415958 Increase of Error Detection Effectiveness in the Data Transmission Channels with Pulse-Amplitude Modulation
Authors: Akram A. Mustafa
Abstract:
In this paper an approaches for increasing the effectiveness of error detection in computer network channels with Pulse-Amplitude Modulation (PAM) has been proposed. Proposed approaches are based on consideration of special feature of errors, which are appearances in line with PAM. The first approach consists of CRC modification specifically for line with PAM. The second approach is base of weighted checksums using. The way for checksum components coding has been developed. It has been shown that proposed checksum modification ensure superior digital data control transformation reliability for channels with PAM in compare to CRC.Keywords: Pulse-Amplitude Modulation, checksum, transmission, discrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13485957 Artificial Intelligence Techniques applied to Biomedical Patterns
Authors: Giovanni Luca Masala
Abstract:
Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.Keywords: Computer Aided Detection, mammary tumor, pattern recognition, thalassemia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14265956 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19725955 Application of Single Subject Experimental Designs in Adapted Physical Activity Research: A Descriptive Analysis
Authors: Jiabei Zhang, Ying Qi
Abstract:
The purpose of this study was to develop a descriptive profile of the adapted physical activity research using single subject experimental designs. All research articles using single subject experimental designs published in the journal of Adapted Physical Activity Quarterly from 1984 to 2013 were employed as the data source. Each of the articles was coded in a subcategory of seven categories: (a) the size of sample; (b) the age of participants; (c) the type of disabilities; (d) the type of data analysis; (e) the type of designs, (f) the independent variable, and (g) the dependent variable. Frequencies, percentages, and trend inspection were used to analyze the data and develop a profile. The profile developed characterizes a small portion of research articles used single subject designs, in which most researchers used a small sample size, recruited children as subjects, emphasized learning and behavior impairments, selected visual inspection with descriptive statistics, preferred a multiple baseline design, focused on effects of therapy, inclusion, and strategy, and measured desired behaviors more often, with a decreasing trend over years.Keywords: Adapted physical activity research, single subject experimental designs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18435954 An Approach to Improvement of Information Integrity in Key Areas of Portfolio Management
Authors: Victoria A. Bakhtina
Abstract:
At a time of growing market turbulence and a strong shifts towards increasingly complex risk models and more stringent audit requirements, it is more critical than ever to maintain the highest quality of financial and credit information. IFC implemented an approach that helps increase data integrity and quality significantly. This approach is called “Screening". Screening is based on linking information from different sources to identify potential inconsistencies in key financial and credit data. That, in turn, can help to ease the trials of portfolio supervision, and improve overall company global reporting and assessment systems. IFC experience showed that when used regularly, Screening led to improved information.Keywords: Information Integrity, Information Quality, Business Rules, Portfolio Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14575953 Knowledge and Eating Behavior of Teenage Pregnancy
Authors: Udomporn Yingpaisuk, Premwadee Karuhadej
Abstract:
The purposed of this research was to study the eating habit of teenage pregnancy and its relationship to the knowledge of nutrition during pregnancy. The 100 samples were derived from simple random sampling technique of the teenage pregnancy in Bangkae District. The questionnaire was used to collect data with the reliability of 0.8. The data were analyzed by SPSS for Windows with multiple regression technique. Percentage, mean and the relationship of knowledge of eating and eating behavior were obtained. The research results revealed that their knowledge in nutrition was at the average of 4.07 and their eating habit that they mentioned most was to refrain from alcohol and caffeine at 82% and the knowledge in nutrition influenced their eating habits at 54% with the statistically significant level of 0.001.Keywords: Teenage pregnancy, knowledge of nutrition, eating habit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14955952 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor
Authors: Jinseon Song, Yongwan Park
Abstract:
In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.Keywords: Positioning, Distance, Camera, Features, SURF (Speed-Up Robust Features), Database, Estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14645951 Perception of Hygiene Knowledge among Staff Working in Top Five Famous Restaurants of Male’
Authors: Zulaikha Reesha Rashaad
Abstract:
One of the major factors which can contribute greatly to success of catering businesses is to employ food and beverage staff having sound hygiene knowledge. Individuals having sound knowledge of hygiene has a higher chance of following safe food practices in food production. One of the leading causes of food poisoning and food borne illnesses has been identified as lack of hygiene knowledge among food and beverage staff working in catering establishments and restaurants. This research aims to analyze the hygiene knowledge among food and beverage staff working in top five restaurants of Male’, in relation to their age, educational background, occupation and training. The research uses quantitative and descriptive methods in data collection and in data analysis. Data was obtained through random sampling technique with self-administered survey questionnaires which was completed by 60 respondents working in 5 different restaurants operating at top level in Male’. The respondents of the research were service staff and chefs working in these restaurants. The responses to the questionnaires have been analyzed by using SPSS. The results of the research indicated that age, education level, occupation and training correlated with hygiene knowledge perception scores.Keywords: Food and beverage staff, food poisoning, food production, hygiene knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10955950 Modeling Stress-Induced Regulatory Cascades with Artificial Neural Networks
Authors: Maria E. Manioudaki, Panayiota Poirazi
Abstract:
Yeast cells live in a constantly changing environment that requires the continuous adaptation of their genomic program in order to sustain their homeostasis, survive and proliferate. Due to the advancement of high throughput technologies, there is currently a large amount of data such as gene expression, gene deletion and protein-protein interactions for S. Cerevisiae under various environmental conditions. Mining these datasets requires efficient computational methods capable of integrating different types of data, identifying inter-relations between different components and inferring functional groups or 'modules' that shape intracellular processes. This study uses computational methods to delineate some of the mechanisms used by yeast cells to respond to environmental changes. The GRAM algorithm is first used to integrate gene expression data and ChIP-chip data in order to find modules of coexpressed and co-regulated genes as well as the transcription factors (TFs) that regulate these modules. Since transcription factors are themselves transcriptionally regulated, a three-layer regulatory cascade consisting of the TF-regulators, the TFs and the regulated modules is subsequently considered. This three-layer cascade is then modeled quantitatively using artificial neural networks (ANNs) where the input layer corresponds to the expression of the up-stream transcription factors (TF-regulators) and the output layer corresponds to the expression of genes within each module. This work shows that (a) the expression of at least 33 genes over time and for different stress conditions is well predicted by the expression of the top layer transcription factors, including cases in which the effect of up-stream regulators is shifted in time and (b) identifies at least 6 novel regulatory interactions that were not previously associated with stress-induced changes in gene expression. These findings suggest that the combination of gene expression and protein-DNA interaction data with artificial neural networks can successfully model biological pathways and capture quantitative dependencies between distant regulators and downstream genes.
Keywords: gene modules, artificial neural networks, yeast, stress
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14685949 Study of a BVAR(p) Process Applied to U.S. Commodity Market Data
Authors: Jan Sindelar
Abstract:
The paper presents an applied study of a multivariate AR(p) process fitted to daily data from U.S. commodity futures markets with the use of Bayesian statistics. In the first part a detailed description of the methods used is given. In the second part two BVAR models are chosen one with assumption of lognormal, the second with normal distribution of prices conditioned on the parameters. For a comparison two simple benchmark models are chosen that are commonly used in todays Financial Mathematics. The article compares the quality of predictions of all the models, tries to find an adequate rate of forgetting of information and questions the validity of Efficient Market Hypothesis in the semi-strong form.
Keywords: Vector auto-regression, forecasting, financial, Bayesian, efficient markets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12015948 Prediction of a Human Facial Image by ANN using Image Data and its Content on Web Pages
Authors: Chutimon Thitipornvanid, Siripun Sanguansintukul
Abstract:
Choosing the right metadata is a critical, as good information (metadata) attached to an image will facilitate its visibility from a pile of other images. The image-s value is enhanced not only by the quality of attached metadata but also by the technique of the search. This study proposes a technique that is simple but efficient to predict a single human image from a website using the basic image data and the embedded metadata of the image-s content appearing on web pages. The result is very encouraging with the prediction accuracy of 95%. This technique may become a great assist to librarians, researchers and many others for automatically and efficiently identifying a set of human images out of a greater set of images.Keywords: Metadata, Prediction, Multi-layer perceptron, Human facial image, Image mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12185947 Environmental Efficiency of Electric Power Industry of the United States: A Data Envelopment Analysis Approach
Authors: Alexander Y. Vaninsky
Abstract:
Importance of environmental efficiency of electric power industry stems from high demand for energy combined with global warming concerns. It is especially essential for the world largest economies like that of the United States. The paper introduces a Data Envelopment Analysis (DEA) model of environmental efficiency using indicators of fossil fuels utilization, emissions rate, and electric power losses. Using DEA is advantageous in this situation over other approaches due to its nonparametric nature. The paper analyzes data for the period of 1990 - 2006 by comparing actual yearly levels in each dimension with the best values of partial indicators for the period. As positive factors of efficiency, tendency to the decline in emissions rates starting 2000, and in electric power losses starting 2004 may be mentioned together with increasing trend of fuel utilization starting 1999. As a result, dynamics of environmental efficiency is positive starting 2002. The main concern is the decline in fossil fuels utilization in 2006. This negative change should be reversed to comply with ecological and economic requirements.
Keywords: Environmental efficiency, electric power industry, DEA, United States.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19085946 Enhancing the Performance of Wireless Sensor Networks Using Low Power Design
Authors: N. Mahendran, R. Madhuranthi
Abstract:
Wireless sensor networks (WSNs), are constantly in demand to process information more rapidly with less energy and area cost. Presently, processor based solutions have difficult to achieve high processing speed with low-power consumption. This paper presents a simple and accurate data processing scheme for low power wireless sensor node, based on reduced number of processing element (PE). The presented model provides a simple recursive structure (SRS) to process the sampled data in the wireless sensor environment and to reduce the power consumption in wireless sensor node. Based on this model, to process the incoming samples and produce a smaller amount of data sufficient to reconstruct the original signal. The ModelSim simulator used to simulate SRS structure. Functional simulation is carried out for the validation of the presented architecture. Xilinx Power Estimator (XPE) tool is used to measure the power consumption. The experimental results show the average power consumption of 91 mW; this is 42% improvement compared to the folded tree architecture.Keywords: Power consumption, energy efficiency, low power WSN node, recursive structure, sleep/wake scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10175945 Clustering Methods Applied to the Tracking of user Traces Interacting with an e-Learning System
Authors: Larbi Omar, Elberrichi Zakaria
Abstract:
Many research works are carried out on the analysis of traces in a digital learning environment. These studies produce large volumes of usage tracks from the various actions performed by a user. However, to exploit these data, compare and improve performance, several issues are raised. To remedy this, several works deal with this problem seen recently. This research studied a series of questions about format and description of the data to be shared. Our goal is to share thoughts on these issues by presenting our experience in the analysis of trace-based log files, comparing several approaches used in automatic classification applied to e-learning platforms. Finally, the obtained results are discussed.Keywords: Classification, , e-learning platform, log file, Trace.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14815944 A Content-Based Optimization of Data Stream Television Multiplex
Authors: Jaroslav Polec, Martin Šimek, Michal Martinovič, Elena Šikudová
Abstract:
The television multiplex has reserved capacity and therefore we can use only limited number of videos for propagation of it. Appropriate composition of the multiplex has a major impact on how many videos is spread by multiplex. Therefore in this paper is designed a simple algorithm to optimize capacity utilization multiplex. Significant impact on the number of programs in the multiplex has also the fact from which programs is composed. Content of multiplex can be movies, news, sport, animated stories, documentaries, etc. These types have their own specific characteristics that affect their resulting data stream. In this paper is also done an impact analysis of the composition of the multiplex to use its capacity by video content.
Keywords: Multiplex, content, group of pictures, frame, capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14795943 Elimination of Redundant Links in Web Pages– Mathematical Approach
Authors: G. Poonkuzhali, K.Thiagarajan, K.Sarukesi
Abstract:
With the enormous growth on the web, users get easily lost in the rich hyper structure. Thus developing user friendly and automated tools for providing relevant information without any redundant links to the users to cater to their needs is the primary task for the website owners. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent one that are likely to contain the outlying data such as noise, irrelevant and redundant data. This paper proposes new algorithm for mining the web content by detecting the redundant links from the web documents using set theoretical(classical mathematics) such as subset, union, intersection etc,. Then the redundant links is removed from the original web content to get the required information by the user..Keywords: Web documents, Web content mining, redundantlink, outliers, set theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20185942 An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure
Authors: Fiona Browne, Huiru Zheng, Haiying Wang, Francisco Azuaje
Abstract:
Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.Keywords: Bayesian network, Classification, Data integration, Protein interaction networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16215941 Privacy of RFID Systems: Security of Personal Data for End-Users
Authors: Firoz Khan
Abstract:
Privacy of RFID systems is receiving increasing attention in the RFID community. RFID privacy is important as the RFID tags will be attached to all kinds of products and physical objects including people. The possible abuse or excessive use of RFID tracking capability by malicious users can lead to potential privacy violations. In this paper, we will discuss how the different industries use RFID and the potential privacy and security issues while RFID is implemented in these industries. Although RFID technology offers interesting services to customer and retailers, it could also endanger the privacy of end-users. Personal data can be leaked if a protection mechanism is not deployed in the RFID systems. The paper summarizes many different solutions for implementing privacy and security while deploying RFID systems.Keywords: RFID, privacy, security, encryption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9775940 Development of a Telemedical Network Supporting an Automated Flow Cytometric Analysis for the Clinical Follow-up of Leukaemia
Authors: Claude Takenga, Rolf-Dietrich Berndt, Erling Si, Markus Diem, Guohui Qiao, Melanie Gau, Michael Brandstoetter, Martin Kampel, Michael Dworzak
Abstract:
In patients with acute lymphoblastic leukaemia (ALL), treatment response is increasingly evaluated with minimal residual disease (MRD) analyses. Flow Cytometry (FCM) is a fast and sensitive method to detect MRD. However, the interpretation of these multi-parametric data requires intensive operator training and experience. This paper presents a pipeline-software, as a ready-to-use FCM-based MRD-assessment tool for the daily clinical practice for patients with ALL. The new tool increases accuracy in assessment of FCM-MRD in samples which are difficult to analyse by conventional operator-based gating since computer-aided analysis potentially has a superior resolution due to utilization of the whole multi-parametric FCM-data space at once instead of step-wise, two-dimensional plot-based visualization. The system developed as a telemedical network reduces the work-load and lab-costs, staff-time needed for training, continuous quality control, operator-based data interpretation. It allows dissemination of automated FCM-MRD analysis to medical centres which have no established expertise for the benefit of an even larger community of diseased children worldwide. We established a telemedical network system for analysis and clinical follow-up and treatment monitoring of Leukaemia. The system is scalable and adapted to link several centres and laboratories worldwide.Keywords: Data security, flow cytometry, leukaemia, telematics platform, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15725939 Knowledge Transfer in Industrial Clusters
Authors: Ana Paula Lisboa Sohn, Filipa Dionísio Vieria, Nelson Casarotto, Idaulo José Cunha
Abstract:
This paper aims at identifying and analyzing the knowledge transmission channels in textile and clothing clusters located in Brazil and in Europe. Primary data was obtained through interviews with key individuals. The collection of primary data was carried out based on a questionnaire with ten categories of indicators of knowledge transmission. Secondary data was also collected through a literature review and through international organizations sites. Similarities related to the use of the main transmission channels of knowledge are observed in all cases. The main similarities are: influence of suppliers of machinery, equipment and raw materials; imitation of products and best practices; training promoted by technical institutions and businesses; and cluster companies being open to acquire new knowledge. The main differences lie in the relationship between companies, where in Europe the intensity of this relationship is bigger when compared to Brazil. The differences also occur in importance and frequency of the relationship with the government, with the cultural environment, and with the activities of research and development. It is also found factors that reduce the importance of geographical proximity in transmission of knowledge, and in generating trust and the establishment of collaborative behavior.
Keywords: Industrial clusters, interorganizational learning, knowledge transmission channels, textile and clothing industry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20905938 Thermodynamic Approach of Lanthanide-Iron Double Oxides Formation
Authors: Vera Varazashvili, Murman Tsarakhov, Tamar Mirianashvili, Teimuraz Pavlenishvili, Tengiz Machaladze, Mzia Khundadze
Abstract:
Standard Gibbs energy of formation ΔGfor(298.15) of lanthanide-iron double oxides of garnet-type crystal structure R3Fe5O12 - RIG (R – are rare earth ions) from initial oxides are evaluated. The calculation is based on the data of standard entropies S298.15 and standard enthalpies ΔH298.15 of formation of compounds which are involved in the process of garnets synthesis. Gibbs energy of formation is presented as temperature function ΔGfor(T) for the range 300-1600K. The necessary starting thermodynamic data were obtained from calorimetric study of heat capacity – temperature functions and by using the semi-empirical method for calculation of ΔH298.15 of formation. Thermodynamic functions for standard temperature – enthalpy, entropy and Gibbs energy - are recommended as reference data for technological evaluations. Through the structural series of rare earth-iron garnets the correlation between thermodynamic properties and characteristics of lanthanide ions are elucidated.Keywords: Calorimetry, entropy, enthalpy, heat capacity, gibbs energy of formation, rare earth iron garnets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19295937 Parkinsons Disease Classification using Neural Network and Feature Selection
Authors: Anchana Khemphila, Veera Boonjing
Abstract:
In this study, the Multi-Layer Perceptron (MLP)with Back-Propagation learning algorithm are used to classify to effective diagnosis Parkinsons disease(PD).It-s a challenging problem for medical community.Typically characterized by tremor, PD occurs due to the loss of dopamine in the brains thalamic region that results in involuntary or oscillatory movement in the body. A feature selection algorithm along with biomedical test values to diagnose Parkinson disease.Clinical diagnosis is done mostly by doctor-s expertise and experience.But still cases are reported of wrong diagnosis and treatment. Patients are asked to take number of tests for diagnosis.In many cases,not all the tests contribute towards effective diagnosis of a disease.Our work is to classify the presence of Parkinson disease with reduced number of attributes.Original,22 attributes are involved in classify.We use Information Gain to determine the attributes which reduced the number of attributes which is need to be taken from patients.The Artificial neural networks is used to classify the diagnosis of patients.Twenty-Two attributes are reduced to sixteen attributes.The accuracy is in training data set is 82.051% and in the validation data set is 83.333%.
Keywords: Data mining, classification, Parkinson disease, artificial neural networks, feature selection, information gain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37885936 An Overview of the Application of Fuzzy Inference System for the Automation of Breast Cancer Grading with Spectral Data
Authors: Shabbar Naqvi, Jonathan M. Garibaldi
Abstract:
Breast cancer is one of the most frequent occurring cancers in women throughout the world including U.K. The grading of this cancer plays a vital role in the prognosis of the disease. In this paper we present an overview of the use of advanced computational method of fuzzy inference system as a tool for the automation of breast cancer grading. A new spectral data set obtained from Fourier Transform Infrared Spectroscopy (FTIR) of cancer patients has been used for this study. The future work outlines the potential areas of fuzzy systems that can be used for the automation of breast cancer grading.
Keywords: Breast cancer, FTIR, fuzzy inference system, principal component analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21365935 Management of Air Pollutants from Point Sources
Authors: N. Lokeshwari, G. Srinikethan, V. S. Hegde
Abstract:
Monitoring is essential to assessing the effectiveness of air pollution control actions. The goal of the air quality information system is through monitoring, to keep authorities, major polluters and the public informed on the short and long-term changes in air quality, thereby helping to raise awareness. Mathematical models are the best tools available for the prediction of the air quality management. The main objective of the work was to apply a Model that predicts the concentration levels of different pollutants at any instant of time. In this study, distribution of air pollutants concentration such as nitrogen dioxides (NO2), sulphur dioxides (SO2) and total suspended particulates (TSP) of industries are determined by using Gaussian model. Besides that, the effect of wind speed and its direction on the pollutant concentration within the affected area were evaluated. In order to determine the efficiency and percentage of error in the modeling, validation process of data was done. Sampling of air quality was conducted in getting existing air quality around a factory and the concentrations of pollutants in a plume were inversely proportional to wind velocity. The resultant ground level concentrations were then compared to the quality standards to determine if there could be a negative impact on health. This study concludes that concentration of pollutants can be significantly predicted using Gaussian Model. The data base management is developed for the air data of Hubli-Dharwad region.
Keywords: DBMS, NO2, SO2, Wind rose plots.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20355934 SQL Generator Based On MVC Pattern
Authors: Chanchai Supaartagorn
Abstract:
Structured Query Language (SQL) is the standard de facto language to access and manipulate data in a relational database. Although SQL is a language that is simple and powerful, most novice users will have trouble with SQL syntax. Thus, we are presenting SQL generator tool which is capable of translating actions and displaying SQL commands and data sets simultaneously. The tool was developed based on Model-View-Controller (MVC) pattern. The MVC pattern is a widely used software design pattern that enforces the separation between the input, processing, and output of an application. Developers take full advantage of it to reduce the complexity in architectural design and to increase flexibility and reuse of code. In addition, we use White-Box testing for the code verification in the Model module.
Keywords: MVC, relational database, SQL, White-Box testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20365933 Gender Based Variability Time Series Complexity Analysis
Authors: Ramesh K. Sunkaria, Puneeta Marwaha
Abstract:
Non linear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy normal sinus rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.
Keywords: Heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771