Search results for: mathematical data analysis.
13093 Investigating the Demand for Short-shelf Life Food Products for SME Wholesalers
Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Ashley Hopwell, Alistair Duffy
Abstract:
Accurate forecasting of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. This paper is an attempt to understand the cause for the high level of variability such as weather, holidays etc., in demand of SME wholesalers. Therefore, understanding the significance of unidentified factors may improve the forecasting accuracy. This paper presents the current literature on the factors used to predict demand and the existing forecasting techniques of short shelf life products. It then investigates a variety of internal and external possible factors, some of which is not used by other researchers in the demand prediction process. The results presented in this paper are further analysed using a number of techniques to minimize noise in the data. For the analysis past sales data (January 2009 to May 2014) from a UK based SME wholesaler is used and the results presented are limited to product ‘Milk’ focused on café’s in derby. The correlation analysis is done to check the dependencies of variability factor on the actual demand. Further PCA analysis is done to understand the significance of factors identified using correlation. The PCA results suggest that the cloud cover, weather summary and temperature are the most significant factors that can be used in forecasting the demand. The correlation of the above three factors increased relative to monthly and becomes more stable compared to the weekly and daily demand.Keywords: Demand Forecasting, Deteriorating Products, Food Wholesalers, Principal Component Analysis and Variability Factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 336813092 Stability of Concrete Moment Resisting Frames in View of Current Codes Requirements
Authors: Mahmoud A. Mahmoud, Ashraf Osman
Abstract:
In this study, the different approaches currently followed by design codes to assess the stability of buildings utilizing concrete moment resisting frames structural system are evaluated. For such purpose, a parametric study was performed. It involved analyzing group of concrete moment resisting frames having different slenderness ratios (height/width ratios), designed for different lateral loads to vertical loads ratios and constructed using ordinary reinforced concrete and high strength concrete for stability check and overall buckling using code approaches and computer buckling analysis. The objectives were to examine the influence of such parameters that directly linked to frames’ lateral stiffness on the buildings’ stability and evaluates the code approach in view of buckling analysis results. Based on this study, it was concluded that, the most susceptible buildings to instability and magnification of second order effects are buildings having high aspect ratios (height/width ratio), having low lateral to vertical loads ratio and utilizing construction materials of high strength. In addition, the study showed that the instability limits imposed by codes are mainly mathematical to ensure reliable analysis not a physical ones and that they are in general conservative. Also, it has been shown that the upper limit set by one of the codes that second order moment for structural elements should be limited to 1.4 the first order moment is not justified, instead, the overall story check is more reliable.
Keywords: Buckling, lateral stability, p-delta, second order.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231313091 Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler
Authors: R.Anita, V.Ganga Bharani, N.Nityanandam, Pradeep Kumar Sahoo
Abstract:
The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based approach for extracting data from the deep web. Deep iCrawl splits the process into two phases. The first phase includes Query analysis and Query translation and the second covers vision-based extraction of data from the dynamically created deep web pages. There are several established approaches for the extraction of deep web pages but the proposed method aims at overcoming the inherent limitations of the former. This paper also aims at comparing the data items and presenting them in the required order.Keywords: Crawler, Deep web, Web Database
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215613090 Teachers and Learners Perceptions on the Impact of Different Test Procedures on Reading: A Case Study
Authors: Bahloul Amel
Abstract:
The main aim of this research was to investigate the perspectives of English language teachers and learners on the effect of test techniques on reading comprehension, test performance and assessment. The research has also aimed at finding the differences between teacher and learner perspectives, specifying the test techniques which have the highest effect, investigating the other factors affecting reading comprehension, and comparing the results with the similar studies. In order to achieve these objectives, perspectives and findings of different researchers were reviewed, two different questionnaires were prepared to collect data for the perspectives of teachers and learners, the questionnaires were applied to 26 learners and 8 teachers from the University of Batna (Algeria), and quantitative and qualitative data analysis of the results were done. The results and analysis of the results show that different test techniques affect reading comprehension, test performance and assessment at different percentages rates.
Keywords: Reading Comprehension, Reading Assessment, Test Performance, Test Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 187313089 Trust and Reliability for Public Sector Data
Authors: Klaus Stranacher, Vesna Krnjic, Thomas Zefferer
Abstract:
The public sector holds large amounts of data of various areas such as social affairs, economy, or tourism. Various initiatives such as Open Government Data or the EU Directive on public sector information aim to make these data available for public and private service providers. Requirements for the provision of public sector data are defined by legal and organizational frameworks. Surprisingly, the defined requirements hardly cover security aspects such as integrity or authenticity. In this paper we discuss the importance of these missing requirements and present a concept to assure the integrity and authenticity of provided data based on electronic signatures. We show that our concept is perfectly suitable for the provisioning of unaltered data. We also show that our concept can also be extended to data that needs to be anonymized before provisioning by incorporating redactable signatures. Our proposed concept enhances trust and reliability of provided public sector data.Keywords: Trusted Public Sector Data, Integrity, Authenticity, Reliability, Redactable Signatures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175813088 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91013087 Two DEA Based Ant Algorithms for CMS Problems
Authors: Hossein Ali Akbarpour, Fatemeh Dadkhah
Abstract:
This paper considers a multi criteria cell formation problem in Cellular Manufacturing System (CMS). Minimizing the number of voids and exceptional elements in cells simultaneously are two proposed objective functions. This problem is an Np-hard problem according to the literature, and therefore, we can-t find the optimal solution by an exact method. In this paper we developed two ant algorithms, Ant Colony Optimization (ACO) and Max-Min Ant System (MMAS), based on Data Envelopment Analysis (DEA). Both of them try to find the efficient solutions based on efficiency concept in DEA. Each artificial ant is considered as a Decision Making Unit (DMU). For each DMU we considered two inputs, the values of objective functions, and one output, the value of one for all of them. In order to evaluate performance of proposed methods we provided an experimental design with some empirical problem in three different sizes, small, medium and large. We defined three different criteria that show which algorithm has the best performance.Keywords: Ant algorithm, Cellular manufacturing system, Data envelopment analysis, Efficiency
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 165413086 An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement
Authors: Guomin Luo, Daming Zhang, Yong Kwee Koh, Kim Teck Ng, Helmi Kurniawan, Weng Hoe Leong
Abstract:
Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.Keywords: Entropy, Fourier analysis, non-intrusive measurement, time-frequency analysis, partial discharge
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158913085 Time-Domain Analysis of Pulse Parameters Effects on Crosstalk (In High Speed Circuits)
Authors: L. Tani, N. El Ouzzani
Abstract:
Crosstalk among interconnects and printed-circuit board (PCB) traces is a major limiting factor of signal quality in highspeed digital and communication equipments especially when fast data buses are involved. Such a bus is considered as a planar multiconductor transmission line. This paper will demonstrate how the finite difference time domain (FDTD) method provides an exact solution of the transmission-line equations to analyze the near end and the far end crosstalk. In addition, this study makes it possible to analyze the rise time effect on the near and far end voltages of the victim conductor. The paper also discusses a statistical analysis, based upon a set of several simulations. Such analysis leads to a better understanding of the phenomenon and yields useful information.Keywords: Multiconductor transmission line, Crosstalk, Finite difference time domain (FDTD), printed-circuit board (PCB), Rise time, Statistical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 177313084 Delay Analysis of Sampled-Data Systems in Hard RTOS
Authors: A. M. Azad, M. Alam, C. M. Hussain
Abstract:
In this paper, we have presented the effect of varying time-delays on performance and stability in the single-channel multirate sampled-data system in hard real-time (RT-Linux) environment. The sampling task require response time that might exceed the capacity of RT-Linux. So a straight implementation with RT-Linux is not feasible, because of the latency of the systems and hence, sampling period should be less to handle this task. The best sampling rate is chosen for the sampled-data system, which is the slowest rate meets all performance requirements. RT-Linux is consistent with its specifications and the resolution of the real-time is considered 0.01 seconds to achieve an efficient result. The test results of our laboratory experiment shows that the multi-rate control technique in hard real-time operating system (RTOS) can improve the stability problem caused by the random access delays and asynchronization.Keywords: Multi-rate, PID, RT-Linux, Sampled-data, Servo.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144413083 Bridge Health Monitoring: A Review
Authors: Mohammad Bakhshandeh
Abstract:
Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.
Keywords: Structural health monitoring, bridge health monitoring, sensor-based methods, machine-learning algorithms, model-based techniques, sensor placement, data acquisition, data analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30213082 Elemental Graph Data Model: A Semantic and Topological Representation of Building Elements
Authors: Yasmeen A. S. Essawy, Khaled Nassar
Abstract:
With the rapid increase of complexity in the building industry, professionals in the A/E/C industry were forced to adopt Building Information Modeling (BIM) in order to enhance the communication between the different project stakeholders throughout the project life cycle and create a semantic object-oriented building model that can support geometric-topological analysis of building elements during design and construction. This paper presents a model that extracts topological relationships and geometrical properties of building elements from an existing fully designed BIM, and maps this information into a directed acyclic Elemental Graph Data Model (EGDM). The model incorporates BIM-based search algorithms for automatic deduction of geometrical data and topological relationships for each building element type. Using graph search algorithms, such as Depth First Search (DFS) and topological sortings, all possible construction sequences can be generated and compared against production and construction rules to generate an optimized construction sequence and its associated schedule. The model is implemented in a C# platform.
Keywords: Building information modeling, elemental graph data model, geometric and topological data models, and graph theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 120313081 Constructing a Bayesian Network for Solar Energy in Egypt Using Life Cycle Analysis and Machine Learning Algorithms
Authors: Rawaa H. El-Bidweihy, Hisham M. Abdelsalam, Ihab A. El-Khodary
Abstract:
In an era where machines run and shape our world, the need for a stable, non-ending source of energy emerges. In this study, the focus was on the solar energy in Egypt as a renewable source, the most important factors that could affect the solar energy’s market share throughout its life cycle production were analyzed and filtered, the relationships between them were derived before structuring a Bayesian network. Also, forecasted models were built for multiple factors to predict the states in Egypt by 2035, based on historical data and patterns, to be used as the nodes’ states in the network. 37 factors were found to might have an impact on the use of solar energy and then were deducted to 12 factors that were chosen to be the most effective to the solar energy’s life cycle in Egypt, based on surveying experts and data analysis, some of the factors were found to be recurring in multiple stages. The presented Bayesian network could be used later for scenario and decision analysis of using solar energy in Egypt, as a stable renewable source for generating any type of energy needed.
Keywords: ARIMA, auto correlation, Bayesian network, forecasting models, life cycle, partial correlation, renewable energy, SARIMA, solar energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78113080 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: Statistical slope stability analysis, Skew distributions, Probability of failure, Functions of random variables.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154513079 An SVM based Classification Method for Cancer Data using Minimum Microarray Gene Expressions
Authors: R. Mallika, V. Saravanan
Abstract:
This paper gives a novel method for improving classification performance for cancer classification with very few microarray Gene expression data. The method employs classification with individual gene ranking and gene subset ranking. For selection and classification, the proposed method uses the same classifier. The method is applied to three publicly available cancer gene expression datasets from Lymphoma, Liver and Leukaemia datasets. Three different classifiers namely Support vector machines-one against all (SVM-OAA), K nearest neighbour (KNN) and Linear Discriminant analysis (LDA) were tested and the results indicate the improvement in performance of SVM-OAA classifier with satisfactory results on all the three datasets when compared with the other two classifiers.Keywords: Support vector machines-one against all, cancerclassification, Linear Discriminant analysis, K nearest neighbour, microarray gene expression, gene pair ranking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 256213078 A Fuzzy Linear Regression Model Based on Dissemblance Index
Authors: Shih-Pin Chen, Shih-Syuan You
Abstract:
Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.Keywords: Dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144213077 Social Network Analysis & Information Disclosure: A Case Study
Authors: Shilpi Sharma, J. S. Sodhi
Abstract:
The advent of social networking technologies has been met with mixed reactions in academic and corporate circles around the world. This study explored the influence of social network in current era, the relation being maintained between the Social networking site and its user by the extent of use, benefits and latest technologies. The study followed a descriptive research design wherein a questionnaire was used as the main research tool. The data collected was analyzed using SPSS 16. Data was gathered from 1205 users and analyzed in accordance with the objectives of the study. The analysis of the results seem to suggest that the majority of users were mainly using Facebook, despite of concerns raised about the disclosure of personal information on social network sites, users continue to disclose huge quantity of personal information, they find that reading privacy policy is time consuming and changes made can result into improper settings.
Keywords: Social Networking Sites, Privacy Policy, Disclosure of Personal Information.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 206313076 Scale Development for Measuring E-Service Quality in Banking
Authors: Vivek Agrawal, Vikas Tripathi, Nitin Seth
Abstract:
This study examines several critical dimensions of eservice quality overlooked in the existing literature and proposes a model and instrument framework for measuring customer perceived e-service quality in the banking sector. The initial design was derived from a pool of instrument dimensions and their items from the existing literature review by content analysis. Based on focused group discussion, nine dimensions were extracted. An exploratory factor analysis approach was applied to data from a survey of 323 respondents. The instrument has been designed specifically for the banking sector. Research data was collected from bank customers who use electronic banking in a developing economy. A nine-factor instrument has been proposed to measure the e-service quality. The instrument has been checked for reliability. The validity and sample place limited the applicability of the instrument across economies and service categories. Future research must be conducted to check the validity. This instrument can help bankers in developing economies like India to measure the e-service quality and make improvements. The present study offers a systematic procedure that provides insights on to the conceptual and empirical comprehension of customer perceived e-service quality and its constituents.
Keywords: Testing, instrument, e-service quality, factor analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 383713075 Modeling and Design of MPPT Controller Using Stepped P&O Algorithm in Solar Photovoltaic System
Authors: R. Prakash, B. Meenakshipriya, R. Kumaravelan
Abstract:
This paper presents modeling and simulation of Grid Connected Photovoltaic (PV) system by using improved mathematical model. The model is used to study different parameter variations and effects on the PV array including operating temperature and solar irradiation level. In this paper stepped P&O algorithm is proposed for MPPT control. This algorithm will identify the suitable duty ratio in which the DC-DC converter should be operated to maximize the power output. Photo voltaic array with proposed stepped P&O-MPPT controller can operate in the maximum power point for the whole range of solar data (irradiance and temperature).
Keywords: Photovoltaic (PV), Maximum Power Point Tracking (MPPT), Boost converter, Stepped Perturb & Observe method (Stepped P&O).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 401113074 A Model-Reference Sliding Mode for Dual-Stage Actuator Servo Control in HDD
Authors: S. Sonkham, U. Pinsopon, W. Chatlatanagulchai
Abstract:
This paper presents a method of sliding mode control (SMC) designing and developing for the servo system in a dual-stage actuator (DSA) hard disk drive. Mathematical modeling of hard disk drive actuators is obtained, extracted from measuring frequency response of the voice-coil motor (VCM) and PZT micro-actuator separately. Matlab software tools are used for mathematical model estimation and also for controller design and simulation. A model-reference approach for tracking requirement is selected as a proposed technique. The simulation results show that performance of a model-reference SMC controller design in DSA servo control can be satisfied in the tracking error, as well as keeping the positioning of the head within the boundary of +/-5% of track width under the presence of internal and external disturbance. The overall results of model-reference SMC design in DSA are met per requirement specifications and significant reduction in %off track is found when compared to the single-state actuator (SSA).
Keywords: Hard Disk Drive, Dual-Stage Actuator, Track Following, HDD Servo Control, Sliding Mode Control, Model-Reference, Tracking Control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196213073 Preliminary Overview of Data Mining Technology for Knowledge Management System in Institutions of Higher Learning
Authors: Muslihah Wook, Zawiyah M. Yusof, Mohd Zakree Ahmad Nazri
Abstract:
Data mining has been integrated into application systems to enhance the quality of the decision-making process. This study aims to focus on the integration of data mining technology and Knowledge Management System (KMS), due to the ability of data mining technology to create useful knowledge from large volumes of data. Meanwhile, KMS vitally support the creation and use of knowledge. The integration of data mining technology and KMS are popularly used in business for enhancing and sustaining organizational performance. However, there is a lack of studies that applied data mining technology and KMS in the education sector; particularly students- academic performance since this could reflect the IHL performance. Realizing its importance, this study seeks to integrate data mining technology and KMS to promote an effective management of knowledge within IHLs. Several concepts from literature are adapted, for proposing the new integrative data mining technology and KMS framework to an IHL.
Keywords: Data mining, Institutions of Higher Learning, Knowledge Management System, Students' academic performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 214213072 Application of Data Envelopment Analysis and Performance Indicators to Irrigation Systems in Thessaloniki Plain (Greece)
Authors: Ntantos P.N, Karpouzos D.K
Abstract:
In this paper, a benchmarking framework is presented for the performance assessment of irrigations systems. Firstly, a data envelopment analysis (DEA) is applied to measure the technical efficiency of irrigation systems. This method, based on linear programming, aims to determine a consistent efficiency ranking of irrigation systems in which known inputs, such as water volume supplied and total irrigated area, and a given output corresponding to the total value of irrigation production are taken into account simultaneously. Secondly, in order to examine the irrigation efficiency in more detail, a cross – system comparison is elaborated using a performance indicators set selected by IWMI. The above methodologies were applied in Thessaloniki plain, located in Northern Greece while the results of the application are presented and discussed. The conjunctive use of DEA and performance indicators seems to be a very useful tool for efficiency assessment and identification of best practices in irrigation systems management.Keywords: Benchmarking, D.E.A, Performance Indicators, Irrigation systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209713071 Energy Separation Mechanism in Uni-Flow Vortex Tube Using Compressible Vortex Flow
Authors: Hiroshi Katanoda, Mohd Hazwan bin Yusof
Abstract:
A theoretical investigation from the view point of gas-dynamics and thermodynamics was carried out, in order to clarify the energy separation mechanism in a viscous compressible vortex, as a primary flow element in a uni-flow vortex tube. The mathematical solutions of tangential velocity, density and temperature in a viscous compressible vortical flow were used in this study.It is clear that a total temperature in the vortex core falls well below that distant from the vortex core in the radial direction, causing aregion with higher total temperature,compared to the distant region,peripheral to the vortex core.
Keywords: Energy separation mechanism, theoretical analysis, vortex tube, vortical flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 190313070 Freighter Aircraft Selection Using Entropic Programming for Multiple Criteria Decision Making Analysis
Authors: C. Ardil
Abstract:
This paper proposes entropic programming for the freighter aircraft selection problem using the multiple criteria decision analysis method. The study aims to propose a systematic and comprehensive framework by focusing on the perspective of freighter aircraft selection. In order to achieve this goal, an integrated entropic programming approach was proposed to evaluate and rank alternatives. The decision criteria and aircraft alternatives were identified from the research data analysis. The objective criteria weights were determined by the mean weight method and the standard deviation method. The proposed entropic programming model was applied to a practical decision problem for evaluating and selecting freighter aircraft. The proposed entropic programming technique gives robust, reliable, and efficient results in modeling decision making analysis problems. As a result of entropic programming analysis, Boeing B747-8F, a freighter aircraft alternative ( a3), was chosen as the most suitable freighter aircraft candidate.
Keywords: entropic programming, additive weighted model, multiple criteria decision making analysis, MCDMA, TOPSIS, aircraft selection, freighter aircraft, Boeing B747-8F, Boeing B777F, Airbus A350F
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 54913069 Towards a Secure Storage in Cloud Computing
Authors: Mohamed Elkholy, Ahmed Elfatatry
Abstract:
Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.Keywords: Access control, data integrity, data confidentiality, Kerberos authentication, cloud security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 177113068 Dynamic Metadata Schemes in the Neutron and Photon Science Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata is one of the most important aspects for advancing data management practices within all research communities. Definitions and schemes of metadata are inter alia of particular significance in the domain of neutron and photon scattering experiments covering a broad area of different scientific disciplines. The demand of describing continuously evolving highly non-standardized experiments, including the resulting processed and published data, constitutes a considerable challenge for a static definition of metadata. Here, we present the concept of dynamic metadata for the neutron and photon scientific community, which enriches a static set of defined basic metadata. We explore the idea of dynamic metadata with the help of the use case of X-ray Photon Correlation Spectroscopy (XPCS), which is a synchrotron-based scattering technique that allows the investigation of nanoscale dynamic processes. It serves here as a demonstrator of how dynamic metadata can improve data acquisition, sharing, and analysis workflows. Our approach enables researchers to tailor metadata definitions dynamically and adapt them to the evolving demands of describing data and results from a diverse set of experiments. We demonstrate that dynamic metadata standards yield advantages that enhance data reproducibility, interoperability, and the dissemination of knowledge.
Keywords: Big data, metadata, schemas, XPCS, X-ray Photon Correlation Spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14813067 A Simple Heat and Mass Transfer Model for Salt Gradient Solar Ponds
Authors: Safwan Kanan, Jonathan Dewsbury, Gregory Lane-Serff
Abstract:
A salinity gradient solar pond is a free energy source system for collecting, convertingand storing solar energy as heat. In thispaper, the principles of solar pond are explained. A mathematical model is developed to describe and simulate heat and mass transferbehaviour of salinity gradient solar pond. MATLAB codes are programmed to solve the one dimensional finite difference method for heat and mass transfer equations. Temperature profiles and concentration distributions are calculated. The numerical results are validated with experimental data and the results arefound to be in good agreement.
Keywords: Finite Difference method, Salt-gradient solar-pond, Solar energy, Transient heat and mass transfer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 497913066 A Software Tool Design for Cerebral Infarction of MR Images
Authors: Kyoung-Jong Park, Woong-Gi Jeon, Hee-Cheol Kim, Dong-Eog Kim, Heung-Kook Choi
Abstract:
The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.
Keywords: Software tool design, Cerebral infarction, Brain MR image, Registration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166313065 Banks Profitability Indicators in CEE Countries
Abstract:
The aim of the present article is to determine the impact of the external and internal factors of bank performance on the profitability indicators of the CEE countries banks in the period from 2006 to 2012. On the basis of research conducted abroad on bank and macroeconomic profitability indicators, in order to obtain research results, the authors evaluated return on average assets (ROAA) and return on average equity (ROAE) indicators of the CEE countries banks. The authors analyzed profitability indicators of banks using descriptive methods, SPSS data analysis methods, as well as data correlation and linear regression analysis. The authors concluded that most internal and external indicators of bank performance have no direct influence the profitability of the banks in the CEE countries. The only exceptions are credit risk and bank size, which affect one of the measures of bank profitability – return on average equity.
Keywords: Banks, CEE countries, Profitability ROAA, ROAE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 266213064 1/Sigma Term Weighting Scheme for Sentiment Analysis
Authors: Hanan Alshaher, Jinsheng Xu
Abstract:
Large amounts of data on the web can provide valuable information. For example, product reviews help business owners measure customer satisfaction. Sentiment analysis classifies texts into two polarities: positive and negative. This paper examines movie reviews and tweets using a new term weighting scheme, called one-over-sigma (1/sigma), on benchmark datasets for sentiment classification. The proposed method aims to improve the performance of sentiment classification. The results show that 1/sigma is more accurate than the popular term weighting schemes. In order to verify if the entropy reflects the discriminating power of terms, we report a comparison of entropy values for different term weighting schemes.
Keywords: Sentiment analysis, term weighting scheme, 1/sigma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 533