Search results for: CAP mining and modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1139

Search results for: CAP mining and modelling

269 Decision Trees for Predicting Risk of Mortality using Routinely Collected Data

Authors: Tessy Badriyah, Jim S. Briggs, Dave R. Prytherch

Abstract:

It is well known that Logistic Regression is the gold standard method for predicting clinical outcome, especially predicting risk of mortality. In this paper, the Decision Tree method has been proposed to solve specific problems that commonly use Logistic Regression as a solution. The Biochemistry and Haematology Outcome Model (BHOM) dataset obtained from Portsmouth NHS Hospital from 1 January to 31 December 2001 was divided into four subsets. One subset of training data was used to generate a model, and the model obtained was then applied to three testing datasets. The performance of each model from both methods was then compared using calibration (the χ2 test or chi-test) and discrimination (area under ROC curve or c-index). The experiment presented that both methods have reasonable results in the case of the c-index. However, in some cases the calibration value (χ2) obtained quite a high result. After conducting experiments and investigating the advantages and disadvantages of each method, we can conclude that Decision Trees can be seen as a worthy alternative to Logistic Regression in the area of Data Mining.

Keywords: Decision Trees, Logistic Regression, clinical outcome, risk of mortality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2477
268 The Development of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

Authors: Mohamed R. Mhereeg

Abstract:

The paper investigates the feasibility of constructing a software multi-agent based monitoring and classification system and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. The agents function autonomously to provide continuous and periodic monitoring of excels spreadsheet workbooks. Resulting in, the development of the MultiAgent classification System (MACS) that is in compliance with the specifications of the Foundation for Intelligent Physical Agents (FIPA). However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies that are Windows Communication Foundation (WCF) services, Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW that is in order to satisfy the monitoring and classification of the multiple developer aspect. ODM was used to automate the classification phase of MACS.

Keywords: Autonomous, Classification, MACS, Multi-Agent, SOA, WCF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
267 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, Construction projects, Cost estimation, NRM, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4389
266 Performance of Derna Steam Power Plant at Varying Super-Heater Operating Conditions Based on Exergy

Authors: Idris Elfeituri

Abstract:

In the current study, energy and exergy analysis of a 65 MW steam power plant was carried out. This study investigated the effect of variations of overall conductance of the super heater on the performance of an existing steam power plant located in Derna, Libya. The performance of the power plant was estimated by a mathematical modelling which considers the off-design operating conditions of each component. A fully interactive computer program based on the mass, energy and exergy balance equations has been developed. The maximum exergy destruction has been found in the steam generation unit. A 50% reduction in the design value of overall conductance of the super heater has been achieved, which accordingly decreases the amount of the net electrical power that would be generated by at least 13 MW, as well as the overall plant exergy efficiency by at least 6.4%, and at the same time that would cause an increase of the total exergy destruction by at least 14 MW. The achieved results showed that the super heater design and operating conditions play an important role on the thermodynamics performance and the fuel utilization of the power plant. Moreover, these considerations are very useful in the process of the decision that should be taken at the occasions of deciding whether to replace or renovate the super heater of the power plant.

Keywords: Exergy, super-heater, fouling, steam power plant, off-design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1072
265 Finite Element Application to Estimate Inservice Material Properties using Miniature Specimen

Authors: G. Partheepan, D.K. Sehgal, R.K. Pandey

Abstract:

This paper presents a method for determining the uniaxial tensile properties such as Young-s modulus, yield strength and the flow behaviour of a material in a virtually non-destructive manner. To achieve this, a new dumb-bell shaped miniature specimen has been designed. This helps in avoiding the removal of large size material samples from the in-service component for the evaluation of current material properties. The proposed miniature specimen has an advantage in finite element modelling with respect to computational time and memory space. Test fixtures have been developed to enable the tension tests on the miniature specimen in a testing machine. The studies have been conducted in a chromium (H11) steel and an aluminum alloy (AR66). The output from the miniature test viz. load-elongation diagram is obtained and the finite element simulation of the test is carried out using a 2D plane stress analysis. The results are compared with the experimental results. It is observed that the results from the finite element simulation corroborate well with the miniature test results. The approach seems to have potential to predict the mechanical properties of the materials, which could be used in remaining life estimation of the various in-service structures.

Keywords: ABAQUS, finite element, miniature test, tensileproperties

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685
264 Bearing Capacity of Sheet Hanger Connection to the Trapezoidal Metal Sheet

Authors: Kateřina Jurdová

Abstract:

Hanging to the trapezoidal sheet by decking hanger is a very widespread solution used in civil engineering to lead the distribution of energy, sanitary, air distribution system etc. under the roof or floor structure. The trapezoidal decking hanger is usually a part of the whole installation system for specific distribution medium. The leading companies offer installation systems for each specific distribution e.g. pipe rings, sprinkler systems, installation channels etc. Every specific part is connected to the base connector which is decking hanger. The own connection has three main components: decking hanger, threaded bar with nuts and web of trapezoidal sheet. The aim of this contribution is determinate the failure mechanism of each component in connection. Load bearing capacity of most components in connection could be calculated by formulas in European codes. This contribution is focused on problematic of bearing resistance of threaded bar in web of trapezoidal sheet. This issue is studied by experimental research and numerical modelling. This contribution presented the initial results of experiment which is compared with numerical model of specimen.

Keywords: Decking hanger, concentrated load, connection, load bearing capacity, trapezoidal metal sheet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2607
263 Using Data Mining Methodology to Build the Predictive Model of Gold Passbook Price

Authors: Chien-Hui Yang, Che-Yang Lin, Ya-Chen Hsu

Abstract:

Gold passbook is an investing tool that is especially suitable for investors to do small investment in the solid gold. The gold passbook has the lower risk than other ways investing in gold, but its price is still affected by gold price. However, there are many factors can cause influences on gold price. Therefore, building a model to predict the price of gold passbook can both reduce the risk of investment and increase the benefits. This study investigates the important factors that influence the gold passbook price, and utilize the Group Method of Data Handling (GMDH) to build the predictive model. This method can not only obtain the significant variables but also perform well in prediction. Finally, the significant variables of gold passbook price, which can be predicted by GMDH, are US dollar exchange rate, international petroleum price, unemployment rate, whole sale price index, rediscount rate, foreign exchange reserves, misery index, prosperity coincident index and industrial index.

Keywords: Gold price, Gold passbook price, Group Method ofData Handling (GMDH), Regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2242
262 Mathematical Analysis of Stock Prices Prediction in a Financial Market Using Geometric Brownian Motion Model

Authors: Edikan E. Akpanibah, Ogunmodimu Dupe Catherine

Abstract:

The relevance of geometric Brownian motion (GBM) in modelling the behaviour of stock market prices (SMP) cannot be over emphasized taking into consideration the volatility of the SMP. Consequently, there is need to investigate how GBM models are being estimated and used in financial market to predict SMP. To achieve this, the GBM estimation and its application to the SMP of some selected companies are studied. The normal and log-normal distributions were used to determine the expected value, variance and co-variance. Furthermore, the GBM model was used to predict the SMP of some selected companies over a period of time and the mean absolute percentage error (MAPE) were calculated and used to determine the accuracy of the GBM model in predicting the SMP of the four companies under consideration. It was observed that for all the four companies, their MAPE values were within the region of acceptance. Also, the MAPE values of our data were compared to an existing literature to test the accuracy of our prediction with respect to time of investment. Finally, some numerical simulations of the graphs of the SMP, expectations and variance of the four companies over a period of time were presented using MATLAB programming software.

Keywords: Stock Market, Geometric Brownian Motion, normal and log-normal distribution, mean absolute percentage error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180
261 Modelling, Simulation and Validation of Plastic Zone Size during Deformation of Mild Steel

Authors: S. O. Adeosun, E. I. Akpan, S. A. Balogun, O. O. Taiwo

Abstract:

A model to predict the plastic zone size for material under plane stress condition has been developed and verified experimentally. The developed model is a function of crack size, crack angle and material property (dislocation density). Simulation and validation results show that the model developed show good agreement with experimental results. Samples of low carbon steel (0.035%C) with included surface crack angles of 45o, 50o, 60o, 70o and 90o and crack depths of 2mm and 4mm were subjected to low strain rate between 0.48 x 10-3 s-1 – 2.38 x 10-3 s-1. The mechanical properties studied were ductility, tensile strength, modulus of elasticity, yield strength, yield strain, stress at fracture and fracture toughness. The experimental study shows that strain rate has no appreciable effect on the size of plastic zone while crack depth and crack angle plays an imperative role in determining the size of the plastic zone of mild steel materials.

Keywords: Applied stress, crack angle, crack size, material property, plastic zone size, strain rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1573
260 Methodology for Developing an Intelligent Tutoring System Based on Marzano’s Taxonomy

Authors: Joaquin Navarro Perales, Ana Lidia Franzoni Velázquez, Francisco Cervantes Pérez

Abstract:

The Mexican educational system faces diverse challenges related with the quality and coverage of education. The development of Intelligent Tutoring Systems (ITS) may help to solve some of them by helping teachers to customize their classes according to the performance of the students in online courses. In this work, we propose the adaptation of a functional ITS based on Bloom’s taxonomy called Sistema de Apoyo Generalizado para la Enseñanza Individualizada (SAGE), to measure student’s metacognition and their emotional response based on Marzano’s taxonomy. The students and the system will share the control over the advance in the course, so they can improve their metacognitive skills. The system will not allow students to get access to subjects not mastered yet. The interaction between the system and the student will be implemented through Natural Language Processing techniques, thus avoiding the use of sensors to evaluate student’s response. The teacher will evaluate student’s knowledge utilization, which is equivalent to the last cognitive level in Marzano’s taxonomy.

Keywords: Intelligent tutoring systems, student modelling, metacognition, affective computing, natural language processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 962
259 Developing a New Relationship between Undrained Shear Strength and Over-Consolidation Ratio

Authors: Wael M Albadri, Hassnen M Jafer, Ehab H Sfoog

Abstract:

Relationship between undrained shear strength (Su) and over consolidation ratio (OCR) of clay soil (marine clay) is very important in the field of geotechnical engineering to estimate the settlement behaviour of clay and to prepare a small scale physical modelling test. In this study, a relationship between shear strength and OCR parameters was determined using the laboratory vane shear apparatus and the fully automatic consolidated apparatus. The main objective was to establish non-linear correlation formula between shear strength and OCR and comparing it with previous studies. Therefore, in order to achieve this objective, three points were chosen to obtain 18 undisturbed samples which were collected with an increasing depth of 1.0 m to 3.5 m each 0.5 m. Clay samples were prepared under undrained condition for both tests. It was found that the OCR and shear strength are inversely proportional at similar depth and at same undrained conditions. However, a good correlation was obtained from the relationships where the R2 values were very close to 1.0 using polynomial equations. The comparison between the experimental result and previous equation from other researchers produced a non-linear correlation which has a similar pattern with this study.

Keywords: Shear strength, over-consolidation ratio, vane shear test, clayey soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
258 An Investigation on Overstrength Factor (Ω) of Reinforced Concrete Buildings in Turkish Earthquake Draft Code (TEC-2016)

Authors: M. Hakan Arslan, I. Hakkı Erkan

Abstract:

Overstrength factor is an important parameter of load reduction factor. In this research, the overstrength factor (Ω) of reinforced concrete (RC) buildings and the parameters of Ω in TEC-2016 draft version have been explored. For this aim, 48 RC buildings have been modeled according to the current seismic code TEC-2007 and Turkish Building Code-500-2000 criteria. After modelling step, nonlinear static pushover analyses have been applied to these buildings by using TEC-2007 Section 7. After the nonlinear pushover analyses, capacity curves (lateral load-lateral top displacement curves) have been plotted for 48 RC buildings. Using capacity curves, overstrength factors (Ω) have been derived for each building. The obtained overstrength factor (Ω) values have been compared with TEC-2016 values for related building types, and the results have been interpreted. According to the obtained values from the study, overstrength factor (Ω) given in TEC-2016 draft code is found quite suitable.

Keywords: Reinforced concrete buildings, overstrength factor, earthquake, static pushover analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1979
257 Performance Evaluation of a ‘Priority-Controlled’ Intersection Converted to Signal-Controlled Intersection

Authors: Ezenwa Chinenye Amanamba

Abstract:

There is a call to ensure that the issues of safety and efficient throughput are considered during design; the solutions to these issues can also be retrofitted at locations where they were not captured during design, but have become problems to road users over time. This paper adopts several methods to analyze the performance of an intersection which was formerly a ‘priority-controlled’ intersection, but has now been converted to a ‘signal-controlled’ intersection. Extensive review of literature helped form the basis for result analysis and discussion. The Ikot-Ekpene/Anagha-Ezikpe intersection, located at the heart of Umuahia was adopted as case study; considering the high traffic volume on the route. Anecdotal evidence revealed that traffic signals imposed enormous delays at the intersection, especially for traffic on the major road. The major road has arrival flow which surpasses the saturation flow obtained from modelling of the isolated signalized intersection. Similarly, there were several geometric elements that did not agree with the specific function of the road. A roundabout, particularly flower roundabout was recommended as a better traffic control measure.

Keywords: Highway function, level of service, roundabout, traffic delays, Umuahia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1060
256 A Text Clustering System based on k-means Type Subspace Clustering and Ontology

Authors: Liping Jing, Michael K. Ng, Xinhua Yang, Joshua Zhexue Huang

Abstract:

This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.

Keywords: Subspace Clustering, Text Mining, Feature Weighting, Cluster Interpretation, Ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2412
255 Finite Element Analysis of Different Architectures for Bone Scaffold

Authors: Nimisha R. Shirbhate, Sanjay Bokade

Abstract:

Bone Scaffolds are fundamental architecture or a support structure that allows the regeneration of lost or damaged tissues and they are developed as a crucial tool in biomedical engineering. The structure of bone scaffolds plays an important role in treating bone defects. The shape of the bone scaffold performs a vital role, specifically pore size and shape, which help understand the behavior and strength of the scaffold. In this article, first, fundamental aspects of bone scaffold design are established. Second, the behavior of each architecture of the bone scaffold with biomaterials is discussed. Finally, for each structure, the stress analysis was carried out. This study aimed to design a porous and mechanically strong bone regeneration scaffold that can be successfully manufactured. Four porous architectures of the bone scaffold were designed using Rhinoceros solid modelling software. The structure model consisted of repeatable unit cells arranged in layers to fill the chosen scaffold volume. The mechanical behavior of used biocompatible material is studied with the help of ANSYS 19.2 software. It is also playing significant role to predict the strength of defined structures or 3 dimensional models.

Keywords: Bone scaffold, stress analysis, porous structure, static loading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 455
254 Sediment Patterns from Fluid-Bed Interactions: A Direct Numerical Simulations Study on Fluvial Turbulent Flows

Authors: Nadim Zgheib, Sivaramakrishnan Balachandar

Abstract:

We present results on the initial formation of ripples from an initially flattened erodible bed. We use direct numerical simulations (DNS) of turbulent open channel flow over a fixed sinusoidal bed coupled with hydrodynamic stability analysis. We use the direct forcing immersed boundary method to account for the presence of the sediment bed. The resolved flow provides the bed shear stress and consequently the sediment transport rate, which is needed in the stability analysis of the Exner equation. The approach is different from traditional linear stability analysis in the sense that the phase lag between the bed topology, and the sediment flux is obtained from the DNS. We ran 11 simulations at a fixed shear Reynolds number of 180, but for different sediment bed wavelengths. The analysis allows us to sweep a large range of physical and modelling parameters to predict their effects on linear growth. The Froude number appears to be the critical controlling parameter in the early linear development of ripples, in contrast with the dominant role of particle Reynolds number during the equilibrium stage.

Keywords: Direct numerical simulation, immersed boundary method, sediment-bed interactions, turbulent multiphase flow, linear stability analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 623
253 Computer Modeling of Drug Distribution after Intravitreal Administration

Authors: N. Haghjou, M. J. Abdekhodaie, Y. L. Cheng, M. Saadatmand

Abstract:

Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.

Keywords: Posterior segment, Intravitreal injection (IVI), Pharmacokinetic, Modelling, Finite volume method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2386
252 Natural Frequency Analysis of a Porous Functionally Graded Shaft System

Authors: Natural Frequency Analysis of a Porous Functionally Graded Shaft System

Abstract:

The vibration characteristics of a functionally graded (FG) rotor model having porosities and micro-voids is investigated using three-dimensional finite element analysis. The FG shaft is mounted with a steel disc located at the midspan. The shaft ends are supported on isotropic bearings. The FG material is composed of a metallic (stainless-steel) and ceramic phase (zirconium oxide) as its constituent phases. The layer wise material property variation is governed by power law. Material property equations are developed for the porosity modelling. Python code is developed to assign the material properties to each layer including the effect of porosities. ANSYS commercial software is used to extract the natural frequencies and whirl frequencies for the FG shaft system. The obtained results show the influence of porosity volume fraction and power-law index, on the vibration characteristics of the ceramic-based FG shaft system.

Keywords: Finite element method, functionally graded material, porosity volume fraction, power law.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 734
251 Material Flow Modeling in Friction Stir Welding of AA6061-T6 Alloy and Study of the Effect of Process Parameters

Authors: B. Saha Roy, T. Medhi, S. C. Saha

Abstract:

To understand the friction stir welding process, it is very important to know the nature of the material flow in and around the tool. The process is a combination of both thermal as well as mechanical work i.e. it is a coupled thermo-mechanical process. Numerical simulations are very much essential in order to obtain a complete knowledge of the process as well as the physics underlying it. In the present work a model based approach is adopted in order to study material flow. A thermo-mechanical based CFD model is developed using a Finite Element package, Comsol Multiphysics. The fluid flow analysis is done. The model simultaneously predicts shear strain fields, shear strain rates and shear stress over the entire workpiece for the given conditions. The flow fields generated by the streamline plot give an idea of the material flow. The variation of dynamic viscosity, velocity field and shear strain fields with various welding parameters is studied. Finally the result obtained from the above mentioned conditions is discussed elaborately and concluded.

Keywords: AA6061-T6, friction stir welding, material flow, CFD modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2540
250 DCBOR: A Density Clustering Based on Outlier Removal

Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.

Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
249 Dynamics of Mini Hydraulic Backhoe Excavator: A Lagrange-Euler (L-E) Approach

Authors: Bhaveshkumar P. Patel, J. M. Prajapati

Abstract:

Excavators are high power machines used in the mining, agricultural and construction industry whose principal functions are digging (material removing), ground leveling and material transport operations. During the digging task there are certain unknown forces exerted by the bucket on the soil and the digging operation is repetitive in nature. Automation of the digging task can be performed by an automatically controlled excavator system, which is not only control the forces but also follow the planned digging trajectories. To develop such a controller for automated excavation, it is required to develop a dynamic model to describe the behavior of the control system during digging operation and motion of excavator with time. The presented work described a dynamic model needed for controller design and which is derived by applying Lagrange-Euler approach. The developed dynamic model is intended for further development of an automated excavation control system for light duty construction work and can be applied for heavy duty or all types of backhoe excavators.

Keywords: Backhoe excavator, controller, digging, excavation, trajectory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4404
248 On the Efficient Implementation of a Serial and Parallel Decomposition Algorithm for Fast Support Vector Machine Training Including a Multi-Parameter Kernel

Authors: Tatjana Eitrich, Bruno Lang

Abstract:

This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.

Keywords: Support Vector Machine Training, Multi-ParameterKernels, Shared Memory Parallel Computing, Large Data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394
247 Mathematical Modelling and Numerical Simulation of Maisotsenko Cycle

Authors: Rasikh Tariq, Fatima Z. Benarab

Abstract:

Evaporative coolers has a minimum potential to reach the wet-bulb temperature of intake air which is not enough to handle a large cooling load; therefore, it is not a feasible option to overcome cooling requirement of a building. The invention of Maisotsenko (M) cycle has led evaporative cooling technology to reach the sub-wet-bulb temperature of the intake air; therefore, it brings an innovation in evaporative cooling techniques. In this work, we developed a mathematical model of the Maisotsenko based air cooler by applying energy and mass balance laws on different air channels. The governing ordinary differential equations are discretized and simulated on MATLAB. The temperature and the humidity plots are shown in the simulation results. A parametric study is conducted by varying working air inlet conditions (temperature and humidity), inlet air velocity, geometric parameters and water temperature. The influence of these aforementioned parameters on the cooling effectiveness of the HMX is reported.  Results have shown that the effectiveness of the M-Cycle is increased by increasing the ambient temperature and decreasing absolute humidity. An air velocity of 0.5 m/sec and a channel height of 6-8mm is recommended.

Keywords: Renewable energy, indirect evaporative cooling, Maisotsenko cycle, HMX, mathematical model, numerical simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1219
246 Product Features Extraction from Opinions According to Time

Authors: Kamal Amarouche, Houda Benbrahim, Ismail Kassou

Abstract:

Nowadays, e-commerce shopping websites have experienced noticeable growth. These websites have gained consumers’ trust. After purchasing a product, many consumers share comments where opinions are usually embedded about the given product. Research on the automatic management of opinions that gives suggestions to potential consumers and portrays an image of the product to manufactures has been growing recently. After launching the product in the market, the reviews generated around it do not usually contain helpful information or generic opinions about this product (e.g. telephone: great phone...); in the sense that the product is still in the launching phase in the market. Within time, the product becomes old. Therefore, consumers perceive the advantages/ disadvantages about each specific product feature. Therefore, they will generate comments that contain their sentiments about these features. In this paper, we present an unsupervised method to extract different product features hidden in the opinions which influence its purchase, and that combines Time Weighting (TW) which depends on the time opinions were expressed with Term Frequency-Inverse Document Frequency (TF-IDF). We conduct several experiments using two different datasets about cell phones and hotels. The results show the effectiveness of our automatic feature extraction, as well as its domain independent characteristic.

Keywords: Opinion mining, product feature extraction, sentiment analysis, SentiWordNet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
245 GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Authors: Mohamed Dbouk, Ihab Sbeity

Abstract:

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

Keywords: Location-Trajectory management, GIS, Mobile- Moving Objects/Agents, Multipurpose/Spatiotemporal data, Multi- Agent Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
244 Development of Basic Patternmaking Using Parametric Modelling and AutoLISP

Authors: Haziyah Hussin, Syazwan Abdul Samad, Rosnani Jusoh

Abstract:

This study is aimed towards the automisation of basic patternmaking for traditional clothes for the purpose of mass production using AutoCAD to apply AutoLISP feature under software Hazi Attire. A standard dress form (industrial form) with the size of small (S), medium (M) and large (L) size is measured using full body scanning machine. Later, the pattern for the clothes is designed parametrically based on the measured dress form. Hazi Attire program is used within the framework of AutoCAD to generate the basic pattern of front bodice, back bodice, front skirt, back skirt and sleeve block (sloper). The generation of pattern is based on the parameters inputted by user, whereby in this study, the parameters were determined based on the measured size of dress form. The finalized pattern parameter shows that the pattern fit perfectly on the dress form. Since the pattern is generated almost instantly, these proved that using the AutoLISP programming, the manufacturing lead time for the mass production of the traditional clothes can be decreased.

Keywords: Apparel, AutoLISP, Malay Traditional Clothes, Pattern Ganeration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334
243 Modelling a Hospital as a Queueing Network: Analysis for Improving Performance

Authors: Emad Alenany, M. Adel El-Baz

Abstract:

In this paper, the flow of different classes of patients into a hospital is modelled and analyzed by using the queueing network analyzer (QNA) algorithm and discrete event simulation. Input data for QNA are the rate and variability parameters of the arrival and service times in addition to the number of servers in each facility. Patient flows mostly match real flow for a hospital in Egypt. Based on the analysis of the waiting times, two approaches are suggested for improving performance: Separating patients into service groups, and adopting different service policies for sequencing patients through hospital units. The separation of a specific group of patients, with higher performance target, to be served separately from the rest of patients requiring lower performance target, requires the same capacity while improves performance for the selected group of patients with higher target. Besides, it is shown that adopting the shortest processing time and shortest remaining processing time service policies among other tested policies would results in, respectively, 11.47% and 13.75% reduction in average waiting time relative to first come first served policy.

Keywords: Queueing network, discrete-event simulation, health applications, SPT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
242 Effects of the Mass and Damping Matrix Model in the Nonlinear Seismic Response of Steel Frames

Authors: A. Reyes-Salazar, M. D. Llanes-Tizoc, E. Bojorquez, F. Valenzuela-Beltran, J. Bojorquez, J. R. Gaxiola-Camacho, A. Haldar

Abstract:

Seismic analysis of steel buildings is usually based on the use of the concentrated mass (ML) matrix and the Rayleigh damping matrix (C). Similarly, the initial stiffness matrix (KO) and the first two modes associated to lateral vibrations are commonly used to develop the matrix C. The evaluation of the accuracy of these practices for the particular case of steel buildings with moment-resisting steel frames constitutes the main objective of this research. For this, the nonlinear seismic responses of three models of steel frames, representing low-, medium- and high-rise steel buildings, are considered. Results indicate that if the ML matrix is used, shears and bending moments in columns are underestimated by up to 30% and 65%, respectively, when compared to the corresponding results obtained with the consistent mass matrix (MC). It is also shown that if KO is used in C instead the tangent stiffness matrix (Kt), axial loads in columns are underestimated by up to 80%. It is concluded that the consistent mass matrix should be used in the structural modelling of moment resisting steel frames and the tangent stiffness matrix should be used to develop the Rayleigh damping matrix.

Keywords: Moment-resisting steel frames, consistent and concentrated mass matrices, nonlinear seismic response, Rayleigh damping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 332
241 The Link between Unemployment and Inflation Using Johansen’s Co-Integration Approach and Vector Error Correction Modelling

Authors: Sagaren Pillay

Abstract:

In this paper bi-annual time series data on unemployment rates (from the Labour Force Survey) are expanded to quarterly rates and linked to quarterly unemployment rates (from the Quarterly Labour Force Survey). The resultant linked series and the consumer price index (CPI) series are examined using Johansen’s cointegration approach and vector error correction modeling. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant co-integrating relationship is found to exist between the time series of unemployment rates and the CPI. Given this significant relationship, the study models this relationship using Vector Error Correction Models (VECM), one with a restriction on the deterministic term and the other with no restriction.

A formal statistical confirmation of the existence of a unique linear and lagged relationship between inflation and unemployment for the period between September 2000 and June 2011 is presented. For the given period, the CPI was found to be an unbiased predictor of the unemployment rate. This relationship can be explored further for the development of appropriate forecasting models incorporating other study variables.

Keywords: Forecasting, lagged, linear, relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2500
240 Investigating the Relation between Student Engagement and Attainment in a Flexible Learning Environment

Authors: Y. Bi, T. Anderson, M. Huang

Abstract:

The use of technology is increasingly adopted to support flexible learning in Higher Education institutions. The adoption of more sophisticated technologies offers a broad range of facilities for communication and resource sharing, thereby creating a flexible learning environment that facilitates and even encourages students not to physically attend classes. However this emerging trend seems to contradict class attendance requirements within universities, inevitably leading to a dilemma between amending traditional regulations and creating new policies for the higher education institutions. This study presents an investigation into student engagement in a technology enhanced/driven flexible environment along with its relationship to attainment. We propose an approach to modelling engagement from different perspectives in terms of indicators and then consider what impact these indicators have on student academic performance. We have carried out a case study on the relation between attendance and attainment in a flexible environment. Although our preliminary results show attendance is quantitatively correlated with successful student development and learning outcomes, our results also indicate there is a cohort that did not follow such a pattern. Nevertheless the preliminary results could provide an insight into pilot studies in the wider deployment of new technology to support flexible learning.

Keywords: Engagement, flexible leaning, attendance and attainment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740