Search results for: algorithms and data structure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32286

Search results for: algorithms and data structure

31146 Hyper Tuned RBF SVM: Approach for the Prediction of the Breast Cancer

Authors: Surita Maini, Sanjay Dhanka

Abstract:

Machine learning (ML) involves developing algorithms and statistical models that enable computers to learn and make predictions or decisions based on data without being explicitly programmed. Because of its unlimited abilities ML is gaining popularity in medical sectors; Medical Imaging, Electronic Health Records, Genomic Data Analysis, Wearable Devices, Disease Outbreak Prediction, Disease Diagnosis, etc. In the last few decades, many researchers have tried to diagnose Breast Cancer (BC) using ML, because early detection of any disease can save millions of lives. Working in this direction, the authors have proposed a hybrid ML technique RBF SVM, to predict the BC in earlier the stage. The proposed method is implemented on the Breast Cancer UCI ML dataset with 569 instances and 32 attributes. The authors recorded performance metrics of the proposed model i.e., Accuracy 98.24%, Sensitivity 98.67%, Specificity 97.43%, F1 Score 98.67%, Precision 98.67%, and run time 0.044769 seconds. The proposed method is validated by K-Fold cross-validation.

Keywords: breast cancer, support vector classifier, machine learning, hyper parameter tunning

Procedia PDF Downloads 71
31145 Efficient Monolithic FEM for Compressible Flow and Conjugate Heat Transfer

Authors: Santhosh A. K.

Abstract:

This work presents an efficient monolithic finite element strategy for solving thermo-fluid-structure interaction problems involving compressible fluids and linear-elastic structure. This formulation uses displacement variables for structure and velocity variables for the fluid, with no additional variables required to ensure traction, velocity, temperature, and heat flux continuity at the fluid-structure interface. Rate of convergence in each time step is quadratic, which is achieved in this formulation by deriving an exact tangent stiffness matrix. The robustness and good performance of the method is ascertained by applying the proposed strategy on a wide spectrum of problems taken from the literature pertaining to steady, transient, two dimensional, axisymmetric, and three dimensional fluid flow and conjugate heat transfer. It is shown that the current formulation gives excellent results on all the case studies conducted, which includes problems involving compressibility effects as well as problems where fluid can be treated as incompressible.

Keywords: linear thermoelasticity, compressible flow, conjugate heat transfer, monolithic FEM

Procedia PDF Downloads 201
31144 An Energy-Balanced Clustering Method on Wireless Sensor Networks

Authors: Yu-Ting Tsai, Chiun-Chieh Hsu, Yu-Chun Chu

Abstract:

In recent years, due to the development of wireless network technology, many researchers have devoted to the study of wireless sensor networks. The applications of wireless sensor network mainly use the sensor nodes to collect the required information, and send the information back to the users. Since the sensed area is difficult to reach, there are many restrictions on the design of the sensor nodes, where the most important restriction is the limited energy of sensor nodes. Because of the limited energy, researchers proposed a number of ways to reduce energy consumption and balance the load of sensor nodes in order to increase the network lifetime. In this paper, we proposed the Energy-Balanced Clustering method with Auxiliary Members on Wireless Sensor Networks(EBCAM)based on the cluster routing. The main purpose is to balance the energy consumption on the sensed area and average the distribution of dead nodes in order to avoid excessive energy consumption because of the increasing in transmission distance. In addition, we use the residual energy and average energy consumption of the nodes within the cluster to choose the cluster heads, use the multi hop transmission method to deliver the data, and dynamically adjust the transmission radius according to the load conditions. Finally, we use the auxiliary cluster members to change the delivering path according to the residual energy of the cluster head in order to its load. Finally, we compare the proposed method with the related algorithms via simulated experiments and then analyze the results. It reveals that the proposed method outperforms other algorithms in the numbers of used rounds and the average energy consumption.

Keywords: auxiliary nodes, cluster, load balance, routing algorithm, wireless sensor network

Procedia PDF Downloads 279
31143 Experimental Analysis of Tuned Liquid Damper (TLD) for High Raised Structures

Authors: Mohamad Saberi, Arash Sohrabi

Abstract:

Tuned liquid damper is one the passive structural control ways which has been used since mid-1980 decade for seismic control in civil engineering. This system is made of one or many tanks filled with fluid, mostly water that installed on top of the high raised structure and used to prevent structure vibration. In this article, we will show how to make seismic table contain TLD system and analysis the result of using this system in our structure. Results imply that when frequency ratio approaches 1 this system can perform its best in both dissipate energy and increasing structural damping. And also results of these serial experiments are proved compatible with Hunzer linear theory behaviour.

Keywords: TLD, seismic table, structural system, Hunzer linear behaviour

Procedia PDF Downloads 339
31142 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt

Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer

Abstract:

Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.

Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening

Procedia PDF Downloads 61
31141 The Effects of English Contractions on the Application of Syntactic Theories

Authors: Wakkai Hosanna Hussaini

Abstract:

A formal structure of the English clause is composed of at least two elements – subject and verb, in structural grammar and at least one element – predicate, in systemic (functional) and generative grammars. Each of the elements can be represented by a word or group (of words). In modern English structure, very often speakers merge two words as one with the use of an apostrophe. Each of the two words can come from different elements or belong to the same element. In either case, result of the merger is called contraction. Although contractions constitute a part of modern English structure, they are considered informal in nature (more frequently used in spoken than written English) that is why they were initially viewed as constituting an evidence of language deterioration. To our knowledge, no formal syntactic theory yet has been particular on the contractions because of its deviation from the formal rules of syntax that seek to identify the elements that form a clause in English. The inconsistency between the formal rules and a contraction is established when two words representing two elements in a non-contraction are merged as one element to form a contraction. Thus the paper presents the various syntactic issues as effects arising from converting non-contracted to contracted forms. It categorizes English contractions and describes each category according to its syntactic relations (position and relationship) and morphological formation (form and content) as integral part of modern structure of English. This is a position paper as such the methodology is observational, descriptive and explanatory/analytical based on existing related literature. The inventory of English contractions contained in books on syntax forms the data from where specific examples are drawn. It is noted as conclusion that the existing syntactic theories were not originally established to account for English contractions. The paper, when published, will further expose the inadequacies of the existing syntactic theories by giving more reasons for the establishment of a more comprehensive syntactic theory for analyzing English clause/sentence structure involving contractions. The method used reveals the extent of the inadequacies in applying the three major syntactic theories: structural, systemic (functional) and generative, on the English contractions. Although no theory is without scope, shying away from the three major theories from recognizing the English contractions need to be broken because of the increasing popularity of its use in modern English structure. The paper, therefore, recommends that as use of contraction gains more popular even in formal speeches today, there is need to establish a syntactic theory to handle its patterns of syntactic relations and morphological formation.

Keywords: application, effects, English contractions, syntactic theories

Procedia PDF Downloads 273
31140 Numerical Simulation of Fluid-Structure Interaction on Wedge Slamming Impact by Using Particle Method

Authors: Sung-Chul Hwang, Di Ren, Sang-Moon Yoon, Jong-Chun Park, Abbas Khayyer, Hitoshi Gotoh

Abstract:

The slamming impact problem has a very important engineering background. For seaplane landing, recycling for the satellite re-entry capsule, and the impact load of the bow in the adverse sea conditions, the slamming problem always plays the important role. Due to its strong nonlinear effect, however, it seems to be not easy to obtain the accurate simulation results. Combined with the strong interaction between the fluid field and the elastic structure, the difficulty for the simulation leads to a new level for challenging. This paper presents a fully Lagrangian coupled solver for simulations of fluid-structure interactions, which is based on the Moving Particle Semi-implicit (MPS) method to solve the governing equations corresponding to incompressible flows as well as elastic structures. The developed solver is verified by reproducing the high velocity impact loads of deformable thin wedges with two different materials such as aluminum and steel on water entry. The present simulation results are compared with analytical solution derived using the hydrodynamic Wagner model and linear theory by Wan.

Keywords: fluid-structure interaction, moving particle semi-implicit (MPS) method, elastic structure, incompressible flow, wedge slamming impact

Procedia PDF Downloads 609
31139 Housing Price Prediction Using Machine Learning Algorithms: The Case of Melbourne City, Australia

Authors: The Danh Phan

Abstract:

House price forecasting is a main topic in the real estate market research. Effective house price prediction models could not only allow home buyers and real estate agents to make better data-driven decisions but may also be beneficial for the property policymaking process. This study investigates the housing market by using machine learning techniques to analyze real historical house sale transactions in Australia. It seeks useful models which could be deployed as an application for house buyers and sellers. Data analytics show a high discrepancy between the house price in the most expensive suburbs and the most affordable suburbs in the city of Melbourne. In addition, experiments demonstrate that the combination of Stepwise and Support Vector Machine (SVM), based on the Mean Squared Error (MSE) measurement, consistently outperforms other models in terms of prediction accuracy.

Keywords: house price prediction, regression trees, neural network, support vector machine, stepwise

Procedia PDF Downloads 236
31138 Dynamic Measurement System Modeling with Machine Learning Algorithms

Authors: Changqiao Wu, Guoqing Ding, Xin Chen

Abstract:

In this paper, ways of modeling dynamic measurement systems are discussed. Specially, for linear system with single-input single-output, it could be modeled with shallow neural network. Then, gradient based optimization algorithms are used for searching the proper coefficients. Besides, method with normal equation and second order gradient descent are proposed to accelerate the modeling process, and ways of better gradient estimation are discussed. It shows that the mathematical essence of the learning objective is maximum likelihood with noises under Gaussian distribution. For conventional gradient descent, the mini-batch learning and gradient with momentum contribute to faster convergence and enhance model ability. Lastly, experimental results proved the effectiveness of second order gradient descent algorithm, and indicated that optimization with normal equation was the most suitable for linear dynamic models.

Keywords: dynamic system modeling, neural network, normal equation, second order gradient descent

Procedia PDF Downloads 130
31137 Analyzing the Perceptions of Emotions in Aesthetic Music

Authors: Abigail Wiafe, Charles Nutrokpor, Adelaide Oduro-Asante

Abstract:

The advancement of technology is rapidly making people more receptive to music as computer-generated music requires minimal human interventions. Though algorithms are applied to generate music, the human experience of emotions is still explored. Thus, this study investigates the emotions humans experience listening to computer-generated music that possesses aesthetic qualities. Forty-two subjects participated in the survey. The selection process was purely arbitrary since it was based on convenience. Subjects listened and evaluated the emotions experienced from the computer-generated music through an online questionnaire. The Likert scale was used to rate the emotional levels after the music listening experience. The findings suggest that computer-generated music possesses aesthetic qualities that do not affect subjects' emotions as long as they are pleased with the music. Furthermore, computer-generated music has unique creativity, and expressioneven though the music produced is meaningless, the computational models developed are unable to present emotional contents in music as humans do.

Keywords: aesthetic, algorithms, emotions, computer-generated music

Procedia PDF Downloads 139
31136 A Comparative Study of Linearly Graded and without Graded Photonic Crystal Structure

Authors: Rajeev Kumar, Angad Singh Kushwaha, Amritanshu Pandey, S. K. Srivastava

Abstract:

Photonic crystals (PCs) have attracted much attention due to its electromagnetic properties and potential applications. In PCs, there is certain range of wavelength where electromagnetic waves are not allowed to pass are called photonic band gap (PBG). A localized defect mode will appear within PBG, due to change in the interference behavior of light, when we create a defect in the periodic structure. We can also create different types of defect structures by inserting or removing a layer from the periodic layered structure in two and three-dimensional PCs. We can design microcavity, waveguide, and perfect mirror by creating a point defect, line defect, and palanar defect in two and three- dimensional PC structure. One-dimensional and two-dimensional PCs with defects were reported theoretically and experimentally by Smith et al.. in conventional photonic band gap structure. In the present paper, we have presented the defect mode tunability in tilted non-graded photonic crystal (NGPC) and linearly graded photonic crystal (LGPC) using lead sulphide (PbS) and titanium dioxide (TiO2) in the infrared region. A birefringent defect layer is created in NGPC and LGPC using potassium titany phosphate (KTP). With the help of transfer matrix method, the transmission properties of proposed structure is investigated for transverse electric (TE) and transverse magnetic (TM) polarization. NGPC and LGPC without defect layer is also investigated. We have found that a photonic band gap (PBG) arises in the infrared region. An additional defect layer of KTP is created in NGPC and LGPC structure. We have seen that an additional transmission mode appers in PBG region. It is due to the addition of defect layer. We have also seen the effect, linear gradation in thickness, angle of incidence, tilt angle, and thickness of defect layer, on PBG and additional transmission mode. We have observed that the additional transmission mode and PBG can be tuned by changing the above parameters. The proposed structure may be used as channeled filter, optical switches, monochromator, and broadband optical reflector.

Keywords: defect modes, graded photonic crystal, photonic crystal, tilt angle

Procedia PDF Downloads 377
31135 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification

Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos

Abstract:

Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.

Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology

Procedia PDF Downloads 151
31134 Using Machine Learning Techniques to Extract Useful Information from Dark Data

Authors: Nigar Hussain

Abstract:

It is a subset of big data. Dark data means those data in which we fail to use for future decisions. There are many issues in existing work, but some need powerful tools for utilizing dark data. It needs sufficient techniques to deal with dark data. That enables users to exploit their excellence, adaptability, speed, less time utilization, execution, and accessibility. Another issue is the way to utilize dark data to extract helpful information to settle on better choices. In this paper, we proposed upgrade strategies to remove the dark side from dark data. Using a supervised model and machine learning techniques, we utilized dark data and achieved an F1 score of 89.48%.

Keywords: big data, dark data, machine learning, heatmap, random forest

Procedia PDF Downloads 35
31133 Annexation (Al-Iḍāfah) in Thariq bin Ziyad’s Speech

Authors: Annisa D. Febryandini

Abstract:

Annexation is a typical construction that commonly used in Arabic language. The use of the construction appears in Arabic speech such as the speech of Thariq bin Ziyad. The speech as one of the most famous speeches in the history of Islam uses many annexations. This qualitative research paper uses the secondary data by library method. Based on the data, this paper concludes that the speech has two basic structures with some variations and has some grammatical relationship. Different from the other researches that identify the speech in sociology field, the speech in this paper will be analyzed in linguistic field to take a look at the structure of its annexation as well as the grammatical relationship.

Keywords: annexation, Thariq bin Ziyad, grammatical relationship, Arabic syntax

Procedia PDF Downloads 325
31132 Evolution of Structure and Magnetic Behavior by Pr Doping in SrRuO3

Authors: Renu Gupta, Ashim K. Pramanik

Abstract:

We report the evolution of structure and magnetic properties in perovskite ruthenates Sr1-xPrxRuO3 (x = 0.0 and 0.1). Our main expectations, to induce the structural modification and change the Ru charge state by Pr doping at Sr site. By the Pr doping on Sr site retains orthorhombic structure while we find a minor change in structural parameters. The SrRuO3 have itinerant type of ferromagnetism with ordering temperature ~160 K. By Pr doping, the magnetic moment decrease and ZFC show three distinct peaks (three transition temperature; TM1, TM2 and TM3). Further analysis of magnetization of both samples, at high temperature follow modified CWL and Pr doping gives Curie temperature ~ 129 K which is close to TM2. Above TM2 to TM3, the inverse susceptibility shows upward deviation from CW behavior, indicating the existence AFM like clustered in this regime. The low-temperature isothermal magnetization M (H) shows moment decreases by Pr doping. The Arrott plot gives spontaneous magnetization (Ms) which also decreases by Pr doping. The evolution of Rhodes-Wohlfarth ratio increases which suggests the FM in this system evolves toward the itinerant type by Pr doping.

Keywords: itinerant ferromagnet, Perovskite structure, Ruthenates, Rhodes-Wohlfarth ratio

Procedia PDF Downloads 360
31131 X-Ray Analysis and Grain Size of CuInx Ga1-X Se2 Solar Cells

Authors: A. I. Al-Bassam, A. M. El-Nggar

Abstract:

Polycrystalline Cu In I-x GaxSe2 thin films have been fabricated. Some physical properties such as lattice parameters, crystal structure and microstructure of Cu In I-x GaxSe2 were determined using X-ray diffractometry and scanning electron microscopy. X-ray diffraction analysis showed that the films with x ≥ 0.5 have a chalcopyrite structure and the films with x ≤ 0.5 have a zinc blende structure. The lattice parameters were found to vary linearly with composition over a wide range from x = 0 to x =1.0. The variation of lattice parameters with composition was found to obey Vegard's law. The variation of the c/a with composition was also linear. The quality of a wide range of Cu In I-xGaxSe2 thin film absorbers from CuInSe to CuGaSe was evaluated by Photoluminescence (PL) measurements.

Keywords: grain size, polycrystalline, solar cells, lattice parameters

Procedia PDF Downloads 506
31130 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms

Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma

Abstract:

Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.

Keywords: image fusion, pyramid, wavelets, principal component analysis

Procedia PDF Downloads 287
31129 The Comparative Analysis of International Financial Reporting Standart Adoption through Earnings Response Coefficient and Conservatism Principle: Case Study in Jakarta Islamic Index 2010 – 2014

Authors: Dwi Wijiastutik, Tarjo, Yuni Rimawati

Abstract:

The purpose of this empirical study is to analyse how to the market reaction and the conservative degree changes on the adoption of International Financial Reporting Standart (IFRS) through Jakarta Islamic Index. The study also has given others additional analysis on the profitability, capital structure and size company toward IFRS adoption. The data collection methods used in this study reveals as secondary data and deep analysis to the company’s annual report and daily price stock at yahoo finance. We analyse 40 companies listed on Jakarta Islamic Index from 2010 to 2014. The result of the study concluded that IFRS has given a different on the depth analysis to the two of variance analysis: Moderated Regression Analysis and Wilcoxon Signed Rank to test developed hypotheses. Our result on the regression analysis shows that market response and conservatism principle is not significantly after IFRS Adoption in Jakarta Islamic Index. Furthermore, in addition, analysis on profitability, capital structure, and company size show that significantly after IFRS adoption. The findings of our study help investor by showing the impact of IFRS for making decided investment.

Keywords: IFRS, earnings response coefficient, conservatism principle

Procedia PDF Downloads 275
31128 The Contrastive Survey of Phonetic Structure in Two Iranian Dialects

Authors: Iran Kalbasi, Foroozandeh Zardashti

Abstract:

Dialectology is a branch of social linguistics that studies systematic language variations. Dialects are the branches of a unique language that have structural, morphological and phonetic differences with each other. In Iran, these dialects and language variations themselves have a lot of cultural loads, and studying them have linguistic and cultural importance. In this study, phonetic structure of two Iranian dialects, Bakhtiyari Lori of Masjedsoleyman and Shushtari in Khuzestan Province of Iran have been surveyed. Its statistical community includes twenty speakers of two dialects. The theoretic bases of this research is based on structuralism. Its data have been collected by interviewing the questionnaire that consist of 3000 words, 410 sentences and 110 complex and simple verbs. These datas are analysed and described synchronically. Then, the phonetic characteristics of these two dialects and standard Persian have been compared. Therefore, we can say that in phonetic level of these two dialects and standard Persian, there are clearly differences.

Keywords: standard language, dialectology, bakhtiyari lori dialect of Masjedsoleyman, Shushtari dialect, vowel, consonant

Procedia PDF Downloads 596
31127 Optimal Operation of Bakhtiari and Roudbar Dam Using Differential Evolution Algorithms

Authors: Ramin Mansouri

Abstract:

Due to the contrast of rivers discharge regime with water demands, one of the best ways to use water resources is to regulate the natural flow of the rivers and supplying water needs to construct dams. Optimal utilization of reservoirs, consideration of multiple important goals together at the same is of very high importance. To study about analyzing this method, statistical data of Bakhtiari and Roudbar dam over 46 years (1955 until 2001) is used. Initially an appropriate objective function was specified and using DE algorithm, the rule curve was developed. In continue, operation policy using rule curves was compared to standard comparative operation policy. The proposed method distributed the lack to the whole year and lowest damage was inflicted to the system. The standard deviation of monthly shortfall of each year with the proposed algorithm was less deviated than the other two methods. The Results show that median values for the coefficients of F and Cr provide the optimum situation and cause DE algorithm not to be trapped in local optimum. The most optimal answer for coefficients are 0.6 and 0.5 for F and Cr coefficients, respectively. After finding the best combination of coefficients values F and CR, algorithms for solving the independent populations were examined. For this purpose, the population of 4, 25, 50, 100, 500 and 1000 members were studied in two generations (G=50 and 100). result indicates that the generation number 200 is suitable for optimizing. The increase in time per the number of population has almost a linear trend, which indicates the effect of population in the runtime algorithm. Hence specifying suitable population to obtain an optimal results is very important. Standard operation policy had better reversibility percentage, but inflicts severe vulnerability to the system. The results obtained in years of low rainfall had very good results compared to other comparative methods.

Keywords: reservoirs, differential evolution, dam, Optimal operation

Procedia PDF Downloads 80
31126 An Attempt at the Multi-Criterion Classification of Small Towns

Authors: Jerzy Banski

Abstract:

The basic aim of this study is to discuss and assess different classifications and research approaches to small towns that take their social and economic functions into account, as well as relations with surrounding areas. The subject literature typically includes three types of approaches to the classification of small towns: 1) the structural, 2) the location-related, and 3) the mixed. The structural approach allows for the grouping of towns from the point of view of the social, cultural and economic functions they discharge. The location-related approach draws on the idea of there being a continuum between the center and the periphery. A mixed classification making simultaneous use of the different approaches to research brings the most information to bear in regard to categories of the urban locality. Bearing in mind the approaches to classification, it is possible to propose a synthetic method for classifying small towns that takes account of economic structure, location and the relationship between the towns and their surroundings. In the case of economic structure, the small centers may be divided into two basic groups – those featuring a multi-branch structure and those that are specialized economically. A second element of the classification reflects the locations of urban centers. Two basic types can be identified – the small town within the range of impact of a large agglomeration, or else the town outside such areas, which is to say located peripherally. The third component of the classification arises out of small towns’ relations with their surroundings. In consequence, it is possible to indicate 8 types of small-town: from local centers enjoying good accessibility and a multi-branch economic structure to peripheral supra-local centers characterised by a specialized economic structure.

Keywords: small towns, classification, functional structure, localization

Procedia PDF Downloads 185
31125 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing

Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou

Abstract:

The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.

Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation

Procedia PDF Downloads 121
31124 Blood Glucose Measurement and Analysis: Methodology

Authors: I. M. Abd Rahim, H. Abdul Rahim, R. Ghazali

Abstract:

There is numerous non-invasive blood glucose measurement technique developed by researchers, and near infrared (NIR) is the potential technique nowadays. However, there are some disagreements on the optimal wavelength range that is suitable to be used as the reference of the glucose substance in the blood. This paper focuses on the experimental data collection technique and also the analysis method used to analyze the data gained from the experiment. The selection of suitable linear and non-linear model structure is essential in prediction system, as the system developed need to be conceivably accurate.

Keywords: linear, near-infrared (NIR), non-invasive, non-linear, prediction system

Procedia PDF Downloads 463
31123 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 397
31122 A Study of Permission-Based Malware Detection Using Machine Learning

Authors: Ratun Rahman, Rafid Islam, Akin Ahmed, Kamrul Hasan, Hasan Mahmud

Abstract:

Malware is becoming more prevalent, and several threat categories have risen dramatically in recent years. This paper provides a bird's-eye view of the world of malware analysis. The efficiency of five different machine learning methods (Naive Bayes, K-Nearest Neighbor, Decision Tree, Random Forest, and TensorFlow Decision Forest) combined with features picked from the retrieval of Android permissions to categorize applications as harmful or benign is investigated in this study. The test set consists of 1,168 samples (among these android applications, 602 are malware and 566 are benign applications), each consisting of 948 features (permissions). Using the permission-based dataset, the machine learning algorithms then produce accuracy rates above 80%, except the Naive Bayes Algorithm with 65% accuracy. Of the considered algorithms TensorFlow Decision Forest performed the best with an accuracy of 90%.

Keywords: android malware detection, machine learning, malware, malware analysis

Procedia PDF Downloads 173
31121 Students’ Awareness of the Use of Poster, Power Point and Animated Video Presentations: A Case Study of Third Year Students of the Department of English of Batna University

Authors: Bahloul Amel

Abstract:

The present study debates students’ perceptions of the use of technology in learning English as a Foreign Language. Its aim is to explore and understand students’ preparation and presentation of Posters, PowerPoint and Animated Videos by drawing attention to visual and oral elements. The data is collected through observations and semi-structured interviews and analyzed through phenomenological data analysis steps. The themes emerged from the data, visual learning satisfaction in using information and communication technology, providing structure to oral presentation, learning from peers’ presentations, draw attention to using Posters, PowerPoint and Animated Videos as each supports visual learning and organization of thoughts in oral presentations.

Keywords: EFL, posters, PowerPoint presentations, Animated Videos, visual learning

Procedia PDF Downloads 448
31120 Fluid Structure Interaction of Flow and Heat Transfer around a Microcantilever

Authors: Khalil Khanafer

Abstract:

This study emphasizes on analyzing the effect of flow conditions and the geometric variation of the microcantilever’s bluff body on the microcantilever detection capabilities within a fluidic device using a finite element fluid-structure interaction model. Such parameters include inlet velocity, flow direction, and height of the microcantilever’s supporting system within the fluidic cell. The transport equations are solved using a finite element formulation based on the Galerkin method of weighted residuals. For a flexible microcantilever, a fully coupled fluid-structure interaction (FSI) analysis is utilized and the fluid domain is described by an Arbitrary-Lagrangian–Eulerian (ALE) formulation that is fully coupled to the structure domain. The results of this study showed a profound effect on the magnitude and direction of the inlet velocity and the height of the bluff body on the deflection of the microcantilever. The vibration characteristics were also investigated in this study. This work paves the road for researchers to design efficient microcantilevers that display least errors in the measurements.

Keywords: fluidic cell, FSI, microcantilever, flow direction

Procedia PDF Downloads 376
31119 Loss Allocation in Radial Distribution Networks for Loads of Composite Types

Authors: Sumit Banerjee, Chandan Kumar Chanda

Abstract:

The paper presents allocation of active power losses and energy losses to consumers connected to radial distribution networks in a deregulated environment for loads of composite types. A detailed comparison among four algorithms, namely quadratic loss allocation, proportional loss allocation, pro rata loss allocation and exact loss allocation methods are presented. Quadratic and proportional loss allocations are based on identifying the active and reactive components of current in each branch and the losses are allocated to each consumer, pro rata loss allocation method is based on the load demand of each consumer and exact loss allocation method is based on the actual contribution of active power loss by each consumer. The effectiveness of the proposed comparison among four algorithms for composite load is demonstrated through an example.

Keywords: composite type, deregulation, loss allocation, radial distribution networks

Procedia PDF Downloads 289
31118 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 529
31117 Numerical Modeling of Various Support Systems to Stabilize Deep Excavations

Authors: M. Abdallah

Abstract:

Urban development requires deep excavations near buildings and other structures. Deep excavation has become more a necessity for better utilization of space as the population of the world has dramatically increased. In Lebanon, some urban areas are very crowded and lack spaces for new buildings and underground projects, which makes the usage of underground space indispensable. In this paper, a numerical modeling is performed using the finite element method to study the deep excavation-diaphragm wall soil-structure interaction in the case of nonlinear soil behavior. The study is focused on a comparison of the results obtained using different support systems. Furthermore, a parametric study is performed according to the remoteness of the structure.

Keywords: deep excavation, ground anchors, interaction soil-structure, struts

Procedia PDF Downloads 419