Search results for: Medical Data Exchange
5414 Fuzzy Hierarchical Clustering Applied for Quality Estimation in Manufacturing System
Authors: Y. Q. Lv, C.K.M. Lee
Abstract:
This paper develops a quality estimation method with the application of fuzzy hierarchical clustering. Quality estimation is essential to quality control and quality improvement as a precise estimation can promote a right decision-making in order to help better quality control. Normally the quality of finished products in manufacturing system can be differentiated by quality standards. In the real life situation, the collected data may be vague which is not easy to be classified and they are usually represented in term of fuzzy number. To estimate the quality of product presented by fuzzy number is not easy. In this research, the trapezoidal fuzzy numbers are collected in manufacturing process and classify the collected data into different clusters so as to get the estimation. Since normal hierarchical clustering methods can only be applied for real numbers, fuzzy hierarchical clustering is selected to handle this problem based on quality standards.Keywords: Quality Estimation, Fuzzy Quality Mean, Fuzzy Hierarchical Clustering, Fuzzy Number, Manufacturing system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16675413 Design and Construction Validation of Pile Performance through High Strain Pile Dynamic Tests for both Contiguous Flight Auger and Drilled Displacement Piles
Authors: S. Pirrello
Abstract:
Sydney’s booming real estate market has pushed property developers to invest in historically “no-go” areas, which were previously too expensive to develop. These areas are usually near rivers where the sites are underlain by deep alluvial and estuarine sediments. In these ground conditions, conventional bored pile techniques are often not competitive. Contiguous Flight Auger (CFA) and Drilled Displacement (DD) Piles techniques are on the other hand suitable for these ground conditions. This paper deals with the design and construction challenges encountered with these piling techniques for a series of high-rise towers in Sydney’s West. The advantages of DD over CFA piles such as reduced overall spoil with substantial cost savings and achievable rock sockets in medium strength bedrock are discussed. Design performances were assessed with PIGLET. Pile performances are validated in two stages, during constructions with the interpretation of real-time data from the piling rigs’ on-board computer data, and after construction with analyses of results from high strain pile dynamic testing (PDA). Results are then presented and discussed. High Strain testing data are presented as Case Pile Wave Analysis Program (CAPWAP) analyses.
Keywords: Contiguous flight auger, case pile wave analysis, high strain pile, drilled displacement, pile performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9835412 E-Appointment Scheduling (EAS)
Authors: Noraziah Ahmad, Roslina Mohd Sidek, Mohd Affendy Omardin
Abstract:
E-Appointment Scheduling (EAS) has been developed to handle appointment for UMP students, lecturers in Faculty of Computer Systems & Software Engineering (FCSSE) and Student Medical Center. The schedules are based on the timetable and university activities. Constraints Logic Programming (CLP) has been implemented to solve the scheduling problems by giving recommendation to the users in part of determining any available slots from the lecturers and doctors- timetable. By using this system, we can avoid wasting time and cost because this application will set an appointment by auto-generated. In addition, this system can be an alternative to the lecturers and doctors to make decisions whether to approve or reject the appointments.Keywords: EAS, Constraint Logic Programming, PHP, Apache.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46075411 The Investigation of 5th Grade Turkish Students- Comprehension Scores According to Different Variables
Authors: Omer Kutlu, Ozen Yildirim, Safiye Bilican
Abstract:
The aim of this study is to examine the reading comprehension scores of Turkish 5th grade students according to the variables given in the student questionnaire. In this descriptive survey study research participated 279 5th grade students, who studied at 10 different primary schools in four provinces of Ankara in 2008-2009 academic year. Two different data collection tools were made use of in the study: “Reading Comprehension Test" and “Student Information Questionnaire". Independent sample t-test, oneway Anova and two-way Anova tests were used in the analyses of the gathered data. The results of the study indicate that the reading comprehension scores of the students differ significantly according to sex of the students, the number of books in their houses, the frequency of summarizing activities on the reading text of free and the frequency reading hours provided by their teachers; but, differ not significantly according to educational level of their mothers and fathers.Keywords: Primary School Education, Reading, ReadingComprehension.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14005410 New Multi-Solid Thermodynamic Model for the Prediction of Wax Formation
Authors: Ehsan Ghanaei, Feridun Esmaeilzadeh, Jamshid Fathi Kaljahi
Abstract:
In the previous multi-solid models,¤ò approach is used for the calculation of fugacity in the liquid phase. For the first time, in the proposed multi-solid thermodynamic model,γ approach has been used for calculation of fugacity in the liquid mixture. Therefore, some activity coefficient models have been studied that the results show that the predictive Wilson model is more appropriate than others. The results demonstrate γ approach using the predictive Wilson model is in more agreement with experimental data than the previous multi-solid models. Also, by this method, generates a new approach for presenting stability analysis in phase equilibrium calculations. Meanwhile, the run time in γ approach is less than the previous methods used ¤ò approach. The results of the new model present 0.75 AAD % (Average Absolute Deviation) from the experimental data which is less than the results error of the previous multi-solid models obviously.Keywords: Multi-solid thermodynamic model, PredictiveWilson model, Wax formation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19815409 Application of Data Envelopment Analysis and Performance Indicators to Irrigation Systems in Thessaloniki Plain (Greece)
Authors: Ntantos P.N, Karpouzos D.K
Abstract:
In this paper, a benchmarking framework is presented for the performance assessment of irrigations systems. Firstly, a data envelopment analysis (DEA) is applied to measure the technical efficiency of irrigation systems. This method, based on linear programming, aims to determine a consistent efficiency ranking of irrigation systems in which known inputs, such as water volume supplied and total irrigated area, and a given output corresponding to the total value of irrigation production are taken into account simultaneously. Secondly, in order to examine the irrigation efficiency in more detail, a cross – system comparison is elaborated using a performance indicators set selected by IWMI. The above methodologies were applied in Thessaloniki plain, located in Northern Greece while the results of the application are presented and discussed. The conjunctive use of DEA and performance indicators seems to be a very useful tool for efficiency assessment and identification of best practices in irrigation systems management.Keywords: Benchmarking, D.E.A, Performance Indicators, Irrigation systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20975408 Gene Selection Guided by Feature Interdependence
Authors: Hung-Ming Lai, Andreas Albrecht, Kathleen Steinhöfel
Abstract:
Cancers could normally be marked by a number of differentially expressed genes which show enormous potential as biomarkers for a certain disease. Recent years, cancer classification based on the investigation of gene expression profiles derived by high-throughput microarrays has widely been used. The selection of discriminative genes is, therefore, an essential preprocess step in carcinogenesis studies. In this paper, we have proposed a novel gene selector using information-theoretic measures for biological discovery. This multivariate filter is a four-stage framework through the analyses of feature relevance, feature interdependence, feature redundancy-dependence and subset rankings, and having been examined on the colon cancer data set. Our experimental result show that the proposed method outperformed other information theorem based filters in all aspect of classification errors and classification performance.
Keywords: Colon cancer, feature interdependence, feature subset selection, gene selection, microarray data analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21445407 Correlation-based Feature Selection using Ant Colony Optimization
Authors: M. Sadeghzadeh, M. Teshnehlab
Abstract:
Feature selection has recently been the subject of intensive research in data mining, specially for datasets with a large number of attributes. Recent work has shown that feature selection can have a positive effect on the performance of machine learning algorithms. The success of many learning algorithms in their attempts to construct models of data, hinges on the reliable identification of a small set of highly predictive attributes. The inclusion of irrelevant, redundant and noisy attributes in the model building process phase can result in poor predictive performance and increased computation. In this paper, a novel feature search procedure that utilizes the Ant Colony Optimization (ACO) is presented. The ACO is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It looks for optimal solutions by considering both local heuristics and previous knowledge. When applied to two different classification problems, the proposed algorithm achieved very promising results.
Keywords: Ant colony optimization, Classification, Datamining, Feature selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24205406 Operating Model of Obstructive Sleep Apnea Patients in North Karelia Central Hospital
Authors: L. Korpinen, T. Kava, I. Salmi
Abstract:
This study aimed to describe the operating model of obstructive sleep apnea. Due to the large number of patients, the role of nurses in the diagnosis and treatment of sleep apnea was important. Pulmonary physicians met only a minority of the patients. The sleep apnea study in 2018 included about 800 patients, of which about 28% were normal and 180 patients were classified as severe (apnea-hypopnea index [AHI] over 30). The operating model has proven to be workable and appropriate. The patients understand well that they may not be referred to a pulmonary doctor. However, specialized medical follow-up on professional drivers continues every year.
Keywords: Sleep, apnea patient, operating model, hospital.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6625405 A Decision Boundary based Discretization Technique using Resampling
Authors: Taimur Qureshi, Djamel A Zighed
Abstract:
Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.Keywords: Bootstrap, discretization, resampling, soft decision trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14345404 On-Time Performance and Service Regularity of Stage Buses in Mixed Traffic
Authors: Suwardo, Madzlan B. Napiah, Ibrahim B. Kamaruddin
Abstract:
Stage bus operated in the mixed traffic might always meet many problems about low quality and reliability of services. The low quality and reliability of bus service can make the system not attractive and directly reduce the interest of using bus service. This paper presents the result of field investigation and analysis of on-time performance and service regularity of stage bus in mixed traffic. Data for analysis was collected from the field by on-board observation along the Ipoh-Lumut corridor in Perak, Malaysia. From analysis and discussion, it can be concluded that on-time performance and service regularity varies depend on station, typical day, time period, operation characteristics of bus and characteristics of traffic. The on-time performance and service regularity of stage bus in mixed traffic can be derived by using data collected by onboard survey. It is clear that on-time performance and service regularity of the existing stage bus system was low.
Keywords: mixed traffic, on-time performance, service regularity, stage bus
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23505403 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography
Authors: Nicole M. Martino
Abstract:
Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from. In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.
Keywords: Bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9145402 Combining ASTER Thermal Data and Spatial-Based Insolation Model for Identification of Geothermal Active Areas
Authors: Khalid Hussein, Waleed Abdalati, Pakorn Petchprayoon, Khaula Alkaabi
Abstract:
In this study, we integrated ASTER thermal data with an area-based spatial insolation model to identify and delineate geothermally active areas in Yellowstone National Park (YNP). Two pairs of L1B ASTER day- and nighttime scenes were used to calculate land surface temperature. We employed the Emissivity Normalization Algorithm which separates temperature from emissivity to calculate surface temperature. We calculated the incoming solar radiation for the area covered by each of the four ASTER scenes using an insolation model and used this information to compute temperature due to solar radiation. We then identified the statistical thermal anomalies using land surface temperature and the residuals calculated from modeled temperatures and ASTER-derived surface temperatures. Areas that had temperatures or temperature residuals greater than 2σ and between 1σ and 2σ were considered ASTER-modeled thermal anomalies. The areas identified as thermal anomalies were in strong agreement with the thermal areas obtained from the YNP GIS database. Also the YNP hot springs and geysers were located within areas identified as anomalous thermal areas. The consistency between our results and known geothermally active areas indicate that thermal remote sensing data, integrated with a spatial-based insolation model, provides an effective means for identifying and locating areas of geothermal activities over large areas and rough terrain.
Keywords: Thermal remote sensing, insolation model, land surface temperature, geothermal anomalies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10255401 Unsupervised Feature Selection Using Feature Density Functions
Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh
Abstract:
Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.Keywords: Feature, Feature Selection, Filter, Probability Density Function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20775400 Classification of Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach
Authors: Henry J. Wattimanela, Udjianna S. Pasaribu, Nanang T. Puspito, Sapto W. Indratno
Abstract:
Banda Sea Collision Zone (BSCZ) is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location is located in eastern Indonesia. This zone has a very high seismic activity. In this research, we will calculate the rate (λ) and Mean Square Error (MSE). By this result, we will classification earthquakes distribution in the BSCZ with the point process approach. Chi-square is used to determine the type of earthquakes distribution in the sub region of BSCZ. The data used in this research is data of earthquakes with a magnitude ≥ 6 SR for the period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.Keywords: Banda sea collision zone, earthquakes, mean square error, Poisson distribution, chi-square test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21175399 Banks Profitability Indicators in CEE Countries
Abstract:
The aim of the present article is to determine the impact of the external and internal factors of bank performance on the profitability indicators of the CEE countries banks in the period from 2006 to 2012. On the basis of research conducted abroad on bank and macroeconomic profitability indicators, in order to obtain research results, the authors evaluated return on average assets (ROAA) and return on average equity (ROAE) indicators of the CEE countries banks. The authors analyzed profitability indicators of banks using descriptive methods, SPSS data analysis methods, as well as data correlation and linear regression analysis. The authors concluded that most internal and external indicators of bank performance have no direct influence the profitability of the banks in the CEE countries. The only exceptions are credit risk and bank size, which affect one of the measures of bank profitability – return on average equity.
Keywords: Banks, CEE countries, Profitability ROAA, ROAE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26625398 Mining of Interesting Prediction Rules with Uniform Two-Level Genetic Algorithm
Authors: Bilal Alatas, Ahmet Arslan
Abstract:
The main goal of data mining is to extract accurate, comprehensible and interesting knowledge from databases that may be considered as large search spaces. In this paper, a new, efficient type of Genetic Algorithm (GA) called uniform two-level GA is proposed as a search strategy to discover truly interesting, high-level prediction rules, a difficult problem and relatively little researched, rather than discovering classification knowledge as usual in the literatures. The proposed method uses the advantage of uniform population method and addresses the task of generalized rule induction that can be regarded as a generalization of the task of classification. Although the task of generalized rule induction requires a lot of computations, which is usually not satisfied with the normal algorithms, it was demonstrated that this method increased the performance of GAs and rapidly found interesting rules.
Keywords: Classification rule mining, data mining, genetic algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15945397 Community Resilience in Response to the Population Growth in Al-Thahabiah Neighborhood
Authors: Layla Mujahed
Abstract:
Amman, the capital of Jordan, is the main political, economic, social and cultural center of Jordan and beyond. The city faces multitude demographic challenges related to the unstable political situation in the surrounded countries. It has regional and local migrants who left their homes to find better life in the capital. This resulted with random and unequaled population distribution. Some districts have high population and pressure on the infrastructure and services more than other districts.Government works to resolve this challenge in compliance with 100 Cities Resilience Framework (CRF). Amman participated in this framework as a member in December 2014 to work in achieving the four goals: health and welfare, infrastructure and utilities, economy and education as well as administration and government. Previous research studies lack in studying Amman resilient work in neighborhood scale and the population growth as resilient challenge. For that, this study focuses on Al-Thahabiah neighborhood in Shafa Badran district in Amman. This paper studies the reasons and drivers behind this population growth during the selected period in this area then provide strategies to improve the resilient work in neighborhood scale. The methodology comprises of primary and secondary data. The primary data consist of interviews with chief officer in the executive part in Great Amman Municipality and resilient officer. The secondary data consist of papers, journals, newspaper, articles and book’s reading. The other part of data consists of maps and statistical data which describe the infrastructural and social situation in the neighborhood and district level during the studying period. Based upon those data, more detailed information will be found, e.g., the centralizing position of population and the provided infrastructure for them. This will help to provide these services and infrastructure to other neighborhoods and enhance population distribution. This study develops an analytical framework to assess urban demographical time series in accordance with the criteria of CRF to make accurate detailed projections on the requirements for the future development in the neighborhood scale and organize the human requirements for affordable quality housing, employment, transportation, health and education in this neighborhood to improve the social relations between its inhabitants and the community. This study highlights on the localization of resilient work in neighborhood scale and spread the resilient knowledge related to the shortage of its research in Jordan. Studying the resilient work from population growth challenge perspective helps improve the facilities provide to the inhabitants and improve their quality of life.
Keywords: City resilience framework, CRF, demography, population growth, stakeholders, urban resilience.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5225396 Analyzing Factors Impacting COVID-19 Vaccination Rates
Authors: Dongseok Cho, Mitchell Driedger, Sera Han, Noman Khan, Mohammed Elmorsy, Mohamad El-Hajj
Abstract:
Since the approval of the COVID-19 vaccine in late 2020, vaccination rates have varied around the globe. Access to a vaccine supply, mandated vaccination policy, and vaccine hesitancy contribute to these rates. This study used COVID-19 vaccination data from Our World in Data and the Multilateral Leaders Task Force on COVID-19 to create two COVID-19 vaccination indices. The first index is the Vaccine Utilization Index (VUI), which measures how effectively each country has utilized its vaccine supply to doubly vaccinate its population. The second index is the Vaccination Acceleration Index (VAI), which evaluates how efficiently each country vaccinated their populations within their first 150 days. Pearson correlations were created between these indices and country indicators obtained from the World Bank. Results of these correlations identify countries with stronger Health indicators such as lower mortality rates, lower age-dependency ratios, and higher rates of immunization to other diseases display higher VUI and VAI scores than countries with lesser values. VAI scores are also positively correlated to Governance and Economic indicators, such as regulatory quality, control of corruption, and GDP per capita. As represented by the VUI, proper utilization of the COVID-19 vaccine supply by country is observed in countries that display excellence in health practices. A country’s motivation to accelerate its vaccination rates within the first 150 days of vaccinating, as represented by the VAI, was largely a product of the governing body’s effectiveness and economic status, as well as overall excellence in health practises.
Keywords: Data mining, Pearson Correlation, COVID-19, vaccination rates, hesitancy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3465395 Distributed Cost-Based Scheduling in Cloud Computing Environment
Authors: Rupali, Anil Kumar Jaiswal
Abstract:
Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc. Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively. Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.
Keywords: Physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8515394 Mining Frequent Patterns with Functional Programming
Authors: Nittaya Kerdprasop, Kittisak Kerdprasop
Abstract:
Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.Keywords: Association, frequent pattern mining, functionalprogramming, pattern matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21355393 Using a Semantic Self-Organising Web Page-Ranking Mechanism for Public Administration and Education
Authors: Marios Poulos, Sozon Papavlasopoulos, V. S. Belesiotis
Abstract:
In the proposed method for Web page-ranking, a novel theoretic model is introduced and tested by examples of order relationships among IP addresses. Ranking is induced using a convexity feature, which is learned according to these examples using a self-organizing procedure. We consider the problem of selforganizing learning from IP data to be represented by a semi-random convex polygon procedure, in which the vertices correspond to IP addresses. Based on recent developments in our regularization theory for convex polygons and corresponding Euclidean distance based methods for classification, we develop an algorithmic framework for learning ranking functions based on a Computational Geometric Theory. We show that our algorithm is generic, and present experimental results explaining the potential of our approach. In addition, we explain the generality of our approach by showing its possible use as a visualization tool for data obtained from diverse domains, such as Public Administration and Education.Keywords: Computational Geometry, Education, e-Governance, Semantic Web.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17575392 Sonic Therapeutic Intervention for Preventing Financial Fraud: A Phenomenological Study
Authors: Vasudev Das
Abstract:
The specific problem is that private and public organizational leaders often do not understand the importance of sonic therapeutic intervention in preventing financial fraud. The study aimed to explore sonic therapeutic intervention practitioners' lived experiences regarding the value of sonic therapeutic intervention in preventing financial fraud. The data collection methods were semi-structured interviews of purposeful samples and documentary reviews, which were analyzed thematically. Four themes emerged from the analysis of interview transcription data: Sonic therapeutic intervention enabled self-control, pro-spiritual values, consequentiality mindset, and post-conventional consciousness. The itemized four themes helped non-engagement in financial fraud. Implications for positive social change include enhanced financial fraud management, more significant financial leadership, and result-oriented decision-taking in the financial market. Also, the study results can improve the increased de-escalation of anxiety/stress associated with defrauding.
Keywords: consciousness, consequentiality, rehabilitation, reintegration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8065391 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation
Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez
Abstract:
With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).Keywords: Component carrier, carrier aggregation, LTE-Advanced, scheduling, spectrum management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5625390 A Design of Elliptic Curve Cryptography Processor Based on SM2 over GF(p)
Authors: Shiji Hu, Lei Li, Wanting Zhou, Daohong Yang
Abstract:
The data encryption is the foundation of today’s communication. On this basis, to improve the speed of data encryption and decryption is always an important goal for high-speed applications. This paper proposed an elliptic curve crypto processor architecture based on SM2 prime field. Regarding hardware implementation, we optimized the algorithms in different stages of the structure. For modulo operation on finite field, we proposed an optimized improvement of the Karatsuba-Ofman multiplication algorithm and shortened the critical path through the pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between the affine coordinate system and the Jacobi projective coordinate system. In the parallel scheduling point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU (dual-core ARM Cortex-A9).
Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2575389 Value Analysis of Islamic Banking and Conventional Banking to Measure Value Co-creation
Authors: Amna Javed, Hisashi Masuda, Youji Kohda
Abstract:
This study examines the value analysis in Islamic and conventional banking services in Pakistan. Many scholars have focused on co-creation of values in services but mainly economic values not non-economic.
Keywords: Economic values, Islamic banking, Non-economic values, Value system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32575388 Heat Transfer Enhancement Studies in a Circular Tube Fitted with Right-Left Helical Inserts with Spacer
Authors: P. K. Nagarajan, P. Sivashanmugam
Abstract:
Experimental investigation of heat transfer and friction factor characteristics of circular tube fitted with 300 right-left helical screw inserts with 100 mm spacer of different twist ratio has been presented for laminar and turbulent flow.. The experimental data obtained were compared with those obtained from plain tube published data. The heat transfer coefficient enhancement for 300 RL inserts with 100 mm spacer is quite comparable with for 300 R-L inserts. Performance evaluation analysis has been made and found that the performance ratio increases with increasing Reynolds number and decreasing twist ration with the maximum for the twist ratio 2.93. Also, the performance ratio of more than one indicates that the type of twist inserts can be used effectively for heat transfer augmentation.Keywords: Heat transfer augmentation, right-left helical screw inserts with spacer, Twist ratio, Heat Transfer
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25565387 Fracture Pressure Predict Based on Well Logs of Depleted Reservoir in Southern Iraqi Oilfield
Authors: Raed H. Allawi
Abstract:
Fracture pressure is the main parameter applied in wells design and used to avoid drilling problems like lost circulation. Thus, this study aims to predict the fracture pressure of oil reservoirs in the southern Iraq Oilfield. The data required to implement this study included bulk density, compression wave velocity, gamma-ray, and leak-off test. In addition, this model is based on the pore pressure which is measured based on the Modular Formation Dynamics Tester (MDT). Many measured values of pore pressure were used to validate the accurate model. Using sonic velocity approaches, the mean absolute percentage error (MAPE) was about 4%. The fracture pressure results were consistent with the measurement data, actual drilling report, and events. The model's results will be a guide for successful drilling in future wells in the same oilfield.
Keywords: Pore pressure, fracture pressure, overburden pressure, effective stress, drilling events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1855386 A Real Time Collision Avoidance Algorithm for Mobile Robot based on Elastic Force
Authors: Kyung Hyun, Choi, Minh Ngoc, Nong, M. Asif Ali, Rehmani
Abstract:
This present paper proposes the modified Elastic Strip method for mobile robot to avoid obstacles with a real time system in an uncertain environment. The method deals with the problem of robot in driving from an initial position to a target position based on elastic force and potential field force. To avoid the obstacles, the robot has to modify the trajectory based on signal received from the sensor system in the sampling times. It was evident that with the combination of Modification Elastic strip and Pseudomedian filter to process the nonlinear data from sensor uncertainties in the data received from the sensor system can be reduced. The simulations and experiments of these methods were carried out.Keywords: Collision avoidance, Avoidance obstacle, Elastic Strip, Real time collision avoidance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20055385 Motions of Multiple Objects Detection Based On Video Frames
Authors: Khin Thandar Lwin, Than Htike, Zaw Min Naing
Abstract:
This paper introduces an intelligent system, which can be applied in the monitoring of vehicle speed using a single camera. The ability of motion tracking is extremely useful in many automation problems and the solution to this problem will open up many future applications. One of the most common problems in our daily life is the speed detection of vehicles on a highway. In this paper, a novel technique is developed to track multiple moving objects with their speeds being estimated using a sequence of video frames. Field test has been conducted to capture real-life data and the processed results were presented. Multiple object problems and noisy in data are also considered. Implementing this system in real-time is straightforward. The proposal can accurately evaluate the position and the orientation of moving objects in real-time. The transformations and calibration between the 2D image and the actual road are also considered.
Keywords: Motion Estimation, Image Analyses, Speed Detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1429