Search results for: multiple data stores
7061 School Design and Energy Efficiency
Authors: B. Su
Abstract:
Auckland has a temperate climate with comfortable warm, dry summers and mild, wet winters. An Auckland school normally does not need air conditioning for cooling during the summer and only need heating during the winter. The space hating energy is the major portion of winter school energy consumption and the winter energy consumption is major portion of annual school energy consumption. School building thermal design should focus on the winter thermal performance for reducing the space heating energy. A number of Auckland schools- design data and energy consumption data are used for this study. This pilot study investigates the relationships between their energy consumption data and school building design data to improve future school design for energy efficiency.Keywords: Building energy efficiency, building thermal performance, school building design, school energy consumption
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18807060 Comparison between Higher-Order SVD and Third-order Orthogonal Tensor Product Expansion
Authors: Chiharu Okuma, Jun Murakami, Naoki Yamamoto
Abstract:
In digital signal processing it is important to approximate multi-dimensional data by the method called rank reduction, in which we reduce the rank of multi-dimensional data from higher to lower. For 2-dimennsional data, singular value decomposition (SVD) is one of the most known rank reduction techniques. Additional, outer product expansion expanded from SVD was proposed and implemented for multi-dimensional data, which has been widely applied to image processing and pattern recognition. However, the multi-dimensional outer product expansion has behavior of great computation complex and has not orthogonally between the expansion terms. Therefore we have proposed an alterative method, Third-order Orthogonal Tensor Product Expansion short for 3-OTPE. 3-OTPE uses the power method instead of nonlinear optimization method for decreasing at computing time. At the same time the group of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is also developed with SVD extensions for multi-dimensional data. 3-OTPE and HOSVD are similarly on the rank reduction of multi-dimensional data. Using these two methods we can obtain computation results respectively, some ones are the same while some ones are slight different. In this paper, we compare 3-OTPE to HOSVD in accuracy of calculation and computing time of resolution, and clarify the difference between these two methods.Keywords: Singular value decomposition (SVD), higher-order SVD (HOSVD), higher-order tensor, outer product expansion, power method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15617059 A Novel QoS Optimization Architecture for 4G Networks
Authors: Aaqif Afzaal Abbasi, Javaid Iqbal, Akhtar Nawaz Malik
Abstract:
4G Communication Networks provide heterogeneous wireless technologies to mobile subscribers through IP based networks and users can avail high speed access while roaming across multiple wireless channels; possible by an organized way to manage the Quality of Service (QoS) functionalities in these networks. This paper proposes the idea of developing a novel QoS optimization architecture that will judge the user requirements and knowing peak times of services utilization can save the bandwidth/cost factors. The proposed architecture can be customized according to the network usage priorities so as to considerably improve a network-s QoS performance.Keywords: QoS, Network Coverage Boundary, ServicesArchives Units (SAU), Cumulative Services Archives Units (CSAU).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20197058 Low Power Approach for Decimation Filter Hardware Realization
Authors: Kar Foo Chong, Pradeep K. Gopalakrishnan, T. Hui Teo
Abstract:
There are multiple ways to implement a decimator filter. This paper addresses usage of CIC (cascaded-integrator-comb) filter and HB (half band) filter as the decimator filter to reduce the frequency sample rate by factor of 64 and detail of the implementation step to realize this design in hardware. Low power design approach for CIC filter and half band filter will be discussed. The filter design is implemented through MATLAB system modeling, ASIC (application specific integrated circuit) design flow and verified using a FPGA (field programmable gate array) board and MATLAB analysis.Keywords: CIC filter, decimation filter, half-band filter, lowpower.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23967057 Electron-Impact Excitation of Kr 5s, 5p Levels
Authors: Alla A. Mityureva
Abstract:
The available data on the cross sections of electronimpact excitation of krypton 5s and 5p configuration levels out of the ground state are represented in convenient and compact form. The results are obtained by regression through all known published data related to this process.Keywords: Cross section, electron excitation, krypton, regression
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10867056 Seamless Flow of Voluminous Data in High Speed Network without Congestion Using Feedback Mechanism
Abstract:
Continuously growing needs for Internet applications that transmit massive amount of data have led to the emergence of high speed network. Data transfer must take place without any congestion and hence feedback parameters must be transferred from the receiver end to the sender end so as to restrict the sending rate in order to avoid congestion. Even though TCP tries to avoid congestion by restricting the sending rate and window size, it never announces the sender about the capacity of the data to be sent and also it reduces the window size by half at the time of congestion therefore resulting in the decrease of throughput, low utilization of the bandwidth and maximum delay. In this paper, XCP protocol is used and feedback parameters are calculated based on arrival rate, service rate, traffic rate and queue size and hence the receiver informs the sender about the throughput, capacity of the data to be sent and window size adjustment, resulting in no drastic decrease in window size, better increase in sending rate because of which there is a continuous flow of data without congestion. Therefore as a result of this, there is a maximum increase in throughput, high utilization of the bandwidth and minimum delay. The result of the proposed work is presented as a graph based on throughput, delay and window size. Thus in this paper, XCP protocol is well illustrated and the various parameters are thoroughly analyzed and adequately presented.Keywords: Bandwidth-Delay Product, Congestion Control, Congestion Window, TCP/IP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14867055 Ginzburg-Landau Model for Curved Two-Phase Shallow Mixing Layers
Authors: Irina Eglite, Andrei A. Kolyshkin
Abstract:
Method of multiple scales is used in the paper in order to derive an amplitude evolution equation for the most unstable mode from two-dimensional shallow water equations under the rigid-lid assumption. It is assumed that shallow mixing layer is slightly curved in the longitudinal direction and contains small particles. Dynamic interaction between carrier fluid and particles is neglected. It is shown that the evolution equation is the complex Ginzburg-Landau equation. Explicit formulas for the computation of the coefficients of the equation are obtained.Keywords: Shallow water equations, mixing layer, weakly nonlinear analysis, Ginzburg-Landau equation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14187054 Eco-Connectivity: Sustainable Practices in Telecom Networks Using Big Data
Authors: Tharunika Sridhar
Abstract:
This paper addresses sustainable eco-connectivity within the telecommunications sector studying its importance to tackle the contemporary challenges and data regulation issues. The paper also investigates the role of Big Data and its integration in this context, specific to telecom industry. One of the major focus areas in this paper is studying and examining the pathways explored, that are state-of-the-art ecological infrastructure solutions and sector-led measures derived from expert analyses and reviews. Additionally, the paper analyses critical factors involving cost-effective route planning, and the development of green telecommunications infrastructure that adds qualitative reasoning to the research idea. Furthermore, the study discusses in detail a potential green roadmap towards sustainability by exploring green routing software, eco-friendly infrastructure and other eco-focused initiatives. The paper is also directed at the special linguistic needs of the telecommunications sector by focusing on targeted select range of telecom environment.
Keywords: Big Data, telecom, sustainable telecom sector, telecom networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 827053 Comparison of Irradiance Decomposition and Energy Production Methods in a Solar Photovoltaic System
Authors: Tisciane Perpetuo e Oliveira, Dante Inga Narvaez, Marcelo Gradella Villalva
Abstract:
Installations of solar photovoltaic systems have increased considerably in the last decade. Therefore, it has been noticed that monitoring of meteorological data (solar irradiance, air temperature, wind velocity, etc.) is important to predict the potential of a given geographical area in solar energy production. In this sense, the present work compares two computational tools that are capable of estimating the energy generation of a photovoltaic system through correlation analyzes of solar radiation data: PVsyst software and an algorithm based on the PVlib package implemented in MATLAB. In order to achieve the objective, it was necessary to obtain solar radiation data (measured and from a solarimetric database), analyze the decomposition of global solar irradiance in direct normal and horizontal diffuse components, as well as analyze the modeling of the devices of a photovoltaic system (solar modules and inverters) for energy production calculations. Simulated results were compared with experimental data in order to evaluate the performance of the studied methods. Errors in estimation of energy production were less than 30% for the MATLAB algorithm and less than 20% for the PVsyst software.
Keywords: Energy production, meteorological data, irradiance decomposition, solar photovoltaic system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7657052 The Effect of Insurance on Foreign Direct Investments Inflow to Nigeria
Authors: Chimaobi V. Okolo, Afamefuna J. Ani, Ebere U. Okolo
Abstract:
This paper seeks to assess the implications of insurance to foreign direct investment inflow in Nigeria. Multiple linear regression technique and correlation matrix test were employed to measure the extent to which foreign direct investment was influenced. The result showed that insurance premium (IP), asset size of insurance industry (AS), and total investment of the industry (TI) impacted significantly and positively on foreign direct investment inflow in Nigeria. There should be effective risk transfer mechanism and financial intermediation, which gives the investor confidence in the risk management strength of the host country.Keywords: Foreign direct investment, insurance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30487051 Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools
Authors: Loke Mun Sei
Abstract:
Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, Robot Framework, and Jenkins.Keywords: Test automation tools, test case, test execution, test reporting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31007050 Cloud Computing Cryptography "State-of-the-Art"
Authors: Omer K. Jasim, Safia Abbas, El-Sayed M. El-Horbaty, Abdel-Badeeh M. Salem
Abstract:
Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing environments provide, there are many questions still coming up on how to gain a trusted environment that protect data and applications in clouds from hackers and intruders. This paper surveys the “keys generation and management” mechanism and encryption/decryption algorithms used in cloud computing environments, we proposed new security architecture for cloud computing environment that considers the various security gaps as much as possible. A new cryptographic environment that implements quantum mechanics in order to gain more trusted with less computation cloud communications is given.
Keywords: Cloud Computing, Cloud Encryption Model, Quantum Key Distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40927049 Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler
Authors: R.Anita, V.Ganga Bharani, N.Nityanandam, Pradeep Kumar Sahoo
Abstract:
The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based approach for extracting data from the deep web. Deep iCrawl splits the process into two phases. The first phase includes Query analysis and Query translation and the second covers vision-based extraction of data from the dynamically created deep web pages. There are several established approaches for the extraction of deep web pages but the proposed method aims at overcoming the inherent limitations of the former. This paper also aims at comparing the data items and presenting them in the required order.Keywords: Crawler, Deep web, Web Database
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21557048 Searchable Encryption in Cloud Storage
Authors: Ren-Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.
Keywords: Fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30797047 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.
Keywords: Artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9177046 A Modified AES Based Algorithm for Image Encryption
Authors: M. Zeghid, M. Machhout, L. Khriji, A. Baganne, R. Tourki
Abstract:
With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.Keywords: Cryptography, Encryption, Advanced EncryptionStandard (AES), ECB mode, statistical analysis, key streamgenerator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50567045 Positive Solutions for Semipositone Discrete Eigenvalue Problems via Three Critical Points Theorem
Authors: Benshi Zhu
Abstract:
In this paper, multiple positive solutions for semipositone discrete eigenvalue problems are obtained by using a three critical points theorem for nondifferentiable functional.Keywords: Discrete eigenvalue problems, positive solutions, semipositone, three critical points theorem
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12537044 Incremental Mining of Shocking Association Patterns
Authors: Eiad Yafi, Ahmed Sultan Al-Hegami, M. A. Alam, Ranjit Biswas
Abstract:
Association rules are an important problem in data mining. Massively increasing volume of data in real life databases has motivated researchers to design novel and incremental algorithms for association rules mining. In this paper, we propose an incremental association rules mining algorithm that integrates shocking interestingness criterion during the process of building the model. A new interesting measure called shocking measure is introduced. One of the main features of the proposed approach is to capture the user background knowledge, which is monotonically augmented. The incremental model that reflects the changing data and the user beliefs is attractive in order to make the over all KDD process more effective and efficient. We implemented the proposed approach and experiment it with some public datasets and found the results quite promising.Keywords: Knowledge discovery in databases (KDD), Data mining, Incremental Association rules, Domain knowledge, Interestingness, Shocking rules (SHR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18667043 Zero Inflated Strict Arcsine Regression Model
Authors: Y. N. Phang, E. F. Loh
Abstract:
Zero inflated strict arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, we extend zero inflated strict arcsine model to zero inflated strict arcsine regression model by taking into consideration the extra variability caused by extra zeros and covariates in count data. Maximum likelihood estimation method is used in estimating the parameters for this zero inflated strict arcsine regression model.Keywords: Overdispersed count data, maximum likelihood estimation, simulated annealing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17557042 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network
Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.Keywords: Big data, k-NN, machine learning, traffic speed prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13757041 GA Based Optimal Feature Extraction Method for Functional Data Classification
Authors: Jun Wan, Zehua Chen, Yingwu Chen, Zhidong Bai
Abstract:
Classification is an interesting problem in functional data analysis (FDA), because many science and application problems end up with classification problems, such as recognition, prediction, control, decision making, management, etc. As the high dimension and high correlation in functional data (FD), it is a key problem to extract features from FD whereas keeping its global characters, which relates to the classification efficiency and precision to heavens. In this paper, a novel automatic method which combined Genetic Algorithm (GA) and classification algorithm to extract classification features is proposed. In this method, the optimal features and classification model are approached via evolutional study step by step. It is proved by theory analysis and experiment test that this method has advantages in improving classification efficiency, precision and robustness whereas using less features and the dimension of extracted classification features can be controlled.Keywords: Classification, functional data, feature extraction, genetic algorithm, wavelet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15547040 Delay Analysis of Sampled-Data Systems in Hard RTOS
Authors: A. M. Azad, M. Alam, C. M. Hussain
Abstract:
In this paper, we have presented the effect of varying time-delays on performance and stability in the single-channel multirate sampled-data system in hard real-time (RT-Linux) environment. The sampling task require response time that might exceed the capacity of RT-Linux. So a straight implementation with RT-Linux is not feasible, because of the latency of the systems and hence, sampling period should be less to handle this task. The best sampling rate is chosen for the sampled-data system, which is the slowest rate meets all performance requirements. RT-Linux is consistent with its specifications and the resolution of the real-time is considered 0.01 seconds to achieve an efficient result. The test results of our laboratory experiment shows that the multi-rate control technique in hard real-time operating system (RTOS) can improve the stability problem caused by the random access delays and asynchronization.Keywords: Multi-rate, PID, RT-Linux, Sampled-data, Servo.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14437039 Client Importance and Audit Quality under Civil Law versus Common Law Societies
Authors: Kelly Grani Yuen
Abstract:
Accounting scandals and auditing frauds are perceived to be driven by aggressive companies and misrepresentation of audit reports. However, local legal systems and law enforcements may affect the services auditors provide to their ‘important’ clients. Under the civil law and common law jurisdictions, the standard setters, the government, and the regulatory bodies treat cases differently. As such, whether or not different forms of legal systems and extent of law enforcement plays an important role in auditor’s Audit Quality is a question this paper attempts to explore. The paper focuses on the investigation in Asia, where Hong Kong represents the common-law jurisdiction, while Taiwan and China represent the civil law jurisdiction. Only the ten reputable accounting firms are used in this study due to the differences in rankings and establishments of some of the small local audit firms. This will also contribute to the data collected between the years 2007-2013. By focusing on the use of multiple regression based on the dependent (Audit Quality) and independent variables (Client Importance, Law Enforcement, and Press Freedom), six different models are established. Results demonstrate that since different jurisdictions have different legal systems and market regulations, auditor’s treatment on ‘important’ clients will vary. However, with the moderators in place (law enforcement and press freedom), the relationship between client importance and audit quality may be smoothed out. With that in mind, this study contributes to local governments and standard setters’ consideration on legal reform and proper law enforcement in the market. Perhaps, with such modifications on the economic systems, collusion between companies and auditors can finally be put to a halt.
Keywords: Audit quality, client importance, jurisdiction, modified audit opinions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11167038 A Study of the Adaptive Reuse for School Land Use Strategy: An Application of the Analytic Network Process and Big Data
Authors: Wann-Ming Wey
Abstract:
In today's popularity and progress of information technology, the big data set and its analysis are no longer a major conundrum. Now, we could not only use the relevant big data to analysis and emulate the possible status of urban development in the near future, but also provide more comprehensive and reasonable policy implementation basis for government units or decision-makers via the analysis and emulation results as mentioned above. In this research, we set Taipei City as the research scope, and use the relevant big data variables (e.g., population, facility utilization and related social policy ratings) and Analytic Network Process (ANP) approach to implement in-depth research and discussion for the possible reduction of land use in primary and secondary schools of Taipei City. In addition to enhance the prosperous urban activities for the urban public facility utilization, the final results of this research could help improve the efficiency of urban land use in the future. Furthermore, the assessment model and research framework established in this research also provide a good reference for schools or other public facilities land use and adaptive reuse strategies in the future.
Keywords: Adaptive reuse, analytic network process, big data, land use strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9207037 A Review and Comparative Analysis on Cluster Ensemble Methods
Authors: S. Sarumathi, P. Ranjetha, C. Saraswathy, M. Vaishnavi, S. Geetha
Abstract:
Clustering is an unsupervised learning technique for aggregating data objects into meaningful classes so that intra cluster similarity is maximized and inter cluster similarity is minimized in data mining. However, no single clustering algorithm proves to be the most effective in producing the best result. As a result, a new challenging technique known as the cluster ensemble approach has blossomed in order to determine the solution to this problem. For the cluster analysis issue, this new technique is a successful approach. The cluster ensemble's main goal is to combine similar clustering solutions in a way that achieves the precision while also improving the quality of individual data clustering. Because of the massive and rapid creation of new approaches in the field of data mining, the ongoing interest in inventing novel algorithms necessitates a thorough examination of current techniques and future innovation. This paper presents a comparative analysis of various cluster ensemble approaches, including their methodologies, formal working process, and standard accuracy and error rates. As a result, the society of clustering practitioners will benefit from this exploratory and clear research, which will aid in determining the most appropriate solution to the problem at hand.
Keywords: Clustering, cluster ensemble methods, consensus function, data mining, unsupervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8197036 Simultaneous Clustering and Feature Selection Method for Gene Expression Data
Authors: T. Chandrasekhar, K. Thangavel, E. N. Sathishkumar
Abstract:
Microarrays are made it possible to simultaneously monitor the expression profiles of thousands of genes under various experimental conditions. It is used to identify the co-expressed genes in specific cells or tissues that are actively used to make proteins. This method is used to analysis the gene expression, an important task in bioinformatics research. Cluster analysis of gene expression data has proved to be a useful tool for identifying co-expressed genes, biologically relevant groupings of genes and samples. In this work K-Means algorithms has been applied for clustering of Gene Expression Data. Further, rough set based Quick reduct algorithm has been applied for each cluster in order to select the most similar genes having high correlation. Then the ACV measure is used to evaluate the refined clusters and classification is used to evaluate the proposed method. They could identify compact clusters with feature selection method used to genes are selected.
Keywords: Clustering, Feature selection, Gene expression data, Quick reduct.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19667035 Segmentation Free Nastalique Urdu OCR
Authors: Sobia T. Javed, Sarmad Hussain, Ameera Maqbool, Samia Asloob, Sehrish Jamil, Huma Moin
Abstract:
The electronically available Urdu data is in image form which is very difficult to process. Printed Urdu data is the root cause of problem. So for the rapid progress of Urdu language we need an OCR systems, which can help us to make Urdu data available for the common person. Research has been carried out for years to automata Arabic and Urdu script. But the biggest hurdle in the development of Urdu OCR is the challenge to recognize Nastalique Script which is taken as standard for writing Urdu language. Nastalique script is written diagonally with no fixed baseline which makes the script somewhat complex. Overlap is present not only in characters but in the ligatures as well. This paper proposes a method which allows successful recognition of Nastalique Script.Keywords: HMM, Image processing, Optical CharacterRecognition, Urdu OCR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21577034 Performance Analysis of MATLAB Solvers in the Case of a Quadratic Programming Generation Scheduling Optimization Problem
Authors: Dávid Csercsik, Péter Kádár
Abstract:
In the case of the proposed method, the problem is parallelized by considering multiple possible mode of operation profiles, which determine the range in which the generators operate in each period. For each of these profiles, the optimization is carried out independently, and the best resulting dispatch is chosen. For each such profile, the resulting problem is a quadratic programming (QP) problem with a potentially negative definite Q quadratic term, and constraints depending on the actual operation profile. In this paper we analyze the performance of available MATLAB optimization methods and solvers for the corresponding QP.Keywords: Economic dispatch, optimization, quadratic programming, MATLAB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9477033 Determination of the Concentrated State Using Multiple EEG Channels
Authors: Tae Jin Choi, Jong Ok Kim, Sang Min Jin, Gilwon Yoon
Abstract:
Analysis of EEG brainwave provides information on mental or emotional states. One of the particular states that can have various applications in human machine interface (HMI) is concentration. 8-channel EEG signals were measured and analyzed. The concentration index was compared during resting and concentrating periods. Among eight channels, locations the frontal lobe (Fp1 and Fp2) showed a clear increase of the concentration index during concentration regardless of subjects. The rest six channels produced conflicting observations depending on subjects. At this time, it is not clear whether individual difference or how to concentrate made these results for the rest six channels. Nevertheless, it is expected that Fp1 and Fp2 are promising locations for extracting control signal for HMI applications.
Keywords: Concentration, EEG, human machine interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34347032 The Advent of Electronic Logbook Technology - Reducing Cost and Risk to Both Marine Resources and the Fishing Industry
Authors: Amos Barkai, Guy Meredith, Fatima Felaar, Zahrah Dantie, Dave de Buys
Abstract:
Fisheries management all around the world is hampered by the lack, or poor quality, of critical data on fish resources and fishing operations. The main reasons for the chronic inability to collect good quality data during fishing operations is the culture of secrecy common among fishers and the lack of modern data gathering technology onboard most fishing vessels. In response, OLRAC-SPS, a South African company, developed fisheries datalogging software (eLog in short) and named it Olrac. The Olrac eLog solution is capable of collecting, analysing, plotting, mapping, reporting, tracing and transmitting all data related to fishing operations. Olrac can be used by skippers, fleet/company managers, offshore mariculture farmers, scientists, observers, compliance inspectors and fisheries management authorities. The authors believe that using eLog onboard fishing vessels has the potential to revolutionise the entire process of data collection and reporting during fishing operations and, if properly deployed and utilised, could transform the entire commercial fleet to a provider of good quality data and forever change the way fish resources are managed. In addition it will make it possible to trace catches back to the actual individual fishing operation, to improve fishing efficiency and to dramatically improve control of fishing operations and enforcement of fishing regulations.Keywords: data management, electronic logbook (eLog), electronic reporting system (ERS), fisheries management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975