Search results for: minimum data set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26525

Search results for: minimum data set

24875 Data Mining Approach: Classification Model Evaluation

Authors: Lubabatu Sada Sodangi

Abstract:

The rapid growth in exchange and accessibility of information via the internet makes many organisations acquire data on their own operation. The aim of data mining is to analyse the different behaviour of a dataset using observation. Although, the subset of the dataset being analysed may not display all the behaviours and relationships of the entire data and, therefore, may not represent other parts that exist in the dataset. There is a range of techniques used in data mining to determine the hidden or unknown information in datasets. In this paper, the performance of two algorithms Chi-Square Automatic Interaction Detection (CHAID) and multilayer perceptron (MLP) would be matched using an Adult dataset to find out the percentage of an/the adults that earn > 50k and those that earn <= 50k per year. The two algorithms were studied and compared using IBM SPSS statistics software. The result for CHAID shows that the most important predictors are relationship and education. The algorithm shows that those are married (husband) and have qualification: Bachelor, Masters, Doctorate or Prof-school whose their age is > 41<57 earn > 50k. Also, multilayer perceptron displays marital status and capital gain as the most important predictors of the income. It also shows that individuals that their capital gain is less than 6,849 and are single, separated or widow, earn <= 50K, whereas individuals with their capital gain is > 6,849, work > 35 hrs/wk, and > 27yrs their income will be > 50k. By comparing the two algorithms, it is observed that both algorithms are reliable but there is strong reliability in CHAID which clearly shows that relation and education contribute to the prediction as displayed in the data visualisation.

Keywords: data mining, CHAID, multi-layer perceptron, SPSS, Adult dataset

Procedia PDF Downloads 378
24874 Exploring the Possibility of Islamic Banking as a Viable Alternative to the Conventional Banking Model

Authors: Lavan Vickneson

Abstract:

In today’s modern economy, the conventional banking model is the primary banking system used around the world. A significant problem faced by the conventional banking model is the recurring nature of banking crises. History’s record of the various banking crises, ranging from the Great Depression to the 2008 subprime mortgage crisis, is testament to the fact that banking crises continue to strike despite the preventive measures in place, such as bank’s minimum capital requirements and deposit guarantee schemes. If banking crises continue to occur despite these preventive measures, it necessarily follows that there are inherent flaws with the conventional banking model itself. In light of this, a possible alternative banking model to the conventional banking model is Islamic banking. To date, Islamic banking has been a niche market, predominantly serving Muslim investors. This paper seeks to explore the possibility of Islamic banking being more than just a niche market and playing a greater role in banking sectors around the world, by being a viable alternative to the conventional banking model.

Keywords: bank crises, conventional banking model, Islamic banking, niche market

Procedia PDF Downloads 282
24873 UWB Channel Estimation Using an Efficient Sub-Nyquist Sampling Scheme

Authors: Yaacoub Tina, Youssef Roua, Radoi Emanuel, Burel Gilles

Abstract:

Recently, low-complexity sub-Nyquist sampling schemes based on the Finite Rate of Innovation (FRI) theory have been introduced to sample parametric signals at minimum rates. The multichannel modulating waveforms (MCMW) is such an efficient scheme, where the received signal is mixed with an appropriate set of arbitrary waveforms, integrated and sampled at rates far below the Nyquist rate. In this paper, the MCMW scheme is adapted to the special case of ultra wideband (UWB) channel estimation, characterized by dense multipaths. First, an appropriate structure, which accounts for the bandpass spectrum feature of UWB signals, is defined. Then, a novel approach to decrease the number of processing channels and reduce the complexity of this sampling scheme is presented. Finally, the proposed concepts are validated by simulation results, obtained with real filters, in the framework of a coherent Rake receiver.

Keywords: coherent rake receiver, finite rate of innovation, sub-nyquist sampling, ultra wideband

Procedia PDF Downloads 256
24872 Developing an Information Model of Manufacturing Process for Sustainability

Authors: Jae Hyun Lee

Abstract:

Manufacturing companies use life-cycle inventory databases to analyze sustainability of their manufacturing processes. Life cycle inventory data provides reference data which may not be accurate for a specific company. Collecting accurate data of manufacturing processes for a specific company requires enormous time and efforts. An information model of typical manufacturing processes can reduce time and efforts to get appropriate reference data for a specific company. This paper shows an attempt to build an abstract information model which can be used to develop information models for specific manufacturing processes.

Keywords: process information model, sustainability, OWL, manufacturing

Procedia PDF Downloads 430
24871 An Interpretable Data-Driven Approach for the Stratification of the Cardiorespiratory Fitness

Authors: D.Mendes, J. Henriques, P. Carvalho, T. Rocha, S. Paredes, R. Cabiddu, R. Trimer, R. Mendes, A. Borghi-Silva, L. Kaminsky, E. Ashley, R. Arena, J. Myers

Abstract:

The continued exploration of clinically relevant predictive models continues to be an important pursuit. Cardiorespiratory fitness (CRF) portends clinical vital information and as such its accurate prediction is of high importance. Therefore, the aim of the current study was to develop a data-driven model, based on computational intelligence techniques and, in particular, clustering approaches, to predict CRF. Two prediction models were implemented and compared: 1) the traditional Wasserman/Hansen Equations; and 2) an interpretable clustering approach. Data used for this analysis were from the 'FRIEND - Fitness Registry and the Importance of Exercise: The National Data Base'; in the present study a subset of 10690 apparently healthy individuals were utilized. The accuracy of the models was performed through the computation of sensitivity, specificity, and geometric mean values. The results show the superiority of the clustering approach in the accurate estimation of CRF (i.e., maximal oxygen consumption).

Keywords: cardiorespiratory fitness, data-driven models, knowledge extraction, machine learning

Procedia PDF Downloads 286
24870 Functional Mortality of Anopheles stephensi, the Urban Malaria Vector as Induced by the Sublethal Exposure to Deltamethrin

Authors: P. Aarumugam, N. Krishnamoorthy, K. Gunasekaran

Abstract:

The mosquitoes with loss of minimum three legs especially the hind legs have the negative impact on the survival hood of mosquitoes. Three days old unfed adult female laboratory strain was selected in each generation against sublethal dosages (0.004%, 0.005%, 0.007% and 0.01%) of deltamethrin upto 40 generations. Impregnated papers with acetone were used for control. Every fourth generation, survived mosquitoes were observed for functional mortality. Hind legs lost were significantly (P< 0.05) higher in treated than the controls up to generation 24, thereafter no significant lost. In contrary, no significant forelegs lost among exposed mosquitoes. Middle legs lost were also not significant in the exposed mosquitoes except first generation (F1). The field strain (Chennai) did not show any significant loss of legs (fore or mid or hind) compared to the control. The selection pressure on mosquito population influences strong natural selection to develop various adaptive mechanisms.

Keywords: Anopheles stephensi, deltamethrin, functional mortality, synthetic pyrethroids

Procedia PDF Downloads 396
24869 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from traveling vehicles, such as taxis through installed global positioning system (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia PDF Downloads 157
24868 Mitigating Supply Chain Risk for Sustainability Using Big Data Knowledge: Evidence from the Manufacturing Supply Chain

Authors: Mani Venkatesh, Catarina Delgado, Purvishkumar Patel

Abstract:

The sustainable supply chain is gaining popularity among practitioners because of increased environmental degradation and stakeholder awareness. On the other hand supply chain, risk management is very crucial for the practitioners as it potentially disrupts supply chain operations. Prediction and addressing the risk caused by social issues in the supply chain is paramount importance to the sustainable enterprise. More recently, the usage of Big data analytics for forecasting business trends has been gaining momentum among professionals. The aim of the research is to explore the application of big data, predictive analytics in successfully mitigating supply chain social risk and demonstrate how such mitigation can help in achieving sustainability (environmental, economic & social). The method involves the identification and validation of social issues in the supply chain by an expert panel and survey. Later, we used a case study to illustrate the application of big data in the successful identification and mitigation of social issues in the supply chain. Our result shows that the company can predict various social issues through big data, predictive analytics and mitigate the social risk. We also discuss the implication of this research to the body of knowledge and practice.

Keywords: big data, sustainability, supply chain social sustainability, social risk, case study

Procedia PDF Downloads 408
24867 Economics of Sugandhakokila (Cinnamomum Glaucescens (Nees) Dury) in Dang District of Nepal: A Value Chain Perspective

Authors: Keshav Raj Acharya, Prabina Sharma

Abstract:

Sugandhakokila (Cinnamomum glaucescens Nees. Dury) is a large evergreen native tree species; mostly confined naturally in mid-hills of Rapti Zone of Nepal. The species is identified as prioritized for agro-technology development as well as for research and development by a department of plant resources. This species is band for export outside the country without processing by the government of Nepal to encourage the value addition within the country. The present study was carried out in Chillikot village of Dang district to find out the economic contribution of C. glaucescens in the local economy and to document the major conservation threats for this species. Participatory Rural Appraisal (PRA) tools such as Household survey, key informants interviews and focus group discussions were carried out to collect the data. The present study reveals that about 1.7 million Nepalese rupees (NPR) have been contributed annually in the local economy of 29 households from the collection of C. glaucescens berries in the study area. The average annual income of each family was around NPR 67,165.38 (US$ 569.19) from the sale of the berries which contributes about 53% of the total household income. Six different value chain actors are involved in C. glaucescens business. Maximum profit margin was taken by collector followed by producer, exporter and processor. The profit margin was found minimum to regional and village traders. The total profit margin for producers was NPR 138.86/kg, and regional traders have gained NPR 17/kg. However, there is a possibility to increase the profit of producers by NPR 8.00 more for each kg of berries through the initiation of community forest user group and village cooperatives in the area. Open access resource, infestation by an insect to over matured trees and browsing by goats were identified as major conservation threats for this species. Handing over the national forest as a community forest, linking the producers with the processor through organized market channel and replacing the old tree through new plantation has been recommended for future.

Keywords: community forest, conservation threats, C. glaucescens, value chain analysis

Procedia PDF Downloads 140
24866 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs

Procedia PDF Downloads 161
24865 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements

Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath

Abstract:

Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.

Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing

Procedia PDF Downloads 176
24864 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects

Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim

Abstract:

Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.

Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation

Procedia PDF Downloads 42
24863 Adopting a New Policy in Maritime Law for Protecting Ship Mortgagees Against Maritime Liens

Authors: Mojtaba Eshraghi Arani

Abstract:

Ship financing is the vital element in the development of shipping industry because while the ship constitutes the owners’ main asset, she is considered a reliable security in the financiers’ viewpoint as well. However, it is most probable that a financier who has accepted a ship as security will face many creditors who are privileged and rank before him for collecting, out of the ship, the money that they are owed. In fact, according to the current rule of maritime law, which was established by “Convention Internationale pour l’Unification de Certaines Règles Relatives aux Privilèges et Hypothèques Maritimes, Brussels, 10 April 1926”, the mortgages, hypotheques, and other charges on vessels rank after several secured claims referred to as “maritime liens”. Such maritime liens are an exhaustive list of claims including but not limited to “expenses incurred in the common interest of the creditors to preserve the vessel or to procure its sale and the distribution of the proceeds of sale”, “tonnage dues, light or harbour dues, and other public taxes and charges of the same character”, “claims arising out of the contract of engagement of the master, crew and other persons hired on board”, “remuneration for assistance and salvage”, “the contribution of the vessel in general average”, “indemnities for collision or other damage caused to works forming part of harbours, docks, etc,” “indemnities for personal injury to passengers or crew or for loss of or damage to cargo”, “claims resulting form contracts entered into or acts done by the master”. The same rule survived with only some minor change in the categories of maritime liens in the substitute conventions 1967 and 1993. The status que in maritime law have always been considered as a major obstacle to the development of shipping market and has inevitably led to increase in the interest rates and other related costs of ship financing. It seems that the national and international policy makers have yet to change their mind being worried about the deviation from the old marine traditions. However, it is crystal clear that the continuation of status que will harm, to a great extent, the shipowners and, consequently, the international merchants as a whole. It is argued in this article that the raison d'être for many categories of maritime liens cease to exist anymore, in view of which, the international community has to recognize only a minimum category of maritime liens which are created in the common interests of all creditors; to this effect, only two category of “compensation due for the salvage of ship” and “extraordinary expenses indispensable for the preservation of the ship” can be declared as taking priority over the mortgagee rights, in anology with the Geneva Convention on the International Recognition of Rights in Aircrafts (1948). A qualitative method with the concept of interpretation of data collection has been used in this manuscript. The source of the data is the analysis of international conventions and domestic laws.

Keywords: ship finance, mortgage, maritime liens, brussels convenion, geneva convention 1948

Procedia PDF Downloads 72
24862 Surface Sterilization of Aquatic Plant, Cryptopcoryne affinis by Using Clorox and Mercury Chloride

Authors: Sridevi Devadas

Abstract:

This study was aimed to examine the combination efficiency of Clorox (5.25% Sodium Hypochlorite) and mercury chloride (HgCl2) as reagent for surface sterilization process of aquatic plant, Cryptocoryne affinis (C. affinis). The treatment applied 10% of the Clorox and 0.1 ppm of mercury chloride. The maximum exposure time for Clorox and mercury chloride was 10 min and 60 sec respectively. After exposed to the treatments protocols (T1-T15) the explants were transferred to culture room under control temperature at 25°C ± 2°C and subjected to 16 hours fluorescence light (2000 lumens) for 30 days. The both sterilizing agents were not applied on control specimens. Upon analysis, the result indicates all of the treatments protocols produced sterile explants at range of minimum 1.5 ± 0.7 (30%) to maximum 5.0 ± 0.0 (100%). Meanwhile, maximum 1.0 ± 0.7 numbers of leaves and 1.4 ± 0.6 numbers of roots have been produced. The optimized exposure time was 0 to 15 min for Clorox and 30 sec for HgCl2 whereby 90% to 100% sterilization was archived at this condition.

Keywords: Cryptocoryne affinis, surface sterilization, tissue culture, clorox, mercury chloride

Procedia PDF Downloads 600
24861 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data

Authors: N. Borjalilu, P. Rabiei, A. Enjoo

Abstract:

Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.

Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety

Procedia PDF Downloads 168
24860 From Modeling of Data Structures towards Automatic Programs Generating

Authors: Valentin P. Velikov

Abstract:

Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.

Keywords: computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling

Procedia PDF Downloads 305
24859 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features

Authors: Bo Wang

Abstract:

The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.

Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection

Procedia PDF Downloads 284
24858 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector

Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar

Abstract:

Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.

Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake

Procedia PDF Downloads 138
24857 Newspaper Coverage and the Prevention of Child Sexual Abuse in Nigeria

Authors: Grace Iember Anweh, Er Shipp

Abstract:

Child Sexual Abuse (CSA) has been a contending issue across the globe. The menace of child sexual violence cuts across all continents. From 0 - 13 years, children have been sexually abused – some to the extent that their reproductive organs have been permanently damaged. The research in view is timely, as it will contribute data on CSA and media role to the communication parlance. This study believes that the adverse effects of this menace can hinder children who are potential leaders of tomorrow from harnessing their potentials to contribute to the growth and development of societies due to the psychological, health, and social effects of sex abuse. Where government policies, the law, cultural beliefs, and bottle necks surrounding processes of fighting child sexual abuse have failed, this study assumes that adequate coverage by the mass media, especially the newspapers known for their in-depth coverage and reporting, can help to eradicate or reduce to its barest minimum, the menace of CSA. Therefore, this study aims at assessing the coverage of newspapers – their policies and content towards preventive strategies, and how the public access and receive the messages to the extent they take action to forestall the persistence of sexual violation of children in Nigeria. Methodologically, the study has adopted qualitative and quantitative methods to answer the problem. The study used in-depth interview method to find out from journalists and editors of newspapers the policies that define the production of news content on sexual gender-based violence. In addition, selected National Daily newspapers are content - analysed to determine the focus of media coverage and whether the contents are preventive-based or case-based. In addition, caregivers of the reproductive ages from 16 years and above, ranging from parents, guardians, and school management, will form the study population through a survey using the questionnaire. The aim is to determine their views regarding mass media coverage of sexual violence against children and the effectiveness of the content, to the extent of prompting them to keep the child safe from sexual molesters. Findings from the content analysis so far show that newspapers in Nigeria are not engaged in preventive content of CSA. Their contents are rather case-based.

Keywords: newspaper, coverage, prevention, child, sexual abuse

Procedia PDF Downloads 120
24856 Astronomical Object Classification

Authors: Alina Muradyan, Lina Babayan, Arsen Nanyan, Gohar Galstyan, Vigen Khachatryan

Abstract:

We present a photometric method for identifying stars, galaxies and quasars in multi-color surveys, which uses a library of ∼> 65000 color templates for comparison with observed objects. The method aims for extracting the information content of object colors in a statistically correct way, and performs a classification as well as a redshift estimation for galaxies and quasars in a unified approach based on the same probability density functions. For the redshift estimation, we employ an advanced version of the Minimum Error Variance estimator which determines the redshift error from the redshift dependent probability density function itself. The method was originally developed for the Calar Alto Deep Imaging Survey (CADIS), but is now used in a wide variety of survey projects. We checked its performance by spectroscopy of CADIS objects, where the method provides high reliability (6 errors among 151 objects with R < 24), especially for the quasar selection, and redshifts accurate within σz ≈ 0.03 for galaxies and σz ≈ 0.1 for quasars. For an optimization of future survey efforts, a few model surveys are compared, which are designed to use the same total amount of telescope time but different sets of broad-band and medium-band filters. Their performance is investigated by Monte-Carlo simulations as well as by analytic evaluation in terms of classification and redshift estimation. If photon noise were the only error source, broad-band surveys and medium-band surveys should perform equally well, as long as they provide the same spectral coverage. In practice, medium-band surveys show superior performance due to their higher tolerance for calibration errors and cosmic variance. Finally, we discuss the relevance of color calibration and derive important conclusions for the issues of library design and choice of filters. The calibration accuracy poses strong constraints on an accurate classification, which are most critical for surveys with few, broad and deeply exposed filters, but less severe for surveys with many, narrow and less deep filters.

Keywords: VO, ArVO, DFBS, FITS, image processing, data analysis

Procedia PDF Downloads 80
24855 Multi-Objectives Genetic Algorithm for Optimizing Machining Process Parameters

Authors: Dylan Santos De Pinho, Nabil Ouerhani

Abstract:

Energy consumption of machine-tools is becoming critical for machine-tool builders and end-users because of economic, ecological and legislation-related reasons. Many machine-tool builders are seeking for solutions that allow the reduction of energy consumption of machine-tools while preserving the same productivity rate and the same quality of machined parts. In this paper, we present the first results of a project conducted jointly by academic and industrial partners to reduce the energy consumption of a Swiss-Type lathe. We employ genetic algorithms to find optimal machining parameters – the set of parameters that lead to the best trade-off between energy consumption, part quality and tool lifetime. Three main machining process parameters are considered in our optimization technique, namely depth of cut, spindle rotation speed and material feed rate. These machining process parameters have been identified as the most influential ones in the configuration of the Swiss-type machining process. A state-of-the-art multi-objective genetic algorithm has been used. The algorithm combines three fitness functions, which are objective functions that permit to evaluate a set of parameters against the three objectives: energy consumption, quality of the machined parts, and tool lifetime. In this paper, we focus on the investigation of the fitness function related to energy consumption. Four different energy consumption related fitness functions have been investigated and compared. The first fitness function refers to the Kienzle cutting force model. The second fitness function uses the Material Removal Rate (RMM) as an indicator of energy consumption. The two other fitness functions are non-deterministic, learning-based functions. One fitness function uses a simple Neural Network to learn the relation between the process parameters and the energy consumption from experimental data. Another fitness function uses Lasso regression to determine the same relation. The goal is, then, to find out which fitness functions predict best the energy consumption of a Swiss-Type machining process for the given set of machining process parameters. Once determined, these functions may be used for optimization purposes – determine the optimal machining process parameters leading to minimum energy consumption. The performance of the four fitness functions has been evaluated. The Tornos DT13 Swiss-Type Lathe has been used to carry out the experiments. A mechanical part including various Swiss-Type machining operations has been selected for the experiments. The evaluation process starts with generating a set of CNC (Computer Numerical Control) programs for machining the part at hand. Each CNC program considers a different set of machining process parameters. During the machining process, the power consumption of the spindle is measured. All collected data are assigned to the appropriate CNC program and thus to the set of machining process parameters. The evaluation approach consists in calculating the correlation between the normalized measured power consumption and the normalized power consumption prediction for each of the four fitness functions. The evaluation shows that the Lasso and Neural Network fitness functions have the highest correlation coefficient with 97%. The fitness function “Material Removal Rate” (MRR) has a correlation coefficient of 90%, whereas the Kienzle-based fitness function has a correlation coefficient of 80%.

Keywords: adaptive machining, genetic algorithms, smart manufacturing, parameters optimization

Procedia PDF Downloads 147
24854 Cellular Mobile Telecommunication GSM Radio Base Station Network Planning

Authors: Saeed Alzahrani, Yaser Miaji

Abstract:

The project involves the design and simulation of a Mobile Cellular Telecommunication Network using the software tool CelPlanner. The design is mainly concerned with Global System for Mobile Communications . The design and simulation of the network is done for a small part of the area allocated for us in the terrain area of Shreveport city .The project is concerned with designing a network that is cost effective and which also efficiently meets the required Grade of Service (GOS) AND Quality of Service (QOS).The expected outcome of this project is the design of a network that gives a good coverage for the area allocated to us with minimum co-channel interference and adjacent channel interference. The Handover and Traffic Handling Capacity should also be taken into consideration and should be good for the given area . The Traffic Handling Capacity of the network in a way decides whether the designed network is good or bad . The design also takes into consideration the topographical and morphological information.

Keywords: mobile communication, GSM, radio base station, network planning

Procedia PDF Downloads 439
24853 Exploring the Availability and Distribution of Public Green Spaces among Riyadh Residential Neighborhoods

Authors: Abdulwahab Alalyani, Mahbub Rashid

Abstract:

Public green space promotes community health including daily activities, but these resources may not be available enough or may not equitably be distributed. This paper measures and compares the availability of public green spaces (PGS) among low, middle, and high-income neighborhoods in the Riyadh city. Additionally, it compares the total availability of PGS to WHO standard and Dubai availability of PGS per person. All PGS were mapped using geographical information systems, and total area availability of PGS compared to WHO and Dubai standards. To evaluate the significant differences in PGS availability across low, medium, and high-income Riyadh neighborhoods, we used a One-way ANOVA analysis of covariance to test the differences. As a result, by comparing PGS of Riyadh neighborhoods to WHO and Dubai-availability, it was found that Riyadh PGS were lower than the minimum standard of WHO and as well as Dubai. Riyadh has only 1.13 m2 per capita of PGS. The second finding, the availability of PGS, was significantly different among Riyadh neighborhoods based on socioeconomic status. The future development of PGS should be focused on increasing PGS availability and should be given priority to those low-income and unhealthy communities.

Keywords: spatial equity, green space, quality of life, built environment

Procedia PDF Downloads 128
24852 Antibacterial Activity of Nisin: Comparison the Role of Free and Encapsulated Nisin to Control Staphylococcus Aureus Inoculated in Minced Beef

Authors: Zh. Ghasemi, S. Nouri Saeedlou, A. Ghasemi, SL. Nasiri, P. Ayremlou, P. Mahasti

Abstract:

The use of nisin is successfully used as antibacterial agent in various food products. Although the conclusions of the previous studies were that nisin is not very effective in meat environments. The reduced antimicrobial efficacy of nisin when applied in food has been frequently observed. The aim of this study is to evaluate the potential of free and encapsulated nisin to inhibit the growth of staphylococcus aureus in minced beef. The minimum inhibitory concentration (MIC) of nisin is determined against S. aureus using the agar dilution method. Nisin is encapsulated by spray drying, and encapsulation efficiency, mass yield and total solids content values are 47.79%, 61%, and 96.41 respectively. The study in vitro release kinetics shows highest release of nisin from zein capsules is obtained after 72 hour. This work shows that an appropriate delivery system is necessary to obtain desirable effect of nisin in meat and meat product.

Keywords: nisin, encapsulation, Staphylococcus aureus, minced beef, antibacterial activity

Procedia PDF Downloads 291
24851 Study and Design of Solar Inverter System

Authors: Khaled A. Madi, Abdulalhakim O. Naji, Hassouna A. Aalaoh, Elmahdi Eldeeb

Abstract:

Solar energy is one of the cleanest energy sources with no environmental impact. Due to rapid increase in industrial as well as domestic needs, solar energy becomes a good candidate for safe and easy to handle energy source, especially after it becomes available due to reduction of manufacturing price. The main part of the solar inverter system is the inverter where the DC is inverted to AC, where we try to minimize the loss of power to the minimum possible level by the use of microcontroller. In this work, a deep investigation is made experimentally as well as theoretically for a microcontroller based variable frequency power inverter. The microcontroller will provide the variable frequency Pulse Width Modulation (PWM) signal that will control the switching of the gate of the Insulating Gate Bipolar Transistor (IGBT) with less harmonics at the output of power inverter which can be fed to the public grid at high quality. The proposed work for single phase as well as three phases is also simulated using Matlab/Simulink where we found a good agreement between the simulated and the practical results, even though the experimental work were done in the laboratory of the academy.

Keywords: solar, inverter, PV, solar inverter system

Procedia PDF Downloads 462
24850 Finding the Free Stream Velocity Using Flow Generated Sound

Authors: Saeed Hosseini, Ali Reza Tahavvor

Abstract:

Sound processing is one the subjects that newly attracts a lot of researchers. It is efficient and usually less expensive than other methods. In this paper the flow generated sound is used to estimate the flow speed of free flows. Many sound samples are gathered. After analyzing the data, a parameter named wave power is chosen. For all samples, the wave power is calculated and averaged for each flow speed. A curve is fitted to the averaged data and a correlation between the wave power and flow speed is founded. Test data are used to validate the method and errors for all test data were under 10 percent. The speed of the flow can be estimated by calculating the wave power of the flow generated sound and using the proposed correlation.

Keywords: the flow generated sound, free stream, sound processing, speed, wave power

Procedia PDF Downloads 415
24849 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves

Authors: Shengnan Chen, Shuhua Wang

Abstract:

Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.

Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves

Procedia PDF Downloads 283
24848 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach

Authors: Gong Zhilin, Jing Yang, Jian Yin

Abstract:

The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).

Keywords: credit card, data mining, fraud detection, money transactions

Procedia PDF Downloads 131
24847 Tea and Its Working Methodology in the Biomass Estimation of Poplar Species

Authors: Pratima Poudel, Austin Himes, Heidi Renninger, Eric McConnel

Abstract:

Populus spp. (poplar) are the fastest-growing trees in North America, making them ideal for a range of applications as they can achieve high yields on short rotations and regenerate by coppice. Furthermore, poplar undergoes biochemical conversion to fuels without complexity, making it one of the most promising, purpose-grown, woody perennial energy sources. Employing wood-based biomass for bioenergy offers numerous benefits, including reducing greenhouse gas (GHG) emissions compared to non-renewable traditional fuels, the preservation of robust forest ecosystems, and creating economic prospects for rural communities.In order to gain a better understanding of the potential use of poplar as a biomass feedstock for biofuel in the southeastern US, the conducted a techno-economic assessment (TEA). This assessment is an analytical approach that integrates technical and economic factors of a production system to evaluate its economic viability. the TEA specifically focused on a short rotation coppice system employing a single-pass cut-and-chip harvesting method for poplar. It encompassed all the costs associated with establishing dedicated poplar plantations, including land rent, site preparation, planting, fertilizers, and herbicides. Additionally, we performed a sensitivity analysis to evaluate how different costs can affect the economic performance of the poplar cropping system. This analysis aimed to determine the minimum average delivered selling price for one metric ton of biomass necessary to achieve a desired rate of return over the cropping period. To inform the TEA, data on the establishment, crop care activities, and crop yields were derived from a field study conducted at the Mississippi Agricultural and Forestry Experiment Station's Bearden Dairy Research Center in Oktibbeha County and Pontotoc Ridge-Flatwood Branch Experiment Station in Pontotoc County.

Keywords: biomass, populus species, sensitivity analysis, technoeconomic analysis

Procedia PDF Downloads 83
24846 Speech Intelligibility Improvement Using Variable Level Decomposition DWT

Authors: Samba Raju, Chiluveru, Manoj Tripathy

Abstract:

Intelligibility is an essential characteristic of a speech signal, which is used to help in the understanding of information in speech signal. Background noise in the environment can deteriorate the intelligibility of a recorded speech. In this paper, we presented a simple variance subtracted - variable level discrete wavelet transform, which improve the intelligibility of speech. The proposed algorithm does not require an explicit estimation of noise, i.e., prior knowledge of the noise; hence, it is easy to implement, and it reduces the computational burden. The proposed algorithm decides a separate decomposition level for each frame based on signal dominant and dominant noise criteria. The performance of the proposed algorithm is evaluated with speech intelligibility measure (STOI), and results obtained are compared with Universal Discrete Wavelet Transform (DWT) thresholding and Minimum Mean Square Error (MMSE) methods. The experimental results revealed that the proposed scheme outperformed competing methods

Keywords: discrete wavelet transform, speech intelligibility, STOI, standard deviation

Procedia PDF Downloads 148