Search results for: relay of sensing data
23848 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria
Authors: Ibrahim Rabiu Darazo
Abstract:
The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs
Procedia PDF Downloads 33623847 Optimize Data Evaluation Metrics for Fraud Detection Using Machine Learning
Authors: Jennifer Leach, Umashanger Thayasivam
Abstract:
The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, though, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate people. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease this advancement. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent data, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which testing split and technique would lead to the most optimal results.Keywords: data science, fraud detection, machine learning, supervised learning
Procedia PDF Downloads 19923846 Suitability of Satellite-Based Data for Groundwater Modelling in Southwest Nigeria
Authors: O. O. Aiyelokun, O. A. Agbede
Abstract:
Numerical modelling of groundwater flow can be susceptible to calibration errors due to lack of adequate ground-based hydro-metrological stations in river basins. Groundwater resources management in Southwest Nigeria is currently challenged by overexploitation, lack of planning and monitoring, urbanization and climate change; hence to adopt models as decision support tools for sustainable management of groundwater; they must be adequately calibrated. Since river basins in Southwest Nigeria are characterized by missing data, and lack of adequate ground-based hydro-meteorological stations; the need for adopting satellite-based data for constructing distributed models is crucial. This study seeks to evaluate the suitability of satellite-based data as substitute for ground-based, for computing boundary conditions; by determining if ground and satellite based meteorological data fit well in Ogun and Oshun River basins. The Climate Forecast System Reanalysis (CFSR) global meteorological dataset was firstly obtained in daily form and converted to monthly form for the period of 432 months (January 1979 to June, 2014). Afterwards, ground-based meteorological data for Ikeja (1981-2010), Abeokuta (1983-2010), and Oshogbo (1981-2010) were compared with CFSR data using Goodness of Fit (GOF) statistics. The study revealed that based on mean absolute error (MEA), coefficient of correlation, (r) and coefficient of determination (R²); all meteorological variables except wind speed fit well. It was further revealed that maximum and minimum temperature, relative humidity and rainfall had high range of index of agreement (d) and ratio of standard deviation (rSD), implying that CFSR dataset could be used to compute boundary conditions such as groundwater recharge and potential evapotranspiration. The study concluded that satellite-based data such as the CFSR should be used as input when constructing groundwater flow models in river basins in Southwest Nigeria, where majority of the river basins are partially gaged and characterized with long missing hydro-metrological data.Keywords: boundary condition, goodness of fit, groundwater, satellite-based data
Procedia PDF Downloads 13223845 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data
Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao
Abstract:
Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive
Procedia PDF Downloads 17623844 A Hybrid Data Mining Algorithm Based System for Intelligent Defence Mission Readiness and Maintenance Scheduling
Authors: Shivam Dwivedi, Sumit Prakash Gupta, Durga Toshniwal
Abstract:
It is a challenging task in today’s date to keep defence forces in the highest state of combat readiness with budgetary constraints. A huge amount of time and money is squandered in the unnecessary and expensive traditional maintenance activities. To overcome this limitation Defence Intelligent Mission Readiness and Maintenance Scheduling System has been proposed, which ameliorates the maintenance system by diagnosing the condition and predicting the maintenance requirements. Based on new data mining algorithms, this system intelligently optimises mission readiness for imminent operations and maintenance scheduling in repair echelons. With modified data mining algorithms such as Weighted Feature Ranking Genetic Algorithm and SVM-Random Forest Linear ensemble, it improves the reliability, availability and safety, alongside reducing maintenance cost and Equipment Out of Action (EOA) time. The results clearly conclude that the introduced algorithms have an edge over the conventional data mining algorithms. The system utilizing the intelligent condition-based maintenance approach improves the operational and maintenance decision strategy of the defence force.Keywords: condition based maintenance, data mining, defence maintenance, ensemble, genetic algorithms, maintenance scheduling, mission capability
Procedia PDF Downloads 29923843 Using Emerging Hot Spot Analysis to Analyze Overall Effectiveness of Policing Policy and Strategy in Chicago
Authors: Tyler Gill, Sophia Daniels
Abstract:
The paper examines how accessing the spatial-temporal constrains of data will help inform policymakers and law enforcement officials. The authors utilize Chicago crime data from 2006-2016 to demonstrate how the Emerging Hot Spot Tool is an ideal hot spot clustering approach to analyze crime data. Traditional approaches include density maps or creating a spatial weights matrix to include the spatial-temporal constrains. This new approach utilizes a space-time implementation of the Getis-Ord Gi* statistic to visualize the data more quickly to make better decisions. The research will help complement socio-cultural research to find key patterns to help frame future policies and evaluate the implementation of prior strategies. Through this analysis, homicide trends and patterns are found more effectively and recommendations for use by non-traditional users of GIS are offered for real life implementation.Keywords: crime mapping, emerging hot spot analysis, Getis-Ord Gi*, spatial-temporal analysis
Procedia PDF Downloads 24623842 Active Learning in Engineering Courses Using Excel Spreadsheet
Authors: Promothes Saha
Abstract:
Recently, transportation engineering industry members at the study university showed concern that students lacked the skills needed to solve real-world engineering problems using spreadsheet data analysis. In response to the concerns shown by industry members, this study investigated how to engage students in a better way by incorporating spreadsheet analysis during class - also, help them learn the course topics. Helping students link theoretical knowledge to real-world problems can be a challenge. In this effort, in-class activities and worksheets were redesigned to integrate with Excel to solve example problems using built-in tools including cell referencing, equations, data analysis tool pack, solver tool, conditional formatting, charts, etc. The effectiveness of this technique was investigated using students’ evaluations of the course, enrollment data, and students’ comments. Based on the data of those criteria, it is evident that the spreadsheet activities may increase student learning.Keywords: civil, engineering, active learning, transportation
Procedia PDF Downloads 13923841 Understanding Cruise Passengers’ On-board Experience throughout the Customer Decision Journey
Authors: Sabina Akter, Osiris Valdez Banda, Pentti Kujala, Jani Romanoff
Abstract:
This paper examines the relationship between on-board environmental factors and customer overall satisfaction in the context of the cruise on-board experience. The on-board environmental factors considered are ambient, layout/design, social, product/service and on-board enjoyment factors. The study presents a data-driven framework and model for the on-board cruise experience. The data are collected from 893 respondents in an application of a self-administered online questionnaire of their cruise experience. This study reveals the cruise passengers’ on-board experience through the customer decision journey based on the publicly available data. Pearson correlation and regression analysis have been applied, and the results show a positive and a significant relationship between the environmental factors and on-board experience. These data help understand the cruise passengers’ on-board experience, which will be used for the ultimate decision-making process in cruise ship design.Keywords: cruise behavior, customer activities, on-board environmental factors, on-board experience, user or customer satisfaction
Procedia PDF Downloads 17223840 Holistic Risk Assessment Based on Continuous Data from the User’s Behavior and Environment
Authors: Cinzia Carrodano, Dimitri Konstantas
Abstract:
Risk is part of our lives. In today’s society risk is connected to our safety and safety has become a major priority in our life. Each person lives his/her life based on the evaluation of the risk he/she is ready to accept and sustain, and the level of safety he/she wishes to reach, based on highly personal criteria. The assessment of risk a person takes in a complex environment and the impact of actions of other people’actions and events on our perception of risk are alements to be considered. The concept of Holistic Risk Assessment (HRA) aims in developing a methodology and a model that will allow us to take into account elements outside the direct influence of the individual, and provide a personalized risk assessment. The concept is based on the fact that in the near future, we will be able to gather and process extremely large amounts of data about an individual and his/her environment in real time. The interaction and correlation of these data is the key element of the holistic risk assessment. In this paper, we present the HRA concept and describe the most important elements and considerations.Keywords: continuous data, dynamic risk, holistic risk assessment, risk concept
Procedia PDF Downloads 12923839 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance
Authors: Abdullah Al Farwan, Ya Zhang
Abstract:
In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance
Procedia PDF Downloads 17023838 A Novel Framework for User-Friendly Ontology-Mediated Access to Relational Databases
Authors: Efthymios Chondrogiannis, Vassiliki Andronikou, Efstathios Karanastasis, Theodora Varvarigou
Abstract:
A large amount of data is typically stored in relational databases (DB). The latter can efficiently handle user queries which intend to elicit the appropriate information from data sources. However, direct access and use of this data requires the end users to have an adequate technical background, while they should also cope with the internal data structure and values presented. Consequently the information retrieval is a quite difficult process even for IT or DB experts, taking into account the limited contributions of relational databases from the conceptual point of view. Ontologies enable users to formally describe a domain of knowledge in terms of concepts and relations among them and hence they can be used for unambiguously specifying the information captured by the relational database. However, accessing information residing in a database using ontologies is feasible, provided that the users are keen on using semantic web technologies. For enabling users form different disciplines to retrieve the appropriate data, the design of a Graphical User Interface is necessary. In this work, we will present an interactive, ontology-based, semantically enable web tool that can be used for information retrieval purposes. The tool is totally based on the ontological representation of underlying database schema while it provides a user friendly environment through which the users can graphically form and execute their queries.Keywords: ontologies, relational databases, SPARQL, web interface
Procedia PDF Downloads 27423837 Anomaly Detection in Financial Markets Using Tucker Decomposition
Authors: Salma Krafessi
Abstract:
The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models
Procedia PDF Downloads 7123836 Photophysics and Torsional Dynamics of Thioflavin T in Deep Eutectic Solvents
Authors: Rajesh Kumar Gautam, Debabrata Seth
Abstract:
Thioflavin-T (ThT) play a key role of an important biologically active fluorescent sensor for amyloid fibrils. ThT molecule has been developed a method to detect the analysis of different type of diseases such as neurodegenerative disorders, Alzheimer’s, Parkinson’s, and type II diabetes. ThT was used as a fluorescent marker to detect the formation of amyloid fibril. In the presence of amyloid fibril, ThT becomes highly fluorescent. ThT undergoes twisting motion around C-C bonds of the two adjacent benzothiazole and dimethylaniline aromatic rings, which is predominantly affected by the micro-viscosity of the local environment. The present study articulates photophysics and torsional dynamics of biologically active molecule ThT in the presence of deep-eutectic solvents (DESs). DESs are environment-friendly, low cost and biodegradable alternatives to the ionic liquids. DES resembles ionic liquids, but the constituents of a DES include a hydrogen bond donor and acceptor species, in addition to ions. Due to the presence of the H-bonding network within a DES, it exhibits structural heterogeneity. Herein, we have prepared two different DESs by mixing urea with choline chloride and N, N-diethyl ethanol ammonium chloride at ~ 340 K. It was reported that deep eutectic mixture of choline chloride with urea gave a liquid with a freezing point of 12°C. We have experimented by taking two different concentrations of ThT. It was observed that at higher concentration of ThT (50 µM) it forms aggregates in DES. The photophysics of ThT as a function of temperature have been explored by using steady-state, and picoseconds time-resolved fluorescence emission spectroscopic techniques. From the spectroscopic analysis, we have observed that with rising temperature the fluorescence quantum yields and lifetime values of ThT molecule gradually decreases; this is the cumulative effect of thermal quenching and increase in the rate of the torsional rate constant. The fluorescence quantum yield and fluorescence lifetime decay values were always higher for DES-II (urea & N, N-diethyl ethanol ammonium chloride) than those for DES-I (urea & choline chloride). This was mainly due to the presence of structural heterogeneity of the medium. This was further confirmed by comparison with the activation energy of viscous flow with the activation energy of non-radiative decay. ThT molecule in less viscous media undergoes a very fast twisting process and leads to deactivation from the photoexcited state. In this system, the torsional motion increases with increasing temperature. We have concluded that beside bulk viscosity of the media, structural heterogeneity of the medium play crucial role to guide the photophysics of ThT in DESs. The analysis of the experimental data was carried out in the temperature range 288 ≤ T = 333K. The present articulate is to obtain an insight into the DESs as media for studying various photophysical processes of amyloid fibrils sensing molecule of ThT.Keywords: deep eutectic solvent, photophysics, Thioflavin T, the torsional rate constant
Procedia PDF Downloads 16423835 Crab Shell Waste Chitosan-Based Thin Film for Acoustic Sensor Applications
Authors: Maydariana Ayuningtyas, Bambang Riyanto, Akhiruddin Maddu
Abstract:
Industrial waste of crustacean shells, such as shrimp and crab, has been considered as one of the major issues contributing to environmental pollution. The waste processing mechanisms to form new, practical substances with added value have been developed. Chitosan, a derived matter from chitin, which is obtained from crab and shrimp shells, performs prodigiously in broad range applications. A chitosan composite-based diaphragm is a new inspiration in fiber optic acoustic sensor advancement. Elastic modulus, dynamic response, and sensitivity to acoustic wave of chitosan-based composite film contribute great potentials of organic-based sound-detecting material. The objective of this research was to develop chitosan diaphragm application in fiber optic microphone system. The formulation was conducted by blending 5% polyvinyl alcohol (PVA) solution with dissolved chitosan at 0%, 1% and 2% in 1:1 ratio, respectively. Composite diaphragms were characterized for the morphological and mechanical properties to predict the desired acoustic sensor sensitivity. The composite with 2% chitosan indicated optimum performance with 242.55 µm thickness, 67.9% relative humidity, and 29-76% light transmittance. The Young’s modulus of 2%-chitosan composite material was 4.89×104 N/m2, which generated the voltage amplitude of 0.013V and performed sensitivity of 3.28 mV/Pa at 1 kHz. Based on the results above, chitosan from crustacean shell waste can be considered as a viable alternative material for fiber optic acoustic sensor sensing pad development. Further, the research in chitosan utilisation is proposed as novel optical microphone development in anthropogenic noise controlling effort for environmental and biodiversity conservation.Keywords: acoustic sensor, chitosan, composite, crab shell, diaphragm, waste utilisation
Procedia PDF Downloads 26023834 Analyzing the Relationship between the Spatial Characteristics of Cultural Structure, Activities, and the Tourism Demand
Authors: Deniz Karagöz
Abstract:
This study is attempt to comprehend the relationship between the spatial characteristics of cultural structure, activities and the tourism demand in Turkey. The analysis divided into four parts. The first part consisted of a cultural structure and cultural activity (CSCA) index provided by principal component analysis. The analysis determined four distinct dimensions, namely, cultural activity/structure, accessing culture, consumption, and cultural management. The exploratory spatial data analysis employed to determine the spatial models of cultural structure and cultural activities in 81 provinces in Turkey. Global Moran I indices is used to ascertain the cultural activities and the structural clusters. Finally, the relationship between the cultural activities/cultural structure and tourism demand was analyzed. The raw/original data of the study official databases. The data on the cultural structure and activities gathered from the Turkish Statistical Institute and the data related to the tourism demand was provided by the Republic of Turkey Ministry of Culture and Tourism.Keywords: cultural activities, cultural structure, spatial characteristics, tourism demand, Turkey
Procedia PDF Downloads 56323833 The Synergistic Effects of Blockchain and AI on Enhancing Data Integrity and Decision-Making Accuracy in Smart Contracts
Authors: Sayor Ajfar Aaron, Sajjat Hossain Abir, Ashif Newaz, Mushfiqur Rahman
Abstract:
Investigating the convergence of blockchain technology and artificial intelligence, this paper examines their synergistic effects on data integrity and decision-making within smart contracts. By implementing AI-driven analytics on blockchain-based platforms, the research identifies improvements in automated contract enforcement and decision accuracy. The paper presents a framework that leverages AI to enhance transparency and trust while blockchain ensures immutable record-keeping, culminating in significantly optimized operational efficiencies in various industries.Keywords: artificial intelligence, blockchain, data integrity, smart contracts
Procedia PDF Downloads 6223832 Time-Series Load Data Analysis for User Power Profiling
Authors: Mahdi Daghmhehci Firoozjaei, Minchang Kim, Dima Alhadidi
Abstract:
In this paper, we present a power profiling model for smart grid consumers based on real time load data acquired smart meters. It profiles consumers’ power consumption behaviour using the dynamic time warping (DTW) clustering algorithm. Due to the invariability of signal warping of this algorithm, time-disordered load data can be profiled and consumption features be extracted. Two load types are defined and the related load patterns are extracted for classifying consumption behaviour by DTW. The classification methodology is discussed in detail. To evaluate the performance of the method, we analyze the time-series load data measured by a smart meter in a real case. The results verify the effectiveness of the proposed profiling method with 90.91% true positive rate for load type clustering in the best case.Keywords: power profiling, user privacy, dynamic time warping, smart grid
Procedia PDF Downloads 15823831 Evaluation of Dual Polarization Rainfall Estimation Algorithm Applicability in Korea: A Case Study on Biseulsan Radar
Authors: Chulsang Yoo, Gildo Kim
Abstract:
Dual polarization radar provides comprehensive information about rainfall by measuring multiple parameters. In Korea, for the rainfall estimation, JPOLE and CSU-HIDRO algorithms are generally used. This study evaluated the local applicability of JPOLE and CSU-HIDRO algorithms in Korea by using the observed rainfall data collected on August, 2014 by the Biseulsan dual polarization radar data and KMA AWS. A total of 11,372 pairs of radar-ground rain rate data were classified according to thresholds of synthetic algorithms into suitable and unsuitable data. Then, evaluation criteria were derived by comparing radar rain rate and ground rain rate, respectively, for entire, suitable, unsuitable data. The results are as follows: (1) The radar rain rate equation including KDP, was found better in the rainfall estimation than the other equations for both JPOLE and CSU-HIDRO algorithms. The thresholds were found to be adequately applied for both algorithms including specific differential phase. (2) The radar rain rate equation including horizontal reflectivity and differential reflectivity were found poor compared to the others. The result was not improved even when only the suitable data were applied. Acknowledgments: This work was supported by the Basic Science Research Program through the National Research Foundation of Korea, funded by the Ministry of Education (NRF-2013R1A1A2011012).Keywords: CSU-HIDRO algorithm, dual polarization radar, JPOLE algorithm, radar rainfall estimation algorithm
Procedia PDF Downloads 21823830 Framework for Socio-Technical Issues in Requirements Engineering for Developing Resilient Machine Vision Systems Using Levels of Automation through the Lifecycle
Authors: Ryan Messina, Mehedi Hasan
Abstract:
This research is to examine the impacts of using data to generate performance requirements for automation in visual inspections using machine vision. These situations are intended for design and how projects can smooth the transfer of tacit knowledge to using an algorithm. We have proposed a framework when specifying machine vision systems. This framework utilizes varying levels of automation as contingency planning to reduce data processing complexity. Using data assists in extracting tacit knowledge from those who can perform the manual tasks to assist design the system; this means that real data from the system is always referenced and minimizes errors between participating parties. We propose using three indicators to know if the project has a high risk of failing to meet requirements related to accuracy and reliability. All systems tested achieved a better integration into operations after applying the framework.Keywords: automation, contingency planning, continuous engineering, control theory, machine vision, system requirements, system thinking
Procedia PDF Downloads 21023829 Wreathed Hornbill (Rhyticeros undulatus) on Mount Ungaran: Are their Habitat Threatened?
Authors: Margareta Rahayuningsih, Nugroho Edi K., Siti Alimah
Abstract:
Wreathed Hornbill (Rhyticeros undulatus) is the one of hornbill species (Family: Bucerotidae) that found on Mount Ungaran. In the preservation or planning in situ conservation of Wreathed Hornbill require the habitat condition data. The objective of the research was to determine the land cover change on Mount Ungaran using satellite image data and GIS. Based on the land cover data on 1999-2009 the research showed that the primer forest on Mount Ungaran was decreased almost 50%, while the seconder forest, tea and coffee plantation, and the settlement were increased.Keywords: GIS, Mount Ungaran, threatened habitat, Wreathed Hornbill (Rhyticeros undulatus)
Procedia PDF Downloads 36323828 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering
Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel
Abstract:
Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.Keywords: classification, data mining, spam filtering, naive bayes, decision tree
Procedia PDF Downloads 41423827 Mapping of Electrical Energy Consumption Yogyakarta Province in 2014-2025
Authors: Alfi Al Fahreizy
Abstract:
Yogyakarta is one of the provinces in Indonesia that often get a power outage because of high load electrical consumption. The authors mapped the electrical energy consumption [GWh] for the province of Yogyakarta in 2014-2025 using LEAP (Long-range Energy Alternatives Planning system) software. This paper use BAU (Business As Usual) scenario. BAU scenario in which the projection is based on the assumption that growth in electricity consumption will run as normally as before. The goal is to be able to see the electrical energy consumption in the household sector, industry , business, social, government office building, and street lighting. The data is the data projected statistical population and consumption data electricity [GWh] 2010, 2011, 2012 in Yogyakarta province.Keywords: LEAP, energy consumption, Yogyakarta, BAU
Procedia PDF Downloads 60123826 Research and Application of Multi-Scale Three Dimensional Plant Modeling
Authors: Weiliang Wen, Xinyu Guo, Ying Zhang, Jianjun Du, Boxiang Xiao
Abstract:
Reconstructing and analyzing three-dimensional (3D) models from situ measured data is important for a number of researches and applications in plant science, including plant phenotyping, functional-structural plant modeling (FSPM), plant germplasm resources protection, agricultural technology popularization. It has many scales like cell, tissue, organ, plant and canopy from micro to macroscopic. The techniques currently used for data capture, feature analysis, and 3D reconstruction are quite different of different scales. In this context, morphological data acquisition, 3D analysis and modeling of plants on different scales are introduced systematically. The commonly used data capture equipment for these multiscale is introduced. Then hot issues and difficulties of different scales are described respectively. Some examples are also given, such as Micron-scale phenotyping quantification and 3D microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning, 3D reconstruction of leaf surfaces and feature extraction from point cloud acquired by using 3D handheld scanner, plant modeling by combining parameter driven 3D organ templates. Several application examples by using the 3D models and analysis results of plants are also introduced. A 3D maize canopy was constructed, and light distribution was simulated within the canopy, which was used for the designation of ideal plant type. A grape tree model was constructed from 3D digital and point cloud data, which was used for the production of science content of 11th international conference on grapevine breeding and genetics. By using the tissue models of plants, a Google glass was used to look around visually inside the plant to understand the internal structure of plants. With the development of information technology, 3D data acquisition, and data processing techniques will play a greater role in plant science.Keywords: plant, three dimensional modeling, multi-scale, plant phenotyping, three dimensional data acquisition
Procedia PDF Downloads 27923825 Enhancement of Primary User Detection in Cognitive Radio by Scattering Transform
Authors: A. Moawad, K. C. Yao, A. Mansour, R. Gautier
Abstract:
The detecting of an occupied frequency band is a major issue in cognitive radio systems. The detection process becomes difficult if the signal occupying the band of interest has faded amplitude due to multipath effects. These effects make it hard for an occupying user to be detected. This work mitigates the missed-detection problem in the context of cognitive radio in frequency-selective fading channel by proposing blind channel estimation method that is based on scattering transform. By initially applying conventional energy detection, the missed-detection probability is evaluated, and if it is greater than or equal to 50%, channel estimation is applied on the received signal followed by channel equalization to reduce the channel effects. In the proposed channel estimator, we modify the Morlet wavelet by using its first derivative for better frequency resolution. A mathematical description of the modified function and its frequency resolution is formulated in this work. The improved frequency resolution is required to follow the spectral variation of the channel. The channel estimation error is evaluated in the mean-square sense for different channel settings, and energy detection is applied to the equalized received signal. The simulation results show improvement in reducing the missed-detection probability as compared to the detection based on principal component analysis. This improvement is achieved at the expense of increased estimator complexity, which depends on the number of wavelet filters as related to the channel taps. Also, the detection performance shows an improvement in detection probability for low signal-to-noise scenarios over principal component analysis- based energy detection.Keywords: channel estimation, cognitive radio, scattering transform, spectrum sensing
Procedia PDF Downloads 19923824 An Extensive Review of Drought Indices
Authors: Shamsulhaq Amin
Abstract:
Drought can arise from several hydrometeorological phenomena that result in insufficient precipitation, soil moisture, and surface and groundwater flow, leading to conditions that are considerably drier than the usual water content or availability. Drought is often assessed using indices that are associated with meteorological, agricultural, and hydrological phenomena. In order to effectively handle drought disasters, it is essential to accurately determine the kind, intensity, and extent of the drought using drought characterization. This information is critical for managing the drought before, during, and after the rehabilitation process. Over a hundred drought assessments have been created in literature to evaluate drought disasters, encompassing a range of factors and variables. Some models utilise solely hydrometeorological drivers, while others employ remote sensing technology, and some incorporate a combination of both. Comprehending the entire notion of drought and taking into account drought indices along with their calculation processes are crucial for researchers in this discipline. Examining several drought metrics in different studies requires additional time and concentration. Hence, it is crucial to conduct a thorough examination of approaches used in drought indices in order to identify the most straightforward approach to avoid any discrepancies in numerous scientific studies. In case of practical application in real-world, categorizing indices relative to their usage in meteorological, agricultural, and hydrological phenomena might help researchers maximize their efficiency. Users have the ability to explore different indexes at the same time, allowing them to compare the convenience of use and evaluate the benefits and drawbacks of each. Moreover, certain indices exhibit interdependence, which enhances comprehension of their connections and assists in making informed decisions about their suitability in various scenarios. This study provides a comprehensive assessment of various drought indices, analysing their types and computation methodologies in a detailed and systematic manner.Keywords: drought classification, drought severity, drought indices, agriculture, hydrological
Procedia PDF Downloads 4823823 Principal Component Analysis in Drug-Excipient Interactions
Authors: Farzad Khajavi
Abstract:
Studies about the interaction between active pharmaceutical ingredients (API) and excipients are so important in the pre-formulation stage of development of all dosage forms. Analytical techniques such as differential scanning calorimetry (DSC), Thermal gravimetry (TG), and Furrier transform infrared spectroscopy (FTIR) are commonly used tools for investigating regarding compatibility and incompatibility of APIs with excipients. Sometimes the interpretation of data obtained from these techniques is difficult because of severe overlapping of API spectrum with excipients in their mixtures. Principal component analysis (PCA) as a powerful factor analytical method is used in these situations to resolve data matrices acquired from these analytical techniques. Binary mixtures of API and interested excipients are considered and produced. Peaks of FTIR, DSC, or TG of pure API and excipient and their mixtures at different mole ratios will construct the rows of the data matrix. By applying PCA on the data matrix, the number of principal components (PCs) is determined so that it contains the total variance of the data matrix. By plotting PCs or factors obtained from the score of the matrix in two-dimensional spaces if the pure API and its mixture with the excipient at the high amount of API and the 1:1mixture form a separate cluster and the other cluster comprise of the pure excipient and its blend with the API at the high amount of excipient. This confirms the existence of compatibility between API and the interested excipient. Otherwise, the incompatibility will overcome a mixture of API and excipient.Keywords: API, compatibility, DSC, TG, interactions
Procedia PDF Downloads 13423822 Activity Data Analysis for Status Classification Using Fitness Trackers
Authors: Rock-Hyun Choi, Won-Seok Kang, Chang-Sik Son
Abstract:
Physical activity is important for healthy living. Recently wearable devices which motivate physical activity are quickly developing, and become cheaper and more comfortable. In particular, fitness trackers provide a variety of information and need to provide well-analyzed, and user-friendly results. In this study, frequency analysis was performed to classify various data sets of Fitbit into simple activity status. The data from Fitbit cloud server consists of 263 subjects who were healthy factory and office workers in Korea from March 7th to April 30th, 2016. In the results, we found assumptions of activity state classification seem to be sufficient and reasonable.Keywords: activity status, fitness tracker, heart rate, steps
Procedia PDF Downloads 38523821 A Crowdsourced Homeless Data Collection System and Its Econometric Analysis: Strengthening Inclusive Public Administration Policies
Authors: Praniil Nagaraj
Abstract:
This paper proposes a method to collect homeless data using crowdsourcing and presents an approach to analyze the data, demonstrating its potential to strengthen existing and future policies aimed at promoting socio-economic equilibrium. This paper's contributions can be categorized into three main areas. Firstly, a unique method for collecting homeless data is introduced, utilizing a user-friendly smartphone app (currently available for Android). The app enables the general public to quickly record information about homeless individuals, including the number of people and details about their living conditions. The collected data, including date, time, and location, is anonymized and securely transmitted to the cloud. It is anticipated that an increasing number of users motivated to contribute to society will adopt the app, thus expanding the data collection efforts. Duplicate data is addressed through simple classification methods, and historical data is utilized to fill in missing information. The second contribution of this paper is the description of data analysis techniques applied to the collected data. By combining this new data with existing information, statistical regression analysis is employed to gain insights into various aspects, such as distinguishing between unsheltered and sheltered homeless populations, as well as examining their correlation with factors like unemployment rates, housing affordability, and labor demand. Initial data is collected in San Francisco, while pre-existing information is drawn from three cities: San Francisco, New York City, and Washington D.C., facilitating the conduction of simulations. The third contribution focuses on demonstrating the practical implications of the data processing results. The challenges faced by key stakeholders, including charitable organizations and local city governments, are taken into consideration. Two case studies are presented as examples. The first case study explores improving the efficiency of food and necessities distribution, as well as medical assistance, driven by charitable organizations. The second case study examines the correlation between micro-geographic budget expenditure by local city governments and homeless information to justify budget allocation and expenditures. The ultimate objective of this endeavor is to enable the continuous enhancement of the quality of life for the underprivileged. It is hoped that through increased crowdsourcing of data from the public, the Generosity Curve and the Need Curve will intersect, leading to a better world for all.Keywords: crowdsourcing, homelessness, socio-economic policies, statistical analysis
Procedia PDF Downloads 5023820 Magnetoresistance Transition from Negative to Positive in Functionalization of Carbon Nanotube and Composite with Polyaniline
Authors: Krishna Prasad Maity, Narendra Tanty, Ananya Patra, V. Prasad
Abstract:
Carbon nanotube (CNT) is a well-known material for very good electrical, thermal conductivity and high tensile strength. Because of that, it’s widely used in many fields like nanotechnology, electronics, optics, etc. In last two decades, polyaniline (PANI) with CNT and functionalized CNT (fCNT) have been promising materials in application of gas sensing, electromagnetic shielding, electrode of capacitor etc. So, the study of electrical conductivity of PANI/CNT and PANI/fCNT is important to understand the charge transport and interaction between PANI and CNT in the composite. It is observed that a transition in magnetoresistance (MR) with lowering temperature, increasing magnetic field and decreasing CNT percentage in CNT/PANI composite. Functionalization of CNT prevent the nanotube aggregation, improves interfacial interaction, dispersion and stabilized in polymer matrix. However, it shortens the length, breaks C-C sp² bonds and enhances the disorder creating defects on the side walls. We have studied electrical resistivity and MR in PANI with CNT and fCNT composites for different weight percentages down to the temperature 4.2K and up to magnetic field 5T. Resistivity increases significantly in composite at low temperature due to functionalization of CNT compared to only CNT. Interestingly a transition from negative to positive magnetoresistance has been observed when the filler is changed from pure CNT to functionalized CNT after a certain percentage (10wt%) as the effect of more disorder in fCNT/PANI composite. The transition of MR has been explained on the basis of polaron-bipolaron model. The long-range Coulomb interaction between two polarons screened by disorder in the composite of fCNT/PANI, increases the effective on-site Coulomb repulsion energy to form bipolaron which leads to change the sign of MR from negative to positive.Keywords: coulomb interaction, magnetoresistance transition, polyaniline composite, polaron-bipolaron
Procedia PDF Downloads 17423819 Does Level of Countries Corruption Affect Firms Working Capital Management?
Authors: Ebrahim Mansoori, Datin Joriah Muhammad
Abstract:
Recent studies in finance have focused on the effect of external variables on working capital management. This study investigates the effect of corruption indexes on firms' working capital management. A large data set that covers data from 2005 to 2013 from five ASEAN countries, namely, Malaysia, Indonesia, Singapore, Thailand, and the Philippines, was selected to investigate how the level of corruption in these countries affect working capital management. The results of panel data analysis include fixed effect estimations showed that a high level of countries' corruption indexes encourages managers to shorten the CCC length. Meanwhile, the managers reduce the level of investment in cash and cash equivalents when the levels of corruption indexes increase. Therefore, increasing the level of countries' corruption indexes encourages managers to select conservative working capital strategies by reducing the level of NLB.Keywords: ASEAN, corruption indexes, panel data analysis, working capital management
Procedia PDF Downloads 438