Search results for: zonal metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 587

Search results for: zonal metrics

437 Geospatial Techniques and VHR Imagery Use for Identification and Classification of Slums in Gujrat City, Pakistan

Authors: Muhammad Ameer Nawaz Akram

Abstract:

The 21st century has been revealed that many individuals around the world are living in urban settlements than in rural zones. The evolution of numerous cities in emerging and newly developed countries is accompanied by the rise of slums. The precise definition of a slum varies countries to countries, but the universal harmony is that slums are dilapidated settlements facing severe poverty and have lacked access to sanitation, water, electricity, good living styles, and land tenure. The slum settlements always vary in unique patterns within and among the countries and cities. The core objective of this study is the spatial identification and classification of slums in Gujrat city Pakistan from very high-resolution GeoEye-1 (0.41m) satellite imagery. Slums were first identified using GPS for sample site identification and ground-truthing; through this process, 425 slums were identified. Then Object-Oriented Analysis (OOA) was applied to classify slums on digital image. Spatial analysis softwares, e.g., ArcGIS 10.3, Erdas Imagine 9.3, and Envi 5.1, were used for processing data and performing the analysis. Results show that OOA provides up to 90% accuracy for the identification of slums. Jalal Cheema and Allah Ho colonies are severely affected by slum settlements. The ratio of criminal activities is also higher here than in other areas. Slums are increasing with the passage of time in urban areas, and they will be like a hazardous problem in coming future. So now, the executive bodies need to make effective policies and move towards the amelioration process of the city.

Keywords: slums, GPS, satellite imagery, object oriented analysis, zonal change detection

Procedia PDF Downloads 104
436 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest

Procedia PDF Downloads 87
435 Flood Devastation Assessment Through Mapping in Nigeria-2022 using Geospatial Techniques

Authors: Hafiz Muhammad Tayyab Bhatti, Munazza Usmani

Abstract:

One of nature's most destructive occurrences, floods do immense damage to communities and economic losses. Nigeria country, specifically southern Nigeria, is known for being prone to flooding. Even though periodic flooding occurs in Nigeria frequently, the floods of 2022 were the worst since those in 2012. Flood vulnerability analysis and mapping are still lacking in this region due to the very limited historical hydrological measurements and surveys on the effects of floods, which makes it difficult to develop and put into practice efficient flood protection measures. Remote sensing and Geographic Information Systems (GIS) are useful approaches to detecting, determining, and estimating the flood extent and its impacts. In this study, NOAA VIIR has been used to extract the flood extent using the flood water fraction data and afterward fused with GIS data for some zonal statistical analysis. The estimated possible flooding areas are validated using satellite imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS). The goal is to map and studied flood extent, flood hazards, and their effects on the population, schools, and health facilities for each state of Nigeria. The resulting flood hazard maps show areas with high-risk levels clearly and serve as an important reference for planning and implementing future flood mitigation and control strategies. Overall, the study demonstrated the viability of using the chosen GIS and remote sensing approaches to detect possible risk regions to secure local populations and enhance disaster response capabilities during natural disasters.

Keywords: flood hazards, remote sensing, damage assessment, GIS, geospatial analysis

Procedia PDF Downloads 93
434 Detecting Covid-19 Fake News Using Deep Learning Technique

Authors: AnjalI A. Prasad

Abstract:

Nowadays, social media played an important role in spreading misinformation or fake news. This study analyzes the fake news related to the COVID-19 pandemic spread in social media. This paper aims at evaluating and comparing different approaches that are used to mitigate this issue, including popular deep learning approaches, such as CNN, RNN, LSTM, and BERT algorithm for classification. To evaluate models’ performance, we used accuracy, precision, recall, and F1-score as the evaluation metrics. And finally, compare which algorithm shows better result among the four algorithms.

Keywords: BERT, CNN, LSTM, RNN

Procedia PDF Downloads 174
433 Dynamic Communications Mapping in NoC-Based Heterogeneous MPSoCs

Authors: M. K. Benhaoua, A. K. Singh, A. E. H. Benyamina

Abstract:

In this paper, we propose heuristic for dynamic communications mapping that considers the placement of communications in order to optimize the overall performance. The mapping technique uses a newly proposed Algorithm to place communications between the tasks. The placement we propose of the communications leads to a better optimization of several performance metrics (time and energy consumption). Experimental results show that the proposed mapping approach provides significant performance improvements when compared to those using static routing.

Keywords: Multi-Processor Systems-on-Chip (MPSoCs), Network-on-Chip (NoC), heterogeneous architectures, dynamic mapping heuristics

Procedia PDF Downloads 503
432 Normalized Compression Distance Based Scene Alteration Analysis of a Video

Authors: Lakshay Kharbanda, Aabhas Chauhan

Abstract:

In this paper, an application of Normalized Compression Distance (NCD) to detect notable scene alterations occurring in videos is presented. Several research groups have been developing methods to perform image classification using NCD, a computable approximation to Normalized Information Distance (NID) by studying the degree of similarity in images. The timeframes where significant aberrations between the frames of a video have occurred have been identified by obtaining a threshold NCD value, using two compressors: LZMA and BZIP2 and defining scene alterations using Pixel Difference Percentage metrics.

Keywords: image compression, Kolmogorov complexity, normalized compression distance, root mean square error

Procedia PDF Downloads 303
431 Active Features Determination: A Unified Framework

Authors: Meenal Badki

Abstract:

We address the issue of active feature determination, where the objective is to determine the set of examples on which additional data (such as lab tests) needs to be gathered, given a large number of examples with some features (such as demographics) and some examples with all the features (such as the complete Electronic Health Record). We note that certain features may be more costly, unique, or laborious to gather. Our proposal is a general active learning approach that is independent of classifiers and similarity metrics. It allows us to identify examples that differ from the full data set and obtain all the features for the examples that match. Our comprehensive evaluation shows the efficacy of this approach, which is driven by four authentic clinical tasks.

Keywords: feature determination, classification, active learning, sample-efficiency

Procedia PDF Downloads 30
430 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 118
429 Sea of Light: A Game 'Based Approach for Evidence-Centered Assessment of Collaborative Problem Solving

Authors: Svenja Pieritz, Jakab Pilaszanovich

Abstract:

Collaborative Problem Solving (CPS) is recognized as being one of the most important skills of the 21st century with having a potential impact on education, job selection, and collaborative systems design. Therefore, CPS has been adopted in several standardized tests, including the Programme for International Student Assessment (PISA) in 2015. A significant challenge of evaluating CPS is the underlying interplay of cognitive and social skills, which requires a more holistic assessment. However, the majority of the existing tests are using a questionnaire-based assessment, which oversimplifies this interplay and undermines ecological validity. Two major difficulties were identified: Firstly, the creation of a controllable, real-time environment allowing natural behaviors and communication between at least two people. Secondly, the development of an appropriate method to collect and synthesize both cognitive and social metrics of collaboration. This paper proposes a more holistic and automated approach to the assessment of CPS. To address these two difficulties, a multiplayer problem-solving game called Sea of Light was developed: An environment allowing students to deploy a variety of measurable collaborative strategies. This controlled environment enables researchers to monitor behavior through the analysis of game actions and chat. The according solution for the statistical model is a combined approach of Natural Language Processing (NLP) and Bayesian network analysis. Social exchanges via the in-game chat are analyzed through NLP and fed into the Bayesian network along with other game actions. This Bayesian network synthesizes evidence to track and update different subdimensions of CPS. Major findings focus on the correlations between the evidences collected through in- game actions, the participants’ chat features and the CPS self- evaluation metrics. These results give an indication of which game mechanics can best describe CPS evaluation. Overall, Sea of Light gives test administrators control over different problem-solving scenarios and difficulties while keeping the student engaged. It enables a more complete assessment based on complex, socio-cognitive information on actions and communication. This tool permits further investigations of the effects of group constellations and personality in collaborative problem-solving.

Keywords: bayesian network, collaborative problem solving, game-based assessment, natural language processing

Procedia PDF Downloads 106
428 Data Model to Predict Customize Skin Care Product Using Biosensor

Authors: Ashi Gautam, Isha Shukla, Akhil Seghal

Abstract:

Biosensors are analytical devices that use a biological sensing element to detect and measure a specific chemical substance or biomolecule in a sample. These devices are widely used in various fields, including medical diagnostics, environmental monitoring, and food analysis, due to their high specificity, sensitivity, and selectivity. In this research paper, a machine learning model is proposed for predicting the suitability of skin care products based on biosensor readings. The proposed model takes in features extracted from biosensor readings, such as biomarker concentration, skin hydration level, inflammation presence, sensitivity, and free radicals, and outputs the most appropriate skin care product for an individual. This model is trained on a dataset of biosensor readings and corresponding skin care product information. The model's performance is evaluated using several metrics, including accuracy, precision, recall, and F1 score. The aim of this research is to develop a personalised skin care product recommendation system using biosensor data. By leveraging the power of machine learning, the proposed model can accurately predict the most suitable skin care product for an individual based on their biosensor readings. This is particularly useful in the skin care industry, where personalised recommendations can lead to better outcomes for consumers. The developed model is based on supervised learning, which means that it is trained on a labeled dataset of biosensor readings and corresponding skin care product information. The model uses these labeled data to learn patterns and relationships between the biosensor readings and skin care products. Once trained, the model can predict the most suitable skin care product for an individual based on their biosensor readings. The results of this study show that the proposed machine learning model can accurately predict the most appropriate skin care product for an individual based on their biosensor readings. The evaluation metrics used in this study demonstrate the effectiveness of the model in predicting skin care products. This model has significant potential for practical use in the skin care industry for personalised skin care product recommendations. The proposed machine learning model for predicting the suitability of skin care products based on biosensor readings is a promising development in the skin care industry. The model's ability to accurately predict the most appropriate skin care product for an individual based on their biosensor readings can lead to better outcomes for consumers. Further research can be done to improve the model's accuracy and effectiveness.

Keywords: biosensors, data model, machine learning, skin care

Procedia PDF Downloads 47
427 Evaluating Daylight Performance in an Office Environment in Malaysia, Using Venetian Blind Systems

Authors: Fatemeh Deldarabdolmaleki, Mohamad Fakri Zaky Bin Ja'afar

Abstract:

This paper presents fenestration analysis to study the balance between utilizing daylight and eliminating the disturbing parameters in a private office room with interior venetian blinds taking into account different slat angles. Mean luminance of the scene and window, luminance ratio of the workplane and window, work plane illumination and daylight glare probability(DGP) were calculated as a function of venetian blind design properties. Recently developed software, analyzing High Dynamic Range Images (HDRI captured by CCD camera), such as radiance based evalglare and hdrscope help to investigate luminance-based metrics. A total of Eight-day measurement experiment was conducted to investigate the impact of different venetian blind angles in an office environment under daylight condition in Serdang, Malaysia. Detailed result for the selected case study showed that artificial lighting is necessary during the morning session for Malaysian buildings with southwest windows regardless of the venetian blind’s slat angle. However, in some conditions of afternoon session the workplane illuminance level exceeds the maximum illuminance of 2000 lx such as 10° and 40° slat angles. Generally, a rising trend is discovered toward mean window luminance level during the day. All the conditions have less than 10% of the pixels exceeding 2000 cd/m² before 1:00 P.M. However, 40% of the selected hours have more than 10% of the scene pixels higher than 2000 cd/m² after 1:00 P.M. Surprisingly in no blind condition, there is no extreme case of window/task ratio, However, the extreme cases happen for 20°, 30°, 40° and 50° slat angles. As expected mean window luminance level is higher than 2000 cd/m² after 2:00 P.M for most cases except 60° slat angle condition. Studying the daylight glare probability, there is not any DGP value higher than 0.35 in this experiment, due to the window’s direction, location of the building and studied workplane. Specifically, this paper reviews different blind angle’s response to the suggested metrics by the previous standards, and finally conclusions and knowledge gaps are summarized and suggested next steps for research are provided. Addressing these gaps is critical for the continued progress of the energy efficiency movement.

Keywords: daylighting, office environment, energy simulation, venetian blind

Procedia PDF Downloads 196
426 A Basic Metric Model: Foundation for an Evidence-Based HRM System

Authors: K. M. Anusha, R. Krishnaveni

Abstract:

Crossing a decade of the 21st century, the paradigm of human resources can be seen evolving with the strategic gene induced into it. There seems to be a radical shift descending as the corporate sector calls on its HR team to become strategic rather than administrative. This transferal eventually requires the metrics employed by these HR teams not to be just operationally reactive but to be aligned to an evidence-based strategic thinking. Realizing the growing need for a prescriptive metric model for effective HR analytics, this study has designed a conceptual framework for a basic metric model that can assist IT-HRM professionals to transition to a practice of evidence-based decision-making to enhance organizational performance.

Keywords: metric model, evidence based HR, HR analytics, strategic HR practices, IT sector

Procedia PDF Downloads 373
425 Optimizing Residential Housing Renovation Strategies at Territorial Scale: A Data Driven Approach and Insights from the French Context

Authors: Rit M., Girard R., Villot J., Thorel M.

Abstract:

In a scenario of extensive residential housing renovation, stakeholders need models that support decision-making through a deep understanding of the existing building stock and accurate energy demand simulations. To address this need, we have modified an optimization model using open data that enables the study of renovation strategies at both territorial and national scales. This approach provides (1) a definition of a strategy to simplify decision trees from theoretical combinations, (2) input to decision makers on real-world renovation constraints, (3) more reliable identification of energy-saving measures (changes in technology or behaviour), and (4) discrepancies between currently planned and actually achieved strategies. The main contribution of the studies described in this document is the geographic scale: all residential buildings in the areas of interest were modeled and simulated using national data (geometries and attributes). These buildings were then renovated, when necessary, in accordance with the environmental objectives, taking into account the constraints applicable to each territory (number of renovations per year) or at the national level (renovation of thermal deficiencies (Energy Performance Certificates F&G)). This differs from traditional approaches that focus only on a few buildings or archetypes. This model can also be used to analyze the evolution of a building stock as a whole, as it can take into account both the construction of new buildings and their demolition or sale. Using specific case studies of French territories, this paper highlights a significant discrepancy between the strategies currently advocated by decision-makers and those proposed by our optimization model. This discrepancy is particularly evident in critical metrics such as the relationship between the number of renovations per year and achievable climate targets or the financial support currently available to households and the remaining costs. In addition, users are free to seek optimizations for their building stock across a range of different metrics (e.g., financial, energy, environmental, or life cycle analysis). These results are a clear call to re-evaluate existing renovation strategies and take a more nuanced and customized approach. As the climate crisis moves inexorably forward, harnessing the potential of advanced technologies and data-driven methodologies is imperative.

Keywords: residential housing renovation, MILP, energy demand simulations, data-driven methodology

Procedia PDF Downloads 43
424 Urban River As Living Infrastructure: Tidal Flooding And Sea Level Rise In A Working Waterway In Hampton Roads, Virginia

Authors: William Luke Hamel

Abstract:

Existing conceptions of urban flooding caused by tidal fluctuations and sea-level rise have been inadequately conceptualized by metrics of resilience and methods of flow modeling. While a great deal of research has been devoted to the effects of urbanization on pluvial flooding, the kind of tidal flooding experienced by locations like Hampton Roads, Virginia, has not been adequately conceptualized as being a result of human factors such as urbanization and gray infrastructure. Resilience from sea level rise and its associated flooding has been pioneered in the region with the 2015 Norfolk Resilience Plan from 100 Resilient Cities as well as the 2016 Norfolk Vision 2100 plan, which envisions different patterns of land use for the city. Urban resilience still conceptualizes the city as having the ability to maintain an equilibrium in the face of disruptions. This economic and social equilibrium relies on the Elizabeth River, narrowly conceptualized. Intentionally or accidentally, the river was made to be a piece of infrastructure. Its development was meant to serve the docks, shipyards, naval yards, and port infrastructure that gives the region so much of its economic life. Inasmuch as it functions to permit the movement of cargo; the raising and lowering of ships to be repaired, commissioned, or decommissioned; or the provisioning of military vessels, the river as infrastructure is functioning properly. The idea that the infrastructure is malfunctioning when high tides and sea-level rise create flooding is predicated on the idea that the infrastructure is truly a human creation and can be controlled. The natural flooding cycles of an urban river, combined with the action of climate change and sea-level rise, are only abnormal so much as they encroach on the development that first encroached on the river. The urban political ecology of water provides the ability to view the river as an infrastructural extension of urban networks while also calling for its emancipation from stationarity and human control. Understanding the river and city as a hydrosocial territory or as a socio-natural system liberates both actors from the duality of the natural and the social while repositioning river flooding as a normal part of coexistence on a floodplain. This paper argues for the adoption of an urban political ecology lens in the analysis and governance of urban rivers like the Elizabeth River as a departure from the equilibrium-seeking and stability metrics of urban resilience.

Keywords: urban flooding, political ecology, Elizabeth river, Hampton roads

Procedia PDF Downloads 135
423 Assessing Supply Chain Performance through Data Mining Techniques: A Case of Automotive Industry

Authors: Emin Gundogar, Burak Erkayman, Nusret Sazak

Abstract:

Providing effective management performance through the whole supply chain is critical issue and hard to applicate. The proper evaluation of integrated data may conclude with accurate information. Analysing the supply chain data through OLAP (On-Line Analytical Processing) technologies may provide multi-angle view of the work and consolidation. In this study, association rules and classification techniques are applied to measure the supply chain performance metrics of an automotive manufacturer in Turkey. Main criteria and important rules are determined. The comparison of the results of the algorithms is presented.

Keywords: supply chain performance, performance measurement, data mining, automotive

Procedia PDF Downloads 480
422 Using Digital Innovations to Increase Awareness and Intent to Use Depo-Medroxy Progesterone Acetate-Subcutaneous Contraception among Women of Reproductive Age in Nigeria, Uganda, and Malawi

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Introduction: Digital innovations have been useful in supporting a client’s contraceptive user journey from awareness to method initiation. The concept of contraceptive self-care is being promoted globally as a means for achieving universal access to quality contraceptive care; however, information about this approach is limited. An important determinant of the scale of awareness is the message construct, choice of information channel, and an understanding of the socio-epidemiological dynamics within the target audience. Significant gains have been made recently in expanding the awareness base of DMPA-SC -a relatively new entrant into the family planning method mix. The cornerstone of this success is a multichannel promotion campaign themed Discover your Power (DYP). The DYP campaign combines content marketing across select social media platforms, chatbots, Cyber-IPC, Interactive Voice Response (IVR), and radio campaigns. Methodology: During implementation, the project monitored predefined metrics of awareness and intent, such as the number of persons reached with the messages, the number of impressions, and meaningful engagement (link-clicks). Metrics/indicators are extracted through native insight/analytics tools across the various platforms. The project also enlists community mobilizers (CMs) who go door-to-door and engage WRA to advertise DISC’s online presence and support them to engage with IVR, digital companion (chatbot), Facebook page, and DiscoverYourPower website. Results: The result showed that the digital platforms recorded 242 million impressions and reached 82 million users with key DMPA-SC self-injection messaging in the three countries. As many as 3.4 million persons engaged (liked, clicked, shared, or reposted) digital posts -an indication of intention. Conclusion: Digital solutions and innovations are gradually becoming the archetype for the advancement of the self-care agenda. Digital innovations can also be used to increase awareness and normalize contraceptive self-care behavior amongst women of reproductive age if they are made an integral part of reproductive health programming.

Keywords: digital transformation, health systems, DMPA-SC, family planning, self-care

Procedia PDF Downloads 50
421 Towards a Goal-Question-Metric Based Approach to Assess Social Sustainability of Software Systems

Authors: Rahma Amri, Narjès Bellamine Ben Saoud

Abstract:

Sustainable development or sustainability is one of the most urgent issues in actual debate in almost domains. Particularly the significant way the software pervades our live should make it in the center of sustainability concerns. The social aspects of sustainability haven’t been well studied in the context of software systems and still immature research field that needs more interest among researchers’ community. This paper presents a Goal-Question-Metric based approach to assess social sustainability of software systems. The approach is based on a generic social sustainability model taken from Social sciences.

Keywords: software assessment approach, social sustainability, goal-question-metric paradigm, software project metrics

Procedia PDF Downloads 365
420 Performance Analysis of Ad-Hoc Network Routing Protocols

Authors: I. Baddari, A. Riahla, M. Mezghich

Abstract:

Today in the literature, we discover a lot of routing algorithms which some have been the subject of normalization. Two great classes Routing algorithms are defined, the first is the class reactive algorithms and the second that of algorithms proactive. The aim of this work is to make a comparative study between some routing algorithms. Two comparisons are considered. The first will focus on the protocols of the same class and second class on algorithms of different classes (one reactive and the other proactive). Since they are not based on analytical models, the exact evaluation of some aspects of these protocols is challenging. Simulations have to be done in order to study their performances. Our simulation is performed in NS2 (Network Simulator 2). It identified a classification of the different routing algorithms studied in a metrics such as loss of message, the time transmission, mobility, etc.

Keywords: ad-hoc network routing protocol, simulation, NS2, delay, packet loss, wideband, mobility

Procedia PDF Downloads 355
419 Application of Machine Learning Techniques in Forest Cover-Type Prediction

Authors: Saba Ebrahimi, Hedieh Ashrafi

Abstract:

Predicting the cover type of forests is a challenge for natural resource managers. In this project, we aim to perform a comprehensive comparative study of two well-known classification methods, support vector machine (SVM) and decision tree (DT). The comparison is first performed among different types of each classifier, and then the best of each classifier will be compared by considering different evaluation metrics. The effect of boosting and bagging for decision trees is also explored. Furthermore, the effect of principal component analysis (PCA) and feature selection is also investigated. During the project, the forest cover-type dataset from the remote sensing and GIS program is used in all computations.

Keywords: classification methods, support vector machine, decision tree, forest cover-type dataset

Procedia PDF Downloads 178
418 On Multiobjective Optimization to Improve the Scalability of Fog Application Deployments Using Fogtorch

Authors: Suleiman Aliyu

Abstract:

Integrating IoT applications with Fog systems presents challenges in optimization due to diverse environments and conflicting objectives. This study explores achieving Pareto optimal deployments for Fog-based IoT systems to address growing QoS demands. We introduce Pareto optimality to balance competing performance metrics. Using the FogTorch optimization framework, we propose a hybrid approach (Backtracking search with branch and bound) for scalable IoT deployments. Our research highlights the advantages of Pareto optimality over single-objective methods and emphasizes the role of FogTorch in this context. Initial results show improvements in IoT deployment cost in Fog systems, promoting resource-efficient strategies.

Keywords: pareto optimality, fog application deployment, resource allocation, internet of things

Procedia PDF Downloads 42
417 Solving the Pseudo-Geometric Traveling Salesman Problem with the “Union Husk” Algorithm

Authors: Boris Melnikov, Ye Zhang, Dmitrii Chaikovskii

Abstract:

This study explores the pseudo-geometric version of the extensively researched Traveling Salesman Problem (TSP), proposing a novel generalization of existing algorithms which are traditionally confined to the geometric version. By adapting the "onion husk" method and introducing auxiliary algorithms, this research fills a notable gap in the existing literature. Through computational experiments using randomly generated data, several metrics were analyzed to validate the proposed approach's efficacy. Preliminary results align with expected outcomes, indicating a promising advancement in TSP solutions.

Keywords: optimization problems, traveling salesman problem, heuristic algorithms, “onion husk” algorithm, pseudo-geometric version

Procedia PDF Downloads 169
416 Optimize Data Evaluation Metrics for Fraud Detection Using Machine Learning

Authors: Jennifer Leach, Umashanger Thayasivam

Abstract:

The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, though, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate people. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease this advancement. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent data, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which testing split and technique would lead to the most optimal results.

Keywords: data science, fraud detection, machine learning, supervised learning

Procedia PDF Downloads 157
415 Unsupervised Learning of Spatiotemporally Coherent Metrics

Authors: Ross Goroshin, Joan Bruna, Jonathan Tompson, David Eigen, Yann LeCun

Abstract:

Current state-of-the-art classification and detection algorithms rely on supervised training. In this work we study unsupervised feature learning in the context of temporally coherent video data. We focus on feature learning from unlabeled video data, using the assumption that adjacent video frames contain semantically similar information. This assumption is exploited to train a convolutional pooling auto-encoder regularized by slowness and sparsity. We establish a connection between slow feature learning to metric learning and show that the trained encoder can be used to define a more temporally and semantically coherent metric.

Keywords: machine learning, pattern clustering, pooling, classification

Procedia PDF Downloads 420
414 Classification of Red, Green and Blue Values from Face Images Using k-NN Classifier to Predict the Skin or Non-Skin

Authors: Kemal Polat

Abstract:

In this study, it has been estimated whether there is skin by using RBG values obtained from the camera and k-nearest neighbor (k-NN) classifier. The dataset used in this study has an unbalanced distribution and a linearly non-separable structure. This problem can also be called a big data problem. The Skin dataset was taken from UCI machine learning repository. As the classifier, we have used the k-NN method to handle this big data problem. For k value of k-NN classifier, we have used as 1. To train and test the k-NN classifier, 50-50% training-testing partition has been used. As the performance metrics, TP rate, FP Rate, Precision, recall, f-measure and AUC values have been used to evaluate the performance of k-NN classifier. These obtained results are as follows: 0.999, 0.001, 0.999, 0.999, 0.999, and 1,00. As can be seen from the obtained results, this proposed method could be used to predict whether the image is skin or not.

Keywords: k-NN classifier, skin or non-skin classification, RGB values, classification

Procedia PDF Downloads 219
413 Qualitative Meta-analysis of ICT4D Implementations

Authors: Miftah Hassen Jemal, Solomon Negash

Abstract:

This study focuses on undertaking a qualitative meta-analysis of qualitative studies conducted on ICT4D implementations. The interpretive approach of synthesis of the interpretation of qualitative studies is adopted to guide the whole process of the study. The traditional criteria of trustworthiness of qualitative studies in terms of transferability, consistency, and credibility are used as quality metrics of the output of the interpretive synthesis process. The findings of the study are anticipated to be of value for policymakers in providing guidance to decisions related to ICT4D implementations. The study is also anticipated to have contributions to research by extracting valuable insights from extant literature and identifying potential areas that warrant further investigation.

Keywords: ICT4D implementations, interpretive synthesis, qualitative meta-analysis, qualitative studies

Procedia PDF Downloads 115
412 An Expert System Designed to Be Used with MOEAs for Efficient Portfolio Selection

Authors: Kostas Metaxiotis, Kostas Liagkouras

Abstract:

This study presents an Expert System specially designed to be used with Multiobjective Evolutionary Algorithms (MOEAs) for the solution of the portfolio selection problem. The validation of the proposed hybrid System is done by using data sets from Hang Seng 31 in Hong Kong, DAX 100 in Germany and FTSE 100 in UK. The performance of the proposed system is assessed in comparison with the Non-dominated Sorting Genetic Algorithm II (NSGAII). The evaluation of the performance is based on different performance metrics that evaluate both the proximity of the solutions to the Pareto front and their dispersion on it. The results show that the proposed hybrid system is efficient for the solution of this kind of problems.

Keywords: expert systems, multi-objective optimization, evolutionary algorithms, portfolio selection

Procedia PDF Downloads 407
411 Cellular Traffic Prediction through Multi-Layer Hybrid Network

Authors: Supriya H. S., Chandrakala B. M.

Abstract:

Deep learning based models have been recently successful adoption for network traffic prediction. However, training a deep learning model for various prediction tasks is considered one of the critical tasks due to various reasons. This research work develops Multi-Layer Hybrid Network (MLHN) for network traffic prediction and analysis; MLHN comprises the three distinctive networks for handling the different inputs for custom feature extraction. Furthermore, an optimized and efficient parameter-tuning algorithm is introduced to enhance parameter learning. MLHN is evaluated considering the “Big Data Challenge” dataset considering the Mean Absolute Error, Root Mean Square Error and R^2as metrics; furthermore, MLHN efficiency is proved through comparison with a state-of-art approach.

Keywords: MLHN, network traffic prediction

Procedia PDF Downloads 53
410 Secure Transfer of Medical Images Using Hybrid Encryption

Authors: Boukhatem Mohamed Belkaid, Lahdi Mourad

Abstract:

In this paper, we propose a new encryption system for security issues medical images. The hybrid encryption scheme is based on AES and RSA algorithms to validate the three security services are authentication, integrity, and confidentiality. Privacy is ensured by AES, authenticity is ensured by the RSA algorithm. Integrity is assured by the basic function of the correlation between adjacent pixels. Our system generates a unique password every new session of encryption, that will be used to encrypt each frame of the medical image basis to strengthen and ensure his safety. Several metrics have been used for various tests of our analysis. For the integrity test, we noticed the efficiencies of our system and how the imprint cryptographic changes at reception if a change affects the image in the transmission channel.

Keywords: AES, RSA, integrity, confidentiality, authentication, medical images, encryption, decryption, key, correlation

Procedia PDF Downloads 407
409 Residual Analysis and Ground Motion Prediction Equation Ranking Metrics for Western Balkan Strong Motion Database

Authors: Manuela Villani, Anila Xhahysa, Christopher Brooks, Marco Pagani

Abstract:

The geological structure of Western Balkans is strongly affected by the collision between Adria microplate and the southwestern Euroasia margin, resulting in a considerably active seismic region. The Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project (BSHAP) (2007-2011, 2012-2015) by NATO supported the preparation of new seismic hazard maps of the Western Balkan, but when inspecting the seismic hazard models produced later by these countries on a national scale, significant differences in design PGA values are observed in the border, for instance, North Albania-Montenegro, South Albania- Greece, etc. Considering the fact that the catalogues were unified and seismic sources were defined within BSHAP framework, obviously, the differences arise from the Ground Motion Prediction Equations selection, which are generally the component with highest impact on the seismic hazard assessment. At the time of the project, a modest database was present, namely 672 three-component records, whereas nowadays, this strong motion database has increased considerably up to 20,939 records with Mw ranging in the interval 3.7-7 and epicentral distance distribution from 0.47km to 490km. Statistical analysis of the strong motion database showed the lack of recordings in the moderate-to-large magnitude and short distance ranges; therefore, there is need to re-evaluate the Ground Motion Prediction Equation in light of the recently updated database and the new generations of GMMs. In some cases, it was observed that some events were more extensively documented in one database than the other, like the 1979 Montenegro earthquake, with a considerably larger number of records in the BSHAP Analogue SM database when compared to ESM23. Therefore, the strong motion flat-file provided from the Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project was merged with the ESM23 database for the polygon studied in this project. After performing the preliminary residual analysis, the candidate GMPE-s were identified. This process was done using the GMPE performance metrics available within the SMT in the OpenQuake Platform. The Likelihood Model and Euclidean Distance Based Ranking (EDR) were used. Finally, for this study, a GMPE logic tree was selected and following the selection of candidate GMPEs, model weights were assigned using the average sample log-likelihood approach of Scherbaum.

Keywords: residual analysis, GMPE, western balkan, strong motion, openquake

Procedia PDF Downloads 44
408 Towards a Conscious Design in AI by Overcoming Dark Patterns

Authors: Ayse Arslan

Abstract:

One of the important elements underpinning a conscious design is the degree of toxicity in communication. This study explores the mechanisms and strategies for identifying toxic content by avoiding dark patterns. Given the breadth of hate and harassment attacks, this study explores a threat model and taxonomy to assist in reasoning about strategies for detection, prevention, mitigation, and recovery. In addition to identifying some relevant techniques such as nudges, automatic detection, or human-ranking, the study suggests the use of major metrics such as the overhead and friction of solutions on platforms and users or balancing false positives (e.g., incorrectly penalizing legitimate users) against false negatives (e.g., users exposed to hate and harassment) to maintain a conscious design towards fairness.

Keywords: AI, ML, algorithms, policy, system design

Procedia PDF Downloads 95