Search results for: cohesion metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 896

Search results for: cohesion metrics

446 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice

Authors: Diana Reckien

Abstract:

Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.

Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity

Procedia PDF Downloads 395
445 A Weighted K-Medoids Clustering Algorithm for Effective Stability in Vehicular Ad Hoc Networks

Authors: Rejab Hajlaoui, Tarek Moulahi, Hervé Guyennet

Abstract:

In a highway scenario, the vehicle speed can exceed 120 kmph. Therefore, any vehicle can enter or leave the network within a very short time. This mobility adversely affects the network connectivity and decreases the life time of all established links. To ensure an effective stability in vehicular ad hoc networks with minimum broadcasting storm, we have developed a weighted algorithm based on the k-medoids clustering algorithm (WKCA). Indeed, the number of clusters and the initial cluster heads will not be selected randomly as usual, but considering the available transmission range and the environment size. Then, to ensure optimal assignment of nodes to clusters in both k-medoids phases, the combined weight of any node will be computed according to additional metrics including direction, relative speed and proximity. Empirical results prove that in addition to the convergence speed that characterizes the k-medoids algorithm, our proposed model performs well both AODV-Clustering and OLSR-Clustering protocols under different densities and velocities in term of end-to-end delay, packet delivery ratio, and throughput.

Keywords: communication, clustering algorithm, k-medoids, sensor, vehicular ad hoc network

Procedia PDF Downloads 238
444 Malaria Parasite Detection Using Deep Learning Methods

Authors: Kaustubh Chakradeo, Michael Delves, Sofya Titarenko

Abstract:

Malaria is a serious disease which affects hundreds of millions of people around the world, each year. If not treated in time, it can be fatal. Despite recent developments in malaria diagnostics, the microscopy method to detect malaria remains the most common. Unfortunately, the accuracy of microscopic diagnostics is dependent on the skill of the microscopist and limits the throughput of malaria diagnosis. With the development of Artificial Intelligence tools and Deep Learning techniques in particular, it is possible to lower the cost, while achieving an overall higher accuracy. In this paper, we present a VGG-based model and compare it with previously developed models for identifying infected cells. Our model surpasses most previously developed models in a range of the accuracy metrics. The model has an advantage of being constructed from a relatively small number of layers. This reduces the computer resources and computational time. Moreover, we test our model on two types of datasets and argue that the currently developed deep-learning-based methods cannot efficiently distinguish between infected and contaminated cells. A more precise study of suspicious regions is required.

Keywords: convolution neural network, deep learning, malaria, thin blood smears

Procedia PDF Downloads 130
443 Metrics and Methods for Improving Resilience in Agribusiness Supply Chains

Authors: Golnar Behzadi, Michael O'Sullivan, Tava Olsen, Abraham Zhang

Abstract:

By definition, increasing supply chain resilience improves the supply chain’s ability to return to normal, or to an even more desirable situation, quickly and efficiently after being hit by a disruption. This is especially critical in agribusiness supply chains where the products are perishable and have a short life-cycle. In this paper, we propose a resilience metric to capture and improve the recovery process in terms of both performance and time, of an agribusiness supply chain following either supply or demand-side disruption. We build a model that determines optimal supply chain recovery planning decisions and selects the best resilient strategies that minimize the loss of profit during the recovery time window. The model is formulated as a two-stage stochastic mixed-integer linear programming problem and solved with a branch-and-cut algorithm. The results show that the optimal recovery schedule is highly dependent on the duration of the time-window allowed for recovery. In addition, the profit loss during recovery is reduced by utilizing the proposed resilient actions.

Keywords: agribusiness supply chain, recovery, resilience metric, risk management

Procedia PDF Downloads 397
442 Modeling User Context Using CEAR Diagram

Authors: Ravindra Dastikop, G. S. Thyagaraju, U. P. Kulkarni

Abstract:

Even though the number of context aware applications is increasing day by day along with the users, till today there is no generic programming paradigm for context aware applications. This situation could be remedied by design and developing the appropriate context modeling and programming paradigm for context aware applications. In this paper, we are proposing the static context model and metrics for validating the expressiveness and understandability of the model. The proposed context modeling is a way of describing a situation of user using context entities , attributes and relationships .The model which is an extended and hybrid version of ER model, ontology model and Graphical model is specifically meant for expressing and understanding the user situation in context aware environment. The model is useful for understanding context aware problems, preparing documentation and designing programs and databases. The model makes use of context entity attributes relationship (CEAR) diagram for representation of association between the context entities and attributes. We have identified a new set of graphical notations for improving the expressiveness and understandability of context from the end user perspective .

Keywords: user context, context entity, context entity attributes, situation, sensors, devices, relationships, actors, expressiveness, understandability

Procedia PDF Downloads 344
441 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis

Authors: Gon Park

Abstract:

Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.

Keywords: cadastral data, green Infrastructure, network analysis, parcel data

Procedia PDF Downloads 205
440 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization

Authors: Soheila Sadeghi

Abstract:

Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.

Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction

Procedia PDF Downloads 58
439 Eco Scale: A Tool for Assessing the Greenness of Pharmaceuticals Analysis

Authors: Heba M. Mohamed

Abstract:

Owing to scientific and public concern about health and environment and seeking for a better quality of life; “Green”, “Environmentally” and “Eco” friendly practices have been presented and implemented in different research areas. Subsequently, researchers’ attention is drawn in the direction of greening the analytical methodologies and taking the Green Analytical Chemistry principles (GAC) into consideration. It is of high importance to appraise the environmental impact of each of the implemented green approaches. Compared to the other traditional green metrics (E-factor, Atom economy and the process profile), the eco scale is the optimum choice to assess the environmental impact of the analytical procedures used for pharmaceuticals analysis. For analytical methodologies, Eco-Scale is calculated by allotting penalty points to any factor of the used analytical procedure which disagree and not match with the model green analysis, where the perfect green analysis has its Eco-Scale value of 100. In this work, calculation and comparison of the Eco-Scale for some of the reported green analytical methods was done, to accentuate their greening potentials. Where the different scores can reveal how green the method is, compared to the ideal value. The study emphasizes that greenness measurement is not only about the waste quantity determination but also dictates a holistic scheme, considering all factors.

Keywords: eco scale, green analysis, environmentally friendly, pharmaceuticals analysis

Procedia PDF Downloads 438
438 Comparison of the Factor of Safety and Strength Reduction Factor Values from Slope Stability Analysis of a Large Open Pit

Authors: James Killian, Sarah Cox

Abstract:

The use of stability criteria within geotechnical engineering is the way the results of analyses are conveyed, and sensitivities and risk assessments are performed. Historically, the primary stability criteria for slope design has been the Factor of Safety (FOS) coming from a limit calculation. Increasingly, the value derived from Strength Reduction Factor (SRF) analysis is being used as the criteria for stability analysis. The purpose of this work was to study in detail the relationship between SRF values produced from a numerical modeling technique and the traditional FOS values produced from Limit Equilibrium (LEM) analyses. This study utilized a model of a 3000-foot-high slope with a 45-degree slope angle, assuming a perfectly plastic mohr-coulomb constitutive model with high cohesion and friction angle values typical of a large hard rock mine slope. A number of variables affecting the values of the SRF in a numerical analysis were tested, including zone size, in-situ stress, tensile strength, and dilation angle. This paper demonstrates that in most cases, SRF values are lower than the corresponding LEM FOS values. Modeled zone size has the greatest effect on the estimated SRF value, which can vary as much as 15% to the downside compared to FOS. For consistency when using SRF as a stability criteria, the authors suggest that numerical model zone sizes should not be constructed to be smaller than about 1% of the overall problem slope height and shouldn’t be greater than 2%. Future work could include investigations of the effect of anisotropic strength assumptions or advanced constitutive models.

Keywords: FOS, SRF, LEM, comparison

Procedia PDF Downloads 307
437 Efficient Deep Neural Networks for Real-Time Strawberry Freshness Monitoring: A Transfer Learning Approach

Authors: Mst. Tuhin Akter, Sharun Akter Khushbu, S. M. Shaqib

Abstract:

A real-time system architecture is highly effective for monitoring and detecting various damaged products or fruits that may deteriorate over time or become infected with diseases. Deep learning models have proven to be effective in building such architectures. However, building a deep learning model from scratch is a time-consuming and costly process. A more efficient solution is to utilize deep neural network (DNN) based transfer learning models in the real-time monitoring architecture. This study focuses on using a novel strawberry dataset to develop effective transfer learning models for the proposed real-time monitoring system architecture, specifically for evaluating and detecting strawberry freshness. Several state-of-the-art transfer learning models were employed, and the best performing model was found to be Xception, demonstrating higher performance across evaluation metrics such as accuracy, recall, precision, and F1-score.

Keywords: strawberry freshness evaluation, deep neural network, transfer learning, image augmentation

Procedia PDF Downloads 90
436 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 297
435 Education in Schools and Public Policy in India

Authors: Sujeet Kumar

Abstract:

Education has greater importance particularly in terms of increasing human capital and economic competitiveness. It plays a crucial role in terms of cognitive and skill development. Its plays a vital role in process of socialization, fostering social justice, and enhancing social cohesion. Policy related to education has been always a priority for developed countries, which is later adopted by developing countries also. The government of India has also brought change in education polices in line with recognizing change at national and supranational level. However, quality education is still not become an open door for every child in India and several reports are produced year to year about level of school education in India. This paper is concerned with schooling in India. Particularly, it focuses on two government and two private schools in Bihar, but reference has made to schools in Delhi especially around slum communities. The paper presents brief historical context and an overview of current school systems in India. Later, it focuses on analysis of current development in policy in reference with field observation, which is anchored around choice, diversity, market – orientation and gap between different groups of pupils. There is greater degree of difference observed at private and government school levels in terms of quality of teachers, method of teaching and overall environment of learning. The paper concludes that the recent policy development in education particularly Sarva Siksha Abhiyaan (SAA) and Right to Education Act (2009) has required renovating new approach to bridge the gap through broader consultation at grassroots and participatory approach with different stakeholders.

Keywords: education, public policy, participatory approach

Procedia PDF Downloads 394
434 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 256
433 Digital Image Steganography with Multilayer Security

Authors: Amar Partap Singh Pharwaha, Balkrishan Jindal

Abstract:

In this paper, a new method is developed for hiding image in a digital image with multilayer security. In the proposed method, the secret image is encrypted in the first instance using a flexible matrix based symmetric key to add first layer of security. Then another layer of security is added to the secret data by encrypting the ciphered data using Pythagorean Theorem method. The ciphered data bits (4 bits) produced after double encryption are then embedded within digital image in the spatial domain using Least Significant Bits (LSBs) substitution. To improve the image quality of the stego-image, an improved form of pixel adjustment process is proposed. To evaluate the effectiveness of the proposed method, image quality metrics including Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), entropy, correlation, mean value and Universal Image Quality Index (UIQI) are measured. It has been found experimentally that the proposed method provides higher security as well as robustness. In fact, the results of this study are quite promising.

Keywords: Pythagorean theorem, pixel adjustment, ciphered data, image hiding, least significant bit, flexible matrix

Procedia PDF Downloads 337
432 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 40
431 An Efficient Resource Management Algorithm for Mobility Management in Wireless Mesh Networks

Authors: Mallikarjuna Rao Yamarthy, Subramanyam Makam Venkata, Satya Prasad Kodati

Abstract:

The main objective of the proposed work is to reduce the overall network traffic incurred by mobility management, packet delivery cost and to increase the resource utilization. The proposed algorithm, An Efficient Resource Management Algorithm (ERMA) for mobility management in wireless mesh networks, relies on pointer based mobility management scheme. Whenever a mesh client moves from one mesh router to another, the pointer is set up dynamically between the previous mesh router and current mesh router based on the distance constraints. The algorithm evaluated for signaling cost, data delivery cost and total communication cost performance metrics. The proposed algorithm is demonstrated for both internet sessions and intranet sessions. The proposed algorithm yields significantly better performance in terms of signaling cost, data delivery cost, and total communication cost.

Keywords: data delivery cost, mobility management, pointer forwarding, resource management, wireless mesh networks

Procedia PDF Downloads 367
430 Generating Insights from Data Using a Hybrid Approach

Authors: Allmin Susaiyah, Aki Härmä, Milan Petković

Abstract:

Automatic generation of insights from data using insight mining systems (IMS) is useful in many applications, such as personal health tracking, patient monitoring, and business process management. Existing IMS face challenges in controlling insight extraction, scaling to large databases, and generalising to unseen domains. In this work, we propose a hybrid approach consisting of rule-based and neural components for generating insights from data while overcoming the aforementioned challenges. Firstly, a rule-based data 2CNL component is used to extract statistically significant insights from data and represent them in a controlled natural language (CNL). Secondly, a BERTSum-based CNL2NL component is used to convert these CNLs into natural language texts. We improve the model using task-specific and domain-specific fine-tuning. Our approach has been evaluated using statistical techniques and standard evaluation metrics. We overcame the aforementioned challenges and observed significant improvement with domain-specific fine-tuning.

Keywords: data mining, insight mining, natural language generation, pre-trained language models

Procedia PDF Downloads 119
429 A Review of Routing Protocols for Mobile Ad-Hoc NETworks (MANET)

Authors: Hafiza Khaddija Saman, Muhammad Sufyan

Abstract:

The increase in availability and popularity of mobile wireless devices has led researchers to develop a wide variety of Mobile Ad-hoc Networking (MANET) protocols to exploit the unique communication opportunities presented by these devices. Devices are able to communicate directly using the wireless spectrum in a peer-to-peer fashion, and route messages through intermediate nodes, however, the nature of wireless shared communication and mobile devices result in many routing and security challenges which must be addressed before deploying a MANET. In this paper, we investigate the range of MANET routing protocols available and discuss the functionalities of several ranging from early protocols such as DSDV to more advanced such as MAODV, our protocol study focuses upon works by Perkins in developing and improving MANET routing. A range of literature relating to the field of MANET routing was identified and reviewed, we also reviewed literature on the topic of securing AODV based MANETs as this may be the most popular MANET protocol. The literature review identified a number of trends within research papers such as exclusive use of the random waypoint mobility model, excluding key metrics from simulation results and not comparing protocol performance against available alternatives.

Keywords: protocol, MANET, ad-Hoc, communication

Procedia PDF Downloads 261
428 Leveraging Quality Metrics in Voting Model Based Thread Retrieval

Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim

Abstract:

Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.

Keywords: content quality, forum search, thread retrieval, voting techniques

Procedia PDF Downloads 213
427 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks

Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris

Abstract:

The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.

Keywords: energy efficiency, handover, HetNets, MADM, small cells

Procedia PDF Downloads 116
426 Comparison of Techniques for Detection and Diagnosis of Eccentricity in the Air-Gap Fault in Induction Motors

Authors: Abrahão S. Fontes, Carlos A. V. Cardoso, Levi P. B. Oliveira

Abstract:

The induction motors are used worldwide in various industries. Several maintenance techniques are applied to increase the operating time and the lifespan of these motors. Among these, the predictive maintenance techniques such as Motor Current Signature Analysis (MCSA), Motor Square Current Signature Analysis (MSCSA), Park's Vector Approach (PVA) and Park's Vector Square Modulus (PVSM) are used to detect and diagnose faults in electric motors, characterized by patterns in the stator current frequency spectrum. In this article, these techniques are applied and compared on a real motor, which has the fault of eccentricity in the air-gap. It was used as a theoretical model of an electric induction motor without fault in order to assist comparison between the stator current frequency spectrum patterns with and without faults. Metrics were purposed and applied to evaluate the sensitivity of each technique fault detection. The results presented here show that the above techniques are suitable for the fault of eccentricity in the air gap, whose comparison between these showed the suitability of each one.

Keywords: eccentricity in the air-gap, fault diagnosis, induction motors, predictive maintenance

Procedia PDF Downloads 350
425 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison

Authors: Saugata Bose, Ritambhra Korpal

Abstract:

The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.

Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram

Procedia PDF Downloads 357
424 Aggregate Fluctuations and the Global Network of Input-Output Linkages

Authors: Alexander Hempfing

Abstract:

The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.

Keywords: economic integration, industrial organization, input-output economics, network economics, production networks

Procedia PDF Downloads 276
423 Bibliometrics of 'Community Garden' and Associated Keywords

Authors: Guilherme Reis Ranieri, Guilherme Leite Gaudereto, Michele Toledo, Luis Fernando Amato-Lourenco, Thais Mauad

Abstract:

Given the importance to urban sustainability and the growing relevance of the term ‘community garden’, this paper aims to conduct a bibliometric analysis of the term. Using SCOPUS as database, we analyzed 105 articles that contained the keywords ‘community garden’, and conducted a cluster analysis with the associated keywords. As results, we found 205 articles and 404 different keywords. Among the keywords, 334 are not repeated anytime, 44 are repeated 2 times and 9 appear 3 times. The most frequent keywords are: community food systems (74), urban activism (14), Communities of practice (6), food production (6) and public rethoric (5). Within the areas, which contains more articles are: social sciences (74), environmental science (29) and agricultural and biological sciences (24).The three main countries that concentrated the papers are United States (54), Canada (15) and Australia (12). The main journal with these keywords is Local Environment (10). The first publication was in 1999, and by 2010 concentrated 30,5% of the publications. The other 69,5% occurred 2010 to 2015, indicating an increase in frequency. We can conclude that the papers, based on the distribution of the keywords, are still scattered in various research topics and presents high variability between subjects.

Keywords: bibliometrics, community garden, metrics, urban agriculture

Procedia PDF Downloads 367
422 An Application of Lean Thinking at the Cargo Transport Area

Authors: Caroline Demartin, Natalia Camaras, Nelson Maestrelli, Max Filipe Gonçalves

Abstract:

This paper presents a case study of Lean Thinking at the cargo transport area. Lean Office principles are considered the application of Lean Thinking focusing on the service area and it is based on Lean Production concepts. Lean production is a philosophy that was born and gained ground after the Second World War when the Japanese Toyota Company developed a process of identifying and eliminating waste. Many researchers show that most part of the companies decide to adopt the principles created at Toyota especially in the manufacturing sector, but until 90’s, has no major applications for the service sector. Due to increased competition and the need for competitive advantage, many companies began to observe the lean transformation and take it as reference. In this study, a key process at a cargo transport company was analyzed using Lean Office tools and methods: a current state map was developed, main wastes were identified, some metrics were used to evaluate improvements and a priority matrix was used to identify action plans. The obtained results showed that Lean Office has a great potential to be successful applied in cargo air transport companies.

Keywords: lean production, lean office, logistic, service sector

Procedia PDF Downloads 190
421 Social Impact Evaluation in the Housing Sector

Authors: Edgard Barki, Tânia Modesto Veludo-de-Oliveira, Felipe Zambaldi

Abstract:

The social enterprise sector can be characterized as organizations that aim to solve social problems with financial sustainability and using market mechanisms. This sector has shown an increasing interest worldwide. Despite the growth and relevance of the sector, there is still a gap regarding the assessment of the social impact resulting from the initiatives of the organizations in this field. A number of metrics have been designed worldwide to evaluate the impact of social enterprises (e.g., IRIS, GIIRS, BACO), as well as some ad hoc studies that have been carried out, mainly in the microcredit sector, but there is still a gap to be filled in the development of research in social impact evaluation. Therefore, this research seeks to evaluate the social impact of two social enterprises (Terra Nova and Vivenda) in the area of housing in Brazil. To evaluate these impacts and their dimensions, we conducted an exploratory research, through three focus groups, thirty in-depth interviews and a survey with beneficiaries of both organizations. The results allowed us to evaluate how the two organizations were able to create a deep social impact in the populations served. Terra Nova has a more collective perspective, with a clear benefit of social inclusion and improvement of the community’s infrastructure, while Vivenda has a more individualized perspective, improving self-esteem, sociability and family coexistence.

Keywords: Brazil, housing, social enterprise, social impact evaluation

Procedia PDF Downloads 442
420 Application of VE in Healthcare Services: An Overview of Healthcare Facility

Authors: Safeer Ahmad, Pratheek Sudhakran, M. Arif Kamal, Tarique Anwar

Abstract:

In Healthcare facility designing, Efficient MEP services are very crucial because the built environment not only affects patients and family but also Healthcare staff and their outcomes. This paper shall cover the basics of Value engineering and its different phases that can be implemented to the MEP Designing stage for Healthcare facility optimization, also VE can improve the product cost the unnecessary costs associated with healthcare services. This paper explores Healthcare facility services and their Value engineering Job plan for the successful application of the VE technique by conducting a Workshop with end-users, designing team and associate experts shall be carried out using certain concepts, tools, methods and mechanism developed to achieve the purpose of selecting what is actually appropriate and ideal among many value engineering processes and tools that have long proven their ability to enhance the value by following the concept of Total quality management while achieving the most efficient resources allocation to satisfy the key functions and requirements of the project without sacrificing the targeted level of service for all design metrics. Detail study has been discussed with analysis been carried out by this process to achieve a better outcome, Various tools are used for the Analysis of the product at different phases used, at the end the results obtained after implementation of techniques are discussed.

Keywords: value engineering, healthcare facility, design, services

Procedia PDF Downloads 197
419 The Mashishing Marking Memories Project: A Culture-Centered Approach to Participation

Authors: Nongcebo Ngcobo

Abstract:

This research explores the importance of including a multitude of voices in the cultural heritage narrative, particularly in South Africa. The Mashishing project is an extension of and builds on the existing ‘Biesje Poort project’ which is a rock art project that was funded by the National Heritage Council in 2010 - 2013. Hence, the Mashishing marking memories project applies comparable Biesje Poort project objectives, though in a different geographical area. The wider project objectives are to transfer skills, promote social cohesion and empowerment, and lastly to add to the knowledge base of the Mashishing region and the repository of the local museum in the Lydenburg museum. This study is located in the Mashishing area, in Mpumalanga, South Africa. In this area, there were no present multi-vocal heritage projects. This research assesses the Mashishing marking memories project through the culture-centered approach for communication for social change, which examines the impact that the diverse participants have on the operations of the Mashishing project and also investigates whether the culturally diverse participants facilitates or hinders effective participation within the project. Key findings of this research uncovered the significance of participation and diverse voices in the cultural heritage field. Furthermore, this study highlights how unequal power relations affect effective participation. As a result, this study encourages the importance of bringing the researcher and the participant in a safe space to facilitate mutual learning. It also encourages an exchange of roles, where the researcher shifts from being an authoritarian figure to being in the role of a listener.

Keywords: culture, heritage, participation, social change

Procedia PDF Downloads 121
418 Elder Abuse: An Exploration of China, the United States, and Israel’s Perspectives on Elder Abuse and What Their Differences Reveal about Its Underreported Nature

Authors: Sydney Burnett

Abstract:

The history of the relationship between elder abuse and its tendency to go underreported is rooted in the oppressive nature of ageism and victimization. Approximately 8% of the world's population was aged sixty or over in 1950, whereas, in 2020, the number more than doubled to 16.9%. By 2050, that number is expected to reach 22%. Although difficult for individuals of any age to feel completely supported in society, this task proves especially difficult for the elderly demographic. And as the elderly population continues to grow, the systemic abuse and neglect that this group encounters, and thus its underreported nature, multiply at a similar rate. Although a recent increase in awareness has initiated stronger efforts towards addressing the meager resources, processes, and personnel present to manage elder abuse, both reported and unreported, the destructive complexities of ageism and victimization persist. Examining the byproducts of the rapidly growing elderly demographic in China, the United States, and Israel, in cohesion with the inherent challenges in the context of terminology, definition, and typologies of elder abuse should provide insight into the pernicious influences of elder abuse that contribute to the non-identification and non-recognition of elder maltreatment present in these three countries in different stages of development. This investigation aims to understand the intricacy of elder abuse and its correlation to a lack of acknowledgment as well as its consequences in society by exploring the variation between China, the United States, and Israel's attitudes surrounding the subject. Furthermore, the systemic abuse and neglect embedded in global ageism can be revealed by the differences between the three countries' approaches to reporting elder abuse.

Keywords: elder abuse, ageism, mistreatment, underreported

Procedia PDF Downloads 91
417 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis

Authors: Alexander Marx

Abstract:

Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.

Keywords: value at risk, financial market risk, banking, quantitative risk management

Procedia PDF Downloads 94