Search results for: data envelopment analysis (DEA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42282

Search results for: data envelopment analysis (DEA)

40962 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis

Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri

Abstract:

In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.

Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer

Procedia PDF Downloads 86
40961 Empirical Study of Running Correlations in Exam Marks: Same Statistical Pattern as Chance

Authors: Weisi Guo

Abstract:

It is well established that there may be running correlations in sequential exam marks due to students sitting in the order of course registration patterns. As such, a random and non-sequential sampling of exam marks is a standard recommended practice. Here, the paper examines a large number of exam data stretching several years across different modules to see the degree to which it is true. Using the real mark distribution as a generative process, it was found that random simulated data had no more sequential randomness than the real data. That is to say, the running correlations that one often observes are statistically identical to chance. Digging deeper, it was found that some high running correlations have students that indeed share a common course history and make similar mistakes. However, at the statistical scale of a module question, the combined effect is statistically similar to the random shuffling of papers. As such, there may not be the need to take random samples for marks, but it still remains good practice to mark papers in a random sequence to reduce the repetitive marking bias and errors.

Keywords: data analysis, empirical study, exams, marking

Procedia PDF Downloads 183
40960 Mostar Type Indices and QSPR Analysis of Octane Isomers

Authors: B. Roopa Sri, Y Lakshmi Naidu

Abstract:

Chemical Graph Theory (CGT) is the branch of mathematical chemistry in which molecules are modeled to study their physicochemical properties using molecular descriptors. Amongst these descriptors, topological indices play a vital role in predicting the properties by defining the graph topology of the molecule. Recently, the bond-additive topological index known as the Mostar index has been proposed. In this paper, we compute the Mostar-type indices of octane isomers and use the data obtained to perform QSPR analysis. Furthermore, we show the correlation between the Mostar type indices and the properties.

Keywords: chemical graph theory, mostar type indices, octane isomers, qspr analysis, topological index

Procedia PDF Downloads 130
40959 Examining Social Connectivity through Email Network Analysis: Study of Librarians' Emailing Groups in Pakistan

Authors: Muhammad Arif Khan, Haroon Idrees, Imran Aziz, Sidra Mushtaq

Abstract:

Social platforms like online discussion and mailing groups are well aligned with academic as well as professional learning spaces. Professional communities are increasingly moving to online forums for sharing and capturing the intellectual abilities. This study investigated dynamics of social connectivity of yahoo mailing groups of Pakistani Library and Information Science (LIS) professionals using Graph Theory technique. Design/Methodology: Social Network Analysis is the increasingly concerned domain for scientists in identifying whether people grow together through online social interaction or, whether they just reflect connectivity. We have conducted a longitudinal study using Network Graph Theory technique to analyze the large data-set of email communication. The data was collected from three yahoo mailing groups using network analysis software over a period of six months i.e. January to June 2016. Findings of the network analysis were reviewed through focus group discussion with LIS experts and selected respondents of the study. Data were analyzed in Microsoft Excel and network diagrams were visualized using NodeXL and ORA-Net Scene package. Findings: Findings demonstrate that professionals and students exhibit intellectual growth the more they get tied within a network by interacting and participating in communication through online forums. The study reports on dynamics of the large network by visualizing the email correspondence among group members in a network consisting vertices (members) and edges (randomized correspondence). The model pair wise relationship between group members was illustrated to show characteristics, reasons, and strength of ties. Connectivity of nodes illustrated the frequency of communication among group members through examining node coupling, diffusion of networks, and node clustering has been demonstrated in-depth. Network analysis was found to be a useful technique in investigating the dynamics of the large network.

Keywords: emailing networks, network graph theory, online social platforms, yahoo mailing groups

Procedia PDF Downloads 241
40958 Qualitative Case Studies in Reading Specialist Education

Authors: Carol Leroy

Abstract:

This presentation focuses on the analysis qualitative case studies in the graduate education of reading specialists. The presentation describes the development and application of an integrated conceptual framework for reading specialist education, drawing on Robert Stake’s work on case study research, Kenneth Zeichner’s work on professional learning, and various tools for reading assessment (e.g. the Qualitative Reading Inventory). Social constructivist theory is used to provide intersecting links between the various influences on the processes used to assess and teaching reading within the case study framework. Illustrative examples are described to show the application of the framework in reading specialist education in a teaching clinic at a large urban university. Central to education of reading specialists in this teaching clinic is the collection, analysis and interpretation of data for the design and implementation of reading and writing programs for struggling readers and writers. The case study process involves the integrated interpretation of data, which is central to qualitative case study inquiry. An emerging theme in this approach to graduate education is the ambiguity and uncertainty that governs work with the adults and children who attend the clinic for assistance. Tensions and contradictions are explored insofar as they reveal overlapping but intersecting frameworks for case study analysis in the area of literacy education. An additional theme is the interplay of multiple layers of data with a resulting depth that goes beyond the practical need of the client and toward the deeper pedagogical growth of the reading specialist. The presentation makes a case for the value of qualitative case studies in reading specialist education. Further, the use of social constructivism as a unifying paradigm provides a robustness to the conceptual framework as a tool for understanding the pedagogy that is involved.

Keywords: assessment, case study, professional education, reading

Procedia PDF Downloads 459
40957 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.

Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis

Procedia PDF Downloads 386
40956 Analysis of Path Nonparametric Truncated Spline Maximum Cubic Order in Farmers Loyalty Modeling

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

Path analysis tests the relationship between variables through cause and effect. Before conducting further tests on path analysis, the assumption of linearity must be met. If the shape of the relationship is not linear and the shape of the curve is unknown, then use a nonparametric approach, one of which is a truncated spline. The purpose of this study is to estimate the function and get the best model on the nonparametric truncated spline path of linear, quadratic, and cubic orders with 1 and 2-knot points and determine the significance of the best function estimator in modeling farmer loyalty through the jackknife resampling method. This study uses secondary data through questionnaires to farmers in Sumbawa Regency who use SP-36 subsidized fertilizer products as many as 100 respondents. Based on the results of the analysis, it is known that the best-truncated spline nonparametric path model is the quadratic order of 2 knots with a coefficient of determination of 85.50%; the significance of the best-truncated spline nonparametric path estimator shows that all exogenous variables have a significant effect on endogenous variables.

Keywords: nonparametric path analysis, farmer loyalty, jackknife resampling, truncated spline

Procedia PDF Downloads 48
40955 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: rice disease, data analysis system, mobile application, iOS operating system

Procedia PDF Downloads 289
40954 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements

Authors: Agnieszka A. Malinowska, R. Hejmanowski

Abstract:

A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.

Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)

Procedia PDF Downloads 160
40953 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: mutex task generation, data augmentation, meta-learning, text classification.

Procedia PDF Downloads 144
40952 Developing Indicators in System Mapping Process Through Science-Based Visual Tools

Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski

Abstract:

The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.

Keywords: indicators, knowledge management, system mapping, visual tools

Procedia PDF Downloads 195
40951 Application of Drones in Agriculture

Authors: Reza Taherlouei Safa, Mohammad Aboonajmi

Abstract:

Agriculture plays an essential role in providing food for the world's population. It also offers numerous benefits to countries, including non-food products, transportation, and environmental balance. Precision agriculture, which employs advanced tools to monitor variability and manage inputs, can help achieve these benefits. The increasing demand for food security puts pressure on decision-makers to ensure sufficient food production worldwide. To support sustainable agriculture, unmanned aerial vehicles (UAVs) can be utilized to manage farms and increase yields. This paper aims to provide an understanding of UAV usage and its applications in agriculture. The objective is to review the various applications of UAVs in agriculture. Based on a comprehensive review of existing research, it was found that different sensors provide varying analyses for agriculture applications. Therefore, the purpose of the project must be determined before using UAV technology for better data quality and analysis. In conclusion, identifying a suitable sensor and UAV is crucial to gather accurate data and precise analysis when using UAVs in agriculture.

Keywords: drone, precision agriculture, farmer income, UAV

Procedia PDF Downloads 82
40950 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025

Authors: Shruti Chopra, Vikas Rao Vadi

Abstract:

In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.

Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking

Procedia PDF Downloads 148
40949 Revolutionizing Traditional Farming Using Big Data/Cloud Computing: A Review on Vertical Farming

Authors: Milind Chaudhari, Suhail Balasinor

Abstract:

Due to massive deforestation and an ever-increasing population, the organic content of the soil is depleting at a much faster rate. Due to this, there is a big chance that the entire food production in the world will drop by 40% in the next two decades. Vertical farming can help in aiding food production by leveraging big data and cloud computing to ensure plants are grown naturally by providing the optimum nutrients sunlight by analyzing millions of data points. This paper outlines the most important parameters in vertical farming and how a combination of big data and AI helps in calculating and analyzing these millions of data points. Finally, the paper outlines how different organizations are controlling the indoor environment by leveraging big data in enhancing food quantity and quality.

Keywords: big data, IoT, vertical farming, indoor farming

Procedia PDF Downloads 176
40948 SA-SPKC: Secure and Efficient Aggregation Scheme for Wireless Sensor Networks Using Stateful Public Key Cryptography

Authors: Merad Boudia Omar Rafik, Feham Mohammed

Abstract:

Data aggregation in wireless sensor networks (WSNs) provides a great reduction of energy consumption. The limited resources of sensor nodes make the choice of an encryption algorithm very important for providing security for data aggregation. Asymmetric cryptography involves large ciphertexts and heavy computations but solves, on the other hand, the problem of key distribution of symmetric one. The latter provides smaller ciphertexts and speed computations. Also, the recent researches have shown that achieving the end-to-end confidentiality and the end-to-end integrity at the same is a challenging task. In this paper, we propose (SA-SPKC), a novel security protocol which addresses both security services for WSNs, and where only the base station can verify the individual data and identify the malicious node. Our scheme is based on stateful public key encryption (StPKE). The latter combines the best features of both kinds of encryption along with state in order to reduce the computation overhead. Our analysis

Keywords: secure data aggregation, wireless sensor networks, elliptic curve cryptography, homomorphic encryption

Procedia PDF Downloads 300
40947 Data Gathering and Analysis for Arabic Historical Documents

Authors: Ali Dulla

Abstract:

This paper introduces a new dataset (and the methodology used to generate it) based on a wide range of historical Arabic documents containing clean data simple and homogeneous-page layouts. The experiments are implemented on printed and handwritten documents obtained respectively from some important libraries such as Qatar Digital Library, the British Library and the Library of Congress. We have gathered and commented on 150 archival document images from different locations and time periods. It is based on different documents from the 17th-19th century. The dataset comprises differing page layouts and degradations that challenge text line segmentation methods. Ground truth is produced using the Aletheia tool by PRImA and stored in an XML representation, in the PAGE (Page Analysis and Ground truth Elements) format. The dataset presented will be easily available to researchers world-wide for research into the obstacles facing various historical Arabic documents such as geometric correction of historical Arabic documents.

Keywords: dataset production, ground truth production, historical documents, arbitrary warping, geometric correction

Procedia PDF Downloads 169
40946 Automatic Tagging and Accuracy in Assamese Text Data

Authors: Chayanika Hazarika Bordoloi

Abstract:

This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.

Keywords: CRF, morphology, tagging, tagset

Procedia PDF Downloads 195
40945 Parameter Identification Analysis in the Design of Rock Fill Dams

Authors: G. Shahzadi, A. Soulaimani

Abstract:

This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.

Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS

Procedia PDF Downloads 146
40944 Study of Influencing Factors of Shrinking Cities Based on Factor Analysis: The Example of Halle Germany

Authors: Fang Yao, Minglei Chen

Abstract:

City shrinkage is one of the thorny problems that many European cities have to face with nowadays. It is mainly expressed as the decrease of population in these cities. Eastern Germany is one of the pioneers of European shrinking cities with long shrinking history. Selecting one representative shrinking city Halle(Saale) in eastern Germany as research objective, collecting and investigating nearly 20 years (1993-2010) municipal data after the reunification of Germany. These data based on five dimensions, which are demographic, economic, social, spatial and environmental and total 16 eligible variables. Using Factor Analysis to dealing with these variables in order to assess the most important factors affecting shrinking Halle. The Factor Analysis shows that there are three main factors determine the shrinkage of Halle, namely demographical and economical factor, social stability factor, and city vitality factor. Three factors acts at different period of Halle’s shrinkage: from 1993 to 1997 the demographical and economical factor played an important role; from 1997 to 2004 the social stability is significant to city shrinkage; since 2005 city vitality factors determines the shrinkage of Halle. In recent years, the shrinkage in Halle mitigates that shows the sign of growing population. Thus the city Halle should focus on attaching more importance on the city vitality factor to prevent the city from shrinkage. Meanwhile, the city should possess a positive perspective that to shift the growth-oriented development to tap the potential of shrinking cities. This method is expected to apply to further research and other shrinking cities.

Keywords: demography, factor analysis, Halle, shrinking cities

Procedia PDF Downloads 415
40943 Data Challenges Facing Implementation of Road Safety Management Systems in Egypt

Authors: A. Anis, W. Bekheet, A. El Hakim

Abstract:

Implementing a Road Safety Management System (SMS) in a crowded developing country such as Egypt is a necessity. Beginning a sustainable SMS requires a comprehensive reliable data system for all information pertinent to road crashes. In this paper, a survey for the available data in Egypt and validating it for using in an SMS in Egypt. The research provides some missing data, and refer to the unavailable data in Egypt, looking forward to the contribution of the scientific society, the authorities, and the public in solving the problem of missing or unreliable crash data. The required data for implementing an SMS in Egypt are divided into three categories; the first is available data such as fatality and injury rates and it is proven in this research that it may be inconsistent and unreliable, the second category of data is not available, but it may be estimated, an example of estimating vehicle cost is available in this research, the third is not available and can be measured case by case such as the functional and geometric properties of a facility. Some inquiries are provided in this research for the scientific society, such as how to improve the links among stakeholders of road safety in order to obtain a consistent, non-biased, and reliable data system.

Keywords: road safety management system, road crash, road fatality, road injury

Procedia PDF Downloads 152
40942 Supervised Learning for Cyber Threat Intelligence

Authors: Jihen Bennaceur, Wissem Zouaghi, Ali Mabrouk

Abstract:

The major aim of cyber threat intelligence (CTI) is to provide sophisticated knowledge about cybersecurity threats to ensure internal and external safeguards against modern cyberattacks. Inaccurate, incomplete, outdated, and invaluable threat intelligence is the main problem. Therefore, data analysis based on AI algorithms is one of the emergent solutions to overcome the threat of information-sharing issues. In this paper, we propose a supervised machine learning-based algorithm to improve threat information sharing by providing a sophisticated classification of cyber threats and data. Extensive simulations investigate the accuracy, precision, recall, f1-score, and support overall to validate the designed algorithm and to compare it with several supervised machine learning algorithms.

Keywords: threat information sharing, supervised learning, data classification, performance evaluation

Procedia PDF Downloads 150
40941 Digital Twin of Real Electrical Distribution System with Real Time Recursive Load Flow Calculation and State Estimation

Authors: Anosh Arshad Sundhu, Francesco Giordano, Giacomo Della Croce, Maurizio Arnone

Abstract:

Digital Twin (DT) is a technology that generates a virtual representation of a physical system or process, enabling real-time monitoring, analysis, and simulation. DT of an Electrical Distribution System (EDS) can perform online analysis by integrating the static and real-time data in order to show the current grid status and predictions about the future status to the Distribution System Operator (DSO), producers and consumers. DT technology for EDS also offers the opportunity to DSO to test hypothetical scenarios. This paper discusses the development of a DT of an EDS by Smart Grid Controller (SGC) application, which is developed using open-source libraries and languages. The developed application can be integrated with Supervisory Control and Data Acquisition System (SCADA) of any EDS for creating the DT. The paper shows the performance of developed tools inside the application, tested on real EDS for grid observability, Smart Recursive Load Flow (SRLF) calculation and state estimation of loads in MV feeders.

Keywords: digital twin, distributed energy resources, remote terminal units, supervisory control and data acquisition system, smart recursive load flow

Procedia PDF Downloads 112
40940 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 197
40939 A Design for Customer Preferences Model by Cluster Analysis of Geometric Features and Customer Preferences

Authors: Yuan-Jye Tseng, Ching-Yen Chen

Abstract:

In the design cycle, a main design task is to determine the external shape of the product. The external shape of a product is one of the key factors that can affect the customers’ preferences linking to the motivation to buy the product, especially in the case of a consumer electronic product such as a mobile phone. The relationship between the external shape and the customer preferences needs to be studied to enhance the customer’s purchase desire and action. In this research, a design for customer preferences model is developed for investigating the relationships between the external shape and the customer preferences of a product. In the first stage, the names of the geometric features are collected and evaluated from the data of the specified internet web pages using the developed text miner. The key geometric features can be determined if the number of occurrence on the web pages is relatively high. For each key geometric feature, the numerical values are explored using the text miner to collect the internet data from the web pages. In the second stage, a cluster analysis model is developed to evaluate the numerical values of the key geometric features to divide the external shapes into several groups. Several design suggestion cases can be proposed, for example, large model, mid-size model, and mini model, for designing a mobile phone. A customer preference index is developed by evaluating the numerical data of each of the key geometric features of the design suggestion cases. The design suggestion case with the top ranking of the customer preference index can be selected as the final design of the product. In this paper, an example product of a notebook computer is illustrated. It shows that the external shape of a product can be used to drive customer preferences. The presented design for customer preferences model is useful for determining a suitable external shape of the product to increase customer preferences.

Keywords: cluster analysis, customer preferences, design evaluation, design for customer preferences, product design

Procedia PDF Downloads 191
40938 Induction Motor Stator Fault Analysis Using Phase-Angle and Magnitude of the Line Currents Spectra

Authors: Ahmed Hamida Boudinar, Noureddine Benouzza, Azeddine Bendiabdellah, Mohamed El Amine Khodja

Abstract:

This paper describes a new diagnosis approach for identification of the progressive stator winding inter-turn short-circuit fault in induction motor. This approach is based on a simple monitoring of the combined information related to both magnitude and phase-angle obtained from the fundamental by the three line currents frequency analysis. In addition, to simplify the interpretation and analysis of the data; a new graphical tool based on a triangular representation is suggested. This representation, depending on its size, enables to visualize in a simple and clear manner, the existence of the stator inter-turn short-circuit fault and its discrimination with respect to a healthy stator. Experimental results show well the benefit and effectiveness of the proposed approach.

Keywords: induction motor, magnitude, phase-angle, spectral analysis, stator fault

Procedia PDF Downloads 361
40937 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function

Procedia PDF Downloads 309
40936 Genomic Diversity and Relationship among Arabian Peninsula Dromedary Camels Using Full Genome Sequencing Approach

Authors: H. Bahbahani, H. Musa, F. Al Mathen

Abstract:

The dromedary camels (Camelus dromedarius) are single-humped even-toed ungulates populating the African Sahara, Arabian Peninsula, and Southwest Asia. The genome of this desert-adapted species has been minimally investigated using autosomal microsatellite and mitochondrial DNA markers. In this study, the genomes of 33 dromedary camel samples from different parts of the Arabian Peninsula were sequenced using Illumina Next Generation Sequencing (NGS) platform. These data were combined with Genotyping-by-Sequencing (GBS) data from African (Sudanese) dromedaries to investigate the genomic relationship between African and Arabian Peninsula dromedary camels. Principle Component Analysis (PCA) and average genome-wide admixture analysis were be conducted on these data to tackle the objectives of these studies. Both of the two analyses conducted revealed phylogeographic distinction between these two camel populations. However, no breed-wise genetic classification has been revealed among the African (Sudanese) camel breeds. The Arabian Peninsula camel populations also show higher heterozygosity than the Sudanese camels. The results of this study explain the evolutionary history and migration of African dromedary camels from their center of domestication in the southern Arabian Peninsula. These outputs help scientists to further understand the evolutionary history of dromedary camels, which might impact in conserving the favorable genetic of this species.

Keywords: dromedary, genotyping-by-sequencing, Arabian Peninsula, Sudan

Procedia PDF Downloads 206
40935 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation

Authors: Natalia Kalinowska

Abstract:

The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.

Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach

Procedia PDF Downloads 253
40934 Model Order Reduction for Frequency Response and Effect of Order of Method for Matching Condition

Authors: Aref Ghafouri, Mohammad javad Mollakazemi, Farhad Asadi

Abstract:

In this paper, model order reduction method is used for approximation in linear and nonlinearity aspects in some experimental data. This method can be used for obtaining offline reduced model for approximation of experimental data and can produce and follow the data and order of system and also it can match to experimental data in some frequency ratios. In this study, the method is compared in different experimental data and influence of choosing of order of the model reduction for obtaining the best and sufficient matching condition for following the data is investigated in format of imaginary and reality part of the frequency response curve and finally the effect and important parameter of number of order reduction in nonlinear experimental data is explained further.

Keywords: frequency response, order of model reduction, frequency matching condition, nonlinear experimental data

Procedia PDF Downloads 404
40933 The Solvent Extraction of Uranium, Plutonium and Thorium from Aqueous Solution by 1-Hydroxyhexadecylidene-1,1-Diphosphonic Acid

Authors: M. Bouhoun Ali, A. Y. Badjah Hadj Ahmed, M. Attou, A. Elias, M. A. Didi

Abstract:

In this paper, the solvent extraction of uranium(VI), plutonium(IV) and thorium(IV) from aqueous solutions using 1-hydroxyhexadecylidene-1,1-diphosphonic acid (HHDPA) in treated kerosene has been investigated. The HHDPA was previously synthesized and characterized by FT-IR, 1H NMR, 31P NMR spectroscopy and elemental analysis. The effects contact time, initial pH, initial metal concentration, aqueous/organic phase ratio, extractant concentration and temperature on the extraction process have been studied. An empirical modelling was performed by using a 25 full factorial design, and regression equation for extraction metals was determined from the data. The conventional log-log analysis of the extraction data reveals that ratios of extractant to extracted U(VI), Pu(IV) and Th(IV) are 1:1, 1:2 and 1:2, respectively. Thermodynamic parameters showed that the extraction process was exothermic heat and spontaneous. The obtained optimal parameters were applied to real effluents containing uranium(VI), plutonium(IV) and thorium(IV) ions.

Keywords: solvent extraction, uranium, plutonium, thorium, 1-hydroxyhexadecylidene-1-1-diphosphonic acid, aqueous solution

Procedia PDF Downloads 288