Search results for: distributed process mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17252

Search results for: distributed process mining

16922 Real-Time Mine Safety System with the Internet of Things

Authors: Şakir Bingöl, Bayram İslamoğlu, Ebubekir Furkan Tepeli, Fatih Mehmet Karakule, Fatih Küçük, Merve Sena Arpacık, Mustafa Taha Kabar, Muhammet Metin Molak, Osman Emre Turan, Ömer Faruk Yesir, Sıla İnanır

Abstract:

This study introduces an IoT-based real-time safety system for mining, addressing global safety challenges. The wearable device, seamlessly integrated into miners' jackets, employs LoRa technology for communication and offers real-time monitoring of vital health and environmental data. Unique features include an LCD panel for immediate information display and sound-based location tracking for emergency response. The methodology involves sensor integration, data transmission, and ethical testing. Validation confirms the system's effectiveness in diverse mining scenarios. The study calls for ongoing research to adapt the system to different mining contexts, emphasizing its potential to significantly enhance safety standards in the industry.

Keywords: mining safety, internet of things, wearable technology, LoRa, RFID tracking, real-time safety system, safety alerts, safety measures

Procedia PDF Downloads 28
16921 IOT Based Process Model for Heart Monitoring Process

Authors: Dalyah Y. Al-Jamal, Maryam H. Eshtaiwi, Liyakathunisa Syed

Abstract:

Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.

Keywords: IoT, process model, remote patient monitoring system, smart watch

Procedia PDF Downloads 303
16920 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example

Authors: Wang Yang

Abstract:

Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.

Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map

Procedia PDF Downloads 85
16919 Mining Coupled to Agriculture: Systems Thinking in Scalable Food Production

Authors: Jason West

Abstract:

Low profitability in agriculture production along with increasing scrutiny over environmental effects is limiting food production at scale. In contrast, the mining sector offers access to resources including energy, water, transport and chemicals for food production at low marginal cost. Scalable agricultural production can benefit from the nexus of resources (water, energy, transport) offered by mining activity in remote locations. A decision support bioeconomic model for controlled environment vertical farms was used. Four submodels were used: crop structure, nutrient requirements, resource-crop integration, and economic. They escalate to a macro mathematical model. A demonstrable dynamic systems framework is needed to prove productive outcomes are feasible. We demonstrate a generalized bioeconomic macro model for controlled environment production systems in minesites using systems dynamics modeling methodology. Despite the complexity of bioeconomic modelling of resource-agricultural dynamic processes and interactions, the economic potential greater than general economic models would assume. Scalability of production as an input becomes a key success feature.

Keywords: crop production systems, mathematical model, mining, agriculture, dynamic systems

Procedia PDF Downloads 54
16918 Q-Map: Clinical Concept Mining from Clinical Documents

Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala

Abstract:

Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.

Keywords: information retrieval, unified medical language system, syntax based analysis, natural language processing, medical informatics

Procedia PDF Downloads 110
16917 Reuse of Huge Industrial Areas

Authors: Martina Perinkova, Lenka Kolarcikova, Marketa Twrda

Abstract:

Brownfields are one of the most important problems that must be solved by today's cities. The topic of this article is description of developing a comprehensive transformation of post-industrial area of the former iron factory national cultural heritage Lower Vítkovice. City of Ostrava used to be industrial superpower of the Czechoslovak Republic, especially in the area of coal mining and iron production, after declining industrial production and mining in the 80s left many unused areas of former factories generally brownfields and backfields. Since the late 90s we are observing how the city officials or private entities seeking to remedy this situation. Regeneration of brownfields is a very expensive and long-term process. The area is now rebuilt for tourists and residents of the city in the entertainment, cultural, and social center. It was necessary do the reconstruction of the industrial monuments. Equally important was the construction of new buildings, which helped reusing of the entire complex. This is a unique example of transformation of technical monuments and completion of necessary new objects, so that the area could start working again and reintegrate back into the urban system.

Keywords: brown fields, conversion, historical and industrial buildings, reconstruction

Procedia PDF Downloads 301
16916 Analytical Study of Data Mining Techniques for Software Quality Assurance

Authors: Mariam Bibi, Rubab Mehboob, Mehreen Sirshar

Abstract:

Satisfying the customer requirements is the ultimate goal of producing or developing any product. The quality of the product is decided on the bases of the level of customer satisfaction. There are different techniques which have been reported during the survey which enhance the quality of the product through software defect prediction and by locating the missing software requirements. Some mining techniques were proposed to assess the individual performance indicators in collaborative environment to reduce errors at individual level. The basic intention is to produce a product with zero or few defects thereby producing a best product quality wise. In the analysis of survey the techniques like Genetic algorithm, artificial neural network, classification and clustering techniques and decision tree are studied. After analysis it has been discovered that these techniques contributed much to the improvement and enhancement of the quality of the product.

Keywords: data mining, defect prediction, missing requirements, software quality

Procedia PDF Downloads 434
16915 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning

Authors: Walid Cherif

Abstract:

Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.

Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification

Procedia PDF Downloads 439
16914 Design of an Improved Distributed Framework for Intrusion Detection System Based on Artificial Immune System and Neural Network

Authors: Yulin Rao, Zhixuan Li, Burra Venkata Durga Kumar

Abstract:

Intrusion detection refers to monitoring the actions of internal and external intruders on the system and detecting the behaviours that violate security policies in real-time. In intrusion detection, there has been much discussion about the application of neural network technology and artificial immune system (AIS). However, many solutions use static methods (signature-based and stateful protocol analysis) or centralized intrusion detection systems (CIDS), which are unsuitable for real-time intrusion detection systems that need to process large amounts of data and detect unknown intrusions. This article proposes a framework for a distributed intrusion detection system (DIDS) with multi-agents based on the concept of AIS and neural network technology to detect anomalies and intrusions. In this framework, multiple agents are assigned to each host and work together, improving the system's detection efficiency and robustness. The trainer agent in the central server of the framework uses the artificial neural network (ANN) rather than the negative selection algorithm of AIS to generate mature detectors. Mature detectors can distinguish between self-files and non-self-files after learning. Our analyzer agents use genetic algorithms to generate memory cell detectors. This kind of detector will effectively reduce false positive and false negative errors and act quickly on known intrusions.

Keywords: artificial immune system, distributed artificial intelligence, multi-agent, intrusion detection system, neural network

Procedia PDF Downloads 83
16913 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 446
16912 Voltage Profile Enhancement in the Unbalanced Distribution Systems during Fault Conditions

Authors: K. Jithendra Gowd, Ch. Sai Babu, S. Sivanagaraju

Abstract:

Electric power systems are daily exposed to service interruption mainly due to faults and human accidental interference. Short circuit currents are responsible for several types of disturbances in power systems. The fault currents are high and the voltages are reduced at the time of fault. This paper presents two suitable methods, consideration of fault resistance and Distributed Generator are implemented and analyzed for the enhancement of voltage profile during fault conditions. Fault resistance is a critical parameter of electric power systems operation due to its stochastic nature. If not considered, this parameter may interfere in fault analysis studies and protection scheme efficiency. The effect of Distributed Generator is also considered. The proposed methods are tested on the IEEE 37 bus test systems and the results are compared.

Keywords: distributed generation, electrical distribution systems, fault resistance

Procedia PDF Downloads 493
16911 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 260
16910 The Distributed Pattern of the Neurovascular Structures under Clavicle to Minimize Structural Injury in Clinical Field: Anatomical Study

Authors: Anna Jeon, Seung-Ho Han, Je-Hun Lee

Abstract:

The aim of this study was to determine the location and distribution pattern of neurovascular structures superior and inferior to the clavicle by detailed dissection. Fifteen adult non-embalmed cadavers with a mean age of 71.5 years were studied. For measurements, the most prominent point of the sternal end of the clavicle (SEC) on anterior view and the most prominent point of the acromial end of the clavicle (AEC) were identified before dissection. A line connecting the SEC and AEC was used as a reference line. The surrounding neurovascular structures were investigated. The supraclavicular nerve was densely distributed at 71.73% on the reference line. Branches of the thoracoacromial artery were located at 76.92%. Branches of subclavian vein were evenly distributed at all sections. The subclavian vein and artery and brachial plexus were located from 31.3% to 57.5%. That area needs caution because major neurovascular structures run underneath the clavicle.

Keywords: clavicle, ORIF, neurovascular structure, anatomical study

Procedia PDF Downloads 137
16909 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets

Authors: Ece Cigdem Mutlu, Burak Alakent

Abstract:

Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.

Keywords: average run length, M-estimators, quality control, robust estimators

Procedia PDF Downloads 170
16908 Distributed Leadership and Emergency Response: A Study on Seafarers

Authors: Delna Shroff

Abstract:

Merchant shipping is an occupation with a high rate of fatal injuries caused by organizational accidents and maritime disasters. In most accident investigations, the leader’s actions are under scrutiny and point out the necessity to investigate the leader’s decisions in critical conditions. While several leadership studies have been carried out in the past, there is a tendency for most research to focus on holders of formal positions. The unit of analysis in most studies has been the ‘individual.’ A need is, therefore, felt to adopt a practice-based perspective of leadership, understand how leadership emerges to affect maritime safety. This paper explores the phenomenon of distributed leadership among seafarers more holistically. It further examines the role of one form of distributed leadership, that is, planfully aligned leadership in the emergency response of the team. A mixed design will be applied. In the first phase, the data gathered by way of semi-structured interviews will be used to explore the seafarer’s implicit understanding of leadership. The data will be used to develop a conceptual framework of distributed leadership, specific to the maritime context. This framework will be used to develop a simulation. Experimental design will be used to examine the relationship between planfully aligned leadership and emergency response of the team members during navigation. Findings show that planfully aligned leadership significantly and positively predicts the emergency response of team members. Planfully aligned leadership leads to a better emergency response of the team members as compared to authoritarian leadership. In the third qualitative phase, additional data will be gathered through semi-structured interviews to further validate the findings to gain a more complete understanding of distributed leadership and its relation to emergency response. Above are the predictive results; the study expects to be a cornerstone of safety leadership research and has important implications for leadership development and training within the maritime industry.

Keywords: authoritarian leadership, distributed leadership, emergency response , planfully aligned leadership

Procedia PDF Downloads 146
16907 Open Source Knowledge Management Approach to Manage and Disseminate Distributed Content in a Global Enterprise

Authors: Rahul Thakur, Onkar Chandel

Abstract:

Red Hat is the world leader in providing open source software and solutions. A global enterprise, like Red Hat, has unique issues of connecting employees with content because of distributed offices, multiple teams spread across geographies, multiple languages, and different cultures. Employees, of a global company, create content that is distributed across departments, teams, regions, and countries. This makes finding the best content difficult since owners keep iterating on the existing content. When employees are unable to find the content, they end up creating it once again and in the process duplicating existing material and effort. Also, employees may not find the relevant content and spend time reviewing obsolete duplicate, or irrelevant content. On an average, a person spends 15 minutes/day in failed searches that might result in missed business opportunities, employee frustration, and substandard deliverables. Red Hat Knowledge Management Office (KMO) applied 'open source strategy' to solve the above problems. Under the Open Source Strategy, decisions are taken collectively. The strategy aims at accomplishing common goals with the help of communities. The objectives of this initiative were to save employees' time, get them authentic content, improve their content search experience, avoid duplicate content creation, provide context based search, improve analytics, improve content management workflows, automate content classification, and automate content upload. This session will describe open source strategy, its applicability in content management, challenges, recommended solutions, and outcome.

Keywords: content classification, content management, knowledge management, open source

Procedia PDF Downloads 185
16906 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining

Authors: Mohsen Farhadloo, Majid Farhadloo

Abstract:

Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.

Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis

Procedia PDF Downloads 78
16905 A High-Level Co-Evolutionary Hybrid Algorithm for the Multi-Objective Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a hybrid distributed algorithm has been suggested for the multi-objective job shop scheduling problem. Many new approaches are used at design steps of the distributed algorithm. Co-evolutionary structure of the algorithm and competition between different communicated hybrid algorithms, which are executed simultaneously, causes to efficient search. Using several machines for distributing the algorithms, at the iteration and solution levels, increases computational speed. The proposed algorithm is able to find the Pareto solutions of the big problems in shorter time than other algorithm in the literature. Apache Spark and Hadoop platforms have been used for the distribution of the algorithm. The suggested algorithm and implementations have been compared with results of the successful algorithms in the literature. Results prove the efficiency and high speed of the algorithm.

Keywords: distributed algorithms, Apache Spark, Hadoop, job shop scheduling, multi-objective optimization

Procedia PDF Downloads 338
16904 The Role of Strategic Alliances, Innovation Capability, Cost Reduction in Enhancing Customer Loyalty and Firm’s Competitive Advantage

Authors: Soebowo Musa

Abstract:

Mining industries are known to be very volatile due to their sensitive nature toward changes in the environment, particularly coal mining. Heavy equipment distributors and coal mining contractors are among heavily affected by such volatility. They are facing more uncertainty on the sustainability of the coal mining industry. Strategic alliances and organizational capabilities such as innovation capability have long been seen as ways to stay competitive with a focus more on the strategic alliances partner-to-partner in serving their customers. In today’s rapid change in the environment, a shift in consumer behaviors, and the human-centric business approach, this study looks at the strategic alliance partner-to-customer relationship in both the industrial organization and resource-based theories. This study was conducted based on 250 respondents from the strategic alliances partner-to-customer between heavy equipment distributors and coal mining contractors in Indonesia. This study finds strategic alliances have the highest association toward cost reduction, a proxy of operational efficiency followed by its association toward innovation capability. Further, strategic alliances and innovation capability have a positive relationship with customer loyalty, while innovation capability and customer loyalty have no significant relationships toward the firm’s competitive advantage. This study also indicates that cost reduction is not a condition to develop customer loyalty in the strategic alliance partner-to-customer relationship. It confirms strategic alliances are a strategy that creates a firm’s operational efficiency, innovation capability that develops customer loyalty, and competitive advantage.

Keywords: strategic alliance, innovation capability, cost reduction, customer loyalty, competitive advantage

Procedia PDF Downloads 91
16903 Application of Fuzzy Logic in Voltage Regulation of Radial Feeder with Distributed Generators

Authors: Anubhav Shrivastava, Lakshya Bhat, Shivarudraswamy

Abstract:

Distributed Generation is the need of the hour. With current advancements in the DG technology, there are some major issues that need to be tackled in order to make this method of generation of energy more efficient and feasible. Among other problems, the control in voltage is the major issue that needs to be addressed. This paper focuses on control of voltage using reactive power control of DGs with the help of fuzzy logic. The membership functions have been defined accordingly and the control of the system is achieved. Finally, with the help of simulation results in Matlab, the control of voltage within the tolerance limit set (+/- 5%) is achieved. The voltage waveform graphs for the IEEE 14 bus system are obtained by using simple algorithm with MATLAB and then with fuzzy logic for 14 bus system. The goal of this project was to control the voltage within limits by controlling the reactive power of the DG using fuzzy logic.

Keywords: distributed generation, fuzzy logic, matlab, newton raphson, IEEE 14 bus, voltage regulation, radial network

Procedia PDF Downloads 605
16902 Opinion Mining to Extract Community Emotions on Covid-19 Immunization Possible Side Effects

Authors: Yahya Almurtadha, Mukhtar Ghaleb, Ahmed M. Shamsan Saleh

Abstract:

The world witnessed a fierce attack from the Covid-19 virus, which affected public life socially, economically, healthily and psychologically. The world's governments tried to confront the pandemic by imposing a number of precautionary measures such as general closure, curfews and social distancing. Scientists have also made strenuous efforts to develop an effective vaccine to train the immune system to develop antibodies to combat the virus, thus reducing its symptoms and limiting its spread. Artificial intelligence, along with researchers and medical authorities, has accelerated the vaccine development process through big data processing and simulation. On the other hand, one of the most important negatives of the impact of Covid 19 was the state of anxiety and fear due to the blowout of rumors through social media, which prompted governments to try to reassure the public with the available means. This study aims to proposed using Sentiment Analysis (AKA Opinion Mining) and deep learning as efficient artificial intelligence techniques to work on retrieving the tweets of the public from Twitter and then analyze it automatically to extract their opinions, expression and feelings, negatively or positively, about the symptoms they may feel after vaccination. Sentiment analysis is characterized by its ability to access what the public post in social media within a record time and at a lower cost than traditional means such as questionnaires and interviews, not to mention the accuracy of the information as it comes from what the public expresses voluntarily.

Keywords: deep learning, opinion mining, natural language processing, sentiment analysis

Procedia PDF Downloads 146
16901 Investigating Data Normalization Techniques in Swarm Intelligence Forecasting for Energy Commodity Spot Price

Authors: Yuhanis Yusof, Zuriani Mustaffa, Siti Sakira Kamaruddin

Abstract:

Data mining is a fundamental technique in identifying patterns from large data sets. The extracted facts and patterns contribute in various domains such as marketing, forecasting, and medical. Prior to that, data are consolidated so that the resulting mining process may be more efficient. This study investigates the effect of different data normalization techniques, which are Min-max, Z-score, and decimal scaling, on Swarm-based forecasting models. Recent swarm intelligence algorithms employed includes the Grey Wolf Optimizer (GWO) and Artificial Bee Colony (ABC). Forecasting models are later developed to predict the daily spot price of crude oil and gasoline. Results showed that GWO works better with Z-score normalization technique while ABC produces better accuracy with the Min-Max. Nevertheless, the GWO is more superior that ABC as its model generates the highest accuracy for both crude oil and gasoline price. Such a result indicates that GWO is a promising competitor in the family of swarm intelligence algorithms.

Keywords: artificial bee colony, data normalization, forecasting, Grey Wolf optimizer

Procedia PDF Downloads 450
16900 Distributed Acoustic Sensing Signal Model under Static Fiber Conditions

Authors: G. Punithavathy

Abstract:

The research proposes a statistical model for the distributed acoustic sensor interrogation units that broadcast a laser pulse into the fiber optics, where interactions within the fiber determine the localized acoustic energy that causes light reflections known as backscatter. The backscattered signal's amplitude and phase can be calculated using explicit equations. The created model makes amplitude signal spectrum and autocorrelation predictions that are confirmed by experimental findings. Phase signal characteristics that are useful for researching optical time domain reflectometry (OTDR) system sensing applications are provided and examined, showing good agreement with the experiment. The experiment was successfully done with the use of Python coding. In this research, we can analyze the entire distributed acoustic sensing (DAS) component parts separately. This model assumes that the fiber is in a static condition, meaning that there is no external force or vibration applied to the cable, that means no external acoustic disturbances present. The backscattered signal consists of a random noise component, which is caused by the intrinsic imperfections of the fiber, and a coherent component, which is due to the laser pulse interacting with the fiber.

Keywords: distributed acoustic sensing, optical fiber devices, optical time domain reflectometry, Rayleigh scattering

Procedia PDF Downloads 49
16899 Multicriteria for Optimal Land Use after Mining

Authors: Carla Idely Palencia-Aguilar

Abstract:

Mining in Colombia represents around 2% of the GDP (USD 8 billion in 2018), with main productions represented by coal, nickel, gold, silver, emeralds, iron, limestone, gypsum, among others. Sand and Gravel had been decreasing its participation of the GDP with a reduction of 33.2 million m3 in 2015, to 27.4 in 2016, 22.7 in 2017 and 15.8 in 2018, with a consumption of approximately 3 tons/inhabitant. However, with the new government policies it is expected to increase in the following years. Mining causes temporary environmental impacts, once restoration and rehabilitation takes place, social, environmental and economic benefits are higher than the initial state. A way to demonstrate how the mining interventions had contributed to improve the characteristics of the region after sand and gravel mining, the NDVI (Normalized Difference Vegetation Index) from MODIS and ASTER were employed. The histograms show not only increments of vegetation in the area (8 times higher), but also topographies similar to the ones before the intervention, according to the application for sustainable development selected: either agriculture, forestry, cattle raising, artificial wetlands or do nothing. The decision was based upon a Multicriteria analysis for optimal land use, with three main variables: geostatistics, evapotranspiration and groundwater characteristics. The use of remote sensing, meteorological stations, piezometers, sunphotometers, geoelectric analysis among others; provide the information required for the multicriteria decision. For cattle raising and agricultural applications (where various crops were implemented), conservation of products were tested by means of nanotechnology. The results showed a duration of 2 years with no chemicals added for preservation and concentration of vitamins of the tested products.

Keywords: ASTER, Geostatistics, MODIS, Multicriteria

Procedia PDF Downloads 107
16898 Combined Safety and Cybersecurity Risk Assessment for Intelligent Distributed Grids

Authors: Anders Thorsén, Behrooz Sangchoolie, Peter Folkesson, Ted Strandberg

Abstract:

As more parts of the power grid become connected to the internet, the risk of cyberattacks increases. To identify the cybersecurity threats and subsequently reduce vulnerabilities, the common practice is to carry out a cybersecurity risk assessment. For safety classified systems and products, there is also a need for safety risk assessments in addition to the cybersecurity risk assessment in order to identify and reduce safety risks. These two risk assessments are usually done separately, but since cybersecurity and functional safety are often related, a more comprehensive method covering both aspects is needed. Some work addressing this has been done for specific domains like the automotive domain, but more general methods suitable for, e.g., intelligent distributed grids, are still missing. One such method from the automotive domain is the Security-Aware Hazard Analysis and Risk Assessment (SAHARA) method that combines safety and cybersecurity risk assessments. This paper presents an approach where the SAHARA method has been modified in order to be more suitable for larger distributed systems. The adapted SAHARA method has a more general risk assessment approach than the original SAHARA. The proposed method has been successfully applied on two use cases of an intelligent distributed grid.

Keywords: intelligent distribution grids, threat analysis, risk assessment, safety, cybersecurity

Procedia PDF Downloads 126
16897 Modeling and Simulation of Pad Surface Topography by Diamond Dressing in Chemical-Mechanical Polishing Process

Authors: A.Chen Chao-Chang, Phong Pham-Quoc

Abstract:

Chemical-mechanical polishing (CMP) process has been widely applied on fabricating integrated circuits (IC) with a soft polishing pad combined with slurry composed of micron or nano-scaled abrasives for generating chemical reaction to remove substrate or film materials from wafer. During CMP process, pad uniformity usually works as a datum surface of wafer planarization and pad asperities can dominate the microscopic pad-slurry-wafer interaction. However, pad topography can be changed by related mechanism factors of CMP and it needs to be re-conditioned or dressed by a diamond dresser of well-distributed diamond grits on a disc surface. It is still very complicated to analyze and understand kinematic of diamond dressing process under the effects of input variables including oscillatory of diamond dresser and rotation speed ratio between the pad and the diamond dresser. This paper has developed a generic geometric model to clarify the kinematic modeling of diamond dressing processes such as dresser/pad motion, pad cutting locus, the relative velocity of the diamond abrasive grits on pad surface, and overlap of cutting for prediction of pad surface topography. Simulation results focus on comparing and analysis kinematics of the diamond dressing on certain CMP tools. Results have shown the significant parameters for diamond dressing process and also discussed. Future study can apply on diamond dresser design and experimental verification of pad dressing process.

Keywords: kinematic modeling, diamond dresser, pad cutting locus, CMP

Procedia PDF Downloads 231
16896 Data Mining of Students' Performance Using Artificial Neural Network: Turkish Students as a Case Study

Authors: Samuel Nii Tackie, Oyebade K. Oyedotun, Ebenezer O. Olaniyi, Adnan Khashman

Abstract:

Artificial neural networks have been used in different fields of artificial intelligence, and more specifically in machine learning. Although, other machine learning options are feasible in most situations, but the ease with which neural networks lend themselves to different problems which include pattern recognition, image compression, classification, computer vision, regression etc. has earned it a remarkable place in the machine learning field. This research exploits neural networks as a data mining tool in predicting the number of times a student repeats a course, considering some attributes relating to the course itself, the teacher, and the particular student. Neural networks were used in this work to map the relationship between some attributes related to students’ course assessment and the number of times a student will possibly repeat a course before he passes. It is the hope that the possibility to predict students’ performance from such complex relationships can help facilitate the fine-tuning of academic systems and policies implemented in learning environments. To validate the power of neural networks in data mining, Turkish students’ performance database has been used; feedforward and radial basis function networks were trained for this task; and the performances obtained from these networks evaluated in consideration of achieved recognition rates and training time.

Keywords: artificial neural network, data mining, classification, students’ evaluation

Procedia PDF Downloads 576
16895 Identification of Risks Associated with Process Automation Systems

Authors: J. K. Visser, H. T. Malan

Abstract:

A need exists to identify the sources of risks associated with the process automation systems within petrochemical companies or similar energy related industries. These companies use many different process automation technologies in its value chain. A crucial part of the process automation system is the information technology component featuring in the supervisory control layer. The ever-changing technology within the process automation layers and the rate at which it advances pose a risk to safe and predictable automation system performance. The age of the automation equipment also provides challenges to the operations and maintenance managers of the plant due to obsolescence and unavailability of spare parts. The main objective of this research was to determine the risk sources associated with the equipment that is part of the process automation systems. A secondary objective was to establish whether technology managers and technicians were aware of the risks and share the same viewpoint on the importance of the risks associated with automation systems. A conceptual model for risk sources of automation systems was formulated from models and frameworks in literature. This model comprised six categories of risk which forms the basis for identifying specific risks. This model was used to develop a questionnaire that was sent to 172 instrument technicians and technology managers in the company to obtain primary data. 75 completed and useful responses were received. These responses were analyzed statistically to determine the highest risk sources and to determine whether there was difference in opinion between technology managers and technicians. The most important risks that were revealed in this study are: 1) the lack of skilled technicians, 2) integration capability of third-party system software, 3) reliability of the process automation hardware, 4) excessive costs pertaining to performing maintenance and migrations on process automation systems, and 5) requirements of having third-party communication interfacing compatibility as well as real-time communication networks.

Keywords: distributed control system, identification of risks, information technology, process automation system

Procedia PDF Downloads 110
16894 Power Quality Improvement Using UPQC Integrated with Distributed Generation Network

Authors: B. Gopal, Pannala Krishna Murthy, G. N. Sreenivas

Abstract:

The increasing demand of electric power is giving an emphasis on the need for the maximum utilization of renewable energy sources. On the other hand maintaining power quality to satisfaction of utility is an essential requirement. In this paper the design aspects of a Unified Power Quality Conditioner integrated with photovoltaic system in a distributed generation is presented. The proposed system consist of series inverter, shunt inverter are connected back to back on the dc side and share a common dc-link capacitor with Distributed Generation through a boost converter. The primary task of UPQC is to minimize grid voltage and load current disturbances along with reactive and harmonic power compensation. In addition to primary tasks of UPQC, other functionalities such as compensation of voltage interruption and active power transfer to the load and grid in both islanding and interconnected mode have been addressed. The simulation model is design in MATLAB/ Simulation environment and the results are in good agreement with the published work.

Keywords: distributed generation (DG), interconnected mode, islanding mode, maximum power point tracking (mppt), power quality (PQ), unified power quality conditioner (UPQC), photovoltaic array (PV)

Procedia PDF Downloads 483
16893 Arabic Light Stemmer for Better Search Accuracy

Authors: Sahar Khedr, Dina Sayed, Ayman Hanafy

Abstract:

Arabic is one of the most ancient and critical languages in the world. It has over than 250 million Arabic native speakers and more than twenty countries having Arabic as one of its official languages. In the past decade, we have witnessed a rapid evolution in smart devices, social network and technology sector which led to the need to provide tools and libraries that properly tackle the Arabic language in different domains. Stemming is one of the most crucial linguistic fundamentals. It is used in many applications especially in information extraction and text mining fields. The motivation behind this work is to enhance the Arabic light stemmer to serve the data mining industry and leverage it in an open source community. The presented implementation works on enhancing the Arabic light stemmer by utilizing and enhancing an algorithm that provides an extension for a new set of rules and patterns accompanied by adjusted procedure. This study has proven a significant enhancement for better search accuracy with an average 10% improvement in comparison with previous works.

Keywords: Arabic data mining, Arabic Information extraction, Arabic Light stemmer, Arabic stemmer

Procedia PDF Downloads 284