Search results for: Interval Type-2 Fuzzy Logic
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1890

Search results for: Interval Type-2 Fuzzy Logic

1200 A Business Model Design Process for Social Enterprises: The Critical Role of the Environment

Authors: Hadia Abdel Aziz, Raghda El Ebrashi

Abstract:

Business models are shaped by their design space or the environment they are designed to be implemented in. The rapidly changing economic, technological, political, regulatory and market external environment severely affects business logic. This is particularly true for social enterprises whose core mission is to transform their environments, and thus, their whole business logic revolves around the interchange between the enterprise and the environment. The context in which social business operates imposes different business design constraints while at the same time, open up new design opportunities. It is also affected to a great extent by the impact that successful enterprises generate; a continuous loop of interaction that needs to be managed through a dynamic capability in order to generate a lasting powerful impact. This conceptual research synthesizes and analyzes literature on social enterprise, social enterprise business models, business model innovation, business model design, and the open system view theory to propose a new business model design process for social enterprises that takes into account the critical role of environmental factors. This process would help the social enterprise develop a dynamic capability that ensures the alignment of its business model to its environmental context, thus, maximizing its probability of success.

Keywords: social enterprise, business model, business model design, business model environment

Procedia PDF Downloads 346
1199 Attentional Engagement for Movie

Authors: Wuon-Shik Kim, Hyoung-Min Choi, Jeonggeon Woo, Sun Jung Kwon, SeungHee Lee

Abstract:

The research on attentional engagement (AE) in movies using physiological signals is rare and controversial. Therefore, whether physiological responses can be applied to evaluate AE in actual movies is unclear. To clarify this, we measured electrocardiogram and electroencephalogram (EEG) of 16 Japanese university students as they watched the American movie Iron Man. After the viewing, we evaluated the subjective AE and affection levels for 11 film content segments in Iron Man. Based on self-reports for AE, we selected two film content segments as stimuli: Film Content 9 describing Tony Stark (the main character) flying through the night sky (with the highest AE score) and Film Content 1, describing Tony Stark and his colleagues telling indecent jokes (with the lowest score). We divided these two content segments into two time intervals, respectively. Results indicated that the Film Content by Interval interaction for HR was significant, at F (1, 11)=35.64, p<.001, η2=.76; while HR in Film Content 1 decreased, that of in Film Content 9 increased. In Film Content 9, the main effects of the Interval for respiratory sinus arrhythmia (RSA) (F (1, 11)=5.91, p<.05, η2=.35) and for the attention index of EEG (F (1, 11)=5.23, p<.05, η2=.37) were significant. The increase in the RSA was significant (p<.05) as well, whereas that of the EEG attention index was nearly significant (p=.069). In conclusion, while RSA increases, HR decreases when people direct their attention toward normal films. However, while paying attention to a film evoking excitement, HR as well as RSA can increase.

Keywords: attentional engagement, electroencephalogram, movie, respiratory sinus arrhythmia

Procedia PDF Downloads 349
1198 A Review of Paleo-Depositional Environment and Thermal Alteration Index of Carboniferous, Permian and Triassic of A1-9 well, NW Libya

Authors: Mohamed Ali Alrabib

Abstract:

This paper introduces a paleoenvironmental and hydrocarbon show in this well was identified in the interval of Dembaba formation to the Hassaona formation was poor to very poor oil show. And from palaeoenvironmental analysis there is neither particularly good reservoir nor source rock have been developed in the area. Recent palaeoenvironment work undertakes that the sedimentary succession in this area comprises the Upper Paleozoic rock of the Carboniferous and Permian and the Mesozoic (Triassic) sedimentary sequences. No early Paleozoic rocks have been found in this area, these rocks were eroding during the Late Carboniferous and Early Permian time. During Latest Permian and earliest Triassic time evidence for major marine transgression has occurred. From depths 5930-5940 feet, to 10800-10810 feet, the TAI of the Al Guidr, the Bir Al Jaja Al Uotia, Hebilia and the top varies between 3+ to 4-(mature-dry gas). This interval corporate the rest part of the Dembaba Formation. From depth 10800- 10810 feet, until total sediment depth (11944 feet Log) which corporate the rest of the Dembaba and underlying equivalents of the Assedjefar and M rar Formations and the underlying Indeterminate unit (Hassouna Formation) the TAI varies between 4 and 5 (dry gas-black& deformed).

Keywords: paleoenveronments, thermail index, carboniferous, Libya

Procedia PDF Downloads 404
1197 Effect of Drought Stress on Yield and Yield Components of Maize Cultivars in Golestan Province

Authors: Mojtaba Esmaeilzad Limoudehi, Ebrahim Amiri

Abstract:

Water scarcity is now one of the leading challenges for human societies. In this regard, recognizing the relationship between soil, water, plant growth, and plant response to stress is very significant. In this paper, considering the importance of drought stress and the role of choosing suitable cultivars in resistance against drought, a split-plot experiment using early, intermediate, and late-maturing cultivars was carried out in Katul filed, Golestan province during two cultivation years of 2015 and 2016. The main factor was irrigation intervals at four levels, including 7 days, 14 days, 21 days, and 28 days. The subfactor was the subplot of six maize cultivars (two early maturing cultivars, two medium maturing cultivars, and two late-maturing cultivars). The results of variance analysis have revealed that irrigation interval and cultivars treatment have significant effects on the number of grain in each corn, number of rows in each corn, number of grain per row, the weight of 1000 grains, grain yield, and biomass yield. Although, the interaction of these two factors on the mentioned attributes was meaningful. The best grain yield was achieved at 7 days irrigation interval and late maturing maize cultivars treatment, which was equal to 12301 kg/ha.

Keywords: corn, growth period, optimization, stress

Procedia PDF Downloads 125
1196 Hybrid Wavelet-Adaptive Neuro-Fuzzy Inference System Model for a Greenhouse Energy Demand Prediction

Authors: Azzedine Hamza, Chouaib Chakour, Messaoud Ramdani

Abstract:

Energy demand prediction plays a crucial role in achieving next-generation power systems for agricultural greenhouses. As a result, high prediction quality is required for efficient smart grid management and therefore low-cost energy consumption. The aim of this paper is to investigate the effectiveness of a hybrid data-driven model in day-ahead energy demand prediction. The proposed model consists of Discrete Wavelet Transform (DWT), and Adaptive Neuro-Fuzzy Inference System (ANFIS). The DWT is employed to decompose the original signal in a set of subseries and then an ANFIS is used to generate the forecast for each subseries. The proposed hybrid method (DWT-ANFIS) was evaluated using a greenhouse energy demand data for a week and compared with ANFIS. The performances of the different models were evaluated by comparing the corresponding values of Mean Absolute Percentage Error (MAPE). It was demonstrated that discret wavelet transform can improve agricultural greenhouse energy demand modeling.

Keywords: wavelet transform, ANFIS, energy consumption prediction, greenhouse

Procedia PDF Downloads 69
1195 The Application of Insects in Forensic Investigations

Authors: Shirin Jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Forensic entomology is the science of study and analysis of insects evidences to aid in criminal investigation. Being aware of the distribution, biology, ecology and behavior of insects, which are founded at crime scene can provide information about when, where and how the crime has been committed. It has many application in criminal investigations. Its main use is estimation of the minimum time after death in suspicious death. The close association between insects and corpses and the use of insects in criminal investigations is the subject of forensic entomology. Because insects attack to the decomposing corpse and spawning on it from the initial stages. Forensic scientists can estimate the postmortem index by studying the insects population and the developing larval stages.In addition, toxicological and molecular studies of these insects can reveal the cause of death or even the identity of a victim. It also be used to detect drugs and poisons, and determination of incident location. Gathering robust entomological evidences is made possible for experts by recent Techniques. They can provide vital information about death, corpse movement or burial, submersion interval, time of decapitation, identification of specific sites of trauma, post-mortem artefacts on the body, use of drugs, linking a suspect to the scene of a crime, sexual molestations and the identification of suspects.

Keywords: Forensic entomology, post mortem interval, insects, larvae

Procedia PDF Downloads 486
1194 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification

Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine

Abstract:

Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.

Keywords: convolution, feature extraction, image analysis, validation, precision agriculture

Procedia PDF Downloads 298
1193 General Time-Dependent Sequenced Route Queries in Road Networks

Authors: Mohammad Hossein Ahmadi, Vahid Haghighatdoost

Abstract:

Spatial databases have been an active area of research over years. In this paper, we study how to answer the General Time-Dependent Sequenced Route queries. Given the origin and destination of a user over a time-dependent road network graph, an ordered list of categories of interests and a departure time interval, our goal is to find the minimum travel time path along with the best departure time that minimizes the total travel time from the source location to the given destination passing through a sequence of points of interests belonging to each of the specified categories of interest. The challenge of this problem is the added complexity to the optimal sequenced route queries, where we assume that first the road network is time dependent, and secondly the user defines a departure time interval instead of one single departure time instance. For processing general time-dependent sequenced route queries, we propose two solutions as Discrete-Time and Continuous-Time Sequenced Route approaches, finding approximate and exact solutions, respectively. Our proposed approaches traverse the road network based on A*-search paradigm equipped with an efficient heuristic function, for shrinking the search space. Extensive experiments are conducted to verify the efficiency of our proposed approaches.

Keywords: trip planning, time dependent, sequenced route query, road networks

Procedia PDF Downloads 303
1192 A Sub-Conjunctiva Injection of Rosiglitazone for Anti-Fibrosis Treatment after Glaucoma Filtration Surgery

Authors: Yang Zhao, Feng Zhang, Xuanchu Duan

Abstract:

Trans-differentiation of human Tenon fibroblasts (HTFs) to myo-fibroblasts and fibrosis of episcleral tissue are the most common reasons for the failure of glaucoma filtration surgery, with limited treatment options like antimetabolites which always have side-effects such as leakage of filter bulb, infection, hypotony, and endophthalmitis. Rosiglitazone, a specific thiazolidinedione is a synthetic high-affinity ligand for PPAR-r, which has been used in the treatment of type2 diabetes, and found to have pleiotropic functions against inflammatory response, cell proliferation and tissue fibrosis and to benefit to a variety of diseases in animal myocardium models, steatohepatitis models, etc. Here, in vitro we cultured primary HTFs and stimulated with TGF- β to induced myofibrogenic, then treated cells with Rosiglitazone to assess for fibrogenic response. In vivo, we used rabbit glaucoma model to establish the formation of post- trabeculectomy scarring. Then we administered subconjunctival injection with Rosiglitazone beside the filtering bleb, later protein, mRNA and immunofluorescence of fibrogenic markers are checked, and filtering bleb condition was measured. In vitro, we found Rosiglitazone could suppressed proliferation and migration of fibroblasts through macroautophagy via TGF- β /Smad signaling pathway. In vivo, on postoperative day 28, the mean number of fibroblasts in Rosiglitazone injection group was significantly the lowest and had the least collagen content and connective tissue growth factor. Rosiglitazone effectively controlled human and rabbit fibroblasts in vivo and in vitro. Its subconjunctiiva application may represent an effective, new avenue for the prevention of scarring after glaucoma surgery.

Keywords: fibrosis, glaucoma, macroautophagy, rosiglitazone

Procedia PDF Downloads 251
1191 Application of Two Stages Adaptive Neuro-Fuzzy Inference System to Improve Dissolved Gas Analysis Interpretation Techniques

Authors: Kharisma Utomo Mulyodinoto, Suwarno, A. Abu-Siada

Abstract:

Dissolved Gas Analysis is one of impressive technique to detect and predict internal fault of transformers by using gas generated by transformer oil sample. A number of methods are used to interpret the dissolved gas from transformer oil sample: Doernenberg Ratio Method, IEC (International Electrotechnical Commission) Ratio Method, and Duval Triangle Method. While the assessment of dissolved gas within transformer oil samples has been standardized over the past two decades, analysis of the results is not always straight forward as it depends on personnel expertise more than mathematical formulas. To get over this limitation, this paper is aimed at improving the interpretation of Doernenberg Ratio Method, IEC Ratio Method, and Duval Triangle Method using Two Stages Adaptive Neuro-Fuzzy Inference System (ANFIS). Dissolved gas analysis data from 520 faulty transformers was analyzed to establish the proposed ANFIS model. Results show that the developed ANFIS model is accurate and can standardize the dissolved gas interpretation process with accuracy higher than 90%.

Keywords: ANFIS, dissolved gas analysis, Doernenberg ratio method, Duval triangular method, IEC ratio method, transformer

Procedia PDF Downloads 134
1190 Assessment and Forecasting of the Impact of Negative Environmental Factors on Public Health

Authors: Nurlan Smagulov, Aiman Konkabayeva, Akerke Sadykova, Arailym Serik

Abstract:

Introduction. Adverse environmental factors do not immediately lead to pathological changes in the body. They can exert the growth of pre-pathology characterized by shifts in physiological, biochemical, immunological and other indicators of the body state. These disorders are unstable, reversible and indicative of body reactions. There is an opportunity to objectively judge the internal structure of the adaptive body reactions at the level of individual organs and systems. In order to obtain a stable response of the body to the chronic effects of unfavorable environmental factors of low intensity (compared to production environment factors), a time called the «lag time» is needed. The obtained results without considering this factor distort reality and, for the most part, cannot be a reliable statement of the main conclusions in any work. A technique is needed to reduce methodological errors and combine mathematical logic using statistical methods and a medical point of view, which ultimately will affect the obtained results and avoid a false correlation. Objective. Development of a methodology for assessing and predicting the environmental factors impact on the population health considering the «lag time.» Methods. Research objects: environmental and population morbidity indicators. The database on the environmental state was compiled from the monthly newsletters of Kazhydromet. Data on population morbidity were obtained from regional statistical yearbooks. When processing static data, a time interval (lag) was determined for each «argument-function» pair. That is the required interval, after which the harmful factor effect (argument) will fully manifest itself in the indicators of the organism's state (function). The lag value was determined by cross-correlation functions of arguments (environmental indicators) with functions (morbidity). Correlation coefficients (r) and their reliability (t), Fisher's criterion (F) and the influence share (R2) of the main factor (argument) per indicator (function) were calculated as a percentage. Results. The ecological situation of an industrially developed region has an impact on health indicators, but it has some nuances. Fundamentally opposite results were obtained in the mathematical data processing, considering the «lag time». Namely, an expressed correlation was revealed after two databases (ecology-morbidity) shifted. For example, the lag period was 4 years for dust concentration, general morbidity, and 3 years – for childhood morbidity. These periods accounted for the maximum values of the correlation coefficients and the largest percentage of the influencing factor. Similar results were observed in relation to the concentration of soot, dioxide, etc. The comprehensive statistical processing using multiple correlation-regression variance analysis confirms the correctness of the above statement. This method provided the integrated approach to predicting the degree of pollution of the main environmental components to identify the most dangerous combinations of concentrations of leading negative environmental factors. Conclusion. The method of assessing the «environment-public health» system (considering the «lag time») is qualitatively different from the traditional (without considering the «lag time»). The results significantly differ and are more amenable to a logical explanation of the obtained dependencies. The method allows presenting the quantitative and qualitative dependence in a different way within the «environment-public health» system.

Keywords: ecology, morbidity, population, lag time

Procedia PDF Downloads 64
1189 Performance Evaluation of Microcontroller-Based Fuzzy Controller for Fruit Drying System

Authors: Salisu Umar

Abstract:

Fruits are a seasonal crop and get spoiled quickly. They are dried to be preserved for a long period. The natural drying process requires more time. The investment on space requirement and infrastructure is large, and cannot be afforded by a middle class farmer. Therefore there is a need for a comparatively small unit with reduced drying times, which can be afforded by a middle class farmer. A controlled environment suitable for fruit drying is developed within a closed chamber and is a three step process. Firstly, the infrared light is used internally to preheated the fruit to speedily remove the water content inside the fruit for fast drying. Secondly, hot air of a specified temperature is blown inside the chamber to maintain the humidity below a specified level and exhaust the humid air of the chamber. Thirdly the microcontroller idles disconnecting the power to the chamber after the weight of the fruits is reduced to a known value of its original weight. This activates a buzzer for duration of ten seconds to indicate the end of the drying process. The results obtained indicate that the system is significantly reducing the drying time without affecting the quality of the fruits compared with the existing dryers.

Keywords: fruit, fuzzy controller, microcontroller, temperature, weight and humidity

Procedia PDF Downloads 427
1188 Characterising Stable Model by Extended Labelled Dependency Graph

Authors: Asraful Islam

Abstract:

Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.

Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring

Procedia PDF Downloads 196
1187 Future of Nanotechnology in Digital MacDraw

Authors: Pejman Hosseinioun, Abolghasem Ghasempour, Elham Gholami, Hamed Sarbazi

Abstract:

Considering the development in global semiconductor technology, it is anticipated that gadgets such as diodes and resonant transistor tunnels (RTD/RTT), Single electron transistors (SET) and quantum cellular automata (QCA) will substitute CMOS (Complementary Metallic Oxide Semiconductor) gadgets in many applications. Unfortunately, these new technologies cannot disembark the common Boolean logic efficiently and are only appropriate for liminal logic. Therefor there is no doubt that with the development of these new gadgets it is necessary to find new MacDraw technologies which are compatible with them. Resonant transistor tunnels (RTD/RTT) and circuit MacDraw with enhanced computing abilities are candida for accumulating Nano criterion in the future. Quantum cellular automata (QCA) are also advent Nano technological gadgets for electrical circuits. Advantages of these gadgets such as higher speed, smaller dimensions, and lower consumption loss are of great consideration. QCA are basic gadgets in manufacturing gates, fuses and memories. Regarding the complex Nano criterion physical entity, circuit designers can focus on logical and constructional design to decrease complication in MacDraw. Moreover Single electron technology (SET) is another noteworthy gadget considered in Nano technology. This article is a survey in future of Nano technology in digital MacDraw.

Keywords: nano technology, resonant transistor tunnels, quantum cellular automata, semiconductor

Procedia PDF Downloads 249
1186 An Ontology-Based Framework to Support Asset Integrity Modeling: Case Study of Offshore Riser Integrity

Authors: Mohammad Sheikhalishahi, Vahid Ebrahimipour, Amir Hossein Radman-Kian

Abstract:

This paper proposes an Ontology framework for knowledge modeling and representation of the equipment integrity process in a typical oil and gas production plant. Our aim is to construct a knowledge modeling that facilitates translation, interpretation, and conversion of human-readable integrity interpretation into computer-readable representation. The framework provides a function structure related to fault propagation using ISO 14224 and ISO 15926 OWL-Lite/ Resource Description Framework (RDF) to obtain a generic system-level model of asset integrity that can be utilized in the integrity engineering process during the equipment life cycle. It employs standard terminology developed by ISO 15926 and ISO 14224 to map textual descriptions of equipment failure and then convert it to a causality-driven logic by semantic interpretation and computer-based representation using Lite/RDF. The framework applied for an offshore gas riser. The result shows that the approach can cross-link the failure-related integrity words and domain-specific logic to obtain a representation structure of equipment integrity with causality inference based on semantic extraction of inspection report context.

Keywords: asset integrity modeling, interoperability, OWL, RDF/XML

Procedia PDF Downloads 168
1185 Determinants of Diarrhoea Prevalence Variations in Mountainous Informal Settlements of Kigali City, Rwanda

Authors: Dieudonne Uwizeye

Abstract:

Introduction: Diarrhoea is one of the major causes of morbidity and mortality among communities living in urban informal settlements of developing countries. It is assumed that mountainous environment introduces variations of the burden among residents of the same settlements. Design and Objective: A cross-sectional study was done in Kigali to explore the effect of mountainous informal settlements on diarrhoea risk variations. Data were collected among 1,152 households through household survey and transect walk to observe the status of sanitation. The outcome variable was the incidence of diarrhoea among household members of any age. The study used the most knowledgeable person in the household as the main respondent. Mostly this was the woman of the house as she was more likely to know the health status of every household member as she plays various roles: mother, wife, and head of the household among others. The analysis used cross tabulation and logistic regression analysis. Results: Results suggest that risks for diarrhoea vary depending on home location in the settlements. Diarrhoea risk increased as the distance from the road increased. The results of the logistic regression analysis indicate the adjusted odds ratio of 2.97 with 95% confidence interval being 1.35-6.55 and 3.50 adjusted odds ratio with 95% confidence interval being 1.61-7.60 in level two and three respectively compared with level one. The status of sanitation within and around homes was also significantly associated with the increase of diarrhoea. Equally, it is indicated that stable households were less likely to have diarrhoea. The logistic regression analysis indicated the adjusted odds ratio of 0.45 with 95% confidence interval being 0.25-0.81. However, the study did not find evidence for a significant association between diarrhoea risks and household socioeconomic status in the multivariable model. It is assumed that environmental factors in mountainous settings prevailed. Households using the available public water sources were more likely to have diarrhoea in their households. Recommendation: The study recommends the provision and extension of infrastructure for improved water, drainage, sanitation and wastes management facilities. Equally, studies should be done to identify the level of contamination and potential origin of contaminants for water sources in the valleys to adequately control the risks for diarrhoea in mountainous urban settings.

Keywords: urbanisation, diarrhoea risk, mountainous environment, urban informal settlements in Rwanda

Procedia PDF Downloads 156
1184 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 71
1183 Material Handling Equipment Selection Using Fuzzy AHP Approach

Authors: Priyanka Verma, Vijaya Dixit, Rishabh Bajpai

Abstract:

This research paper is aimed at selecting appropriate material handling equipment among the given choices so that the automation level in material handling can be enhanced. This work is a practical case scenario of material handling systems in consumer electronic appliances manufacturing organization. The choices of material handling equipment among which the decision has to be made are Automated Guided Vehicle’s (AGV), Autonomous Mobile Robots (AMR), Overhead Conveyer’s (OC) and Battery Operated Trucks/Vehicle’s (BOT). There is a need of attaining a certain level of automation in order to reduce human interventions in the organization. This requirement of achieving certain degree of automation can be attained by material handling equipment’s mentioned above. The main motive for selecting above equipment’s for study was solely based on corporate financial strategy of investment and return obtained through that investment made in stipulated time framework. Since the low cost automation with respect to material handling devices has to be achieved hence these equipment’s were selected. Investment to be done on each unit of this equipment is less than 20 lakh rupees (INR) and the recovery period is less than that of five years. Fuzzy analytic hierarchic process (FAHP) is applied here for selecting equipment where the four choices are evaluated on basis of four major criteria’s and 13 sub criteria’s, and are prioritized on the basis of weight obtained. The FAHP used here make use of triangular fuzzy numbers (TFN). The inability of the traditional AHP in order to deal with the subjectiveness and impreciseness in the pair-wise comparison process has been improved in the FAHP. The range of values for general rating purposes for all decision making parameters is kept between 0 and 1 on the basis of expert opinions captured on shop floor. These experts were familiar with operating environment and shop floor activity control. Instead of generating exact value the FAHP generates the ranges of values to accommodate the uncertainty in decision-making process. The four major criteria’s selected for the evaluation of choices of material handling equipment’s available are materials, technical capabilities, cost and other features. The thirteen sub criteria’s listed under these following four major criteria’s are weighing capacity, load per hour, material compatibility, capital cost, operating cost and maintenance cost, speed, distance moved, space required, frequency of trips, control required, safety and reliability issues. The key finding shows that among the four major criteria selected, cost is emerged as the most important criteria and is one of the key decision making aspect on the basis of which material equipment selection is based on. While further evaluating the choices of equipment available for each sub criteria it is found that AGV scores the highest weight in most of the sub-criteria’s. On carrying out complete analysis the research shows that AGV is the best material handling equipment suiting all decision criteria’s selected in FAHP and therefore it is beneficial for the organization to carry out automated material handling in the facility using AGV’s.

Keywords: fuzzy analytic hierarchy process (FAHP), material handling equipment, subjectiveness, triangular fuzzy number (TFN)

Procedia PDF Downloads 422
1182 Formal Group Laws and Toposes in Gauge Theory

Authors: Patrascu Andrei Tudor

Abstract:

One of the main problems in high energy physics is the fact that we do not have a complete understanding of the interaction between local and global effects in gauge theory. This has an increasing impact on our ability to access the non-perturbative regime of most of our theories. Our theories, while being based on gauge groups considered to be simple or semi-simple and connected, are expected to be described by their simple local linear approximation, namely the Lie algebras. However, higher homotopy properties resulting in gauge anomalies appear frequently in theories of physical interest. Our assumption that the groups we deal with are simple and simply connected is probably not suitable, and ways to go beyond such assumptions, particularly in gauge theories, where the Lie algebra linear approximation is prevalent, are not known. We approach this problem from two directions: on one side we are explaining the potential role of formal group laws in describing certain higher homotopical properties and interferences with local or perturbative effects, and on the other side, we employ a categorical approach leading to synthetic theory and a way of looking at gauge theories. The topos approach is based on a geometry where the fundamental logic is intuitionistic logic, and hence the ‘tertium non datur’ principle is abandoned. This has a remarkable impact on understanding conformal symmetry and its anomalies in string theory in various dimensions.

Keywords: Gauge theory, formal group laws, Topos theory, conformal symmetry

Procedia PDF Downloads 0
1181 Modified Fuzzy Delphi Method to Incorporate Healthcare Stakeholders’ Perspectives in Selecting Quality Improvement Projects’ Criteria

Authors: Alia Aldarmaki, Ahmad Elshennawy

Abstract:

There is a global shift in healthcare systems’ emphasizing engaging different stakeholders in selecting quality improvement initiatives and incorporating their preferences to improve the healthcare efficiency and outcomes. Although experts bring scientific knowledge based on the scientific model and their personal experience, other stakeholders can bring new insights and information into the decision-making process. This study attempts to explore the impact of incorporating different stakeholders’ preference in identifying the most significant criteria that should be considered in healthcare for electing the improvement projects. A Framework based on a modified Fuzzy Delphi Method (FDM) was built. In addition to, the subject matter experts, doctors/physicians, nurses, administrators, and managers groups contribute to the selection process. The research identifies potential criteria for evaluating projects in healthcare, then utilizes FDM to capture expertise knowledge. The first round in FDM is intended to validate the identified list of criteria from experts; which includes collecting additional criteria from experts that the literature might have overlooked. When an acceptable level of consensus has been reached, a second round is conducted to obtain experts’ and other related stakeholders’ opinions on the appropriate weight of each criterion’s importance using linguistic variables. FDM analyses eliminate or retain the criteria to produce a final list of the critical criteria to select improvement projects in healthcare. Finally, reliability and validity were investigated using Cronbach’s alpha and factor analysis, respectively. Two case studies were carried out in a public hospital in the United Arab Emirates to test the framework. Both cases demonstrate that even though there were common criteria between the experts and the stakeholders, still stakeholders’ perceptions bring additional critical criteria into the evaluation process, which can impact the outcomes. Experts selected criteria related to strategical and managerial aspects, while the other participants preferred criteria related to social aspects such as health and safety and patients’ satisfaction. The health and safety criterion had the highest important weight in both cases. The analysis showed that Cronbach’s alpha value is 0.977 and all criteria have factor loading greater than 0.3. In conclusion, the inclusion of stakeholders’ perspectives is intended to enhance stakeholders’ engagement, improve transparency throughout the decision process, and take robust decisions.

Keywords: Fuzzy Delphi Method, fuzzy number, healthcare, stakeholders

Procedia PDF Downloads 108
1180 Seismic Microzonation Analysis for Damage Mapping of the 2006 Yogyakarta Earthquake, Indonesia

Authors: Fathul Mubin, Budi E. Nurcahya

Abstract:

In 2006, a large earthquake ever occurred in the province of Yogyakarta, which caused considerable damage. This is the basis need to investigate the seismic vulnerability index in around of the earthquake zone. This research is called microzonation of earthquake hazard. This research has been conducted at the site and surrounding of Prambanan Temple, includes homes and civil buildings. The reason this research needs to be done because in the event of an earthquake in 2006, there was damage to the temples at Prambanan temple complex and its surroundings. In this research, data collection carried out for 60 minutes using three component seismograph measurements at 165 points with spacing of 1000 meters. The data recorded in time function were analyzed using the spectral ratio method, known as the Horizontal to Vertical Spectral Ratio (HVSR). Results from this analysis are dominant frequency (Fg) and maximum amplification factor (Ag) are used to obtain seismic vulnerability index. The results of research showed the dominant frequency range from 0.5 to 30 Hz and the amplification is in interval from 0.5 to 9. Interval value for seismic vulnerability index is 0.1 to 50. Based on distribution maps of seismic vulnerability index and impact of buildings damage seemed for suitability. For further research, it needs to survey to the east (klaten) and south (Bantul, DIY) to determine a full distribution maps of seismic vulnerability index.

Keywords: amplification factor, dominant frequency, microzonation analysis, seismic vulnerability index

Procedia PDF Downloads 182
1179 Grammar as a Logic of Labeling: A Computer Model

Authors: Jacques Lamarche, Juhani Dickinson

Abstract:

This paper introduces a computational model of a Grammar as Logic of Labeling (GLL), where the lexical primitives of morphosyntax are phonological matrixes, the form of words, understood as labels that apply to realities (or targets) assumed to be outside of grammar altogether. The hypothesis is that even though a lexical label relates to its target arbitrarily, this label in a complex (constituent) label is part of a labeling pattern which, depending on its value (i.e., N, V, Adj, etc.), imposes language-specific restrictions on what it targets outside of grammar (in the world/semantics or in cognitive knowledge). Lexical forms categorized as nouns, verbs, adjectives, etc., are effectively targets of labeling patterns in use. The paper illustrates GLL through a computer model of basic patterns in English NPs. A constituent label is a binary object that encodes: i) alignment of input forms so that labels occurring at different points in time are understood as applying at once; ii) endocentric structuring - every grammatical constituent has a head label that determines the target of the constituent, and a limiter label (the non-head) that restricts this target. The N or A values are restricted to limiter label, the two differing in terms of alignment with a head. Consider the head initial DP ‘the dog’: the label ‘dog’ gets an N value because it is a limiter that is evenly aligned with the head ‘the’, restricting application of the DP. Adapting a traditional analysis of ‘the’ to GLL – apply label to something familiar – the DP targets and identifies one reality familiar to participants by applying to it the label ‘dog’ (singular). Consider next the DP ‘the large dog’: ‘large dog’ is nominal by even alignment with ‘the’, as before, and since ‘dog’ is the head of (head final) ‘large dog’, it is also nominal. The label ‘large’, however, is adjectival by narrow alignment with the head ‘dog’: it doesn’t target the head but targets a property of what dog applies to (a property or value of attribute). In other words, the internal composition of constituents determines that a form targets a property or a reality: ‘large’ and ‘dog’ happen to be valid targets to realize this constituent. In the presentation, the computer model of the analysis derives the 8 possible sequences of grammatical values with three labels after the determiner (the x y z): 1- D [ N [ N N ]]; 2- D [ A [ N N ] ]; 3- D [ N [ A N ] ]; 4- D [ A [ A N ] ]; 5- D [ [ N N ] N ]; 5- D [ [ A N ] N ]; 6- D [ [ N A ] N ] 7- [ [ N A ] N ] 8- D [ [ Adv A ] N ]. This approach that suggests that a computer model of these grammatical patterns could be used to construct ontologies/knowledge using speakers’ judgments about the validity of lexical meaning in grammatical patterns.

Keywords: syntactic theory, computational linguistics, logic and grammar, semantics, knowledge and grammar

Procedia PDF Downloads 12
1178 Effect of Tissue Preservation Chemicals on Decomposition in Different Soil Types

Authors: Onyekachi Ogbonnaya Iroanya, Taiye Abdullahi Gegele, Frank Tochukwu Egwuatu

Abstract:

Introduction: Forensic taphonomy is a multifaceted area that incorporates decomposition, chemical and biological cadaver exposure in post-mortem event chronology and reconstruction to predict the Post Mortem Interval (PMI). The aim of this study was to evaluate the integrity of DNA extracted from the remains of embalmed decomposed Sus domesticus tissues buried in different soil types. Method: A total of 12 limbs of Sus domesticus weighing between 0.7-1.4 kg were used. Each of the samples across the groups was treated with 10% formaldehyde, absolute methanol and 50% Pine oil for 24 hours before burial except the control samples, which were buried immediately. All samples were buried in shallow simulated Clay, Sandy and Loamy soil graves for 12 months. The DNA for each sample was extracted and quantified with Nanodrop Spectrophotometer (6305 JENWAY spectrometers). The rate of decomposition was examined through the modified qualitative decomposition analysis. Extracted DNA was amplified through PCR and bands visualized via gel electrophoresis. A biochemical enzyme assay was done for each burial grave soil. Result: The limbs in all burial groups had lost weight over the burial period. There was a significant increase in the soil urease level in the samples preserved in formaldehyde across the 3 soil type groups (p≤0.01). Also, the control grave soils recorded significantly higher alkaline phosphatase, dehydrogenase and calcium carbonate values compared to experimental grave soils (p≤0.01). The experimental samples showed a significant decrease in DNA concentration and purity when compared to the control groups (p≤0.01). Obtained findings of the soil biochemical analysis showed the embalming treatment altered the relationship between organic matter decomposition and soil biochemical properties as observed in the fluctuations that were recorded in the soil biochemical parameters. The PCR amplified DNA showed no bands on the gel electrophoresis plates. Conclusion: In criminal investigations, factors such as burial grave soil, grave soil biochemical properties, antemortem exposure to embalming chemicals should be considered in post-mortem interval (PMI) determination.

Keywords: forensic taphonomy, post-mortem interval (PMI), embalmment, decomposition, grave soil

Procedia PDF Downloads 145
1177 Value-Based Argumentation Frameworks and Judicial Moral Reasoning

Authors: Sonia Anand Knowlton

Abstract:

As the use of Artificial Intelligence is becoming increasingly integrated in virtually every area of life, the need and interest to logically formalize the law and judicial reasoning is growing tremendously. The study of argumentation frameworks (AFs) provides promise in this respect. AF’s provide a way of structuring human reasoning using a formal system of non-monotonic logic. P.M. Dung first introduced this framework and demonstrated that certain arguments must prevail and certain arguments must perish based on whether they are logically “attacked” by other arguments. Dung labelled the set of prevailing arguments as the “preferred extension” of the given argumentation framework. Trevor Bench-Capon’s Value-based Argumentation Frameworks extended Dung’s AF system by allowing arguments to derive their force from the promotion of “preferred” values. In VAF systems, the success of an attack from argument A to argument B (i.e., the triumph of argument A) requires that argument B does not promote a value that is preferred to argument A. There has been thorough discussion of the application of VAFs to the law within the computer science literature, mainly demonstrating that legal cases can be effectively mapped out using VAFs. This article analyses VAFs from a jurisprudential standpoint to provide a philosophical and theoretical analysis of what VAFs tell the legal community about the judicial reasoning, specifically distinguishing between legal and moral reasoning. It highlights the limitations of using VAFs to account for judicial moral reasoning in theory and in practice.

Keywords: nonmonotonic logic, legal formalization, computer science, artificial intelligence, morality

Procedia PDF Downloads 59
1176 Effect of Hybrid Fibers on Mechanical Properties in Autoclaved Aerated Concrete

Authors: B. Vijay Antony Raj, Umarani Gunasekaran, R. Thiru Kumara Raja Vallaban

Abstract:

Fibrous autoclaved aerated concrete (FAAC) is concrete containing fibrous material in it which helps to increase its structural integrity when compared to that of convention autoclaved aerated concrete (CAAC). These short discrete fibers are uniformly distributed and randomly oriented, which enhances the bond strength within the aerated concrete matrix. Conventional red-clay bricks create larger impact to the environment due to red soil depletion and it also consumes large amount to time for construction. Whereas, AAC are larger in size, lighter in weight and it is environmentally friendly in nature and hence it is a viable replacement for red-clay bricks. Internal micro cracks and corner cracks are the only disadvantages of conventional autoclaved aerated concrete, to resolve this particular issue it is preferable to make use of fibers in it.These fibers are bonded together within the matrix and they induce the aerated concrete to withstand considerable stresses, especially during the post cracking stage. Hence, FAAC has the capability of enhancing the mechanical properties and energy absorption capacity of CAAC. In this research work, individual fibers like glass, nylon, polyester and polypropylene are used they generally reduce the brittle fracture of AAC.To study the fibre’s surface topography and composition, SEM analysis is performed and then to determine the composition of a specimen as a whole as well as the composition of individual components EDAX mapping is carried out and then an experimental approach was performed to determine the effect of hybrid (multiple) fibres at various dosage (0.5%, 1%, 1.5%) and curing temperature of 180-2000 C is maintained to determine the mechanical properties of autoclaved aerated concrete. As an analytical part, the outcome experimental results is compared with fuzzy logic using MATLAB.

Keywords: fiberous AAC, crack control, energy absorption, mechanical properies, SEM, EDAX, MATLAB

Procedia PDF Downloads 255
1175 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines

Authors: Alexander Guzman Urbina, Atsushi Aoyama

Abstract:

The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.

Keywords: deep learning, risk assessment, neuro fuzzy, pipelines

Procedia PDF Downloads 279
1174 Fuzzy Optimization for Identifying Anticancer Targets in Genome-Scale Metabolic Models of Colon Cancer

Authors: Feng-Sheng Wang, Chao-Ting Cheng

Abstract:

Developing a drug from conception to launch is costly and time-consuming. Computer-aided methods can reduce research costs and accelerate the development process during the early drug discovery and development stages. This study developed a fuzzy multi-objective hierarchical optimization framework for identifying potential anticancer targets in a metabolic model. First, RNA-seq expression data of colorectal cancer samples and their healthy counterparts were used to reconstruct tissue-specific genome-scale metabolic models. The aim of the optimization framework was to identify anticancer targets that lead to cancer cell death and evaluate metabolic flux perturbations in normal cells that have been caused by cancer treatment. Four objectives were established in the optimization framework to evaluate the mortality of cancer cells for treatment and to minimize side effects causing toxicity-induced tumorigenesis on normal cells and smaller metabolic perturbations. Through fuzzy set theory, a multiobjective optimization problem was converted into a trilevel maximizing decision-making (MDM) problem. The applied nested hybrid differential evolution was applied to solve the trilevel MDM problem using two nutrient media to identify anticancer targets in the genome-scale metabolic model of colorectal cancer, respectively. Using Dulbecco’s Modified Eagle Medium (DMEM), the computational results reveal that the identified anticancer targets were mostly involved in cholesterol biosynthesis, pyrimidine and purine metabolisms, glycerophospholipid biosynthetic pathway and sphingolipid pathway. However, using Ham’s medium, the genes involved in cholesterol biosynthesis were unidentifiable. A comparison of the uptake reactions for the DMEM and Ham’s medium revealed that no cholesterol uptake reaction was included in DMEM. Two additional media, i.e., a cholesterol uptake reaction was included in DMEM and excluded in HAM, were respectively used to investigate the relationship of tumor cell growth with nutrient components and anticancer target genes. The genes involved in the cholesterol biosynthesis were also revealed to be determinable if a cholesterol uptake reaction was not induced when the cells were in the culture medium. However, the genes involved in cholesterol biosynthesis became unidentifiable if such a reaction was induced.

Keywords: Cancer metabolism, genome-scale metabolic model, constraint-based model, multilevel optimization, fuzzy optimization, hybrid differential evolution

Procedia PDF Downloads 63
1173 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 287
1172 A Sustainable Supplier Selection and Order Allocation Based on Manufacturing Processes and Product Tolerances: A Multi-Criteria Decision Making and Multi-Objective Optimization Approach

Authors: Ravi Patel, Krishna K. Krishnan

Abstract:

In global supply chains, appropriate and sustainable suppliers play a vital role in supply chain development and feasibility. In a larger organization with huge number of suppliers, it is necessary to divide suppliers based on their past history of quality and delivery of each product category. Since performance of any organization widely depends on their suppliers, well evaluated selection criteria and decision-making models lead to improved supplier assessment and development. In this paper, SCOR® performance evaluation approach and ISO standards are used to determine selection criteria for better utilization of supplier assessment by using hybrid model of Analytic Hierchchy Problem (AHP) and Fuzzy Techniques for Order Preference by Similarity to Ideal Solution (FTOPSIS). AHP is used to determine the global weightage of criteria which helps TOPSIS to get supplier score by using triangular fuzzy set theory. Both qualitative and quantitative criteria are taken into consideration for the proposed model. In addition, a multi-product and multi-time period model is selected for order allocation. The optimization model integrates multi-objective integer linear programming (MOILP) for order allocation and a hybrid approach for supplier selection. The proposed MOILP model optimizes order allocation based on manufacturing process and product tolerances as per manufacturer’s requirement for quality product. The integrated model and solution approach are tested to find optimized solutions for different scenario. The detailed analysis shows the superiority of proposed model over other solutions which considered individual decision making models.

Keywords: AHP, fuzzy set theory, multi-criteria decision making, multi-objective integer linear programming, TOPSIS

Procedia PDF Downloads 153
1171 Gaussian Probability Density for Forest Fire Detection Using Satellite Imagery

Authors: S. Benkraouda, Z. Djelloul-Khedda, B. Yagoubi

Abstract:

we present a method for early detection of forest fires from a thermal infrared satellite image, using the image matrix of the probability of belonging. The principle of the method is to compare a theoretical mathematical model to an experimental model. We considered that each line of the image matrix, as an embodiment of a non-stationary random process. Since the distribution of pixels in the satellite image is statistically dependent, we divided these lines into small stationary and ergodic intervals to characterize the image by an adequate mathematical model. A standard deviation was chosen to generate random variables, so each interval behaves naturally like white Gaussian noise. The latter has been selected as the mathematical model that represents a set of very majority pixels, which we can be considered as the image background. Before modeling the image, we made a few pretreatments, then the parameters of the theoretical Gaussian model were extracted from the modeled image, these settings will be used to calculate the probability of each interval of the modeled image to belong to the theoretical Gaussian model. The high intensities pixels are regarded as foreign elements to it, so they will have a low probability, and the pixels that belong to the background image will have a high probability. Finally, we did present the reverse of the matrix of probabilities of these intervals for a better fire detection.

Keywords: forest fire, forest fire detection, satellite image, normal distribution, theoretical gaussian model, thermal infrared matrix image

Procedia PDF Downloads 123