Search results for: effectual logic
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 655

Search results for: effectual logic

445 The U.S.-Taliban Peace Deal: Two-level Game Logic and Actors’ Payoffs

Authors: Zafar Iqbal

Abstract:

This article aims at analyzing the U.S.-Taliban peace deal considering the cross- pressures that both parties (U.S. and Taliban) faced and eventually paved the way for a negotiated settlement to the two-decade-long war. The paper first discusses the peace process initiated by President Obama in 2009 and then explores the factors that compelled both the parties to sign this deal. The study is based on secondary data and interviews done with the leading experts on Afghanistan along with the Taliban Qatar office spokesperson’s interview. The theoretical framework is based on the interplay of diplomacy and domestic politics: two-level games logic proposed by Robert D. Putnam. The two-level games suggest that actors involved in negotiations face cross-pressures and are constrained both by the expectations of the domestic audience and their counterpart’s zone of possible agreement. This paper will take the cross pressures for both sides as the permissive factors for the entire process of negotiations. However, there will be a slight aberration in the application of Putnam’s two-level games. In this case, it is not inter-state negotiations but between an all-powerful state and the unyielding non-state actors. The study concludes that both the parties faced domestic as well as international pressure which compelled them to sign a deal that could lead to an end of the two-decade-long war. Furthermore, it looks at the potential prospects and challenges of the deal following the U.S. withdrawal.

Keywords: neo-Taliban insurgency, negotiations, two-level game, U.S.-Taliban peace deal, U.S. withdrawal

Procedia PDF Downloads 191
444 Fuzzy Logic Modeling of Evaluation the Urban Skylines by the Entropy Approach

Authors: Murat Oral, Seda Bostancı, Sadık Ata, Kevser Dincer

Abstract:

When evaluating the aesthetics of cities, an analysis of the urban form development depending on design properties with a variety of factors is performed together with a study of the effects of this appearance on human beings. Different methods are used while making an aesthetical evaluation related to a city. Entropy, in its preliminary meaning, is the mathematical representation of thermodynamic results. Measuring the entropy is related to the distribution of positional figures of a message or information from the probabilities standpoint. In this study, analysis of evaluation the urban skylines by the entropy approach was modelled with Rule-Based Mamdani-Type Fuzzy (RBMTF) modelling technique. Input-output parameters were described by RBMTF if-then rules. Numerical parameters of input and output variables were fuzzificated as linguistic variables: Very Very Low (L1), Very Low (L2), Low (L3), Negative Medium (L4), Medium (L5), Positive Medium (L6), High (L7), Very High (L8) and Very Very High (L9) linguistic classes. The comparison between application data and RBMTF is done by using absolute fraction of variance (R2). The actual values and RBMTF results indicated that RBMTF can be successfully used for the analysis of evaluation the urban skylines by the entropy approach. As a result, RBMTF model has shown satisfying relation with experimental results, which suggests an alternative method to evaluation of the urban skylines by the entropy approach.

Keywords: urban skylines, entropy, rule-based Mamdani type, fuzzy logic

Procedia PDF Downloads 272
443 A Novel Approach to Design and Implement Context Aware Mobile Phone

Authors: G. S. Thyagaraju, U. P. Kulkarni

Abstract:

Context-aware computing refers to a general class of computing systems that can sense their physical environment, and adapt their behaviour accordingly. Context aware computing makes systems aware of situations of interest, enhances services to users, automates systems and personalizes applications. Context-aware services have been introduced into mobile devices, such as PDA and mobile phones. In this paper we are presenting a novel approaches used to realize the context aware mobile. The context aware mobile phone (CAMP) proposed in this paper senses the users situation automatically and provides user context required services. The proposed system is developed by using artificial intelligence techniques like Bayesian Network, fuzzy logic and rough sets theory based decision table. Bayesian Network to classify the incoming call (high priority call, low priority call and unknown calls), fuzzy linguistic variables and membership degrees to define the context situations, the decision table based rules for service recommendation. To exemplify and demonstrate the effectiveness of the proposed methods, the context aware mobile phone is tested for college campus scenario including different locations like library, class room, meeting room, administrative building and college canteen.

Keywords: context aware mobile, fuzzy logic, decision table, Bayesian probability

Procedia PDF Downloads 347
442 A Combined Approach Based on Artificial Intelligence and Computer Vision for Qualitative Grading of Rice Grains

Authors: Hemad Zareiforoush, Saeed Minaei, Ahmad Banakar, Mohammad Reza Alizadeh

Abstract:

The quality inspection of rice (Oryza sativa L.) during its various processing stages is very important. In this research, an artificial intelligence-based model coupled with computer vision techniques was developed as a decision support system for qualitative grading of rice grains. For conducting the experiments, first, 25 samples of rice grains with different levels of percentage of broken kernels (PBK) and degree of milling (DOM) were prepared and their qualitative grade was assessed by experienced experts. Then, the quality parameters of the same samples examined by experts were determined using a machine vision system. A grading model was developed based on fuzzy logic theory in MATLAB software for making a relationship between the qualitative characteristics of the product and its quality. Totally, 25 rules were used for qualitative grading based on AND operator and Mamdani inference system. The fuzzy inference system was consisted of two input linguistic variables namely, DOM and PBK, which were obtained by the machine vision system, and one output variable (quality of the product). The model output was finally defuzzified using Center of Maximum (COM) method. In order to evaluate the developed model, the output of the fuzzy system was compared with experts’ assessments. It was revealed that the developed model can estimate the qualitative grade of the product with an accuracy of 95.74%.

Keywords: machine vision, fuzzy logic, rice, quality

Procedia PDF Downloads 395
441 Spatio-Temporal Pest Risk Analysis with ‘BioClass’

Authors: Vladimir A. Todiras

Abstract:

Spatio-temporal models provide new possibilities for real-time action in pest risk analysis. It should be noted that estimation of the possibility and probability of introduction of a pest and of its economic consequences involves many uncertainties. We present a new mapping technique that assesses pest invasion risk using online BioClass software. BioClass is a GIS tool designed to solve multiple-criteria classification and optimization problems based on fuzzy logic and level set methods. This research describes a method for predicting the potential establishment and spread of a plant pest into new areas using a case study: corn rootworm (Diabrotica spp.), tomato leaf miner (Tuta absoluta) and plum fruit moth (Grapholita funebrana). Our study demonstrated that in BioClass we can combine fuzzy logic and geographic information systems with knowledge of pest biology and environmental data to derive new information for decision making. Pests are sensitive to a warming climate, as temperature greatly affects their survival and reproductive rate and capacity. Changes have been observed in the distribution, frequency and severity of outbreaks of Helicoverpa armigera on tomato. BioClass has demonstrated to be a powerful tool for applying dynamic models and map the potential future distribution of a species, enable resource to make decisions about dangerous and invasive species management and control.

Keywords: classification, model, pest, risk

Procedia PDF Downloads 268
440 A Cognitive Training Program in Learning Disability: A Program Evaluation and Follow-Up Study

Authors: Krisztina Bohacs, Klaudia Markus

Abstract:

To author’s best knowledge we are in absence of studies on cognitive program evaluation and we are certainly short of programs that prove to have high effect sizes with strong retention results. The purpose of our study was to investigate the effectiveness of a comprehensive cognitive training program, namely BrainRx. This cognitive rehabilitation program target and remediate seven core cognitive skills and related systems of sub-skills through repeated engagement in game-like mental procedures delivered one-on-one by a clinician, supplemented by digital training. A larger sample of children with learning disability were given pretest and post-test cognitive assessments. The experimental group completed a twenty-week cognitive training program in a BrainRx center. A matched control group received another twenty-week intervention with Feuerstein’s Instrumental Enrichment programs. A second matched control group did not receive training. As for pre- and post-test, we used a general intelligence test to assess IQ and a computer-based test battery for assessing cognition across the lifespan. Multiple regression analyses indicated that the experimental BrainRx treatment group had statistically significant higher outcomes in attention, working memory, processing speed, logic and reasoning, auditory processing, visual processing and long-term memory compared to the non-treatment control group with very large effect sizes. With the exception of logic and reasoning, the BrainRx treatment group realized significantly greater gains in six of the above given seven cognitive measures compared to the Feuerstein control group. Our one-year retention measures showed that all the cognitive training gains were above ninety percent with the greatest retention skills in visual processing, auditory processing, logic, and reasoning. The BrainRx program may be an effective tool to establish long-term cognitive changes in case of students with learning disabilities. Recommendations are made for treatment centers and special education institutions on the cognitive training of students with special needs. The importance of our study is that targeted, systematic, progressively loaded and intensive brain training approach may significantly change learning disabilities.

Keywords: cognitive rehabilitation training, cognitive skills, learning disability, permanent structural cognitive changes

Procedia PDF Downloads 185
439 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem

Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo

Abstract:

At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.

Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system

Procedia PDF Downloads 378
438 Twitter Ego Networks and the Capital Markets: A Social Network Analysis Perspective of Market Reactions to Earnings Announcement Events

Authors: Gregory D. Saxton

Abstract:

Networks are everywhere: lunch ties among co-workers, golfing partnerships among employees, interlocking board-of-director connections, Facebook friendship ties, etc. Each network varies in terms of its structure -its size, how inter-connected network members are, and the prevalence of sub-groups and cliques. At the same time, within any given network, some network members will have a more important, more central position on account of their greater number of connections or their capacity as “bridges” connecting members of different network cliques. The logic of network structure and position is at the heart of what is known as social network analysis, and this paper applies this logic to the study of the stock market. Using an array of data analytics and machine learning tools, this study will examine 17 million Twitter messages discussing the stocks of the firms in the S&P 1,500 index in 2018. Each of these 1,500 stocks has a distinct Twitter discussion network that varies in terms of core network characteristics such as size, density, influence, norms and values, level of activity, and embedded resources. The study’s core proposition is that the ultimate effect of any market-relevant information is contingent on the characteristics of the network through which it flows. To test this proposition, this study operationalizes each of the core network characteristics and examines their influence on market reactions to 2018 quarterly earnings announcement events.

Keywords: data analytics, investor-to-investor communication, social network analysis, Twitter

Procedia PDF Downloads 99
437 The Academic Experience of Vocational Training Teachers

Authors: Andréanne Gagné, Jo Anni Joncas, Éric Tendon

Abstract:

Teaching in vocational training requires an excellent mastery of the trade being taught, but also solid professional skills in pedagogy. Teachers are typically recruited on the basis of their trade expertise, and they do not necessarily have training or experience in pedagogy. In order to counter this lack, the Ministry of Education (Québec, Canada) requires them to complete a 120-credit university program to obtain their teaching certificate. They must complete this training in addition to their teaching duties. This training was rarely planned in the teacher’s life course, and each teacher approaches it differently: some are enthusiastic, but many feel reluctant discouragement and even frustration at the idea of committing to a training program lasting an average of 10 years to completion. However, Quebec is experiencing an unprecedented shortage of teachers, and the perseverance of vocational teachers in their careers requires special attention because of the conditions of their specific integration conditions. Our research examines the perceptions that vocational teachers in training have of their academic experience in pre-service teaching. It differs from previous research in that it focuses on the influence of the academic experience on the teaching employment experience. The goal is that by better understanding the university experience of teachers in vocational education, we can identify support strategies to support their school experience and their teaching. To do this, the research is based on the theoretical framework of the sociology of experience, which allows us to study the way in which these “teachers-students” give meaning to their university program in articulation with their jobs according to three logics of action. The logic of integration is based on the process of socialization, where the action is preceded by the internalization of values, norms, and cultural models associated with the training context. The logic of strategy refers to the usefulness of this experience where the individual constructs a form of rationality according to his objectives, resources, social position, and situational constraints. The logic of subjectivation refers to reflexivity activities aimed at solving problems and making choices. These logics served as a framework for the development of an online questionnaire. Three hundred respondents, newly enrolled in an undergraduate teaching program (bachelor's degree in vocational education), expressed themselves about their academic experience. This paper relates qualitative data (open-ended questions) subjected to an interpretive repertory analysis approach to descriptive data (closed-ended questions) that emerged. The results shed light on how the respondents perceive themselves as teachers and students, their perceptions of university training and the support offered, and the place that training occupies in their professional path. Indeed, their professional and academic paths are inextricably linked, and it seems essential to take them into account simultaneously to better meet their needs and foster the development of their expertise in pedagogy. The discussion focuses on the strengths and limitations of university training from the perspective of the logic of action. The results also suggest support strategies that can be implemented to better support the integration and retention of student teachers in professional education.

Keywords: teacher, vocational training, pre-service training, academic experience

Procedia PDF Downloads 101
436 An Application of Integrated Multi-Objective Particles Swarm Optimization and Genetic Algorithm Metaheuristic through Fuzzy Logic for Optimization of Vehicle Routing Problems in Sugar Industry

Authors: Mukhtiar Singh, Sumeet Nagar

Abstract:

Vehicle routing problem (VRP) is a combinatorial optimization and nonlinear programming problem aiming to optimize decisions regarding given set of routes for a fleet of vehicles in order to provide cost-effective and efficient delivery of both services and goods to the intended customers. This paper proposes the application of integrated particle swarm optimization (PSO) and genetic optimization algorithm (GA) to address the Vehicle routing problem in sugarcane industry in India. Suger industry is very prominent agro-based industry in India due to its impacts on rural livelihood and estimated to be employing around 5 lakhs workers directly in sugar mills. Due to various inadequacies, inefficiencies and inappropriateness associated with the current vehicle routing model it costs huge money loss to the industry which needs to be addressed in proper context. The proposed algorithm utilizes the crossover operation that originally appears in genetic algorithm (GA) to improve its flexibility and manipulation more readily and avoid being trapped in local optimum, and simultaneously for improving the convergence speed of the algorithm, level set theory is also added to it. We employ the hybrid approach to an example of VRP and compare its result with those generated by PSO, GA, and parallel PSO algorithms. The experimental comparison results indicate that the performance of hybrid algorithm is superior to others, and it will become an effective approach for solving discrete combinatory problems.

Keywords: fuzzy logic, genetic algorithm, particle swarm optimization, vehicle routing problem

Procedia PDF Downloads 377
435 The Role of Language Strategy on International Survival of Firm: A Conceptual Framework from Resource Dependence Perspective

Authors: Sazzad Hossain Talukder

Abstract:

Survival in the competitive international market with unforeseen environmental contingencies has always been a concern of the firms that led to adopting different strategies to deal with different situations. Language strategy is considered to enhance the international performance of a firm by organizing language diversity and fostering communications within and outside the firm. Yet there is a lack of theoretical attention or model development on the role of language strategy on firm international survival. From resource dependence perspective, the adoption of language strategy and its relationship with firm survival are determined by the firm´s capability to prevent dependency concentration and/or increase relative power on the external environment. However, the impact of language strategy on firm survival is complex and multifaceted as the strategy influence firm performance indirectly through communication, coordination, learning and value creation. The evidence of various types of language strategies and different forms of firm survival also bring in complexities to understand the effects of a language strategy on the international survival of a firm. Based on language literatures and resource dependence logic, certain propositions are developed to conceptualize the relationship between language strategy and firm international survival in this conceptual paper. For the purpose of this paper, a conceptual model is proposed to examine how different kinds of language strategy foster reduction of resource dependency that lead to firm international survival in respond to local responsiveness and global integration. In this proposed model, it is theorized that language strategy has a positive relationship with the international survival of the firm, as the strategy is likely to reduce external resource dependency and increase the ability to continue independent operations both in short and long term.

Keywords: language strategy, language diversity, firm international survival, resource dependence logic

Procedia PDF Downloads 253
434 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty

Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih

Abstract:

In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.

Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization

Procedia PDF Downloads 109
433 A User-Directed Approach to Optimization via Metaprogramming

Authors: Eashan Hatti

Abstract:

In software development, programmers often must make a choice between high-level programming and high-performance programs. High-level programming encourages the use of complex, pervasive abstractions. However, the use of these abstractions degrades performance-high performance demands that programs be low-level. In a compiler, the optimizer attempts to let the user have both. The optimizer takes high-level, abstract code as an input and produces low-level, performant code as an output. However, there is a problem with having the optimizer be a built-in part of the compiler. Domain-specific abstractions implemented as libraries are common in high-level languages. As a language’s library ecosystem grows, so does the number of abstractions that programmers will use. If these abstractions are to be performant, the optimizer must be extended with new optimizations to target them, or these abstractions must rely on existing general-purpose optimizations. The latter is often not as effective as needed. The former presents too significant of an effort for the compiler developers, as they are the only ones who can extend the language with new optimizations. Thus, the language becomes more high-level, yet the optimizer – and, in turn, program performance – falls behind. Programmers are again confronted with a choice between high-level programming and high-performance programs. To investigate a potential solution to this problem, we developed Peridot, a prototype programming language. Peridot’s main contribution is that it enables library developers to easily extend the language with new optimizations themselves. This allows the optimization workload to be taken off the compiler developers’ hands and given to a much larger set of people who can specialize in each problem domain. Because of this, optimizations can be much more effective while also being much more numerous. To enable this, Peridot supports metaprogramming designed for implementing program transformations. The language is split into two fragments or “levels”, one for metaprogramming, the other for high-level general-purpose programming. The metaprogramming level supports logic programming. Peridot’s key idea is that optimizations are simply implemented as metaprograms. The meta level supports several specific features which make it particularly suited to implementing optimizers. For instance, metaprograms can automatically deduce equalities between the programs they are optimizing via unification, deal with variable binding declaratively via higher-order abstract syntax, and avoid the phase-ordering problem via non-determinism. We have found that this design centered around logic programming makes optimizers concise and easy to write compared to their equivalents in functional or imperative languages. Overall, implementing Peridot has shown that its design is a viable solution to the problem of writing code which is both high-level and performant.

Keywords: optimization, metaprogramming, logic programming, abstraction

Procedia PDF Downloads 69
432 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 322
431 Formulation Development and Characterization of Oligonucleotide Containing Chitosan Nanoparticles

Authors: Gyati Shilakari Asthana, Abhay Asthana

Abstract:

Purpose: The therapeutic potential of oligonucleotide (ODN) is primarily dependent upon its safe and efficient delivery to specific cells overcoming degradation and maximizing cellular uptake in vivo. The present study is focused to design low molecular weight chitosan nanoconstructs to meet the requirements of safe and effectual delivery of ODNs. LMW-chitosan is a biodegradable, water soluble, biocompatible polymer and is useful as a non-viral vector for gene delivery due to its better stability in water. Methods: LMW chitosan ODN nanoparticles (CHODN NPs) were formulated by self assembled method using various N/P ratios (moles ratio of amine groups of CH to phosphate moieties of ODNs; 0.5:1, 1:1, 3:1, 5:1 and 7:1) of CH to ODN. The developed CHODN NPs were evaluated with respect to gel retardation assay, particle size, zeta potential and cytotoxicity and transfection efficiency. Results: Complete complexation of CH/ODN was achieved at the charge ratio of 0.5:1 or above and CHODN NPs displayed resistance against DNase I. On increasing the N/P ratio of CH/ODN, particle size of the NPs decreased whereas zeta potential (ZV) value increased. No significant toxicity was observed at all CH concentrations. The transfection efficiency was increased on increasing N/P ratio from 1:1 to 3:1, whereas it was decreased with further increment in N/P ratio upto 7:1. Maximum transfection of CHODN NPs with both the cell lines (Raw 267.4 cells and Hela cells) was achieved at N/P ratio of 3:1. The results suggest that transfection efficiency of CHODN NPs is dependent on N/P ratio. Conclusion: Thus the present study states that LMW chitosan nanoparticulate carriers would be acceptable choice to improve transfection efficiency in vitro as well as in vivo delivery of oligonucleotide.

Keywords: LMW-chitosan, chitosan nanoparticles, biocompatibility, cytotoxicity study, transfection efficiency, oligonucleotide

Procedia PDF Downloads 478
430 Ecosystem Model for Environmental Applications

Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru

Abstract:

This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision-making.

Keywords: ecosystem model, environmental security, fuzzy logic, sustainability of habitable regions

Procedia PDF Downloads 401
429 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment

Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu

Abstract:

Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.

Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic

Procedia PDF Downloads 463
428 Study and Acquisition of the Duality of the Arabic Language

Authors: Oleg Redkin, Olga Bernikova

Abstract:

It is commonly accepted that every language is both pure linguistic phenomenon as well as socially significant communicative system, which exists on the basis of certain society - its collective 'native speaker'. Therefore the language evolution and features besides its own linguistic rules and regulations are also defined by the influence of a number of extra-linguistic factors. The above mentioned statement may be illustrated by the example of the Arabic language which may be characterized by the following peculiarities: - the inner logic of the Arabic language - the 'algebraicity' of its morphological paradigms and grammar rules; - association of the Arabic language with the sacred texts of Islam, its close ties with the pre-Islamic and Islamic cultural heritage - the pre-Islamic poetry and Islamic literature and science; - territorial distribution, which in recent years went far beyond the boundaries of its traditional realm due to the development of new technologies and the spread of mass media, and what is more important, migration processes; - association of the Arabic language with the so called 'Renaissance of Islam'. These peculiarities should be remembered while considering the status of the Modern Standard Arabic (MSA) language or the Classical Arabic (CA) language as well as the Modern Arabic (MA) dialects in synchrony or from the diachronic point of view. Continuity of any system in diachrony on the one hand depends on the level of its ability to adapt itself to changing environment and by its internal ties on the other. Structural durability of language is characterized by its inner logic, hierarchy of paradigms and its grammar rules, as well as continuity of their implementation in acts of everyday communication. Since the Arabic language is both linguistic and social phenomenon the process of the Arabic language acquisition and study should not be focused only on the knowledge about linguistic features or development of communicative skills alone, but must be supplied with the information related to culture, history and religion of peoples of certain region that will expand and enrich competences of the target audience.

Keywords: Arabic, culture, Islam, language

Procedia PDF Downloads 261
427 Parametric Appraisal of Robotic Arc Welding of Mild Steel Material by Principal Component Analysis-Fuzzy with Taguchi Technique

Authors: Amruta Rout, Golak Bihari Mahanta, Gunji Bala Murali, Bibhuti Bhusan Biswal, B. B. V. L. Deepak

Abstract:

The use of industrial robots for performing welding operation is one of the chief sign of contemporary welding in these days. The weld joint parameter and weld process parameter modeling is one of the most crucial aspects of robotic welding. As weld process parameters affect the weld joint parameters differently, a multi-objective optimization technique has to be utilized to obtain optimal setting of weld process parameter. In this paper, a hybrid optimization technique, i.e., Principal Component Analysis (PCA) combined with fuzzy logic has been proposed to get optimal setting of weld process parameters like wire feed rate, welding current. Gas flow rate, welding speed and nozzle tip to plate distance. The weld joint parameters considered for optimization are the depth of penetration, yield strength, and ultimate strength. PCA is a very efficient multi-objective technique for converting the correlated and dependent parameters into uncorrelated and independent variables like the weld joint parameters. Also in this approach, no need for checking the correlation among responses as no individual weight has been assigned to responses. Fuzzy Inference Engine can efficiently consider these aspects into an internal hierarchy of it thereby overcoming various limitations of existing optimization approaches. At last Taguchi method is used to get the optimal setting of weld process parameters. Therefore, it has been concluded the hybrid technique has its own advantages which can be used for quality improvement in industrial applications.

Keywords: robotic arc welding, weld process parameters, weld joint parameters, principal component analysis, fuzzy logic, Taguchi method

Procedia PDF Downloads 166
426 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 67
425 Preparation and Characterization of Chitosan Nanoparticles for Delivery of Oligonucleotides

Authors: Gyati Shilakari Asthana, Abhay Asthana, Dharm Veer Kohli, Suresh Prasad Vyas

Abstract:

Purpose: The therapeutic potential of oligonucleotide (ODN) is primarily dependent upon its safe and efficient delivery to specific cells overcoming degradation and maximizing cellular uptake in vivo. The present study is focused to design low molecular weight chitosan nanoconstructs to meet the requirements of safe and effectual delivery of ODNs. LMW-chitosan is a biodegradable, water soluble, biocompatible polymer and is useful as a non-viral vector for gene delivery due to its better stability in water. Methods: LMW chitosan ODN nanoparticles (CHODN NPs) were formulated by self-assembled method using various N/P ratios (moles ratio of amine groups of CH to phosphate moieties of ODNs; 0.5:1, 1:1, 3:1, 5:1, and 7:1) of CH to ODN. The developed CHODN NPs were evaluated with respect to gel retardation assay, particle size, zeta potential and cytotoxicity and transfection efficiency. Results: Complete complexation of CH/ODN was achieved at the charge ratio of 0.5:1 or above and CHODN NPs displayed resistance against DNase I. On increasing the N/P ratio of CH/ODN, the particle size of the NPs decreased whereas zeta potential (ZV) value increased. No significant toxicity was observed at all CH concentrations. The transfection efficiency was increased on increasing N/P ratio from 1:1 to 3:1, whereas it was decreased with further increment in N/P ratio upto 7:1. Maximum transfection of CHODN NPs with both the cell lines (Raw 267.4 cells and Hela cells) was achieved at N/P ratio of 3:1. The results suggest that transfection efficiency of CHODN NPs is dependent on N/P ratio. Conclusion: Thus the present study states that LMW chitosan nanoparticulate carriers would be acceptable choice to improve transfection efficiency in vitro as well as in vivo delivery of oligonucleotide.

Keywords: LMW-chitosan, chitosan nanoparticles, biocompatibility, cytotoxicity study, transfection efficiency, oligonucleotide

Procedia PDF Downloads 831
424 Effectual Role of Local Level Partnership Schemes in Affordable Housing Delivery

Authors: Hala S. Mekawy

Abstract:

Affordable housing delivery for low and lower middle income families is a prominent problem in many developing countries; governments alone are unable to address this challenge due to diverse financial and regulatory constraints, and the private sector's contribution is rare and assists only middle-income households even when institutional and legal reforms are conducted to persuade it to go down market. Also, the market-enabling policy measures advocated by the World Bank since the early nineties have been strongly criticized and proven to be inappropriate to developing country contexts, where it is highly unlikely that the formal private sector can reach low income population. In addition to governments and private developers, affordable housing delivery systems involve an intricate network of relationships between diverse ranges of actors. Collaboration between them was proven to be vital, and hence, an approach towards partnership schemes for affordable housing delivery has emerged. The basic premise of this paper is that addressing housing affordability challenges in Egypt demands direct public support, as markets and market actors alone would never succeed in delivering decent affordable housing to low and lower middle income groups. It argues that this support would ideally be through local level partnership schemes, with a leading decentralized local government role, and partners being identified according to specific local conditions. It attempts to identify major attributes that would ensure the fulfilment of the goals of such schemes in the Egyptian context. This is based upon evidence from diversified worldwide experiences, in addition to the main outcomes of a questionnaire that was conducted to specialists and chief actors in the field.

Keywords: affordable housing, partnership schemes, housing, urban environments

Procedia PDF Downloads 203
423 Power Energy Management For A Grid-Connected PV System Using Rule-Base Fuzzy Logic

Authors: Nousheen Hashmi, Shoab Ahmad Khan

Abstract:

Active collaboration among the green energy sources and the load demand leads to serious issues related to power quality and stability. The growing number of green energy resources and Distributed-Generators need newer strategies to be incorporated for their operations to keep the power energy stability among green energy resources and micro-grid/Utility Grid. This paper presents a novel technique for energy power management in Grid-Connected Photovoltaic with energy storage system under set of constraints including weather conditions, Load Shedding Hours, Peak pricing Hours by using rule-based fuzzy smart grid controller to schedule power coming from multiple Power sources (photovoltaic, grid, battery) under the above set of constraints. The technique fuzzifies all the inputs and establishes fuzzify rule set from fuzzy outputs before defuzzification. Simulations are run for 24 hours period and rule base power scheduler is developed. The proposed fuzzy controller control strategy is able to sense the continuous fluctuations in Photovoltaic power generation, Load Demands, Grid (load Shedding patterns) and Battery State of Charge in order to make correct and quick decisions.The suggested Fuzzy Rule-based scheduler can operate well with vague inputs thus doesn’t not require any exact numerical model and can handle nonlinearity. This technique provides a framework for the extension to handle multiple special cases for optimized working of the system.

Keywords: photovoltaic, power, fuzzy logic, distributed generators, state of charge, load shedding, membership functions

Procedia PDF Downloads 465
422 Beyond Baudrillard: A Critical Intersection between Semiotics and Materialism

Authors: Francesco Piluso

Abstract:

Nowadays, to restore the deconstructive power of semiotics implies a critical analysis of neoliberal ideology, and, even more critically, a confrontation with materialist perspective. The theoretical path of Jean Baudrillard is crucial to understand the ambivalence of this intersection. A semiotic critique of Baudrillard’s work, through tools of both structuralism and interpretative semiotics, has the aim to give materialism a new consistent semiotic approach and vice-versa. According to Baudrillard, the commodity form is characterized by the same abstract and systemic logic of the sign-form, in which the production of the signified (use-value) is a mere ideological mean for the reproduction of the signifiers-chain (exchange-value). Nevertheless, this parallelism is broken by the author himself: if the use-value is deconstructed in its relative logic, the signified and the referent, both as discrete and positive elements, are collapsed on the same plane at the shadows of the signified forms. These divergent considerations lead Baudrillard to the same crucial point: the dismissal of the material world, replaced by the hyperreality as reproduction of a semiotic (genetic) Code. The stress on the concept of form, as an epistemological and semiotic tool to analyse the construction of values in the consumer society, has led to the Code as its ontological drift. In other words, Baudrillard seems to enclose consumer society (and reality) in this immanent and self-fetishized world of signs–an ideological perspective that mystifies the gravity of the material relationships between Northern-Western World and Third World. The notion of Encyclopaedia by Umberto Eco is the key to overturn the relationship of immanence/transcendence between the Code and the economic political of the sign, by understanding the former as an ideological plane within the encyclopedia itself. Therefore, rather than building semiotic (hyper)realities, semiotics has to deal with materialism in terms of material relationships of power which are mystified and reproduced through such ideological ontologies of signs.

Keywords: Baudrillard, Code, Eco, Encyclopaedia, epistemology vs. ontology, semiotics vs. materialism

Procedia PDF Downloads 139
421 Design of a Fuzzy Expert System for the Impact of Diabetes Mellitus on Cardiac and Renal Impediments

Authors: E. Rama Devi Jothilingam

Abstract:

Diabetes mellitus is now one of the most common non communicable diseases globally. India leads the world with largest number of diabetic subjects earning the title "diabetes capital of the world". In order to reduce the mortality rate, a fuzzy expert system is designed to predict the severity of cardiac and renal problems of diabetic patients using fuzzy logic. Since uncertainty is inherent in medicine, fuzzy logic is used in this research work to remove the inherent fuzziness of linguistic concepts and uncertain status in diabetes mellitus which is the prime cause for the cardiac arrest and renal failure. In this work, the controllable risk factors "blood sugar, insulin, ketones, lipids, obesity, blood pressure and protein/creatinine ratio" are considered as input parameters and the "the stages of cardiac" (SOC)" and the stages of renal" (SORD) are considered as the output parameters. The triangular membership functions are used to model the input and output parameters. The rule base is constructed for the proposed expert system based on the knowledge from the medical experts. Mamdani inference engine is used to infer the information based on the rule base to take major decision in diagnosis. Mean of maximum is used to get a non fuzzy control action that best represent possibility distribution of an inferred fuzzy control action. The proposed system also classifies the patients with high risk and low risk using fuzzy c means clustering techniques so that the patients with high risk are treated immediately. The system is validated with Matlab and is used as a tracking system with accuracy and robustness.

Keywords: Diabetes mellitus, fuzzy expert system, Mamdani, MATLAB

Procedia PDF Downloads 275
420 Testing of Complicated Bus Bar Protection Using Smart Testing Methodology

Authors: K. N. Dinesh Babu

Abstract:

In this paper, the protection of a complicated bus arrangement with a dual bus coupler and bus sectionalizer using low impedance differential protection applicable for very high voltages like 220kV and 400kV is discussed. In many power generation stations, several operational procedures are implemented to utilize the transfer bus as the main bus and to facilitate the maintenance of circuit breakers and current transformers (in each section) without shutting down the bay(s). Owing to this fact, the complications in operational philosophy have thrown challenges for the bus bar protection implementation. Many bus topologies allow any one of the main buses available in the station to be used as an auxiliary bus. In such a system, pre-defined precautions and procedures are made as guidelines, which are followed before assigning any bus as an auxiliary bus. The procedure involves shifting of links, changing rotary switches, insertion of test block, and so on, thereby causing unreliable operation. This kind of unreliable operation or inadvertent procedural lapse may result in the isolation of the bus bar from the grid due to the unpredictable operation of the bus bar protection relay, which is a commonly occurring phenomenon due to manual mistakes. With the sophisticated configuration and implementation of logic in modern intelligent electronic devices, the operator is free to select the transfer arrangement without sacrificing the protection required by a bus differential system for a reliable operation, and labor-intensive processes are completely eliminated. This paper deals with the procedure to test the security logic for such special scenarios using Megger make SMRT, bus bar protection relay to assure system stability and get rid of all the specific operational precautions/procedure.

Keywords: bus bar protection, by-pass isolator, blind spot, breaker failure, intelligent electronic device, end fault, bus unification, directional principle, zones of protection, breaker re-trip, under voltage security, smart megger relay tester

Procedia PDF Downloads 53
419 Design and Development of an 'Optimisation Controller' and a SCADA Based Monitoring System for Renewable Energy Management in Telecom Towers

Authors: M. Sundaram, H. R. Sanath Kumar, A. Ramprakash

Abstract:

Energy saving is a key sustainability focus area for the Indian telecom industry today. This is especially true in rural India where energy consumption contributes to 70 % of the total network operating cost. In urban areas, the energy cost for network operation ranges between 15-30 %. This expenditure on energy as a result of the lack of grid power availability highlights a potential barrier to telecom industry growth. As a result of this, telecom tower companies switch to diesel generators, making them the second largest consumer of diesel in India, consuming over 2.5 billion litres per annum. The growing cost of energy due to increasing diesel prices and concerns over rising greenhouse emissions have caused these companies to look at other renewable energy options. Even the TRAI (Telecom Regulation Authority of India) has issued a number of guidelines to implement Renewable Energy Technologies (RETs) in the telecom towers as part of its ‘Implementation of Green Technologies in Telecom Sector’ initiative. Our proposal suggests the implementation of a Programmable Logic Controller (PLC) based ‘optimisation controller’ that can not only efficiently utilize the energy from RETs but also help to conserve the power used in the telecom towers. When there are multiple RETs available to supply energy, this controller will pick the optimum amount of energy from each RET based on the availability and feasibility at that point of time, reducing the dependence on diesel generators. For effective maintenance of the towers, we are planing to implement a SCADA based monitoring system along with the ‘optimization controller’.

Keywords: operation costs, consumption of fuel and carbon footprint, implementation of a programmable logic controller (PLC) based ‘optimisation controller’, efficient SCADA based monitoring system

Procedia PDF Downloads 405
418 Phyto-Therapeutic, Functional and Nutritional Acclaims of Turnip (Brassica rapus L.): An Overview

Authors: Tabussam Tufail

Abstract:

Purpose: The core purpose of the current review article is to elaborate the phytochemicals present in turnip (brassica rapus l.) and also allied health claims. Plant-based foods contain a significant amount of bioactive compounds which provide desirable health benefits beyond the basic nutrition. Epidemiological evidence suggests that consumption of a diet rich in vegetables and fruits has positive implications for human health. Design: Potential of turnip peroxidase (TP) for the treatment of phenolic-contaminated solutions has been reviewed. However, issues of taste along with behavioral nutrition ought to be considered. So in the last decades, special attention has been paid towards edible plants, especially those that are rich in secondary metabolites (frequently called phytochemicals) and nowadays, there is an increasing interest in the antioxidant activity of such phytochemicals present in the diet. These chemicals favor nutritional and phytotherapy that is emerging as new concepts of health aid in recent years. Turnip is rich in these valuable ingredients though it can be employed as having health promoting and healing properties. Findings: Numerous bioactive components i.e. organic acids, phenolic compounds, turnip peroxidase, kaempeferol, vitamin-K, etc. are present in turnip. The review focused on the significance of plant derived (especially turnip) phenolic compounds as a source of certain beneficial compounds for human health. Owing to the presence of bioactive moieties, the turnip has high antioxidant activity, positive role in blood clotting, effectual in phenobarbital-induced sleeping time, effective against hepatic injury in diabetics and also have a good hepatoprotective role. Strong recommendations for consumption of nutraceuticals from turnip have become progressively popular to improve health, and to prevent from diseases.

Keywords: phytochemicals, turnip, antioxidants, health benefits

Procedia PDF Downloads 222
417 Facile Synthesis of Sulfur Doped TiO2 Nanoparticles with Enhanced Photocatalytic Activity

Authors: Vishnu V. Pillai, Sunil P. Lonkar, Akhil M. Abraham, Saeed M. Alhassan

Abstract:

An effectual technology for wastewater treatment is a great demand now in order to encounter the water pollution caused by organic pollutants. Photocatalytic oxidation technology is widely used in removal of such unsafe contaminants. Among the semi-conducting metal oxides, robust and thermally stable TiO2 has emerged as a fascinating material for photocatalysis. Enhanced catalytic activity was observed for nanostructured TiO2 due to its higher surface, chemical stability and higher oxidation ability. However, higher charge carrier recombination and wide band gap of TiO2 limits its use as a photocatalyst in the UV region. It is desirable to develop a photocatalyst that can efficiently absorb the visible light, which occupies the main part of the solar spectrum. Hence, in order to extend its photocatalytic efficiency under visible light, TiO2 nanoparticles are often doped with metallic or non-metallic elements. Non-metallic doping of TiO2 has attracted much attention due to the low thermal stability and enhanced recombination of charge carriers endowed by metallic doping of TiO2. Amongst, sulfur doped TiO2 is most widely used photocatalyst in environmental purification. However, the most of S-TiO2 synthesis technique uses toxic chemicals and complex procedures. Hence, a facile, scalable and environmentally benign preparation process for S-TiO2 is highly desirable. In present work, we have demonstrated new and facile solid-state reaction method for S-TiO2 synthesis that uses abundant elemental sulfur as S source and moderate temperatures. The resulting nano-sized S-TiO2 has been successfully employed as visible light photocatalyst in methylene blue dye removal from aqueous media.

Keywords: ecofriendly, nanomaterials, methylene blue, photocatalysts

Procedia PDF Downloads 333
416 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System

Authors: Benjamin Chijioke Agwah, Paulinus Chinaenye Eze

Abstract:

Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC- VZLC provided fast tracking of desired wheel slip, eliminate chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.

Keywords: ABS, fuzzy logic controller, variable zero lag compensator, wheel slip tracking

Procedia PDF Downloads 132