Search results for: contractual complexity
1195 Land Use Change Modeling Using Cellular Automata, Case Study: Karawang City, West Java Province, Indonesia
Authors: Bagus Indrawan Hardi
Abstract:
Cellular Automata are widely used in land use modeling, it has been proven powerful to simulate land use change for small scale in many large cities in the world. In this paper, we try to implement CA for land use modeling in unique city in Indonesia, Karawang. Instead the complex numerical implementation, CA are simple, and it is accurate and also highly dependable on the on the rules (rule based). The most important to do in CA is how we form and calculate the neighborhood effect. The neighborhood effect represents the environment and relationship situation between the occupied cell and others. We adopted 196 cells of circular neighborhood with 8 cells of radius. For the results, CA works well in this study, we exhibit several analyzed and proceed of zoomed part in Karawang region. The rule set can handle the complexity in land use modeling. However, we cannot strictly believe of the result, many non-technical parameters, such as politics, natural disaster activities, etc. may change the results dramatically.Keywords: cellular automata (CA), land use change, spatial dynamics, urban sprawl
Procedia PDF Downloads 2421194 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction
Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz
Abstract:
In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.Keywords: software quality, fuzzy logic, perception, prediction
Procedia PDF Downloads 3161193 The Origins of Representations: Cognitive and Brain Development
Authors: Athanasios Raftopoulos
Abstract:
In this paper, an attempt is made to explain the evolution or development of human’s representational arsenal from its humble beginnings to its modern abstract symbols. Representations are physical entities that represent something else. To represent a thing (in a general sense of “thing”) means to use in the mind or in an external medium a sign that stands for it. The sign can be used as a proxy of the represented thing when the thing is absent. Representations come in many varieties, from signs that perceptually resemble their representative to abstract symbols that are related to their representata through conventions. Relying the distinction among indices, icons, and symbols, it is explained how symbolic representations gradually emerged from indices and icons. To understand the development or evolution of our representational arsenal, the development of the cognitive capacities that enabled the gradual emergence of representations of increasing complexity and expressive capability should be examined. The examination of these factors should rely on a careful assessment of the available empirical neuroscientific and paleo-anthropological evidence. These pieces of evidence should be synthesized to produce arguments whose conclusions provide clues concerning the developmental process of our representational capabilities. The analysis of the empirical findings in this paper shows that Homo Erectus was able to use both icons and symbols. Icons were used as external representations, while symbols were used in language. The first step in the emergence of representations is that a sensory-motor purely causal schema involved in indices is decoupled from its normal causal sensory-motor functions and serves as a representation of the object that initially called it into play. Sensory-motor schemes are tied to specific contexts of the organism-environment interactions and are activated only within these contexts. For a representation of an object to be possible, this scheme must be de-contextualized so that the same object can be represented in different contexts; a decoupled schema loses its direct ties to reality and becomes mental content. The analysis suggests that symbols emerged due to selection pressures of the social environment. The need to establish and maintain social relationships in ever-enlarging groups that would benefit the group was a sufficient environmental pressure to lead to the appearance of the symbolic capacity. Symbols could serve this need because they can express abstract relationships, such as marriage or monogamy. Icons, by being firmly attached to what can be observed, could not go beyond surface properties to express abstract relations. The cognitive capacities that are required for having iconic and then symbolic representations were present in Homo Erectus, which had a language that started without syntactic rules but was structured so as to mirror the structure of the world. This language became increasingly complex, and grammatical rules started to appear to allow for the construction of more complex expressions required to keep up with the increasing complexity of social niches. This created evolutionary pressures that eventually led to increasing cranial size and restructuring of the brain that allowed more complex representational systems to emerge.Keywords: mental representations, iconic representations, symbols, human evolution
Procedia PDF Downloads 541192 A Study on Game Theory Approaches for Wireless Sensor Networks
Authors: M. Shoukath Ali, Rajendra Prasad Singh
Abstract:
Game Theory approaches and their application in improving the performance of Wireless Sensor Networks (WSNs) are discussed in this paper. The mathematical modeling and analysis of WSNs may have low success rate due to the complexity of topology, modeling, link quality, etc. However, Game Theory is a field, which can efficiently use to analyze the WSNs. Game Theory is related to applied mathematics that describes and analyzes interactive decision situations. Game theory has the ability to model independent, individual decision makers whose actions affect the surrounding decision makers. The outcome of complex interactions among rational entities can be predicted by a set of analytical tools. However, the rationality demands a stringent observance to a strategy based on measured of perceived results. Researchers are adopting game theory approaches to model and analyze leading wireless communication networking issues, which includes QoS, power control, resource sharing, etc.Keywords: wireless sensor network, game theory, cooperative game theory, non-cooperative game theory
Procedia PDF Downloads 4291191 Video-Observation: A Phenomenological Research Tool for International Relation?
Authors: Andreas Aagaard Nohr
Abstract:
International Relations is an academic discipline which is rarely in direct contact with its field. However, there has in recent years been a growing interest in the different agents within and beyond the state and their associated practices; yet some of the research tools with which to study them are not widely used. This paper introduces video-observation as a method for the study of IR and argues that it offers a unique way of studying the complexity of the everyday context of actors. The paper is divided into two main parts: First, the philosophical and methodological underpinnings of the kind of data that video-observation produces are discussed; primarily through a discussion of the phenomenology of Husserl, Heidegger, and Merleau-Ponty. Second, taking simulation of a WTO negotiation round as an example, the paper discusses how the data created can be analysed: in particular with regard to the structure of events, the temporal and spatial organization of activities, rhythm and periodicity, and the concrete role of artefacts and documents. The paper concludes with a discussion of the ontological, epistemological, and practical challenges and limitations that ought to be considered if video-observation is chosen as a method within the field of IR.Keywords: video-observation, phenomenology, international relations
Procedia PDF Downloads 4451190 Critical Pedagogy and Literacy Development
Authors: Rajendra Chetty
Abstract:
This paper analyses the experiences of teachers of literacy in underprivileged schools in the Western Cape, South Africa. The purpose is to provide teachers in poorly resourced schools within economically deprived areas an opportunity to voice their experiences of teaching literacy. The paper is based on an empirical study using interviews and classroom observation. A descriptive account of the observation data was followed by an interpretive analysis. The content analysis of the interview data led to the development of themes and patterns for the discussion. The study reveals key factors for literacy underachievement that include lack of critical and emancipatory pedagogies, resources, parental support, lack of teacher knowledge, absence of cognitive activities, and the social complexity of poverty. The paper recommends that a new model of literacy that is underpinned by critical pedagogy challenge inequality and provides strategic and sustained teacher support in disadvantaged schools is crucial in a society emerging from oppression and racism.Keywords: critical pedagogy, disadvantaged schools, literacy, poverty
Procedia PDF Downloads 1091189 A Low Insertion Loss Variation 10-35 GHz Phase Shifter
Authors: Soroush Rasti Boroujeni, S. Hassan Mousavi, Javad Ebrahimizadeh, Ardeshir Palizban, Mohammad-Reza Nezhad-Ahmadi, , Safieddin Safavi-Naeini
Abstract:
This paper presents a wideband True Time Delay (TTD) phase shifter with low insertion loss variation. The circuit benefits from a controllable resistive load shunt with transmission line segments to optimize return loss variations, addressing the unbalanced capacitive nature of the varactor. The phase shifter reduces the complexity of the calibration process because the dependency of insertion loss on voltage controls is improved up to 3 dB. The TTD phase shifter provides a continuous changing delay time of 6.4 ps with low insertion loss (IL) in the 10-35 GHz frequency range. The proposed circuit benefits from lowloss phase shifters with a small footprint. Fabricated using a 65 nm CMOC process, the TTD phase shifter occupies only 388 × 615 µm 2 of chip area, achieving a 20% improvements compared to conventional TTD phase shifters.Keywords: millimeter-wave phased-array, true time delay phase shifter, insertion loss variation, compact size
Procedia PDF Downloads 51188 Digital Individual Benefit Statement: The Use of a Triangulation Methodology to Design a Digital Platform for Switzerland
Authors: Catherine Equey Balzli
Abstract:
Old age retirement pensions are an important concern among the Swiss but estimating one’s income after retirement is difficult due to the Swiss insurance system’s complexity. This project’s aim is to prepare for developing a digital platform that will allow individuals to plan for retirement in a simplified manner. The main objective of the platform will be to give individuals the tools to check that their savings and retirement benefits will allow them to continue the lifestyle to which they are accustomed once they are retired. The research results from qualitative (focus group) and quantitative (survey) methodologies, recommend the scope and functionalities for a digital platform to be developed. A main outcome is the need to limit the platform’s scope to old-age pension only (excluding survivors’ or disability pensions, for instance). Furthermore, an outcome regarding the functionalities is the proposition of scenarios such as early retirement, changes to income, or modifications to personal status. The development of the digital platform will be a subsequent project.Keywords: benefit statement, digital platform, retirement financial planning, social insurance
Procedia PDF Downloads 1101187 The Capability of Organizational Leadership: Development of Conceptual Framework
Authors: Kurmet Kivipõld, Maaja Vadi
Abstract:
Current paper develops the conceptual framework for organizational leadership capability. Organizational leadership here is understood as collective multi-level phenomenon which has been embedded into organizational processes as a capability at the level of the entire organization. The paper analyses and systematises the theo¬retical approaches to multi-level leadership in existing literature. This analysis marks the foundation of collective leadership at the organizational level, which forms the basis for the development of the conceptual framework of organi¬zational leadership capability. The developed conceptual framework of organiza¬tional leadership capability is formed from the synthesis of the three groups of base theories – traditional leadership theories, the resource-based view from strategic management and complexity theory from system theories. These conceptual sources present the main characteristics that determine the nature of organizational leadership capability and are the basis for its mea¬surement.Keywords: leadership, organizational capability, organizational leadership, resource-based view, system theory
Procedia PDF Downloads 3491186 Software Component Identification from Its Object-Oriented Code: Graph Metrics Based Approach
Authors: Manel Brichni, Abdelhak-Djamel Seriai
Abstract:
Systems are increasingly complex. To reduce their complexity, an abstract view of the system can simplify its development. To overcome this problem, we propose a method to decompose systems into subsystems while reducing their coupling. These subsystems represent components. Consisting of an existing object-oriented systems, the main idea of our approach is based on modelling as graphs all entities of an oriented object source code. Such modelling is easy to handle, so we can apply restructuring algorithms based on graph metrics. The particularity of our approach consists in integrating in addition to standard metrics, such as coupling and cohesion, some graph metrics giving more precision during the components identication. To treat this problem, we relied on the ROMANTIC approach that proposed a component-based software architecture recovery from an object oriented system.Keywords: software reengineering, software component and interfaces, metrics, graphs
Procedia PDF Downloads 5001185 Effects of Epinephrine on Gene Expressions during the Metamorphosis of Pacific Oyster Crassostrea gigas
Authors: Fei Xu, Guofan Zhang, Xiao Liu
Abstract:
Many major marine invertebrate phyla are characterized by indirect development. These animals transit from planktonic larvae to benthic adults via settlement and metamorphosis, which has many advantages for organisms to adapt marine environment. Studying the biological process of metamorphosis is thus a key to understand the origin and evolution of indirect development. Although the mechanism of metamorphosis has been largely studied on their relationships with the marine environment, microorganisms, as well as the neurohormones, little is known on the gene regulation network (GRN) during metamorphosis. We treated competent oyster pediveligers with epinephrine, which was known to be able to effectively induce oyster metamorphosis, and analyzed the dynamics of gene and proteins with transcriptomics and proteomics methods. The result indicated significant upregulation of protein synthesis system, as well as some transcription factors including Homeobox, basic helix-loop-helix, and nuclear receptors. The result suggested the GRN complexity of the transition stage during oyster metamorphosis.Keywords: indirect development, gene regulation network, protein synthesis, transcription factors
Procedia PDF Downloads 1361184 Merging Sequence Diagrams Based Slicing
Authors: Bouras Zine Eddine, Talai Abdelouaheb
Abstract:
The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. Also, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.Keywords: system behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing
Procedia PDF Downloads 3381183 On Block Vandermonde Matrix Constructed from Matrix Polynomial Solvents
Authors: Malika Yaici, Kamel Hariche
Abstract:
In control engineering, systems described by matrix fractions are studied through properties of block roots, also called solvents. These solvents are usually dealt with in a block Vandermonde matrix form. Inverses and determinants of Vandermonde matrices and block Vandermonde matrices are used in solving problems of numerical analysis in many domains but require costly computations. Even though Vandermonde matrices are well known and method to compute inverse and determinants are many and, generally, based on interpolation techniques, methods to compute the inverse and determinant of a block Vandermonde matrix have not been well studied. In this paper, some properties of these matrices and iterative algorithms to compute the determinant and the inverse of a block Vandermonde matrix are given. These methods are deducted from the partitioned matrix inversion and determinant computing methods. Due to their great size, parallelization may be a solution to reduce the computations cost, so a parallelization of these algorithms is proposed and validated by a comparison using algorithmic complexity.Keywords: block vandermonde matrix, solvents, matrix polynomial, matrix inverse, matrix determinant, parallelization
Procedia PDF Downloads 2371182 Government Intervention Strategies in Providing Water to Rural Communities in the O R Tambo District Municipality, South Africa
Authors: Cecilia Kunseh Betek
Abstract:
Managing rural water supply systems effectively and efficiently is a challenge in the O R Tambo District Municipality due to the long distances between consumers and municipal centres. This is a couple with the low income of most residents and the government's policy of free basic water which is making rural water provision very difficult. With regard to cartage, the results reveal that the majority (84.4%) of the population covers distances of about 1kilometre to fetch water, and 15.6% travel up kilometer to access water facilities. This means that the water sources are located very far from households, outside the officially legislated array of 200metres. These are many reasons to account for this situation. Firstly, this implies that there are inadequate stand pipes to cater for all the homesteads scattered across the rugged terrain of OR Tambo District municipality. Secondly, and following from the first explanation, it would be seen that funding that is made available is not adequate, or is not efficiently spent on the targeted projects. The situation in the rural areas of South Africa is fraught with cumbersome complexity when it comes to service delivery.Keywords: water, management, government, rural
Procedia PDF Downloads 2831181 New Method to Increase Contrast of Electromicrograph of Rat Tissues Sections
Authors: Lise Paule Labéjof, Raíza Sales Pereira Bizerra, Galileu Barbosa Costa, Thaísa Barros dos Santos
Abstract:
Since the beginning of the microscopy, improving the image quality has always been a concern of its users. Especially for transmission electron microscopy (TEM), the problem is even more important due to the complexity of the sample preparation technique and the many variables that can affect the conservation of structures, proper operation of the equipment used and then the quality of the images obtained. Animal tissues being transparent it is necessary to apply a contrast agent in order to identify the elements of their ultrastructural morphology. Several methods of contrastation of tissues for TEM imaging have already been developed. The most used are the “in block” contrastation and “in situ” contrastation. This report presents an alternative technique of application of contrast agent in vivo, i.e. before sampling. By this new method the electromicrographies of the tissue sections have better contrast compared to that in situ and present no artefact of precipitation of contrast agent. Another advantage is that a small amount of contrast is needed to get a good result given that most of them are expensive and extremely toxic.Keywords: image quality, microscopy research, staining technique, ultra thin section
Procedia PDF Downloads 4311180 Liquidity and Cash Management in Business-A Key to Business Survival and Growth: The Nigerian Case
Authors: Ugbor Raphael Oluchukwu
Abstract:
Focusing on liquidity comes more naturally to a Chief Executive Officer than an Accountant who is trained to practice accrual accounting. When business is just commencing, it is essentially run on a cheque book (cash accounting) and for as long as there is cash in the accounts, the business is solvent. When complexity sets in and the business adopts financial accounting, the effect of liquidity and cash management becomes more pronounced. The management of cash no doubts impacts positively on the survival and growth of firms. What is in doubt is the amount of cash to be held by a firm as enough cash to enable the firm stay “afloat”. The focus of this paper is to determine liquidity and cash management in business, the Nigerian case. The specific objectives of the study are to do a theoretical review of the amount of cash to be held by a firm as enough cash to enable it stay afloat and to do a theoretical analysis to show the effect of cash flow on the survival and growth of firms in Nigeria.Keywords: cash, firm survival, growth, liquidity management
Procedia PDF Downloads 5831179 Lean Implementation Analysis on the Safety Performance of Construction Projects in the Philippines
Authors: Kim Lindsay F. Restua, Jeehan Kyra A. Rivero, Joneka Myles D. Taguba
Abstract:
Lean construction is defined as an approach in construction with the purpose of reducing waste in the process without compromising the value of the project. There are numerous lean construction tools that are applied in the construction process, which maximizes the efficiency of work and satisfaction of customers while minimizing waste. However, the complexity and differences of construction projects cause a rise in challenges on achieving the lean benefits construction can give, such as improvement in safety performance. The objective of this study is to determine the relationship between lean construction tools and their effects on safety performance. The relationship between construction tools applied in construction and safety performance is identified through Logistic Regression Analysis, and Correlation Analysis was conducted thereafter. Based on the findings, it was concluded that almost 60% of the factors listed in the study, which are different tools and effects of lean construction, were determined to have a significant relationship with the level of safety in construction projects.Keywords: correlation analysis, lean construction tools, lean construction, logistic regression analysis, risk management, safety
Procedia PDF Downloads 1831178 Problem of Services Selection in Ubiquitous Systems
Authors: Malika Yaici, Assia Arab, Betitra Yakouben, Samia Zermani
Abstract:
Ubiquitous computing is nowadays a reality through the networking of a growing number of computing devices. It allows providing users with context aware information and services in a heterogeneous environment, anywhere and anytime. Selection of the best context-aware service, between many available services and providers, is a tedious problem. In this paper, a service selection method based on Constraint Satisfaction Problem (CSP) formalism is proposed. The services are considered as variables and domains; and the user context, preferences and providers characteristics are considered as constraints. The Backtrack algorithm is used to solve the problem to find the best service and provider which matches the user requirements. Even though this algorithm has an exponential complexity, but its use guarantees that the service, that best matches the user requirements, will be found. A comparison of the proposed method with the existing solutions finishes the paper.Keywords: ubiquitous computing, services selection, constraint satisfaction problem, backtrack algorithm
Procedia PDF Downloads 2421177 A Model for Teaching Arabic Grammar in Light of the Common European Framework of Reference for Languages
Authors: Erfan Abdeldaim Mohamed Ahmed Abdalla
Abstract:
The complexity of Arabic grammar poses challenges for learners, particularly in relation to its arrangement, classification, abundance, and bifurcation. The challenge at hand is a result of the contextual factors that gave rise to the grammatical rules in question, as well as the pedagogical approach employed at the time, which was tailored to the needs of learners during that particular historical period. Consequently, modern-day students encounter this same obstacle. This requires a thorough examination of the arrangement and categorization of Arabic grammatical rules based on particular criteria, as well as an assessment of their objectives. Additionally, it is necessary to identify the prevalent and renowned grammatical rules, as well as those that are infrequently encountered, obscure and disregarded. This paper presents a compilation of grammatical rules that require arrangement and categorization in accordance with the standards outlined in the Common European Framework of Reference for Languages (CEFR). In addition to facilitating comprehension of the curriculum, accommodating learners' requirements, and establishing the fundamental competencies for achieving proficiency in Arabic, it is imperative to ascertain the conventions that language learners necessitate in alignment with explicitly delineated benchmarks such as the CEFR criteria. The aim of this study is to reduce the quantity of grammatical rules that are typically presented to non-native Arabic speakers in Arabic textbooks. This reduction is expected to enhance the motivation of learners to continue their Arabic language acquisition and to approach the level of proficiency of native speakers. The primary obstacle faced by learners is the intricate nature of Arabic grammar, which poses a significant challenge in the realm of study. The proliferation and complexity of regulations evident in Arabic language textbooks designed for individuals who are not native speakers is noteworthy. The inadequate organisation and delivery of the material create the impression that the grammar is being imparted to a student with the intention of memorising "Alfiyyat-Ibn-Malik." Consequently, the sequence of grammatical rules instruction was altered, with rules originally intended for later instruction being presented first and those intended for earlier instruction being presented subsequently. Students often focus on learning grammatical rules that are not necessarily required while neglecting the rules that are commonly used in everyday speech and writing. Non-Arab students are taught Arabic grammar chapters that are infrequently utilised in Arabic literature and may be a topic of debate among grammarians. The aforementioned findings are derived from the statistical analysis and investigations conducted by the researcher, which will be disclosed in due course of the research. To instruct non-Arabic speakers on grammatical rules, it is imperative to discern the most prevalent grammatical frameworks in grammar manuals and linguistic literature (study sample). The present proposal suggests the allocation of grammatical structures across linguistic levels, taking into account the guidelines of the CEFR, as well as the grammatical structures that are necessary for non-Arabic-speaking learners to generate a modern, cohesive, and comprehensible language.Keywords: grammar, Arabic, functional, framework, problems, standards, statistical, popularity, analysis
Procedia PDF Downloads 871176 Integrating RAG with Prompt Engineering for Dynamic Log Parsing and Anomaly Detections
Authors: Liu Lin Xin
Abstract:
With the increasing complexity of systems, log parsing and anomaly detection have become crucial for maintaining system stability. However, traditional methods often struggle with adaptability and accuracy, especially when dealing with rapidly evolving log content and unfamiliar domains. To address these challenges, this paper proposes approach that integrates Retrieval Augmented Generation (RAG) technology with Prompt Engineering for Large Language Models, applied specifically in LogPrompt. This approach enables dynamic log parsing and intelligent anomaly detection by combining real-time information retrieval with prompt optimization. The proposed method significantly enhances the adaptability of log analysis and improves the interpretability of results. Experimental results on several public datasets demonstrate the method's superior performance, particularly in scenarios lacking training data, where it significantly outperforms traditional methods. This paper introduces a novel technical pathway for log parsing and anomaly detection, showcasing the substantial theoretical value and practical potential.Keywords: log parsing, anomaly detection, RAG, prompt engineering, LLMs
Procedia PDF Downloads 311175 Comparison of Seismic Retrofitting Methods for Existing Foundations in Seismological Active Regions
Authors: Peyman Amini Motlagh, Ali Pak
Abstract:
Seismic retrofitting of important structures is essential in seismological active zones. The importance is doubled when it comes to some buildings like schools, hospitals, bridges etc. because they are required to continue their serviceability even after a major earthquake. Generally, seismic retrofitting codes have paid little attention to retrofitting of foundations due to its construction complexity. In this paper different methods for seismic retrofitting of tall buildings’ foundations will be discussed and evaluated. Foundations are considered in three different categories. First, foundations those are in danger of liquefaction of their underlying soil. Second, foundations located on slopes in seismological active regions. Third, foundations designed according to former design codes and may show structural defects under earthquake loads. After describing different methods used in different countries for retrofitting of the existing foundations in seismological active regions, comprehensive comparison between these methods with regard to the above mentioned categories is carried out. This paper gives some guidelines to choose the best method for seismic retrofitting of tall buildings’ foundations in retrofitting projects.Keywords: existing foundation, landslide, liquefaction, seismic retrofitting
Procedia PDF Downloads 3891174 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Rajaian Hoonejani Mohammad, Eshraghi Pegah, Zomorodian Zahra Sadat, Tahsildoost Mohammad
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 711173 Process Driven Architecture For The ‘Lessons Learnt’ Knowledge Sharing Framework: The Case Of A ‘Lessons Learnt’ Framework For KOC
Authors: Rima Al-Awadhi, Abdul Jaleel Tharayil
Abstract:
On a regular basis, KOC engages into various types of Projects. However, due to very nature and complexity involved, each project experience generates a lot of ‘learnings’ that need to be factored into while drafting a new contract and thus avoid repeating the same mistakes. But, many a time these learnings are localized and remain as tacit leading to scope re-work, larger cycle time, schedule overrun, adjustment orders and claims. Also, these experiences are not readily available to new employees leading to steep learning curve and longer time to competency. This is to share our experience in designing and implementing a process driven architecture for the ‘lessons learnt’ knowledge sharing framework in KOC. It high-lights the ‘lessons learnt’ sharing process adopted, integration with the organizational processes, governance framework, the challenges faced and learning from our experience in implementing a ‘lessons learnt’ framework.Keywords: lessons learnt, knowledge transfer, knowledge sharing, successful practices, Lessons Learnt Workshop, governance framework
Procedia PDF Downloads 5751172 A Network-Theorical Perspective on Music Analysis
Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria
Abstract:
The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.Keywords: computational musicology, mathematical music modelling, music analysis, style classification
Procedia PDF Downloads 1011171 Phytoadaptation in Desert Soil Prediction Using Fuzzy Logic Modeling
Authors: S. Bouharati, F. Allag, M. Belmahdi, M. Bounechada
Abstract:
In terms of ecology forecast effects of desertification, the purpose of this study is to develop a predictive model of growth and adaptation of species in arid environment and bioclimatic conditions. The impact of climate change and the desertification phenomena is the result of combined effects in magnitude and frequency of these phenomena. Like the data involved in the phytopathogenic process and bacteria growth in arid soil occur in an uncertain environment because of their complexity, it becomes necessary to have a suitable methodology for the analysis of these variables. The basic principles of fuzzy logic those are perfectly suited to this process. As input variables, we consider the physical parameters, soil type, bacteria nature, and plant species concerned. The result output variable is the adaptability of the species expressed by the growth rate or extinction. As a conclusion, we prevent the possible strategies for adaptation, with or without shifting areas of plantation and nature adequate vegetation.Keywords: climate changes, dry soil, phytopathogenicity, predictive model, fuzzy logic
Procedia PDF Downloads 3201170 A Review of HVDC Modular Multilevel Converters Subjected to DC and AC Faults
Authors: Jude Inwumoh, Adam P. R. Taylor, Kosala Gunawardane
Abstract:
Modular multilevel converters (MMC) exhibit a highly scalable and modular characteristic with good voltage/power expansion, fault tolerance capability, low output harmonic content, good redundancy, and a flexible front-end configuration. Fault detection, location, and isolation, as well as maintaining fault ride-through (FRT), are major challenges to MMC reliability and power supply sustainability. Different papers have been reviewed to seek the best MMC configuration with fault capability. DC faults are the most common fault, while the probability that AC fault occurs in a modular multilevel converter (MCC) is low; though, AC faults consequence are severe. This paper reviews several MMC topologies and modulation techniques in tackling faults. These fault control strategies are compared based on cost, complexity, controllability, and power loss. A meshed network of half-bridge (HB) MMC topology was optimal in rendering fault ride through than any other MMC topologies but only when combined with DC circuit breakers (CBS), AC CBS, and fault current limiters (FCL).Keywords: MMC-HVDC, DC faults, fault current limiters, control scheme
Procedia PDF Downloads 1371169 Navigating the Complexity of Guillain-Barré Syndrome and Miller Fisher Syndrome Overlap Syndrome: A Pediatric Case Report
Authors: Kamal Chafiq, Youssef Hadzine, Adel Elmekkaoui, Othmane Benlenda, Houssam Rajad, Soukaina Wakrim, Hicham Nassik
Abstract:
Guillain-Barré syndrome/Miller Fishe syndrome (GBS/MFS) overlap syndrome is an extremely rare variant of Guillain-Barré syndrome (GBS) in which Miller Fisher syndrome (MFS) coexists with other characteristics of GBS, such as limb weakness, paresthesia, and facial paralysis. We report the clinical case of a 12-year-old patient, with no pathological history, who acutely presents with ophthalmoplegia, areflexia, facial diplegia, and swallowing and phonation disorders, followed by progressive, descending, and symmetrical paresis affecting first the upper limbs and then the lower limbs. An albuminocytological dissociation was found in the cerebrospinal fluid study. Magnetic resonance imaging of the spinal cord showed enhancement and thickening of the cauda equina roots. The patient was treated with immunoglobulins with a favorable clinical outcome.Keywords: Guillain-Barré syndrome, Miller Fisher syndrome, overlap syndrome, anti-GQ1b antibodies
Procedia PDF Downloads 751168 AM/E/c Queuing Hub Maximal Covering Location Model with Fuzzy Parameter
Authors: M. H. Fazel Zarandi, N. Moshahedi
Abstract:
The hub location problem appears in a variety of applications such as medical centers, firefighting facilities, cargo delivery systems and telecommunication network design. The location of service centers has a strong influence on the congestion at each of them, and, consequently, on the quality of service. This paper presents a fuzzy maximal hub covering location problem (FMCHLP) in which travel costs between any pair of nodes is considered as a fuzzy variable. In order to consider the quality of service, we model each hub as a queue. Arrival rate follows Poisson distribution and service rate follows Erlang distribution. In this paper, at first, a nonlinear mathematical programming model is presented. Then, we convert it to the linear one. We solved the linear model using GAMS software up to 25 nodes and for large sizes due to the complexity of hub covering location problems, and simulated annealing algorithm is developed to solve and test the model. Also, we used possibilistic c-means clustering method in order to find an initial solution.Keywords: fuzzy modeling, location, possibilistic clustering, queuing
Procedia PDF Downloads 3911167 Adaptive Neuro Fuzzy Inference System Model Based on Support Vector Regression for Stock Time Series Forecasting
Authors: Anita Setianingrum, Oki S. Jaya, Zuherman Rustam
Abstract:
Forecasting stock price is a challenging task due to the complex time series of the data. The complexity arises from many variables that affect the stock market. Many time series models have been proposed before, but those previous models still have some problems: 1) put the subjectivity of choosing the technical indicators, and 2) rely upon some assumptions about the variables, so it is limited to be applied to all datasets. Therefore, this paper studied a novel Adaptive Neuro-Fuzzy Inference System (ANFIS) time series model based on Support Vector Regression (SVR) for forecasting the stock market. In order to evaluate the performance of proposed models, stock market transaction data of TAIEX and HIS from January to December 2015 is collected as experimental datasets. As a result, the method has outperformed its counterparts in terms of accuracy.Keywords: ANFIS, fuzzy time series, stock forecasting, SVR
Procedia PDF Downloads 2441166 Active Power Flow Control Using a TCSC Based Backstepping Controller in Multimachine Power System
Authors: Naimi Abdelhamid, Othmane Abdelkhalek
Abstract:
With the current rise in the demand of electrical energy, present-day power systems which are large and complex, will continue to grow in both size and complexity. Flexible AC Transmission System (FACTS) controllers provide new facilities, both in steady state power flow control and dynamic stability control. Thyristor Controlled Series Capacitor (TCSC) is one of FACTS equipment, which is used for power flow control of active power in electric power system and for increase of capacities of transmission lines. In this paper, a Backstepping Power Flow Controller (BPFC) for TCSC in multimachine power system is developed and tested. The simulation results show that the TCSC proposed controller is capable of controlling the transmitted active power and improving the transient stability when compared with conventional PI Power Flow Controller (PIPFC).Keywords: FACTS, thyristor controlled series capacitor (TCSC), backstepping, BPFC, PIPFC
Procedia PDF Downloads 527