Search results for: low complexity
1167 Construction Quality Perception of Construction Professionals and Their Expectations from a Quality Improvement Technique in Pakistan
Authors: Muhammad Yousaf Sadiq
Abstract:
The complexity arises in defining the construction quality due to its perception, based on inherent market conditions and their requirements, the diversified stakeholders itself and their desired output. An quantitative survey based approach was adopted in this constructive study. A questionnaire-based survey was conducted for the assessment of construction Quality perception and expectations in the context of quality improvement technique. The survey feedback of professionals of the leading construction organizations/companies of Pakistan construction industry were analyzed. The financial capacity, organizational structure, and construction experience of the construction firms formed basis for their selection. The quality perception was found to be project-scope-oriented and considered as an excess cost for a construction project. Any quality improvement technique was expected to maximize the profit for the employer, by improving the productivity in a construction project. The study is beneficial for the construction professionals to assess the prevailing construction quality perception and the expectations from implementation of any quality improvement technique in construction projects.Keywords: construction quality, expectation, improvement, perception
Procedia PDF Downloads 4561166 Dewatering Agents for Granular Bauxite
Authors: Bruno Diniz Fecchio
Abstract:
Operations have been demanding increasingly challenging operational targets for the dewatering process, requiring lower humidity for concentrates. Chemical dewatering agents are able to improve solid/liquid separation processes, allowing operations to deal with increased complexity caused by either mineralogical changes or seasonal events that present operations with challenging moisture requirements for transportation and downstream steps. These chemicals reduce water retention by reducing the capillary pressure of the mineral and contributing to improved water drainage. This current study addresses the reagent effects on pile dewatering for Bauxite. Such chemicals were able to decrease the moisture of granulated Bauxite (particle size of 5 – 50 mm). The results of the laboratory scale tests and industrial trials presented the obtention of up to 11% relative moisture reduction, which reinforced the strong interaction between dewatering agents and the particle surface of granulated Bauxite. The evaluated dewatering agents, however, did not present any negative impact on these operations.Keywords: bauxite, dewatering agents, pile dewatering, moisture reduction
Procedia PDF Downloads 731165 A Minimum Spanning Tree-Based Method for Initializing the K-Means Clustering Algorithm
Authors: J. Yang, Y. Ma, X. Zhang, S. Li, Y. Zhang
Abstract:
The traditional k-means algorithm has been widely used as a simple and efficient clustering method. However, the algorithm often converges to local minima for the reason that it is sensitive to the initial cluster centers. In this paper, an algorithm for selecting initial cluster centers on the basis of minimum spanning tree (MST) is presented. The set of vertices in MST with same degree are regarded as a whole which is used to find the skeleton data points. Furthermore, a distance measure between the skeleton data points with consideration of degree and Euclidean distance is presented. Finally, MST-based initialization method for the k-means algorithm is presented, and the corresponding time complexity is analyzed as well. The presented algorithm is tested on five data sets from the UCI Machine Learning Repository. The experimental results illustrate the effectiveness of the presented algorithm compared to three existing initialization methods.Keywords: degree, initial cluster center, k-means, minimum spanning tree
Procedia PDF Downloads 3971164 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions
Authors: Vikrant Gupta, Amrit Goswami
Abstract:
The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition
Procedia PDF Downloads 1241163 The Estimation of Human Vital Signs Complexity
Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius
Abstract:
Non-stationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables interactions.Keywords: cardiac diseases, complex systems theory, ECG analysis, matrix analysis
Procedia PDF Downloads 3301162 Vocabulary Paradigm in Learning Romanian As a Foreign Language
Authors: Georgiana Ciobotaru
Abstract:
The vocabulary that foreign students assimilate once they start studying the Romanian language must allow them to develop the linguistic competence of oral and written expression, but also the intercultural one, necessary for their integration into the new socio-cultural environment. Therefore, the familiarization courses with Romanian as a foreign language aim at fundamental language acquisitions in order to obtain the expected level of Romanian language. They also relate differently to the new culture and the new language they come in contact with, having a distinct way of expressing themselves. Foreign students want to continue their university and postgraduate studies at specialized faculties in the country; therefore, they need both a general language for their integration into society and for interaction with others, Romanians or students from countries other than their own, but also from a specialized language that facilitates didactic communication and professional development. The complexity of the vocabulary must thus cover the daily communication needs, but also the subsequent evolution of each one. This paper aims to illustrate the most important semantic fields that students must assimilate in order to crystallize a linguistic identity in the new context of their personal and professional development and to help them cope with the culture shock.Keywords: integration, intercultural, language, linguistic, vocabulary
Procedia PDF Downloads 1861161 Linear Quadratic Gaussian/Loop Transfer Recover Control Flight Control on a Nonlinear Model
Authors: T. Sanches, K. Bousson
Abstract:
As part of the development of a 4D autopilot system for unmanned aerial vehicles (UAVs), i.e. a time-dependent robust trajectory generation and control algorithm, this work addresses the problem of optimal path control based on the flight sensors data output that may be unreliable due to noise on data acquisition and/or transmission under certain circumstances. Although several filtering methods, such as the Kalman-Bucy filter or the Linear Quadratic Gaussian/Loop Transfer Recover Control (LQG/LTR), are available, the utter complexity of the control system, together with the robustness and reliability required of such a system on a UAV for airworthiness certifiable autonomous flight, required the development of a proper robust filter for a nonlinear system, as a way of further mitigate errors propagation to the control system and improve its ,performance. As such, a nonlinear algorithm based upon the LQG/LTR, is validated through computational simulation testing, is proposed on this paper.Keywords: autonomous flight, LQG/LTR, nonlinear state estimator, robust flight control
Procedia PDF Downloads 1271160 Freight Time and Cost Optimization in Complex Logistics Networks, Using a Dimensional Reduction Method and K-Means Algorithm
Authors: Egemen Sert, Leila Hedayatifar, Rachel A. Rigg, Amir Akhavan, Olha Buchel, Dominic Elias Saadi, Aabir Abubaker Kar, Alfredo J. Morales, Yaneer Bar-Yam
Abstract:
The complexity of providing timely and cost-effective distribution of finished goods from industrial facilities to customers makes effective operational coordination difficult, yet effectiveness is crucial for maintaining customer service levels and sustaining a business. Logistics planning becomes increasingly complex with growing numbers of customers, varied geographical locations, the uncertainty of future orders, and sometimes extreme competitive pressure to reduce inventory costs. Linear optimization methods become cumbersome or intractable due to a large number of variables and nonlinear dependencies involved. Here we develop a complex systems approach to optimizing logistics networks based upon dimensional reduction methods and apply our approach to a case study of a manufacturing company. In order to characterize the complexity in customer behavior, we define a “customer space” in which individual customer behavior is described by only the two most relevant dimensions: the distance to production facilities over current transportation routes and the customer's demand frequency. These dimensions provide essential insight into the domain of effective strategies for customers; direct and indirect strategies. In the direct strategy, goods are sent to the customer directly from a production facility using box or bulk trucks. In the indirect strategy, in advance of an order by the customer, goods are shipped to an external warehouse near a customer using trains and then "last-mile" shipped by trucks when orders are placed. Each strategy applies to an area of the customer space with an indeterminate boundary between them. Specific company policies determine the location of the boundary generally. We then identify the optimal delivery strategy for each customer by constructing a detailed model of costs of transportation and temporary storage in a set of specified external warehouses. Customer spaces help give an aggregate view of customer behaviors and characteristics. They allow policymakers to compare customers and develop strategies based on the aggregate behavior of the system as a whole. In addition to optimization over existing facilities, using customer logistics and the k-means algorithm, we propose additional warehouse locations. We apply these methods to a medium-sized American manufacturing company with a particular logistics network, consisting of multiple production facilities, external warehouses, and customers along with three types of shipment methods (box truck, bulk truck and train). For the case study, our method forecasts 10.5% savings on yearly transportation costs and an additional 4.6% savings with three new warehouses.Keywords: logistics network optimization, direct and indirect strategies, K-means algorithm, dimensional reduction
Procedia PDF Downloads 1281159 Land Use Change Modeling Using Cellular Automata, Case Study: Karawang City, West Java Province, Indonesia
Authors: Bagus Indrawan Hardi
Abstract:
Cellular Automata are widely used in land use modeling, it has been proven powerful to simulate land use change for small scale in many large cities in the world. In this paper, we try to implement CA for land use modeling in unique city in Indonesia, Karawang. Instead the complex numerical implementation, CA are simple, and it is accurate and also highly dependable on the on the rules (rule based). The most important to do in CA is how we form and calculate the neighborhood effect. The neighborhood effect represents the environment and relationship situation between the occupied cell and others. We adopted 196 cells of circular neighborhood with 8 cells of radius. For the results, CA works well in this study, we exhibit several analyzed and proceed of zoomed part in Karawang region. The rule set can handle the complexity in land use modeling. However, we cannot strictly believe of the result, many non-technical parameters, such as politics, natural disaster activities, etc. may change the results dramatically.Keywords: cellular automata (CA), land use change, spatial dynamics, urban sprawl
Procedia PDF Downloads 2311158 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction
Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz
Abstract:
In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.Keywords: software quality, fuzzy logic, perception, prediction
Procedia PDF Downloads 3051157 The Origins of Representations: Cognitive and Brain Development
Authors: Athanasios Raftopoulos
Abstract:
In this paper, an attempt is made to explain the evolution or development of human’s representational arsenal from its humble beginnings to its modern abstract symbols. Representations are physical entities that represent something else. To represent a thing (in a general sense of “thing”) means to use in the mind or in an external medium a sign that stands for it. The sign can be used as a proxy of the represented thing when the thing is absent. Representations come in many varieties, from signs that perceptually resemble their representative to abstract symbols that are related to their representata through conventions. Relying the distinction among indices, icons, and symbols, it is explained how symbolic representations gradually emerged from indices and icons. To understand the development or evolution of our representational arsenal, the development of the cognitive capacities that enabled the gradual emergence of representations of increasing complexity and expressive capability should be examined. The examination of these factors should rely on a careful assessment of the available empirical neuroscientific and paleo-anthropological evidence. These pieces of evidence should be synthesized to produce arguments whose conclusions provide clues concerning the developmental process of our representational capabilities. The analysis of the empirical findings in this paper shows that Homo Erectus was able to use both icons and symbols. Icons were used as external representations, while symbols were used in language. The first step in the emergence of representations is that a sensory-motor purely causal schema involved in indices is decoupled from its normal causal sensory-motor functions and serves as a representation of the object that initially called it into play. Sensory-motor schemes are tied to specific contexts of the organism-environment interactions and are activated only within these contexts. For a representation of an object to be possible, this scheme must be de-contextualized so that the same object can be represented in different contexts; a decoupled schema loses its direct ties to reality and becomes mental content. The analysis suggests that symbols emerged due to selection pressures of the social environment. The need to establish and maintain social relationships in ever-enlarging groups that would benefit the group was a sufficient environmental pressure to lead to the appearance of the symbolic capacity. Symbols could serve this need because they can express abstract relationships, such as marriage or monogamy. Icons, by being firmly attached to what can be observed, could not go beyond surface properties to express abstract relations. The cognitive capacities that are required for having iconic and then symbolic representations were present in Homo Erectus, which had a language that started without syntactic rules but was structured so as to mirror the structure of the world. This language became increasingly complex, and grammatical rules started to appear to allow for the construction of more complex expressions required to keep up with the increasing complexity of social niches. This created evolutionary pressures that eventually led to increasing cranial size and restructuring of the brain that allowed more complex representational systems to emerge.Keywords: mental representations, iconic representations, symbols, human evolution
Procedia PDF Downloads 401156 A Study on Game Theory Approaches for Wireless Sensor Networks
Authors: M. Shoukath Ali, Rajendra Prasad Singh
Abstract:
Game Theory approaches and their application in improving the performance of Wireless Sensor Networks (WSNs) are discussed in this paper. The mathematical modeling and analysis of WSNs may have low success rate due to the complexity of topology, modeling, link quality, etc. However, Game Theory is a field, which can efficiently use to analyze the WSNs. Game Theory is related to applied mathematics that describes and analyzes interactive decision situations. Game theory has the ability to model independent, individual decision makers whose actions affect the surrounding decision makers. The outcome of complex interactions among rational entities can be predicted by a set of analytical tools. However, the rationality demands a stringent observance to a strategy based on measured of perceived results. Researchers are adopting game theory approaches to model and analyze leading wireless communication networking issues, which includes QoS, power control, resource sharing, etc.Keywords: wireless sensor network, game theory, cooperative game theory, non-cooperative game theory
Procedia PDF Downloads 4141155 Video-Observation: A Phenomenological Research Tool for International Relation?
Authors: Andreas Aagaard Nohr
Abstract:
International Relations is an academic discipline which is rarely in direct contact with its field. However, there has in recent years been a growing interest in the different agents within and beyond the state and their associated practices; yet some of the research tools with which to study them are not widely used. This paper introduces video-observation as a method for the study of IR and argues that it offers a unique way of studying the complexity of the everyday context of actors. The paper is divided into two main parts: First, the philosophical and methodological underpinnings of the kind of data that video-observation produces are discussed; primarily through a discussion of the phenomenology of Husserl, Heidegger, and Merleau-Ponty. Second, taking simulation of a WTO negotiation round as an example, the paper discusses how the data created can be analysed: in particular with regard to the structure of events, the temporal and spatial organization of activities, rhythm and periodicity, and the concrete role of artefacts and documents. The paper concludes with a discussion of the ontological, epistemological, and practical challenges and limitations that ought to be considered if video-observation is chosen as a method within the field of IR.Keywords: video-observation, phenomenology, international relations
Procedia PDF Downloads 4321154 Critical Pedagogy and Literacy Development
Authors: Rajendra Chetty
Abstract:
This paper analyses the experiences of teachers of literacy in underprivileged schools in the Western Cape, South Africa. The purpose is to provide teachers in poorly resourced schools within economically deprived areas an opportunity to voice their experiences of teaching literacy. The paper is based on an empirical study using interviews and classroom observation. A descriptive account of the observation data was followed by an interpretive analysis. The content analysis of the interview data led to the development of themes and patterns for the discussion. The study reveals key factors for literacy underachievement that include lack of critical and emancipatory pedagogies, resources, parental support, lack of teacher knowledge, absence of cognitive activities, and the social complexity of poverty. The paper recommends that a new model of literacy that is underpinned by critical pedagogy challenge inequality and provides strategic and sustained teacher support in disadvantaged schools is crucial in a society emerging from oppression and racism.Keywords: critical pedagogy, disadvantaged schools, literacy, poverty
Procedia PDF Downloads 981153 Digital Individual Benefit Statement: The Use of a Triangulation Methodology to Design a Digital Platform for Switzerland
Authors: Catherine Equey Balzli
Abstract:
Old age retirement pensions are an important concern among the Swiss but estimating one’s income after retirement is difficult due to the Swiss insurance system’s complexity. This project’s aim is to prepare for developing a digital platform that will allow individuals to plan for retirement in a simplified manner. The main objective of the platform will be to give individuals the tools to check that their savings and retirement benefits will allow them to continue the lifestyle to which they are accustomed once they are retired. The research results from qualitative (focus group) and quantitative (survey) methodologies, recommend the scope and functionalities for a digital platform to be developed. A main outcome is the need to limit the platform’s scope to old-age pension only (excluding survivors’ or disability pensions, for instance). Furthermore, an outcome regarding the functionalities is the proposition of scenarios such as early retirement, changes to income, or modifications to personal status. The development of the digital platform will be a subsequent project.Keywords: benefit statement, digital platform, retirement financial planning, social insurance
Procedia PDF Downloads 981152 The Capability of Organizational Leadership: Development of Conceptual Framework
Authors: Kurmet Kivipõld, Maaja Vadi
Abstract:
Current paper develops the conceptual framework for organizational leadership capability. Organizational leadership here is understood as collective multi-level phenomenon which has been embedded into organizational processes as a capability at the level of the entire organization. The paper analyses and systematises the theo¬retical approaches to multi-level leadership in existing literature. This analysis marks the foundation of collective leadership at the organizational level, which forms the basis for the development of the conceptual framework of organi¬zational leadership capability. The developed conceptual framework of organiza¬tional leadership capability is formed from the synthesis of the three groups of base theories – traditional leadership theories, the resource-based view from strategic management and complexity theory from system theories. These conceptual sources present the main characteristics that determine the nature of organizational leadership capability and are the basis for its mea¬surement.Keywords: leadership, organizational capability, organizational leadership, resource-based view, system theory
Procedia PDF Downloads 3371151 Software Component Identification from Its Object-Oriented Code: Graph Metrics Based Approach
Authors: Manel Brichni, Abdelhak-Djamel Seriai
Abstract:
Systems are increasingly complex. To reduce their complexity, an abstract view of the system can simplify its development. To overcome this problem, we propose a method to decompose systems into subsystems while reducing their coupling. These subsystems represent components. Consisting of an existing object-oriented systems, the main idea of our approach is based on modelling as graphs all entities of an oriented object source code. Such modelling is easy to handle, so we can apply restructuring algorithms based on graph metrics. The particularity of our approach consists in integrating in addition to standard metrics, such as coupling and cohesion, some graph metrics giving more precision during the components identication. To treat this problem, we relied on the ROMANTIC approach that proposed a component-based software architecture recovery from an object oriented system.Keywords: software reengineering, software component and interfaces, metrics, graphs
Procedia PDF Downloads 4881150 Effects of Epinephrine on Gene Expressions during the Metamorphosis of Pacific Oyster Crassostrea gigas
Authors: Fei Xu, Guofan Zhang, Xiao Liu
Abstract:
Many major marine invertebrate phyla are characterized by indirect development. These animals transit from planktonic larvae to benthic adults via settlement and metamorphosis, which has many advantages for organisms to adapt marine environment. Studying the biological process of metamorphosis is thus a key to understand the origin and evolution of indirect development. Although the mechanism of metamorphosis has been largely studied on their relationships with the marine environment, microorganisms, as well as the neurohormones, little is known on the gene regulation network (GRN) during metamorphosis. We treated competent oyster pediveligers with epinephrine, which was known to be able to effectively induce oyster metamorphosis, and analyzed the dynamics of gene and proteins with transcriptomics and proteomics methods. The result indicated significant upregulation of protein synthesis system, as well as some transcription factors including Homeobox, basic helix-loop-helix, and nuclear receptors. The result suggested the GRN complexity of the transition stage during oyster metamorphosis.Keywords: indirect development, gene regulation network, protein synthesis, transcription factors
Procedia PDF Downloads 1251149 Merging Sequence Diagrams Based Slicing
Authors: Bouras Zine Eddine, Talai Abdelouaheb
Abstract:
The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. Also, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.Keywords: system behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing
Procedia PDF Downloads 3321148 On Block Vandermonde Matrix Constructed from Matrix Polynomial Solvents
Authors: Malika Yaici, Kamel Hariche
Abstract:
In control engineering, systems described by matrix fractions are studied through properties of block roots, also called solvents. These solvents are usually dealt with in a block Vandermonde matrix form. Inverses and determinants of Vandermonde matrices and block Vandermonde matrices are used in solving problems of numerical analysis in many domains but require costly computations. Even though Vandermonde matrices are well known and method to compute inverse and determinants are many and, generally, based on interpolation techniques, methods to compute the inverse and determinant of a block Vandermonde matrix have not been well studied. In this paper, some properties of these matrices and iterative algorithms to compute the determinant and the inverse of a block Vandermonde matrix are given. These methods are deducted from the partitioned matrix inversion and determinant computing methods. Due to their great size, parallelization may be a solution to reduce the computations cost, so a parallelization of these algorithms is proposed and validated by a comparison using algorithmic complexity.Keywords: block vandermonde matrix, solvents, matrix polynomial, matrix inverse, matrix determinant, parallelization
Procedia PDF Downloads 2241147 Government Intervention Strategies in Providing Water to Rural Communities in the O R Tambo District Municipality, South Africa
Authors: Cecilia Kunseh Betek
Abstract:
Managing rural water supply systems effectively and efficiently is a challenge in the O R Tambo District Municipality due to the long distances between consumers and municipal centres. This is a couple with the low income of most residents and the government's policy of free basic water which is making rural water provision very difficult. With regard to cartage, the results reveal that the majority (84.4%) of the population covers distances of about 1kilometre to fetch water, and 15.6% travel up kilometer to access water facilities. This means that the water sources are located very far from households, outside the officially legislated array of 200metres. These are many reasons to account for this situation. Firstly, this implies that there are inadequate stand pipes to cater for all the homesteads scattered across the rugged terrain of OR Tambo District municipality. Secondly, and following from the first explanation, it would be seen that funding that is made available is not adequate, or is not efficiently spent on the targeted projects. The situation in the rural areas of South Africa is fraught with cumbersome complexity when it comes to service delivery.Keywords: water, management, government, rural
Procedia PDF Downloads 2731146 New Method to Increase Contrast of Electromicrograph of Rat Tissues Sections
Authors: Lise Paule Labéjof, Raíza Sales Pereira Bizerra, Galileu Barbosa Costa, Thaísa Barros dos Santos
Abstract:
Since the beginning of the microscopy, improving the image quality has always been a concern of its users. Especially for transmission electron microscopy (TEM), the problem is even more important due to the complexity of the sample preparation technique and the many variables that can affect the conservation of structures, proper operation of the equipment used and then the quality of the images obtained. Animal tissues being transparent it is necessary to apply a contrast agent in order to identify the elements of their ultrastructural morphology. Several methods of contrastation of tissues for TEM imaging have already been developed. The most used are the “in block” contrastation and “in situ” contrastation. This report presents an alternative technique of application of contrast agent in vivo, i.e. before sampling. By this new method the electromicrographies of the tissue sections have better contrast compared to that in situ and present no artefact of precipitation of contrast agent. Another advantage is that a small amount of contrast is needed to get a good result given that most of them are expensive and extremely toxic.Keywords: image quality, microscopy research, staining technique, ultra thin section
Procedia PDF Downloads 4191145 Liquidity and Cash Management in Business-A Key to Business Survival and Growth: The Nigerian Case
Authors: Ugbor Raphael Oluchukwu
Abstract:
Focusing on liquidity comes more naturally to a Chief Executive Officer than an Accountant who is trained to practice accrual accounting. When business is just commencing, it is essentially run on a cheque book (cash accounting) and for as long as there is cash in the accounts, the business is solvent. When complexity sets in and the business adopts financial accounting, the effect of liquidity and cash management becomes more pronounced. The management of cash no doubts impacts positively on the survival and growth of firms. What is in doubt is the amount of cash to be held by a firm as enough cash to enable the firm stay “afloat”. The focus of this paper is to determine liquidity and cash management in business, the Nigerian case. The specific objectives of the study are to do a theoretical review of the amount of cash to be held by a firm as enough cash to enable it stay afloat and to do a theoretical analysis to show the effect of cash flow on the survival and growth of firms in Nigeria.Keywords: cash, firm survival, growth, liquidity management
Procedia PDF Downloads 5701144 Lean Implementation Analysis on the Safety Performance of Construction Projects in the Philippines
Authors: Kim Lindsay F. Restua, Jeehan Kyra A. Rivero, Joneka Myles D. Taguba
Abstract:
Lean construction is defined as an approach in construction with the purpose of reducing waste in the process without compromising the value of the project. There are numerous lean construction tools that are applied in the construction process, which maximizes the efficiency of work and satisfaction of customers while minimizing waste. However, the complexity and differences of construction projects cause a rise in challenges on achieving the lean benefits construction can give, such as improvement in safety performance. The objective of this study is to determine the relationship between lean construction tools and their effects on safety performance. The relationship between construction tools applied in construction and safety performance is identified through Logistic Regression Analysis, and Correlation Analysis was conducted thereafter. Based on the findings, it was concluded that almost 60% of the factors listed in the study, which are different tools and effects of lean construction, were determined to have a significant relationship with the level of safety in construction projects.Keywords: correlation analysis, lean construction tools, lean construction, logistic regression analysis, risk management, safety
Procedia PDF Downloads 1711143 Problem of Services Selection in Ubiquitous Systems
Authors: Malika Yaici, Assia Arab, Betitra Yakouben, Samia Zermani
Abstract:
Ubiquitous computing is nowadays a reality through the networking of a growing number of computing devices. It allows providing users with context aware information and services in a heterogeneous environment, anywhere and anytime. Selection of the best context-aware service, between many available services and providers, is a tedious problem. In this paper, a service selection method based on Constraint Satisfaction Problem (CSP) formalism is proposed. The services are considered as variables and domains; and the user context, preferences and providers characteristics are considered as constraints. The Backtrack algorithm is used to solve the problem to find the best service and provider which matches the user requirements. Even though this algorithm has an exponential complexity, but its use guarantees that the service, that best matches the user requirements, will be found. A comparison of the proposed method with the existing solutions finishes the paper.Keywords: ubiquitous computing, services selection, constraint satisfaction problem, backtrack algorithm
Procedia PDF Downloads 2281142 A Model for Teaching Arabic Grammar in Light of the Common European Framework of Reference for Languages
Authors: Erfan Abdeldaim Mohamed Ahmed Abdalla
Abstract:
The complexity of Arabic grammar poses challenges for learners, particularly in relation to its arrangement, classification, abundance, and bifurcation. The challenge at hand is a result of the contextual factors that gave rise to the grammatical rules in question, as well as the pedagogical approach employed at the time, which was tailored to the needs of learners during that particular historical period. Consequently, modern-day students encounter this same obstacle. This requires a thorough examination of the arrangement and categorization of Arabic grammatical rules based on particular criteria, as well as an assessment of their objectives. Additionally, it is necessary to identify the prevalent and renowned grammatical rules, as well as those that are infrequently encountered, obscure and disregarded. This paper presents a compilation of grammatical rules that require arrangement and categorization in accordance with the standards outlined in the Common European Framework of Reference for Languages (CEFR). In addition to facilitating comprehension of the curriculum, accommodating learners' requirements, and establishing the fundamental competencies for achieving proficiency in Arabic, it is imperative to ascertain the conventions that language learners necessitate in alignment with explicitly delineated benchmarks such as the CEFR criteria. The aim of this study is to reduce the quantity of grammatical rules that are typically presented to non-native Arabic speakers in Arabic textbooks. This reduction is expected to enhance the motivation of learners to continue their Arabic language acquisition and to approach the level of proficiency of native speakers. The primary obstacle faced by learners is the intricate nature of Arabic grammar, which poses a significant challenge in the realm of study. The proliferation and complexity of regulations evident in Arabic language textbooks designed for individuals who are not native speakers is noteworthy. The inadequate organisation and delivery of the material create the impression that the grammar is being imparted to a student with the intention of memorising "Alfiyyat-Ibn-Malik." Consequently, the sequence of grammatical rules instruction was altered, with rules originally intended for later instruction being presented first and those intended for earlier instruction being presented subsequently. Students often focus on learning grammatical rules that are not necessarily required while neglecting the rules that are commonly used in everyday speech and writing. Non-Arab students are taught Arabic grammar chapters that are infrequently utilised in Arabic literature and may be a topic of debate among grammarians. The aforementioned findings are derived from the statistical analysis and investigations conducted by the researcher, which will be disclosed in due course of the research. To instruct non-Arabic speakers on grammatical rules, it is imperative to discern the most prevalent grammatical frameworks in grammar manuals and linguistic literature (study sample). The present proposal suggests the allocation of grammatical structures across linguistic levels, taking into account the guidelines of the CEFR, as well as the grammatical structures that are necessary for non-Arabic-speaking learners to generate a modern, cohesive, and comprehensible language.Keywords: grammar, Arabic, functional, framework, problems, standards, statistical, popularity, analysis
Procedia PDF Downloads 741141 Comparison of Seismic Retrofitting Methods for Existing Foundations in Seismological Active Regions
Authors: Peyman Amini Motlagh, Ali Pak
Abstract:
Seismic retrofitting of important structures is essential in seismological active zones. The importance is doubled when it comes to some buildings like schools, hospitals, bridges etc. because they are required to continue their serviceability even after a major earthquake. Generally, seismic retrofitting codes have paid little attention to retrofitting of foundations due to its construction complexity. In this paper different methods for seismic retrofitting of tall buildings’ foundations will be discussed and evaluated. Foundations are considered in three different categories. First, foundations those are in danger of liquefaction of their underlying soil. Second, foundations located on slopes in seismological active regions. Third, foundations designed according to former design codes and may show structural defects under earthquake loads. After describing different methods used in different countries for retrofitting of the existing foundations in seismological active regions, comprehensive comparison between these methods with regard to the above mentioned categories is carried out. This paper gives some guidelines to choose the best method for seismic retrofitting of tall buildings’ foundations in retrofitting projects.Keywords: existing foundation, landslide, liquefaction, seismic retrofitting
Procedia PDF Downloads 3811140 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Rajaian Hoonejani Mohammad, Eshraghi Pegah, Zomorodian Zahra Sadat, Tahsildoost Mohammad
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 521139 Process Driven Architecture For The ‘Lessons Learnt’ Knowledge Sharing Framework: The Case Of A ‘Lessons Learnt’ Framework For KOC
Authors: Rima Al-Awadhi, Abdul Jaleel Tharayil
Abstract:
On a regular basis, KOC engages into various types of Projects. However, due to very nature and complexity involved, each project experience generates a lot of ‘learnings’ that need to be factored into while drafting a new contract and thus avoid repeating the same mistakes. But, many a time these learnings are localized and remain as tacit leading to scope re-work, larger cycle time, schedule overrun, adjustment orders and claims. Also, these experiences are not readily available to new employees leading to steep learning curve and longer time to competency. This is to share our experience in designing and implementing a process driven architecture for the ‘lessons learnt’ knowledge sharing framework in KOC. It high-lights the ‘lessons learnt’ sharing process adopted, integration with the organizational processes, governance framework, the challenges faced and learning from our experience in implementing a ‘lessons learnt’ framework.Keywords: lessons learnt, knowledge transfer, knowledge sharing, successful practices, Lessons Learnt Workshop, governance framework
Procedia PDF Downloads 5671138 A Network-Theorical Perspective on Music Analysis
Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria
Abstract:
The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.Keywords: computational musicology, mathematical music modelling, music analysis, style classification
Procedia PDF Downloads 88