Search results for: online flood prediction system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21421

Search results for: online flood prediction system

17491 Numerical Prediction of Wall Eroded Area by Cavitation

Authors: Ridha Zgolli, Ahmed Belhaj, Maroua Ennouri

Abstract:

This study presents a new method to predict cavitation area that may be eroded. It is based on the post-treatment of URANS simulations in cavitant flows. The most RANS calculations with incompressible consideration are based on cavitation model using mixture fluid with density (ρm) calculated as a function of liquid density (ρliq), vapour or gas density (ρvap) and vapour or gas volume fraction α (ρm = αρvap + (1-α) ρliq). The calculations are performed on hydrofoil geometries and compared with experimental works concerning flows characteristics (size of pocket, pressure, velocity). We present here the used cavitation model and the approach followed to evaluate the value of α fixing the shape of pocket around wall before collapsing.

Keywords: flows, CFD, cavitation, erosion

Procedia PDF Downloads 334
17490 Seismic Assessment of Flat Slab and Conventional Slab System for Irregular Building Equipped with Shear Wall

Authors: Muhammad Aji Fajari, Ririt Aprilin Sumarsono

Abstract:

Particular instability of structural building under lateral load (e.g earthquake) will rise due to irregularity in vertical and horizontal direction as stated in SNI 03-1762-2012. The conventional slab has been considered for its less contribution in increasing the stability of the structure, except special slab system such as flat slab turned into account. In this paper, the analysis of flat slab system at Sequis Tower located in South Jakarta will be assessed its performance under earthquake. It consists of 6 floors of the basement where the flat slab system is applied. The flat slab system will be the main focus in this paper to be compared for its performance with conventional slab system under earthquake. Regarding the floor plan of Sequis Tower basement, re-entrant corner signed for this building is 43.21% which exceeded the allowable re-entrant corner is 15% as stated in ASCE 7-05 Based on that, the horizontal irregularity will be another concern for analysis, otherwise vertical irregularity does not exist for this building. Flat slab system is a system where the slabs use drop panel with shear head as their support instead of using beams. Major advantages of flat slab application are decreasing dead load of structure, removing beams so that the clear height can be maximized, and providing lateral resistance due to lateral load. Whilst, deflection at middle strip and punching shear are problems to be detail considered. Torsion usually appears when the structural member under flexure such as beam or column dimension is improper in ratio. Considering flat slab as alternative slab system will keep the collapse due to torsion down. Common seismic load resisting system applied in the building is a shear wall. Installation of shear wall will keep the structural system stronger and stiffer affecting in reduced displacement under earthquake. Eccentricity of shear wall location of this building resolved the instability due to horizontal irregularity so that the earthquake load can be absorbed. Performing linear dynamic analysis such as response spectrum and time history analysis due to earthquake load is suitable as the irregularity arise so that the performance of structure can be significantly observed. Utilization of response spectrum data for South Jakarta which PGA 0.389g is basic for the earthquake load idealization to be involved in several load combinations stated on SNI 03-1726-2012. The analysis will result in some basic seismic parameters such as period, displacement, and base shear of the system; besides the internal forces of the critical member will be presented. Predicted period of a structure under earthquake load is 0.45 second, but as different slab system applied in the analysis then the period will show a different value. Flat slab system will probably result in better performance for the displacement parameter compare to conventional slab system due to higher contribution of stiffness to the whole system of the building. In line with displacement, the deflection of the slab will result smaller for flat slab than a conventional slab. Henceforth, shear wall will be effective to strengthen the conventional slab system than flat slab system.

Keywords: conventional slab, flat slab, horizontal irregularity, response spectrum, shear wall

Procedia PDF Downloads 185
17489 Decision Support System for Solving Multi-Objective Routing Problem

Authors: Ismail El Gayar, Ossama Ismail, Yousri El Gamal

Abstract:

This paper presented a technique to solve one of the transportation problems that faces us in real life which is the Bus Scheduling Problem. Most of the countries using buses in schools, companies and traveling offices as an example to transfer multiple passengers from many places to specific place and vice versa. This transferring process can cost time and money, so we build a decision support system that can solve this problem. In this paper, a genetic algorithm with the shortest path technique is used to generate a competitive solution to other well-known techniques. It also presents a comparison between our solution and other solutions for this problem.

Keywords: bus scheduling problem, decision support system, genetic algorithm, shortest path

Procedia PDF Downloads 400
17488 An In-Depth Experimental Study of Wax Deposition in Pipelines

Authors: Arias M. L., D’Adamo J., Novosad M. N., Raffo P. A., Burbridge H. P., Artana G.

Abstract:

Shale oils are highly paraffinic and, consequently, can create wax deposits that foul pipelines during transportation. Several factors must be considered when designing pipelines or treatment programs that prevents wax deposition: including chemical species in crude oils, flowrates, pipes diameters and temperature. This paper describes the wax deposition study carried out within the framework of Y-TEC's flow assurance projects, as part of the process to achieve a better understanding on wax deposition issues. Laboratory experiments were performed on a medium size, 1 inch diameter, wax deposition loop of 15 mts long equipped with a solid detector system, online microscope to visualize crystals, temperature and pressure sensors along the loop pipe. A baseline test was performed with diesel with no paraffin or additive content. Tests were undertaken with different temperatures of circulating and cooling fluid at different flow conditions. Then, a solution formed with a paraffin added to the diesel was considered. Tests varying flowrate and cooling rate were again run. Viscosity, density, WAT (Wax Appearance Temperature) with DSC (Differential Scanning Calorimetry), pour point and cold finger measurements were carried out to determine physical properties of the working fluids. The results obtained in the loop were analyzed through momentum balance and heat transfer models. To determine possible paraffin deposition scenarios temperature and pressure loop output signals were studied. They were compared with WAT static laboratory methods. Finally, we scrutinized the effect of adding a chemical inhibitor to the working fluid on the dynamics of the process of wax deposition in the loop.

Keywords: paraffin desposition, flow assurance, chemical inhibitors, flow loop

Procedia PDF Downloads 97
17487 Parkinson’s Disease Hand-Eye Coordination and Dexterity Evaluation System

Authors: Wann-Yun Shieh, Chin-Man Wang, Ya-Cheng Shieh

Abstract:

This study aims to develop an objective scoring system to evaluate hand-eye coordination and hand dexterity for Parkinson’s disease. This system contains three boards, and each of them is implemented with the sensors to sense a user’s finger operations. The operations include the peg test, the block test, and the blind block test. A user has to use the vision, hearing, and tactile abilities to finish these operations, and the board will record the results automatically. These results can help the physicians to evaluate a user’s reaction, coordination, dexterity function. The results will be collected to a cloud database for further analysis and statistics. A researcher can use this system to obtain systematic, graphic reports for an individual or a group of users. Particularly, a deep learning model is developed to learn the features of the data from different users. This model will help the physicians to assess the Parkinson’s disease symptoms by a more intellective algorithm.

Keywords: deep learning, hand-eye coordination, reaction, hand dexterity

Procedia PDF Downloads 58
17486 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction

Authors: Mohammad Ghahramani, Fahimeh Saei Manesh

Abstract:

Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.

Keywords: soccer, analytics, machine learning, database

Procedia PDF Downloads 231
17485 Research on Quality Assurance in African Higher Education: A Bibliometric Mapping from 1999 to 2019

Authors: Luís M. João, Patrício Langa

Abstract:

The article reviews the literature on quality assurance (QA) in African higher education studies (HES) conducted through a bibliometric mapping of published papers between 1999 and 2019. Specifically, the article highlights the nuances of knowledge production in four scientific databases: Scopus, Web of Science (WoS), African Journal Online (AJOL), and Google Scholar. The analysis included 531 papers, of which 127 are from Scopus, 30 are from Web of Science, 85 are from African Journal Online, and 259 are from Google Scholar. In essence, 284 authors wrote these papers from 231 institutions and 69 different countries (i.e., Africa=54 and outside Africa=15). Results indicate the existing knowledge. This analysis allows the readers to understand the growth and development of the field during the two-decade period, identify key contributors, and observe potential trends or gaps in the research. The paper employs bibliometric mapping as its primary analytical lens. By utilizing this method, the study quantitatively assesses the publications related to QA in African HES, helping to identify patterns, collaboration networks, and disparities in research output. The bibliometric approach allows for a systematic and objective analysis of large datasets, offering a comprehensive view of the knowledge production in the field. Furthermore, the study highlights the lack of shared resources available to enhance quality in higher education institutions (HEIs) in Africa. This finding underscores the importance of promoting collaborative research efforts, knowledge exchange, and capacity building within the region to improve the overall quality of higher education. The paper argues that despite the growing quantity of QA research in African higher education, there are challenges related to citation impact and access to high-impact publication avenues for African researchers. It emphasises the need to promote collaborative research and resource-sharing to enhance the quality of HEIs in Africa. The analytical lenses of bibliometric mapping and the examination of publication players' scenarios contribute to a comprehensive understanding of the field and its implications for African higher education.

Keywords: Africa, bibliometric research, higher education studies, quality assurance, scientific database, systematic review

Procedia PDF Downloads 37
17484 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control

Authors: Ming-Yen Chang, Sheng-Hung Ke

Abstract:

This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.

Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride

Procedia PDF Downloads 63
17483 Tracing the History of Indian Legal System Vis-A-Vis the Code of Hammurabi

Authors: Vandana Kumari

Abstract:

One of the most ancient and detailed legal codes proclaimed the Babylonian King Hammurabi during his reign in the erstwhile Mesopotamian society, provides a fascinating account of the social and justice system of Babylon. The 282 laws intricately carved on eight feet black stone stela serve as an important source of contemporary commercial, family and criminals laws. This paper attempts an inquiry into the contemporary relevance of this legal code to our current legal system. An exhaustive study of one of ancient legal system based on a series of practical experiences rather than being founded on mere theoretical ideologies can be assumed pertinent to the promulgation of practically viable laws in our country. The first chapter of the paper focuses on law seven which established the rules of commerce and the role of government in overseeing justice and honesty regarding the law of property. The second chapter deals with the laws of family, marriages, divorce and adoption prevailing in the Babylonian era. The third chapter traces the earliest known history of criminal jurisprudence which impregnated the principle of an eye for an eye. The paper is not merely a theoretical account of the Mesopotamian way of living but a novice attempt to discover the roots of Indian laws in the ruins of the courtrooms of the Hammurabi Empire.

Keywords: Babylonian legal system, Contemporary relevance, criminal jurisprudence, Hammurabi Code

Procedia PDF Downloads 302
17482 Features Vector Selection for the Recognition of the Fragmented Handwritten Numeric Chains

Authors: Salim Ouchtati, Aissa Belmeguenai, Mouldi Bedda

Abstract:

In this study, we propose an offline system for the recognition of the fragmented handwritten numeric chains. Firstly, we realized a recognition system of the isolated handwritten digits, in this part; the study is based mainly on the evaluation of neural network performances, trained with the gradient backpropagation algorithm. The used parameters to form the input vector of the neural network are extracted from the binary images of the isolated handwritten digit by several methods: the distribution sequence, sondes application, the Barr features, and the centered moments of the different projections and profiles. Secondly, the study is extended for the reading of the fragmented handwritten numeric chains constituted of a variable number of digits. The vertical projection was used to segment the numeric chain at isolated digits and every digit (or segment) was presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits).

Keywords: features extraction, handwritten numeric chains, image processing, neural networks

Procedia PDF Downloads 261
17481 The Contribution of SMES to Improve the Transient Stability of Multimachine Power System

Authors: N. Chérif, T. Allaoui, M. Benasla, H. Chaib

Abstract:

Industrialization and population growth are the prime factors for which the consumption of electricity is steadily increasing. Thus, to have a balance between production and consumption, it is necessary at first to increase the number of power plants, lines and transformers, which implies an increase in cost and environmental degradation. As a result, it is now important to have mesh networks and working close to the limits of stability in order to meet these new requirements. The transient stability studies involve large disturbances such as short circuits, loss of work or production group. The consequence of these defects can be very serious, and can even lead to the complete collapse of the network. This work focuses on the regulation means that networks can help to keep their stability when submitted to strong disturbances. The magnetic energy storage-based superconductor (SMES) comprises a superconducting coil short-circuited on it self. When such a system is connected to a power grid is able to inject or absorb the active and reactive power. This system can be used to improve the stability of power systems.

Keywords: short-circuit, power oscillations, multiband PSS, power system, SMES, transient stability

Procedia PDF Downloads 447
17480 An Immune-Inspired Web Defense Architecture

Authors: Islam Khalil, Amr El-Kadi

Abstract:

With the increased use of web technologies, microservices, and Application Programming Interface (API) for integration between systems, and with the development of containerization of services on the operating system level as a method of isolating system execution and for easing the deployment and scaling of systems, there is a growing need as well as opportunities for providing platforms that improve the security of such services. In our work, we propose an architecture for a containerization platform that utilizes various concepts derived from the human immune system. The goal of the proposed containerization platform is to introduce the concept of slowing down or throttling suspected malicious digital pathogens (intrusions) to reduce their damage footprint while providing more opportunities for forensic inspection of suspected pathogens in addition to the ability to snapshot, rollback, and recover from possible damage. The proposed platform also leverages existing intrusion detection algorithms by integrating and orchestrating their cooperative operation for more effective intrusion detection. We show how this model reduces the damage footprint of intrusions and gives a greater time window for forensic investigation. Moreover, during our experiments, our proposed platform was able to uncover unintentional system design flaws that resulted in internal DDoS-like attacks by submodules of the system itself rather than external intrusions.

Keywords: containers, human immunity, intrusion detection, security, web services

Procedia PDF Downloads 86
17479 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators

Authors: Fathi Abid, Bilel Kaffel

Abstract:

The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.

Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode

Procedia PDF Downloads 332
17478 An In-Situ Integrated Micromachining System for Intricate Micro-Parts Machining

Authors: Shun-Tong Chen, Wei-Ping Huang, Hong-Ye Yang, Ming-Chieh Yeh, Chih-Wei Du

Abstract:

This study presents a novel versatile high-precision integrated micromachining system that combines contact and non-contact micromachining techniques to machine intricate micro-parts precisely. Two broad methods of micro fabrication-1) volume additive (micro co-deposition), and 2) volume subtractive (nanometric flycutting, ultrafine w-EDM (wire Electrical Discharge Machining), and micro honing) - are integrated in the developed micromachining system, and their effectiveness is verified. A multidirectional headstock that supports various machining orientations is designed to evaluate the feasibility of multifunctional micromachining. An exchangeable working-tank that allows for various machining mechanisms is also incorporated into the system. Hence, the micro tool and workpiece need not be unloaded or repositioned until all the planned tasks have been completed. By using the designed servo rotary mechanism, a nanometric flycutting approach with a concentric rotary accuracy of 5-nm is constructed and utilized with the system to machine a diffraction-grating element with a nano-metric scale V-groove array. To improve the wear resistance of the micro tool, the micro co-deposition function is used to provide a micro-abrasive coating by an electrochemical method. The construction of ultrafine w-EDM facilitates the fabrication of micro slots with a width of less than 20-µm on a hardened tool. The hardened tool can thus be employed as a micro honing-tool to hone a micro hole with an internal diameter of 200 µm on SKD-11 molded steel. Experimental results prove that intricate micro-parts can be in-situ manufactured with high-precision by the developed integrated micromachining system.

Keywords: integrated micromachining system, in-situ micromachining, nanometric flycutting, ultrafine w-EDM, micro honing

Procedia PDF Downloads 404
17477 Investigation of the Effect of Impulse Voltage to Flashover by Using Water Jet

Authors: Harun Gülan, Muhsin Tunay Gencoglu, Mehmet Cebeci

Abstract:

The main function of the insulators used in high voltage (HV) transmission lines is to insulate the energized conductor from the pole and hence from the ground. However, when the insulators fail to perform this insulation function due to various effects, failures occur. The deterioration of the insulation results either from breakdown or surface flashover. The surface flashover is caused by the layer of pollution that forms conductivity on the surface of the insulator, such as salt, carbonaceous compounds, rain, moisture, fog, dew, industrial pollution and desert dust. The source of the majority of failures and interruptions in HV lines is surface flashover. This threatens the continuity of supply and causes significant economic losses. Pollution flashover in HV insulators is still a serious problem that has not been fully resolved. In this study, a water jet test system has been established in order to investigate the behavior of insulators under dirty conditions and to determine their flashover performance. Flashover behavior of the insulators is examined by applying impulse voltages in the test system. This study aims to investigate the insulator behaviour under high impulse voltages. For this purpose, a water jet test system was installed and experimental results were obtained over a real system and analyzed. By using the water jet test system instead of the actual insulator, the damage to the insulator as a result of the flashover that would occur under impulse voltage was prevented. The results of the test system performed an important role in determining the insulator behavior and provided predictability.

Keywords: insulator, pollution flashover, high impulse voltage, water jet model

Procedia PDF Downloads 104
17476 Modification of a Human Powered Lawn Mower

Authors: Akinwale S. O., Koya O. A.

Abstract:

The need to provide ecologically-friendly and effective lawn mowing solution is crucial for the well-being of humans. This study involved the modification of a human-powered lawn mower designed to cut tall grasses in residential areas. This study designed and fabricated a reel-type mower blade system and a pedal-powered test rig for the blade system. It also evaluated the performance of the machine. The machine was tested on some overgrown grass plots at College of Education Staff School Ilesa. Parameters such as theoretical field capacity, field efficiency and effective field capacity were determined from the data gathered. The quality of cut achieved by the unit was also documented. Test results showed that the fabricated cutting system produced a theoretical field capacity of 0.11 ha/h and an effective field capacity of 0.08ha/h. Moreover, the unit’s cutting system showed a substantial improvement over existing reel mower designs in its ability to cut on both the forward and reverse phases of its motion. This study established that the blade system described herein has the capacity to cut tall grasses. Hence, this device can therefore eliminate the need for powered mowers entirely on small residential lawns.

Keywords: effective field capacity, field efficiency, theoretical field capacity, quality of cut

Procedia PDF Downloads 142
17475 Comparison of Extended Kalman Filter and Unscented Kalman Filter for Autonomous Orbit Determination of Lagrangian Navigation Constellation

Authors: Youtao Gao, Bingyu Jin, Tanran Zhao, Bo Xu

Abstract:

The history of satellite navigation can be dated back to the 1960s. From the U.S. Transit system and the Russian Tsikada system to the modern Global Positioning System (GPS) and the Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS), performance of satellite navigation has been greatly improved. Nowadays, the navigation accuracy and coverage of these existing systems have already fully fulfilled the requirement of near-Earth users, but these systems are still beyond the reach of deep space targets. Due to the renewed interest in space exploration, a novel high-precision satellite navigation system is becoming even more important. The increasing demand for such a deep space navigation system has contributed to the emergence of a variety of new constellation architectures, such as the Lunar Global Positioning System. Apart from a Walker constellation which is similar to the one adopted by GPS on Earth, a novel constellation architecture which consists of libration point satellites in the Earth-Moon system is also available to construct the lunar navigation system, which can be called accordingly, the libration point satellite navigation system. The concept of using Earth-Moon libration point satellites for lunar navigation was first proposed by Farquhar and then followed by many other researchers. Moreover, due to the special characteristics of Libration point orbits, an autonomous orbit determination technique, which is called ‘Liaison navigation’, can be adopted by the libration point satellites. Using only scalar satellite-to-satellite tracking data, both the orbits of the user and libration point satellites can be determined autonomously. In this way, the extensive Earth-based tracking measurement can be eliminated, and an autonomous satellite navigation system can be developed for future space exploration missions. The method of state estimate is an unnegligible factor which impacts on the orbit determination accuracy besides type of orbit, initial state accuracy and measurement accuracy. We apply the extended Kalman filter(EKF) and the unscented Kalman filter(UKF) to determinate the orbits of Lagrangian navigation satellites. The autonomous orbit determination errors are compared. The simulation results illustrate that UKF can improve the accuracy and z-axis convergence to some extent.

Keywords: extended Kalman filter, autonomous orbit determination, unscented Kalman filter, navigation constellation

Procedia PDF Downloads 276
17474 A Machine Learning Approach for Efficient Resource Management in Construction Projects

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management

Procedia PDF Downloads 28
17473 A System Dynamic Based DSS for Ecological Urban Management in Alexandria, Egypt

Authors: Mona M. Salem, Khaled S. Al-Hagla, Hany M. Ayad

Abstract:

The concept of urban metabolism has increasingly been employed in a diverse range of disciplines as a mean to analyze and theorize the city. Urban ecology has a particular focus on the implications of applying the metabolism concept to the urban realm. This approach has been developed by a few researchers, though it has rarely if ever been used in policy development for city planning. The aim of this research is to use ecologically informed urban planning interventions to increase the sustainability of urban metabolism; with special focus on land stock as a most important city resource by developing a system dynamic based DSS. This model identifies two critical management strategy variables for the Strategic Urban Plan Alexandria SUP 2032. As a result, this comprehensive and precise quantitative approach is needed to monitor, measure, evaluate and observe dynamic urban changes working as a decision support system (DSS) for policy making.

Keywords: ecology, land resource, LULCC, management, metabolism, model, scenarios, system dynamics, urban development

Procedia PDF Downloads 373
17472 Decision Support System for the Management and Maintenance of Sewer Networks

Authors: A. Bouamrane, M. T. Bouziane, K. Boutebba, Y. Djebbar

Abstract:

This paper aims to develop a decision support tool to provide solutions to the problems of sewer networks management/maintenance in order to assist the manager to sort sections upon priority of intervention by taking account of the technical, economic, social and environmental standards as well as the managers’ strategy. This solution uses the Analytic Network Process (ANP) developed by Thomas Saaty, coupled with a set of tools for modelling and collecting integrated data from a geographic information system (GIS). It provides to the decision maker a tool adapted to the reality on the ground and effective in usage compared to the means and objectives of the manager.

Keywords: multi-criteria decision support, maintenance, Geographic Information System, modelling

Procedia PDF Downloads 621
17471 Variability of Hydrological Modeling of the Blue Nile

Authors: Abeer Samy, Oliver C. Saavedra Valeriano, Abdelazim Negm

Abstract:

The Blue Nile Basin is the most important tributary of the Nile River. Egypt and Sudan are almost dependent on water originated from the Blue Nile. This multi-dependency creates conflicts among the three countries Egypt, Sudan, and Ethiopia making the management of these conflicts as an international issue. Good assessment of the water resources of the Blue Nile is an important to help in managing such conflicts. Hydrological models are good tool for such assessment. This paper presents a critical review of the nature and variability of the climate and hydrology of the Blue Nile Basin as a first step of using hydrological modeling to assess the water resources of the Blue Nile. Many several attempts are done to develop basin-scale hydrological modeling on the Blue Nile. Lumped and semi distributed models used averages of meteorological inputs and watershed characteristics in hydrological simulation, to analyze runoff for flood control and water resource management. Distributed models include the temporal and spatial variability of catchment conditions and meteorological inputs to allow better representation of the hydrological process. The main challenge of all used models was to assess the water resources of the basin is the shortage of the data needed for models calibration and validation. It is recommended to use distributed model for their higher accuracy to cope with the great variability and complexity of the Blue Nile basin and to collect sufficient data to have more sophisticated and accurate hydrological modeling.

Keywords: Blue Nile Basin, climate change, hydrological modeling, watershed

Procedia PDF Downloads 361
17470 Hydroinformatics of Smart Cities: Real-Time Water Quality Prediction Model Using a Hybrid Approach

Authors: Elisa Coraggio, Dawei Han, Weiru Liu, Theo Tryfonas

Abstract:

Water is one of the most important resources for human society. The world is currently undergoing a wave of urban growth, and pollution problems are of a great impact. Monitoring water quality is a key task for the future of the environment and human species. In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for environmental monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the artificial intelligence algorithm. This study derives the methodology and demonstrates its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for the environment monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a new methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the Artificial Intelligence algorithm. This study derives the methodology and demonstrate its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.

Keywords: artificial intelligence, hydroinformatics, numerical modelling, smart cities, water quality

Procedia PDF Downloads 177
17469 A Method for Quantitative Assessment of the Dependencies between Input Signals and Output Indicators in Production Systems

Authors: Maciej Zaręba, Sławomir Lasota

Abstract:

Knowing the degree of dependencies between the sets of input signals and selected sets of indicators that measure a production system's effectiveness is of great importance in the industry. This paper introduces the SELM method that enables the selection of sets of input signals, which affects the most the selected subset of indicators that measures the effectiveness of a production system. For defined set of output indicators, the method quantifies the impact of input signals that are gathered in the continuous monitoring production system.

Keywords: manufacturing operation management, signal relationship, continuous monitoring, production systems

Procedia PDF Downloads 108
17468 Key Parameters Analysis of the Stirring Systems in the Optmization Procedures

Authors: T. Gomes, J. Manzi

Abstract:

The inclusion of stirring systems in the calculation and optimization procedures has been undergone a significant lack of attention, what it can reflect in the results because such systems provide an additional energy to the process, besides promote a better distribution of mass and energy. This is meaningful for the reactive systems, particularly for the Continuous Stirred Tank Reactor (CSTR), for which the key variables and parameters, as well as the operating conditions of stirring systems, can play a pivotal role and it has been showed in the literature that neglect these factors can lead to sub-optimal results. It is also well known that the sole use of the First Law of Thermodynamics as an optimization tool cannot yield satisfactory results, since the joint use of the First and Second Laws condensed into a procedure so-called entropy generation minimization (EGM) has shown itself able to drive the system towards better results. Therefore, the main objective of this paper is to determine the effects of key parameters of the stirring system in the optimization procedures by means of EGM applied to the reactive systems. Such considerations have been possible by dimensional analysis according to Rayleigh and Buckingham's method, which takes into account the physical and geometric parameters and the variables of the reactive system. For the simulation purpose based on the production of propylene glycol, the results have shown a significant increase in the conversion rate from 36% (not-optimized system) to 95% (optimized system) with a consequent reduction of by-products. In addition, it has been possible to establish the influence of the work of the stirrer in the optimization procedure, in which can be described as a function of the fluid viscosity and consequently of the temperature. The conclusions to be drawn also indicate that the use of the entropic analysis as optimization tool has been proved to be simple, easy to apply and requiring low computational effort.

Keywords: stirring systems, entropy, reactive system, optimization

Procedia PDF Downloads 243
17467 Adequacy of Advanced Earthquake Intensity Measures for Estimation of Damage under Seismic Excitation with Arbitrary Orientation

Authors: Konstantinos G. Kostinakis, Manthos K. Papadopoulos, Asimina M. Athanatopoulou

Abstract:

An important area of research in seismic risk analysis is the evaluation of expected seismic damage of structures under a specific earthquake ground motion. Several conventional intensity measures of ground motion have been used to estimate their damage potential to structures. Yet, none of them was proved to be able to predict adequately the seismic damage of any structural system. Therefore, alternative advanced intensity measures which take into account not only ground motion characteristics but also structural information have been proposed. The adequacy of a number of advanced earthquake intensity measures in prediction of structural damage of 3D R/C buildings under seismic excitation which attacks the building with arbitrary incident angle is investigated in the present paper. To achieve this purpose, a symmetric in plan and an asymmetric 5-story R/C building are studied. The two buildings are subjected to 20 bidirectional earthquake ground motions. The two horizontal accelerograms of each ground motion are applied along horizontal orthogonal axes forming 72 different angles with the structural axes. The response is computed by non-linear time history analysis. The structural damage is expressed in terms of the maximum interstory drift as well as the overall structural damage index. The values of the aforementioned seismic damage measures determined for incident angle 0° as well as their maximum values over all seismic incident angles are correlated with 9 structure-specific ground motion intensity measures. The research identified certain intensity measures which exhibited strong correlation with the seismic damage of the two buildings. However, their adequacy for estimation of the structural damage depends on the response parameter adopted. Furthermore, it was confirmed that the widely used spectral acceleration at the fundamental period of the structure is a good indicator of the expected earthquake damage level.

Keywords: damage indices, non-linear response, seismic excitation angle, structure-specific intensity measures

Procedia PDF Downloads 490
17466 Media Literacy: Information and Communication Technology Impact on Teaching and Learning Methods in Albanian Education System

Authors: Loreta Axhami

Abstract:

Media literacy in the digital age emerges not only as a set of skills to generate true knowledge and information but also as a pedagogy methodology, as a kind of educational philosophy. In addition to such innovations as information integration and communication technologies, media infrastructures, and web usage in the educational system, media literacy enables the change in the learning methods, pedagogy, teaching programs, and school curriculum itself. In this framework, this study focuses on ICT's impact on teaching and learning methods and the degree they are reflected in the Albanian education system. The study is based on a combination of quantitative and qualitative methods of scientific research. Referring to the study findings, it results that student’s limited access to the internet in school, focus on the hardcopy textbooks and the role of the teacher as the only or main source of knowledge and information are some of the main factors contributing to the implementation of authoritarian pedagogical methods in the Albanian education system. In these circumstances, the implementation of media literacy is recommended as an apt educational process for the 21st century, which requires a reconceptualization of textbooks as well as the application of modern teaching and learning methods by integrating information and communication technologies.

Keywords: authoritarian pedagogic model, education system, ICT, media literacy

Procedia PDF Downloads 132
17465 The Solid-Phase Sensor Systems for Fluorescent and SERS-Recognition of Neurotransmitters for Their Visualization and Determination in Biomaterials

Authors: Irina Veselova, Maria Makedonskaya, Olga Eremina, Alexandr Sidorov, Eugene Goodilin, Tatyana Shekhovtsova

Abstract:

Such catecholamines as dopamine, norepinephrine, and epinephrine are the principal neurotransmitters in the sympathetic nervous system. Catecholamines and their metabolites are considered to be important markers of socially significant diseases such as atherosclerosis, diabetes, coronary heart disease, carcinogenesis, Alzheimer's and Parkinson's diseases. Currently, neurotransmitters can be studied via electrochemical and chromatographic techniques that allow their characterizing and quantification, although these techniques can only provide crude spatial information. Besides, the difficulty of catecholamine determination in biological materials is associated with their low normal concentrations (~ 1 nM) in biomaterials, which may become even one more order lower because of some disorders. In addition, in blood they are rapidly oxidized by monoaminooxidases from thrombocytes and, for this reason, the determination of neurotransmitter metabolism indicators in an organism should be very rapid (15—30 min), especially in critical states. Unfortunately, modern instrumental analysis does not offer a complex solution of this problem: despite its high sensitivity and selectivity, HPLC-MS cannot provide sufficiently rapid analysis, while enzymatic biosensors and immunoassays for the determination of the considered analytes lack sufficient sensitivity and reproducibility. Fluorescent and SERS-sensors remain a compelling technology for approaching the general problem of selective neurotransmitter detection. In recent years, a number of catecholamine sensors have been reported including RNA aptamers, fluorescent ribonucleopeptide (RNP) complexes, and boronic acid based synthetic receptors and the sensor operated in a turn-off mode. In this work we present the fluorescent and SERS turn-on sensor systems based on the bio- or chemorecognizing nanostructured films {chitosan/collagen-Tb/Eu/Cu-nanoparticles-indicator reagents} that provide the selective recognition, visualization, and sensing of the above mentioned catecholamines on the level of nanomolar concentrations in biomaterials (cell cultures, tissue etc.). We have (1) developed optically transparent porous films and gels of chitosan/collagen; (2) ensured functionalization of the surface by molecules-'recognizers' (by impregnation and immobilization of components of the indicator systems: biorecognizing and auxiliary reagents); (3) performed computer simulation for theoretical prediction and interpretation of some properties of the developed materials and obtained analytical signals in biomaterials. We are grateful for the financial support of this research from Russian Foundation for Basic Research (grants no. 15-03-05064 a, and 15-29-01330 ofi_m).

Keywords: biomaterials, fluorescent and SERS-recognition, neurotransmitters, solid-phase turn-on sensor system

Procedia PDF Downloads 400
17464 Autoantibodies against Central Nervous System Antigens and the Serum Levels of IL-32 in Patients with Schizophrenia

Authors: Fatemeh Keshavarz

Abstract:

Background: Schizophrenia is a disease of the nervous system, and immune system disorders can affect its pathogenesis. Activation of microglia, proinflammatory cytokines, disruption of the blood-brain barrier (BBB) due to inflammation, activation of autoreactive B cells, and consequently the production of autoantibodies against system antigens are among the immune processes involved in neurological diseases. interleukin 32 (IL-32) a proinflammatory cytokine that important player in the activation of the innate and adaptive immune responses. This study aimed to measure the serum level of IL-32 as well as the frequency of autoantibody positivity against several nervous system antigens in patients with schizophrenia. Material and Methods: This study was conducted on 40 patients with schizophrenia and 40 healthy individuals in the control group. Serum IL-32 levels were measured by ELISA. The frequency of autoantibodies against Hu, Ri, Yo, Tr, CV2, Amphiphysin, SOX1, Zic4, ITPR1, CARP, GAD, Recoverin, Titin, and Ganglioside antigens were measured by indirect immunofluorescence method. Results: Serum IL-32 levels in patients with schizophrenia were significantly higher compared to the control group. Autoantibodies were positive in 8 patients for GAD antigen and 5 patients for Ri antigen, which showed a significant relationship compared to the control group. Autoantibodies were also positive in 2 patients for CV2, in 1 patient for Hu, and in 1 patient for CARP. Negative results were reported for other antigens. Conclusion: Our findings suggest that elevated the serum IL-32 level and autoantibody positivity against several nervous system antigens may be involved in the pathogenesis of schizophrenia.

Keywords: schizophrenia, microglia, autoantibodies, IL-32

Procedia PDF Downloads 121
17463 A New Converter Topology for Wind Energy Conversion System

Authors: Mahmoud Khamaira, Ahmed Abu-Siada, Yasser Alharbi

Abstract:

Doubly Fed Induction Generators (DFIGs) are currently extensively used in variable speed wind power plants due to their superior advantages that include reduced converter rating, low cost, reduced losses, easy implementation of power factor correction schemes, variable speed operation and four quadrants active and reactive power control capabilities. On the other hand, DFIG sensitivity to grid disturbances, especially for voltage sags represents the main disadvantage of the equipment. In this paper, a coil is proposed to be integrated within the DFIG converters to improve the overall performance of a DFIG-based wind energy conversion system (WECS). The charging and discharging of the coil are controlled by controlling the duty cycle of the switches of the dc-dc chopper. Simulation results reveal the effectiveness of the proposed topology in improving the overall performance of the WECS system under study.

Keywords: doubly fed induction generator, coil, wind energy conversion system, converter topology

Procedia PDF Downloads 654
17462 Android Graphics System: Study of Dual-Software VSync Synchronization Architecture and Optimization

Authors: Prafulla Kumar Choubey, Krishna Kishor Jha, S. B. Vaisakh Punnekkattu Chirayil

Abstract:

In Graphics-display subsystem, frame buffers are shared between producer i.e. content rendering and consumer i.e. display. If a common buffer is operated by both producer and consumer simultaneously, their processing rates mismatch can cause tearing effect in displayed content. Therefore, Android OS employs triple buffered system, taking in to account an additional composition stage. Three stages-rendering, composition and display refresh, operate synchronously on three different buffers, which is achieved by using vsync pulses. This synchronization, however, brings in to the pipeline an additional latency of up to 26ms. The present study details about the existing synchronization mechanism of android graphics-display pipeline and discusses a new adaptive architecture which reduces the wait time to 5ms-16ms in all the use-cases. The proposed method uses two adaptive software vsyncs (PLL) for achieving the same result.

Keywords: Android graphics system, vertical synchronization, atrace, adaptive system

Procedia PDF Downloads 307