Search results for: distribution system and optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23253

Search results for: distribution system and optimization

19203 Visual Analysis of Picturesque Urban Landscape Case of Sultanahmet, Istanbul

Authors: Saidu Dalhat Dansadau, Aykut Karaman

Abstract:

The integration of photography into architecture was a pivotal point in the journey of architectural representation; photography proved itself useful for the betterment of architecture early on, as well as established itself as a necessary tool in the realm of architecture. The main study this paper was extracted from looked into the inquiry of knowing exactly what are the key picturesque locations/structures in Sultanahmet, Fatih-Istanbul, and how can their spatial distribution and cultural significance be characterized and mapped for urban design and development as well as the secondary objective, of which this paper focuses on, is to “Investigate the role of perception in urban environments and how photography serves as a tool for capturing and conveying the perception of Sultanahmet's picturesque structures/locations”. The study achieved these objectives by utilizing methodologies such as geo-tagged photography, sequential photography, social media metadata extraction, GIS mapping, spatial analysis, and visual analysis, focusing on the historically rich and culturally significant study area of Sultanahmet, Fatih-Istanbul. By looking at potential structures/locations and then dissecting their special distribution and cultural significance, the main study was able to achieve the main objective as well as unveil a more nuanced understanding of the dynamics between photography, architecture, and urban design with respect to perception using sequential photography.

Keywords: perception, architectural photography, picturesque, urban design, Sultanahmet, Istanbul

Procedia PDF Downloads 39
19202 Examining the Relationship Between Traditional Property Rights and Online Intellectual Property Rights in the Digital Age

Authors: Luljeta Plakolli-Kasumi

Abstract:

In the digital age, the relationship between traditional property rights and online intellectual property rights is becoming increasingly complex. On the one hand, the internet and advancements in technology have allowed for the widespread distribution and use of digital content, making it easier for individuals and businesses to access and share information. On the other hand, the rise of digital piracy and illegal file-sharing has led to increased concerns about the protection of intellectual property rights. This paper aims to examine the relationship between traditional property rights and online intellectual property rights in the digital age by analyzing the current legal frameworks, key challenges and controversies that arise, and potential solutions for addressing these issues. The paper will look at how traditional property rights concepts such as ownership and possession are being applied in the online context and how they intersect with new and evolving forms of intellectual property such as digital downloads, streaming services, and online content creation. It will also discuss the tension between the need for strong intellectual property protection to encourage creativity and innovation and the public interest in promoting access to information and knowledge. Ultimately, the paper will explore how the legal system can adapt to better balance the interests of property owners, creators, and users in the digital age.

Keywords: intellectual property, traditional property, digital age, digital content

Procedia PDF Downloads 87
19201 Experimental and Numerical Studies of Droplet Formation

Authors: Khaled Al-Badani, James Ren, Lisa Li, David Allanson

Abstract:

Droplet formation is an important process in many engineering systems and manufacturing procedures, which includes welding, biotechnologies, 3D printing, biochemical, biomedical fields and many more. The volume and the characteristics of droplet formation are generally depended on various material properties, microfluidics and fluid mechanics considerations. Hence, a detailed investigation of this process, with the aid of numerical computational tools, are essential for future design optimization and process controls of many engineering systems. This will also improve the understanding of changes in the properties and the structures of materials, during the formation of the droplet, which is important for new material developments to achieve different functions, pending the requirements of the application. For example, the shape of the formed droplet is critical for the function of some final products, such as the welding nugget during Capacitor Discharge Welding process, or PLA 3D printing, etc. Although, most academic journals on droplet formation, focused on issued with material transfer rate, surface tension and residual stresses, the general emphasis on the characteristics of droplet shape has been overlooked. The proposed work for this project will examine theoretical methodologies, experimental techniques, and numerical modelling, using ANSYS FLUENT, to critically analyse and highlight optimization methods regarding the formation of pendant droplet. The project will also compare results from published data with experimental and numerical work, concerning the effects of key material parameters on the droplet shape. These effects include changes in heating/cooling rates, solidification/melting progression and separation/break-up times. From these tests, a set of objectives is prepared, with an intention of improving quality, stability and productivity in modelling metal welding and 3D printing.

Keywords: computer modelling, droplet formation, material distortion, materials forming, welding

Procedia PDF Downloads 282
19200 Lessons Learned from Push-Plus Implementation in Northern Nigeria

Authors: Aisha Giwa, Mohammed-Faosy Adeniran, Olufunke Femi-Ojo

Abstract:

Four decades ago, the World Health Organization (WHO) launched the Expanded Programme on Immunization (EPI). The EPI blueprint laid out the technical and managerial functions necessary to routinely vaccinate children with a limited number of vaccines, providing protection against diphtheria, tetanus, whooping cough, measles, polio, and tuberculosis, and to prevent maternal and neonatal tetanus by vaccinating women of childbearing age with tetanus toxoid. Despite global efforts, the Routine Immunization (RI) coverage in two of the World Health Organization (WHO) regions; the African Region and the South-East Asia Region, still remains short of its targets. As a result, the WHO Regional Director for Africa declared 2012 as the year for intensifying RI in these regions and this also coincided with the declaration of polio as a programmatic emergency by the WHO Executive Board. In order to intensify routine immunization, the National Routine Immunization Strategic Plan (2013-2015) stated that its core priority is to ensure 100% adequacy and availability of vaccines for safe immunization. To achieve 100% availability, the “PUSH System” and then “Push-Plus” were adopted for vaccine distribution, which replaced the inefficient “PULL” method. The NPHCDA plays the key role in coordinating activities in area advocacy, capacity building, engagement of 3PL for the state as well as monitoring and evaluation of the vaccine delivery process. eHealth Africa (eHA) is a player as a 3PL service provider engaged by State Primary Health Care Boards (SPHCDB) to ensure vaccine availability through Vaccine Direct Delivery (VDD) project which is essential to successful routine immunization services. The VDD project ensures the availability and adequate supply of high-quality vaccines and immunization-related materials to last-mile facilities. eHA’s commitment to the VDD project saw the need for an assessment of the project vis-a-vis the overall project performance, evaluation of a process for necessary improvement suggestions as well as general impact across Kano State (Where eHA had transitioned to the state), Bauchi State (currently manage delivery to all LGAs except 3 LGAs currently being managed by the state), Sokoto State (eHA currently covers all LGAs) and Zamfara State (Currently, in-sourced and managed solely by the state).

Keywords: cold chain logistics, health supply chain system strengthening, logistics management information system, vaccine delivery traceability and accountability

Procedia PDF Downloads 295
19199 Application of GIS-Based Construction Engineering: An Electronic Document Management System

Authors: Mansour N. Jadid

Abstract:

This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.

Keywords: construction, coordinate, engineering, GIS, management, map

Procedia PDF Downloads 301
19198 Enhanced Production of Endo-β-1,4-Xylanase from a Newly Isolated Thermophile Geobacillus stearothermophilus KIBGE-IB29 for Prospective Industrial Applications

Authors: Zainab Bibi, Afsheen Aman, Shah Ali Ul Qader

Abstract:

Endo-β-1,4-xylanases [EC 3.2.1.8] are one of the major groups of enzymes that are involved in degradation process of xylan and have several applications in food, textile and paper processing industries. Due to broad utility of endo-β-1,4-xylanase, researchers are focusing to increase the productivity of this hydrolase from various microbial species. Harsh industrial condition, faster reaction rate and efficient hydrolysis of xylan with low risk of contamination are critical requirements of industry that can be fulfilled by synthesizing the enzyme with efficient properties. In the current study, a newly isolated thermophile Geobacillus stearothermophilus KIBGE-IB29 was used in order to attain the maximum production of endo-1,4-β-xylanase. Bacterial culture was isolated from soil, collected around the blast furnace site of a steel processing mill, Karachi. Optimization of various nutritional and physical factors resulted the maximum synthesis of endo-1,4-β-xylanase from a thermophile. High production yield was achieved at 60°C and pH-6.0 after 24 hours of incubation period. Various nitrogen sources viz. peptone, yeast extract and meat extract improved the enzyme synthesis with 0.5%, 0.2% and 0.1% optimum concentrations. Dipotassium hydrogen phosphate (0.25%), potassium dihydrogen phosphate (0.05%), ammonium sulfate (0.05%) and calcium chloride (0.01%) were noticed as valuable salts to improve the production of enzyme. The thermophilic nature of isolate, with its broad pH stability profile and reduced fermentation time indicates its importance for effective xylan saccharification and for large scale production of endo-1,4-β-xylanase.

Keywords: geobacillus, optimization, production, xylanase

Procedia PDF Downloads 306
19197 A Model of Empowerment Evaluation of Knowledge Management in Private Banks Using Fuzzy Inference System

Authors: Nazanin Pilevari, Kamyar Mahmoodi

Abstract:

The purpose of this research is to provide a model based on fuzzy inference system for evaluating empowerment of Knowledge management. The first prototype of the research was developed based on the study of literature. In the next step, experts were provided with these models and after implementing consensus-based reform, the views of Fuzzy Delphi experts and techniques, components and Index research model were finalized. Culture, structure, IT and leadership were considered as dimensions of empowerment. Then, In order to collect and extract data for fuzzy inference system based on knowledge and Experience, the experts were interviewed. The values obtained from designed fuzzy inference system, made review and assessment of the organization's empowerment of Knowledge management possible. After the design and validation of systems to measure indexes ,empowerment of Knowledge management and inputs into fuzzy inference) in the AYANDEH Bank, a questionnaire was used. In the case of this bank, the system output indicates that the status of empowerment of Knowledge management, culture, organizational structure and leadership are at the moderate level and information technology empowerment are relatively high. Based on these results, the status of knowledge management empowerment in AYANDE Bank, was moderate. Eventually, some suggestions for improving the current situation of banks were provided. According to studies of research history, the use of powerful tools in Fuzzy Inference System for assessment of Knowledge management and knowledge management empowerment such an assessment in the field of banking, are the innovation of this Research.

Keywords: knowledge management, knowledge management empowerment, fuzzy inference system, fuzzy Delphi

Procedia PDF Downloads 353
19196 Thermal Imaging of Aircraft Piston Engine in Laboratory Conditions

Authors: Lukasz Grabowski, Marcin Szlachetka, Tytus Tulwin

Abstract:

The main task of the engine cooling system is to maintain its average operating temperatures within strictly defined limits. Too high or too low average temperatures result in accelerated wear or even damage to the engine or its individual components. In order to avoid local overheating or significant temperature gradients, leading to high stresses in the component, the aim is to ensure an even flow of air. In the case of analyses related to heat exchange, one of the main problems is the comparison of temperature fields because standard measuring instruments such as thermocouples or thermistors only provide information about the course of temperature at a given point. Thermal imaging tests can be helpful in this case. With appropriate camera settings and taking into account environmental conditions, we are able to obtain accurate temperature fields in the form of thermograms. Emission of heat from the engine to the engine compartment is an important issue when designing a cooling system. Also, in the case of liquid cooling, the main sources of heat in the form of emissions from the engine block, cylinders, etc. should be identified. It is important to redesign the engine compartment ventilation system. Ensuring proper cooling of aircraft reciprocating engine is difficult not only because of variable operating range but mainly because of different cooling conditions related to the change of speed or altitude of flight. Engine temperature also has a direct and significant impact on the properties of engine oil, which under the influence of this parameter changes, in particular, its viscosity. Too low or too high, its value can be a result of fast wear of engine parts. One of the ways to determine the temperatures occurring on individual parts of the engine is the use of thermal imaging measurements. The article presents the results of preliminary thermal imaging tests of aircraft piston diesel engine with a maximum power of about 100 HP. In order to perform the heat emission tests of the tested engine, the ThermaCAM S65 thermovision monitoring system from FLIR (Forward-Looking Infrared) together with the ThermaCAM Researcher Professional software was used. The measurements were carried out after the engine warm up. The engine speed was 5300 rpm The measurements were taken for the following environmental parameters: air temperature: 17 °C, ambient pressure: 1004 hPa, relative humidity: 38%. The temperatures distribution on the engine cylinder and on the exhaust manifold were analysed. Thermal imaging tests made it possible to relate the results of simulation tests to the real object by measuring the rib temperature of the cylinders. The results obtained are necessary to develop a CFD (Computational Fluid Dynamics) model of heat emission from the engine bay. The project/research was financed in the framework of the project Lublin University of Technology-Regional Excellence Initiative, funded by the Polish Ministry of Science and Higher Education (contract no. 030/RID/2018/19).

Keywords: aircraft, piston engine, heat, emission

Procedia PDF Downloads 114
19195 A High Reliable Space-Borne File System with Applications of Device Partition and Intra-Channel Pipeline in Nand Flash

Authors: Xin Li, Ji-Yang Yu, Yue-Hua Niu, Lu-Yuan Wang

Abstract:

As an inevitable chain of the space data acquirement system, space-borne storage system based on Nand Flash has gradually been implemented in spacecraft. In face of massive, parallel and varied data on board, efficient data management become an important issue of storage research. Face to the requirements of high-performance and reliability in Nand Flash storage system, a combination of hardware and file system design can drastically increase system dependability, even for missions with a very long duration. More sophisticated flash storage concepts with advanced operating systems have been researched to improve the reliability of Nand Flash storage system on satellites. In this paper, architecture of file system with multi-channel data acquisition and storage on board is proposed, which obtains large-capacity and high-performance with the combine of intra-channel pipeline and device partition in Nand Flash. Multi-channel data in different rate are stored as independent files with parallel-storage system in device partition, which assures the high-effective and reliable throughput of file treatments. For massive and high-speed data storage, an efficiency assessment model is established to calculate the bandwidth formula of intra-channel pipeline. Information tables designed in Magnetoresistive RAM (MRAM) hold the management of bad block in Nand Flash and the arrangement of file system address for the high-reliability of data storage. During the full-load test, the throughput of 3D PLUS Module 160Gb Nand Flash can reach 120Mbps for store and reach 120Mbps for playback, which efficiently satisfies the requirement of multi-channel data acquisition in Satellite. Compared with previous literature, the results of experiments verify the advantages of the proposed system.

Keywords: device partition architecture, intra-channel pipelining, nand flash, parallel storage

Procedia PDF Downloads 287
19194 Transversal Connection Strengthening of T Section Beam Bridge with Brace System

Authors: Chen Chen

Abstract:

T section beam bridge has been widely used in China as it is low cost and easy to erect. Some of T section beam bridges only have end diagrams and the adjacent girders are connected by wet-joint along span, which leads to the damage of transversal connection becomes a serious problem in operation and maintenance. This paper presents a brace system to strengthen the transversal connection of T section beam bridge. The strengthening effect was discussed by experiments and finite element analysis. The results show that the proposed brace system can improve load transfer between adjacent girders. Based on experiments and FEA model, displacement of T section beam with proposed brace system reduced 14.9% and 19.1% respectively. Integral rigidity increased 19.4% by static experiments. The transversal connection of T section beam bridge can be improved efficiently.

Keywords: experiment, strengthening, T section beam bridge, transversal connection

Procedia PDF Downloads 277
19193 An Algorithm of Regulation of Glucose-Insulin Concentration in the Blood

Authors: B. Selma, S. Chouraqui

Abstract:

The pancreas is an elongated organ that extends across the abdomen, below the stomach. In addition, it secretes certain enzymes that aid in food digestion. The pancreas also manufactures hormones responsible for regulating blood glucose levels. In the present paper, we propose a mathematical model to study the homeostasis of glucose and insulin in healthy human, and a simulation of this model, which depicts the physiological events after a meal, will be represented in ordinary humans. The aim of this paper is to design an algorithm which regulates the level of glucose in the blood. The algorithm applied the concept of expert system for performing an algorithm control in the form of an "active" used to prescribe the rate of insulin infusion. By decomposing the system into subsystems, we have developed parametric models of each subsystem by using a forcing function strategy. The results showed a performance of the control system.

Keywords: modeling, algorithm, regulation, glucose-insulin, blood, control system

Procedia PDF Downloads 173
19192 Machine Learning Approaches Based on Recency, Frequency, Monetary (RFM) and K-Means for Predicting Electrical Failures and Voltage Reliability in Smart Cities

Authors: Panaya Sudta, Wanchalerm Patanacharoenwong, Prachya Bumrungkun

Abstract:

As With the evolution of smart grids, ensuring the reliability and efficiency of electrical systems in smart cities has become crucial. This paper proposes a distinct approach that combines advanced machine learning techniques to accurately predict electrical failures and address voltage reliability issues. This approach aims to improve the accuracy and efficiency of reliability evaluations in smart cities. The aim of this research is to develop a comprehensive predictive model that accurately predicts electrical failures and voltage reliability in smart cities. This model integrates RFM analysis, K-means clustering, and LSTM networks to achieve this objective. The research utilizes RFM analysis, traditionally used in customer value assessment, to categorize and analyze electrical components based on their failure recency, frequency, and monetary impact. K-means clustering is employed to segment electrical components into distinct groups with similar characteristics and failure patterns. LSTM networks are used to capture the temporal dependencies and patterns in customer data. This integration of RFM, K-means, and LSTM results in a robust predictive tool for electrical failures and voltage reliability. The proposed model has been tested and validated on diverse electrical utility datasets. The results show a significant improvement in prediction accuracy and reliability compared to traditional methods, achieving an accuracy of 92.78% and an F1-score of 0.83. This research contributes to the proactive maintenance and optimization of electrical infrastructures in smart cities. It also enhances overall energy management and sustainability. The integration of advanced machine learning techniques in the predictive model demonstrates the potential for transforming the landscape of electrical system management within smart cities. The research utilizes diverse electrical utility datasets to develop and validate the predictive model. RFM analysis, K-means clustering, and LSTM networks are applied to these datasets to analyze and predict electrical failures and voltage reliability. The research addresses the question of how accurately electrical failures and voltage reliability can be predicted in smart cities. It also investigates the effectiveness of integrating RFM analysis, K-means clustering, and LSTM networks in achieving this goal. The proposed approach presents a distinct, efficient, and effective solution for predicting and mitigating electrical failures and voltage issues in smart cities. It significantly improves prediction accuracy and reliability compared to traditional methods. This advancement contributes to the proactive maintenance and optimization of electrical infrastructures, overall energy management, and sustainability in smart cities.

Keywords: electrical state prediction, smart grids, data-driven method, long short-term memory, RFM, k-means, machine learning

Procedia PDF Downloads 50
19191 Regret-Regression for Multi-Armed Bandit Problem

Authors: Deyadeen Ali Alshibani

Abstract:

In the literature, the multi-armed bandit problem as a statistical decision model of an agent trying to optimize his decisions while improving his information at the same time. There are several different algorithms models and their applications on this problem. In this paper, we evaluate the Regret-regression through comparing with Q-learning method. A simulation on determination of optimal treatment regime is presented in detail.

Keywords: optimal, bandit problem, optimization, dynamic programming

Procedia PDF Downloads 450
19190 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 368
19189 A Generic Middleware to Instantly Sync Intensive Writes of Heterogeneous Massive Data via Internet

Authors: Haitao Yang, Zhenjiang Ruan, Fei Xu, Lanting Xia

Abstract:

Industry data centers often need to sync data changes reliably and instantly from a large-scale of heterogeneous autonomous relational databases accessed via the not-so-reliable Internet, for which a practical universal sync middle of low maintenance and operation costs is most wanted, but developing such a product and adapting it for various scenarios are a very sophisticated and continuous practice. The authors have been devising, applying, and optimizing a generic sync middleware system, named GSMS since 2006, holding the principles or advantages that the middleware must be SyncML-compliant and transparent to data application layer logic, need not refer to implementation details of databases synced, does not rely on host computer operating systems deployed, and its construction is light weighted and hence, of low cost. A series of ultimate experiments with GSMS sync performance were conducted for a persuasive example of a source relational database that underwent a broad range of write loads, say, from one thousand to one million intensive writes within a few minutes. The tests proved that GSMS has achieved an instant sync level of well below a fraction of millisecond per record sync, and GSMS’ smooth performances under ultimate write loads also showed it is feasible and competent.

Keywords: heterogeneous massive data, instantly sync intensive writes, Internet generic middleware design, optimization

Procedia PDF Downloads 117
19188 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 63
19187 A Multilayer Perceptron Neural Network Model Optimized by Genetic Algorithm for Significant Wave Height Prediction

Authors: Luis C. Parra

Abstract:

The significant wave height prediction is an issue of great interest in the field of coastal activities because of the non-linear behavior of the wave height and its complexity of prediction. This study aims to present a machine learning model to forecast the significant wave height of the oceanographic wave measuring buoys anchored at Mooloolaba of the Queensland Government Data. Modeling was performed by a multilayer perceptron neural network-genetic algorithm (GA-MLP), considering Relu(x) as the activation function of the MLPNN. The GA is in charge of optimized the MLPNN hyperparameters (learning rate, hidden layers, neurons, and activation functions) and wrapper feature selection for the window width size. Results are assessed using Mean Square Error (MSE), Root Mean Square Error (RMSE), and Mean Absolute Error (MAE). The GAMLPNN algorithm was performed with a population size of thirty individuals for eight generations for the prediction optimization of 5 steps forward, obtaining a performance evaluation of 0.00104 MSE, 0.03222 RMSE, 0.02338 MAE, and 0.71163% of MAPE. The results of the analysis suggest that the MLPNNGA model is effective in predicting significant wave height in a one-step forecast with distant time windows, presenting 0.00014 MSE, 0.01180 RMSE, 0.00912 MAE, and 0.52500% of MAPE with 0.99940 of correlation factor. The GA-MLP algorithm was compared with the ARIMA forecasting model, presenting better performance criteria in all performance criteria, validating the potential of this algorithm.

Keywords: significant wave height, machine learning optimization, multilayer perceptron neural networks, evolutionary algorithms

Procedia PDF Downloads 102
19186 Reliability Improvement of Power System Networks Using Adaptive Genetic Algorithm

Authors: Alireza Alesaadi

Abstract:

Reliability analysis is a powerful method for determining the weak points of the electrical networks. In designing of electrical network, it is tried to design the most reliable network with minimal system shutting down, but it is usually associated with increasing the cost. In this paper, using adaptive genetic algorithm, a method was presented that provides the most reliable system with a certain economical cost. Finally, the proposed method is applied to a sample network and results will be analyzed.

Keywords: reliability, adaptive genetic algorithm, electrical network, communication engineering

Procedia PDF Downloads 497
19185 The Impact of a Sustainable Solar Heating System on the Growth of ‎Strawberry Plants in an Agricultural Greenhouse

Authors: Ilham Ihoume, Rachid Tadili, Nora Arbaoui

Abstract:

The use of solar energy is a crucial tactic in the agricultural industry's plan ‎‎to decrease greenhouse gas emissions. This clean source of energy can ‎greatly lower the sector's carbon footprint and make a significant impact in ‎the ‎fight against climate change. In this regard, this study examines the ‎effects ‎of a solar-based heating system, in a north-south oriented agricultural ‎green‎house on the development of strawberry plants during winter. This ‎system ‎relies on the circulation of water as a heat transfer fluid in a closed ‎circuit ‎installed on the greenhouse roof to store heat during the day and ‎release it ‎inside at night. A comparative experimental study was conducted ‎in two ‎greenhouses, one experimental with the solar heating system and the ‎other ‎for control without any heating system. Both greenhouses are located ‎on the ‎terrace of the Solar Energy and Environment Laboratory of the ‎Mohammed ‎V University in Rabat, Morocco. The developed heating system ‎consists of a ‎copper coil inserted in double glazing and placed on the roof of ‎the greenhouse, a water pump circulator, a battery, and a photovoltaic solar ‎panel to ‎power the electrical components. This inexpensive and ‎environmentally ‎friendly system allows the greenhouse to be heated during ‎the winter and ‎improves its microclimate system. This improvement resulted ‎in an increase ‎in the air temperature inside the experimental greenhouse by 6 ‎‎°C and 8 °C, ‎and a reduction in its relative humidity by 23% and 35% ‎compared to the ‎control greenhouse and the ambient air, respectively, ‎throughout the winter. ‎For the agronomic performance, it was observed that ‎the production was 17 ‎days earlier than in the control greenhouse‎.‎

Keywords: sustainability, thermal energy storage, solar energy, agriculture greenhouse

Procedia PDF Downloads 82
19184 Extreme Value Modelling of Ghana Stock Exchange Indices

Authors: Kwabena Asare, Ezekiel N. N. Nortey, Felix O. Mettle

Abstract:

Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana Stock Exchange All-Shares indices (2000-2010) by applying the Extreme Value Theory to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before EVT method was applied. The Peak Over Threshold (POT) approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model’s goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the Value at Risk (VaR) and Expected Shortfall (ES) risk measures at some high quantiles, based on the fitted GPD model.

Keywords: extreme value theory, expected shortfall, generalized pareto distribution, peak over threshold, value at risk

Procedia PDF Downloads 550
19183 Analysing Maximum Power Point Tracking in a Stand Alone Photovoltaic System

Authors: Osamede Asowata

Abstract:

Optimized gain in respect to output power of stand-alone photovoltaic (PV) systems is one of the major focus of PV in recent times. This is evident in its low carbon emission and efficiency. Power failure or outage from commercial providers, in general, does not promote development to public and private sector; these basically limit the development of industries. The need for a well-structured PV system is of importance for an efficient and cost effective monitoring system. The purpose of this paper is to validate the maximum power point of an off-grid PV system taking into consideration the most effective tilt and orientation angles for PV's in the southern hemisphere. This paper is based on analyzing the system using a solar charger with maximum power point tracking (MPPT) from a pulse width modulation (PWM) perspective. The power conditioning device chosen is a solar charger with MPPT. The practical setup consists of a PV panel that is set to an orientation angle of 0°N, with a corresponding tilt angle of 36°, 26°, and 16°. Preliminary results include regression analysis (normal probability plot) showing the maximum power point in the system as well the best tilt angle for maximum power point tracking.

Keywords: poly-crystalline PV panels, solar chargers, tilt and orientation angles, maximum power point tracking, MPPT, Pulse Width Modulation (PWM).

Procedia PDF Downloads 169
19182 Impacts of Land Use and Land Cover Change on Stream Flow and Sediment Yield of Genale Dawa Dam III Watershed, Ethiopia

Authors: Aklilu Getahun Sulito

Abstract:

Land Use and Land Cover change dynamics is a result of complex interactions betweenseveral bio- physical and socio-economic conditions. The impacts of the landcoverchange on stream flow and sediment yield were analyzed statistically usingthehydrological model, SWAT. Genale Dawa Dam III watershed is highly af ectedbydeforestation, over grazing, and agricultural land expansion. This study was aimedusingSWAT model for the assessment of impacts of land use land cover change on sediment yield, evaluating stream flow on wet &dry seasons and spatial distribution sediment yieldfrom sub-basins of the Genale Dawa Dam III watershed. Land use land cover maps(LULC) of 2000, 2008 and 2016 were used with same corresponding climate data. During the study period most parts of the forest, dense forest evergreen and grass landchanged to cultivated land. The cultivated land increased by 26.2%but forest land, forest evergreen lands and grass lands decreased by 21.33%, 11.59 % and 7.28 %respectively, following that the mean annual sediment yield of watershed increased by 7.37ton/haover16 years period (2000 – 2016). The analysis of stream flow for wet and dry seasonsshowed that the steam flow increased by 25.5% during wet season, but decreasedby29.6% in the dry season. The result an average annual spatial distribution of sediment yield increased by 7.73ton/ha yr -1 from (2000_2016). The calibration results for bothstream flow and sediment yield showed good agreement between observed and simulateddata with the coef icient of determination of 0.87 and 0.84, Nash-Sutclif e ef iciencyequality to 0.83 and 0.78 and percentage bias of -7.39% and -10.90%respectively. Andthe result for validation for both stream flow and sediment showed good result withCoef icient of determination equality to 0.83 and 0.80, Nash-Sutclif e ef iciency of 0.78and 0.75 and percentage bias of 7.09% and 3.95%. The result obtained fromthe model based on the above method was the mean annual sediment load at Genale DawaDamIIIwatershed increase from 2000 to 2016 for the reason that of the land uses change. Sotouse the Genale Dawa Dam III the land use management practices are neededinthefuture to prevent further increase of sediment yield of the watershed.

Keywords: Genale Dawa Dam III watershed, land use land cover change, SWAT, spatial distribution, sediment yield, stream flow

Procedia PDF Downloads 48
19181 An A-Star Approach for the Quickest Path Problem with Time Windows

Authors: Christofas Stergianos, Jason Atkin, Herve Morvan

Abstract:

As air traffic increases, more airports are interested in utilizing optimization methods. Many processes happen in parallel at an airport, and complex models are needed in order to have a reliable solution that can be implemented for ground movement operations. The ground movement for aircraft in an airport, allocating a path to each aircraft to follow in order to reach their destination (e.g. runway or gate), is one process that could be optimized. The Quickest Path Problem with Time Windows (QPPTW) algorithm has been developed to provide a conflict-free routing of vehicles and has been applied to routing aircraft around an airport. It was subsequently modified to increase the accuracy for airport applications. These modifications take into consideration specific characteristics of the problem, such as: the pushback process, which considers the extra time that is needed for pushing back an aircraft and turning its engines on; stand holding where any waiting should be allocated to the stand; and runway sequencing, where the sequence of the aircraft that take off is optimized and has to be respected. QPPTW involves searching for the quickest path by expanding the search in all directions, similarly to Dijkstra’s algorithm. Finding a way to direct the expansion can potentially assist the search and achieve a better performance. We have further modified the QPPTW algorithm to use a heuristic approach in order to guide the search. This new algorithm is based on the A-star search method but estimates the remaining time (instead of distance) in order to assess how far the target is. It is important to consider the remaining time that it is needed to reach the target, so that delays that are caused by other aircraft can be part of the optimization method. All of the other characteristics are still considered and time windows are still used in order to route multiple aircraft rather than a single aircraft. In this way the quickest path is found for each aircraft while taking into account the movements of the previously routed aircraft. After running experiments using a week of real aircraft data from Zurich Airport, the new algorithm (A-star QPPTW) was found to route aircraft much more quickly, being especially fast in routing the departing aircraft where pushback delays are significant. On average A-star QPPTW could route a full day (755 to 837 aircraft movements) 56% faster than the original algorithm. In total the routing of a full week of aircraft took only 12 seconds with the new algorithm, 15 seconds faster than the original algorithm. For real time application, the algorithm needs to be very fast, and this speed increase will allow us to add additional features and complexity, allowing further integration with other processes in airports and leading to more optimized and environmentally friendly airports.

Keywords: a-star search, airport operations, ground movement optimization, routing and scheduling

Procedia PDF Downloads 225
19180 Study of Ion Density Distribution and Sheath Thickness in Warm Electronegative Plasma

Authors: Rajat Dhawan, Hitendra K. Malik

Abstract:

Electronegative plasmas comprising electrons, positive ions, and negative ions are advantageous for their expanding applications in industries. In plasma cleaning, plasma etching, and plasma deposition process, electronegative plasmas are preferred because of relatively less potential developed on the surface of the material under investigation. Also, the presence of negative ions avoid the irregularity in etching shapes and also enhance the material working during the fabrication process. The interaction of metallic conducting surface with plasma becomes mandatory to understand these applications. A metallic conducting probe immersed in a plasma results in the formation of a thin layer of charged species around the probe called as a sheath. The density of the ions embedded on the surface of the material and the sheath thickness are the important parameters for the surface-plasma interaction. Sheath thickness will give rise to the information of affected plasma region due to conducting surface/probe. The knowledge of the density of ions in the sheath region is advantageous in plasma nitriding, and their temperature is equally important as it strongly influences the thickness of the modified layer during surface plasma interaction. In the present work, we considered a negatively biased metallic probe immersed in a warm electronegative plasma. For this system, we adopted the continuity equation and momentum transfer equation for both the positive and negative ions, whereas electrons are described by Boltzmann distribution. Finally, we use the Poisson’s equation. Here, we assumed the spherical geometry for small probe radius. Poisson’s equation reveals the behaviour of potential surrounding a conducting metallic probe along with the use of the continuity and momentum transfer equations, with the help of proper boundary conditions. In turn, it gives rise to the information about the density profile of charged species and most importantly the thickness of the sheath. By keeping in mind, the well-known Bohm-Sheath criterion, all calculations are done. We found that positive ion density decreases with an increase in positive ion temperature, whereas it increases with the higher temperature of the negative ions. Positive ion density decreases as we move away from the center of the probe and is found to show a discontinuity at a particular distance from the center of the probe. The distance where discontinuity occurs is designated as sheath edge, i.e., the point where sheath ends. These results are beneficial for industrial applications, as the density of ions embedded on material surface is strongly affected by the temperature of plasma species. It has a drastic influence on the surface properties, i.e., the hardness, corrosion resistance, etc. of the materials.

Keywords: electronegative plasmas, plasma surface interaction positive ion density, sheath thickness

Procedia PDF Downloads 127
19179 Web Map Service for Fragmentary Rockfall Inventory

Authors: M. Amparo Nunez-Andres, Nieves Lantada

Abstract:

One of the most harmful geological risks is rockfalls. They cause both economic lost, damaged in buildings and infrastructures, and personal ones. Therefore, in order to estimate the risk of the exposed elements, it is necessary to know the mechanism of this kind of events, since the characteristics of the rock walls, to the propagation of fragments generated by the initial detached rock mass. In the framework of the research RockModels project, several inventories of rockfalls were carried out along the northeast of the Spanish peninsula and the Mallorca island. These inventories have general information about the events, although the important fact is that they contained detailed information about fragmentation. Specifically, the IBSD (Insitu Block Size Distribution) is obtained by photogrammetry from drone or TLS (Terrestrial Laser Scanner) and the RBSD (Rock Block Size Distribution) from the volume of the fragment in the deposit measured by hand. In order to share all this information with other scientists, engineers, members of civil protection, and stakeholders, it is necessary a platform accessible from the internet and following interoperable standards. In all the process, open-software have been used: PostGIS 2.1., Geoserver, and OpenLayers library. In the first step, a spatial database was implemented to manage all the information. We have used the data specifications of INSPIRE for natural risks adding specific and detailed data about fragmentation distribution. The next step was to develop a WMS with Geoserver. A previous phase was the creation of several views in PostGIS to show the information at different scales of visualization and with different degrees of detail. In the first view, the sites are identified with a point, and basic information about the rockfall event is facilitated. In the next level of zoom, at medium scale, the convex hull of the rockfall appears with its real shape and the source of the event and fragments are represented by symbols. The queries at this level offer a major detail about the movement. Eventually, the third level shows all elements: deposit, source, and blocks, in their real size, if it is possible, and in their real localization. The last task was the publication of all information in a web mapping site (www.rockdb.upc.edu) with data classified by levels using libraries in JavaScript as OpenLayers.

Keywords: geological risk, web mapping, WMS, rockfalls

Procedia PDF Downloads 157
19178 Texture Identification Using Vision System: A Method to Predict Functionality of a Component

Authors: Varsha Singh, Shraddha Prajapati, M. B. Kiran

Abstract:

Texture identification is useful in predicting the functionality of a component. Many of the existing texture identification methods are of contact in nature, which limits its measuring speed. These contact measurement techniques use a diamond stylus and the diamond stylus being sharp going to damage the surface under inspection and hence these techniques can be used in statistical sampling. Though these contact methods are very accurate, they do not give complete information for full characterization of surface. In this context, the presented method assumes special significance. The method uses a relatively low cost vision system for image acquisition. Software is developed based on wavelet transform, for analyzing texture images. Specimens are made using different manufacturing process (shaping, grinding, milling etc.) During experimentation, the specimens are illuminated using proper lighting and texture images a capture using CCD camera connected to the vision system. The software installed in the vision system processes these images and subsequently identify the texture of manufacturing processes.

Keywords: diamond stylus, manufacturing process, texture identification, vision system

Procedia PDF Downloads 283
19177 Fuzzy Availability Analysis of a Battery Production System

Authors: Merve Uzuner Sahin, Kumru D. Atalay, Berna Dengiz

Abstract:

In today’s competitive market, there are many alternative products that can be used in similar manner and purpose. Therefore, the utility of the product is an important issue for the preferability of the brand. This utility could be measured in terms of its functionality, durability, reliability. These all are affected by the system capabilities. Reliability is an important system design criteria for the manufacturers to be able to have high availability. Availability is the probability that a system (or a component) is operating properly to its function at a specific point in time or a specific period of times. System availability provides valuable input to estimate the production rate for the company to realize the production plan. When considering only the corrective maintenance downtime of the system, mean time between failure (MTBF) and mean time to repair (MTTR) are used to obtain system availability. Also, the MTBF and MTTR values are important measures to improve system performance by adopting suitable maintenance strategies for reliability engineers and practitioners working in a system. Failure and repair time probability distributions of each component in the system should be known for the conventional availability analysis. However, generally, companies do not have statistics or quality control departments to store such a large amount of data. Real events or situations are defined deterministically instead of using stochastic data for the complete description of real systems. A fuzzy set is an alternative theory which is used to analyze the uncertainty and vagueness in real systems. The aim of this study is to present a novel approach to compute system availability using representation of MTBF and MTTR in fuzzy numbers. Based on the experience in the system, it is decided to choose 3 different spread of MTBF and MTTR such as 15%, 20% and 25% to obtain lower and upper limits of the fuzzy numbers. To the best of our knowledge, the proposed method is the first application that is used fuzzy MTBF and fuzzy MTTR for fuzzy system availability estimation. This method is easy to apply in any repairable production system by practitioners working in industry. It is provided that the reliability engineers/managers/practitioners could analyze the system performance in a more consistent and logical manner based on fuzzy availability. This paper presents a real case study of a repairable multi-stage production line in lead-acid battery production factory in Turkey. The following is focusing on the considered wet-charging battery process which has a higher production level than the other types of battery. In this system, system components could exist only in two states, working or failed, and it is assumed that when a component in the system fails, it becomes as good as new after repair. Instead of classical methods, using fuzzy set theory and obtaining intervals for these measures would be very useful for system managers, practitioners to analyze system qualifications to find better results for their working conditions. Thus, much more detailed information about system characteristics is obtained.

Keywords: availability analysis, battery production system, fuzzy sets, triangular fuzzy numbers (TFNs)

Procedia PDF Downloads 221
19176 Effect of Cap and Trade Policies for Carbon Emission Reduction on Delhi Households

Authors: Vikram Singh

Abstract:

This paper aims to take into account carbon tax or cap-and-trade legislation to manage Delhi carbon emissions after a post-Kyoto treaty. This report estimated the influence of the carbon taxes or rebate/compensation cost at the household level. Here, the three possible scenarios will help to comprehend the difference between a straightforward compensation/rebate, and two clearly denoting progressive formula. The straightforward compensation is basically minimizing the regressive applications that will bears on cost. On the other hand, both the progressive formula will generate extra revenue, which will help for feasibility of more efficient, vehicles, appliances and buildings in the low-income household. For the hypothetical case of carbon price $40/tonne, low-income household for both urban and rural region could experience price burden up to 5% and 9% on their income as compared to 3% and 7% for high-income household respectively. The survey report also shown that carbon emission due low-income household are primarily by the substantive requirement like housing and transportation whereas almost 40% emission due to high-income household are by luxurious and non-essential items. The equal distribution of revenue cum incentives will not completely overcome high-income household’s investment in inessential items. However, it will merely help in investing their income in energy efficient and less carbon intensive items. Therefore, the rebate distribution on per capita basis instead on per households will benefit more especially large families at low-income group.

Keywords: household emission, carbon credit, carbon intensity, green house gas emission, carbon generation based insentives

Procedia PDF Downloads 430
19175 Creating a Virtual Perception for Upper Limb Rehabilitation

Authors: Nina Robson, Kenneth John Faller II, Vishalkumar Ahir, Arthur Ricardo Deps Miguel Ferreira, John Buchanan, Amarnath Banerjee

Abstract:

This paper describes the development of a virtual-reality system ARWED, which will be used in physical rehabilitation of patients with reduced upper extremity mobility to increase limb Active Range of Motion (AROM). The ARWED system performs a symmetric reflection and real-time mapping of the patient’s healthy limb on to their most affected limb, tapping into the mirror neuron system and facilitating the initial learning phase. Using the ARWED, future experiments will test the extension of the action-observation priming effect linked to the mirror-neuron system on healthy subjects and then stroke patients.

Keywords: physical rehabilitation, mirror neuron, virtual reality, stroke therapy

Procedia PDF Downloads 426
19174 Distribution of HLA-DQA1 and HLA-DQB1 Alleles in Thais: Genetics Database Insight for COVID-19 Severity

Authors: Jinu Phonamontham

Abstract:

Coronavirus, also referred to as COVID-19, is a virus caused by the SARS-Cov-2 virus. The pandemic has caused over 10 million cases and 500,000 deaths worldwide through the end of June 2020. In a previous study, HLA-DQA1*01:02 allele was associated with COVID-19 disease (p-value = 0.0121). Furthermore, there was a statistical significance between HLA- DQB1*06:02 and COVID-19 in the Italian population by Bonferroni’s correction (p-value = 0.0016). Nevertheless, there is no data describing the distribution of HLA alleles as a valid marker for prediction of COVID-19 in the Thai population. We want to investigate the prevalence of HLA-DQA1*01:02 and HLA-DQB1*06:02 alleles that are associated with severe COVID-19 in the Thai population. In this study, we recruited 200 healthy Thai individuals. Genomic DNA samples were isolated from EDTA blood using Genomic DNA Mini Kit. HLA genotyping was conducted using the Lifecodes HLA SSO typing kits (Immucor, West Avenue, Stamford, USA). The frequency of HLA-DQA1 alleles in Thai population, consisting of HLA-DQA1*01:01 (27.75%), HLA-DQA1*01:02 (24.50%), HLA-DQA1*03:03 (13.00%), HLA-DQA1*06:01 (10.25%) and HLA-DQA1*02:01 (6.75%). Furthermore, the distributions of HLA-DQB1 alleles were HLA-DQB1*05:02 (21.50%), HLA-DQB1*03:01 (15.75%), HLA-DQB1*05:01 (14.50%), HLA-DQB1*03:03 (11.00%) and HLA-DQB1*02:02 (8.25%). Particularly, HLA- DQA1*01:02 (29.00%) allele was the highest frequency in the NorthEast group, but there was not significant difference when compared with the other regions in Thais (p-value = 0.4202). HLA-DQB1*06:02 allele was similarly distributed in Thai population and there was no significant difference between Thais and China (3.8%) and South Korea (6.4%) and Japan (8.2%) with p-value > 0.05. Whereas, South Africa (15.7%) has a significance with Thais by p-value of 0.0013. This study supports the specific genotyping of the HLA-DQA1*01:02 and HLA-DQB1*06:02 alleles to screen severe COVID-19 in Thai and many populations.

Keywords: HLA-DQA1*01:02, HLA-DQB1*06:02, Asian, Thai population

Procedia PDF Downloads 89