Search results for: optimization/inverse mapping
1745 Mapping the Quotidian Life of Practitioners of Various Religious Sects in Late Medieval Bengal: Portrayals on the Front Façades of the Baranagar Temple Cluster
Authors: I. Gupta, B. Karmakar
Abstract:
Bengal has a long history (8th century A.D. onwards) of decorating the wall of brick-built temples with curved terracotta plaques on a diverse range of subjects. These could be considered as one of the most significant visual archives to understand the various facets of the then contemporary societies. The temples under focus include Char-bangla temple complex (circa 1755 A.D.), Bhavanishvara temple (circa 1755 A.D.) and the Gangeshvara Shiva Jor-bangla temple (circa 1753 A.D.), located within a part of the river Bhagirathi basin in Baranagar, Murshidabad, West Bengal, India. Though, a diverse range of subjects have been intricately carved mainly on the front façades of the Baranagar temple cluster, the study specifically concentrates on depictions related to religious and non-religious acts performed by practitioners of various religious sects of late medieval Bengal with the intention to acquire knowledge about the various facets of their life. Apart from this, the paper also mapped the spatial location of these religious performers on the temples’ façades to examine if any systematic plan or arrangement had been employed for connoting a particular idea. Further, an attempt is made to provide a commentary on the attire worn by followers of various religious sects of late medieval Bengal. The primary materials for the study comprise the depictions which denote religious activities carved on the terracotta plaques. The secondary material has been collected from published and unpublished theses, journals and books. These data have been further supplemented with photographic documentation, some useful line-drawings and descriptions in table format to get a clear understanding of the concerned issues.Keywords: attire, scheme of allocation, terracotta temple, various religious sect
Procedia PDF Downloads 1381744 Collective Potential: A Network of Acupuncture Interventions for Flood Resilience
Authors: Sachini Wickramanayaka
Abstract:
The occurrence of natural disasters has increased in an alarming rate in recent times due to escalating effects of climate change. One such natural disaster that has continued to grow in frequency and intensity is ‘flooding’, adversely affecting communities around the globe. This is an exploration on how architecture can intervene and facilitate in preserving communities in the face of disaster, specifically in battling floods. ‘Resilience’ is one of the concepts that have been brought forward to be instilled in vulnerable communities to lower the impact from such disasters as a preventative and coping mechanism. While there are number of ways to achieve resilience in the built environment, this paper aims to create a synthesis between resilience and ‘urban acupuncture’. It will consider strengthening communities from within, by layering a network of relatively small-scale, fast phased interventions on pre-existing conventional flood preventative large-scale engineering infrastructure.By investigating ‘The Woodlands’, a planned neighborhood as a case study, this paper will argue that large-scale water management solutions while extremely important will not suffice as a single solution particularly during a time of frequent and extreme weather events. The different projects will try to synthesize non-architectural aspects such as neighborhood aspirations, requirements, potential and awareness into a network of architectural forms that would collectively increase neighborhood resiliency to floods. A mapping study of the selected study area will identify the problematic areas that flood in the neighborhood while the empirical data from previously implemented case studies will assess the success of each solution.If successful the different solutions for each of the identified problem areas will exhibithow flooding and water management can be integrated as part and parcel of daily life.Keywords: acupuncture, architecture, resiliency, micro-interventions, neighborhood
Procedia PDF Downloads 1701743 Mapping Social and Natural Hazards: A Survey of Potential for Managed Retreat in the United States
Authors: Karim Ahmed
Abstract:
The purpose of this study was to investigate how factoring the impact of natural disasters beyond flooding would affect managed retreat policy eligibility in the United States. For the study design, a correlation analysis method compared weighted measures of flooding and other natural disasters (e.g., wildfires, tornadoes, heatwaves, etc.) to CBSA Populated areas, the prevalence of cropland, and relative poverty on a county level. The study found that the vast majority of CBSAs eligible for managed retreat programs under a policy inclusive of non-flooding events would have already been covered by flood-only managed retreat policies. However, it is noteworthy that a majority of those counties that are not covered by a flood-only managed retreat policy have high rates of poverty and are either heavily populated and/or agriculturally active. The correlation is particularly strong between counties that are subject to multiple natural hazards and those that have both high rates of relative poverty and cropland prevalence. There is currently no managed retreat policy for agricultural land in the United States despite the environmental implications and food supply chain vulnerabilities related to at-risk cropland. The findings of this study suggest both that such a policy should be created and, when it is, that special attention should be paid to non-flood natural disasters affecting agricultural areas. These findings also reveal that, while current flood-based policies in the United States serve many areas that do need access to managed retreat funding and implementation, other vulnerable areas are overlooked by this approach. These areas are often deeply impoverished and are therefore particularly vulnerable to natural disaster; if and when those disasters do occur, these areas are often less financially prepared to recover or retreat from the disaster’s advance and, due to the limitations of the current policies discussed above, are less able to take the precautionary measures necessary to mitigate their risk.Keywords: flood, hazard, land use, managed retreat, wildfire
Procedia PDF Downloads 1261742 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach
Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier
Abstract:
The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis
Procedia PDF Downloads 1041741 A Simulative Approach for JIT Parts-Feeding Policies
Authors: Zhou BingHai, Fradet Victor
Abstract:
Lean philosophy follows the simple principle of “creating more value with fewer resources”. In accordance with this policy, material handling can be managed by the mean of Kanban which by triggering every feeding tour only when needed regulates the flow of material in one of the most efficient way. This paper focuses on Kanban Supermarket’s parameters and their optimization on a purely cost-based point of view. Number and size of forklifts, as well as size of the containers they carry, will be variables of the cost function which includes handling costs, inventory costs but also shortage costs. With an innovative computational approach encoded into industrial engineering software Tecnomatix and reproducing real-life conditions, a fictive assembly line is established and produces a random list of orders. Multi-scenarios are then run to study the impact of each change of parameter and the variation of costs it implies. Lastly, best-case scenarios financially speaking are selected.Keywords: Kanban, supermarket, parts-feeding policies, multi-scenario simulation, assembly line
Procedia PDF Downloads 1951740 Optimation of Ethanol Extract of Gotu Kola and Majapahit Composition as Natural Antioxidant Source
Authors: Mustofa Ahda, Fiqri Rozi, Gina Noor Habibah, Mas Ulfah Lestari, Tomy Hardianto, Yuni Andriani
Abstract:
The development of natural antioxidants in the Centella asiatica and Majapahit is a great potential. This research has been optimizing the composition of ethanol extract of Centella asiatica and leaves Majapahit as an antioxidants source using measure the free radical scavenging activity of DPPH. The results of the research showed that both the ethanol extract of Centella asiatica and leaves Majapahit has a total content of phenol. It is shown with the ability to reduce reagent Folin Ciocalteu become blue colour. The composition optimization of extract Centella asiatica leaves Majapahit = 30:70 has free radical scavenging activity of DPPH most well compared ethanol extract of Centella asiatica and leaves Majapahit. IC50 values for the composition of ethanol extract of Centella asiatica : leaves Majapahit = 30:70 is 0,103 mg/mL.Keywords: antioxidant activity, Centella asiatica, Cresentia cujete, composition extract
Procedia PDF Downloads 3291739 Stock Prediction and Portfolio Optimization Thesis
Authors: Deniz Peksen
Abstract:
This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.Keywords: stock prediction, portfolio optimization, data science, machine learning
Procedia PDF Downloads 801738 Design of Non-uniform Circular Antenna Arrays Using Firefly Algorithm for Side Lobe Level Reduction
Authors: Gopi Ram, Durbadal Mandal, Rajib Kar, Sakti Prasad Ghoshal
Abstract:
A design problem of non-uniform circular antenna arrays for maximum reduction of both the side lobe level (SLL) and first null beam width (FNBW) is dealt with. This problem is modeled as a simple optimization problem. The method of Firefly algorithm (FFA) is used to determine an optimal set of current excitation weights and antenna inter-element separations that provide radiation pattern with maximum SLL reduction and much improvement on FNBW as well. Circular array antenna laid on x-y plane is assumed. FFA is applied on circular arrays of 8-, 10-, and 12- elements. Various simulation results are presented and hence performances of side lobe and FNBW are analyzed. Experimental results show considerable reductions of both the SLL and FNBW with respect to those of the uniform case and some standard algorithms GA, PSO, and SA applied to the same problem.Keywords: circular arrays, first null beam width, side lobe level, FFA
Procedia PDF Downloads 2591737 Crude Oil Electrostatic Mathematical Modelling on an Existing Industrial Plant
Authors: Fatemeh Yazdanmehr, Iulian Nistor
Abstract:
The scope of the current study is the prediction of water separation in a two-stage industrial crude oil desalting plant. This research study was focused on developing a desalting operation in an existing production unit of one Iranian heavy oil field with 75 MBPD capacity. Because of some operational issues, such as oil dehydration at high temperatures, the optimization of the desalter operational parameters was essential. The mathematical desalting is modeled based on the population balance method. The existing operational data is used for tuning and validation of the accuracy of the modeling. The inlet oil temperature to desalter used was decreased from 110°C to 80°C, and the desalted electrical field was increased from 0.75 kv to 2.5 kv. The proposed condition for the desalter also meets the water oil specification. Based on these conditions of desalter, the oil recovery is increased by 574 BBL/D, and the gas flaring decrease by 2.8 MMSCF/D. Depending on the oil price, the additional production of oil can increase the annual income by about $15 MM and reduces greenhouse gas production caused by gas flaring.Keywords: desalter, demulsification, modelling, water-oil separation, crude oil emulsion
Procedia PDF Downloads 771736 Optimized and Secured Digital Watermarking Using Fuzzy Entropy, Bezier Curve and Visual Cryptography
Authors: R. Rama Kishore, Sunesh
Abstract:
Recent development in the usage of internet for different purposes creates a great threat for the copyright protection of the digital images. Digital watermarking can be used to address the problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field of secured, robust and imperceptible watermarking. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (2, 2) share visual cryptography and Bezier curve based algorithm to improve the security of the watermark. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method. The algorithm is optimized using fuzzy entropy for better results.Keywords: digital watermarking, fractional transform, visual cryptography, Bezier curve, fuzzy entropy
Procedia PDF Downloads 3661735 Monomial Form Approach to Rectangular Surface Modeling
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications
Procedia PDF Downloads 1461734 Recovery of Fried Soybean Oil Using Bentonite as an Adsorbent: Optimization, Isotherm and Kinetics Studies
Authors: Prakash Kumar Nayak, Avinash Kumar, Uma Dash, Kalpana Rayaguru
Abstract:
Soybean oil is one of the most widely consumed cooking oils, worldwide. Deep-fat frying of foods at higher temperatures adds unique flavour, golden brown colour and crispy texture to foods. But it brings in various changes like hydrolysis, oxidation, hydrogenation and thermal alteration to oil. The presence of Peroxide value (PV) is one of the most important factors affecting the quality of the deep-fat fried oil. Using bentonite as an adsorbent, the PV can be reduced, thereby improving the quality of the soybean oil. In this study, operating parameters like heating time of oil (10, 15, 20, 25 & 30 h), contact time ( 5, 10, 15, 20, 25 h) and concentration of adsorbent (0.25, 0.5, 0.75, 1.0 and 1.25 g/ 100 ml of oil) have been optimized by response surface methodology (RSM) considering percentage reduction of PV as a response. Adsorption data were analysed by fitting with Langmuir and Freundlich isotherm model. The results show that the Langmuir model shows the best fit compared to the Freundlich model. The adsorption process was also found to follow a pseudo-second-order kinetic model.Keywords: bentonite, Langmuir isotherm, peroxide value, RSM, soybean oil
Procedia PDF Downloads 3751733 Experimental and Finite Element Analysis for Mechanics of Soil-Tool Interaction
Authors: A. Armin, R. Fotouhi, W. Szyszkowski
Abstract:
In this paper a 3-D finite element (FE) investigation of soil-blade interaction is described. The effects of blade’s shape and rake angle are examined both numerically and experimentally. The soil is considered as an elastic-plastic granular material with non-associated Drucker-Prager material model. Contact elements with different properties are used to mimic soil-blade sliding and soil-soil cutting phenomena. A separation criterion is presented and a procedure to evaluate the forces acting on the blade is given and discussed in detail. Experimental results were derived from tests using soil bin facility and instruments at the University of Saskatchewan. During motion of the blade, load cells collect data and send them to a computer. The measured forces using load cells had noisy signals which are needed to be filtered. The FE results are compared with experimental results for verification. This technique can be used in blade shape optimization and design of more complicated blade’s shape.Keywords: finite element analysis, experimental results, blade force, soil-blade contact modeling
Procedia PDF Downloads 3201732 Optimization of Machining Parameters in AlSi/10%AlN Metal Matrix Composite Material by TiN Coating Insert
Authors: Nurul Na'imy Wan, Mohamad Sazali Said, Jaharah Ab. Ghani, Rusli Othman
Abstract:
This paper presents the surface roughness of the aluminium silicon alloy (AlSi) matrix composite which has been reinforced with aluminium nitride (AlN). Experiments were conducted at various cutting speeds, feed rates, and depths of cut, according to a standard orthogonal array L27 of Taguchi method using TiN coating tool of insert. The signal-to-noise (S/N) ratio and analysis of variance are applied to study the characteristic performance of cutting speeds, feed rates and depths of cut in measuring the surface roughness during the milling operation. The surface roughness was observed using Mitutoyo Formtracer CS-500 and analyzed using the Taguchi method. From the Taguchi analysis, it was found that cutting speed of 230 m/min, feed rate of 0.4 mm/tooth, depth of cut of 0.3 mm were the optimum machining parameters using TiN coating insert.Keywords: AlSi/AlN metal matrix composite (MMC), surface roughness, Taguchi method, machining parameters
Procedia PDF Downloads 4331731 Design and Implementation of an AI-Enabled Task Assistance and Management System
Authors: Arun Prasad Jaganathan
Abstract:
In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper introduces an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.Keywords: artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization
Procedia PDF Downloads 591730 Oil Producing Wells Using a Technique of Gas Lift on Prosper Software
Authors: Nikhil Yadav, Shubham Verma
Abstract:
Gas lift is a common technique used to optimize oil production in wells. Prosper software is a powerful tool for modeling and optimizing gas lift systems in oil wells. This review paper examines the effectiveness of Prosper software in optimizing gas lift systems in oil-producing wells. The literature review identified several studies that demonstrated the use of Prosper software to adjust injection rate, depth, and valve characteristics to optimize gas lift system performance. The results showed that Prosper software can significantly improve production rates and reduce operating costs in oil-producing wells. However, the accuracy of the model depends on the accuracy of the input data, and the cost of Prosper software can be high. Therefore, further research is needed to improve the accuracy of the model and evaluate the cost-effectiveness of using Prosper software in gas lift system optimizationKeywords: gas lift, prosper software, injection rate, operating costs, oil-producing wells
Procedia PDF Downloads 881729 Numerical Design and Characterization of SiC Single Crystals Obtained with PVT Method
Authors: T. Wejrzanowski, M. Grybczuk, E. Tymicki, K. J. Kurzydlowski
Abstract:
In the present study, numerical simulations of heat and mass transfer in Physical Vapor Transport reactor during silicon carbide single crystal growth are addressed. Silicon carbide is a wide bandgap material with unique properties making it highly applicable for high power electronics applications. Because of high manufacturing costs improvements of SiC production process are required. In this study, numerical simulations were used as a tool of process optimization. Computer modeling allows for cost and time effective analysis of processes occurring during SiC single crystal growth and provides essential information needed for improvement of the process. Quantitative relationship between process conditions, such as temperature or pressure, and crystal growth rate and shape of crystallization front have been studied and verified using experimental data. Basing on modeling results, several process improvements were proposed and implemented.Keywords: Finite Volume Method, semiconductors, Physica Vapor Transport, silicon carbide
Procedia PDF Downloads 4981728 Design and Optimization of a Customized External Fixation Device for Lower Limb Injuries
Authors: Mohammed S. Alqahtani, Paulo J. Bartolo
Abstract:
External fixation is a common technique for the treatment and stabilization of bone fractures. Different designs have been proposed by companies and research groups, but all of them present limitations such as high weight, not comfortable to use, and not customized to individual patients. This paper proposes a lightweight customized external fixator, overcoming some of these limitations. External fixators are designed using a set of techniques such as medical imaging, CAD modelling, finite element analysis, and full factorial design of experiments. Key design parameters are discussed, and the optimal set of parameters is used to design the final external fixator. Numerical simulations are used to validate design concepts. Results present an optimal external fixation design with weight reduction of 13% without compromising its stiffness and structural integrity. External fixators are also designed to be additively manufactured, allowing to develop a strategy for personalization.Keywords: computer-aided design modelling, external fixation, finite element analysis, full factorial, personalization
Procedia PDF Downloads 1601727 Credit Risk Evaluation Using Genetic Programming
Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira
Abstract:
Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.Keywords: credit risk assessment, rule generation, genetic programming, feature selection
Procedia PDF Downloads 3531726 Analysis of Joint Source Channel LDPC Coding for Correlated Sources Transmission over Noisy Channels
Authors: Marwa Ben Abdessalem, Amin Zribi, Ammar Bouallègue
Abstract:
In this paper, a Joint Source Channel coding scheme based on LDPC codes is investigated. We consider two concatenated LDPC codes, one allows to compress a correlated source and the second to protect it against channel degradations. The original information can be reconstructed at the receiver by a joint decoder, where the source decoder and the channel decoder run in parallel by transferring extrinsic information. We investigate the performance of the JSC LDPC code in terms of Bit-Error Rate (BER) in the case of transmission over an Additive White Gaussian Noise (AWGN) channel, and for different source and channel rate parameters. We emphasize how JSC LDPC presents a performance tradeoff depending on the channel state and on the source correlation. We show that, the JSC LDPC is an efficient solution for a relatively low Signal-to-Noise Ratio (SNR) channel, especially with highly correlated sources. Finally, a source-channel rate optimization has to be applied to guarantee the best JSC LDPC system performance for a given channel.Keywords: AWGN channel, belief propagation, joint source channel coding, LDPC codes
Procedia PDF Downloads 3571725 Arsenite Remediation by Green Nano Zero Valent Iron
Authors: Ratthiwa Deewan, Visanu Tanboonchuy
Abstract:
The optimal conditions for green synthesis of zero-valent (G-NZVI) synthesis are investigated in this study using a Box Behnken design. The factors that were used in the study consisted of 3 factors as follows: the iron solution to mango peel extract ratio (1:1-1:3), feeding rate of mango peel extracts (1-5 mL/min), and agitation speed (300-30 rpm). The results showed that the optimization of conditions using the regression model was appropriate. The optimal conditions of the synthesis of G-NZVI for arsenate removal are the iron solution to mango peel extract ratio of 1:1, the feeding rate of mango peel extract at 5 mL/min, and the agitation speed rate of 300 rpm, which was able to arsenate removal of 100%.Keywords: Box Behnken design, arsenate removal, green nano zero valent iron, arsenic
Procedia PDF Downloads 301724 Earth Observations and Hydrodynamic Modeling to Monitor and Simulate the Oil Pollution in the Gulf of Suez, Red Sea, Egypt
Authors: Islam Abou El-Magd, Elham Ali, Moahmed Zakzouk, Nesreen Khairy, Naglaa Zanaty
Abstract:
Maine environment and coastal zone are wealthy with natural resources that contribute to the local economy of Egypt. The Gulf of Suez and Red Sea area accommodates diverse human activities that contribute to the local economy, including oil exploration and production, touristic activities, export and import harbors, etc, however, it is always under the threat of pollution due to human interaction and activities. This research aimed at integrating in-situ measurements and remotely sensed data with hydrodynamic model to map and simulate the oil pollution. High-resolution satellite sensors including Sentinel 2 and Plantlab were functioned to trace the oil pollution. Spectral band ratio of band 4 (infrared) over band 3 (red) underpinned the mapping of the point source pollution from the oil industrial estates. This ratio is supporting the absorption windows detected in the hyperspectral profiles. ASD in-situ hyperspectral device was used to measure experimentally the oil pollution in the marine environment. The experiment used to measure water behavior in three cases a) clear water without oil, b) water covered with raw oil, and c) water after a while from throwing the raw oil. The spectral curve is clearly identified absorption windows for oil pollution, particularly at 600-700nm. MIKE 21 model was applied to simulate the dispersion of the oil contamination and create scenarios for crises management. The model requires precise data preparation of the bathymetry, tides, waves, atmospheric parameters, which partially obtained from online modeled data and other from historical in-situ stations. The simulation enabled to project the movement of the oil spill and could create a warning system for mitigation. Details of the research results will be described in the paper.Keywords: oil pollution, remote sensing, modelling, Red Sea, Egypt
Procedia PDF Downloads 3471723 Optimizing Load Shedding Schedule Problem Based on Harmony Search
Authors: Almahd Alshereef, Ahmed Alkilany, Hammad Said, Azuraliza Abu Bakar
Abstract:
From time to time, electrical power grid is directed by the National Electricity Operator to conduct load shedding, which involves hours' power outages on the area of this study, Southern Electrical Grid of Libya (SEGL). Load shedding is conducted in order to alleviate pressure on the National Electricity Grid at times of peak demand. This approach has chosen a set of categories to study load-shedding problem considering the effect of the demand priorities on the operation of the power system during emergencies. Classification of category region for load shedding problem is solved by a new algorithm (the harmony algorithm) based on the "random generation list of category region", which is a possible solution with a proximity degree to the optimum. The obtained results prove additional enhancements compared to other heuristic approaches. The case studies are carried out on SEGL.Keywords: optimization, harmony algorithm, load shedding, classification
Procedia PDF Downloads 3971722 Computational Fluids Dynamics Investigation of the Effect of Geometric Parameters on the Ejector Performance
Authors: Michel Wakim, Rodrigo Rivera Tinoco
Abstract:
Supersonic ejector is an economical device that use high pressure vapor to compress a low pressure vapor without any rotating parts or external power sources. Entrainment ratio is a major characteristic of the ejector performance, so the ejector performance is highly dependent on its geometry. The aim of this paper is to design ejector geometry, based on pre-specified operating conditions, and to study the flow behavior inside the ejector by using computational fluid dynamics ‘CFD’ by using ‘ANSYS FLUENT 15.0’ software. In the first section; 1-D mathematical model is carried out to predict the ejector geometry. The second part describes the flow behavior inside the designed model. CFD is the most reliable tool to reveal the mixing process at different parts of the supersonic turbulent flow and to study the effect of the geometry on the effective ejector area. Finally, the results show the effect of the geometry on the entrainment ratio.Keywords: computational fluids dynamics, ejector, entrainment ratio, geometry optimization, performance
Procedia PDF Downloads 2751721 Land Use and Natal Multimammate Mouse Abundance in Lassa Fever Endemic Villages of Eastern Sierra Leone
Authors: J. T. Koininga, J. E. Teigen, A. Wilkinson, D. Kanneh, F. Kanneh, M. Foday, D. S. Grant, M. Leach, L. M. Moses
Abstract:
Lassa fever (LF) is a severe febrile illness endemic to West Africa. While human-to-human transmission occurs, evidence suggests most LF cases originate from exposure to rodents, particularly the Natal multimammate mouse, Mastomys natalensis. Within West Africa, LF occurs primarily in rural communities where agriculture is the main economic activity. Seasonality of LF has also been linked to agricultural cycles, with peak incidence occurring in the dry season when fields are burned and plowed. To investigate this pattern of seasonality, four agricultural communities were selected for this two-year longitudinal study. Each community was to be sampled four times each year, but this was interrupted by the Ebola virus disease outbreak. Agricultural land use, forested, and fallow areas were identified through participatory mapping. Transects were plotted in each area and Sherman traps were set for four nights. Captured small mammals were identified, ear tagged, and released. Mastomys natalensis abundance was found to be highest in areas of converted fallow land and rice swamps in the dry season and upland mixed crop areas toward the onset of the rainy season. All peak times were associated with heavy perturbation of soil. All ages and genders were present during these time points. These results suggest that peak abundance of the Mastomys natalensis in agricultural areas coincides with peak incidence of LF reported in this region. Although contact with rodents may be higher in villages, our study suggests human behaviors in agricultural areas may increase risk of transmission of Lassa virus.Keywords: agriculture, land use, Lassa Fever, rodent abundance
Procedia PDF Downloads 1201720 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm
Authors: Safayat Ali Shaikh
Abstract:
Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern
Procedia PDF Downloads 2031719 Using Greywolf Optimized Machine Learning Algorithms to Improve Accuracy for Predicting Hospital Readmission for Diabetes
Authors: Vincent Liu
Abstract:
Machine learning algorithms (ML) can achieve high accuracy in predicting outcomes compared to classical models. Metaheuristic, nature-inspired algorithms can enhance traditional ML algorithms by optimizing them such as by performing feature selection. We compare ten ML algorithms to predict 30-day hospital readmission rates for diabetes patients in the US using a dataset from UCI Machine Learning Repository with feature selection performed by Greywolf nature-inspired algorithm. The baseline accuracy for the initial random forest model was 65%. After performing feature engineering, SMOTE for class balancing, and Greywolf optimization, the machine learning algorithms showed better metrics, including F1 scores, accuracy, and confusion matrix with improvements ranging in 10%-30%, and a best model of XGBoost with an accuracy of 95%. Applying machine learning this way can improve patient outcomes as unnecessary rehospitalizations can be prevented by focusing on patients that are at a higher risk of readmission.Keywords: diabetes, machine learning, 30-day readmission, metaheuristic
Procedia PDF Downloads 621718 Effect of Process Parameters on Tensile Strength of Aluminum Alloy ADC 10 Produced through Ceramic Shell Investment Casting
Authors: Balwinder Singh
Abstract:
Castings are produced by using aluminum alloy ADC 10 through the process of Ceramic Shell Investment Casting. Experiments are conducted as per the Taguchi L9 orthogonal array. In order to evaluate the effect of process parameters such as mould preheat temperature, preheat time, firing temperature and pouring temperature on surface roughness of ceramic shell investment castings, the Taguchi parameter design and optimization approach is used. Plots of means of significant factors and S/N ratios have been used to determine the best relationship between the responses and model parameters. It is found that the pouring temperature is the most significant factor. The best tensile strength of aluminum alloy ADC 10 is given by 150 ºC shell preheat temperature, 45 minutes preheat time, 900 ºC firing temperature, 650 ºC pouring temperature.Keywords: investment casting, shell preheat temperature, firing temperature, Taguchi method
Procedia PDF Downloads 1751717 Order Picking Problem: An Exact and Heuristic Algorithms for the Generalized Travelling Salesman Problem With Geographical Overlap Between Clusters
Authors: Farzaneh Rajabighamchi, Stan van Hoesel, Christof Defryn
Abstract:
The generalized traveling salesman problem (GTSP) is an extension of the traveling salesman problem (TSP) where the set of nodes is partitioned into clusters, and the salesman must visit exactly one node per cluster. In this research, we apply the definition of the GTSP to an order picker routing problem with multiple locations per product. As such, each product represents a cluster and its corresponding nodes are the locations at which the product can be retrieved. To pick a certain product item from the warehouse, the picker needs to visit one of these locations during its pick tour. As all products are scattered throughout the warehouse, the product clusters not separated geographically. We propose an exact LP model as well as heuristic and meta-heuristic solution algorithms for the order picking problem with multiple product locations.Keywords: warehouse optimization, order picking problem, generalised travelling salesman problem, heuristic algorithm
Procedia PDF Downloads 1121716 How to Improve Teaching and Learning Strategies Through Educational Research. An Experience of Peer Observation in Legal Education
Authors: Luigina Mortari, Alessia Bevilacqua, Roberta Silva
Abstract:
The experience presented in this paper aims to understand how educational research can support the introduction and optimization of teaching innovations in legal education. In this increasingly complex context, a strong need to introduce paths aimed at acquiring not only professional knowledge and skills but also transversal such as reflective, critical, and problem-solving skills emerges. Through a peer observation intertwined with an analysis of discursive practices, researchers and the teacher worked together through a process of participatory and transformative accompaniment whose objective was to promote the active participation and engagement of students in learning processes, an element indispensable to work in the more specific direction of strengthening key competences. This reflective faculty development path led the teacher to activate metacognitive processes, becoming thus aware of the strengths and areas of improvement of his teaching innovation.Keywords: legal education, teaching innovation, peer observation, discursive analysis, faculty development
Procedia PDF Downloads 167