Search results for: optimized asset allocation
2461 Effect of Compost Application on Uptake and Allocation of Heavy Metals and Plant Nutrients and Quality of Oriental Tobacco Krumovgrad 90
Authors: Violina R. Angelova, Venelina T. Popova, Radka V. Ivanova, Givko T. Ivanov, Krasimir I. Ivanov
Abstract:
A comparative research on the impact of compost on uptake and allocation of nutrients and heavy metals and quality of Oriental tobacco Krumovgrad 90 has been carried out. The experiment was performed on an agricultural field contaminated by the lead zinc smelter near the town of Kardzali, Bulgaria, after closing the lead production. The compost treatments had significant effects on the uptake and allocation of plant nutrients and heavy metals. The incorporation of compost leads to decrease in the amount of heavy metals present in the tobacco leaves, with Cd, Pb and Zn having values of 36%, 12% and 6%, respectively. Application of the compost leads to increased content of potassium, calcium and magnesium in the leaves of tobacco, and therefore, may favorably affect the burning properties of tobacco. The incorporation of compost in the soil has a negative impact on the quality and typicality of the oriental tobacco variety of Krumovgrad 90. The incorporation of compost leads to an increase in the size of the tobacco plant leaves, the leaves become darker in colour, less fleshy and undergo a change in form, becoming (much) broader in the second, third and fourth stalk position. This is accompanied by a decrease in the quality of the tobacco. The incorporation of compost also results in an increase in the mineral substances (pure ash), total nicotine and nitrogen, and a reduction in the amount of reducing sugars, which causes the quality of the tobacco leaves to deteriorate (particularly in the third and fourth harvests).Keywords: chemical composition, compost, heavy metals, oriental tobacco, quality
Procedia PDF Downloads 2732460 Operations Research Applications in Audit Planning and Scheduling
Authors: Abdel-Aziz M. Mohamed
Abstract:
This paper presents a state-of-the-art survey of the operations research models developed for internal audit planning. Two alternative approaches have been followed in the literature for audit planning: (1) identifying the optimal audit frequency; and (2) determining the optimal audit resource allocation. The first approach identifies the elapsed time between two successive audits, which can be presented as the optimal number of audits in a given planning horizon, or the optimal number of transactions after which an audit should be performed. It also includes the optimal audit schedule. The second approach determines the optimal allocation of audit frequency among all auditable units in the firm. In our review, we discuss both the deterministic and probabilistic models developed for audit planning. In addition, game theory models are reviewed to find the optimal auditing strategy based on the interactions between the auditors and the clients.Keywords: operations research applications, audit frequency, audit-staff scheduling, audit planning
Procedia PDF Downloads 8152459 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion
Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut
Abstract:
This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.Keywords: hub location problem, p-hub median problem, clustering, congestion
Procedia PDF Downloads 4922458 Developing Medium Term Maintenance Plan For Road Networks
Authors: Helen S. Ghali, Haidy S. Ghali, Salma Ibrahim, Ossama Hosny, Hatem S. Elbehairy
Abstract:
Infrastructure systems are essential assets in any community; accordingly, authorities aim to maximize its life span while minimizing the life cycle cost. This requires studying the asset conditions throughout its operation and forming a cost-efficient maintenance strategy plan. The objective of this study is to develop a highway management system that provides medium-term maintenance plans with the minimum life cycle cost subject to budget constraints. The model is applied to data collected for the highway network in India with the aim to output a 5-year maintenance plan strategy from 2019 till 2023. The main element considered is the surface coarse, either rigid or flexible pavement. The model outputs a 5-year maintenance plan for each segment given the budget constraint while maximizing the new pavement condition rating and minimizing its life cycle cost.Keywords: infrastructure, asset management, optimization, maintenance plan
Procedia PDF Downloads 2182457 Optimization of Fermentation Parameters for Bioethanol Production from Waste Glycerol by Microwave Induced Mutant Escherichia coli EC-MW (ATCC 11105)
Authors: Refal Hussain, Saifuddin M. Nomanbhay
Abstract:
Glycerol is a valuable raw material for the production of industrially useful metabolites. Among many promising applications for the use of glycerol is its bioconversion to high value-added compounds, such as bioethanol through microbial fermentation. Bioethanol is an important industrial chemical with emerging potential as a biofuel to replace vanishing fossil fuels. The yield of liquid fuel in this process was greatly influenced by various parameters viz, temperature, pH, glycerol concentration, organic concentration, and agitation speed were considered. The present study was undertaken to investigate optimum parameters for bioethanol production from raw glycerol by immobilized mutant Escherichia coli (E.coli) (ATCC11505) strain on chitosan cross linked glutaraldehyde optimized by Taguchi statistical method in shake flasks. The initial parameters were set each at four levels and the orthogonal array layout of L16 (45) conducted. The important controlling parameters for optimized the operational fermentation was temperature 38 °C, medium pH 6.5, initial glycerol concentration (250 g/l), and organic source concentration (5 g/l). Fermentation with optimized parameters was carried out in a custom fabricated shake flask. The predicted value of bioethanol production under optimized conditions was (118.13 g/l). Immobilized cells are mainly used for economic benefits of continuous production or repeated use in continuous as well as in batch mode.Keywords: bioethanol, Escherichia coli, immobilization, optimization
Procedia PDF Downloads 6532456 Effects of Cash Transfers Mitigation Impacts in the Face of Socioeconomic External Shocks: Evidence from Egypt
Authors: Basma Yassa
Abstract:
Evidence on cash transfers’ effectiveness in mitigating macro and idiosyncratic shocks’ impacts has been mixed and is mostly concentrated in Latin America, Sub-Saharan Africa, and South Asia with very limited evidence from the MENA region. Yet conditional cash transfers schemes have been continually used, especially in Egypt, as the main social protection tool in response to the recent socioeconomic crises and macro shocks. We use 2 panel datasets and 1 cross-sectional dataset to estimate the effectiveness of cash transfers as a shock-mitigative mechanism in the Egyptian context. In this paper, the results from the different models (Panel Fixed Effects model and the Regression Discontinuity Design (RDD) model) confirm that micro and macro shocks lead to significant decline in several household-level welfare outcomes and that Takaful cash transfers have a significant positive impact in mitigating the negative shock impacts, especially on households’ debt incidence, debt levels, and asset ownership, but not necessarily on food, and non-food expenditure levels. The results indicate large positive significant effects on decreasing household incidence of debt by up to 12.4 percent and lowered the debt size by approximately 18 percent among Takaful beneficiaries compared to non-beneficiaries’. Similar evidence is found on asset ownership levels, as the RDD model shows significant positive effects on total asset ownership and productive asset ownership, but the model failed to detect positive impacts on per capita food and non-food expenditures. Further extensions are still in progress to compare the models’ results with the DID model results when using a nationally representative ELMPS panel data (2018/2024) rounds. Finally, our initial analysis suggests that conditional cash transfers are effective in buffering the negative shock impacts on certain welfare indicators even after successive macro-economic shocks in 2022 and 2023 in the Egyptian Context.Keywords: cash transfers, fixed effects, household welfare, household debt, micro shocks, regression discontinuity design
Procedia PDF Downloads 442455 Artificial Intelligent-Based Approaches for Task Offloading, Resource Allocation and Service Placement of Internet of Things Applications: State of the Art
Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib
Abstract:
In order to support the continued growth, critical latency of IoT applications, and various obstacles of traditional data centers, mobile edge computing (MEC) has emerged as a promising solution that extends cloud data-processing and decision-making to edge devices. By adopting a MEC structure, IoT applications could be executed locally, on an edge server, different fog nodes, or distant cloud data centers. However, we are often faced with wanting to optimize conflicting criteria such as minimizing energy consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge devices and trying to keep high performance (reducing response time, increasing throughput and service availability) at the same time. Achieving one goal may affect the other, making task offloading (TO), resource allocation (RA), and service placement (SP) complex processes. It is a nontrivial multi-objective optimization problem to study the trade-off between conflicting criteria. The paper provides a survey on different TO, SP, and RA recent multi-objective optimization (MOO) approaches used in edge computing environments, particularly artificial intelligent (AI) ones, to satisfy various objectives, constraints, and dynamic conditions related to IoT applications.Keywords: mobile edge computing, multi-objective optimization, artificial intelligence approaches, task offloading, resource allocation, service placement
Procedia PDF Downloads 1152454 Life Table and Functional Response of Scolothrips takahashii (Thysanoptera: Thripidae) on Tetranychus urticae (Acari:Tetranychidae)
Authors: Kuang-Chi Pan, Shu-Jen Tuan
Abstract:
Scolothrips takahashii Priesner (Thysanoptera: Thripidae) is a common predatory thrips which feeds on spider mites; it is considered an important natural enemy and a potential biological control agent against spider mites. In order to evaluate the efficacy of S. takahashii against tetranychid mites, life table and functional response study were conducted at 25±1°C, with Tetranychus urticae Priesner as prey. The intrinsic rate of increase (r), finite rate of increase (λ), net reproduction rate (R₀), mean generation time (T) were 0.1674 d⁻¹, 1.1822d⁻¹, 62.26 offspring/individual, and 24.68d. The net consumption rate (C₀) was 846.15, mean daily consumption rate was 51.92 eggs for females and 19.28 eggs for males. S. takahashii exhibited type III functional response when offered T. urticae deutonymphs. Based on the random predator equation, the estimated maximum attack rate (a) and handling time (Th) were 0.1376h⁻¹ and 0.7883h. In addition, a life table experiment was conducted to evaluate the offspring sex allocation and population dynamic of Tetranychus ludeni Zacher under group-rearing conditions with different sex ratios. All bisexual groups produced offspring with similar sex allocation patterns, which started with the majority of females, then transited during the middle of the oviposition period and turned male-biased at the end of the oviposition period.Keywords: Scolothrips takahashii, Tetranychus urticae, Tetranychus ludeni, two-sex life table, functional response, sex allocation
Procedia PDF Downloads 902453 Impact of Modern Beehive on Income of Rural Households: Evidence from Bugina District of Northern Ethiopia
Authors: Wondmnew Derebe Yohannis
Abstract:
The enhanced utilization of modern beehives holds significant potential to enhance the livelihoods of smallholder farmers who heavily rely on mixed crop-livestock farming for their income. Recognizing this, the distribution of improved beehives has been implemented across various regions in Ethiopia, including the Bugina district. However, the precise impact of these improved beehives on farmers' income has received limited attention. To address this gap, this study aims to assess the influence of adopting upgraded beehives on rural households' income and asset accumulation. To conduct this research, survey data was gathered from a sample of 350 households selected through random sampling. The collected data was then analyzed using an econometric stochastic frontier model (ESRM) approach. The findings reveal that the adoption of improved beehives has resulted in higher annual income and asset growth for beekeepers. On average, those who adopted the improved beehives earned approximately 6,077 Ethiopian Birr (ETB) more than their counterparts who did not adopt these beehives. However, it is worth noting that the impact of adoption would have been even greater for non-adopters, as evidenced by the negative transitional heterogeneity effect of 1792 ETB. Furthermore, the analysis indicates that the decision to adopt or not adopt improved beehives was driven by individual self-selection. The adoption of improved beehives also led to an increase in fixed assets for households, establishing it as a viable strategy for poverty reduction. Overall, this study underscores the positive effect of adopting improved beehives on rural households' income and asset holdings, showcasing its potential to uplift smallholder farmers and serve as an alternative mechanism for reducing poverty.Keywords: impact, adoption, endogenous switching regression, income, improved beehives
Procedia PDF Downloads 542452 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection
Procedia PDF Downloads 2562451 Volatility Index, Fear Sentiment and Cross-Section of Stock Returns: Indian Evidence
Authors: Pratap Chandra Pati, Prabina Rajib, Parama Barai
Abstract:
The traditional finance theory neglects the role of sentiment factor in asset pricing. However, the behavioral approach to asset-pricing based on noise trader model and limit to arbitrage includes investor sentiment as a priced risk factor in the assist pricing model. Investor sentiment affects stock more that are vulnerable to speculation, hard to value and risky to arbitrage. It includes small stocks, high volatility stocks, growth stocks, distressed stocks, young stocks and non-dividend-paying stocks. Since the introduction of Chicago Board Options Exchange (CBOE) volatility index (VIX) in 1993, it is used as a measure of future volatility in the stock market and also as a measure of investor sentiment. CBOE VIX index, in particular, is often referred to as the ‘investors’ fear gauge’ by public media and prior literature. The upward spikes in the volatility index are associated with bouts of market turmoil and uncertainty. High levels of the volatility index indicate fear, anxiety and pessimistic expectations of investors about the stock market. On the contrary, low levels of the volatility index reflect confident and optimistic attitude of investors. Based on the above discussions, we investigate whether market-wide fear levels measured volatility index is priced factor in the standard asset pricing model for the Indian stock market. First, we investigate the performance and validity of Fama and French three-factor model and Carhart four-factor model in the Indian stock market. Second, we explore whether India volatility index as a proxy for fearful market-based sentiment indicators affect the cross section of stock returns after controlling for well-established risk factors such as market excess return, size, book-to-market, and momentum. Asset pricing tests are performed using monthly data on CNX 500 index constituent stocks listed on the National stock exchange of India Limited (NSE) over the sample period that extends from January 2008 to March 2017. To examine whether India volatility index, as an indicator of fear sentiment, is a priced risk factor, changes in India VIX is included as an explanatory variable in the Fama-French three-factor model as well as Carhart four-factor model. For the empirical testing, we use three different sets of test portfolios used as the dependent variable in the in asset pricing regressions. The first portfolio set is the 4x4 sorts on the size and B/M ratio. The second portfolio set is the 4x4 sort on the size and sensitivity beta of change in IVIX. The third portfolio set is the 2x3x2 independent triple-sorting on size, B/M and sensitivity beta of change in IVIX. We find evidence that size, value and momentum factors continue to exist in Indian stock market. However, VIX index does not constitute a priced risk factor in the cross-section of returns. The inseparability of volatility and jump risk in the VIX is a possible explanation of the current findings in the study.Keywords: India VIX, Fama-French model, Carhart four-factor model, asset pricing
Procedia PDF Downloads 2522450 Formulation and Optimization of Topical 5-Fluorouracil Microemulsions Using Central Compisite Design
Authors: Sudhir Kumar, V. R. Sinha
Abstract:
Water in oil topical microemulsions of 5-FU were developed and optimized using face centered central composite design. Topical w/o microemulsion of 5-FU were prepared using sorbitan monooleate (Span 80), polysorbate 80 (Tween 80), with different oils such as oleic acid (OA), triacetin (TA), and isopropyl myristate (IPM). The ternary phase diagrams designated the microemulsion region and face centered central composite design helped in determining the effects of selected variables viz. type of oil, smix ratio and water concentration on responses like drug content, globule size and viscosity of microemulsions. The CCD design exhibited that the factors have statistically significant effects (p<0.01) on the selected responses. The actual responses showed excellent agreement with the predicted values as suggested by the CCD with lower residual standard error. Similarly, the optimized values were found within the range as predicted by the model. Furthermore, other characteristics of microemulsions like pH, conductivity were investigated. For the optimized microemulsion batch, ex-vivo skin flux, skin irritation and retention studies were performed and compared with marketed 5-FU formulation. In ex vivo skin permeation studies, higher skin retention of drug and minimal flux was achieved for optimized microemulsion batch then the marketed cream. Results confirmed the actual responses to be in agreement with predicted ones with least residual standard errors. Controlled release of drug was achieved for the optimized batch with higher skin retention of 5-FU, which can further be utilized for the treatment of many dermatological disorders.Keywords: 5-FU, central composite design, microemulsion, ternanry phase diagram
Procedia PDF Downloads 4792449 Maintenance Work Order Management Tool (Desktop & Mobile Solution)
Authors: Haitham Al Rawahi
Abstract:
Oman Electricity Transmission Company (OETC) has implemented Computerized Maintenance Management System (CMMS), which is based on Oracle enterprise asset management model e-AM. This was implemented with cooperation of Nama Shared Services (NSS). CMMS is mainly used to create maintenance work orders with a preconfigured workflow of defined maintenance schedules/plans, required resources, and materials, obtaining shutdown approvals, completing maintenance activities, and closing the work orders. Furthermore, CMMS is also configured with asset failure classifications, asset hierarchy, asset maintenance activities, integration with spare inventories, etc. Since the year 2017, site engineer is working on CMMS by filling-in manually all related maintenance and inspection records on paper forms and then scanning and attaching it in CMMS for further analysis. Site engineer will finalize all paper works at site and then goes back to office to scan and attach it to work order in CMMS. This creates sub tasks for site engineer and makes it very difficult and lengthy process. Also, there is a significant risk for missing or deleted important fields on the paper due to usage of pen to fill the paper. In addition to that, site engineer may take time and days working outside of the office. therefore, OETC has decided to digitize these inspection and maintenance forms in one platform in CMMS, and it can be opened with both functionalities online and offline. The ArcGIS product formats or web-enabled solutions which has ability to access from mobile and desktop devices via arc map modules will be used too. The purpose of interlinking is to setup for maintenance and inspection forms to work orders in e-AM, which the site engineer has daily interactions with. This ArcGIS environment or tool is designed to link with e-AM, so when site engineer opens this application from the site and a window will take him through same ArcGIS. This window opens the maintenance forms and shows the required fields to fill-in and save the work through his mobile application. After saving his work with the availability of network (Off/In) line, notification will trigger to his line manager to review and take further actions (approve/reject/request more information). In this function, the user can see the assigned work orders to his departments as well as chart of all work orders with status. The approver has ability to see the statistics of all work.Keywords: e-AM, GIS, CMMS, integration
Procedia PDF Downloads 972448 An Evolutionary Multi-Objective Optimization for Airport Gate Assignment Problem
Authors: Seyedmirsajad Mokhtarimousavi, Danial Talebi, Hamidreza Asgari
Abstract:
Gate Assignment Problem (GAP) is one of the most substantial issues in airport operation. In principle, GAP intends to maintain the maximum capacity of the airport through the best possible allocation of the resources (gates) in order to reach the optimum outcome. The problem involves a wide range of dependent and independent resources and their limitations, which add to the complexity of GAP from both theoretical and practical perspective. In this study, GAP was mathematically formulated as a three-objective problem. The preliminary goal of multi-objective formulation was to address a higher number of objectives that can be simultaneously optimized and therefore increase the practical efficiency of the final solution. The problem is solved by applying the second version of Non-dominated Sorting Genetic Algorithm (NSGA-II). Results showed that the proposed mathematical model could address most of major criteria in the decision-making process in airport management in terms of minimizing both airport/airline cost and passenger walking distance time. Moreover, the proposed approach could properly find acceptable possible answers.Keywords: airport management, gate assignment problem, mathematical modeling, genetic algorithm, NSGA-II
Procedia PDF Downloads 2992447 Direct Conversion of Crude Oils into Petrochemicals under High Severity Conditions
Authors: Anaam H. Al-ShaikhAli, Mansour A. Al-Herz
Abstract:
The research leverages the proven HS-FCC technology to directly crack crude oils into petrochemical building blocks. Crude oils were subjected to an optimized hydro-processing process where metal contaminants and sulfur were reduced to an acceptable level for feeding the crudes into the HS-FCC technology. The hydro-processing is achieved through a fixed-bed reactor which is composed of 3 layers of catalysts. The crude oil is passed through a dementalization catalyst followed by a desulfurization catalyst and finally a de-aromatization catalyst. The hydroprocessing was conducted at an optimized liquid hourly space velocity (LHSV), temperature, and pressure for an optimal reduction of metals and sulfur from the crudes. The hydro-processed crudes were then fed into a micro activity testing (MAT) unit to simulate the HS-FCC technology. The catalytic cracking of crude oils was conducted over tailored catalyst formulations under an optimized catalyst/oil ratio and cracking temperature for optimal production of total light olefins.Keywords: petrochemical, catalytic cracking, catalyst synthesis, HS-FCC technology
Procedia PDF Downloads 932446 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 1082445 Developing a Machine Learning-based Cost Prediction Model for Construction Projects using Particle Swarm Optimization
Authors: Soheila Sadeghi
Abstract:
Accurate cost prediction is essential for effective project management and decision-making in the construction industry. This study aims to develop a cost prediction model for construction projects using Machine Learning techniques and Particle Swarm Optimization (PSO). The research utilizes a comprehensive dataset containing project cost estimates, actual costs, resource details, and project performance metrics from a road reconstruction project. The methodology involves data preprocessing, feature selection, and the development of an Artificial Neural Network (ANN) model optimized using PSO. The study investigates the impact of various input features, including cost estimates, resource allocation, and project progress, on the accuracy of cost predictions. The performance of the optimized ANN model is evaluated using metrics such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and R-squared. The results demonstrate the effectiveness of the proposed approach in predicting project costs, outperforming traditional benchmark models. The feature selection process identifies the most influential variables contributing to cost variations, providing valuable insights for project managers. However, this study has several limitations. Firstly, the model's performance may be influenced by the quality and quantity of the dataset used. A larger and more diverse dataset covering different types of construction projects would enhance the model's generalizability. Secondly, the study focuses on a specific optimization technique (PSO) and a single Machine Learning algorithm (ANN). Exploring other optimization methods and comparing the performance of various ML algorithms could provide a more comprehensive understanding of the cost prediction problem. Future research should focus on several key areas. Firstly, expanding the dataset to include a wider range of construction projects, such as residential buildings, commercial complexes, and infrastructure projects, would improve the model's applicability. Secondly, investigating the integration of additional data sources, such as economic indicators, weather data, and supplier information, could enhance the predictive power of the model. Thirdly, exploring the potential of ensemble learning techniques, which combine multiple ML algorithms, may further improve cost prediction accuracy. Additionally, developing user-friendly interfaces and tools to facilitate the adoption of the proposed cost prediction model in real-world construction projects would be a valuable contribution to the industry. The findings of this study have significant implications for construction project management, enabling proactive cost estimation, resource allocation, budget planning, and risk assessment, ultimately leading to improved project performance and cost control. This research contributes to the advancement of cost prediction techniques in the construction industry and highlights the potential of Machine Learning and PSO in addressing this critical challenge. However, further research is needed to address the limitations and explore the identified future research directions to fully realize the potential of ML-based cost prediction models in the construction domain.Keywords: cost prediction, construction projects, machine learning, artificial neural networks, particle swarm optimization, project management, feature selection, road reconstruction
Procedia PDF Downloads 592444 Discrete Breeding Swarm for Cost Minimization of Parallel Job Shop Scheduling Problem
Authors: Tarek Aboueldahab, Hanan Farag
Abstract:
Parallel Job Shop Scheduling Problem (JSP) is a multi-objective and multi constrains NP- optimization problem. Traditional Artificial Intelligence techniques have been widely used; however, they could be trapped into the local minimum without reaching the optimum solution, so we propose a hybrid Artificial Intelligence model (AI) with Discrete Breeding Swarm (DBS) added to traditional Artificial Intelligence to avoid this trapping. This model is applied in the cost minimization of the Car Sequencing and Operator Allocation (CSOA) problem. The practical experiment shows that our model outperforms other techniques in cost minimization.Keywords: parallel job shop scheduling problem, artificial intelligence, discrete breeding swarm, car sequencing and operator allocation, cost minimization
Procedia PDF Downloads 1882443 The Effects of Urbanization on Peri-Urban Livelihood in Ghana: A Case of Kumasi Peri-Urban Communities
Authors: Charles Kwaku Oppong
Abstract:
The research linked urban expansion resulting from urbanization with changing morphology processes happening in peri-urban communities. Two villages of Kumasi City peri-urban were used as a case study. Appropriate analytical framework and methodology (literature review and empirical evidence) were employed to ensure that all pertinent issues of peri-urban interface are brought to light. It was discovered from the study that since peri-urban livelihood is linked with assets base; it has been found that stock of asset, as well as transformation processes, were major factors in the shaping of livelihoods strategies. For that reason, success or failure of household livelihoods was seen to relate to the kind of livelihood strategy employed. With efforts to mitigate for livelihoods failure due to peri-urban development, households' recourse to remittances, land disposal, and other means as an alternative livelihood approach. The study calls for local government policy interventions in regulating peri-urban transformation process and providing safety nets for the vulnerable.Keywords: urban expansion, peri-urban interface, livelihoods, asset
Procedia PDF Downloads 2652442 Design of a 4-DOF Robot Manipulator with Optimized Algorithm for Inverse Kinematics
Authors: S. Gómez, G. Sánchez, J. Zarama, M. Castañeda Ramos, J. Escoto Alcántar, J. Torres, A. Núñez, S. Santana, F. Nájera, J. A. Lopez
Abstract:
This paper shows in detail the mathematical model of direct and inverse kinematics for a robot manipulator (welding type) with four degrees of freedom. Using the D-H parameters, screw theory, numerical, geometric and interpolation methods, the theoretical and practical values of the position of robot were determined using an optimized algorithm for inverse kinematics obtaining the values of the particular joints in order to determine the virtual paths in a relatively short time.Keywords: kinematics, degree of freedom, optimization, robot manipulator
Procedia PDF Downloads 4652441 Combining an Optimized Closed Principal Curve-Based Method and Evolutionary Neural Network for Ultrasound Prostate Segmentation
Authors: Tao Peng, Jing Zhao, Yanqing Xu, Jing Cai
Abstract:
Due to missing/ambiguous boundaries between the prostate and neighboring structures, the presence of shadow artifacts, as well as the large variability in prostate shapes, ultrasound prostate segmentation is challenging. To handle these issues, this paper develops a hybrid method for ultrasound prostate segmentation by combining an optimized closed principal curve-based method and the evolutionary neural network; the former can fit curves with great curvature and generate a contour composed of line segments connected by sorted vertices, and the latter is used to express an appropriate map function (represented by parameters of evolutionary neural network) for generating the smooth prostate contour to match the ground truth contour. Both qualitative and quantitative experimental results showed that our proposed method obtains accurate and robust performances.Keywords: ultrasound prostate segmentation, optimized closed polygonal segment method, evolutionary neural network, smooth mathematical model, principal curve
Procedia PDF Downloads 2002440 Finding the Theory of Riba Avoidance: A Scoping Review to Set the Research Agenda
Authors: Randa Ismail Sharafeddine
Abstract:
The Islamic economic system is distinctive in that it implicitly recognizes money as a separate, independent component of production capable of assuming risk and so entitled to the same reward as other Entrepreneurial Factors of Production (EFP). Conventional theory does not identify money capital explicitly as a component of production; rather, interest is recognized as a reward for capital, the interest rate is the cost of money capital, and it is also seen as a cost of physical capital. The conventional theory of production examines how diverse non-entrepreneurial resources (Land, Labor, and Capital) are selected; however, the economic theory community is largely unaware of the reasons why these resources choose to remain as non-entrepreneurial resources as opposed to becoming entrepreneurial resources. Should land, labor, and financial asset owners choose to work for others in return for rent, income, or interest, or should they engage in entrepreneurial risk-taking in order to profit. This is a decision made often in the actual world, but it has never been effectively treated in economic theory. This article will conduct a critical analysis of the conventional classification of factors of production and propose a classification for resource allocation and income distribution (Rent, Wages, Interest, and Profits) that is more rational, even within the conventional theoretical framework for evaluating and developing production and distribution theories. Money is an essential component of production in an Islamic economy, and it must be used to sustain economic activity.Keywords: financial capital, production theory, distribution theory, economic activity, riba avoidance, institution of participation
Procedia PDF Downloads 912439 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining
Authors: Mohsen Farhadloo, Majid Farhadloo
Abstract:
Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis
Procedia PDF Downloads 942438 Effects of Dividend Policy on Firm Profitability and Growth in Light of Present Economic Conditions
Authors: Madani Chahinaz
Abstract:
This study aims to shed light on the impact of dividend policy on corporate profitability and its relationship to growth, considering the economic developments taking place. The study was conducted on a sample of seven companies for the period from 2014 to 2020, based on a set of determinants to select variables affecting dividend distribution, where the descriptive analytical approach relied upon using graphical data models. The study concluded that companies that follow a well-studied dividend distribution policy enjoy higher profitability rates, which contributes to enhancing their growth in light of the economic developments taking place. There is also no statistically significant relationship between the variables of total asset growth and fixed asset growth and profitability. The study also concluded that there is statistical significance for the relationship between the sales volume growth variable, the self-financing ratio variable, and dividend distribution at a significance level of 0.05, as the random effects model was able to explain 68% of the changes in dividend distribution policy.Keywords: dividend distribution policy, profitability, growth, self-financing ratio
Procedia PDF Downloads 92437 Identification and Prioritisation of Students Requiring Literacy Intervention and Subsequent Communication with Key Stakeholders
Authors: Emilie Zimet
Abstract:
During networking and NCCD moderation meetings, best practices for identifying students who require Literacy Intervention are often discussed. Once these students are identified, consideration is given to the most effective process for prioritising those who have the greatest need for Literacy Support and the allocation of resources, tracking of intervention effectiveness and communicating with teachers/external providers/parents. Through a workshop, the group will investigate best practices to identify students who require literacy support and strategies to communicate and track their progress. In groups, participants will examine what they do in their settings and then compare with other models, including the researcher’s model, to decide the most effective path to identification and communication. Participants will complete a worksheet at the beginning of the session to deeply consider their current approaches. The participants will be asked to critically analyse their own identification processes for Literacy Intervention, ensuring students are not overlooked if they fall into the borderline category. A cut-off for students to access intervention will be considered so as not to place strain on already stretched resources along with the most effective allocation of resources. Furthermore, communicating learning needs and differentiation strategies to staff is paramount to the success of an intervention, and participants will look at the frequency of communication to share such strategies and updates. At the end of the session, the group will look at creating or evolving models that allow for best practices for the identification and communication of Literacy Interventions. The proposed outcome for this research is to develop a model of identification of students requiring Literacy Intervention that incorporates the allocation of resources and communication to key stakeholders. This will be done by pooling information and discussing a variety of models used in the participant's school settings.Keywords: identification, student selection, communication, special education, school policy, planning for intervention
Procedia PDF Downloads 472436 Design Optimization of a Compact Quadrupole Electromagnet for CLS 2.0
Authors: Md. Armin Islam, Les Dallin, Mark Boland, W. J. Zhang
Abstract:
This paper reports a study on the optimal magnetic design of a compact quadrupole electromagnet for the Canadian Light Source (CLS 2.0). The nature of the design is to determine a quadrupole with low relative higher order harmonics and better field quality. The design problem was formulated as an optimization model, in which the objective function is the higher order harmonics (multipole errors) and the variable to be optimized is the material distribution on the pole. The higher order harmonics arose in the quadrupole due to truncating the ideal hyperbola at a certain point to make the pole. In this project, the arisen harmonics have been optimized both transversely and longitudinally by adjusting material on the poles in a controlled way. For optimization, finite element analysis (FEA) has been conducted. A better higher order harmonics amplitudes and field quality have been achieved through the optimization. On the basis of the optimized magnetic design, electrical and cooling calculation has been performed for the magnet.Keywords: drift, electrical, and cooling calculation, integrated field, magnetic field gradient, multipole errors, quadrupole
Procedia PDF Downloads 1432435 Enhanced Growth of Microalgae Chlamydomonas reinhardtii Cultivated in Different Organic Waste and Effective Conversion of Algal Oil to Biodiesel
Authors: Ajith J. Kings, L. R. Monisha Miriam, R. Edwin Raj, S. Julyes Jaisingh, S. Gavaskar
Abstract:
Microalgae are a potential bio-source for rejuvenated solutions in various disciplines of science and technology, especially in medicine and energy. Biodiesel is being replaced for conventional fuels in automobile industries with reduced pollution and equivalent performance. Since it is a carbon neutral fuel by recycling CO2 in photosynthesis, global warming potential can be held in control using this fuel source. One of the ways to meet the rising demand of automotive fuel is to adopt with eco-friendly, green alternative fuels called sustainable microalgal biodiesel. In this work, a microalga Chlamydomonas reinhardtii was cultivated and optimized in different media compositions developed from under-utilized waste materials in lab scale. Using the optimized process conditions, they are then mass propagated in out-door ponds, harvested, dried and oils extracted for optimization in ambient conditions. The microalgal oil was subjected to two step esterification processes using acid catalyst to reduce the acid value (0.52 mg kOH/g) in the initial stage, followed by transesterification to maximize the biodiesel yield. The optimized esterification process parameters are methanol/oil ratio 0.32 (v/v), sulphuric acid 10 vol.%, duration 45 min at 65 ºC. In the transesterification process, commercially available alkali catalyst (KOH) is used and optimized to obtain a maximum biodiesel yield of 95.4%. The optimized parameters are methanol/oil ratio 0.33(v/v), alkali catalyst 0.1 wt.%, duration 90 min at 65 ºC 90 with smooth stirring. Response Surface Methodology (RSM) is employed as a tool for optimizing the process parameters. The biodiesel was then characterized with standard procedures and especially by GC-MS to confirm its compatibility for usage in internal combustion engine.Keywords: microalgae, organic media, optimization, transesterification, characterization
Procedia PDF Downloads 2342434 Developing a Maturity Model of Digital Twin Application for Infrastructure Asset Management
Authors: Qingqing Feng, S. Thomas Ng, Frank J. Xu, Jiduo Xing
Abstract:
Faced with unprecedented challenges including aging assets, lack of maintenance budget, overtaxed and inefficient usage, and outcry for better service quality from the society, today’s infrastructure systems has become the main focus of many metropolises to pursue sustainable urban development and improve resilience. Digital twin, being one of the most innovative enabling technologies nowadays, may open up new ways for tackling various infrastructure asset management (IAM) problems. Digital twin application for IAM, as its name indicated, represents an evolving digital model of intended infrastructure that possesses functions including real-time monitoring; what-if events simulation; and scheduling, maintenance, and management optimization based on technologies like IoT, big data and AI. Up to now, there are already vast quantities of global initiatives of digital twin applications like 'Virtual Singapore' and 'Digital Built Britain'. With digital twin technology permeating the IAM field progressively, it is necessary to consider the maturity of the application and how those institutional or industrial digital twin application processes will evolve in future. In order to deal with the gap of lacking such kind of benchmark, a draft maturity model is developed for digital twin application in the IAM field. Firstly, an overview of current smart cities maturity models is given, based on which the draft Maturity Model of Digital Twin Application for Infrastructure Asset Management (MM-DTIAM) is developed for multi-stakeholders to evaluate and derive informed decision. The process of development follows a systematic approach with four major procedures, namely scoping, designing, populating and testing. Through in-depth literature review, interview and focus group meeting, the key domain areas are populated, defined and iteratively tuned. Finally, the case study of several digital twin projects is conducted for self-verification. The findings of the research reveal that: (i) the developed maturity model outlines five maturing levels leading to an optimised digital twin application from the aspects of strategic intent, data, technology, governance, and stakeholders’ engagement; (ii) based on the case study, levels 1 to 3 are already partially implemented in some initiatives while level 4 is on the way; and (iii) more practices are still needed to refine the draft to be mutually exclusive and collectively exhaustive in key domain areas.Keywords: digital twin, infrastructure asset management, maturity model, smart city
Procedia PDF Downloads 1572433 Life Cycle Assessment of Biogas Energy Production from a Small-Scale Wastewater Treatment Plant in Central Mexico
Authors: Joel Bonales, Venecia Solorzano, Carlos Garcia
Abstract:
A great percentage of the wastewater generated in developing countries don’t receive any treatment, which leads to numerous environmental impacts. In response to this, a paradigm change in the current wastewater treatment model based on large scale plants towards a small and medium scale based model has been proposed. Nevertheless, small scale wastewater treatment (SS-WTTP) with novel technologies such as anaerobic digesters, as well as the utilization of derivative co-products such as biogas, still presents diverse environmental impacts which must be assessed. This study consisted in a Life Cycle Assessment (LCA) performed to a SS-WWTP which treats wastewater from a small commercial block in the city of Morelia, Mexico. The treatment performed in the SS-WWTP consists in anaerobic and aerobic digesters with a daily capacity of 5,040 L. Two different scenarios were analyzed: the current plant conditions and a hypothetical energy use of biogas obtained in situ. Furthermore, two different allocation criteria were applied: full impact allocation to the system’s main product (treated water) and substitution credits for replacing Mexican grid electricity (biogas) and clean water pumping (treated water). The results showed that the analyzed plant had bigger impacts than what has been reported in the bibliography in the basis of wastewater volume treated, which may imply that this plant is currently operating inefficiently. The evaluated impacts appeared to be focused in the aerobic digestion and electric generation phases due to the plant’s particular configuration. Additional findings prove that the allocation criteria applied is crucial for the interpretation of impacts and that that the energy use of the biogas obtained in this plant can help mitigate associated climate change impacts. It is concluded that SS-WTTP is a environmentally sound alternative for wastewater treatment from a systemic perspective. However, this type of studies must be careful in the selection of the allocation criteria and replaced products, since these factors have a great influence in the results of the assessment.Keywords: biogas, life cycle assessment, small scale treatment, wastewater treatment
Procedia PDF Downloads 1242432 Well Inventory Data Entry: Utilization of Developed Technologies to Progress the Integrated Asset Plan
Authors: Danah Al-Selahi, Sulaiman Al-Ghunaim, Bashayer Sadiq, Fatma Al-Otaibi, Ali Ameen
Abstract:
In light of recent changes affecting the Oil & Gas Industry, optimization measures have become imperative for all companies globally, including Kuwait Oil Company (KOC). To keep abreast of the dynamic market, a detailed Integrated Asset Plan (IAP) was developed to drive optimization across the organization, which was facilitated through the in-house developed software “Well Inventory Data Entry” (WIDE). This comprehensive and integrated approach enabled centralization of all planned asset components for better well planning, enhancement of performance, and to facilitate continuous improvement through performance tracking and midterm forecasting. Traditionally, this was hard to achieve as, in the past, various legacy methods were used. This paper briefly describes the methods successfully adopted to meet the company’s objective. IAPs were initially designed using computerized spreadsheets. However, as data captured became more complex and the number of stakeholders requiring and updating this information grew, the need to automate the conventional spreadsheets became apparent. WIDE, existing in other aspects of the company (namely, the Workover Optimization project), was utilized to meet the dynamic requirements of the IAP cycle. With the growth of extensive features to enhance the planning process, the tool evolved into a centralized data-hub for all asset-groups and technical support functions to analyze and infer from, leading WIDE to become the reference two-year operational plan for the entire company. To achieve WIDE’s goal of operational efficiency, asset-groups continuously add their parameters in a series of predefined workflows that enable the creation of a structured process which allows risk factors to be flagged and helps mitigation of the same. This tool dictates assigned responsibilities for all stakeholders in a method that enables continuous updates for daily performance measures and operational use. The reliable availability of WIDE, combined with its user-friendliness and easy accessibility, created a platform of cross-functionality amongst all asset-groups and technical support groups to update contents of their respective planning parameters. The home-grown entity was implemented across the entire company and tailored to feed in internal processes of several stakeholders across the company. Furthermore, the implementation of change management and root cause analysis techniques captured the dysfunctionality of previous plans, which in turn resulted in the improvement of already existing mechanisms of planning within the IAP. The detailed elucidation of the 2 year plan flagged any upcoming risks and shortfalls foreseen in the plan. All results were translated into a series of developments that propelled the tool’s capabilities beyond planning and into operations (such as Asset Production Forecasts, setting KPIs, and estimating operational needs). This process exemplifies the ability and reach of applying advanced development techniques to seamlessly integrated the planning parameters of various assets and technical support groups. These techniques enables the enhancement of integrating planning data workflows that ultimately lay the founding plans towards an epoch of accuracy and reliability. As such, benchmarks of establishing a set of standard goals are created to ensure the constant improvement of the efficiency of the entire planning and operational structure.Keywords: automation, integration, value, communication
Procedia PDF Downloads 146