Search results for: cost analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30656

Search results for: cost analysis

30416 Solving Operating Room Scheduling Problem by Using Dispatching Rule

Authors: Yang-Kuei Lin, Yin-Yi Chou

Abstract:

In this research, we have considered operating room scheduling problem. The objective is to minimize total operating cost. The total operating cost includes idle cost and overtime cost. We have proposed a dispatching rule that can guarantee to find feasible solutions for the studied problem efficiently. We compared the proposed dispatching rule with the optimal solutions found by solving Inter Programming, and other solutions found by using modified existing dispatching rules. The computational results indicates that the proposed heuristic can find near optimal solutions efficiently.

Keywords: assignment, dispatching rule, operation rooms, scheduling

Procedia PDF Downloads 204
30415 Comparative Study on Inhibiting Factors of Cost and Time Control in Nigerian Construction Practice

Authors: S. Abdulkadir, I. Y. Moh’d, S. U. Kunya, U. Nuruddeen

Abstract:

The basis of any contract formation between the client and contractor is the budgeted cost and the estimated duration of projects. These variables are paramount important to project's sponsor in a construction projects and in assessing the success or viability of construction projects. Despite the availability of various techniques of cost and time control, many projects failed to achieve their initial estimated cost and time. The paper evaluate the inhibiting factors of cost and time control in Nigerian construction practice and comparing the result with the United Kingdom practice as identified by one researcher. The populations of the study are construction professionals within Bauchi and Gombe state, Nigeria, a judgmental sampling employed in determining the size of respondents. Descriptive statistics used in analyzing the data in SPSS. Design change, project fraud and corruption, financing and payment of completed work found to be common among the top five inhibiting factors of cost and time control in the study area. Furthermore, the result had shown some comprising with slight contrast as in the case of United Kingdom practice. Study recommend the adaptation of mitigation measures developed in the UK prior to assessing its effectiveness and so also developing a mitigating measure for other top factors that are not within the one developed in United Kingdom practice. Also, it recommends a wider assessing comparison on the modify inhibiting factors of cost and time control as revealed by the study to cover almost all part of Nigeria.

Keywords: comparison, cost, inhibiting factor, United Kingdom, time

Procedia PDF Downloads 409
30414 Design Transformation to Reduce Cost in Irrigation Using Value Engineering

Authors: F. S. Al-Anzi, M. Sarfraz, A. Elmi, A. R. Khan

Abstract:

Researchers are responding to the environmental challenges of Kuwait in localized, innovative, effective and economic ways. One of the vital and significant examples of the natural challenges is lack or water and desertification. In this research, the project team focuses on redesigning a prototype, using Value Engineering Methodology, which would provide similar functionalities to the well-known technology of Waterboxx kits while reducing the capital and operational costs and simplifying the process of manufacturing and usability by regular farmers. The design employs used tires and recycled plastic sheets as raw materials. Hence, this approach is going to help not just fighting desertification but also helping in getting rid of ever growing huge tire dumpsters in Kuwait, as well as helping in avoiding hazards of tire fires yielding in a safer and friendlier environment. Several alternatives for implementing the prototype have been considered. The best alternative in terms of value has been selected after thorough Function Analysis System Technique (FAST) exercise has been developed. A prototype has been fabricated and tested in a controlled simulated lab environment that is being followed by real environment field testing. Water and soil analysis conducted on the site of the experiment to cross compare between the composition of the soil before and after the experiment to insure that the prototype being tested is actually going to be environment safe. Experimentation shows that the design was equally as effective as, and may exceed, the original design with significant savings in cost. An estimated total cost reduction using the VE approach of 43.84% over the original design. This cost reduction does not consider the intangible costs of environmental issue of waste recycling which many further intensify the total savings of using the alternative VE design. This case study shows that Value Engineering Methodology can be an important tool in innovating new designs for reducing costs.

Keywords: desertification, functional analysis, scrap tires, value engineering, waste recycling, water irrigation rationing

Procedia PDF Downloads 167
30413 A Data-Driven Optimal Control Model for the Dynamics of Monkeypox in a Variable Population with a Comprehensive Cost-Effectiveness Analysis

Authors: Martins Onyekwelu Onuorah, Jnr Dahiru Usman

Abstract:

Introduction: In the realm of public health, the threat posed by Monkeypox continues to elicit concern, prompting rigorous studies to understand its dynamics and devise effective containment strategies. Particularly significant is its recurrence in variable populations, such as the observed outbreak in Nigeria in 2022. In light of this, our study undertakes a meticulous analysis, employing a data-driven approach to explore, validate, and propose optimized intervention strategies tailored to the distinct dynamics of Monkeypox within varying demographic structures. Utilizing a deterministic mathematical model, we delved into the intricate dynamics of Monkeypox, with a particular focus on a variable population context. Our qualitative analysis provided insights into the disease-free equilibrium, revealing its stability when R0 is less than one and discounting the possibility of backward bifurcation, as substantiated by the presence of a single stable endemic equilibrium. The model was rigorously validated using real-time data from the Nigerian 2022 recorded cases for Epi weeks 1 – 52. Transitioning from qualitative to quantitative, we augmented our deterministic model with optimal control, introducing three time-dependent interventions to scrutinize their efficacy and influence on the epidemic's trajectory. Numerical simulations unveiled a pronounced impact of the interventions, offering a data-supported blueprint for informed decision-making in containing the disease. A comprehensive cost-effectiveness analysis employing the Infection Averted Ratio (IAR), Average Cost-Effectiveness Ratio (ACER), and Incremental Cost-Effectiveness Ratio (ICER) facilitated a balanced evaluation of the interventions’ economic and health impacts. In essence, our study epitomizes a holistic approach to understanding and mitigating Monkeypox, intertwining rigorous mathematical modeling, empirical validation, and economic evaluation. The insights derived not only bolster our comprehension of Monkeypox's intricate dynamics but also unveil optimized, cost-effective interventions. This integration of methodologies and findings underscores a pivotal stride towards aligning public health imperatives with economic sustainability, marking a significant contribution to global efforts in combating infectious diseases.

Keywords: monkeypox, equilibrium states, stability, bifurcation, optimal control, cost-effectiveness

Procedia PDF Downloads 36
30412 Finding Viable Pollution Routes in an Urban Network under a Predefined Cost

Authors: Dimitra Alexiou, Stefanos Katsavounis, Ria Kalfakakou

Abstract:

In an urban area the determination of transportation routes should be planned so as to minimize the provoked pollution taking into account the cost of such routes. In the sequel these routes are cited as pollution routes. The transportation network is expressed by a weighted graph G= (V, E, D, P) where every vertex represents a location to be served and E contains unordered pairs (edges) of elements in V that indicate a simple road. The distances/cost and a weight that depict the provoked air pollution by a vehicle transition at every road are assigned to each road as well. These are the items of set D and P respectively. Furthermore the investigated pollution routes must not exceed predefined corresponding values concerning the route cost and the route pollution level during the vehicle transition. In this paper we present an algorithm that generates such routes in order that the decision maker selects the most appropriate one.

Keywords: bi-criteria, pollution, shortest paths, computation

Procedia PDF Downloads 343
30411 A Ground Structure Method to Minimize the Total Installed Cost of Steel Frame Structures

Authors: Filippo Ranalli, Forest Flager, Martin Fischer

Abstract:

This paper presents a ground structure method to optimize the topology and discrete member sizing of steel frame structures in order to minimize total installed cost, including material, fabrication and erection components. The proposed method improves upon existing cost-based ground structure methods by incorporating constructability considerations well as satisfying both strength and serviceability constraints. The architecture for the method is a bi-level Multidisciplinary Feasible (MDF) architecture in which the discrete member sizing optimization is nested within the topology optimization process. For each structural topology generated, the sizing optimization process seek to find a set of discrete member sizes that result in the lowest total installed cost while satisfying strength (member utilization) and serviceability (node deflection and story drift) criteria. To accurately assess cost, the connection details for the structure are generated automatically using accurate site-specific cost information obtained directly from fabricators and erectors. Member continuity rules are also applied to each node in the structure to improve constructability. The proposed optimization method is benchmarked against conventional weight-based ground structure optimization methods resulting in an average cost savings of up to 30% with comparable computational efficiency.

Keywords: cost-based structural optimization, cost-based topology and sizing, optimization, steel frame ground structure optimization, multidisciplinary optimization of steel structures

Procedia PDF Downloads 309
30410 Income and Factor Analysis of Small Scale Broiler Production in Imo State, Nigeria

Authors: Ubon Asuquo Essien, Okwudili Bismark Ibeagwa, Daberechi Peace Ubabuko

Abstract:

The Broiler Poultry subsector is dominated by small scale production with low aggregate output. The high cost of inputs currently experienced in Nigeria tends to aggravate the situation; hence many broiler farmers struggle to break-even. This study was designed to examine income and input factors in small scale deep liter broiler production in Imo state, Nigeria. Specifically, the study examined; socio-economic characteristics of small scale deep liter broiler producing Poultry farmers; estimate cost and returns of broiler production in the area; analyze input factors in broiler production in the area and examined marketability, age and profitability of the enterprise. A multi-stage sampling technique was adopted in selecting 60 small scale broiler farmers who use deep liter system from 6 communities through the use of structured questionnaire. The socioeconomic characteristics of the broiler farmers and the profitability/ marketability age of the birds were described using descriptive statistical tools such as frequencies, means and percentages. Gross margin analysis was used to analyze the cost and returns to broiler production, while Cobb Douglas production function was employed to analyze input factors in broiler production. The result of the study revealed that the cost of feed (P<0.1), deep liter material (P<0.05) and medication (P<0.05) had a significant positive relationship with the gross return of broiler farmers in the study area, while cost of labour, fuel and day old chicks were not significant. Furthermore, Gross profit margin of the farmers who market their broiler at the 8th week of rearing was 80.7%; and 78.7% and 60.8% for farmers who market at the 10th week and 12th week of rearing, respectively. The business is, therefore, profitable but at varying degree. Government and Development partners should make deliberate efforts to curb the current rise in the prices of poultry feeds, drugs and timber materials used as bedding so as to widen the profit margin and encourage more farmers to go into the business. The farmers equally need more technical assistance from extension agents with regards to timely and profitable marketing.

Keywords: broilers, factor analysis, income, small scale

Procedia PDF Downloads 41
30409 Cost-Effective and Optimal Control Analysis for Mitigation Strategy to Chocolate Spot Disease of Faba Bean

Authors: Haileyesus Tessema Alemneh, Abiyu Enyew Molla, Oluwole Daniel Makinde

Abstract:

Introduction: Faba bean is one of the most important grown plants worldwide for humans and animals. Several biotic and abiotic elements have limited the output of faba beans, irrespective of their diverse significance. Many faba bean pathogens have been reported so far, of which the most important yield-limiting disease is chocolate spot disease (Botrytis fabae). The dynamics of disease transmission and decision-making processes for intervention programs for disease control are now better understood through the use of mathematical modeling. Currently, a lot of mathematical modeling researchers are interested in plant disease modeling. Objective: In this paper, a deterministic mathematical model for chocolate spot disease (CSD) on faba bean plant with an optimal control model was developed and analyzed to examine the best strategy for controlling CSD. Methodology: Three control interventions, quarantine (u2), chemical control (u3), and prevention (u1), are employed that would establish the optimal control model. The optimality system, characterization of controls, the adjoint variables, and the Hamiltonian are all generated employing Pontryagin’s maximum principle. A cost-effective approach is chosen from a set of possible integrated strategies using the incremental cost-effectiveness ratio (ICER). The forward-backward sweep iterative approach is used to run numerical simulations. Results: The Hamiltonian, the optimality system, the characterization of the controls, and the adjoint variables were established. The numerical results demonstrate that each integrated strategy can reduce the diseases within the specified period. However, due to limited resources, an integrated strategy of prevention and uprooting was found to be the best cost-effective strategy to combat CSD. Conclusion: Therefore, attention should be given to the integrated cost-effective and environmentally eco-friendly strategy by stakeholders and policymakers to control CSD and disseminate the integrated intervention to the farmers in order to fight the spread of CSD in the Faba bean population and produce the expected yield from the field.

Keywords: CSD, optimal control theory, Pontryagin’s maximum principle, numerical simulation, cost-effectiveness analysis

Procedia PDF Downloads 35
30408 Energy-Led Sustainability Assessment Approach for Energy-Efficient Manufacturing

Authors: Aldona Kluczek

Abstract:

In recent years, manufacturing processes have interacted with sustainability issues realized in the cost-effective ways that minimalize energy, decrease negative impacts on the environment and are safe for society. However, the attention has been on separate sustainability assessment methods considering energy and material flow, energy consumption, and emission release or process control. In this paper, the energy-led sustainability assessment approach combining the methods: energy Life Cycle Assessment to assess environmental impact, Life Cycle Cost to analyze costs, and Social Life Cycle Assessment through ‘energy LCA-based value stream map’, is used to assess the energy sustainability of the hardwood lumber manufacturing process in terms of technologies. The approach integrating environmental, economic and social issues can be visualized in the considered energy-efficient technologies on the map of an energy LCA-related (input and output) inventory data. It will enable the identification of efficient technology of a given process to be reached, through the effective analysis of energy flow. It is also indicated that interventions in the considered technology should focus on environmental, economic improvements to achieve energy sustainability. The results have indicated that the most intense energy losses are caused by a cogeneration technology. The environmental impact analysis shows that a substantial reduction by 34% can be achieved with the improvement of it. From the LCC point of view, the result seems to be cost-effective, when done at that plant where the improvement is used. By demonstrating the social dimension, every component of the energy of plant labor use in the life-cycle process of the lumber production has positive energy benefits. The energy required to install the energy-efficient technology amounts to 30.32 kJ compared to others components of the energy of plant labor and it has the highest value in terms of energy-related social indicators. The paper depicts an example of hardwood lumber production in order to prove the applicability of a sustainability assessment method.

Keywords: energy efficiency, energy life cycle assessment, life cycle cost, social life cycle analysis, manufacturing process, sustainability assessment

Procedia PDF Downloads 211
30407 Essentiality of Core Strategic Vision in Continuous Cost Reduction Management

Authors: Lai Ving Kam

Abstract:

Many markets are maturing, consumer buying powers are weakening and customer preferences change rapidly. To survive, many adopt fast paced continuous cost reduction and competitive pricing to remain relevance. Marketers desire to push for more sales to increase revenues have intensified competitions at time cannibalize the product and market. The amazing technologies changes have created both hope and despair to the industries. The pressure to constantly reduce cost, on the one hand, create and market new products in cheaper prices and shorter life cycles, on the other has become a continuous endeavour. The twin trends appear irreconcilable. Can core strategic vision provides and adapts new directions in continuous cost reduction? This study investigates core strategic vision able to meet this need, for firms to survive and stay profitable. Under current uncertainty market, are firms falling back on their core strategic visions to take them out of the unfavourable positions?

Keywords: core strategy vision, continuous cost reduction, fashionable products industry, competitive pricing

Procedia PDF Downloads 290
30406 Addressing the Exorbitant Cost of Labeling Medical Images with Active Learning

Authors: Saba Rahimi, Ozan Oktay, Javier Alvarez-Valle, Sujeeth Bharadwaj

Abstract:

Successful application of deep learning in medical image analysis necessitates unprecedented amounts of labeled training data. Unlike conventional 2D applications, radiological images can be three-dimensional (e.g., CT, MRI), consisting of many instances within each image. The problem is exacerbated when expert annotations are required for effective pixel-wise labeling, which incurs exorbitant labeling effort and cost. Active learning is an established research domain that aims to reduce labeling workload by prioritizing a subset of informative unlabeled examples to annotate. Our contribution is a cost-effective approach for U-Net 3D models that uses Monte Carlo sampling to analyze pixel-wise uncertainty. Experiments on the AAPM 2017 lung CT segmentation challenge dataset show that our proposed framework can achieve promising segmentation results by using only 42% of the training data.

Keywords: image segmentation, active learning, convolutional neural network, 3D U-Net

Procedia PDF Downloads 116
30405 Economic Analysis of Cassava Value Chain by Farmers in Ilesa West Local Government Area of Osun State

Authors: Maikasuwa Mohammed Abubakar, Okebiorun Ola, M. H. Sidi, Ala Ahmed Ladan, Ango Aabdullahi Kamba

Abstract:

The study examines the economic analysis of cassava value chain by farmers in Ilesa West Local Government Area of Osun State. Simple random sampling technique was used to collect data from 200 respondents from purposively selected wards in the L.G.A. The data collected were analyzed using budgetary analysis and value addition model. The result shows that an average total cost incurred by the input dealers was ₦9,062,127.74 while the average net profit realized was ₦1,038,102.40. Other actors such as producers, processors and marketers incurred an average total cost of ₦23,324.00, ₦130,177.00 and ₦523,755.00 per production season, respectively and the average net profit realized was ₦102,614.00 for cassava producers, ₦51,131.00 for cassava processors and ₦79,045.00 for cassava marketers during cassava production season. Further analysis shows the rate of investment for cassava input dealers was ₦0.1, for cassava producers was ₦4.4, for cassava processors were ₦0.40 and for cassava marketers was ₦0.20. This indicated that rate of return on cassava was higher in cassava production than in others corridors along the value chain of cassava. However, value added the cassava producers (₦102,536.16/season) was the highest when compared with value added by cassava processors (₦51,853.82/season) and cassava marketers (₦100,885.56/season).

Keywords: Cassava, value chain, Ilesa West, Nigeria

Procedia PDF Downloads 294
30404 Application of Transportation Linear Programming Algorithms to Cost Reduction in Nigeria Soft Drinks Industry

Authors: Salami Akeem Olanrewaju

Abstract:

The transportation models or problems are primarily concerned with the optimal (best possible) way in which a product produced at different factories or plants (called supply origins) can be transported to a number of warehouses or customers (called demand destinations). The objective in a transportation problem is to fully satisfy the destination requirements within the operating production capacity constraints at the minimum possible cost. The objective of this study is to determine ways of minimizing transport cost in order to maximum profit. Data were gathered from the records of the Distribution Department of 7-Up Bottling Company Plc. Ilorin, Kwara State, Nigeria. The data were analyzed using SPSS (Statistical Package for Social Sciences) while applying the three methods of solving a transportation problem. The three methods produced the same results; therefore, any of the method can be adopted by the company in transporting its final products to the wholesale dealers in order to minimize total production cost.

Keywords: cost minimization, resources utilization, distribution system, allocation problem

Procedia PDF Downloads 223
30403 Analysis of the Omnichannel Delivery Network with Application to Last Mile Delivery

Authors: Colette Malyack, Pius Egbelu

Abstract:

Business-to-Customer (B2C) delivery options have improved to meet increased demand in recent years. The change in end users has forced logistics networks to focus on customer service and sentiment that would have previously been the priority of the company or organization of origin. This has led to increased pressure on logistics companies to extend traditional B2B networks into a B2C solution while accommodating additional costs, roadblocks, and customer sentiment; the result has been the creation of the omnichannel delivery network encompassing a number of traditional and modern methods of package delivery. In this paper the many solutions within the omnichannel delivery network are defined and discussed. It can be seen through this analysis that the omnichannel delivery network can be applied to reduce the complexity of package delivery and provide customers with more options. Applied correctly the result is a reduction in cost to the logistics company over time, even with an initial increase in cost to obtain the technology.

Keywords: network planning, last mile delivery, omnichannel delivery network, omnichannel logistics

Procedia PDF Downloads 112
30402 Ecological-Economics Evaluation of Water Treatment Systems

Authors: Hwasuk Jung, Seoi Lee, Dongchoon Ryou, Pyungjong Yoo, Seokmo Lee

Abstract:

The Nakdong River being used as drinking water sources for Pusan metropolitan city has the vulnerability of water management due to the fact that industrial areas are located in the upper Nakdong River. Most citizens of Busan think that the water quality of Nakdong River is not good, so they boil or use home filter to drink tap water, which causes unnecessary individual costs to Busan citizens. We need to diversify water intake to reduce the cost and to change the weak water source. Under this background, this study was carried out for the environmental accounting of Namgang dam water treatment system compared to Nakdong River water treatment system by using emergy analysis method to help making reasonable decision. Emergy analysis method evaluates quantitatively both natural environment and human economic activities as an equal unit of measure. The emergy transformity of Namgang dam’s water was 1.16 times larger than that of Nakdong River’s water. Namgang Dam’s water shows larger emergy transformity than that of Nakdong River’s water due to its good water quality. The emergy used in making 1 m3 tap water from Namgang dam water treatment system was 1.26 times larger than that of Nakdong River water treatment system. Namgang dam water treatment system shows larger emergy input than that of Nakdong river water treatment system due to its construction cost of new pipeline for intaking Namgang daw water. If the Won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.66. If the Em-won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.26. The cost-benefit ratio of Em-won was smaller than that of Won. When we use emergy analysis, which considers the benefit of a natural environment such as good water quality of Namgang dam, Namgang dam water treatment system could be a good alternative for diversifying intake source.

Keywords: emergy, emergy transformity, Em-won, water treatment system

Procedia PDF Downloads 267
30401 Congestion Mitigation on an Urban Arterial through Infrastructure Intervention

Authors: Attiq Ur Rahman Dogar, Sohaib Ishaq

Abstract:

Pakistan had experienced rapid motorization in the last decade. Due to the soft leasing schemes of banks and increase in average household income, even the middle class can now afford cars. The public transit system is inadequate and sparse. Due to these reasons, traffic demand on urban arterials has increased manifold. Poor urban transit planning and aging transportation systems have resulted in traffic congestion. The focus of this study is to improve traffic flow on a section of N-5 passing through the Rawalpindi downtown. Present efforts aim to carry out the analysis of traffic conditions on this section and to investigate the impact of traffic signal co-ordination on travel time. In addition to signal co-ordination, we also examined the effect of different infrastructure improvements on the travel time. After the economic analysis of alternatives and discussions, the improvement plan for Rawalpindi downtown urban arterial section is proposed for implementation.

Keywords: signal coordination, infrastructure intervention, infrastructure improvement, cycle length, fuel consumption cost, travel time cost, economic analysis, travel time, Rawalpindi, Pakistan, traffic signals

Procedia PDF Downloads 290
30400 Modelling Conceptual Quantities Using Support Vector Machines

Authors: Ka C. Lam, Oluwafunmibi S. Idowu

Abstract:

Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.

Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression

Procedia PDF Downloads 182
30399 Component Level Flood Vulnerability Framework for the United Kingdom

Authors: Mohammad Shoraka, Francesco Preti, Karen Angeles, Raulina Wojtkiewicz, Karthik Ramanathan

Abstract:

Catastrophe modeling has evolved significantly over the last four decades. Verisk introduced its pioneering comprehensive inland flood model tailored for the U.K. in 2008. Over the course of the last 15 years, Verisk has built a suite of physically driven flood models for several countries and regions across the globe. This paper aims to spotlight a selection of these advancements tailored to the development of vulnerability estimation, which forms an integral part of a forthcoming update to Verisk’s U.K. inland flood model. Vulnerability functions are critical to evaluating and robust modeling flood-induced damage to buildings and contents. The subsequent damage assessments then allow for direct quantification of losses for entire building portfolios. Notably, today’s flood loss models more often prioritize enhanced development of hazard characterization, while vulnerability functions often lack sufficient granularity for a robust assessment. This study proposes a novel, engineering-driven, physically based component-level flood vulnerability framework for the U.K. Various aspects of the framework, including component classification and comprehensive cost analysis, meticulously tailored to capture the distinct building characteristics unique to the U.K., will be discussed. This analysis will elucidate how the cost distribution across individual components contributes to translating component-level damage functions into building-level damage functions. Furthermore, a succinct overview of essential datasets employed to gauge building regional vulnerability will be highlighted.

Keywords: catastrophe modeling, inland flood, vulnerability, cost analysis

Procedia PDF Downloads 31
30398 The Influence of Website Quality on Customer E-Satisfaction in Low Cost Airline

Authors: Zainab Khalifah, Wong Chiet Bing, Noor Hazarina Hashim

Abstract:

The evolution of customer behavior in purchasing products or services through the Internet leads to airline companies engaging in the e-ticketing process in order to maintain their business. A well-designed website is vitally significant for the airline companies to provide effective communication, support, and competitive advantage. This study was conducted to identify the dimensions of website quality for low cost airline and to investigate the relationship between the website quality and customer e-satisfaction at low cost airline. A total of 381 responses were conveniently collected among local passengers at Low Cost Carrier Terminal, Kuala Lumpur via questionnaire distribution. This study found that the five determinant factors of website quality for AirAsia were Information Content, Navigation, Responsiveness, Personalization, and Security and Privacy. The results of this study revealed that there is a positive relationship between the five dimensions of website quality and customer e-satisfaction, and also information content was the most significant contributor to customer e-satisfaction.

Keywords: website quality, customer e-satisfaction, low cost airline, e-ticketing

Procedia PDF Downloads 384
30397 Identification of Coauthors in Scientific Database

Authors: Thiago M. R Dias, Gray F. Moita

Abstract:

The analysis of scientific collaboration networks has contributed significantly to improving the understanding of how does the process of collaboration between researchers and also to understand how the evolution of scientific production of researchers or research groups occurs. However, the identification of collaborations in large scientific databases is not a trivial task given the high computational cost of the methods commonly used. This paper proposes a method for identifying collaboration in large data base of curriculum researchers. The proposed method has low computational cost with satisfactory results, proving to be an interesting alternative for the modeling and characterization of large scientific collaboration networks.

Keywords: extraction, data integration, information retrieval, scientific collaboration

Procedia PDF Downloads 359
30396 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 78
30395 Enabling Oral Communication and Accelerating Recovery: The Creation of a Novel Low-Cost Electroencephalography-Based Brain-Computer Interface for the Differently Abled

Authors: Rishabh Ambavanekar

Abstract:

Expressive Aphasia (EA) is an oral disability, common among stroke victims, in which the Broca’s area of the brain is damaged, interfering with verbal communication abilities. EA currently has no technological solutions and its only current viable solutions are inefficient or only available to the affluent. This prompts the need for an affordable, innovative solution to facilitate recovery and assist in speech generation. This project proposes a novel concept: using a wearable low-cost electroencephalography (EEG) device-based brain-computer interface (BCI) to translate a user’s inner dialogue into words. A low-cost EEG device was developed and found to be 10 to 100 times less expensive than any current EEG device on the market. As part of the BCI, a machine learning (ML) model was developed and trained using the EEG data. Two stages of testing were conducted to analyze the effectiveness of the device: a proof-of-concept and a final solution test. The proof-of-concept test demonstrated an average accuracy of above 90% and the final solution test demonstrated an average accuracy of above 75%. These two successful tests were used as a basis to demonstrate the viability of BCI research in developing lower-cost verbal communication devices. Additionally, the device proved to not only enable users to verbally communicate but has the potential to also assist in accelerated recovery from the disorder.

Keywords: neurotechnology, brain-computer interface, neuroscience, human-machine interface, BCI, HMI, aphasia, verbal disability, stroke, low-cost, machine learning, ML, image recognition, EEG, signal analysis

Procedia PDF Downloads 87
30394 Input-Output Analysis in Laptop Computer Manufacturing

Authors: H. Z. Ulukan, E. Demircioğlu, M. Erol Genevois

Abstract:

The scope of this paper and the aim of proposed model were to apply monetary Input –Output (I-O) analysis to point out the importance of reusing know-how and other requirements in order to reduce the production costs in a manufacturing process for a laptop computer. I-O approach using the monetary input-output model is employed to demonstrate the impacts of different factors in a manufacturing process. A sensitivity analysis showing the correlation between these different factors is also presented. It is expected that the recommended model would have an advantageous effect in the cost minimization process.

Keywords: input-output analysis, monetary input-output model, manufacturing process, laptop computer

Procedia PDF Downloads 363
30393 Immobilization of Lipase Enzyme by Low Cost Material: A Statistical Approach

Authors: Md. Z. Alam, Devi R. Asih, Md. N. Salleh

Abstract:

Immobilization of lipase enzyme produced from palm oil mill effluent (POME) by the activated carbon (AC) among the low cost support materials was optimized. The results indicated that immobilization of 94% was achieved by AC as the most suitable support material. A sequential optimization strategy based on a statistical experimental design, including one-factor-at-a-time (OFAT) method was used to determine the equilibrium time. Three components influencing lipase immobilization were optimized by the response surface methodology (RSM) based on the face-centered central composite design (FCCCD). On the statistical analysis of the results, the optimum enzyme concentration loading, agitation rate and carbon active dosage were found to be 30 U/ml, 300 rpm and 8 g/L respectively, with a maximum immobilization activity of 3732.9 U/g-AC after 2 hrs of immobilization. Analysis of variance (ANOVA) showed a high regression coefficient (R2) of 0.999, which indicated a satisfactory fit of the model with the experimental data. The parameters were statistically significant at p<0.05.

Keywords: activated carbon, POME based lipase, immobilization, adsorption

Procedia PDF Downloads 213
30392 Create a Dynamic Model in Project Control and Management

Authors: Hamed Saremi, Shahla Saremi

Abstract:

In this study, control and management of construction projects is evaluated through developing a dynamic model in which some means are used in order to evaluating planning assumptions and reviewing the effectiveness of some project control policies based on previous researches about time, cost, project schedule pressure management, source management, project control, adding elements and sub-systems from cost management such as estimating consumption budget from budget due to costs, budget shortage effects and etc. using sensitivity analysis, researcher has evaluated introduced model that during model simulation by VENSIM software and assuming optimistic times and adding information about doing job and changes rate and project is forecasted with 373 days (2 days sooner than forecasted) and final profit $ 1,960,670 (23% amount of contract) assuming 15% inflation rate in year and costs rate accordance with planned amounts and other input information and final profit.

Keywords: dynamic planning, cost, time, performance, project management

Procedia PDF Downloads 437
30391 Determination Optimum Strike Price of FX Option Call Spread with USD/IDR Volatility and Garman–Kohlhagen Model Analysis

Authors: Bangkit Adhi Nugraha, Bambang Suripto

Abstract:

On September 2016 Bank Indonesia (BI) release regulation no.18/18/PBI/2016 that permit bank clients for using the FX option call spread USD/IDR. Basically, this product is a combination between clients buy FX call option (pay premium) and sell FX call option (receive premium) to protect against currency depreciation while also capping the potential upside with cheap premium cost. BI classifies this product as a structured product. The structured product is combination at least two financial instruments, either derivative or non-derivative instruments. The call spread is the first structured product against IDR permitted by BI since 2009 as response the demand increase from Indonesia firms on FX hedging through derivative for protecting market risk their foreign currency asset or liability. The composition of hedging products on Indonesian FX market increase from 35% on 2015 to 40% on 2016, the majority on swap product (FX forward, FX swap, cross currency swap). Swap is formulated by interest rate difference of the two currency pairs. The cost of swap product is 7% for USD/IDR with one year USD/IDR volatility 13%. That cost level makes swap products seem expensive for hedging buyers. Because call spread cost (around 1.5-3%) cheaper than swap, the most Indonesian firms are using NDF FX call spread USD/IDR on offshore with outstanding amount around 10 billion USD. The cheaper cost of call spread is the main advantage for hedging buyers. The problem arises because BI regulation requires the call spread buyer doing the dynamic hedging. That means, if call spread buyer choose strike price 1 and strike price 2 and volatility USD/IDR exchange rate surpass strike price 2, then the call spread buyer must buy another call spread with strike price 1’ (strike price 1’ = strike price 2) and strike price 2’ (strike price 2’ > strike price 1‘). It could make the premium cost of call spread doubled or even more and dismiss the purpose of hedging buyer to find the cheapest hedging cost. It is very crucial for the buyer to choose best optimum strike price before entering into the transaction. To help hedging buyer find the optimum strike price and avoid expensive multiple premium cost, we observe ten years 2005-2015 historical data of USD/IDR volatility to be compared with the price movement of the call spread USD/IDR using Garman–Kohlhagen Model (as a common formula on FX option pricing). We use statistical tools to analysis data correlation, understand nature of call spread price movement over ten years, and determine factors affecting price movement. We select some range of strike price and tenor and calculate the probability of dynamic hedging to occur and how much it’s cost. We found USD/IDR currency pairs is too uncertain and make dynamic hedging riskier and more expensive. We validated this result using one year data and shown small RMS. The study result could be used to understand nature of FX call spread and determine optimum strike price for hedging plan.

Keywords: FX call spread USD/IDR, USD/IDR volatility statistical analysis, Garman–Kohlhagen Model on FX Option USD/IDR, Bank Indonesia Regulation no.18/18/PBI/2016

Procedia PDF Downloads 350
30390 A Query Optimization Strategy for Autonomous Distributed Database Systems

Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam

Abstract:

Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.

Keywords: autonomous strategies, distributed database systems, high priority, query optimization

Procedia PDF Downloads 489
30389 The Impact of Voluntary Disclosure Level on the Cost of Equity Capital in Tunisian's Listed Firms

Authors: Nouha Ben Salah, Mohamed Ali Omri

Abstract:

This paper treats the association between disclosure level and the cost of equity capital in Tunisian’slisted firms. This relation is tested by using two models. The first is used for testing this relation directly by regressing firm specific estimates of cost of equity capital on market beta, firm size and a measure of disclosure level. The second model is used for testing this relation by introducing information asymmetry as mediator variable. This model is suggested by Baron and Kenny (1986) to demonstrate the role of mediator variable in general. Based on a sample of 21 non-financial Tunisian’s listed firms over a period from 2000 to 2004, the results prove that greater disclosure is associated with a lower cost of equity capital. However, the results of indirect relationship indicate a significant positive association between the level of voluntary disclosure and information asymmetry and a significant negative association between information asymmetry and cost of equity capital in contradiction with our previsions. Perhaps this result is due to the biases of measure of information asymmetry.

Keywords: cost of equity capital, voluntary disclosure, information asymmetry, and Tunisian’s listed non-financial firms

Procedia PDF Downloads 481
30388 MAS Capped CdTe/ZnS Core/Shell Quantum Dot Based Sensor for Detection of Hg(II)

Authors: Dilip Saikia, Suparna Bhattacharjee, Nirab Adhikary

Abstract:

In this piece of work, we have presented the synthesis and characterization of CdTe/ZnS core/shell (CS) quantum dots (QD). CS QDs are used as a fluorescence probe to design a simple cost-effective and ultrasensitive sensor for the detection of toxic Hg(II) in an aqueous medium. Mercaptosuccinic acid (MSA) has been used as a capping agent for the synthesis CdTe/ZnS CS QD. Photoluminescence quenching mechanism has been used in the detection experiment of Hg(II). The designed sensing technique shows a remarkably low detection limit of about 1 picomolar (pM). Here, the CS QDs are synthesized by a simple one-pot aqueous method. The synthesized CS QDs are characterized by using advanced diagnostics tools such as UV-vis, Photoluminescence, XRD, FTIR, TEM and Zeta potential analysis. The interaction between CS QDs and the Hg(II) ions results in the quenching of photoluminescence (PL) intensity of QDs, via the mechanism of excited state electron transfer. The proposed mechanism is explained using cyclic voltammetry and zeta potential analysis. The designed sensor is found to be highly selective towards Hg (II) ions. The analysis of the real samples such as drinking water and tap water has been carried out and the CS QDs show remarkably good results. Using this simple sensing method we have designed a prototype low-cost electronic device for the detection of Hg(II) in an aqueous medium. The findings of the experimental results of the designed sensor is crosschecked by using AAS analysis.

Keywords: photoluminescence, quantum dots, quenching, sensor

Procedia PDF Downloads 239
30387 Application of Discrete-Event Simulation in Health Technology Assessment: A Cost-Effectiveness Analysis of Alzheimer’s Disease Treatment Using Real-World Evidence in Thailand

Authors: Khachen Kongpakwattana, Nathorn Chaiyakunapruk

Abstract:

Background: Decision-analytic models for Alzheimer’s disease (AD) have been advanced to discrete-event simulation (DES), in which individual-level modelling of disease progression across continuous severity spectra and incorporation of key parameters such as treatment persistence into the model become feasible. This study aimed to apply the DES to perform a cost-effectiveness analysis of treatment for AD in Thailand. Methods: A dataset of Thai patients with AD, representing unique demographic and clinical characteristics, was bootstrapped to generate a baseline cohort of patients. Each patient was cloned and assigned to donepezil, galantamine, rivastigmine, memantine or no treatment. Throughout the simulation period, the model randomly assigned each patient to discrete events including hospital visits, treatment discontinuation and death. Correlated changes in cognitive and behavioral status over time were developed using patient-level data. Treatment effects were obtained from the most recent network meta-analysis. Treatment persistence, mortality and predictive equations for functional status, costs (Thai baht (THB) in 2017) and quality-adjusted life year (QALY) were derived from country-specific real-world data. The time horizon was 10 years, with a discount rate of 3% per annum. Cost-effectiveness was evaluated based on the willingness-to-pay (WTP) threshold of 160,000 THB/QALY gained (4,994 US$/QALY gained) in Thailand. Results: Under a societal perspective, only was the prescription of donepezil to AD patients with all disease-severity levels found to be cost-effective. Compared to untreated patients, although the patients receiving donepezil incurred a discounted additional costs of 2,161 THB, they experienced a discounted gain in QALY of 0.021, resulting in an incremental cost-effectiveness ratio (ICER) of 138,524 THB/QALY (4,062 US$/QALY). Besides, providing early treatment with donepezil to mild AD patients further reduced the ICER to 61,652 THB/QALY (1,808 US$/QALY). However, the dominance of donepezil appeared to wane when delayed treatment was given to a subgroup of moderate and severe AD patients [ICER: 284,388 THB/QALY (8,340 US$/QALY)]. Introduction of a treatment stopping rule when the Mini-Mental State Exam (MMSE) score goes below 10 to a mild AD cohort did not deteriorate the cost-effectiveness of donepezil at the current treatment persistence level. On the other hand, none of the AD medications was cost-effective when being considered under a healthcare perspective. Conclusions: The DES greatly enhances real-world representativeness of decision-analytic models for AD. Under a societal perspective, treatment with donepezil improves patient’s quality of life and is considered cost-effective when used to treat AD patients with all disease-severity levels in Thailand. The optimal treatment benefits are observed when donepezil is prescribed since the early course of AD. With healthcare budget constraints in Thailand, the implementation of donepezil coverage may be most likely possible when being considered starting with mild AD patients, along with the stopping rule introduced.

Keywords: Alzheimer's disease, cost-effectiveness analysis, discrete event simulation, health technology assessment

Procedia PDF Downloads 98