Search results for: Cost assessment
10665 Agency Cost, Firm Performance, Corporate Governance: Evidence from Indonesia
Authors: Arnold Sanda Layuk
Abstract:
Fraud in the disclosure of financial statements by management shows that agency conflict is an important issue in the company. The conflict has consequences for the agency costs that must be borne and has an impact on the firm's performance. The effect of agency costs on firm performance is investigated in this study, as well as whether several variables such as corporate governance mechanisms can positively moderate the agency cost and firm performance relationship. The agency cost is measured by the asset utilization ratio and discretionary expenditure ratio. The firm's performance is represented by the return on equity. Data was collected from the manufacturing companies listed on the Indonesia Stock Exchange from 2015 to 2019, then regressed on the panel data using the panel corrected standard error model (PCSE). According to the findings, agency costs are negatively related to firm performance, which supports previous empirical research findings. It also found that the agency cost and firm performance relationship is significantly moderated by board size and ownership concentration as the representatives of corporate governance mechanisms. It suggests that corporate governance can become tools to reduce agency costs and increase firm performance as well. The empirical evidence adds to previous research on agency conflict, particularly in emerging markets. These findings are expected to supplement previous research and provide additional information to shareholders in order to control opportunistic management decisions that affect their investments and discretionary operational expenses.Keywords: agency cost, corporate governance, asset utilization ratio, firm performance
Procedia PDF Downloads 19310664 Neuropsychological Assessment and Rehabilitation Settings for Developmental Dyslexia in Children in Greece: The Use of Music at Intervention Protocols
Authors: Argyris B. Karapetsas, Rozi M. Laskaraki, Aikaterini A. Karapetsa, Maria Bampou, Valentini N. Vamvaka
Abstract:
The main aim of the current protocol is the contribution of neuropsychology in both assessment and rehabilitation settings for children with dyslexia. Objectives: The purpose of this study was to evaluate the significant role of neuropsychological assessment including both Psychometric and electrophysiological tests as well as to investigate the effectiveness of an Auditory Training program, designed via Music designed for children with Developmental Dyslexia (DD). Materials: In our study, participated 45 third-, and fourth-grade students with DD and a matched control group (n=45). Method: At the first phase of the protocol, children underwent a clinical assessment, including both electrophysiological, i.e. Event Related Potentials (ERPs) esp. P300 waveform, and psychometric tests, being conducted in Laboratory of Neuropsychology, at University of Thessaly, in Volos, Greece. Assessment’s results confirmed statistically significant lower performance for children with DD for all tests, compared to the typical readers of the control group. After evaluation, a subgroup of children with DD participated in a Rehabilitation Program including digitized musical auditory training activities. Results: The electrophysiological recordings after the intervention revealed shorter, almost similar, P300 latency values for children with DD to those of the control group, indicating the beneficial effects of the Intervention, thus enabling children develop reading skills and become successful readers. Discussion: Similar research data confirm the crucial role of neuropsychology in both diagnosis and treatment of common disorders, observed in children. Indeed, as for DD, there is growing evidence that brain activity dysfunction does occur, as it is confirmed by neuropsychological assessment and also musical auditory training may have remedial effects. Conclusions: The outcomes of the current study suggest that due to the neurobiological origin of DD, neuropsychology may give the means in both neuropsychological assessment and rehabilitation, enabling professionals to cope with cerebral dysfunction and recovery more efficiently.Keywords: diagnosis, dyslexia, ERPs, Music, neuropsychology, rehabilitation
Procedia PDF Downloads 13210663 Habitat Model Review and a Proposed Methodology to Value Economic Trade-Off between Cage Culture and Habitat of an Endemic Species in Lake Maninjau, Indonesia
Authors: Ivana Yuniarti, Iwan Ridwansyah
Abstract:
This paper delivers a review of various methodologies for habitat assessment and a proposed methodology to assess an endemic fish species habitat in Lake Maninjau, Indonesia as a part of a Ph.D. project. This application is mainly aimed to assess the trade-off between the economic value of aquaculture and the fisheries. The proposed methodology is a generalized linear model (GLM) combined with GIS to assess presence-absence data or habitat suitability index (HSI) combined with the analytical hierarchy process (AHP). Further, a cost of habitat replacement approach is planned to be used to calculate the habitat value as well as its trade-off with the economic value of aquaculture. The result of the study is expected to be a scientific consideration in local decision making and to provide a reference for other areas in the country.Keywords: AHP, habitat, GLM, HSI, Maninjau
Procedia PDF Downloads 15010662 Chemical Life Cycle Alternative Assessment as a Green Chemical Substitution Framework: A Feasibility Study
Authors: Sami Ayad, Mengshan Lee
Abstract:
The Sustainable Development Goals (SDGs) were designed to be the best possible blueprint to achieve peace, prosperity, and overall, a better and more sustainable future for the Earth and all its people, and such a blueprint is needed more than ever. The SDGs face many hurdles that will prevent them from becoming a reality, one of such hurdles, arguably, is the chemical pollution and unintended chemical impacts generated through the production of various goods and resources that we consume. Chemical Alternatives Assessment has proven to be a viable solution for chemical pollution management in terms of filtering out hazardous chemicals for a greener alternative. However, the current substitution practice lacks crucial quantitative datasets (exposures and life cycle impacts) to ensure no unintended trade-offs occur in the substitution process. A Chemical Life Cycle Alternative Assessment (CLiCAA) framework is proposed as a reliable and replicable alternative to Life Cycle Based Alternative Assessment (LCAA) as it integrates chemical molecular structure analysis and Chemical Life Cycle Collaborative (CLiCC) web-based tool to fill in data gaps that the former frameworks suffer from. The CLiCAA framework consists of a four filtering layers, the first two being mandatory, with the final two being optional assessment and data extrapolation steps. Each layer includes relevant impact categories of each chemical, ranging from human to environmental impacts, that will be assessed and aggregated into unique scores for overall comparable results, with little to no data. A feasibility study will demonstrate the efficiency and accuracy of CLiCAA whilst bridging both cancer potency and exposure limit data, hoping to provide the necessary categorical impact information for every firm possible, especially those disadvantaged in terms of research and resource management.Keywords: chemical alternative assessment, LCA, LCAA, CLiCC, CLiCAA, chemical substitution framework, cancer potency data, chemical molecular structure analysis
Procedia PDF Downloads 9010661 Integrating Deterministic and Probabilistic Safety Assessment to Decrease Risk & Energy Consumption in a Typical PWR
Authors: Ebrahim Ghanbari, Mohammad Reza Nematollahi
Abstract:
Integrating deterministic and probabilistic safety assessment (IDPSA) is one of the most commonly used issues in the field of safety analysis of power plant accident. It has also been recognized today that the role of human error in creating these accidents is not less than systemic errors, so the human interference and system errors in fault and event sequences are necessary. The integration of these analytical topics will be reflected in the frequency of core damage and also the study of the use of water resources in an accident such as the loss of all electrical power of the plant. In this regard, the SBO accident was simulated for the pressurized water reactor in the deterministic analysis issue, and by analyzing the operator's behavior in controlling the accident, the results of the combination of deterministic and probabilistic assessment were identified. The results showed that the best performance of the plant operator would reduce the risk of an accident by 10%, as well as a decrease of 6.82 liters/second of the water sources of the plant.Keywords: IDPSA, human error, SBO, risk
Procedia PDF Downloads 12810660 Performance and Economics of Goats Fed Poultry Litter and Rumen Content
Authors: A. Mohammed, A. M. Umar, S. H. Adamu
Abstract:
The study was conducted to evaluate the growth performance and nutrients utilization using 20 entire males of Sahelian goats fed Rumen content (fore-stomach digest) and poultry litter waste (PLW) at various levels of inclusion. The experimental animals were randomly allocated to diet A (Control), B (10% each of FSD and PLW), C (6.67%PLW and 13.33 FSD) and D(13.33% PLW and 6.67% FDS) at the rate of five animals per treatment. After 90 days of feeding trial, It was observed that Diets D had best feed intake and body weight gain which might be due to the good palatability of PLW and less odour of FSD in the diet. Diet C had the least feed cost then followed by diet B and while diet A(control) was more expensive than other treatments. There was the significant difference (P<0.05) between the treatments in the cost of daily feed consumption. Treatment A had the highest value while treatment C recorded the lowest cost of daily feed consumption. There was no significant difference (P > 0.05) between all treatments in terms of Cost of feed kg/ live weight gain, where treatment B had the highest value while the lowest obtained in treatment D. However, it is recommended that more research trial should be carried out to ascertain the true value of incorporating poultry litter waste and fore-stomach digest.Keywords: poultry litter, rumen content, weight gain, economics
Procedia PDF Downloads 64010659 Ground Water Sustainable Management in Ethiopia, Africa
Authors: Ebissa Gadissa Kedir
Abstract:
This paper presents the potential groundwater assessment and sustainable management in the selected study area. It is the most preferred water source in all climatic zones for its convenient availability, drought dependability, excellent quality, and low development cost. The rural areas, which account for more than 85% of the country's population, are encountered a shortage of potable water supply which can be solved by proper groundwater utilization. For the present study area, the groundwater potential is assessed and analysed. Thus, the study area falls in four potential groundwater zones ranging from poor to high. However, the current groundwater management practices in the study area are poor. Despite the pervasive and devastating challenges, immediate and proper responses have not yet been given to the problem. Thus, such frustrating threats and challenges have initiated the researcher to work in the project area.Keywords: GW potential, GW management, GW sustainability, South gonder, Ethiopia
Procedia PDF Downloads 6310658 Batch and Dynamic Investigations on Magnesium Separation by Ion Exchange Adsorption: Performance and Cost Evaluation
Authors: Mohamed H. Sorour, Hayam F. Shaalan, Heba A. Hani, Eman S. Sayed
Abstract:
Ion exchange adsorption has a long standing history of success for seawater softening and selective ion removal from saline sources. Strong, weak and mixed types ion exchange systems could be designed and optimized for target separation. In this paper, different types of adsorbents comprising zeolite 13X and kaolin, in addition to, poly acrylate/zeolite (AZ), poly acrylate/kaolin (AK) and stand-alone poly acrylate (A) hydrogel types were prepared via microwave (M) and ultrasonic (U) irradiation techniques. They were characterized using X-ray diffraction (XRD), Fourier transform infrared spectroscopy (FTIR), and scanning electron microscopy (SEM). The developed adsorbents were evaluated on bench scale level and based on assessment results, a composite bed has been formulated for performance evaluation in pilot scale column investigations. Owing to the hydrogel nature of the partially crosslinked poly acrylate, the developed adsorbents manifested a swelling capacity of about 50 g/g. The pilot trials have been carried out using magnesium enriched Red Seawater to simulate Red Seawater desalination brine. Batch studies indicated varying uptake efficiencies, where Mg adsorption decreases according to the following prepared hydrogel types AU>AM>AKM>AKU>AZM>AZU, being 108, 107, 78, 69, 66 and 63 mg/g, respectively. Composite bed adsorbent tested in the up-flow mode column studies indicated good performance for Mg uptake. For an operating cycle of 12 h, the maximum uptake during the loading cycle approached 92.5-100 mg/g, which is comparable to the performance of some commercial resins. Different regenerants have been explored to maximize regeneration and minimize the quantity of regenerants including 15% NaCl, 0.1 M HCl and sodium carbonate. Best results were obtained by acidified sodium chloride solution. In conclusion, developed cation exchange adsorbents comprising clay or zeolite support indicated adequate performance for Mg recovery under saline environment. Column design operated at the up-flow mode (approaching expanded bed) is appropriate for such type of separation. Preliminary cost indicators for Mg recovery via ion exchange have been developed and analyzed.Keywords: batch and dynamic magnesium separation, seawater, polyacrylate hydrogel, cost evaluation
Procedia PDF Downloads 13310657 Project Time Prediction Model: A Case Study of Construction Projects in Sindh, Pakistan
Authors: Tauha Hussain Ali, Shabir Hussain Khahro, Nafees Ahmed Memon
Abstract:
Accurate prediction of project time for planning and bid preparation stage should contain realistic dates. Constructors use their experience to estimate the project duration for the new projects, which is based on intuitions. It has been a constant concern to both researchers and constructors to analyze the accurate prediction of project duration for bid preparation stage. In Pakistan, such study for time cost relationship has been lacked to predict duration performance for the construction projects. This study is an attempt to explore the time cost relationship that would conclude with a mathematical model to predict the time for the drainage rehabilitation projects in the province of Sindh, Pakistan. The data has been collected from National Engineering Services (NESPAK), Pakistan and regression analysis has been carried out for the analysis of results. Significant relationship has been found between time and cost of the construction projects in Sindh and the generated mathematical model can be used by the constructors to predict the project duration for the upcoming projects of same nature. This study also provides the professionals with a requisite knowledge to make decisions regarding project duration, which is significantly important to win the projects at the bid stage.Keywords: BTC Model, project time, relationship of time cost, regression
Procedia PDF Downloads 38010656 The Optimal Order Policy for the Newsvendor Model under Worker Learning
Authors: Sunantha Teyarachakul
Abstract:
We consider the worker-learning Newsvendor Model, under the case of lost-sales for unmet demand, with the research objective of proposing the cost-minimization order policy and lot size, scheduled to arrive at the beginning of the selling-period. In general, the New Vendor Model is used to find the optimal order quantity for the perishable items such as fashionable products or those with seasonal demand or short-life cycles. Technically, it is used when the product demand is stochastic and available for the single selling-season, and when there is only a one time opportunity for the vendor to purchase, with possibly of long ordering lead-times. Our work differs from the classical Newsvendor Model in that we incorporate the human factor (specifically worker learning) and its influence over the costs of processing units into the model. We describe this by using the well-known Wright’s Learning Curve. Most of the assumptions of the classical New Vendor Model are still maintained in our work, such as the constant per-unit cost of leftover and shortage, the zero initial inventory, as well as the continuous time. Our problem is challenging in the way that the best order quantity in the classical model, which is balancing the over-stocking and under-stocking costs, is no longer optimal. Specifically, when adding the cost-saving from worker learning to such expected total cost, the convexity of the cost function will likely not be maintained. This has called for a new way in determining the optimal order policy. In response to such challenges, we found a number of characteristics related to the expected cost function and its derivatives, which we then used in formulating the optimal ordering policy. Examples of such characteristics are; the optimal order quantity exists and is unique if the demand follows a Uniform Distribution; if the demand follows the Beta Distribution with some specific properties of its parameters, the second derivative of the expected cost function has at most two roots; and there exists the specific level of lot size that satisfies the first order condition. Our research results could be helpful for analysis of supply chain coordination and of the periodic review system for similar problems.Keywords: inventory management, Newsvendor model, order policy, worker learning
Procedia PDF Downloads 41610655 Analysis of Efficiency Production of Grass Black Jelly (Mesona palustris) in Double Scale
Authors: Irvan Adhin Cholilie, Susinggih Wijana, Yusron Sugiarto
Abstract:
The aim of this research is to compare the results of black grass jelly produced using laboratory scale and double scale. In this research, the production from the laboratory scale is using ingredients of 1 kg black grass jelly added with 5 liters of water, while the double scale is using 5 kg black grass jelly and 75 liters of water. The results of organoleptic tests performed by 30 panelists (general) to the sample gels of grass black powder produced from both of laboratory and double scale are not different significantly in color, odor, flavor, and texture. Proximate test results conducted in both of grass black jelly powder produced in laboratory scale and double scale also have no significant differences in all parameters. Grass black jelly powder from double scale contains water, carbohydrate, crude fiber, and yield in the amount of 12,25 %; 43,7 %; 5,89 %; and 16,28 % respectively. The results of the energy efficiency analysis by boiling, draining, evaporation, drying, and milling processes are 85,11 %; 76,97 %; 99,64 %; 99,99% and 99,39% respectively. The utility needs including water needs for each batch amounted 0.1 m3 and cost Rp 220,5 per batch, the electricity needs for each batch is 20.01 kWh and cost Rp 18569.28 per batch, and LPG needs for each batch is 30 kg costed Rp 234,000.00 so that the total cost spent for the process is Rp 252,789.78 .Keywords: black grass jelly, powder, mass balance, energy balance, cost
Procedia PDF Downloads 38310654 The Low-Cost Design and 3D Printing of Structural Knee Orthotics for Athletic Knee Injury Patients
Authors: Alexander Hendricks, Sean Nevin, Clayton Wikoff, Melissa Dougherty, Jacob Orlita, Rafiqul Noorani
Abstract:
Knee orthotics play an important role in aiding in the recovery of those with knee injuries, especially athletes. However, structural knee orthotics is often very expensive, ranging between $300 and $800. The primary reason for this project was to answer the question: can 3D printed orthotics represent a viable and cost-effective alternative to present structural knee orthotics? The primary objective for this research project was to design a knee orthotic for athletes with knee injuries for a low-cost under $100 and evaluate its effectiveness. The initial design for the orthotic was done in SolidWorks, a computer-aided design (CAD) software available at Loyola Marymount University. After this design was completed, finite element analysis (FEA) was utilized to understand how normal stresses placed upon the knee affected the orthotic. The knee orthotic was then adjusted and redesigned to meet a specified factor-of-safety of 3.25 based on the data gathered during FEA and literature sources. Once the FEA was completed and the orthotic was redesigned based from the data gathered, the next step was to move on to 3D-printing the first design of the knee brace. Subsequently, physical therapy movement trials were used to evaluate physical performance. Using the data from these movement trials, the CAD design of the brace was refined to accommodate the design requirements. The final goal of this research means to explore the possibility of replacing high-cost, outsourced knee orthotics with a readily available low-cost alternative.Keywords: 3D printing, knee orthotics, finite element analysis, design for additive manufacturing
Procedia PDF Downloads 17810653 A Framework for Teaching the Intracranial Pressure Measurement through an Experimental Model
Authors: Christina Klippel, Lucia Pezzi, Silvio Neto, Rafael Bertani, Priscila Mendes, Flavio Machado, Aline Szeliga, Maria Cosendey, Adilson Mariz, Raquel Santos, Lys Bendett, Pedro Velasco, Thalita Rolleigh, Bruna Bellote, Daria Coelho, Bruna Martins, Julia Almeida, Juliana Cerqueira
Abstract:
This project presents a framework for teaching intracranial pressure monitoring (ICP) concepts using a low-cost experimental model in a neurointensive care education program. Data concerning ICP monitoring contribute to the patient's clinical assessment and may dictate the course of action of a health team (nursing, medical staff) and influence decisions to determine the appropriate intervention. This study aims to present a safe method for teaching ICP monitoring to medical students in a Simulation Center. Methodology: Medical school teachers, along with students from the 4th year, built an experimental model for teaching ICP measurement. The model consists of a mannequin's head with a plastic bag inside simulating the cerebral ventricle and an inserted ventricular catheter connected to the ICP monitoring system. The bag simulating the ventricle can also be changed for others containing bloody or infected simulated cerebrospinal fluid. On the mannequin's ear, there is a blue point indicating the right place to set the "zero point" for accurate pressure reading. The educational program includes four steps: 1st - Students receive a script on ICP measurement for reading before training; 2nd - Students watch a video about the subject created in the Simulation Center demonstrating each step of the ICP monitoring and the proper care, such as: correct positioning of the patient, anatomical structures to establish the zero point for ICP measurement and a secure range of ICP; 3rd - Students train the procedure in the model. Teachers help students during training; 4th - Student assessment based on a checklist form. Feedback and correction of wrong actions. Results: Students expressed interest in learning ICP monitoring. Tests concerning the hit rate are still being performed. ICP's final results and video will be shown at the event. Conclusion: The study of intracranial pressure measurement based on an experimental model consists of an effective and controlled method of learning and research, more appropriate for teaching neurointensive care practices. Assessment based on a checklist form helps teachers keep track of student learning progress. This project offers medical students a safe method to develop intensive neurological monitoring skills for clinical assessment of patients with neurological disorders.Keywords: neurology, intracranial pressure, medical education, simulation
Procedia PDF Downloads 17010652 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine
Authors: D. Madhushanka, Y. Liu, H. C. Fernando
Abstract:
Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2
Procedia PDF Downloads 23310651 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 36810650 Prime Mover Sizing for Base-Loaded Combined Heating and Power Systems
Authors: Djalal Boualili
Abstract:
This article considers the problem of sizing prime movers for combined heating and power (CHP) systems operating at full load to satisfy a fraction of a facility's electric load, i.e. a base load. Prime mover sizing is examined using three criteria: operational cost, carbon dioxide emissions (CDE), and primary energy consumption (PEC). The sizing process leads to consider ratios of conversion factors applied to imported electricity to conversion factors applied to fuel consumed. These ratios are labelled RCost, R CDE, R PEC depending on whether the conversion factors are associated with operational cost, CDE, or PEC, respectively. Analytical results show that in order to achieve savings in operational cost, CDE, or PEC, the ratios must be larger than a unique constant R Min that only depends on the CHP components efficiencies. Savings in operational cost, CDE, or PEC due to CHP operation are explicitly formulated using simple equations. This facilitates the process of comparing the tradeoffs of optimizing the savings of one criterion over the other two – a task that has traditionally been accomplished through computer simulations. A hospital building, located in Chlef, Algeria, was used as an example to apply the methodology presented in this article.Keywords: sizing, heating and power, ratios, energy consumption, carbon dioxide emissions
Procedia PDF Downloads 22710649 The Significance of Seasonality on the Airport Efficiency in Touristic Regions
Authors: Ioanna Pagoni, Annitsa Koumoutsidi
Abstract:
The aim of this paper is to estimate the efficiency of airports that are located in touristic regions. It focuses on the regional airports of Greece, which are located at the mainland and the islands that constitute touristic destinations. Most of these airports share the following characteristics. They operate at levels below capacity with a high level of seasonality to their traffic. In addition, in such airports, the operation of charter and low-cost airlines is significant. The efficiency of the study airports is calculated by using the non-parametric data envelopment analysis during the period of 2010-2016. The selected inputs include several airport infrastructure measures such as passenger terminal size, aircraft parking area, runway length, and the number of check-in counters, while the number of employees in each airport is also used. The number of passengers and aircraft movements are selected as outputs. The effect of seasonality, as well as the operation of charter airlines and low-cost carriers on airport efficiency, is estimated by running proper regression models. Preliminary findings indicate that low-cost and charter airlines contribute to increasing airport efficiency for most of the study airports. The results of this research could be useful for airlines, airport operators, hotel businesses, and other tourism-related operators.Keywords: airport efficiency, data envelopment analysis, low-cost carriers, charter airlines, seasonality
Procedia PDF Downloads 12410648 Replacement Time and Number of Preventive Maintenance Actions for Second-Hand Device
Authors: Wen Liang Chang
Abstract:
In this study, the optimal replacement time and number of preventive maintenance (PM) actions were investigated for a second-hand device. Suppose that a user intends to use a second-hand device for manufacturing products, and that the device is replaced with a new one. Any device failure is rectified through minimal repair, thereby incurring a fixed repair cost to the user. If the new device fails within the FRW period, minimal repair is performed at no cost to the user. After the FRW expires, a failed device is repaired and the cost of repair is incurred by the user. In this study, two profit models were developed, and the optimal replacement time and number of PM actions were determined to maximize profits. Finally, the influence of the optimal replacement time and number of PM actions were elaborated on, using numerical examples.Keywords: second-hand device, preventive maintenance, replacement time, device failure
Procedia PDF Downloads 46610647 Technology Blending as an Innovative Construction Mechanism in the Global South
Authors: Janet Kaningen, Richard N. Kaningen, Jonas Kaningen
Abstract:
This paper aims to discover the best ways to improve production efficiency, cost efficiency, community cohesion, and long-term sustainability in Ghana's housing delivery. Advanced Construction Technologies (ACTs) are set to become the sustainable mainstay of the construction industry due to the demand for innovative housing solutions. Advances in material science, building component production, and assembly technologies are leading to the development of next-generation materials such as polymeric-fiber-based products, light-metal alloys, and eco-materials. Modular housing construction has become more efficient and cost-effective than traditional building methods and is becoming increasingly popular for commercial, industrial, and residential projects. Effective project management and logistics will be imperative in the future speed and cost of modular construction housing.Keywords: technology blending, sustainability, housing, Ghana
Procedia PDF Downloads 8510646 Examining the Coverage of CO2-Related Indicators in a Sample of Sustainable Rating Systems
Authors: Wesam Rababa, Jamal Al-Qawasmi
Abstract:
The global climate is negatively impacted by CO2 emissions, which are mostly produced by buildings. Several green building rating systems (GBRS) have been proposed to impose low-carbon criteria in order to address this problem. The Green Globes certification is one such system that evaluates a building's sustainability level by assessing different categories of environmental impact and emerging concepts aimed at reducing environmental harm. Therefore, assessment tools at the national level are crucial in the developing world, where specific local conditions require a more precise evaluation. This study analyzed eight sustainable building assessment systems from different regions of the world, comparing a comprehensive list of CO2-related indicators with a various assessment system for conducting coverage analysis. The results show that GBRS includes both direct and indirect indicators in this regard. It reveals deep variation between examined practices, and a lack of consensus not only on the type and the optimal number of indicators used in a system, but also on the depth and breadth of coverage of various sustainable building SB attributes. Generally, the results show that most of the examined systems reflect a low comprehensive coverage, the highest of which is found in materials category. On the other hand, the most of the examined systems reveal a very low representative coverage.Keywords: Assessment tools, CO2-related indicators, Comparative study, Green Building Rating Systems
Procedia PDF Downloads 5610645 Analysis of Risk Factors Affecting the Motor Insurance Pricing with Generalized Linear Models
Authors: Puttharapong Sakulwaropas, Uraiwan Jaroengeratikun
Abstract:
Casualty insurance business, the optimal premium pricing and adequate cost for an insurance company are important in risk management. Normally, the insurance pure premium can be determined by multiplying the claim frequency with the claim cost. The aim of this research was to study in the application of generalized linear models to select the risk factor for model of claim frequency and claim cost for estimating a pure premium. In this study, the data set was the claim of comprehensive motor insurance, which was provided by one of the insurance company in Thailand. The results of this study found that the risk factors significantly related to pure premium at the 0.05 level consisted of no claim bonus (NCB) and used of the car (Car code).Keywords: generalized linear models, risk factor, pure premium, regression model
Procedia PDF Downloads 46210644 Environmental Life Cycle Assessment of Two Technologic Scenario of Wind Turbine Blades Composition for an Optimized Wind Turbine Design Using the Impact 2002+ Method and Using 15 Environmental Impact Indicators
Authors: A. Jarrou, A. Iranzo, C. Nana
Abstract:
The rapid development of the onshore/offshore wind industry and the continuous, strong, and long-term support from governments have made it possible to create factories specializing in the manufacture of the different parts of wind turbines, but in the literature, Life Cycle Assessment (LCA) analyzes consider the wind turbine as a whole and do not allow the allocation of impacts to the different components of the wind turbine. Here we propose to treat each part of the wind turbine as a system in its own right. This is more in line with the current production system. Environmental Life Cycle Assessment of two technological scenarios of wind turbine blades composition for an optimized wind turbine design using the impact 2002+ method and using 15 environmental impact indicators. This article aims to assess the environmental impacts associated with 1 kg of wind turbine blades. In order to carry out a realistic and precise study, the different stages of the life cycle of a wind turbine installation are included in the study (manufacture, installation, use, maintenance, dismantling, and waste treatment). The Impact 2002+ method used makes it possible to assess 15 impact indicators (human toxicity, terrestrial and aquatic ecotoxicity, climate change, land use, etc.). Finally, a sensitivity study is carried out to analyze the different types of uncertainties in the data collected.Keywords: life cycle assessment, wind turbine, turbine blade, environmental impact
Procedia PDF Downloads 17510643 Quick Covering Machine for Grain Drying Pavement
Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug
Abstract:
In sundrying, the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack, conducting partial budget, and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0 .53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.Keywords: quick, covering machine, grain, drying pavement
Procedia PDF Downloads 37210642 Risk Reduction of Household Refuse, a Case Study of Shagari Low-Cost, Mubi North (LGA) Adamawa State, Nigeria
Authors: Maryam Tijjani Kolo
Abstract:
Lack of refuse dumping points has made the residents of Shagari low-cost well armed with some health and environmental related hazards. These studies investigate the effect of household refuse on the resident of Shagari low-cost. A well structured questionnaire was administered to elicit views of the respondent in the study area through adopting cluster sampling method. A total of 100 questionnaires were selected and divided into 50, each to both sections of the study area. Interview was conducted to each household head. Data obtained were analyzed using simple parentages to determine the major hazard in the area. Result showed that majority of the household are civil servant and traders, earning reasonable monthly income. 68% of the respondent has experienced the effect of living close to waste dumping areas, which include unpleasant smell and polluted smoke when refuse is burnt, which causes eye and respiratory induction, human injury from broken bottles or sharp objects as well as water, insect and air borne diseases. Hence, the need to urgently address these menace before it overwhelms the capacities of the community becomes paramount. Thus, the community should be given more enlightenment and refuse dumping sites should be created by the local government area.Keywords: household, refuse, refuse dumping points, Shagari low-cost
Procedia PDF Downloads 31810641 Assessment Methodology of E-government Projects for the Regions of Georgia
Authors: Tina Melkoshvili
Abstract:
Drastic development of information and communication technologies in Georgia has led to the necessity of launching conceptually new, effective, flexible, transparent and society oriented form of government that is e-government. Through applying information technologies, the electronic system enables to raise the efficacy of state governance and increase citizens’ participation in the process. Focusing on the topic of e-government allows us to analyze success stories, attributed benefits and, at the same time, observes challenges hampering the government development process. There are number of methodologies elaborated to study the conditions in the field of electronic governance. They enable us to find out if the government is ready to apply broad opportunities of information and communication technologies and if the government is apt to improve the accessibility and quality of delivering mainly social services. This article seeks to provide comparative analysis of widely spread methodologies used for Electronic government projects’ assessment. It has been concluded that applying current methods of assessment in Georgia is related to difficulties due to inaccessible data and the necessity of involving number of experts. The article presents new indicators for e-government development assessment that reflect efficacy of e-government conception realization in the regions of Georgia and enables to provide quantitative evaluation of regional e-government projects including all significant aspects of development.Keywords: development methodology, e-government in Georgia, information and communication technologies, regional government
Procedia PDF Downloads 27410640 Security Risks Assessment: A Conceptualization and Extension of NFC Touch-And-Go Application
Authors: Ku Aina Afiqah Ku Adzman, Manmeet Mahinderjit Singh, Zarul Fitri Zaaba
Abstract:
NFC operates on low-range 13.56 MHz frequency within a distance from 4cm to 10cm, and the applications can be categorized as touch and go, touch and confirm, touch and connect, and touch and explore. NFC applications are vulnerable to various security and privacy attacks such due to its physical nature; unprotected data stored in NFC tag and insecure communication between its applications. This paper aims to determine the likelihood of security risks happening in an NFC technology and application. We present an NFC technology taxonomy covering NFC standards, types of application and various security and privacy attack. Based on observations and the survey presented to evaluate the risk assessment within the touch and go application demonstrates two security attacks that are high risks namely data corruption and DOS attacks. After the risks are determined, risk countermeasures by using AHP is adopted. The guideline and solutions to these two high risks, attacks are later applied to a secure NFC-enabled Smartphone Attendance System.Keywords: Near Field Communication (NFC), risk assessment, multi-criteria decision making, Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 30110639 Simplified INS\GPS Integration Algorithm in Land Vehicle Navigation
Authors: Othman Maklouf, Abdunnaser Tresh
Abstract:
Land vehicle navigation is subject of great interest today. Global Positioning System (GPS) is the main navigation system for positioning in such systems. GPS alone is incapable of providing continuous and reliable positioning, because of its inherent dependency on external electromagnetic signals. Inertial Navigation (INS) is the implementation of inertial sensors to determine the position and orientation of a vehicle. The availability of low-cost Micro-Electro-Mechanical-System (MEMS) inertial sensors is now making it feasible to develop INS using an inertial measurement unit (IMU). INS has unbounded error growth since the error accumulates at each step. Usually, GPS and INS are integrated with a loosely coupled scheme. With the development of low-cost, MEMS inertial sensors and GPS technology, integrated INS/GPS systems are beginning to meet the growing demands of lower cost, smaller size, and seamless navigation solutions for land vehicles. Although MEMS inertial sensors are very inexpensive compared to conventional sensors, their cost (especially MEMS gyros) is still not acceptable for many low-end civilian applications (for example, commercial car navigation or personal location systems). An efficient way to reduce the expense of these systems is to reduce the number of gyros and accelerometers, therefore, to use a partial IMU (ParIMU) configuration. For land vehicular use, the most important gyroscope is the vertical gyro that senses the heading of the vehicle and two horizontal accelerometers for determining the velocity of the vehicle. This paper presents a field experiment for a low-cost strap down (ParIMU)\GPS combination, with data post processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach, we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of our low-cost IMU (Inertial Measurement Unit) and because of the relatively small area of the trajectory.Keywords: GPS, IMU, Kalman filter, materials engineering
Procedia PDF Downloads 41710638 Design, Synthesis and Pharmacological Investigation of Novel 2-Phenazinamine Derivatives as a Mutant BCR-ABL (T315I) Inhibitor
Authors: Gajanan M. Sonwane
Abstract:
Nowadays, the entire pharmaceutical industry is facing the challenge of increasing efficiency and innovation. The major hurdles are the growing cost of research and development and a concurrent stagnating number of new chemical entities (NCEs). Hence, the challenge is to select the most druggable targets and to search the equivalent drug-like compounds, which also possess specific pharmacokinetic and toxicological properties that allow them to be developed as drugs. The present research work includes the studies of developing new anticancer heterocycles by using molecular modeling techniques. The heterocycles synthesized through such methodology are much effective as various physicochemical parameters have been already studied and the structure has been optimized for its best fit in the receptor. Hence, on the basis of the literature survey and considering the need to develop newer anticancer agents, new phenazinamine derivatives were designed by subjecting the nucleus to molecular modeling, viz., GQSAR analysis and docking studies. Simultaneously, these designed derivatives were subjected to in silico prediction of biological activity through PASS studies and then in silico toxicity risk assessment studies. In PASS studies, it was found that all the derivatives exhibited a good spectrum of biological activities confirming its anticancer potential. The toxicity risk assessment studies revealed that all the derivatives obey Lipinski’s rule. Amongst these series, compounds 4c, 5b and 6c were found to possess logP and drug-likeness values comparable with the standard Imatinib (used for anticancer activity studies) and also with the standard drug methotrexate (used for antimitotic activity studies). One of the most notable mutations is the threonine to isoleucine mutation at codon 315 (T315I), which is known to be resistant to all currently available TKI. Enzyme assay planned for confirmation of target selective activity.Keywords: drug design, tyrosine kinases, anticancer, Phenazinamine
Procedia PDF Downloads 11510637 Vulnerability of People to Climate Change: Influence of Methods and Computation Approaches on Assessment Outcomes
Authors: Adandé Belarmain Fandohan
Abstract:
Climate change has become a major concern globally, particularly in rural communities that have to find rapid coping solutions. Several vulnerability assessment approaches have been developed in the last decades. This comes along with a higher risk for different methods to result in different conclusions, thereby making comparisons difficult and decision-making non-consistent across areas. The effect of methods and computational approaches on estimates of people’s vulnerability was assessed using data collected from the Gambia. Twenty-four indicators reflecting vulnerability components: (exposure, sensitivity, and adaptive capacity) were selected for this purpose. Data were collected through household surveys and key informant interviews. One hundred and fifteen respondents were surveyed across six communities and two administrative districts. Results were compared over three computational approaches: the maximum value transformation normalization, the z-score transformation normalization, and simple averaging. Regardless of the approaches used, communities that have high exposure to climate change and extreme events were the most vulnerable. Furthermore, the vulnerability was strongly related to the socio-economic characteristics of farmers. The survey evidenced variability in vulnerability among communities and administrative districts. Comparing output across approaches, overall, people in the study area were found to be highly vulnerable using the simple average and maximum value transformation, whereas they were only moderately vulnerable using the z-score transformation approach. It is suggested that assessment approach-induced discrepancies be accounted for in international debates to harmonize/standardize assessment approaches to the end of making outputs comparable across regions. This will also likely increase the relevance of decision-making for adaptation policies.Keywords: maximum value transformation, simple averaging, vulnerability assessment, West Africa, z-score transformation
Procedia PDF Downloads 10310636 Modelling Conceptual Quantities Using Support Vector Machines
Authors: Ka C. Lam, Oluwafunmibi S. Idowu
Abstract:
Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression
Procedia PDF Downloads 204